Creating a nephrostomy tract involves advancement of a needle between the kidney and the flank skin. To plan the procedure and understand the three-dimensional anatomic field of the retroperitoneal space and patient-specific organ position, the physician reviews available imaging to understand the patient-specific anatomy in the context of the planned procedure. Most imaging reviewed by physicians is presented in a 2-dimensional format e.g. computed tomography, fluoroscopic images, ultrasound images. To best understand the proposed procedure before and possibly during the procedure, a three-dimensional representation of the anatomy is useful as it better represents the three-dimensional nature of the procedure and thus can improve physician understanding and decision making. Three-dimensional rendering can be viewed either on a two-dimensional screen or in a virtual reality 3-dimensional environment.
Various tools exist that categorize patient-specific anatomic factors into complexity levels of kidney stone surgery generally and percutaneous nephrolithotomy (PCNL) specifically. The Guy's stone score, the CROES nomogram, the S.T.O.N.E. nephrolithotomy score and the S-ReSC system are predominately based on kidney stone factors—stone location, stone size, etc. Stone-to-skin distance measurements. In the setting of nephrostomy renal puncture, a sharp member is advanced between the kidney and the flank skin through the retroperitoneal anatomic field with care taken to avoid injuring nearby precious organs. Defining a window of safe passage through the retroperitoneum would be a benefit as it would allow enhanced understanding and potentially measurements. This window of safe passage is herein called a ‘flank window’. Understanding how potential or real renal puncture pathways would pass from kidney to flank skin through the ‘flank window’ would be a benefit to the physician both for procedure understanding, qualification and planning. This ‘flank window’ or safe passage from kidney to flank skin can be presented on 2D or 3D medical imaging.
There is currently a lacking of a system that evaluates this critical soft-tissue anatomy (e.g. nearby organs that are outside the kidney potentially in the path between the kidney and flank skin) to aid in the performance of a nephrostomy creation for the PCNL procedure. Providing a system that focuses on anatomic consideration of non-renal organs in the anatomic field between kidney and flank skin will provide understanding of the anatomic field before and during renal procedures e.g. in the setting of nephrostomy creation. A model that presents patient-specific anatomic detail to the surgeon both before and during the procedure would be a valuable resource to surgeons performing nephrostomy creation and would enhance patient safety and physician adoption of this difficult procedure. Further, a model that adds a proposed or created nephrostomy tract to the patient-specific anatomic field may enhance physician understanding and thus patient safety and adoption of the technique.
Many computer software imaging platforms are currently available to perform anatomic image segmentation and three-dimensional image reconstruction to represent image features relevant to a range of clinical interests. There is an acute need to represent the anatomy related to renal nephrostomy puncture in a three-dimensional fashion. The correlation of established nephrostomy puncture pathways in the intraoperative setting to three-dimensional reconstructed regional anatomy is a function that is not currently available. This requires a system that properly emphases relevant anatomy and de-emphases non-relevant anatomy in the visual presentation of the anatomic field. To enhance that functionality, muscle, fat, bone and skin may need to be segmented, along with kidney, liver, bowel, spleen and pleura, and there may need to be translucency applied to much of these structures to permit ‘through and through’ view of the puncture tract. Collectively, this nephrostomy creation procedure-focused image reconstruction function is not currently reported. An additional value would be to represent on reconstructed imaging a puncture path in relation to the anatomic field. With this function the surgeon can cognitively map the puncture planned or created to the nearby intracorporeal anatomy, or else computer software functionality can map the puncture path onto the image reconstruction data set.
Rules that govern puncture or other surgical interventions such as retrograde nephrostomy wire puncture or tissue biopsy can be defined and applied to this anatomic data to provide puncture tract evaluation, interventional analysis, planning, qualification, and intraoperative procedure confirmation to the physician. Further, the combination of ‘flank window’ anatomic field data with procedural rule sets can be used to assess surgical risk, feasibility, planning, and navigational support during the procedure.
Thus, the present invention endeavors to:
In an exemplary embodiment, a model or system that defines an anatomic window (‘flank window’) from the kidney, through the retroperitoneum, to the nearby ipsilateral flank skin. This window is defined either by its nearby precious perimeter organs where the space between these precious organs is the safe anatomic ‘flank window’ or by a manually drawn field drawn inside of the space defined by the perimeter organs where the perimeter of this drawn field may be a distance from the perimeter anatomical organs. This window can be represented visually in various fashions to a physician to aid in understanding the patient-specific anatomy relevant to a procedure.
In an exemplary embodiment the ‘flank window’ may be defined in several ways:
In an exemplary embodiment, a 3D imaging reconstruction of previously acquired source patient-specific imaging data demonstrating the kidney and other structures in the region of the path between the kidney and the flank skin possibly including perirenal and subcutaneous fat, the position and terminal points of one or more ipsilateral ribs, colon, pleura, liver, spleen, paraspinal muscle, pararenal and perirenal fat, fascia, flank musculature, subcutaneous fat, possibly with specific visual representation of the ‘flank window’ which defines a safe path between kidney to skin. Renal specific anatomic features may be incorporated to enhance position and structure information e.g. renal rotation, anterior vs. posterior regions of kidney, upper/mid/lower pole regions of kidney, renal infundibulae, renal papillae, subsegments of renal papillae, any renal masses, etc. Renal anatomic feature input may be limited by imaging detail availability e.g. CT urography and or retrograde pyelography may provide more anatomic information for segmentation than a source CT that was acquired without contrast. All of the above anatomic reference points may be segmented using a combination of manual and automatic or AI supported processes. This visual image reconstruction will present selected anatomic elements onto a single visual spatial model. This unified multidimensional visual model may aid the surgeon in planning an antegrade or retrograde renal procedure by presented a clear view of the relevant anatomy. In an exemplary embodiment this model would be applied to either antegrade or retrograde renal puncture for nephrostomy creation. Translucency may be applied to the 3D image reconstruction to various anatomic elements to permit enhanced visual understanding of the path from kidney to flank skin while still understanding the relevant regional anatomic structures and their interrelationships to each other and a potential safe puncture path between kidney and skin. This 3D reconstruction can also be formatted and processed to permit ‘drive through’ or ‘virtual puncture’ journey in 3D space between kidney and flank skin using a virtual reality or alternative software feature. Puncture path may be modeled if the puncture point on each of the kidney and flank skin are selected, with the shortest line between them becoming a model puncture path through the anatomic field. The kidney puncture point is cognitively known to the surgeon as they have awareness that the puncture was through a specific region of the kidney e.g. posterior kidney and mid pole, for example. Thus, with the skin puncture site known in relation to, say the tip of the 12th rib, and the renal puncture site being known, review of the three-dimensional regional anatomy with the puncture beginning and end points known, the tract can be overlaid onto that pre-existing reconstructed 2D or 3D image set either cognitively (in the ‘mind's eye’ of the surgeon) or can be actually visually overlaid onto the 2D or 3D reconstructed images using computer functionality.
In an exemplary embodiment, the above anatomic elements are represented as voxel coordinates in a single unified spatial model (X-Y-Z planes) to allow accurate visual image presentation of relative organ/anatomy position, and also to permit procedure modeling on top of this 3D anatomic field converted to specific multiplanar position data.
In an exemplary embodiment, information regarding a proposed or intraoperatively performed renal puncture can be added to the above XYZ anatomic positional data to model to visually model a puncture path relationship to relevant nearby anatomic structures. Anatomic structures that permit registration of the actual puncture to image reconstruction may include palpable ribs or portions thereof, skin overlying the palpable rib, the posterior axillary line, the umbilicus, the palpable lateral edge of the paraspinal muscle, skin, subcutaneous fat, fascia and flank musculature, pararenal and perirenal fat, and possibly a radio-visible marker that may have been applied to the patient prior to acquisition of source CT scan imaging.
In an exemplary embodiment, a model or system defines and applies interactive rule sets to a spatial model that evaluates and proposes potential puncture paths between kidney and flank skin while passing through the ‘flank window’. Input function may be added to allow the physician to interactively contribute to selected puncture path. This modeling may inform the surgeon who is planning an antegrade or retrograde renal procedure, such as nephrostomy creation, as to available and optimal paths. In one exemplary embodiment this model would be applied to retrograde renal puncture for nephrostomy creation. In one exemplary embodiment the proposed or best selected path rule set would have additional outputs such as distance to nearest precious organ, categorization into ‘safe’ and ‘unsafe’ groupings, etc.
In an exemplary embodiment, the above model or system may use various methods to propose a puncture path from the renal collecting system to the flank skin passing through a safe anatomic window (‘flank window’). These puncture path models can be interactive with the physician, permitting the surgeon to modify and manipulate the puncture paths for example with a drawing tool. Methods to determine a puncture path include but are not limited to: (i) AI, neural networks or other advanced calculation methodologies can be applied to select a ‘best path’ between the kidney and the flank skin while passing through the flank window, where ‘best path’ may consider any of: angle of approach limits to the skin, preferring to be closer to the center of the ‘flank window’ or at least a distance from the edge of the ‘flank window’ or nearby organs, and any preference regarding approach to or departure from the kidney. Regarding these points, physician input may be used to inform the ‘best path’ calculation e.g. ‘no puncture from or to the upper pole’ or ‘no puncture from or to the anterior region of kidney’ or ‘path must be from or to a physician selected calyx/infundibulum’ or ‘no puncture path angle of approach to skin of greater than 15 degrees from perpendicular’, etc. Specific regions within a calyx e.g. posterior fornix can be calculated as part of a ‘best path’ output; (ii) a retrograde puncture path is established by selecting a minimum of 2 points in X-Y-Z space from within the renal collecting system from which the retrograde puncture path is established. These initiation points can be proposed by AI or other advanced calculation systems or by physician using a manual process e.g. manual segmentation. In either case, the puncture path trajectory is linearly extrapolated from the path set by these initiation coordinates in X-Y-Z space. A margin of error around the ideal puncture path will predict a potential range of puncture paths and this range of error will grow over distance in space as the wire progresses further from the initiation of the puncture. This consideration may or may not be applied as a rule set to build a confidence level of possible wire paths in space; (iii) two or great point position data on the ureteroscope tip is used: e.g. where 2-point position of the ureteroscope tip is captured either by fluoroscopic registration where CT/fluoroscopy fusion is available, or by visual registration; (iv) a range of positions within an infundibulum and calyx are calculated to determine the optimal position on the papilla for puncture. In an exemplary model a specific rule set considers the angle of approach of the puncture wire to the flank skin. A more tangential approach to the skin creates a weaker puncture interface of the wire with the skin which can reduce power of the puncture wire to advance through the skin. Thus, rules may place a limit on ranges of puncture angles to ensure an acceptable wire angle at the flank skin.
The above outputs can be presented as sequential 2D anatomic slices or as a 3D reconstructed model. The 3D images can be presented on a two-dimensional screen or else presented in three-dimensions e.g. as a 3D printed model, in a virtual ureteroscopy or virtual reality environment. This representation provides the surgeon spatial understanding of the patient-specific flank anatomy and patient-specific safe anatomic windows for puncture from the kidney to the flank and can be used by the physician to assess surgical candidacy, surgical planning and risk. This can be applied in the pre-operative planning stage or intraoperatively as a cognitive aid to the surgery.
In an exemplary embodiment, a system or model wherein a therapeutic renal procedure plan can be assessed and created based on rule set modeling and case qualification and these outputs can be presented to the surgeon on a 2D monitor or 3D printed model or virtual ureteroscopy or virtual reality environment, and these data outputs may allow the surgeon to interact with them either in a preoperative or intraoperative environment.
In an exemplary embodiment, computer-based method for segmenting relevant anatomy, registration with live endoscopy and tracking e.g. image based, optical or electromagnetic system to provide live navigational assistance during the surgery by overlaying preoperative computational modeling for safe, ideal, and/or preselected puncture pathways with live surgical position. In an exemplary embodiment this model may comprise: receiving processed image data of a patient's abdominal and retroperitoneal anatomy and ‘window of safety’; applying puncture rule sets to processed anatomic data to determine best or possible puncture initiation points and best or possible puncture angles; receiving live endoscopic image data of a portion of the patients renal collecting system collected from a flexible ureteroscope and possibly fluoroscopy image or ultrasound data; performing image registration between these image data to determine a global location and orientation of the ureteroscope within the patients urinary collecting system during flexible ureteroscopy and provide guidance on how to perform the selected procedure according to the previously modeled and approved puncture trajectories determined by pre-operative image analysis and puncture path rule set application. Image tracking, optical tracking, electromagnetic tracking methods can be applied to the registration system to provide live ongoing procedural guidance.
Create a rule set to determine overall case complexity—‘easy’; ‘moderate’; ‘advanced’; ‘unsafe’. This can be based on a number of considerations some or all of which may include distance and angle calculations, safe anatomic window size, distance from kidney to skin, etc.
In an exemplary embodiment:
Cross-sectional imaging data is sent to a software program to segment relevant abdominal and retroperitoneal anatomic structures, ribs, detailed anatomic features of the kidney, flank skin, flank muscles, fascia, pararenal and perirenal fat, subcutaneous fat and possibly the ‘flank window’. This segmentation process may incorporate automatic, manual and/or AI supported segmentation processes.
Render a visual reconstruction of the above and present this as a 2D slice-by-slice reconstruction or a 3D reconstruction to the surgeon either on a 2-dimensional monitor, a 3D printed model, or in a 3D virtual ureteroscopy or virtual reality environment. This output can be presented to the physician in the preoperative environment or intraoperatively as an aid to surgery.
In another exemplary embodiment, the above outputs 1. and 2. can be used as source processed imaging data and the following functions are added on top of that existing output:
In the preoperative or intraoperative setting of an antegrade or retrograde nephrostomy puncture, anatomic points such as (i) the skin puncture possibly in positional relation to nearby identifiable bony anatomy e.g tip of 12th rib, and (ii) renal puncture points, can be used as the two ends of an interpolated line between them and this interpolated line can be cognitively modeled by the surgeon to consider the puncture path or else are inputted onto the 2D or reconstructed 3D anatomic image data set using rule sets and tools such as a manual drawing tool or description-based input into a computer system able to receive and model or interpolate a straight line between said anatomic points, such as (i) the skin and possibly its positional relation to bony landmarks e.g. 12th rib tip, and (ii) renal puncture points and model these points and a line between them onto the preoperatively 2D slice-by-slice or 3D reconstructed anatomic images. This would permit the physician to view a modeled or intraoperatively actual puncture path's relationship to relevant nearby organ structures. For example, after a surgeon creates a puncture intraoperatively, the wire emerges from the flank skin and is controlled with a clamp. The surgeon measures the distance from the nearest palpable rib or other known and segmented anatomic structure and this information becomes the skin puncture data point. The surgeon is aware from where in the kidney the puncture penetrated the kidney. With these two points (skin and kidney) known in XYZ planes, the puncture path through the flank tissues can be interpolated between these two points either in a cognitive ‘mind's eye’ fashion of a physician, or a function can exist to permit entry into a computer system of these puncture data at skin and kidney, and the computer function will model the created puncture path between these two points and present this line on either 2D slice-by-slice or 3D reconstructed images, permitting the surgeon to evaluate their created tract and the path's relation and proximity to nearby organs, angles, etc. Additionally, further rule sets can provide additional outputs relevant to evaluation of the created puncture tract such as distance from any point along the puncture path to a nearest precious organ, overall safety profile based on established rule set criteria, total tract length calculations, angles, etc. Maximum and minimum values can be included in these rule sets. The entry (antegrade puncture) or exit point (retrograde puncture) of the wire in relation to the skin could be registered to either the 2D axial, coronal or sagittal images or the 3D segmented and reconstructed images using an anatomic structure common to both the external flank anatomy and the 3D images such as a portion or point on a specific palpable bony structure e.g. rib or the skin overlying the palpable bony structure e.g. rib, and the relation of the wire skin entry or exit to that identified portion of bony structure e.g. rib and its corresponding representation on the 2D or 3D segmented preoperative images, or else other structures such as umbilicus, posterior axillary line, edge of paraspinal muscle, radio-opaque marker, etc. A 2D slice-by-slice or a 3D representation can be generated that shows the puncture path (straight line drawn in 3D between skin and renal wire puncture points) in the larger anatomic field, to determine puncture safety prior to proceeding further in the operating room with the subsequent steps of the procedure. Regarding the renal entry (in the setting of an antegrade nephrostomy puncture) or exit point (in the setting of a retrograde nephrostomy puncture), information such as anterior or posterior, upper, mid or lower pole regions of the kidney may be used to register or model the true puncture to the XYZ spatial data grid and reconstructed images.
In another exemplary embodiment, the above outputs 1. and 2. (and possibly 3.) can be used as source processed imaging data and the following functions are added on top of that existing output:
During the procedure planning stage, apply a set of rules that assess potential puncture paths initiated from various positions from within the kidney. These sets of rules include but are not limited to sub-rule sets that (i) define the origin of the puncture path from within the kidney; (ii) constrain puncture paths to safely travel through the segmented ‘window of safe passage’ based on maximum distance tolerable to nearby organs, etc.; (iii) constrain acceptable angles and lengths of puncture wire to contact flank skin; (iv) maximum tract length. Alternatively inputted renal and skin puncture site data (3. Above) can be analyzed against rule sets to evaluate the safety of the entered tract data.
Output of this phase may include (i) treatment tract ratings/proposals, including which segment of papilla is ideal for puncture to achieve proper path angle to treatment target area e.g. puncture through ‘window’; (ii) simulated 2D fluoroscopy image showing range of acceptable flexible ureteroscope positions in 2D space to achieve acceptable wire angles; (iii) overall complexity score of the case—‘easy’; ‘moderate’; ‘advanced’; ‘unsafe’; (iv) video reconstruction of urinary collecting system with eligible puncture paths visually overlaid onto this to simulate the surgery plan possibly with a virtual endoscopy or virtual ureteroscopy element to the presentation of these reconstructions. This phase aids with preoperative surgical decision making and planning.
In another exemplary embodiment, the above outputs can be used as source data and the following functions are added on top of that existing output:
A possible extension of the above output is registration and tracking of live video endoscopy (possibly also live fluoroscopy and/or ultrasound images) with preoperatively processed images is performed (1) using soft tissue urinary system targets that can be identified on both 3D CT and vision endoscopy; (2) registration may be integrated with fluoroscopy (biplanar or multiplanar) and/or ultrasound. Image registration is performed by any or more of (1) marking the ureteroscope position (i) under vision and optionally (ii) in relation to live retrograde pyelogram to the corresponding 3D processed CT image, optionally with integrated CT urogram or retrograde pyelogram data. Tracking can be performed using image-based tracking system, optical tracking or electromagnetic tracking.
In an exemplary embodiment, soft tissue registration targets can include the ureteropelvic junction (UPJ), any number of calyces (anterior/posterior calyces, uppermost, lowermost calyces, the ostium of the selected infundibulum and the papilla of that infundibulum selected for puncture.
In an exemplary embodiment, registration may include various points on a particular papilla—cephalad fornix, posterior fornix, etc. In particular, soft tissue registration data from the infundibular ostium and papilla selected for puncture can be abstracted in XYZ planes and used to generate the origination point and path prediction for the puncture path probability cone.
In an exemplary embodiment, continuous and discrete ongoing/subsequent registrations can be performed based on visual travel through the mapped branched lumen of the kidney and/or known anatomic landmarks.
In an exemplary embodiment, registration and/or tracking can be performed using image-based tracking systems, optical tracking, EM navigation or other identifiable landmarks on fluoroscopy such as ribs, the flexible ureteroscope itself, vertebrae, anterior superior iliac spine, etc. These data may be integrated with direction vision endoscopic/preop imaging data registrations and/or tracking systems.
In an exemplary embodiment, registration and tracking can additionally be integrated with 2D fluoroscopy if CT/fluoroscopy fusion capabilities are available/integrated, ultrasound or electromagnetic (EM) navigation where a 2-point EM probe is inside the working channel or integrated into the distal 2 cm of the flexible ureteroscope. In the scenario of 2-point EM based inside the ureteroscope either as a built-in feature or a probe in the ureteroscope working channel, having 2-point data spatial data inside the kidney permits an accurate extrapolation of a retrograde advanced puncture wire.
In the intraoperative environment, several outputs are contemplated: 1. integrated visual of live endoscopy video and color-coded guidance overlay onto the endoscopic video, to guide the physician to the preselected region (Upper/mid/lower pole, anterior/posterior) or specific calyx for puncture as well as sub-portion of papilla to aim for (e.g. ‘aim for posterior portion of papilla’); 2. 3D representation of patient-specific anatomy and flank window for surgeon review alongside the surgery, possibly with preoperatively proposed or selected puncture paths; 3. Two-dimensional representative image of ureteroscope biplanar position on simulated fluoroscopy to define safe range of ureteroscope position for puncture—e.g. 0 degrees cephalad/caudad down to 20 degrees cephalad caudad, to guide physician to set safe ureteroscope position on 2D fluoroscopy; 4. Projection of soft tissue anatomy or approved puncture pathway data onto 2D fluoroscopy images; 5. Augmented reality display.
In an exemplary embodiment of this invention the entirety of the systems defined herein are applied to permit various instruments to be extended from the ureteroscope working channel under the entire range of the imaging and procedural concepts and guidance systems defined herein to diagnose or treat other issues e.g. direct a biopsy instrument to biopsy a segmented renal mass, therapeutic needle either cycling cold liquid for cryotherapy or a needle with a lumen to inject a therapeutic or diagnostic agent including but not limited to immunotherapy or chemotherapy; intervene for diagnostic or therapeutic purposes using a ureteroscope to deploy therapy or intervention to the liver, colon, spleen, skin, etc.
The method further comprises fusing the endoscopic image data with the 3D CT scan or MRI image data, possibly with biplanar or multiplanar fluoroscopy data to provide real time navigational assistance to the surgeon during the procedure. The method further comprises projecting navigational guidance onto the live endoscopy image. The method further comprises superimposing the endoscopic position onto a processed 3D reconstructed CT or MRI image. The method further comprises superimposing image data—video or 3D CT or MRI data onto fluoroscopy images.
In an exemplary embodiment of the present invention, a system for flexible ureteroscope navigation assistance, comprising: a memory device for storing a program; a processor in communication with the memory device, the processor operative with the program to: receive first image data of a patients abdominal and retroperitoneal anatomy, the first image data acquired before a surgery is performed; a segmentation process, either automated or manual, to mark relevant anatomic sites on the first image data; a rule set or sets applied to the image data in the program to map potential anatomic pathways through the anatomy; receive second image data acquired during the surgery procedure (endoscopic video and/or fluoroscopy data and/or ultrasound data); and perform registration between the first image data and the second image data to determine a global location and orientation of the ureteroscope within the patients urinary collecting system during the surgery and to inform the operator on how to best proceed with the procedure.
In an exemplary embodiment of the present invention, a system for real-time flexible ureteroscopic procedure planning based on 3D CT or MRI image data (and possibly retrograde pyelographic or ultrasound data) and applied rule sets to these image data. Navigational assistance is provided with various strategies including, but not limited to, fusion of images, separately displayed images, etc. Any combination of image overlay or guidance instructions between 3D CT/MRI data, endoscopic video, fluoroscopy and ultrasound is contemplated—whichever presentation is most useful for the surgeon will be presented.
It is contemplated that the rule sets will require iterative improvement as a multitude of patient-specific data is collected, to improve performance of predictive elements of surgical guidance outputs. Various means of iteration are contemplated which will result in improved algorithm performance.
The tools and sequences described herein can be applied to a range of organ systems for diagnostic and/or therapeutic purposes. One alternate embodiment is for the diagnosis and treatment of renal lesions using a retrograde approach through the collecting system. More generally, other embodiments may include the application of these principles to any endoluminal approach to diagnostic or therapeutic deliveries involving non-renal organs such as sinuses, gastrointestinal system, central nervous system procedures, cardiovascular procedures.
The foregoing features are of representative embodiments and are presented to assist in understanding the invention. It should be understood that they are not intended to be considered limitations on the invention as defined by the claims, or limitations on equivalents to the claims. Therefore, this summary of features should not be considered dispositive in determining equivalents. Additional features of the invention will become apparent in the following description, from the drawings and from the claims.
It should also be understood that the above descriptions and those represented in the drawings are only representative of illustrative embodiments. For the convenience of the reader, these descriptions have focused on a representative sample of possible embodiments, a sample that is illustrative of the principles of the invention. The description has not attempted to exhaustively enumerate all possible variations. That alternative embodiments may not have been presented for a specific portion of the invention, or that further undescribed alternatives may be available for a portion, are not be considered a disclaimer of those alternate embodiments. Other applications and embodiments can be implemented without departing from the spirit and scope of the present invention.
It is therefore intended that the invention not be limited to the specifically described embodiments because numerous permutations and combinations of the above and implementations involving non-inventive substitutions for the above can be created, but the invention is to be defined in accordance with the general principles that are contained herein. It can be appreciated that many of those undescribed embodiments are within the literal scope of claims to be filed, and that others are equivalent.
Once the 3D image reconstruction and ROI marking and segmentation 120 has been performed, procedure planning and navigational assistance rule sets 125 are applied to this data 120 to generate an output 130 of (i) acceptable treatment plan proposal(s); (ii) and virtual navigation through collecting system imaging to review procedure plan proposal and rating 505; (iii) proposed 2 dimensionally represented anterior/posterior simulated fluoroscopic position of ureteroscope positions for acceptable puncture for each acceptable treatment plan 510. To the extent that rule sets will rule out pathways that do not meet criteria for puncture path eligibility, only those pathways that qualify will be represented as options for puncture. For example, if a puncture from a region of the kidney cannot pass through the safety window based on extrapolation from puncture site initiation 405, then that region of the kidney will be ruled out as having a viable path for puncture. For viable puncture pathways, various modes of visual presentation of treatment plan proposals 130 are contemplated. Various aspects of the treatment proposals can be represented including but not limited to (i) visually represented paths to avoid, and to the papillae suitable for puncture; (ii) sub-regions of selected papilla ideal for puncture (e.g. posterior portion of papilla may be marked for targeting). A rating or case complexity score is also contemplated that would be the product of a rule set that informs the surgeon of the case difficulty or feasibility. A relative complexity scoring of one or more potential procedure paths is contemplated which could result in a ranking of first and second choice pathways, for example.
Given the procedure plan proposal and complexity rating 130, surgeon can review these prior to surgery to qualify or disqualify a patient for surgery, plan the surgery, and review the surgical plan visually prior to surgery or as an aid during the surgery. This output 130 can also be integrated in the OR to provide live surgical guidance. In this embodiment, a video endoscopy image is acquired from the patient 133 using the flexible ureteroscope 270 and fluoroscopic data 134 may be integrated into the registration/tracking system.
Next, image registration is performed between the planning image 120 and the video endoscopic image 133 and possibly fluoroscopy data 134. This is done by performing any number of image registration techniques. These may include identifiable soft tissue anatomic sites on endoscopy and planning image 120, fluoroscopy image fusion with video and/or 3D CT/MRI data, electromagnetic navigation if integrated, or acquisition at this time of retrograde pyelogram 134. Other methods of registration are contemplated. At this time a deformable mapping can be computed.
Several tracking systems are contemplated: Image tracking, optical tracking, EM navigation tracking.
With the image registration complete, a global location and orientation of the flexible ureteroscope 270 within the urinary collecting system is presented as image guidance output 140. For example, given the location of the ureteroscope 270 within the procedural image, the deformable mapping computed above can then be used to find the location of the ureteroscope 270 in the planning image 120. In order to infer the orientation of the ureteroscopy 270 in the planning image 120, an orientation of the bronchoscope 270 must be determined from the procedure image.
Depending on what is required, several options for image guidance outputs 140 exist at this stage. In one option, for example, the procedure proposal image 130 can be superimposed onto the endoscopy video image 133. Alternatively, the video endoscopy image 133 can be superimposed onto the procedure proposal image 130 and/or its pre-selected procedure path. In another option a simulated 2D fluoroscopy image can be presented to show the surgeon acceptable ureteroscope deflection angles on 2D fluoroscopy acquired during the procedure. In another option, a cine loop can be presented to guide the surgeon to the preselected region of the kidney.
As the ureteroscope is advanced 145, virtual ureteroscopy 150 can be applied to guide the ureteroscope to the preselected target. This is performed by the surgeon reviewing preoperatively generated guidance outputs 130 during the operation. Image re-registration 155 can be performed as the ureteroscope moves closer to its target and passes pre-set regions of interest thereby reducing future positional XYZ error. Tracking can be performed in a number of possible ways including but not limited to image-based tracking, electromagnetic navigation 160 and biplanar or multiplanar fluoroscopy image fusion 165.
Image based or other listed tracking systems 170 may employ registration refinement using 3D-3D modified iterative closest point (ICP) registration. Simultaneous localization and mapping (SLAM) algorithmic frameworks may be used for image-based tracking. Because a flexible endoscope does not have the option of direct motion estimation using an external tracker since the endoscope tip position cannot be obtained via rigid transformation, modelling the tip motion using boundary conditions and dynamic constraints, and applying repeated registration to anatomic points further along the anatomic can provide meaningful location information. As well, because the path of the flexible scope is constrained by the lumenal wall which can be constructed from preoperative images, this highly constrained possible scope position enables the scope tip position to be reasonably computed by referencing medical images. Thus, image registration is a potentially good choice.
Cephalad-caudad ureteroscope tip angle data can be obtained from biplanar fluoroscopy and this data can be integrated into the puncture path calculations intraoperatively 165 to the overall algorithms and guidance outputs for the surgeon. Multiplanar fluoroscopic data can also be integrated to provide 3D position data of the ureteroscope to be integrated to the overall algorithms and guidance outputs for the surgeon 165.
The acquisition device 205 may be a computed tomography (CT) imaging device or any other three-dimensional (3D) high resolution imaging device such as magnetic resonance (MR) scanner. The acquisition device may additionally include a fluoroscopy image acquisition device such as a C-arm or fixed fluoroscopy table or image collected from CT scan scout/planning image which presents 2-dimensional x-ray data.
The computer 210, which may be portable or laptop computer, includes a CPU 225 and a memory 230 connected to an input device 250 and an output device 255. The CPU 225 includes a procedure evaluation and planning module 242 and a flexible ureteroscopic navigation module 245, each of which contain one or more rule sets 125 and methods. Although shown inside the CPU, the planning module 242 and flexible ureteroscopic navigation module 245 can be located outside the CPU 225.
The memory 230 includes a RAM 235 and a ROM 240. The memory 230 can also include a database, disk drive, USB drive, Wi-Fi, Bluetooth, cloud connection, VPN, wired connection or combination thereof. The RAM 235 functions as a data memory that stores data used during execution of a program in the CPU 225 ad is used as a work area. The ROM 240 functions as a program memory for storing a program executed in the CPU 225. The input 250 is constituted by a keyboard, mouse, etc. and the output 255 is constituted by an LCD, CRT display, printer, etc.
The operation of the system 200 can be controlled from the operator's console 215, which includes a controller 265, e.g. a keyboard or touchscreen, and a display 260 which may have an integrated touch screen controller 265. The operators console 215 communicates with the computer 210 and the acquisition device 205 can be rendered by the computer 210 and viewed on the display 260. The computer 210 can be configured to operate and display information provided by the acquisition device 205 absent the operator's console 215, by using e.g. the input 250 and output 255 devices to execute certain tasks performed by the controller 265 and display 260.
The operators console 215 may permit manual anatomic segmentation and may further include any suitable image rendering system/tool/application that can process digital image data, including automatic anatomic segmentation, of an acquired image dataset (or portion thereof) to generate and display images on the display 260. More specifically, the image rendering system may be an application that provides rendering and visualization of medical image data, and which executes on a general purpose or specific computer workstation. The computer 210 can also include the above-mentioned image rendering system/tool/application.
It is considered that some or more of the above functions may be remotely located within the same building or offsite at a remote location with connection via secure data transfer over VPN, cloud, Wi-Fi, Bluetooth, or other modalities. Functions which may be hosted offsite may include the CPU 225 and the memory components 230.
The flexible ureteroscope 270 is a slender tubular instrument with small light 275 on the end for illumination to permit an image capture element 277 to capture and transmit images of the interior of the urinary collecting system for display on a video display monitor 280.
In some embodiments, an electronic communications linkage can be established between the image guidance output and one or more of a camera, an item of robotic surgical equipment, a trocar, and an electrosurgery device. The electronic communications linkage can be configured for operating one or more of controls for robotic arms, trocars, electrosurgical devices, other surgical equipment and/or stereotactic video offering a three dimensional view of a surgical field.
The learned AI modeling enhanced automatic segmentation comprises an application-specific integrated circuit (ASIC) for an artificial neural network connected to a communications network, the ASIC comprising: a plurality of neurons organized in an array, wherein each neuron comprises a register, a processing element and at least one input, and a plurality of synaptic circuits, each synaptic circuit including a memory for storing a synaptic weight, wherein each neuron is connected to at least one other neuron via one of the plurality of synaptic circuits configured to identify the coordinates of the window of safe passage.
The application-specific integrated circuit (ASIC) is used to make a prediction or classification of the coordinates of the window of safe passage based on some input data, which can be labeled or unlabeled. The algorithm will produce an estimate about a pattern in the data. An error function then evaluates the prediction of the model. If there are known examples, an error function can make a comparison to assess the accuracy of the model.
A model optimization process then occurs. If the model can fit better to the data points in the training set, then weights are adjusted to reduce the discrepancy between the known example and the model estimate. The algorithm will repeat this “evaluate and optimize” process, updating weights autonomously until a threshold of accuracy has been met. Supervised learning in particular uses a training set to teach models to yield the desired output. This training dataset includes inputs and correct outputs, which enables the model to learn over time. The algorithm measures its accuracy through the loss function, adjusting until the error has been sufficiently minimized.
As well, the coordinates of the window of safe passage are established not strictly by the position of the perimeter organs, but by the direct perimeter of the window of safe passage that has been drawn by person (manual segmentation e.g. applying a mouse and drawing tool onto the image) or software (AI assisted/software trained automatic segmentation) performing the segmentation of the perimeter of the window of safe passage. Other representations of this image reconstruction are weighting of visual prominence of various organ parts to optimize viewer understanding of a puncture path, virtual endoscopy, virtual reality, and a tool to permit zoom in, rotation of this image, virtual navigation through the image reconstructions. The sum of this will provide the surgeon a better 3-dimensional understanding of the surgical task along with its requirements, limitations, and challenges. Rule sets may be applied to calculate and present distance measurements from a proposed puncture path to nearby precious organs. Both retrograde and antegrade nephrostomy tract creation are considered exemplary embodiments of these applications.
The acquisition device 1105 may be a computed tomography (CT) imaging device or any other three-dimensional (3D) high resolution imaging device such as magnetic resonance (MR) scanner. The acquisition device may additionally include a fluoroscopy image acquisition device such as a C-arm or fixed fluoroscopy table or image collected from CT scan scout/planning image which presents 2-dimensional x-ray data.
The computer 1110, which may be portable or laptop computer, includes a CPU 1125 and a memory 1130 connected to an input device 1150 and an output device 1155. The CPU 1125 includes an image reconstruction, evaluation and planning module 1142 which includes a nephrostomy line modeling function module which may contain one or more rule sets 1125 and methods. Although shown inside the CPU, any or all functions from among the image reconstruction, evaluation and planning module, and nephrostomy line modeling functions 1142 can be located outside the CPU 1125.
The memory 1130 includes a RAM 1135 and a ROM 1140. The memory 1130 can also include a database, disk drive, USB drive, Wi-Fi, Bluetooth, cloud connection, VPN, wired connection or combination thereof. The RAM 1135 functions as a data memory that stores data used during execution of a program in the CPU 1125 ad is used as a work area. The ROM 1140 functions as a program memory for storing a program executed in the CPU 1125. The input 1150 is constituted by a keyboard, mouse, etc. and the output 1155 is constituted by an LCD, CRT display, printer, etc.
The operation of the system in
The operators console 1115 may permit manual anatomic inputs using text or a drawing tool onto an image represented on a screen and may further include any suitable image rendering system/tool/application that can process digital image data, including automatic anatomic segmentation, of an acquired image dataset (or portion thereof) to generate and display images on the display 1160. More specifically, the image rendering system may be an application that provides rendering and visualization of medical image data, and which executes on a general purpose or specific computer workstation. The computer 1110 can also include the above-mentioned image rendering system/tool/application.
It is considered that some or more of the above functions may be remotely located within the same building or offsite at a remote location with connection via secure data transfer over VPN, cloud, Wi-Fi, Bluetooth, or other modalities. Functions which may be hosted offsite may include the CPU 1125 and the memory components 1130.
Visual outputs such as puncture assessment 515, puncture path image modeling onto 2D or 3D image reconstructions, measurement and angle calculations are displayed on monitor 280.
The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specifications and drawings are to be regarded in an illustrative rather than a restrictive sense. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description. Therefore, it is intended that the disclosure not be limited to the particular embodiment(s) disclosed.
Number | Date | Country | |
---|---|---|---|
63696406 | Sep 2024 | US | |
63705797 | Oct 2024 | US | |
63724628 | Nov 2024 | US | |
63614892 | Dec 2023 | US |