This invention concerns a system for medical procedure equipment movement planning involving determining and presenting in an image, positions of an X-ray imaging system C-arm and workers at different stages of a medical procedure of a particular medical procedure type.
A Hybrid operating room (OR) is a complex working environment where a large team of medical staff (interventionists, surgeons, anesthesiologists, nurses and technicians) need to work seamlessly together. For instance, during a trans-catheter aortic-valve implantation (TAVI), there can be around 20 medical workers in an OR room along with different medical equipment and systems including an X-ray imaging system, for example. In known systems, placement of medical equipment and workers may obstruct robotic X-ray imaging system C-Arm access to a patient for image acquisition during surgery in a hybrid OR. As a result, an interventional procedure may need to be paused, and the equipment and workers moved to a different location within the OR. This is undesirable during surgery especially since a patient may already be undergoing a procedure. Rearrangement of the OR equipment and personnel takes time and may compromise patient safety. Furthermore, rearrangement of the equipment may need to be repeated if a subsequent rearrangement is not optimal (i.e. an imaging system C-Arm cannot reach required angles for a whole procedure). A system according to invention principles preplans equipment and worker placement in an OR by providing a planned layout of equipment and worker position for different stages of an interventional procedure before a procedure is initiated.
A system preplans placement of equipment and medical staff to achieve optimal image acquisition angles of an imaging system for cardiac, vascular or neurological surgery procedures in a hybrid operating room, for example. A system for medical procedure equipment movement planning uses at least one repository of information, a position processor and a display processor. The at least one repository of information associates a type of interventional medical procedure with image acquisition angles used for acquiring images during a particular type of interventional medical procedure and with positions of workers having particular roles in a procedure in an operating room and with corresponding different stages of a medical procedure. The position processor, in response to receiving data identifying a medical procedure type, automatically uses the information and predetermined data identifying location and dimensions of an X-ray imaging system C-arm and a patient support table in determining positions of a C-arm and workers at different stages of a medical procedure of the medical procedure type. The display processor initiates generation of data representing a display image showing the determined positions of the C-arm and workers at different stages of the medical procedure.
A system preplans placement of equipment and medical workers during different stages of a medical procedure in an OR to achieve optimal needed image acquisition angles of an imaging system for cardiac, vascular or neurological surgery procedures in a hybrid operating room, for example. The system advantageously informs responsible surgeons, radiologists and cardiologists of the different angles a robotic X-ray system C-arm needs to assume for different types of surgery (e.g. cardiac surgery) to acquire images in order to perform a particular procedure. The required angles are advantageously indicated in a lookup table or database for different types of procedure (e.g., a Transcatheter Aortic Valve Implantation (TAVI) procedure). The system provides a lookup table or database that identifies different procedures (e.g. a cardiac procedure) and maps the different types of procedure to the possible angles related to the procedures.
At least one repository of information 17 associates a type of interventional medical procedure with image acquisition angles used for acquiring images during a particular type of interventional medical procedure and with positions of workers having particular roles in a procedure in an operating room and with corresponding different stages of a medical procedure. Position processor 15, in response to receiving data identifying a medical procedure type, automatically uses the information and predetermined data identifying location and dimensions of an X-ray imaging system C-arm and a patient support table in determining positions of a C-arm and workers at different stages of a medical procedure of the medical procedure type. Display processor 29 initiates generation of data representing a display image showing the determined positions of the C-arm and workers at different stages of the medical procedure.
System 10 in response to a data entry input identifying a procedure type, automatically selects equipment including an X-ray imaging system using a C-arm needed for a procedure and identifies number of workers and their roles needed for a procedure and automatically determines position of the equipment and workers for different stages of a procedure. The system automatically determines placement and simulates movement of equipment and workers during a procedure and provides a visual image (2D or 3D) layout in different orientations (e.g. overhead and side views) identifying placement of equipment and workers at the different stages of a procedure.
In response to equipment placement and movement simulation, the system presents data indicating the possible angles for that procedure to a physician. A physician modifies the angles if required and confirms the possible angles to be used. The simulation system performs C-Arm collision detection analysis based on the hybrid OR equipment and staff placement. If there is a possible C-Arm collision, the system informs a user of a C-Arm angle that cannot be achieved based on a current placement layout. The system prompts a user with a suggested different location for a highlighted object that is in a collision path or the system prompts a user with a different imaging angle that prevents the collision. The system 10 user interface enables a user to accept a suggestion or select a different location of the highlighted object or a different imaging angle. Collision detection processor 27 processes the user determined device and worker placement data to ensure equipment placement in the OR is optimal.
Collision detection processor 27 checks for obstruction in a path of C-arm movement to a particular angle based on a current location of the C-Arm. Processor 27 detects an obstruction in a C-arm path of movement and highlights the object in collision with the C-arm in a 2D or 3D image presentation and identifies a specific image acquisition angle that cannot be achieved as it results in a collision. Processor 27 repeats the obstruction checking and detection steps until the C-Arm angles used for imaging of a particular procedure type have been checked and verified as valid.
System 10 advantageously positions medical equipment and workers in an optimal location for surgery in a hybrid OR. The system preplans placement of equipment and medical workers to achieve optimal needed image acquisition angles of imaging system 25 for cardiac, vascular or neurological surgery procedures in a hybrid operating room. The system employs a lookup table associating different procedure types with different angular positions of a C arm and with different numbers of workers having particular roles and different positions of workers in the OR. In one embodiment, the lookup table is populated by a user in a configuration step, in consultation with surgeons and for particular procedures performed in the OR. Collision detection processor 27 calculates a collision zone and highlights objects (equipment/staff) that are within the collision zone in a visual display image, for example.
Where E2, E3, E4 comprise equipment items and W1, W2, W3, W4 comprise workers. Grid cells of the table and
Processor 15 in step 209 determines candidate C-arm imaging angles associated with the imaging procedure selected in step 203 using a lookup table in repository 17 accessed in step 206. The physician in step 212 acknowledges the candidate C-arm imaging angles or modifies the angles, using user interface 26. In step 214 processor 15 determines whether the C-arm imaging angles selected in step 212 result in a potential collision with a worker or equipment. Position processor 15 provides a stage 1 to stage 2 transition collision table showing collision zones where a worker and equipment may collide with a C-arm. The collision table provides a visual depiction corresponding to an imaging room with table cells corresponding to room sections and identifying safe zones and collision zones where a worker or equipment may collide with a C-arm. A collision table for a Left heart study where a C arm moves from a parking position to a stage 2 comprises, for example,
In step 217, display 19 presents data identifying the potential collision and highlighting equipment and workers that are involved in a potential collision in a visual depiction of an imaging room. Position processor 15 in step 220 uses the position table and collision table in identifying an alternative candidate position of equipment and workers that avoids the potential collision. Display 19 presents the alternative candidate position of equipment and workers in the visual depiction of the imaging room. The physician in step 223 enters data accepting the alternative candidate position of equipment and workers or selecting a different candidate position of equipment and workers. Processor 15 in step 227 uses the position table and collision table in identifying whether the C-arm imaging angles result in a potential collision with the worker and equipment positions selected in step 223 and alerts a user if there is a collision via display 19
Position processor 15 provides a stage 2 collision table showing collision zones where a worker and equipment may collide with a C-arm. The collision table for a Left heart study in stage 2 comprises,
The process iteratively repeats from step 217 until a valid configuration is determined or the system terminates selection after a number of iterations exceeding a predetermined limit (e.g. 3 iterations).
In step 920, collision detection processor 27 uses a position table and collision table in identifying and suggesting an alternative angle for image acquisition to avoid the determined potential collision. Display processor 29 in step 923 initiates generation of data representing a display image alerting a user to a determined potential collision and showing the determined positions of the X-ray imaging system C-arm and workers at different stages of the medical procedure and patient support table and indicating a collision position. In one embodiment the display image shows the X-ray imaging system C-arm and patient support table in an overhead view of the X-ray imaging system C-arm and patient support table and indicates the collision position. In another embodiment, the display image includes a photographic image showing an X-ray imaging system C-arm and patient support table and indicates at least one of, (a) a safe zone and (b) a collision zone in the display image showing the X-ray imaging system C-arm and patient support table. The process of
A processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and is conditioned using executable instructions to perform special purpose functions not performed by a general purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. Computer program instructions may be loaded onto a computer, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the block(s) of the flowchart(s). A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display elements or portions thereof. A user interface comprises one or more display elements enabling user interaction with a processor or other device.
An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters. A graphical user interface (GUI), as used herein, comprises one or more display elements, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
The UI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the UI display images. These signals are supplied to a display device which displays the elements for viewing by the user. The executable procedure or executable application further receives signals from user input devices, such as a keyboard, mouse, light pen, touch screen or any other means allowing a user to provide data to a processor. The processor, under control of an executable procedure or executable application, manipulates the UI display elements in response to signals received from the input devices. In this way, the user interacts with the display elements using the input devices, enabling user interaction with the processor or other device. The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to executable instruction or device operation without user direct initiation of the activity. A histogram of an image is a graph that plots the number of pixels (on the y-axis herein) in the image having a specific intensity value (on the x-axis herein) against the range of available intensity values. The resultant curve is useful in evaluating image content and can be used to process the image for improved display (e.g. enhancing contrast).
The system and processes of
This is a non-provisional application of provisional application Ser. No. 61/587,690 filed Jan. 18, 2012, by S. Kargar et al.
Number | Date | Country | |
---|---|---|---|
61587690 | Jan 2012 | US |