The present technique relates to a device, computer program and method.
The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in the background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present technique.
Stress levels for surgeons are very high during surgery. Whilst experienced surgeons manage these stress levels, trainee surgeons may suffer burnout during training. This is especially the case with surgical procedures having a slower learning rate such as endoscopy and minimally invasive surgery.
It is therefore desirable to manage stress levels of trainee surgeons during training whilst providing good quality training for the surgeon.
It is an aim of the disclosure to address this issue.
According to embodiments, there is provided a device, comprising circuitry configured to: receive surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a member of a surgical team performing the surgical procedure; and associating the surgical data with the stress related parameter.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings.
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
The present disclosure relates to generating a training regime for surgeons that manage stress levels of the surgeon being subject to the training. The surgeon under training may be an inexperienced surgeon or may be an experienced surgeon being trained in an unfamiliar procedure or being trained using different surgical tools or techniques. Typically, this training is carried out in more modern training settings using a surgical simulator.
Although embodiments of the disclosure relates to a training regime for surgeons (and trainee surgeons), the disclosure is not so limited. In other embodiments, stress during surgery may also be applicable to other members of a surgical team, such as an anaesthetist, member of the surgical nursing team or the like. More generally, therefore, embodiments of the disclosure relate to a member of a surgical team.
A surgical simulator is a known system that uses realistic synthetic imagery to train the surgeon. Typically, this imagery is collected from real surgical procedures or from virtually created surgical scenarios.
The disclosure is described in two parts. The first part describes the collection of the surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a surgeon performing the surgical procedure and the second part describes the generation of a surgical training simulation based on the surgical data and the stress related parameter collected in the first part.
<First Part>
In embodiments of the disclosure, the first and/or second part may include a computer assisted surgical system. In these embodiments, it is especially useful to manage the stress felt by a member of the surgical team as very few members of the surgical team will initially have experience with a computer assisted surgical system and especially in collaborative surgery with a degree of autonomy (such as semiautonomous or fully autonomous where a surgical robot will perform one or more specific tasks autonomously). Moreover, the types of stress felt by the members of the surgical team will likely be different to those felt where no robotic assistance is provided. This makes the training system particularly effective when applied to embodiments where there is an element of computer assistance with the surgery.
During this surgical procedure, the surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a surgeon performing the surgical procedure will be collected. The patient 106 lies on an operating table 105 and a human surgeon 104 and a computerised surgical apparatus 103 perform the surgery together. It should be noted here that the surgeon is experienced in the procedure for which the surgical data and the stress related parameter is being collected. In addition, the human surgeon 104 provides an identifier that uniquely identifies him or her.
Each of the human surgeon and computerised surgical apparatus monitor one or more parameters of the surgery, for example, patient data collected from one or more patient data collection apparatuses (e.g. electrocardiogram (ECG) data from an ECG monitor, blood pressure data from a blood pressure monitor, etc.—patient data collection apparatuses are known in the art and not shown or discussed in detail) and one or more parameters determined by analysing images of the surgery (captured by the surgeon's eyes or a camera 109 of the computerised surgical apparatus) or sounds of the surgery (captured by the surgeon's ears or a microphone (not shown) of the computerised surgical apparatus). Each of the human surgeon and computerised surgical apparatus carry out respective tasks during the surgery (e.g. some tasks are carried out exclusively by the surgeon, some tasks are carried out exclusively by the computerised surgical apparatus and some tasks are carried out by both the surgeon and computerised surgical apparatus) and make decisions about how to carry out those tasks using the monitored one or more surgical parameters.
In addition to the parameters of the surgery described above, further surgical data is collected. The surgical data includes movement data of a surgical tool and the surgical robot collected from sensors located within the tool or robot or by tracking the tool or robot and any feedback provided by that tool or robot. For example, sensors include accelerometers, gyroscopes, encoders to measure an angle of a joint or other sensors located within surgical tools such as forceps, tweezers, scalpels, electrodiathermy units or the surgical robot arm that indicates the motion and force of the tool. Moreover, in the example of a surgical robot which is under at least partial control of the experienced surgeon using an interface, the control data provided by the experienced surgeon is also captured.
In addition, image data from cameras showing the experienced surgeon's viewpoint and/or image data from an endoscope or a surgical microscope or an exoscope, or any surgical instrument used in the surgical procedure is captured. This image data may be RGB type image data or may be fluorescent video or the like. In other words, image data of the surgical procedure is image data obtained by the surgical instrument.
In addition, stress related parameters of the experienced surgeon are collected. These stress related parameters may be collected from sensors worn by the surgeon or from images captured of the surgeon or from other physiological parameters of the surgeon. For example, the experienced surgeon may wear a heart rate monitor, sweat analysis sensors, skin conductivity sensors, a breathing rate sensor or a blood pressure monitor. In addition, the experienced surgeon may have his or her blood composition measured regularly or continually during the surgical procedure to measure the amount of cortisol in the blood. The purpose of collecting these stress related parameters is to quantify the stress level of the experienced surgeon at any moment during the surgical procedure. These stress related parameters are captured continually during the surgical procedure.
So, during the surgical procedure, surgical data, stress related parameters and optionally image data are collected. In addition, the unique identifier associated with the experienced surgeon (and possibly the surgical robot) is collected. This information is sent by a communication interface to a network as will be explained later.
Although
The robot 103 comprises a controller 110 and one or more surgical tools 107 (e.g. movable scalpel, clamp or robotic hand). The controller 110 is connected to the camera 109 for capturing images of the surgery, to a movable camera arm 112 for adjusting the position of the camera 109 and to adjustable surgical lighting 111 which illuminates the surgical scene and has one or more adjustable lighting parameters such as brightness and colour. For example, the adjustable surgical lighting comprises a plurality of light emitting diodes (LEDs, or laser diodes not shown) of different respective colours. The brightness of each LED is individually adjustable (by suitable control circuitry (not shown) of the adjustable surgical lighting) to allow adjustment of the overall colour and brightness of light output by the LEDs. The controller 110 is also connected to a control apparatus 100. The control apparatus 100 is connected to another camera 108 for capturing images of the surgeon's eyes for use in gaze tracking and to an electronic display 102 (e.g. liquid crystal display or Organic Light Emitting Diode (OLED) display) held on a stand 102 so the electronic display 102 is viewable by the surgeon 104 during the surgery. The control apparatus 100 compares the visual regions of the surgical scene paid attention to by the surgeon 104 and robot 103 to help resolve conflicting surgeon and computer decisions according to the present technique.
The control apparatus 100 comprises a control interface 201 for sending electronic information to and/or receiving electronic information from the controller 110, a display interface 202 for sending electronic information representing information to be displayed to the electronic display 102, a processor 203 for processing electronic instructions, a memory 204 for storing the electronic instructions to be processed and input and output data associated with the electronic instructions, a storage medium 205 (e.g. a hard disk drive, solid state drive or the like) for long term storage of electronic information, a camera interface 206 for receiving electronic information representing images of the surgeon's eyes captured by the camera 108 and the image data noted above and a user interface 214 (e.g. comprising a touch screen, physical buttons, a voice control system or the like). Moreover, the communication interface 215 is provided that provides the parameters of the surgery, surgical data, image data, stress related parameters and decision information to the network 300. Each of the control interface 201, display interface 202, processor 203, memory 204, storage medium 205, camera interface 206, user interface 214 and communication interface 215 are implemented using appropriate circuitry, for example. The processor 203 controls the operation of each of the control interface 201, display interface 202, memory 204, storage medium 205, camera interface 206 and user interface 214.
Additionally connected to the network 310 is a medical procedure server 315. As will be explained later, the medical procedure server 315 is a computer server that comprises circuitry configured to: receive surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a surgeon performing the surgical procedure; and associating the surgical data with the stress related parameter. In addition, the medical procedure server 315 segments the captured image data provided by the control apparatus 100 from the surgery and also generates one or more stress level value that indicate the levels of stress of the experienced surgeon. These stress level values are derived from the stress related parameters captured during the surgery.
It should be noted that the surgical instrument includes one or more of the surgical tool and any one of the cameras in the surgical environment.
HH:MM:SS
which indicates the number of hours (HH), minutes (MM) and seconds (SS).
The experienced surgeon in the embodiments of
In the embodiments of
Of course, the image data may be used to generate a surgical simulation. In the described embodiment, for brevity, the image data is used to generate the surgical training simulation.
As noted above, in addition to the image data, in embodiments, the stress related parameters and the surgical data that are captured during surgery are also split in accordance with the image data. In other words, the stress related parameters and the surgical data associated with each section of the surgery are also defined. For brevity, however, the following disclosure will discuss only stress related parameters.
So, the stress related parameters captured between 00:00:00 and 00:09:22 are in the incision stress parameters; the stress related parameters captured between 00:09:22 and 00:15:35 are in the insertion stress parameters; the stress related parameters captured between 00:15:35 and 00:42:56 are in the removal stress parameters; and the stress related parameters captured between 00:42:56 and 00:58:22 are in the closure stress parameters.
In addition, the medical procedure server 315 may split each of the sections into further sub-sections. This allows for increased granularity of the training regime. In the embodiments shown in
In the embodiments of
The further sub-sections may contain images relevant to other skills that surgeons require and that may form part of a training regime. For example, although the incision section relates to the experienced surgeon performing an incision into the patient's colon, the incision #2 subsection may include the surgeon performing a cauterisation to stop a bleed. This subsection may be also relevant to a training regime based upon cauterising bleeds.
As noted above, the medical procedure server 315 receives the stress related parameters from the experienced surgeon collected during the surgical procedure. These stress related parameters are used to produce a stress indication value that defines the levels of stress felt by the experienced surgeon during the surgical procedure. In other words, physiological measurements of the experienced surgeon are used to generate an indication of stress felt by the surgeon.
There are a number of different physiological measurements that indicate when a person (such as the experienced surgeon) is under stress as is known. For example, stress levels on various heart rate complexity measures have been investigated in NPL 1 and stress can be estimated based on the overall heart rate and changes to the variability of the heart rate. Moreover, the amount of sweat secreted by a person increases when the person is feeling stress. In addition, the sweat secreted when stressed is provided by the apocrine gland rather than by the eccrine sweat glands which secrete sweat when a person is hot. The apocrine glands are located near dense pockets of hair follicles (such as under the arms, around the groin and on the scalp) and secrete a sweat high in fatty acids and proteins. Therefore, by placing sweat analysis sensors in areas close to the apocrine glands, the sweat (both quantity and composition) associated with stress may be analysed. Finally, and as noted above, the amount of cortisol in the blood of the surgeon indicates whether the surgeon is feeling stressed. Therefore, a rise in cortisol levels within the blood may indicate that the surgeon is feeling under stress. This can be measured by collecting blood from the surgeon continually or at least regularly during the surgical procedure.
One or more of these physiological measurements are used to generate the stress related parameter for the experienced surgeon at a given point in time during the surgical procedure. These physiological measurements may be derived from measurement captured by a wearable device. In embodiments, a single stress related parameter is determined for each section or subsection. Moreover, this single stress related parameter may be determined for each physiological measurement or may be a single stress related parameter for all physiological measurements. The single stress related parameter may be a mean or median average value of the stress or may be the highest value of the stress related parameter over the entire section or sub-section.
Returning to
The procedure is identified in the “procedure” section. In this case, a colon polyp biopsy is noted. The duration of the image data associated with this procedure is also noted in the data structure. The naming of the procedure may be selected from a drop down menu or may be entered using free text. In the situation where free text is used to name the procedure, rules associated with a naming convention may be followed to ensure consistency in searching through the procedures.
The sections of the procedure are also noted in the data structure. Moreover, the stress related parameter associated with the section or sub-section is also stored in the data structure. The time period associated with the section or sub-section is noted in the data structure. This is useful in retrieving the relevant image data when the data structure is interrogated looking for a section or sub-section of the image data that provides a certain stress related parameter, surgical data and/or a certain surgical procedure. The relevant section and/or sub-section may be retrieved easily.
Additionally, although not shown in
To summarise the first part, according to embodiments, there is provided a device, comprising circuitry configured to: receive surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a surgeon performing the surgical procedure; and associating the surgical data with the stress related parameter. This allows a training simulation to be created that tests the surgical skill of the surgeon undergoing training by providing the surgical data whilst ensuring the surgeon does not have burn-out by managing the stress levels of the surgeon undergoing training.
As will be appreciated, the procedure of
Moreover, the mechanism for grouping the categories is not limited in the disclosure. The grouping of the sections may be achieved by reviewing the name of the section in the data structures of
In addition to the sections from the two data structures of
The “stress related parameter” value for each section is stored in the second data structure of
1234.5678.9101//00:00:00-00:09:22
Which is formed from the image data ID (1234.5678.9101) and the time unit during which the incision section was captured (00:00:00-00:09:22).
The purpose of the second data structure of
In embodiments, the surgical procedure is formed of a plurality of subprocedures, and the circuitry is configured to divide the surgical procedure into a plurality of sections, each section comprising the surgical data and the stress related parameter relating to a different subprocedure.
This allows the training simulator to select relevant parts of different surgical procedures together so that the surgeon undergoing training may conduct a relevant surgical procedure that has the correct stress levels.
The circuitry may be configured to receive image data of the surgical procedure and the image data of the surgical procedure is image data obtained by the surgical instrument.
The surgical data may include movement data of a robotic surgical instrument. Further, the circuitry may be configured to receive the stress related parameter from a wearable device that is worn by a member in the surgical team, such as one or more of the surgeons.
To summarise the first part, according to embodiments, there is provided a device, comprising circuitry configured to: receive surgical data indicating a parameter of a surgical instrument used during a surgical procedure and a stress related parameter of a member of a surgical team performing the surgical procedure; and associating the surgical data with the stress related parameter. This allows a training simulation to be created that tests the surgical skill of the surgeon undergoing training by providing the surgical data whilst ensuring the surgeon does not have burn-out by managing the stress levels of the surgeon undergoing training.
<Second Part>
A training regime for a surgeon undergoing training will now be described.
Firstly, it is envisaged that the surgeon undergoing training will collect the same surgical data and stress related parameters as the experienced surgeon carrying out the surgery in
As noted above, the second part of the disclosure provides a surgical training apparatus comprising circuitry configured to: receive the surgical data and the associated stress related parameter from the device of the first part; and generate a surgical training simulation based on the surgical data and the stress related parameter.
As noted above in respect of the surgery carried out by the experienced surgeon, the procedure to remove a polyp using an endoscope in the training regime is broadly comprised of four sections (or subprocedures); the incision section where an incision is performed on the patient to allow entry of the endoscope; an insertion section where the endoscope is inserted into the patient; a biopsy section where a biopsy of the polyp is carried out and finally a closure section where the endoscope is removed, the patient closed and the surgery finished.
In the training schedule of
In the example, of
In the specific embodiment of
Although the foregoing in
Although the foregoing describes one mechanism for generating a simulation, the disclosure is not so limited.
Firstly, in respect of the data structures of
So, in the example embodiments of
Although a mean value is established, the disclosure is not so limited and any kind of average value (such as the median average) is envisaged. By calculating the average value for the SRV across a plurality (which may be all or a subset) of the experienced surgeons who have carried out the procedure, a better representation of the stress levels associated with the section or sub-section for an experienced surgeon may be obtained.
The average value may then be used to calculate a stress correlation value between different sections. This allows a value to be determined that indicates how much one section induces similar levels of increased or decreased stress in different experienced surgeons, relative to the average SRV across experienced surgeons for that section or sub-section.
The stress correlation value may, in embodiments, be calculated in the following manner:
b. Across all the section identifier a correlation analysis is performed using these vectors to determine the similarity of these vectors.
c. This process yields, for each pair of section identifiers, a Section Correlation Link Value ranging from 0 to 1.
Referring to
Referring to
Referring to
a. A first virtual training program is constructed using the training graph, by choosing an initial level of section average stress value as a threshold value. All the nodes to the left of that line are marked as ‘in training’. Virtual surgery simulations are then generated by the surgical simulation server which will play through those section identifier which are ‘in training’ while avoiding those section identifier which are not ‘in training’. Each simulation generates the average stress level for each point in the simulation.
b. When a surgeon undergoing training performs these virtual surgical training sessions, the same sensors are used to collect stress related parameters but in this case for the surgeon undergoing training. The levels of stress in this trainee simulation data is compared to the stress related parameters derived from the experienced surgeon. When the stress levels of the surgeon undergoing training within the section with a specific section label are close enough to the stress related parameters of the experienced surgeon, and that the surgical data is similar to those of the experienced surgeon so the quality of the surgery is satisfactory, then that node in the graph is labelled as ‘mastered’.
c. For each node that is labelled as ‘mastered’, nodes to the right in the Training Graph (those with higher average stress level) are examined to determine if they should be ‘in training’. The values of the average stress level of mastered sections and sections linked to it and the value of the Section Correlation Link Value (SCLV) between them (and possibly the current the average stress levels for that section) and uses them to calculate if it should return ‘True’ or ‘False’. If ‘True’ the node linked to the ‘mastered’ node is set as ‘in training’. For example a node which has a high SCLV and whose average stress level for that section is not too much higher would return ‘True’.
d. A subsequent training program of Virtual surgery simulations are then generated by the surgical simulator which will play through those section identifier nodes which are ‘in training’ while avoiding those section identifier nodes which are not ‘in training’. The probability of the use of section identifier nodes which are marked as ‘mastered’ is reduced so that the training concentrates on new sections which are ‘in training’ but not ‘mastered’.
e. This process of ‘mastering’ certain sections and extending the sections which are ‘in training’ then continues until the trainee has ‘mastered’ all the sections considered to be important in their training.
Further, the control unit 100 and a simulation delivery server 1004 is connected to the network 1002. The simulation delivery server 1004 may include circuitry 1004A that is configured to perform embodiments of the 1st part described above. The circuitry 1004A may include processing circuitry such as a microprocessor that uses software stored in a storage medium (not shown) to operate, Communication circuitry may also be provided within the circuitry 1004A to communicate with the network 1002.
In summary, the 2nd part discloses a surgical training apparatus comprising circuitry configured to: receive the surgical data and the associated stress related parameter from the device of any one of embodiments of the 1st part; and generate a surgical training simulation based on the surgical data and the stress related parameter.
As noted above, by using the surgical data and the associated stress related parameter, a surgeon undergoing training can master the skills of becoming a surgeon without risking burnout by managing the stress levels of the surgeon undergoing training.
The circuitry may be further configured to: receive a second stress related parameter from the surgeon using the surgical training apparatus; and generate the surgical training simulation based on the second stress related parameter.
The circuitry may be further configured to: generate the surgical training simulation based on a value of the stress related parameter that is higher than the second stress related parameter. This allows the surgeon undergoing training to progress through his or her training to more complex matters where stress levels increase in a more gradual manner. This reduces the risk of burnout.
The circuitry may be configured to: control a display to display the generated surgical training simulation.
The circuitry may be configured to control the display to display a training graph defining the surgical training simulation.
The surgeon controls the one or more surgeon-controlled arms 1101 using a master console 1104. The master console includes a master controller 1105. The master controller 1105 includes one or more force sensors 1106 (e.g. torque sensors), one or more rotation sensors 1107 (e.g. encoders) and one or more actuators 1108. The master console includes an arm (not shown) including one or more joints and an operation portion. The operation portion can be grasped by the surgeon and moved to cause movement of the arm about the one or more joints. The one or more force sensors 1106 detect a force provided by the surgeon on the operation portion of the arm about the one or more joints. The one or more rotation sensors detect a rotation angle of the one or more joints of the arm. The actuator 1108 drives the arm about the one or more joints to allow the arm to provide haptic feedback to the surgeon. The master console includes a natural user interface (NUI) input/output for receiving input information from and providing output information to the surgeon. The NUI input/output includes the arm (which the surgeon moves to provide input information and which provides haptic feedback to the surgeon as output information). The NUI input/output may also include voice input, line of sight input and/or gesture input, for example. The master console includes the electronic display 1110 for outputting images captured by the imaging device 1102.
The master console 1104 communicates with each of the autonomous arm 1100 and one or more surgeon-controlled arms 1101 via a robotic control system 1111. The robotic control system is connected to the master console 1104, autonomous arm 1100 and one or more surgeon-controlled arms 1101 by wired or wireless connections 1123, 1124 and 1125. The connections 1123, 1124 and 1125 allow the exchange of wired or wireless signals between the master console, autonomous arm and one or more surgeon-controlled arms.
The robotic control system includes a control processor 1112 and a database 1113. The control processor 1112 processes signals received from the one or more force sensors 1106 and one or more rotation sensors 1107 and outputs control signals in response to which one or more actuators 1116 drive the one or more surgeon controlled arms 1101. In this way, movement of the operation portion of the master console 1104 causes corresponding movement of the one or more surgeon controlled arms.
The control processor 1112 also outputs control signals in response to which one or more actuators 1116 drive the autonomous arm 1100. The control signals output to the autonomous arm are determined by the control processor 1112 in response to signals received from one or more of the master console 1104, one or more surgeon-controlled arms 1101, autonomous arm 1100 and any other signal sources (not shown). The received signals are signals which indicate an appropriate position of the autonomous arm for images with an appropriate view to be captured by the imaging device 1102. The database 1113 stores values of the received signals and corresponding positions of the autonomous arm.
For example, for a given combination of values of signals received from the one or more force sensors 1106 and rotation sensors 1107 of the master controller (which, in turn, indicate the corresponding movement of the one or more surgeon-controlled arms 1101), a corresponding position of the autonomous arm 1100 is set so that images captured by the imaging device 1102 are not occluded by the one or more surgeon-controlled arms 1101.
As another example, if signals output by one or more force sensors 1117 (e.g. torque sensors) of the autonomous arm indicate the autonomous arm is experiencing resistance (e.g. due to an obstacle in the autonomous arm's path), a corresponding position of the autonomous arm is set so that images are captured by the imaging device 1102 from an alternative view (e.g. one which allows the autonomous arm to move along an alternative path not involving the obstacle).
It will be appreciated there may be other types of received signals which indicate an appropriate position of the autonomous arm.
The control processor 1112 looks up the values of the received signals in the database 1112 and retrieves information indicating the corresponding position of the autonomous arm 1100. This information is then processed to generate further signals in response to which the actuators 1116 of the autonomous arm cause the autonomous arm to move to the indicated position.
Each of the autonomous arm 1100 and one or more surgeon-controlled arms 1101 includes an arm unit 1114. The arm unit includes an arm (not shown), a control unit 1115, one or more actuators 1116 and one or more force sensors 1117 (e.g. torque sensors). The arm includes one or more links and joints to allow movement of the arm. The control unit 1115 sends signals to and receives signals from the robotic control system 1111.
In response to signals received from the robotic control system, the control unit 1115 controls the one or more actuators 1116 to drive the arm about the one or more joints to move it to an appropriate position. For the one or more surgeon-controlled arms 1101, the received signals are generated by the robotic control system based on signals received from the master console 1104 (e.g. by the surgeon controlling the arm of the master console). For the autonomous arm 1100, the received signals are generated by the robotic control system looking up suitable autonomous arm position information in the database 1113.
In response to signals output by the one or more force sensors 1117 about the one or more joints, the control unit 1115 outputs signals to the robotic control system. For example, this allows the robotic control system to send signals indicative of resistance experienced by the one or more surgeon-controlled arms 1101 to the master console 1104 to provide corresponding haptic feedback to the surgeon (e.g. so that a resistance experienced by the one or more surgeon-controlled arms results in the actuators 1108 of the master console causing a corresponding resistance in the arm of the master console). As another example, this allows the robotic control system to look up suitable autonomous arm position information in the database 1113 (e.g. to find an alternative position of the autonomous arm if the one or more force sensors 1117 indicate an obstacle is in the path of the autonomous arm).
The imaging device 1102 of the autonomous arm 1100 includes a camera control unit 1118 and an imaging unit 1119. The camera control unit controls the imaging unit to capture images and controls various parameters of the captured image such as zoom level, exposure value, white balance and the like. The imaging unit captures images of the surgical scene. The imaging unit includes all components necessary for capturing images including one or more lenses and an image sensor (not shown). The view of the surgical scene from which images are captured depends on the position of the autonomous arm.
The surgical device 1103 of the one or more surgeon-controlled arms includes a device control unit 1120, manipulator 1121 (e.g. including one or more motors and/or actuators) and one or more force sensors 1122 (e.g. torque sensors).
The device control unit 1120 controls the manipulator to perform a physical action (e.g. a cutting action when the surgical device 1103 is a cutting tool) in response to signals received from the robotic control system 1111. The signals are generated by the robotic control system in response to signals received from the master console 1104 which are generated by the surgeon inputting information to the NUI input/output 1109 to control the surgical device. For example, the NUI input/output includes one or more buttons or levers comprised as part of the operation portion of the arm of the master console which are operable by the surgeon to cause the surgical device to perform a predetermined action (e.g. turning an electric blade on or off when the surgical device is a cutting tool).
The device control unit 1120 also receives signals from the one or more force sensors 1122. In response to the received signals, the device control unit provides corresponding signals to the robotic control system 1111 which, in turn, provides corresponding signals to the master console 1104. The master console provides haptic feedback to the surgeon via the NUI input/output 1109. The surgeon therefore receives haptic feedback from the surgical device 1103 as well as from the one or more surgeon-controlled arms 1101. For example, when the surgical device is a cutting tool, the haptic feedback involves the button or lever which operates the cutting tool to give greater resistance to operation when the signals from the one or more force sensors 1122 indicate a greater force on the cutting tool (as occurs when cutting through a harder material, e.g. bone) and to give lesser resistance to operation when the signals from the one or more force sensors 1122 indicate a lesser force on the cutting tool (as occurs when cutting through a softer material, e.g. muscle). The NUI input/output 1109 includes one or more suitable motors, actuators or the like to provide the haptic feedback in response to signals received from the robot control system 1111.
The master-slave system 1126 is the same as
The computerised surgical apparatus 1200 includes a robotic control system 1201 and a tool holder arm apparatus 1210. The tool holder arm apparatus 1210 includes an arm unit 1204 and a surgical device 1208. The arm unit includes an arm (not shown), a control unit 1205, one or more actuators 1206 and one or more force sensors 1207 (e.g. torque sensors). The arm includes one or more joints to allow movement of the arm. The tool holder arm apparatus 1210 sends signals to and receives signals from the robotic control system 1201 via a wired or wireless connection 1211. The robotic control system 1201 includes a control processor 1202 and a database 1203. Although shown as a separate robotic control system, the robotic control system 1201 and the robotic control system 1111 may be one and the same. The surgical device 1208 has the same components as the surgical device 1103. These are not shown in
In response to control signals received from the robotic control system 1201, the control unit 1205 controls the one or more actuators 1206 to drive the arm about the one or more joints to move it to an appropriate position. The operation of the surgical device 1208 is also controlled by control signals received from the robotic control system 1201. The control signals are generated by the control processor 1202 in response to signals received from one or more of the arm unit 1204, surgical device 1208 and any other signal sources (not shown). The other signal sources may include an imaging device (e.g. imaging device 1102 of the master-slave system 1126) which captures images of the surgical scene. The values of the signals received by the control processor 1202 are compared to signal values stored in the database 1203 along with corresponding arm position and/or surgical device operation state information. The control processor 1202 retrieves from the database 1203 arm position and/or surgical device operation state information associated with the values of the received signals. The control processor 1202 then generates the control signals to be transmitted to the control unit 1205 and surgical device 1208 using the retrieved arm position and/or surgical device operation state information.
For example, if signals received from an imaging device which captures images of the surgical scene indicate a predetermined surgical scenario (e.g. via neural network image classification process or the like), the predetermined surgical scenario is looked up in the database 1203 and arm position information and/or surgical device operation state information associated with the predetermined surgical scenario is retrieved from the database. As another example, if signals indicate a value of resistance measured by the one or more force sensors 1207 about the one or more joints of the arm unit 1204, the value of resistance is looked up in the database 1203 and arm position information and/or surgical device operation state information associated with the value of resistance is retrieved from the database (e.g. to allow the position of the arm to be changed to an alternative position if an increased resistance corresponds to an obstacle in the arm's path). In either case, the control processor 1202 then sends signals to the control unit 1205 to control the one or more actuators 1206 to change the position of the arm to that indicated by the retrieved arm position information and/or signals to the surgical device 1208 to control the surgical device 1208 to enter an operation state indicated by the retrieved operation state information (e.g. turning an electric blade to an “on” state or “off” state if the surgical device 1208 is a cutting tool).
The computer assisted medical scope system 1300 also includes a robotic control system 1302 for controlling the autonomous arm 1100. The robotic control system 1302 includes a control processor 1303 and a database 1304. Wired or wireless signals are exchanged between the robotic control system 1302 and autonomous arm 1100 via connection 1301.
In response to control signals received from the robotic control system 1302, the control unit 1115 controls the one or more actuators 1116 to drive the autonomous arm 1100 to move it to an appropriate position for images with an appropriate view to be captured by the imaging device 1102. The control signals are generated by the control processor 1303 in response to signals received from one or more of the arm unit 1114, imaging device 1102 and any other signal sources (not shown). The values of the signals received by the control processor 1303 are compared to signal values stored in the database 1304 along with corresponding arm position information. The control processor 1303 retrieves from the database 1304 arm position information associated with the values of the received signals. The control processor 1303 then generates the control signals to be transmitted to the control unit 1115 using the retrieved arm position information.
For example, if signals received from the imaging device 1102 indicate a predetermined surgical scenario (e.g. via neural network image classification process or the like), the predetermined surgical scenario is looked up in the database 1304 and arm position information associated with the predetermined surgical scenario is retrieved from the database. As another example, if signals indicate a value of resistance measured by the one or more force sensors 1117 of the arm unit 1114, the value of resistance is looked up in the database 1203 and arm position information associated with the value of resistance is retrieved from the database (e.g. to allow the position of the arm to be changed to an alternative position if an increased resistance corresponds to an obstacle in the arm's path). In either case, the control processor 1303 then sends signals to the control unit 1115 to control the one or more actuators 1116 to change the position of the arm to that indicated by the retrieved arm position information.
The autonomous arms 1100 and 1210 perform at least a part of the surgery completely autonomously (e.g. when the system 1400 is an open surgery system). The robotic control system 1408 controls the autonomous arms 1100 and 1210 to perform predetermined actions during the surgery based on input information indicative of the current stage of the surgery and/or events happening in the surgery. For example, the input information includes images captured by the image capture device 1102. The input information may also include sounds captured by a microphone (not shown), detection of in-use surgical instruments based on motion sensors comprised with the surgical instruments (not shown) and/or any other suitable input information.
The input information is analysed using a suitable machine learning (ML) algorithm (e.g. a suitable artificial neural network) implemented by machine learning based surgery planning apparatus 1402. The planning apparatus 1402 includes a machine learning processor 1403, a machine learning database 1404 and a trainer 1405.
The machine learning database 1404 includes information indicating classifications of surgical stages (e.g. making an incision, removing an organ or applying stitches) and/or surgical events (e.g. a bleed or a patient parameter falling outside a predetermined range) and input information known in advance to correspond to those classifications (e.g. one or more images captured by the imaging device 1102 during each classified surgical stage and/or surgical event). The machine learning database 1404 is populated during a training phase by providing information indicating each classification and corresponding input information to the trainer 1405. The trainer 1405 then uses this information to train the machine learning algorithm (e.g. by using the information to determine suitable artificial neural network parameters). The machine learning algorithm is implemented by the machine learning processor 1403.
Once trained, previously unseen input information (e.g. newly captured images of a surgical scene) can be classified by the machine learning algorithm to determine a surgical stage and/or surgical event associated with that input information. The machine learning database also includes action information indicating the actions to be undertaken by each of the autonomous arms 1100 and 1210 in response to each surgical stage and/or surgical event stored in the machine learning database (e.g. controlling the autonomous arm 1210 to make the incision at the relevant location for the surgical stage “making an incision” and controlling the autonomous arm 1210 to perform an appropriate cauterisation for the surgical event “bleed”). The machine learning based surgery planner 1402 is therefore able to determine the relevant action to be taken by the autonomous arms 1100 and/or 1210 in response to the surgical stage and/or surgical event classification output by the machine learning algorithm. Information indicating the relevant action is provided to the robotic control system 1408 which, in turn, provides signals to the autonomous arms 1100 and/or 1210 to cause the relevant action to be performed.
The planning apparatus 1402 may be included within a control unit 1401 with the robotic control system 1408, thereby allowing direct electronic communication between the planning apparatus 1402 and robotic control system 1408. Alternatively or in addition, the robotic control system 1408 may receive signals from other devices 1407 over a communications network 1405 (e.g. the internet). This allows the autonomous arms 1100 and 1210 to be remotely controlled based on processing carried out by these other devices 1407. In an example, the devices 1407 are cloud servers with sufficient processing power to quickly implement complex machine learning algorithms, thereby arriving at more reliable surgical stage and/or surgical event classifications. Different machine learning algorithms may be implemented by different respective devices 1407 using the same training data stored in an external (e.g. cloud based) machine learning database 1406 accessible by each of the devices. Each device 1407 therefore does not need its own machine learning database (like machine learning database 1404 of planning apparatus 1402) and the training data can be updated and made available to all devices 1407 centrally. Each of the devices 1407 still includes a trainer (like trainer 1405) and machine learning processor (like machine learning processor 1403) to implement its respective machine learning algorithm.
The arm unit 1114 includes a base 710 and an arm 720 extending from the base 720. The arm 720 includes a plurality of active joints 721a to 721f and supports the endoscope 1102 at a distal end of the arm 720. The links 722a to 722f are substantially rod-shaped members. Ends of the plurality of links 722a to 722f are connected to each other by active joints 721a to 721f, a passive slide mechanism 724 and a passive joint 726. The base unit 710 acts as a fulcrum so that an arm shape extends from the base 710.
A position and a posture of the endoscope 1102 are controlled by driving and controlling actuators provided in the active joints 721a to 721f of the arm 720. According to the this example, a distal end of the endoscope 1102 is caused to enter a patient's body cavity, which is a treatment site, and captures an image of the treatment site. However, the endoscope 1102 may instead be another device such as another imaging device or a surgical device. More generally, a device held at the end of the arm 720 is referred to as a distal unit or distal device.
Here, the arm unit 700 is described by defining coordinate axes as follows. Furthermore, a vertical direction, a longitudinal direction, and a horizontal direction are defined according to the coordinate axes. In other words, a vertical direction with respect to the base 710 installed on the floor surface is defined as a z-axis direction and the vertical direction. Furthermore, a direction orthogonal to the z axis, the direction in which the arm 720 is extended from the base 710 (in other words, a direction in which the endoscope 1102 is positioned with respect to the base 710) is defined as a y-axis direction and the longitudinal direction. Moreover, a direction orthogonal to the y-axis and z-axis is defined as an x-axis direction and the horizontal direction.
The active joints 721a to 721f connect the links to each other to be rotatable. The active joints 721a to 721f have the actuators, and have each rotation mechanism that is driven to rotate about a predetermined rotation axis by drive of the actuator. As the rotational drive of each of the active joints 721a to 721f is controlled, it is possible to control the drive of the arm 720, for example, to extend or contract (fold) the arm unit 720.
The passive slide mechanism 724 is an aspect of a passive form change mechanism, and connects the link 722c and the link 722d to each other to be movable forward and rearward along a predetermined direction. The passive slide mechanism 724 is operated to move forward and rearward by, for example, a user, and a distance between the active joint 721c at one end side of the link 722c and the passive joint 726 is variable. With the configuration, the whole form of the arm unit 720 can be changed.
The passive joint 736 is an aspect of the passive form change mechanism, and connects the link 722d and the link 722e to each other to be rotatable. The passive joint 726 is operated to rotate by, for example, the user, and an angle formed between the link 722d and the link 722e is variable. With the configuration, the whole form of the arm unit 720 can be changed.
In an embodiment, the arm unit 1114 has the six active joints 721a to 721f, and six degrees of freedom are realized regarding the drive of the arm 720. That is, the passive slide mechanism 726 and the passive joint 726 are not objects to be subjected to the drive control while the drive control of the arm unit 1114 is realized by the drive control of the six active joints 721a to 721f.
Specifically, the active joints 721a, 721d, and 721f are provided so as to have each long axis direction of the connected links 722a and 722e and a capturing direction of the connected endoscope 1102 as a rotational axis direction. The active joints 721b, 721c, and 721e are provided so as to have the x-axis direction, which is a direction in which a connection angle of each of the connected links 722a to 722c, 722e, and 722f and the endoscope 1102 is changed within a y-z plane (a plane defined by the y axis and the z axis), as a rotation axis direction. In this manner, the active joints 721a, 721d, and 721f have a function of performing so-called yawing, and the active joints 421b, 421c, and 421e have a function of performing so-called pitching.
Since the six degrees of freedom are realized with respect to the drive of the arm 720 in the arm unit 1114, the endoscope 1102 can be freely moved within a movable range of the arm 720. A hemisphere as an example of the movable range of the endoscope 723. Assuming that a central point RCM (remote center of motion) of the hemisphere is a capturing centre of a treatment site captured by the endoscope 1102, it is possible to capture the treatment site from various angles by moving the endoscope 1102 on a spherical surface of the hemisphere in a state where the capturing centre of the endoscope 1102 is fixed at the centre point of the hemisphere.
Numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practiced otherwise than as specifically described herein.
In so far as embodiments of the disclosure have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present disclosure.
It will be appreciated that the above description for clarity has described embodiments with reference to different functional units, circuitry and/or processors. However, it will be apparent that any suitable distribution of functionality between different functional units, circuitry and/or processors may be used without detracting from the embodiments.
Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors.
Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in any manner suitable to implement the technique.
Embodiments of the present technique can generally described by the following numbered clauses:
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
Number | Date | Country | Kind |
---|---|---|---|
20185660.6 | Jul 2020 | WO | international |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/021579 | 6/7/2021 | WO |