The present disclosure relates to an information processing apparatus, an information processing system, and an information processing method.
In surgery using a surgical support device such as a surgical robot, there is a problem of an operating room layout in which the surgical robot should be arranged. Determining the operating room layout is cumbersome and time consuming. One of the reasons for this is that surgical procedures or patient body shapes may vary, or operating room size or machines on hand may vary depending on the hospital. That is, the surgical environment has no reproducibility. Furthermore, as another reason, there is also a restriction that layout cannot be freely performed due to necessary arrangement of a plurality of other devices.
A number of techniques for supporting robot arrangement have been proposed. However, in order to automatically calculate robot arrangement, it is necessary to accurately grasp the operating room environment such as positions, sizes, and arrangement restrictions of people and other machines. Since appropriate arrangement of persons and other machines varies depending on surgical method, it is necessary to determine the operating room layout by actually incorporating the knowledge and judgment of medical staff such as doctors and nurses.
At present, a system including an operator console (master device) and a robot arm cart (slave device) that supports a surgical tool or a camera (endoscope, microscope, etc.) is widely used as a surgical robot. However, if an installer, such as a nurse, does not understand the overall range of motion of the surgical robot system (in particular, the robotic arm cart), appropriate arrangement and effective use of the range of motion is not possible. Furthermore, even in a case where a doctor actually arranges the robot arm cart directly, it is necessary to understand the range of motion before arranging the robot arm cart. In addition, not only the surgical robot as described above but also a surgical support device such as an articulated arm robot holding an endoscope has a similar problem in that it is important to arrange the surgical support device in consideration of a range of motion.
Patent Document 1 described below is a technique for automatically calculating a position of a robot, but it is necessary to accurately input or recognize a patient, a surgical procedure, and a surrounding environment into a system. Patent Document 2 described below is a system that supports arrangement of a robot in an operating room, but it is necessary to perform device tracking, and manual fine adjustment is difficult because a range of motion of the robot is not indicated.
The present disclosure provides an information processing apparatus, an information processing system, and an information processing method that support easy arrangement of a robot.
An information processing apparatus of the present disclosure includes:
a projection image generating section which generates a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of a projection device that projects an image in an operating room; and
an output instruction section that outputs a projection instruction for the projection image to the projection device.
An information processing system of the present disclosure includes:
a projection device which projects an image in an operating room;
a projection image generating section which generates a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of the projection device; and
an output instruction section which outputs a projection instruction for the projection image to the projection device.
An information processing method of the present disclosure
generates a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of a projection device that projects an image in an operating room, and
projects the projection image by the projection device.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. In one or more embodiments illustrated in the present disclosure, elements included in each embodiment can be combined with each other, and the combined result also forms a part of the embodiments shown in the present disclosure.
The robot 101 has a distal end portion 111 that performs an operation on an operation target, a medical arm (robot arm, articulated arm) 121 that supports the distal end portion 111 at a distal end, and a base 131 that supports a proximal end of the arm 121. The distal end portion 111 is an example of a movable target part in a medical arm.
Examples of the distal end portion 111 include a microscope section for enlarging and observing an observation target, an imaging device (camera, etc.) for capturing an observation target, a projection device (projector), an endoscope, forceps, tweezers, a pneumoperitoneum tube for pneumoperitoneum, and an energy treatment tool for performing incision of a tissue or sealing of a blood vessel by cauterization. The surgical system 100 may include a plurality of arms, for example, and each arm may be configured to have a different distal end portion 111. For example, the arms may be configured as an arm holding an imaging device, an arm holding forceps or tweezers, an arm having an energy treatment tool, or the like. Examples of the observation target include an observation part of a subject, specifically, a surgical site of a patient. The distal end portion 111 may include a plurality of the items listed here. By supporting an item using the distal end portion 111, the position of the item can be more stably fixed and a burden on the medical staff can be reduced as compared with a case where the medical staff manually supports the item.
One end of the arm 121 is attached to the base 131 such that the arm 121 extends from the base 131. The base 131 may be movable by a user on a floor surface using wheels attached to a lower portion. The user can fix the position of the robot by operating an unillustrated brake. The height of the arm 121 may be adjustable relative to the base 131.
The arm 121 includes a plurality of links 122A, 122B, 122C, 122D, and 122E and a plurality of joints 122A to 122D coupling the links 123A to 123E. The plurality of links 122A to 122E is mutually rotatable by the plurality of joints 123A to 123D. The distal end portion 111 is coupled to a distal end of the link 122E. When the distal end portion 111 is supported by the arm 121, the position and posture of the distal end portion 111 are controlled and stably fixed.
In the diagram, the configuration of the arm 121 is illustrated in a simplified manner for simplicity. In actuality, the shape, number, and arrangement of the joints 123A to 123D and the links 122A to 122E, the direction of the rotation axis of the joints 123A to 123D, a rotation or linear movement drive mechanism, and the like may be appropriately set so that the arm 121 has a desired degree of freedom. For example, the arm 121 may be suitably configured to have six or more degrees of freedom. Therefore, the distal end portion 111 can be freely moved within the movable scope of the arm 121.
The link 122D is provided with a projection device (projector) 141 that projects an image. The projection device 141 projects an image on the basis of a projection image provided from the control device 201. The projection device 141 may be coupled to the link 122D such that the projection direction of the projection device 141 can be rotated with a desired degree of freedom. Alternatively, the projection device 141 may be fixed to the link 122D such that the projection device 141 projects only in a specific direction. In a case where the projection device 141 is rotatable with a desired degree of freedom, the posture of the projection device 141 relative to the link 122D may be controllable by the control device 201. Parameters such as a focal length and a zoom magnification of the projection device 141 can also be controlled by the control device 201. The projection device 141 may be movable along the link 122D. In this case, the position of the projection device 141 on the link 122D may be controllable by the control device 201, or the position of the projection device 141 may be manually adjustable by the user. Examples of the target (projection target) on which the projection device 141 projects an image include a part (e.g., a surgical site) of a patient on a bed apparatus, a floor surface on which the bed apparatus is installed (a floor surface on which the bed apparatus is to be installed), and a lying surface of the patient in the bed apparatus (patient bed, operating table, etc.). The projection device 141 may be provided in a link other than the link 122D, or may be included in the distal end portion 111. Furthermore, the projection device 141 may be provided at any joint. In addition, the projection device 141 may be provided at a location other than the robot, such as a wall or a ceiling of the operating room.
The arm 121 is driven under the control of the control device 201. The joints 123A to 123D are provided with actuators including a drive mechanism such as a motor, an encoder that detects rotation angles of the joints 123A to 123D, and the like. The joints 123A to 123D are configured to be rotatable around a predetermined rotation axis by driving of the actuator. Then, the driving of each actuator is controlled by the control device 201, whereby the posture of the arm 121, that is, the position and posture of the distal end portion 111 are controlled. The control device 201 can grasp the current posture of the arm 121 and the current position and posture of the distal end portion 111 on the basis of information regarding the rotation angles of the joints 123A to 123D detected by the encoder. The base 131 may be equipped with a position detection function using a marker or the like. In this case, the control device 201 may acquire information on the position of the base 131 from the position detection function.
The control device 201 uses the grasped information regarding the position and posture of the arm 121 to calculate control values (e.g., rotation angles, generated torque, etc.) for the joints 123A to 123D to achieve movement of the distal end portion 111 according to operation input from the user. Then, the drive mechanisms of the joints 123A to 123D are driven according to the control values. The control method of the arm 121 by the control device 201 is not limited to a specific method, and various known control methods such as force control or position control may be applied.
As an example, when the user performs operation input via the input device 401, the driving of the arm 121 may be controlled by the control device 201, and the position and posture of the distal end portion 111 may be controlled. The control device 201 calculates control values (e.g., rotation angles, generated torque, etc.) for the joints 123A to 123D according to the operation input, and drives the drive mechanisms of the joints 123A to 123D according to the control values. After the distal end portion 111 has been moved to an arbitrary position, the distal end portion 111 is fixedly supported at the position after moving. Note that the arm 121 may be operated by a so-called master-slave method. In this case, the arm 121 may be remotely operated by the user via the input device 401 installed at a location in the operating room or a location away from the operating room.
The control device 201 integrally controls the operation of the surgical system 100 by controlling the operation of the robot 101 and the display device 301. For example, the control device 201 controls the driving of the arm 121 by operating the actuators of the joints 123A to 123D according to a predetermined control method. Furthermore, for example, the control device 201 generates image data for display by applying various types of signal processing to an image signal acquired by an imaging device included in the distal end portion 111 of the robot 101. The control device 201 also causes the display device 301 to display the generated image data. Examples of the signal processing include any of development processing (demosaic processing), image quality improvement processing (any of band emphasis processing, super resolution processing, noise reduction (NR) processing, and camera shake correction processing), enlargement processing (i.e., electronic zoom processing), and 3D image generation processing.
In addition, the control device 201 of the present embodiment calculates a range of motion (e.g., a range in which the distal end portion can move in a three-dimensional space) of a target part (e.g., the distal end portion of the arm) of the robot 101 in a three-dimensional space in the operating room, and generates a projection image that projects information (an image) specifying a motion space onto the range of motion. The control device 201 outputs a projection instruction for the generated projection image to the projection device 141. The projection device 141 projects the projection image provided from the control device 201. The user can intuitively grasp the range of motion of the target part of the robot 101 by viewing the projected image. A configuration in which the control device 201 generates a projection image and a configuration in which the projection device projects the projection image will be described later. The target part may be an arbitrary part such as an arbitrary link or an arbitrary joint of the arm 121 other than the distal end portion 111.
Transmission and reception of information between the control device 201 and the distal end portion 111, transmission and reception of information between the control device 201 and the joints 123A to 123D, and transmission and reception of information between the control device 201 and the projection device 141 are performed by wired communication or wireless communication. Wired communication may be communication by an electric signal or communication by an optical signal. As a transmission cable used for wired communication, an electric signal cable, an optical fiber, or a composite cable of the foregoing is used according to the communication method. A wireless communication method may be an arbitrary method such as a wireless local area network (LAN), Bluetooth, a dedicated communication method, 4G communication, or 5G communication. In the case of wireless communication, since it is not necessary to lay a transmission cable, a situation may be eliminated in which movement of the medical staff in the operating room is hindered by a transmission cable.
The control device 201 may be a processor such as a central processing unit (CPU) or a graphics processing unit (GPU), a microcomputer in which a processor and a storage element such as memory are combined, a control board, or the like. The processor of the control device 201 operates according to a predetermined program, whereby the above-described various functions may be achieved. Note that in the illustrated example, the control device 201 is provided as a separate device from the robot 101, but the control device 201 may be installed inside the base 131 of the robot 101 and configured integrally with the robot 101. Alternatively, the control device 201 may be configured by a plurality of devices. For example, a microcomputer, a control board, or the like may be disposed in each of the distal end portion 111 and the joints 123A to 123D of the arm 121, and these elements may be communicably connected to each other to achieve similar function to the control device 201.
As an example, the display device 301 is provided in an operating room, and displays an image corresponding to image data generated by the control device 201 under the control of the control device 201. The display device 301 is a display device such as a liquid-crystal display device or an electroluminescent (EL) display device, for example. The display device 301 displays an image of a surgical site captured by an imaging device provided in the distal end portion 111, another part of the robot 101, the operating room, or the like, or an image of the environment or equipment in the operating room. The display device 301 may display various types of information regarding a surgery instead of or together with images of the surgical site, environment, equipment, or the like. Examples of the information include body information of the patient or information regarding a surgical procedure of the surgery. The display device 301 may be provided in a plurality thereof. A plurality of imaging devices may be provided, and image data obtained by every imaging device may be displayed on different display devices. Image data captured by the plurality of imaging devices may be simultaneously displayed on the same display device.
The input device 401 is an operation device for the user to perform various operation input. The input device 401 is a device that can be operated even if the user has a surgical tool in hand, such as a foot switch or a device that performs voice recognition as an example. Alternatively, the input device 401 may be a device capable of accepting operation input in a non-contact manner on the basis of gesture detection or line-of-sight detection using a wearable device or a camera provided in the operating room. Furthermore, the input device 401 may be a device manually operated by the user, such as a touch panel, a keyboard or a mouse, or a haptic device. In addition, in the case of a master-slave type surgical system, the input device 401 is an input device included in a master console and operated by an operator.
During surgery as illustrated in
The information processing system 210 includes an information processing apparatus 211, a projection device 141, and an input device 401. The information processing apparatus 211 includes a joint angle acquiring section 221, a position and posture calculating section 222, a projection image generating section 223, an output instruction section 224, and storage 225. The joint angle acquiring section 221 acquires information on the joint angles (rotation angles) of the joints 123A to 123D from the encoders provided in the joints 123A to 123D. The information on the joint angles of the joints 123A to 123D may be stored in the storage 225 of the control device 201 in advance. In this case, information on the joint angles of the joints 123A to 123D may be acquired from the storage 225.
The position and posture calculating section 222 calculates the position and posture of the projection device 141 on the basis of the joint angles (arm postures) of the joints connecting the links present between the base 131 and the location where the projection device 141 is provided. In the present example, since the projection device 141 is installed in the link 122D, the position and posture of the projection device 141 are calculated on the basis of the joint angles of the joints 123A to 123C. In a case where the projection device 141 is relatively rotatable with respect to the link 122D with an arbitrary degree of freedom, a relative posture of the projection device 141 with respect to the link 122D is specified, and the posture of the projection device 141 is calculated on the basis of the posture of the arm 121 and the relative posture. The posture of the projection device 141 can be represented by three angle variables in a three-axis space, for example. The position of the projection device 141 can also be represented by coordinates in a three-axis space. In a case where the position of the projection device 141 is movable (e.g., in a case where the position is movable in parallel along the link 122D), the position of the projection device 141 need only be calculated on the basis of the relative position of the projection device 141 in the link 122D and the posture of the arm.
The projection image generating section 223 specifies a range of motion of the target part of the robot 101 relative to the base 131 (refer to
On the basis of the range of motion information of the distal end portion 111 relative to the base 131 and the position and posture of the projection device 141, the projection image generating section 223 generates a projection image that projects information (an image) specifying the range of motion of the distal end portion 111 onto the range of motion. That is, by projecting the projection image from the projection device 141, an image capable of identifying the range of motion of the distal end portion 111 is displayed in the range of motion. In generating the projection image, parameter information (focal length, zoom magnification, etc.) of the projection device 141 may be used in addition to the position and posture of the projection device 141.
The projection target of an image of a motion space is an observation part (e.g., a surgical site) of a patient allowed to lie on a bed apparatus, a lying surface of the patient in the bed apparatus, or a floor surface on which the bed apparatus is installed (a floor surface on which the bed apparatus is to be installed), for example.
In a case where the image of the motion space is projected on the surgical site of the patient, the user (operator) can grasp the range of motion of the distal end portion 111 during surgery while maintaining a line of sight from the surgical site. Furthermore, in a case where the image of the range of motion is projected on the lying surface of the patient in the bed apparatus, it is easy to allow the patient to lie on the bed apparatus such that the surgical site of the patient is positioned in the range of motion. In addition, in a case where the image of the range of motion is displayed on the floor surface, it is possible to easily perform positioning of the bed apparatus on which the patient is allowed to lie or positioning of the robot 101 or the arm 121.
The output instruction section 224 outputs a projection instruction for the projection image generated by the projection image generating section 223 to the projection device 141. The projection device 141 projects a projection image in accordance with the projection instruction from the output instruction section 224. The information processing apparatus 211 may control the position or posture of the projection device 141 relative to the link 122D to a predetermined position or posture according to the posture of the arm 121 (joint angle of each joint). As a result, regardless of the posture of the arm 121, projection can be appropriately performed in the range of motion of the projection device 141 even when the motion space is positioned outside the projectable range.
The projection device 141 is a two-dimensional projection device (2D projector) that projects a two-dimensional image or a three-dimensional projection device (3D projector) that projects a three-dimensional image. The projection image generating section 223 generates a projection image adapted to each method depending on whether the projection device 141 is a two-dimensional projection device or a three-dimensional projection device. In the case of a three-dimensional projection device, by projecting a three-dimensional image from the projection device 141, the range of motion of the distal end portion 111 is stereoscopically displayed in three-dimensional space, and the range of motion can be intuitively recognized up to the depth direction. In the case of a two-dimensional projection image, a two-dimensional image is projected from the projection device 141. As an example, an image of a region at an arbitrary height in the range of motion is displayed as the two-dimensional image. The height in the range of motion at which the range of motion is to be displayed may be determined, and a projection image that projects the range of motion having the determined height may be generated.
As an example, the displayed range of motion is a range of motion in a plane at the height of the patient 502 in the three-dimensional range of motion of the distal end portion 111. Information on the posture (height, inclination, etc.) of the bed apparatus 501 and the thickness of the affected part of the patient (which may be a statistical value such as an average thickness) are stored in advance in the storage 225 of the information processing apparatus, and the projection image generating section 223 generates a projection image representing an image of the range of motion at the height using this information. In a case where the bed apparatus 501 can be driven such that the lying surface of the patient is inclined obliquely from a horizontal state, a projection image representing an image of a range of motion along the inclination of the bed apparatus 501 may be generated. In a case where information on the posture of the bed apparatus 501 is not stored in the storage 225, information on the posture of the bed apparatus 501 may be acquired by a method of measuring a distance to a marker installed in the bed apparatus. Alternatively, information on the posture of the bed apparatus 501 may be acquired by a method of communicating with a communication device provided in the bed apparatus to acquire at least one of the height and the inclination of the bed apparatus.
In a case where the image of the range of motion includes a plurality of types of regions, information for identifying the regions may be included in the image of the range of motion. For example, there are a region where the distal end portion 111 can be operated in a free posture and a region where the distal end portion 111 can be operated only in a specific posture. The identification information of each region may be color information.
The joint angle acquiring section 221 acquires information on the joint angles (rotation angles) of the joints 123A to 123D from the encoders provided in the joints 123A to 123D (S101). Alternatively, the joint angle acquiring section 221 acquires information on the joint angles of the joints 123A to 123D stored in the storage 225 in advance.
The position and posture calculating section 222 calculates the position and posture of the projection device 141 on the basis of the joint angles (arm posture) of the joints 123A to 123D (S102). Specifically, the position and posture of the projection device 141 are calculated by forward kinematics on the basis of the joint angles of joints present from the base 131 to the installation location of the projection device 141.
The projection image generating section 223 acquires, from the storage 225, information expressing the range of motion of a target part (here, the distal end portion 111, etc.) of the robot 101 relative to the base 131 (S103). On the basis of the range of motion information of the distal end portion 111 relative to the base 131 and the position and posture of the projection device 141, the projection image generating section 223 generates a projection image that projects information specifying the range of motion of the distal end portion 111 onto the range of motion (S104). The output instruction section 224 outputs a projection instruction for the generated projection image to the projection device 141.
The projection device 141 projects the projection image in accordance with the projection instruction from the output instruction section 224 (S105). Therefore, an image for specifying the range of motion is displayed in the range of motion of the distal end portion 111.
The order of the steps illustrated in
As described above, even in a case where the user has changed the position of the arm, an image can be displayed in the range of motion by recalculating the posture of the arm and the position and posture of the projection device 141 following the change.
In the example of
According to the present embodiment as described above, the projection image that projects information specifying the range of motion onto the range of motion is generated on the basis of information regarding the range of motion of the target part of the robot and the posture of the arm calculated from the joint angle of the robot, and the projection image is projected from the projection device. Therefore, the user can intuitively understand the range of motion, and thus, in the operating room, the robot or the arm can be easily, appropriately, and quickly arranged so that the range of motion is at an appropriate position in accordance with the surgical information and the surgical situation held by the doctor. Also according to the present embodiment, the installability of the robot and the reliability of the installation position are improved.
(First Variation)
The information processing system 210 in
In the first variation, the surface shape of the observation target (e.g., the observation part of the subject) is calculated using the imaging device 142 and the projection device 141. According to a projection instruction from the output instruction section 224, a two-dimensional image of a predetermined pattern is projected from the projection device 141 onto the observation target. The output instruction section 224 outputs an imaging instruction to the imaging device 142 so that the projected two-dimensional image is captured by the imaging device 142. The number of imaging devices 142 may be one or two or more. The imaging device 142 captures an image of the projected predetermined pattern, and stores the captured image data in the storage 225. A shape calculating section 226 specifies a correspondence relationship between the pattern of the projected image and the pattern included in the captured image, and calculates the surface shape of the observation target on the basis of the specified correspondence relationship and the principle of triangulation. That is, the depth at each position on the surface of the observation target is calculated. Calibration of the projection device 141 and the imaging device 142 may be performed in advance to acquire each piece of parameter information, and the parameter information may be used for calculation of the surface shape.
The projection image generating section 223 calculates the range of motion on the surface along the surface shape of the observation target in the three-dimensional range of motion of the target part (here, the distal end portion 111) of the robot 101. A projection image that projects information specifying the calculated range of motion onto the range of motion is generated. The output instruction section 224 outputs a projection instruction for the projection image to the projection device 141. As a result, the range of motion can be correctly displayed on the surface of the observation target. For example, in a case where there is unevenness on the surface of the observation target, there may be a position or region where the surgical site can be operated on from the distal end portion 111 and a position or region where the surgical site cannot be operated on depending on the position of the surface. In this case, in the present variation, an image is not projected on a position or region where operation cannot be performed correctly, and an image is projected only on a position or region where operation can be performed. In the above-described embodiment, an image of a range of motion in a plane at a certain height in a three-dimensional range of motion is projected. Therefore, in a case where an image is projected on an uneven observation target, the image can be projected even at a position where the distal end portion 111 cannot actually be operated (e.g., a recessed position where the distal end portion 111 does not reach). In the present variation, the range of motion can be more accurately displayed by generating a projection image based on measurement values of the surface shape of the observation target.
The projection image generating section 223 has calculated the range of motion on the surface of the observation target, but may calculate the range of motion (range of motion having a shape parallel to the shape of the surface of the observation target) at a height lower or higher than the surface of the observation target by a certain distance. For example, by displaying the range of motion at a height lower than the surface by a certain distance, the user (operator) can predict in advance the range of motion lower than the surface by the certain distance, so that surgery can be more appropriately performed. In addition, by displaying the range of motion at the height higher by the certain distance, for example, it is possible to appropriately grasp a region where the distal end portion 111 can be moved without making contact with the observation target.
In the present variation, the surface shape of the observation target has been calculated using the imaging device 142 and the projection device 141, but the surface shape may be calculated using a depth sensor such as a distance measuring sensor.
In the present variation, the surgical site of the patient is mainly assumed to be the observation target, but the lying surface of the patient on the bed apparatus, the floor surface on which the bed apparatus is installed during surgery, or the like may be used as the measurement target.
After Step S102, according to an instruction of the output instruction section 224, an image of a predetermined pattern is projected from the projection device 141 onto the observation target (S201).
According to the instruction of the output instruction section 224, the imaging device 142 captures the image projected from the projection device 141 (S202).
A correspondence relationship between a predetermined pattern included in the projected image and a predetermined pattern included in the captured image is specified. The surface shape of the observation target is calculated using the principle of triangulation on the basis of the specified correspondence and the parameter information of the imaging device 142 and the projection device 141 acquired by advance calibration (S203).
The projection image generating section 223 acquires range of motion information of the distal end portion 111 from the storage 225 (S103). On the basis of the range of motion information of the distal end portion 111 and the surface shape of the observation target, the range of motion of the distal end portion 111 is specified on the surface of the observation target and a projection image that projects the information specifying the specified range of motion onto the range of motion is generated (S104). The output instruction section 224 outputs an instruction for generating the projection image to the projection device 141 (also S104). The projection device 141 projects the projection image in accordance with the instruction (S105).
The order of the steps in
Information for identifying the distance (depth) by which the distal end portion 111 can move in the depth direction from the surface of the projection target (observation target) may be included in the image projected onto the observation target (e.g., the surgical site). The distance that can be moved in the depth direction from the surface is calculated by the shape calculating section 226 on the basis of the range of motion of the distal end portion 111 and the surface shape of the observation target.
In
(Second Variation)
A block diagram of an information processing system of a second variation is the same as that of
In the present variation, an image including a reference mark is acquired in advance. For example, an affected part image including an affected part (e.g., a tumor) of a patient is acquired by a technique such as computed tomography (CT) or magnetic resonance imaging (MRI) before surgery. The affected part image may be a two-dimensional image or a three-dimensional image. Using the range of motion information of the distal end portion 111, alignment between the affected part image and the range of motion is performed such that the affected part in the affected part image is included in the range of motion. The position of the affected part image associated with the range of motion of the robot relative to the base 131 is determined by this alignment. Alignment of the affected part image and the range of motion information may be manually performed by the user, or may be performed by the projection image generating section 223 or another computer. In a case where the projection image generating section 223 performs alignment, data of the affected part image is stored in the storage 225. As a method of alignment, for example, affected part detection may be performed by image analysis on the affected part image, and the range of motion information may be aligned with a detected affected part. The image analysis may be performed using a model such as a neural network generated by machine learning, or may be performed using image clustering or the like. Other methods may be used.
In an aligned state, the range of motion information is combined with the affected part image, and composite information (a composite image) in which the range of motion information is combined with the affected part image is generated. A projection image of the composite information is generated such that the range of motion information included in the composite information is displayed on the range of motion. The projection device 141 projects the projection image.
Although the image is projected on the floor surface in
In the above description, the range of motion information is aligned with the affected part of the patient as a reference mark. However, other than the affected part of the patient, items such as a mark affixed to the bed surface, an arbitrary part of the patient (head, waist), or a human form may be used.
After Step S103, the projection image generating section 223 reads the affected part image from the storage 225 (S301) and generates a composite image in which the range of motion information acquired in Step S103 is aligned with the affected part in the affected part image (S302). On the basis of the position and posture of the projection device 141, a projection image that projects a composite image such that the range of motion information in the composite image is displayed in the range of motion is generated. The output instruction section 224 outputs an instruction for generating the projection image to the projection device 141 (S104). The imaging device 142 projects the projection image in accordance with the instruction from the output instruction section 224.
The order of the steps in
(Third Variation)
According to a projection instruction from the output instruction section 224, a predetermined pattern image (correction image) is projected from the projection device 141A or 142B. As an example, the projection target is a floor surface or a bed surface. The output instruction section 224 outputs an imaging instruction to the imaging devices 142A and 142B so that the projected correction image is captured by the imaging devices 142A and 142B. The imaging devices 142A and 142B capture both correction images projected from the two projection devices in a posture, and store the captured image data in the storage 225.
The positional relationship calculating section 227 calculates a positional relationship (arm positional relationship) between the two robots 101A and 101B on the basis of the image data captured by the imaging devices 142A and 142B. For example, the position of the projected pattern is determined by the principle of triangulation from the relationship between the projected pattern by the projection device 141A and the projected pattern captured by the projection device 142A. Since the positional relationship between the imaging devices is obtained by capturing images of the projected patterns by the imaging device 142A and the imaging device 142B, the positional relationship between the bases of the robots can be obtained. In addition, a model (e.g., a neural network) that uses two pieces of image data as inputs and outputs the positional relationship between the two robots may be learned in advance, and the positional relationship may be calculated using the model.
The projection image generating section 223 calculates an integrated region in which the ranges of motion of the distal end portions 111A and 111B are integrated on the basis of the information regarding the ranges of motion of the distal end portions 111A and 111B, the positions and the postures of the projection devices 141A and 141B, and the positional relationship calculated by the positional relationship calculating section 227. A projection image that projects information specifying the calculated integrated region onto the integrated region is generated. The output instruction section outputs a projection instruction for the generated projection image to the projection device 141A or 141B.
In addition, the projection image generating section 223 may specify a region where interference between the two distal end portions 111 is likely to occur in the integrated region on the basis of the above positional relationship, and include information for identifying the specified region in the image. Specifically, a region (first region) in which interference is likely to occur between the distal end portions 111, a region (second region) where the two distal end portions 111 are simultaneously movable and interference is unlikely to occur, and a region (third region) where only the distal end portions 111 are movable may be specified, and information for identifying these three regions may be included in the image. For example, a region having a constant width from the center of an intersection region of the two regions is referred to as a first region, a region other than the first region in the intersection region is referred to as a second region, and a region other to these is referred to as a third region.
The joint angle acquiring section 221 acquires information on the joint angles (rotation angles) of each joint from the encoders provided in the joints of the robots 101A and 101B (S101). The position and posture calculating section 222 calculates the position and posture of the projection devices 141A and 141B on the basis of the joint angles of the joints of the robots 101A and 101B (S102). In addition to the positions and postures of the projection devices 141A and 141B, the positions and postures of the imaging devices 142A and 142B may be calculated. The projection image generating section 223 acquires, from the storage 225, information expressing the ranges of motion of target parts (distal end portions 111, etc.) of the robots 101A and 101B relative to the bases 131A and 131B (S103).
The projection image generating section 223 generates projection images representing correction images for the robots 101A and 101B, respectively (S401). The output instruction section 224 outputs a projection instruction for the correction images represented by the projection images to the projection devices 141A and 141B of the robots 101A and 101B (also Step S401).
The output instruction section 224 outputs an imaging instruction to the imaging devices 142A and 142B of the robots 101A and 101B (S402). The imaging devices 142A and 142B perform imaging and provide captured image data to the information processing apparatus 211 (also S402). The information processing apparatus 211 stores each piece of correction image data in the storage 225 (also Step S402). Each piece of correction image data includes correction images projected from both projection devices 141A and 141B.
The positional relationship calculating section 227 calculates a positional relationship (arm positional relationship) between the two robots on the basis of the correction image data captured by the imaging devices 142A and 142B (S403).
The projection image generating section 223 calculates an integrated region in which the ranges of motion of the distal end portions 111A and 111B are integrated on the basis of the information regarding the ranges of motion of the distal end portions 111A and 111B, the positions and postures of the projection devices 141A and 141B, and the positional relationship calculated by the positional relationship calculating section 227 (S104). As an example, the integrated region includes a region (first region) where only the distal end portions 111A and 111B are individually movable, a region (second region) where the distal end portions 111A and 111B are simultaneously movable and interference is unlikely to occur, and a region (third region) where interference between the distal end portions 111A and 111B is likely to occur. The projection image generating section 223 generates a projection image that projects information specifying the integrated region onto the integrated region (also S104). The output instruction section 224 outputs a projection instruction for the projection image to the projection device 141A or 141B (also S104).
The projection device 141A or 141B projects the projection image in accordance with the projection instruction from the output instruction section 224 (S105).
In the present variation, the positional relationship between the robots is calculated using the imaging devices 142A and 142B and the projection devices 141A and 141B. However, in a case where each robot includes a position detection function, the information processing apparatus 211 may communicate with each robot to acquire position information of each robot. The positional relationship calculating section 227 calculates a positional relationship between the robots on the basis of the positional information of each robot.
Note that the above-described embodiments illustrate examples for embodying the present disclosure, and the present disclosure can be implemented in various other forms. For example, various modifications, substitutions, omissions, or combinations thereof can be made without departing from the gist of the present disclosure. Such modifications, substitutions, omissions, and the like are also included in the scope of the present disclosure and are included in the invention described in the claims and the equivalent scope thereof.
Furthermore, the effects of the present disclosure described in the present specification are merely examples, and other effects may be provided.
The present disclosure can also have the following configurations.
An information processing apparatus including:
a projection image generating section which generates a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of a projection device that projects an image in an operating room; and
an output instruction section which outputs a projection instruction for the projection image to the projection device.
The information processing apparatus according to item 1, in which
the range of motion of the target part includes a region in which a distal end portion of the medical arm is movable or a region in which an imaging device provided at an arbitrary part of the medical arm is capable of imaging.
The information processing apparatus according to item 1 or 2, further including a position and posture calculating section, in which
the projection device is provided in the medical arm, and
the position and posture calculating section calculates the position and the posture of the projection device on the basis of the posture of the medical arm.
The information processing apparatus according to any one of items 1 to 3, in which
the projection image generating section generates the projection image on the basis of at least one of a position and a posture of a target on which the projection image is projected.
The information processing apparatus according to item 4, in which
the target on which the projection image is projected is a lying surface of a bed apparatus on which a subject to undergo medical treatment is allowed to lie, an observation part of the subject allowed to lie on the bed apparatus, or a floor surface on which the bed apparatus is installed.
The information processing apparatus according to item 4, further including
a shape calculating section which calculates a surface shape of the target on which the projection image is projected, in which
the projection image generating section generates the projection image on the basis of the surface shape.
The information processing apparatus according to item 6, in which
the range of motion is a range of motion on a surface of the target on which the projection image is projected.
The information processing apparatus according to item 7, in which
the range of motion is a range of motion at a height lower or higher than the surface of the target by a certain distance.
The information processing apparatus according to item 8, in which
the range of motion having the height higher by the certain distance is a region where the target part is movable without making contact with the target.
The information processing apparatus according to item 6, in which
the projection image includes information for identifying a movable distance in a depth direction of the target on which the projection image is projected.
The information processing apparatus according to any one of items 1 to 10, in which
the projection device is a three-dimensional projection device, and
the projection image is a three-dimensional image.
The information processing apparatus according to any one of items 1 to 11, in which
the projection image generating section generates composite information obtained by combining information on the range of motion in alignment with a reference mark of an image including the reference mark, and generates the projection image on which the composite information is projected.
The information processing apparatus according to item 12, in which,
the reference mark is an affected part of the subject in an image including the affected part.
The information processing apparatus according to any one of items 1 to 13, in which
the projection image generating section calculates an integrated region obtained by integrating ranges of motion of target parts of a plurality of medical arms on the basis of a plurality of the first information regarding the ranges of motion of the target parts of the plurality of medical arms and the second information, and generates the projection image that projects information for specifying the integrated region onto the integrated region.
The information processing apparatus according to item 14, in which
the integrated region includes a first region in which the plurality of medical arms interfere with each other, and
the projection image includes information for identifying the first region.
The information processing apparatus according to item 15, in which
a second region different from the first region in the integrated region is different in color from the first region.
The information processing apparatus according to item 14, in which
the projection image generating section generates the projection image on the basis of the positional relationship of the plurality of medical arms.
The information processing apparatus according to item 17, further including a positional relationship calculating section, in which
the projection device is provided in a plurality thereof
the projection devices are installed in the plurality of medical arms,
correction images including predetermined patterns are projected from the projection devices of the medical arms, and
the positional relationship calculating section acquires image data obtained by capturing the plurality of projected correction images from the plurality of imaging devices, and calculates a positional relationship between the plurality of medical arms on the basis of the plurality of predetermined patterns included in each piece of the acquired image data.
An information processing system including:
a projection device which projects an image in an operating room;
a projection image generating section which generates a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of the projection device; and
an output instruction section which outputs a projection instruction for the projection image to the projection device.
An information processing method including:
generating a projection image that projects information specifying a range of motion of a target part of a medical arm onto the range of motion on the basis of first information regarding the range of motion and second information regarding a position and a posture of a projection device that projects an image in an operating room; and
projecting the projection image by the projection device.
Number | Date | Country | Kind |
---|---|---|---|
2020-061407 | Mar 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/009444 | 3/10/2021 | WO |