The present invention relates to a program, an information processing device, an information processing system, and an information processing method.
For health management (e.g., medical treatment such as surgical care), it is known that body data (for example, body weight, body temperature, blood pressure, and the like) of an individual measured using a measuring device is accumulated in a server via a network and managed by the server. For example, an information processing device is disclosed which executes processing of storing numerical information acquired from an image displayed on a display unit of a body data measuring device for each type of measurement items of the body data measuring device (see, e.g., JP5601250B1).
However, it is difficult to automatically and accurately record information about various medical devices using the information processing device disclosed in JP5601250B1 in an operating room in which various medical devices are used for medical treatment.
In addition, there is a case where a plurality of measuring devices is used in an operating room, and the measuring devices are connected by cables to transfer data. At that time, erroneous recognition of data from each of the measuring devices due to cable noise generated in the cables may become a problem. Furthermore, since multiple cables are placed in the operating room, there is a concern of the risk of, for example, an operator stumbling while moving around in the operating room.
The present invention has been made in view of such circumstances, and an object thereof is to provide a program, an information processing device, an information processing system, and an information processing method with which it is possible to automatically and accurately record information about a medical device in an operating room.
A program according to one aspect of the present disclosure causes a computer to execute processing of: acquiring a surgical space image that is captured by an imaging device imaging a surgical space and that includes monitor screens of a plurality of display devices dispersedly located in the surgical space; extracting information regarding the monitor screen of each of the display devices on the basis of the acquired surgical space image; and storing the extracted information regarding the monitor screen of each of the display devices in time series.
An information processing device according to one aspect of the present disclosure includes: an acquisition unit that acquires a surgical space image which is captured by an imaging device imaging a surgical space and which includes monitor screens of a plurality of display devices dispersedly located in the surgical space; an extraction unit that extracts information regarding the monitor screen of each of the display devices on the basis of the surgical space image that has been acquired by the acquisition unit; and a storage unit that stores, in time series, the information regarding the monitor screen of each of the display devices that has been extracted by the extraction unit.
An information processing system according to one aspect of the present disclosure includes: an imaging device that acquires a surgical space image by imaging a surgical space, the surgical space image including monitor screens of a plurality of display devices dispersedly located in the surgical space; and an information processing device that communicates with the imaging device, wherein the information processing device includes an acquisition unit that acquires the surgical space image captured by the imaging device, an extraction unit that extracts information regarding the monitor screen of each of the display devices on the basis of the surgical space image that has been acquired by the acquisition unit, and a storage unit that stores, in time series, the information regarding the monitor screen of each of the display devices that has been extracted by the extraction unit.
An information processing method according to one aspect of the present disclosure is an information processing method executed by a computer, the method including steps of: acquiring a surgical space image which is captured by an imaging device imaging a surgical space and which includes monitor screens of a plurality of display devices dispersedly located in the surgical space; extracting information regarding the monitor screen of each of the display devices on the basis of the acquired surgical space image; and storing, in time series, the extracted information regarding the monitor screen of each of the display devices.
According to the present disclosure, information about medical devices in an operating room can be automatically and accurately recorded.
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
As illustrated in
The imaging device 10 is, for example, a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like. The imaging device 10 captures an image of the surgical space in the operating room 100 at predetermined time intervals, for example, and transmits an image (hereinafter, referred to as a surgical space image) including display screens of a plurality of monitors 201, 211, 221, . . . obtained by the image capture to the information processing device 50. Note that the imaging device 10 may sequentially acquire the surgical space image of one frame by continuously imaging the surgical space in the operating room 100 during a surgery period. The imaging device 10 and the information processing device 50 are connected via a network N, but may be directly connected by a wired cable, or wirelessly connected using, for example, Bluetooth (registered trademark) to exchange information.
As shown in
The control unit 51 includes an arithmetic processing unit such as a central processing unit (CPU), a micro-processing unit (MPU), or a graphics processing unit (GPU) and executes various types of information processing, control processing, and the like performed by the information processing device 50 by reading and executing a program 1P for control stored in the storage unit 53. In
The communication unit 52 is a communication module for performing processing related to communication, and transmits and receives information between the imaging device 10 and the information processing device 50 via the network N.
The storage unit 53 includes memory elements such as a random access memory (RAM) and a read only memory (ROM), and stores the program 1P, data, or the like necessary for the control unit 51 to execute processing. In addition, the storage unit 53 temporarily stores data and the like necessary for the control unit 51 to execute arithmetic processing.
The mass storage unit 54 includes a recording medium such as a hard disk drive (HDD) or a solid state drive (SSD). The mass storage unit 54 includes an image database (DB) 541 and a measurement information DB 542 to be described later.
In the present embodiment, the storage unit 53 and the mass storage unit 54 may be configured as an integrated storage device. In addition, the mass storage unit 54 may include a plurality of storage devices. Furthermore, the mass storage unit 54 may be an external storage device connected to the information processing device 50.
The reading unit 55 reads a portable storage medium 1a including a compact disc (CD)-ROM or a digital versatile disc (DVD)-ROM. The control unit 51 may read the program 1P from the portable storage medium 1a via the reading unit 55 and store the program 1P in the mass storage unit 54. In addition, the control unit 51 may download the program 1P from another computer via the network N or the like and store the program 1P in the mass storage unit 54. Furthermore, the control unit 51 may read the program 1P from a semiconductor memory 1b.
The display unit 56 is a liquid crystal display, an organic electroluminescence (EL) display, or the like, and displays various kinds of information in accordance with an instruction from the control unit 51. The operation unit 57 receives an operation from an operational personnel and notifies the control unit 51 of the received operation. The operation unit 57 receives an operation from the operational personnel by a mechanical button or an input device such as a touch panel provided on the surface of the display unit 56. Furthermore, the operation unit 57 may be an input device such as a mouse and a keyboard, and these input devices may be detachable from the information processing device 50.
The information processing device 50 according to the first embodiment acquires time-series surgical space images captured by the imaging device 10 via the communication unit 52, and the control unit 51 acquires the surgical space images, extracts information regarding the screens of the monitors 201, 211, 221, . . . on the basis of the acquired surgical space images, and stores the extracted information regarding the monitor screens in time series.
The control unit 51 extracts information regarding the monitor screen by, for example, executing detection processing of detecting the monitor screen and extraction processing of extracting measurement information displayed on the monitor screen to be described later on the surgical space image acquired via the communication unit 52. With respect to the detection processing of detecting the monitor screen, the control unit 51 detects the monitor screen from the surgical space image using, for example, a learning model trained in advance by machine learning. The learning model is trained in advance by machine learning to receive, for example, data of a surgical space image as an input and output region coordinates of a monitor screen included in the surgical space image. The learning model is trained by machine learning using, for example, training data in which the data of the surgical space image is associated with the coordinates indicating an image region, which includes the monitor screen, included in the surgical space image. The control unit 51 inputs the surgical space image captured by the imaging device 10 to the learning model and acquires the region coordinates of the monitor screen output by the learning model. Note that, regarding the processing of detecting the monitor screen, the control unit 51 may detect the monitor screen from the surgical space image captured by the imaging device 10 using, for example, a conventional object detection technology. The object detection technology is, for example, accelerated KAZE (A-KAZE), scale invariant feature transform (SIFT), or the like in addition to pattern matching, and detects a monitor screen by extracting a feature amount using a local feature amount extraction method. Furthermore, the storage unit 53 stores in advance a table in which, for each of the monitors 201, 211, 221, . . . , the frame shape of the monitor is associated with a measurement item to be displayed, and the control unit 51 can identify each of the monitors 201, 211, 221, . . . according to the frame shape of each of the monitors 201, 211, 221, . . . included in the surgical space image, and specify the measurement item. Note that the method for identifying each of the monitors 201, 211, 221, . . . is not limited to the above method, and for example, an identification marker (colored label or the like) may be attached in advance to each of the monitors 201, 211, 221, . . . , and the color of the marker and the measurement item may be associated with each other.
The control unit 51 stores the region coordinates of the monitor screen as data of the monitor image in the image DB 541 of the mass storage unit 54. The control unit 51 may extract an image region including the monitor screen from the surgical space image and store data of the extracted image region in the image DB 541 of the mass storage unit 54 as data of the monitor image.
The control unit 51 uses the detection result of the monitor screen to extract information regarding the monitor screen from the image in the region coordinates. The information regarding the monitor screen includes various types of measurement information displayed on the monitor screen, and may be a measurement value, text data, graph data, or a medical image such as an intravascular ultrasound (IVUS) image or an angiographic image. In the present embodiment, the measurement information will be described using a measurement value as an example. Regarding the processing of extracting the measurement value, the control unit 51 recognizes the number on the monitor screen included in the surgical space image using, for example, a learning model that is trained by machine learning in advance. The learning model is trained by machine learning in advance so as to receive, for example, data of a monitor image including data of a surgical space image and region coordinates of a monitor screen as an input, and output, for each of the numbers on the monitor screen included in this image, the probabilities of the number being “0” to “9”. Note that the region coordinates of the monitor screen used in the processing of extracting the measurement value are not limited to the coordinates output by the learning model described above, and may be coordinates input by the operational personnel using the operation unit 57 while observing the surgical space image. Alternatively, data of the image region of the monitor screen extracted from the surgical space image may be used as an input to the learning model. The learning model is trained by machine learning using, for example, training data in which the data of the monitor image is associated with a label indicating the probability of each number in the image within the region coordinates being “0” to “9”. In the first embodiment, the control unit 51 inputs the data of the monitor image to the learning model, and acquires the probabilities of each number output from the learning model being “0” to “9”. By specifying a number corresponding to a probability higher than a predetermined threshold (for example, 80%) from the probabilities, a measurement value on the monitor screen is extracted. The data of the measurement value extracted by the control unit 51 is stored in the measurement information DB 542 of the mass storage unit 54. As a result, the accuracy of the measurement value can be improved.
Furthermore, when the probabilities of a certain number output from the learning model being “0” to “9” output from the learning model are all equal to or less than a predetermined threshold (for example, 50%), the control unit 51 determines that the identification rate for the number is low and does not store the measurement value indicated by the number in the measurement information DB 542. Note that the control unit 51 may create a trend graph using the measurement information stored in the measurement information DB 542 and display the trend graph on the display unit 56.
The control unit 51 acquires the surgical space image captured by the imaging device 10 (step S1), detects the monitor screen from the acquired image (step S2), and stores data of the detected monitor image in the image DB 541 (step S3).
The control unit 51 extracts information regarding the monitor screen from the data of the monitor image stored in the image DB 541 (step S4), and stores the extracted information in the measurement information DB 542 together with the imaging date and time (step S5).
The control unit 51 determines whether or not there is an unprocessed monitor image, and when determining that there is an unprocessed monitor image (YES in step S6), returns the processing to step S4. When determining that there is no unprocessed monitor image (NO in step S6), the control unit 51 ends the processing.
In the first embodiment, the surgical space in the operating room 100 is imaged, and the information regarding the monitor screen is extracted from the surgical space image and stored, whereby the measurement information about each of the measurement devices 20, 21, 22, . . . in the imaging region can be automatically recorded. Since the information about the plurality of medical devices is collectively managed, it is possible to reduce the burden on a medical profession and avoid human errors due to manual recording. In addition, it is not necessary to provide wires between each of the measurement devices 20, 21, 22, . . . and the information processing device 50, whereby it is possible to eliminate the trouble of routing the wires for the medical devices and to reduce various risks (for example, erroneous recognition due to cable noise, entanglement between cables, the cables getting in the way when the measuring device is moved in the operating room or when the operator himself/herself moves, or the like) caused by very complicated wiring in a medical environment. Furthermore, when a medical device is newly introduced, large-scale software revision is not required even if the communication protocol is not prepared.
In addition, it is possible to, for example, illustrate time-series data of a mode of a patient graphically (for example, with a trend graph) using the recorded measurement information. The recorded measurement information has no outlier caused by cable noise or human error, and thus, highly accurate trend information can be obtained when a trend graph is created using the extracted measurement information. In addition, when the identified measurement information is not reliable, the measurement information is not recorded. Therefore, an accurate trend graph can be created without using an identification result with low reliability.
Note that the surgical space image may include a blood tank of the extracorporeal circulation device and each circuit in addition to the monitor screen. The information processing system according to the first embodiment can easily monitor the liquid level of the extracorporeal circulation device by extracting information related to the liquid level of the blood tank from the surgical space image.
An information processing system according to the second embodiment is different from the first embodiment in including a plurality of imaging devices, and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
In the information processing device 50, a control unit 51 acquires a plurality of surgical space images via a communication unit 52, selects one image from the plurality of surgical space images according to the installation location of each of the imaging devices 10, 11, 12, . . . , and extracts information regarding a monitor screen from the selected image.
A storage unit 53 stores, for each of monitors 201, 211, 221, . . . , a priority table (not illustrated) in which priorities of the imaging devices 10, 11, 12, . . . are set in advance on the basis of the installation locations of the imaging devices. When detecting the monitor screen, the control unit 51 uses, for the monitor to be detected, the surgical space image captured by the imaging device having the highest priority on the basis of the priority table. Note that the method for setting the priority is not limited to the above method, and the priority may be set for each surgical space image by comparing the accuracy among surgical space images captured by the imaging devices 10, 11, 12, . . . at the same time point.
In a case where it is impossible to detect the monitor screen or recognize the measurement information because a blocking object is included in the surgical space image captured by the imaging device having the highest priority, the control unit 51 performs the processing of detecting the monitor screen again using the surgical space image captured by the imaging device with the second highest priority. The control unit 51 recognizes a blocking object by, for example, a technology called “YOLO” that detects an object such as a person from an image using a learning model trained by deep learning.
The control unit 51 acquires a plurality of surgical space images captured by the imaging devices 10, 11, 12, . . . (step S21), and stores the acquired images in an image DB 541 of a mass storage unit 54 (step S22).
The control unit 51 selects one image from the plurality of surgical space images on the basis of the priority table (step S23), and detects the monitor screen from the selected surgical space image (step S24). The control unit 51 determines whether or not the monitor screen is blocked by the blocking object (step S25). When determining that the monitor screen is blocked (YES in step S25), the control unit 51 selects another image from the plurality of surgical space images on the basis of the priority table (step S26), and returns the processing to step S24.
When determining that the monitor image is not blocked (NO in step S25), the control unit 51 stores data of the detected monitor image in the image DB 541 (step S27).
The control unit 51 extracts information regarding the monitor screen on the basis of the data of the monitor image stored in the image DB 541 (step S28), and stores the extracted information in the measurement information DB 542 of the mass storage unit 54 together with the imaging date and time (step S29).
The control unit 51 determines whether or not there is an unprocessed monitor image (step S30), and when determining that there is an unprocessed monitor imaged (YES in step S30), returns the processing to step S23. When determining that there is no unprocessed monitor image (NO in step S30), the control unit 51 ends the processing.
In the second embodiment, the plurality of imaging devices 10, 11, 12, . . . is installed. Therefore, when the measurement information is not recognized from one surgical space image, another surgical space image can be used to complement the measurement information. Therefore, the information about the medical device can be accurately recorded. In addition, by using the surgical space image captured by the imaging device with high reliability for each monitor screen, information about the medical device can be recorded with higher accuracy.
An information processing system according to the third embodiment is different from the first embodiment in that an information processing device 50 recognizes a blocking object present between each of monitor screens and an imaging device 10, and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
A control unit 51 recognizes a blocking object present between the monitor screen and the imaging device 10, and extracts information regarding the monitor screen except for a region of the recognized blocking object. In addition, when recognizing an operator as the blocking object, the control unit 51 specifies the position of the recognized operator.
The control unit 51 recognizes a blocking object by, for example, a technology called “YOLO” that detects an object such as a person from an image using a learning model trained by deep learning. For example, in a case where the display screen of the monitor 201 is blocked by the operator in the surgical space image captured by the imaging device 10 at a certain time point, the control unit 51 specifies the position of the operator as a point on a straight line between the imaging device 10 and the monitor 201. Note that, even in a case where a part of the display screen of the monitor 201 is blocked by the operator, information may be extracted from a region that is not blocked. In this case, the control unit 51 specifies a pixel region of the operator on the display screen using a machine learning model of a segmentation method such as SegNet. The control unit 51 extracts an image in a pixel region other than the specified pixel region on the display screen. The control unit 51 recognizes measurement information in the region by a machine learning model or pattern matching that performs character recognition.
In the information processing device 50, the control unit 51 acquires the surgical space image captured by the imaging device 10 (step S41), and determines whether or not the monitor screen in the acquired surgical space image is blocked (step S42). When determining that the monitor screen is blocked (YES in step S42), the control unit 51 removes the blocked screen region (step S43), detects the monitor screen for the region that is not blocked (step S44), and stores data of the detected monitor image in the image DB 541 (step S45).
The control unit 51 extracts information regarding the monitor screen from the data of the monitor image stored in the image DB 541 (step S46), and stores the extracted information in the measurement information DB 542 together with the imaging date and time (step S47).
The control unit 51 determines whether or not the blocking object is an operator (step S48). When determining that the blocking object is the operator (YES in step S48), the control unit 51 specifies the position of the operator as a position between the monitor 201 and the imaging device 10, between the monitor 211 and the imaging device 10, or between the monitor 221 and the imaging device 10 (step S49), and stores the specified position in the measurement information DB 542 together with the imaging date and time (step S50). The control unit 51 determines whether or not there is an unprocessed monitor image (step S51), and when determining that there is an unprocessed monitor imaged (YES in step S51), returns the processing to step S46. When determining that there is no unprocessed monitor image (NO in step S51), the control unit 51 ends the processing.
When determining that the blocking object is not the operator (NO in step S48), the control unit 51 returns the processing to step S51.
In the third embodiment, the measurement information can be extracted from the monitor screen not blocked by the blocking object, and the position of the operator can be specified and recorded.
An information processing system according to the fourth embodiment is different from the first embodiment in that an information processing device 50 further acquires an actual measurement value by each measurement device, and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
In the information processing device 50, a control unit 51 compares the actual measurement values measured by the measurement devices 20, 21, 22, . . . with a measurement value (hereinafter, referred to as extracted measurement value) extracted from the surgical space image, and in a case where a difference between them is equal to or greater than a predetermined threshold, stores the monitor screen from which the measurement value is extracted and the imaging date and time in a measurement information DB 542 of a mass storage unit 54 in association with each other. Here, the predetermined threshold may be appropriately set as necessary and stored in a storage unit 53 in advance. Note that, in a case where the information regarding the monitor screen extracted from the surgical space image is graph data, the control unit 51 may convert the graph data into a plurality of extracted measurement values and compare the converted extracted measurement values with the actual measurement values.
The control unit 51 acquires an actual measurement value (step S61), and compares the acquired actual measurement value with an extracted measurement value which is stored in the measurement information DB 542 and which corresponds to the imaging date and time which is the same as the acquisition date and time (step S62).
The control unit 51 determines whether or not the difference between them is equal to or greater than a predetermined threshold (step S63). When determining that the difference is equal to or greater than the predetermined threshold (YES in step S63), the control unit 51 stores the monitor screen displaying the extracted measurement value in the measurement information DB 542 in association with the imaging date and time (step S64).
The control unit 51 outputs warning information to the display unit 56 (step S65) indicating that the difference is equal to or greater than a predetermined threshold, and ends the processing.
When determining that the difference is not equal to or greater than the predetermined threshold (NO in step S63), the control unit 51 ends the processing.
In the fourth embodiment, it is possible to determine whether or not an abnormality occurs in the measurement device by comparing the actual measurement value with the extracted measurement value. In a case where there is an abnormality in the measurement device, warning information is displayed, by which the status of the abnormality can be clearly provided. Furthermore, it is also possible to omit in advance the extracted measurement value determined to be an abnormal value, and this has not been conceived of in the related art.
An information processing system according to the fifth embodiment is different from the first embodiment in that an information processing device 50 detects the abnormality of a patient (subject), and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
In the information processing device 50, a control unit 51 detects an abnormality of the patient on the basis of various types of measurement information stored in a measurement information DB 542. For example, a storage unit 53 may store in advance a table in which a numerical value indicating a normal range is set for each measurement item, and it may be determined that an abnormality occurs when the measurement information about the patient is outside the normal range with reference to the table. Alternatively, the storage unit 53 may store in advance a table in which information indicating a normal trend is set for each measurement item, and it may be determined that an abnormality occurs when the trend of the measurement information about the patient is different from the normal trend with reference to the table.
When determining that there is an abnormality, the control unit 51 outputs information regarding the abnormality to the display unit 56. The information regarding the abnormality may be information presenting an abnormal measurement value or trend information, or may be information presenting a measurement item in which an abnormality has occurred. When determining that there is an abnormality, the control unit 51 may output advice information for eliminating the abnormality to the display unit 56.
The present embodiment will describe a case where the measurement device is an extracorporeal circulation device as an example.
Specifically, the control unit 51 calculates a change in the flow rate or the pressure per unit time for each measurement item, stores the calculated change in, for example, the mass storage unit 54, determines whether or not an abnormality has occurred on the basis of the reference table, specifies the type of the abnormality when determining that the abnormality has occurred, and outputs advice information for eliminating the abnormality to the display unit 56. For example, when each of the measurement items K1, K2, K3, and K4 shows a decreasing trend, the control unit 51 determines that there is abnormality A, and displays (outputs) a message indicating “there may be a thrombus in the cannula for drawing blood” on the display unit 56 as advice information for eliminating the abnormality. In addition, the control unit 51 may display the type of abnormality on the display unit 56 together with the advice information. Furthermore, the control unit 51 may display, on the display unit 56, a combination of the trend information about various measurement values by the extracorporeal circulation device and the advice information. Alternatively, the control unit 51 may display an image according to the reference table on the display unit 56.
Note that the control unit 51 may determine the possibility of abnormality in the patient when detecting that the abnormality information is displayed on the monitor screen. When determining that there is an abnormality, the control unit 51 may store the monitor screen on which the abnormality information is displayed in the measurement information DB 542 in association with the imaging date and time.
The control unit 51 acquires measurement information for each measurement item (step S71), calculates a change in the measurement information about each measurement item (step S72), and stores the calculated change in the measurement information in the mass storage unit 54 (step S73). The control unit 51 determines whether or not there is an abnormality in the measurement information on the basis of the reference table (step S74), and when determining that there is no abnormality (NO in step S74), the control unit 51 determines that the measurement information is normal (step S75), and ends the processing.
When determining that there is an abnormality (YES in step S74), the control unit 51 reads advice information for eliminating the abnormality from the reference table (step S76), outputs the read advice information to the display unit 56 (step S77), and ends the processing.
In the fifth embodiment, when there is an abnormality in information about a medical device, the content of the abnormality can be recorded, and a countermeasure can be presented depending on the abnormality. Therefore, the burden on a medical profession can be reduced.
An information processing system according to the sixth embodiment is different from the first embodiment in that an information processing device 50 further performs behavior recognition processing on the basis of an image of an operator included in a surgical space image, and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
An imaging device 10 captures an image of a surgical space and sequentially acquires the surgical space image of one frame. The imaging device 10 is configured to acquire surgical space images of 60 frames, 30 frames, or 15 frames per second, for example, and the surgical space images acquired by the imaging device 10 are sequentially transmitted to the information processing device 50.
In addition to extracting the information regarding the monitor screen, the control unit 51 performs behavior recognition processing of recognizing the behavior of an operator in the surgical space imaged by the imaging device 10, and stores information regarding the recognized behavior of the operator and the information regarding the monitor screen in time series in association with each other. With respect to the behavior recognition processing, the control unit recognizes the behavior of the operator from the surgical space image using the behavior recognition model M stored in the storage unit 53. The behavior of the operator includes, but is not limited to, incision, suture, injection, endoscopic operation, catheter insertion, balloon expansion, use of an electric scalpel, hemostasis, waiting, and the like.
The behavior recognition model M illustrated in
The behavior recognition model M learns using training data including a correct answer label indicating each behavior and a captured image obtained by imaging a person performing the behavior. When the captured image included in the training data is input, the behavior recognition model M learns so as to output a discrimination label corresponding to the behavior indicated by the correct answer label included in the training data. In the learning processing, the behavior recognition model M learns so as to optimize a weighting coefficient that couples the nodes of each layer and the coefficient of a function. Thus, the trained behavior recognition model M is obtained that is trained to, when receiving the captured image as an input, specify the behavior performed by the person included in the captured image and output information regarding the behavior as a result of specifying the behavior. Note that, as the captured image used for the training data, images obtained by capturing persons performing various different behaviors can be used as images of the respective behaviors.
The behavior recognition model M in the present embodiment may receive, for example, captured images of a plurality of frames sequentially captured by the imaging device 10, or captured images of a plurality of frames extracted at predetermined time intervals (for example, 1 second) from the captured images captured by the imaging device 10. The behavior recognition model M is trained by another learning device, but may be trained by the information processing device 50. The behavior recognition model M is not limited to the neural network as illustrated in
The control unit 51 stores the information regarding the behavior of the operator output from the behavior recognition model M in a measurement information DB 542. In addition, when determining that an abnormality occurs as a result of determining whether or not an abnormality occurs and specifying the type of the abnormality as in the above-described fourth and fifth embodiments, the control unit 51 outputs abnormality information including the behavior of the operator and the type of the abnormality to the display unit 56.
The control unit 51 acquires the surgical space image (step S81) and inputs the image to the behavior recognition model M (step S82). The control unit 51 acquires the information regarding the behavior of the operator output from the behavior recognition model M (step S83), and stores the acquired information in the measurement information DB 542 in association with the measurement information (step S84).
The control unit 51 determines whether or not the abnormality flag is set (step S85), and when determining that the abnormality flag is not set (NO in step S85), the control unit 51 stores the type of the abnormality as “none” (step S86), and ends the processing.
When determining that the abnormality flag is set (YES in step S85), the control unit 51 determines whether or not the type of the abnormality has been identified (step S87). When determining that the type of the abnormality has been identified (YES in step S87), the control unit 51 stores the identified type of the abnormality in the abnormality type column of the measurement information DB 542 (step S88), outputs the abnormality information to the display unit 56 (step S89), and ends the processing.
When determining that the type of the abnormality has not been identified (NO in step S87), the control unit 51 determines whether or not the behavior of the operator corresponds to a behavior that is likely to generate noise (step S90). When determining that the behavior of the operator corresponds to the behavior that is likely to generate noise (YES in step S90), the control unit 51 stores “noise” as the type of the abnormality in the abnormality type column of the measurement information DB 542 (step S91), and returns the processing to step S89.
When determining that the behavior of the operator does not correspond to the behavior which is likely to generate noise (NO in step S90), the control unit 51 stores “unknown” as the type of abnormality in the abnormality type column of the measurement information DB 542 (step S92), outputs a message indicating “Please check the device” to the display unit 56 (step S93), and ends the processing.
In the sixth embodiment, it is possible to support determination of the cause of the abnormality in the measurement information by specifying the type of the behavior of the operator and recording the type together with the measurement information.
An information processing system according to the seventh embodiment is different from the first embodiment in that an information processing device 50 does not store the detection result of the monitor screen, and thus, the above difference will be mainly described below. The other configurations, operation, and effect are similar to those of the first embodiment, and thus, the corresponding portions are denoted by the same reference numerals, and the detailed description thereof will be omitted.
The control unit 51 acquires a surgical space image captured by an imaging device 10 (step S11), and detects a monitor screen from the acquired image (step S12).
The control unit 51 extracts information regarding the monitor screen from the detected monitor screen (step S13), stores the extracted information in a measurement information DB 542 together with the imaging date and time (step S14), and ends the processing.
In the seventh embodiment, the detection result of the monitor screen is not stored and is directly used for the processing of extracting the measurement information, whereby the real-time performance for executing the processing in real time can be improved. In addition, it is not necessary to have a capacity for storing the detection result of the monitor screen, and thus, there is no concern about the capacity of the mass storage unit 54.
The technical features (components) described in the respective embodiments can be combined with each other, and new technical features can be formed by the combination.
It should be construed that the embodiments disclosed herein are illustrative in all respects and not restrictive. The scope of the present invention is indicated not by the above meaning but by the claims and is intended to include all changes within the meaning and scope equivalent to the claims.
Number | Date | Country | Kind |
---|---|---|---|
2022-054471 | Mar 2022 | JP | national |
This application is a continuation of PCT Application No. PCT/JP2023/007183, filed Feb. 28, 2023, based on and claiming priority to Japanese Application No. JP2022-054471, filed Mar. 29, 2022, both of which are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/007183 | Feb 2023 | WO |
Child | 18769701 | US |