The present U.S. patent application claims a priority under the Paris Convention of Japanese patent application No. 2014-030821 filed on Feb. 20, 2014, the entirety of which incorporated by references.
1. Field of the Invention
The present invention relates to an information processing system and a non-transitory computer readable recording medium.
2. Description of the Background Art
In medical settings, image diagnosis is made by doctors observing lesions while displaying medical images generated by various modalities, and thereby obtaining findings. In image diagnosis, doctors create reports that show diagnostic results including finding sentences.
The reports include a plurality of items such as a shooting condition, a basic body part, basic findings and diagnosis for which various terms should be input. Creating such reports thus becomes a heavy burden for doctors.
To address the problem, technology for detecting an abnormality included in an image, and creating a radiologic interpretation report with use of a combination of a type and a location of the abnormality has been proposed for the purpose of supporting creation of reports showing diagnostic results, for example (e.g., Japanese Patent No. 3332104).
With the above-mentioned technology disclosed in Japanese Patent No. 3332104, however, an anatomy captured in an image is that in a two-dimensional coordinate system, and a location of an abnormality is detected only from, for example, six locations such as left and right upper lung fields, left and right middle lung fields, and left and right lower lung fields. As such, with the above-mentioned technology disclosed in Japanese Patent No. 3332104, it is difficult to accurately detect locations of abnormalities from three-dimensional anatomies, which are subjects captured in medical images generated by various modalities, and to efficiently create proper finding sentences.
An object of the present invention is therefore to provide technology for efficiently and properly creating diagnostic reports.
To achieve the above-mentioned object, an information processing system reflecting one aspect of the present invention includes a storage, a location designation unit, a body part identification unit, a body part designation unit, a display controller, and an element designation unit. The storage is capable of storing therein a combination information group that indicates a plurality of combinations of elements that belong to a plurality of items. The plurality of items include a first item concerning a body part in a three-dimensional anatomy, and a second item and a third item each different from the first item. The location designation unit designates, in accordance with a user action, an attention location in a medical image displayed by a display unit. The body part identification unit identifies, from among elements that belong to the first item, two or more elements that correspond to the attention location. The body part designation unit designates, in accordance with a user action, one of the two or more elements identified by the body part identification unit. The display controller causes the display unit to display, based on one or more combinations of elements that are indicated by the combination information group and correspond to the one element designated by the body part designation unit, two or more elements that are included in the one or more combinations of elements and belong to the second item so that the two or more displayed elements are distinguishable from the other elements. The element designation unit designates, in accordance with a user action, one or more of the two or more elements that belong to the second item and are displayed by the display unit. The display controller causes the display unit to display, based on at least one combination of elements that is indicated by the combination information group and corresponds to a combination of the one element designated by the body part designation unit and each of the one or more elements designated by the element designation unit, one or more elements that are included in the at least one combination of elements and belong to the third item so that the one or more displayed elements are distinguishable from the other elements.
Another aspect of the present invention is also directed to a non-transitory computer readable recording medium storing a computer-readable program, the program controlling an information processing system to operate as one information processing system. The one information processing system includes a storage, a location designation unit, a body part identification unit, a body part designation unit, a display controller, and an element designation unit. The storage is capable of storing therein a combination information group that indicates a plurality of combinations of elements that belong to a plurality of items. The plurality of items include a first item concerning a body part in a three-dimensional anatomy, and a second item and a third item each different from the first item. The location designation unit designates, in accordance with a user action, an attention location in a medical image displayed by a display unit. The body part identification unit identifies, from among elements that belong to the first item, two or more elements that correspond to the attention location. The body part designation unit designates, in accordance with a user action, one of the two or more elements identified by the body part identification unit. The display controller causes the display unit to display, based on one or more combinations of elements that are indicated by the combination information group and correspond to the one element designated by the body part designation unit, two or more elements that are included in the one or more combinations of elements and belong to the second item so that the two or more displayed elements are distinguishable from the other elements. The element designation unit designates, in accordance with a user action, one or more of the two or more elements that belong to the second item and are displayed by the display unit. The display controller causes the display unit to display, based on at least one combination of elements that is indicated by the combination information group and corresponds to a combination of the one element designated by the body part designation unit and each of the one or more elements designated by the element designation unit, one or more elements that are included in the at least one combination of elements and belong to the third item so that the one or more displayed elements are distinguishable from the other elements.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
The following describes one embodiment and modifications of the present invention based on the drawings. It should be noted that components having a similar structure and function bear the same reference sign in the drawings, and repetition of description thereof is avoided below. The drawings are those shown schematically, and sizes of and positional relationships among various components in each of the drawings are not accurate, and may be changed as appropriate.
The server 10 is a computing device that stores therein data on a medical image which is acquired by the modality 20 and in which three-dimensional anatomical components as subjects are captured. In response to a request from the terminal device 30, the server 10 provides data on a desired medical image to the terminal device 30. The server 10 has various functions to perform processing (also referred to as diagnosis support processing) to support a diagnostic action taken with respect to a medical image by use of the terminal device 30. The diagnosis support processing includes processing to support creation of a report (also referred to as a diagnostic result report) that shows a result of diagnosis made with respect to the medical image, for example. The diagnostic result report includes a radiologic interpretation report that shows a result of radiologic interpretation, for example.
The modality 20 is a device that acquires data on a medical image in which three-dimensional anatomical components as subjects are captured. Examples of the modality 20 are a computed radiography (CR) device, an ultrasound diagnostic (US) device and the like. A subject of shooting is not limited to a human, and may be other animals. Examples of the three-dimensional anatomical components are various organs, bones, joints, and the like included in a chest area.
The terminal device 30 is in the form of a personal computer (hereinafter, abbreviated as a PC), for example, and is a terminal device into which a diagnostic result report is input by a user, such as a doctor, who has specialized medical knowledge.
The controller 11 includes a processor 11a, such as a central processing unit (CPU), and volatile memory 11b, such as random access memory (RAM), for example. The processor 11a achieves the diagnosis support processing by reading a program 123 stored in the storage 12 and performing a variety of processing in accordance with the program 123. That is to say, functions of the information processing system 1 to perform the diagnosis support processing are achieved by the processor 11a executing the program 123.
The storage 12 includes non-volatile semiconductor memory or a hard disk, for example. The storage 12 can store therein medical care information 121, support information 122, and the program 123. The storage 12 can also store therein a variety of data that indicates a parameter and the like that are required to perform processing in accordance with the program 123 and data that is at least temporarily generated as a result of arithmetic processing, for example.
The electronic medical record DB 121a stores therein data on electronic medical records of many patients, for example.
The examination list 121b is a list of basic information pieces on many examinations, for example. The basic information pieces include information pieces on identification information (an ID) of a subject, a name, a birth date, age, sex, a state regarding creation of a diagnostic result report, identification information of an examination (an examination ID), an examination date, an examined body part, a modality that has acquired a medical image, and the number of medical images, for example. The state regarding creation of a diagnostic result report includes a state (also referred to as a not created state) in which a diagnostic result report has not been created and a state (also referred to as a created state) in which a diagnostic result report has been created.
The image DB 121c stores therein, for each of the examinations listed in the examination list 121b, data on a medical image acquired by the modality 20 in association with identification information, such as an examination ID, listed in the examination list.
The report DB 121d stores therein, for each of the examinations listed in the examination list 121b, data on a diagnostic result report in association with identification information, such as an examination ID, listed in the examination list 121b.
The location correspondence information 122a is information that associates information pieces on locations in a predetermined model (an anatomical model) concerning a three-dimensional anatomy with elements that belong to an item showing a basic body part in the three-dimensional anatomy. In the present embodiment, the elements that belong to the item showing the basic body part are each a term showing a name of the basic body part. That is to say, the location correspondence information 122a includes information that associates terms showing names of a plurality of basic body parts that indicate anatomical sections of a human body with information pieces on locations of predetermined areas in a preset anatomical model.
As illustrated in
The priority information 122b is information that associates the terms showing names of the basic body parts that indicate anatomical sections in the chest area, which are listed in the location correspondence information 122a, with respective numerical values showing priorities (also referred to as priority degrees). The priority information 122b is used by a body part designation unit 308, which is described later. For example, when a doctor designates an area, on a medical image, to which attention is to be paid (also referred to as an attention area), and a plurality of terms concerning basic body parts are identified as terms that correspond to the attention area, a term concerning a basic body part having a high priority degree is designated preferentially in accordance with the priority degrees.
The term information 122c is information that includes, with respect to a finding sentence and attribute information concerning the finding sentence each included in a diagnostic result report, a list of one or more terms as one or more elements, that can be used as one or more term for a corresponding one of items, by an item. In the present embodiment, the term information 122c is in the form of a table.
When a finding sentence included in a radiologic interpretation report is taken as an example, the items include items concerning the attribute information, such as “examined body part” and “shooting condition”, and a plurality of items concerning a current state of a patient, such as “basic body part”, “basic findings”, and “diagnosis”. That is to say, in the present embodiment, the plurality of items include an item concerning a body part as a first item, an item concerning findings as a second item, and an item concerning diagnosis as a third item. This facilitates designation of elements concerning findings and diagnosis in creating the diagnostic result report. A term includes a symbol that represents the term.
For example, as for the category “lung field”, a plurality of terms (a frontal view, a lateral view, and a lateral decubitus view) are listed for the item “shooting condition”, and a plurality of terms (e.g., an entire lung field, an apical portion of a lung, and an upper lung field) are listed for the item “basic body part”. In addition, a plurality of terms (e.g., tumor shadow, ground-glass opacity, and an increased concentration area) are listed for the item “basic findings”, and a plurality of terms (e.g., interstitial pneumonia and pneumonia) concerning each of classes of a disease (e.g., an infectious disease) are listed for the item “diagnosis”, for example.
The combination frequency information 122d is information that indicates frequency of combination of terms used in diagnostic result reports including finding sentences. In the present embodiment, the combination frequency information 122d is in the form of a table. As the frequency, the number of combination times of terms used in many past diagnostic result reports are combined can be used, for example. That is to say, in the combination frequency information 122d, a plurality of terms belong to each of a plurality of items that include the item concerning a body part in the three-dimensional anatomy, which is the first item, the item concerning basic findings, which is the second item, and the item concerning diagnosis, which is the third item. The combination frequency information 122d as a combination information group indicates a plurality of combinations of terms that belong to a plurality of items.
In the present embodiment, the term information 122c and the combination frequency information 122d are stored such that combinations of terms can be distinguished from one another by a combination of a term concerning an examined body part (e.g., CHEST and HEAD) and a term concerning a modality (e.g., CR and US).
The term information 122c and the combination frequency information 122d as described above can be created from information obtained by structuring many diagnostic result reports with use of a resource description framework (RDF) and an extensible markup language (XML), for example. For example, structuring with use of the resource description framework (RDF) is achieved by extracting, as for each diagnostic result report, a necessary element from a finding sentence that is a natural sentence and extracting various elements from the attribute information. As a result, as for many diagnostic result reports, data on many structured diagnostic result reports (also referred to as structured report data) is generated. In the present embodiment, an element extracted from a finding sentence and the attribute information is a term. A term includes a symbol that represents the term.
Specifically, information on a diagnostic result report is structured by dividing a variety of information included in report data and the attribute information into terms that belong to respective items and describing the terms with use of the RDF based on model data obtained through machine learning, for example. The model data is herein data of a model that indicates how elements constituting an existing radiologic interpretation report are divided into elements that belong to respective items. The items include an item concerning the attribute information (e.g., an examined body part and a shooting condition) and a plurality of items concerning a current state of a patient (e.g., a basic body part, basic findings, and diagnosis).
For example, the finding sentences shown in
If there are too many synonyms (e.g., terms “T2-weighted image” and “T2WI”) when the term information 122c and the combination frequency information 122d are built up, the number of terms increases excessively. With respect to synonyms, processing to replace each of the synonyms with a single representative term may be performed, for example. Replacement with the representative term can be achieved by including, in information used in machine learning, a table in which a plurality of terms are associated with a representative term, for example.
In a diagnostic result report, a term such as a modifier can be added to each of terms that belong to respective items, for example. For example, a term “circular nodular shadow” is a complex of a term “nodular shadow” that belongs to the item “basic findings” and a term “circular” as a modifier. In this case, a term that belongs to a detailed item “modifier” (hereinafter also referred to as a detailed element) may be identified from a finding sentence that is a natural sentence, and structured report data may be generated such that the detailed element is included. The term information 122c and the combination frequency information 122d may also be generated such that the detailed element is included.
The number of diagnostic result reports stored in the medical care information 121 can increase each time a new diagnostic result report is generated in response to input from the terminal device 30. Use of the diagnostic result reports stored over time as knowledge from the past is effective. Therefore, reflecting a term and a combination of terms included in a newly generated diagnostic result report in the term information 122c and the combination frequency information 122d is also effective. A newly stored diagnostic result report that includes a new finding sentence is especially valuable as the knowledge from the past is further developed.
The communication unit 15 performs data transmission/reception with a device other than the server 10 via the communication line W1, for example.
The controller 31 includes a processor 31a, such as a central processing unit (CPU), and volatile memory 31b, such as random access memory (RAM), for example. The processor 31a achieves the diagnosis support processing by reading a program 321 stored in the storage 32 and performing a variety of processing in accordance with the program 321. That is to say, functions of the information processing system 1 to perform the diagnosis support processing are achieved by the processor 31a executing the program 321.
The storage 32 includes non-volatile semiconductor memory or a hard disk, for example, and stores therein the program 321 and a variety of data. The variety of data can include data that indicates a parameter and the like that are required to perform processing in accordance with the program 321 and data that is at least temporarily generated as a result of arithmetic processing.
The operation unit 33 includes a pointing device, such as a keyboard and a mouse, for example. The operation unit 33 outputs, to the controller 31, a signal (also referred to as an instruction signal) that is generated in accordance with an operation performed on the keyboard, the mouse, and the like. The operation unit 33 may be in the form of a touch panel, or the like.
The display unit 34 includes various display devices, such as a liquid crystal display (LCD), for example. The display unit 34 visually outputs a variety of information in response to a signal input from the controller 31.
The communication unit 35 performs data transmission/reception with a device other than the terminal device 30 via the communication line W1 and the like.
The information processing system 1 includes a plurality of functional components for achieving the diagnosis support processing in the controllers 11 and 31, for example. The plurality of functional components include a reading unit 301, an examination designation unit 302, a management unit 303, a display controller 304, a location designation unit 305, a body part identification unit 306, a storage controller 307, a body part designation unit 308, an information extraction unit 309, an association identification unit 310, an element designation unit 311, and a registration designation unit 312.
The reading unit 301 reads a variety of information from the medical care information 121 in response to a signal from the operation unit 33 and a command from the management unit 303. The variety of read information includes information on the examination list 121b and a variety of data on an examination targeted for creation of a diagnostic result report (also referred to as a creation target examination), for example. The variety of information read by the reading unit 301 is visually output by the display unit 34 as appropriate through control performed by the display controller 304. As a result, an examination list screen DL1 (
The examination designation unit 302 designates a creation target examination in response to a signal from the operation unit 33.
The management unit 303 specifies a task of creating a diagnostic result report corresponding to the creation target examination designated by the examination designation unit 302, and causes the reading unit 301 to read a variety of information on the creation target examination from the medical care information 121. The variety of read information includes the data on the medical image regarding the creation target examination and basic information regarding the creation target examination, for example.
The display controller 304 controls visual output of a variety of information performed by the display unit 34. The display controller 304 causes the display unit 34 to display the examination list screen DL1 (
The display controller 304 also causes the display unit 34 to display a list of terms (also referred to as a term list) as a plurality of options, based on information on at least one or more combinations of terms indicated by the combination frequency information 122d, which is the combination information group. The term list includes a first part concerning the first item, a second part concerning the second item, and a third part concerning the third item, for example. The first part is a part in which terms belonging to an item “basic body part”, which is the first item, are listed for a name of the first item. The second part is a part in which terms belonging to an item “basic findings”, which is the second item, are listed for a name of the second item. The third part is a part in which terms belonging to an item “diagnosis”, which is the third item, are listed for a name of the third item. The information on at least one or more combinations of terms indicated by the combination frequency information 122d can be extracted by the information extraction unit 309 as appropriate.
The location designation unit 305 designates, in accordance with a user operation on the operation unit 33, an attention location on a medical image displayed by the display unit 34. An example of the attention location is a location, on a medical image, at which a doctor has found an abnormal shadow and the like. Designation of the attention location may be achieved, for example, by setting a mouse pointer at a given location on the medical image, and double clicking a left mouse button by pressing the left mouse button twice in succession.
The body part identification unit 306 performs processing (also referred to as body part identification processing) to identify, from among terms that belong to the “basic body part”, which is the first item, two or more terms that correspond to the attention location designated by the location designation unit 305 as candidates for a term concerning the item “basic body part”. In the body part identification processing, information (also referred to as location information) indicating the attention location on the medical image is converted into information on a location in the anatomical model, and two or more terms that belong to the item “basic body part”, which is the first item, and are associated with the information on the location in the anatomical model are extracted from the location correspondence information 122a. This facilitates identification of a body part according to the three-dimensional anatomy.
Since subjects captured in a two-dimensional medical image are three-dimensional anatomical components, two or more body parts can exist in a depth direction of the medical image at the attention location designated on the two-dimensional medical image. In the present embodiment, two or more terms that show names of basic body parts and correspond to the designated attention location are identified as candidates for a term concerning the item “basic body part”.
The storage controller 307 causes the storage 12 to store therein the location information on the attention location, in the medical image, designated by the location designation unit 305. In this case, the location information is stored within the information on the creation target examination included in the medical care information 121 stored in the storage 12, for example. Examples of the location information are an address and coordinates that specify a pixel of the medical image.
At the same time, the display controller 304 causes the display unit 34 to display the attention location on the medical image based on the location information on the attention location, in the medical image, stored in the storage 12 so that the attention location is distinguishable from a surrounding area. For example, the attention location may be indicated by use of a preset marker. Examples of the marker are a frame enclosing the attention location and an arrow pointing to the attention location. By thus displaying the attention location on the medical image so that the attention location is distinguishable, the attention location can easily be referred to. As a result, diagnostic result reports can properly be created.
The body part designation unit 308 designates, in accordance with a user operation on the operation unit 33, one of the two or more terms identified, as the candidates for the term, by the body part identification unit 306.
The information extraction unit 309 extracts information that corresponds to the one term that is designated by the body part designation unit 308 and belongs to the item “basic body part” from the combination frequency information 122d, which is the combination information group, stored in the support information 122.
The association identification unit 310 identifies, in response to designation of the one term by the body part designation unit 308, one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term designated by the body part designation unit 308. In this case, the display controller 304 causes the display unit 34 to display one or more terms that are included in the one or more combinations of terms identified by the association identification unit 310 and belong to the second item “basic findings” based on the one or more combinations of terms so that the one or more terms are distinguishable from the other terms, for example.
Specifically, the display controller 304 causes the display unit 34 to display, in the term list, the one or more terms that are included in the one or more combinations of terms identified by the association identification unit 310 and belong to the second item “basic findings” so that the one or more terms are distinguishable from one or more remaining terms that belong to the second item “basic findings”, for example. The one or more terms that belong to the item “basic findings” and are associated with the designated term that belongs to the item “basic body part” can thus easily be known. As a result, designation of terms in creating a diagnostic result report is further facilitated.
Furthermore, the display controller 304 may cause the display unit 34 to display, in the term list, one or more terms that belong to the second item “basic findings” and are included in the one or more combinations of terms identified by the association identification unit 310 in accordance with frequency of appearance in the one or more combinations of terms, for example. An example of the frequency of appearance is frequency regarding one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term that is designated by the body part designation unit 308. The one or more terms that belong to the item “basic findings” and are strongly associated with the designated term that belongs to the item “basic body part” can thus easily be known. As a result, designation of terms in creating a diagnostic result report is further facilitated.
The element designation unit 311 designates, in accordance with a user operation on the operation unit 33, one or more of two or more terms that belong to the second item “basic findings” and are displayed by the display unit 34. The element designation unit 311 also designates, in accordance with a user operation on the operation unit 33, one or more of two or more terms that belong to the third item “diagnosis” and are displayed by the display unit 34.
In response to designation of the one or more terms by the element designation unit 311, the association identification unit 310 identifies at least one combination of terms that is indicated by the combination frequency information 122d and corresponds to a combination of the one term designated by the body part designation unit 308 and each of the one or more terms designated by the element designation unit 311. In this case, the display controller 304 causes the display unit 34 to display, in the term list, one or more terms that are included in the at least one combination of terms identified by the association identification unit 310 and belong to the third item “diagnosis” based on the at least one combination of terms so that the one or more terms are distinguishable from the other terms, for example.
Specifically, the display controller 304 causes the display unit 34 to display, in the term list, the one or more terms that are included in the at least one combination of terms identified by the association identification unit 310 and belong to the third item “diagnosis” so that the one or more terms are distinguishable from one or more remaining terms that belong to the third item “diagnosis”, for example. The one or more terms that belong to the item “diagnosis” and are associated with a combination of the designated term that belongs to the item “basic body part” and the designated term that belongs to the item “basic findings” can thus easily be known. As a result, designation of terms in creating a diagnostic result report is further facilitated.
Furthermore, the display controller 304 may cause the display unit 34 to display, in the term list, one or more terms that belong to the third item “diagnosis” and are included in the at least one combination of terms identified by the association identification unit 310 in accordance with frequency of appearance in the at least one combination of terms, for example. An example of the frequency of appearance is frequency regarding at least one combination of terms that is indicated by the combination frequency information 122d and corresponds to a combination of the one term that is designated by the body part designation unit 308 and each of the one or more terms designated by the element designation unit 311. The one or more terms that belong to the item “diagnosis” and are strongly associated with a combination of the designated term that belongs to the item “basic body part” and the designated term that belongs to the item “basic findings” can thus easily be known. As a result, designation of terms in creating a diagnostic result report is further facilitated.
Each term may be displayed so as to be distinguishable in accordance with frequency of appearance of the term, for example, by changing density, a color, a size, thickness, brightness, and a font of the term, and by changing density, a color, and brightness of a frame enclosing the term, for example. Alternatively, the term may be displayed so as to be distinguishable in accordance with frequency of appearance of the term, for example, by linking terms that belong to different items to each other by use of various display elements, such as a line and an arrow, in accordance with the combination information. The term may be displayed in accordance with frequency of appearance of the term, for example, by changing a method for displaying the term when the frequency of appearance of the term falls within a preset value range. The one or more terms that belong to the item “diagnosis” and are strongly associated with a combination of the designated term that belongs to the item “basic body part” and the designated term that belongs to the item “basic findings” can thus easily be known. As a result, designation of terms in creating a diagnostic result report is further facilitated.
The registration designation unit 312 designates registration of a diagnostic result report in accordance with a user operation on the operation unit 33. In this case, the storage controller 307 causes the storage 12 to store therein a diagnostic result report including a finding sentence that is created based on one term concerning the item “basic body part” designated by the body part designation unit 308, and two or more terms concerning the items “basic findings” and “diagnosis” designated by the element designation unit 311, for example. Data on a diagnostic result report can be stored in the storage 12 as information on the creation target examination included in the medical care information 121, for example.
In step S1, the reading unit 301 reads information on the examination list 121b included in the medical care information 121, and the display controller 304 causes the display unit 34 to display the examination list screen DL1 (
In step S2, when a doctor presses a set button (e.g., an enter key) on the operation unit 33 in a state in which the solid frame CS1 is set to a desired examination in the examination list screen DL1, the examination designation unit 302 designates the examination enclosed by the solid frame CS1 as a creation target examination. In this case, the management unit 303 specifies a task of creating a diagnostic result report corresponding to the creation target examination designated by the examination designation unit 302.
In step S3, the reading unit 301 reads a variety of information on the creation target examination from the medical care information 121. The variety of information read herein includes data on the medical image regarding the creation target examination and basic information regarding the creation target examination, for example.
In step S4, the display controller 304 causes the display unit 34 to display the diagnosis support screen SD1 (
The following describes the diagnosis support screen SD1 with reference to
As shown in
The display controller 304 causes the display unit 34 to display the support template as a template that includes a list of terms (a term list) as a plurality of options, based on information on at least one or more combinations of terms indicated by the combination frequency information 122d, which is the combination information group. In the present embodiment, the information on the at least one or more combinations of terms is information on one or more combinations of terms that are indicated by the combination frequency information 122d, which is the combination information group, and correspond to a combination of an examined body part and a modality regarding the creation target examination.
The category buttons SP1-SP5 are buttons for selectively designating one or more terms that belong to a category regarding the diagnostic result report as an input target. By identifying one or more terms that belong to a category that correspond to an examined body part and a modality regarding the creation target examination from the term information 122c, the buttons for selectively designating the one or more terms are shown. In the support template shown in
The selection areas A31-A34 are large areas that occupy the middle part of the support template, and are sequentially arranged from left to right in the stated order, for example. Specifically, in the selection area A31, a plurality of options for a term that belongs to the item “shooting condition” (e.g., a frontal view, a lateral view, and a lateral decubitus view) are listed downwards, for example. In the selection area A32 as the first part, a plurality of options for a term that belongs to the item “basic body part” (e.g., an entire lung field, an upper lung field, and a middle lung field) are listed downwards. In the selection area A33 as the second part, a plurality of options for a term that belongs to the item “basic findings” (e.g., tumor shadow, ground-glass opacity, and an increased concentration area) are listed downwards. In the selection area A34 as the third part, a plurality of classes that belong to the item “diagnosis” (e.g., an infectious disease, a respiratory disease, and a tumor disease) are listed downwards, and, for each of the classes, a plurality of options for a term (e.g., interstitial pneumonia, pneumonia, and aspiration pneumonia) are listed downwards. That is to say, in the selection areas A31-A34, a plurality of options for the terms that belong to the items “shooting condition”, “basic body part”, “basic findings”, and “diagnosis” are presented so as to be distinguishable by an item. In the selection areas A31-A34, terms that belong to the category regarding one of the category buttons SP1-SP5 that is currently designated are listed. In the present embodiment, some of the terms are shown as “XXXX”, for example, to avoid complexity of the drawing.
In the selection area A31, when a desired option for the term is pressed with the mouse pointer M1 in accordance with a doctor's operation on the operation unit 33, the desired option for the term is designated for the item “shooting condition”. In this case, the designated option for the term is shown so as to be distinguishable from the other options (e.g., highlighted), and the designated term is displayed in the text box Tbl as a term that belongs to the item “shooting condition”. The term that belongs to the item “shooting condition” can appropriately be designated in the selection area A31 while a doctor checks the medical image displayed in the second area Ar2. In the present embodiment, however, a term indicating a shooting condition is provided in advance to the data on the medical image regarding the creation target examination, and, upon determination of the medical image displayed in the second area Ar2, the term indicating the shooting condition provided to the medical image is designated automatically in the selection area A31.
In the selection area A32, a desired option for the term is designated for the item “basic body part” in accordance with a doctor's operation on the operation unit 33. In this case, the designated option for the term is shown so as to be distinguishable from the other options (e.g., highlighted), and the designated term is displayed in the text box Tb2 as a term that belongs to the item “basic body part”. As for designation of the term that belongs to the item “basic body part”, upon designation of the attention location on the medical image displayed in the second area Ar2, candidates for the term that belongs to the item “basic body part” are automatically identified through the body part identification processing, which is described later, for example. Options for the term that belongs to the item “basic body part” are thus narrowed down, and designation of the term that belongs to the item “basic body part” is facilitated. By pressing, with the mouse pointer M1, any of buttons “L”, “B”, “R” provided on the left side of the designated option for the term, a modifier “left”, “both”, or “right” may be added to the term displayed in the text box Tb2 as appropriate.
In the selection area A33, when a desired option for the term is pressed with the mouse pointer M1 in accordance with a doctor's operation on the operation unit 33, the desired option for the term is designated for the item “basic findings”. In this case, the designated option for the term is shown so as to be distinguishable from the other options (e.g., highlighted), and the designated term is displayed in the text box Tb3 as a term that belongs to the item “basic findings”.
In the selection area A34, when a desired option for the term is pressed with the mouse pointer M1 in accordance with a doctor's operation on the operation unit 33, the desired option for the term is designated for the item “diagnosis”. In this case, the designated option for the term is shown so as to be distinguishable from the other options (e.g., highlighted), and the designated term is displayed in the text box Tb4 as a term that belongs to the item “diagnosis”.
In the selection areas A31-A34, a detail designate button PS is provided on the right side of each option for the term. When the detail designate button PS is pressed with the mouse pointer M1, a window (also referred to as a detail window) OW1 in which a detailed element that modifies the designated term can be input is displayed so as to be overlaid on the support template. The detailed element includes a term such as a modifier, for example.
Use of such a support template allows doctors to designate an option for a term in each of the selection areas A31-A34 while referring to a medical image regarding the creation target examination displayed in the second area Ar2 of the diagnosis support screen SD1, thereby facilitating creation of finding sentences. For example, when terms “frontal view”, “entire lung field”, “ground-glass opacity”, and “pneumonia” are designated for respective four items (a shooting condition, a basic body part, basic findings, and diagnosis), finding sentences “In a frontal view, ground-glass opacity is found in the entire lung field. Pneumonia is suspected.” can be created, for example. For a specific item (e.g., basic findings), two or more options for the term may be designated to create a finding sentence. Specifically, when terms “frontal view”, “entire lung field”, “ground-glass opacity”, “nodular shadow”, and “pneumonia” are designated for four items (a shooting condition, a basic body part, basic findings, and diagnosis), finding sentences “In a frontal view, ground-glass opacity and nodular shadow are found in the entire lung field. Pneumonia is suspected.” can be created, for example.
When doctors make diagnosis with reference a chest frontal view acquired by a CR device, the doctor typically checks whether there is an abnormal shadow in the order of “lung”, “mediastinum”, “bone”, “soft part”, and “pleura (margin)” in accordance with a display order of the category buttons SP1-SP5. When a three-dimensional anatomy is captured in a two-dimensional medical image, specialists may be able to specify the “basic body part” at which the abnormal shadow exists as they can imagine the three-dimensional anatomy from overlapping of shadows and the like, but general practitioners may have difficulty specifying the “basic body part” accurately. In the present embodiment, when a location, on a medical image, at which an abnormal shadow is found is designated by a doctor, candidates for the term that belongs to the item “basic body part” that can be included in the designated location are identified and presented. As a result, false identification and overlooking of a location at which an abnormality is found are less likely to occur in image diagnosis, and more reliable image diagnosis can be made.
In step S5, the body part identification unit 306 sets a diagnostic area r0 as an anatomical area targeted for diagnosis made with respect to a medical image displayed by the display unit 34 (also referred to as a diagnostic target).
The diagnostic area r0 is set at a location and in a range that cover an area that is required for making diagnosis with respect to the medical image regarding the creation target examination. In the present embodiment, a thoracic area that includes a lung field and a mediastinum, which are the most important areas in making diagnosis with respect to a chest frontal view as a medical image on the diagnosis support screen SD1 (
First, in an area corresponding to a radiation field F1 that is obtained by excluding an upper part and a lower part of the medical image, a relationship between an X coordinate and a cumulative value of density (also referred to as a cumulative density value) at each X coordinate are obtained as a profile prj (X) in the X direction. When the area corresponding to the radiation field F1 is divided into three equal parts in the X direction, an X coordinate corresponding to the minimum cumulative density value in a middle part of the area is set to an X coordinate of a midline (XC). Furthermore, when the area corresponding to the radiation field F1 is divided into three equal parts in the X direction, in a left part of the area, an X coordinate that corresponds to a cumulative density value that first becomes equal to a preset threshold TR in a direction toward a left end of the medical image in a part that is closer to the left end than an X coordinate corresponding to the local maximum cumulative value is set to a right end XR of the thoracic area. On the other hand, when the area corresponding to the radiation field F1 is divided into three equal parts in the X direction, in a right part of the area, an X coordinate that corresponds to a cumulative density value that first becomes equal to a preset threshold TL in a direction toward a right end of the medical image in a part that is closer to the right end than an X coordinate corresponding to the local maximum cumulative value is set to a left end XL of the thoracic area.
Next, in an area of the medical image that corresponds to the radiation field F1 between the right end XR and left end XL, a relationship between a Y coordinate and a cumulative value of density (also referred to as a cumulative density value) at each Y coordinate is obtained as a profile prj (Y) in the Y direction. When the area corresponding to the radiation field F1 is divided into four equal parts in the Y direction, in the highest part of the area, an Y coordinate that corresponds to a cumulative density value that first becomes equal to a preset threshold TT in a direction toward a top end of the medical image in a part that is closer to the top end than a Y coordinate corresponding to the maximum cumulative value is set to a top end YT of the thoracic area. On the other hand, when the area corresponding to the radiation field F1 is divided into four equal parts in the Y direction, in lower two parts (a lower half) of the area, a Y coordinate that corresponds to a cumulative density value that first becomes equal to a preset threshold TB in a direction toward a bottom end of the medical image is set to a bottom end YB of the thoracic area.
The thoracic area defined by the top end, the bottom end, the left end, and the right end thus obtained is set as the diagnostic area r0. Coordinates (xi1, yi1) of an upper left point r1 and coordinates (xi2, yi2) of a lower right point r2 of the diagnostic area r0 are also set.
As described above, the thoracic area extracted by the body part identification unit 306 is set as the diagnostic area r0 in the present embodiment. The present invention, however, is not limited to this structure. For example, a doctor as a user may manually designate any area in the medical image displayed by the display unit 34 as the diagnostic area r0 (thoracic area). The diagnostic area r0 is not limited to the thoracic area, and may optionally be set by a doctor as a user.
In step S6, the location designation unit 305 designates the attention location on the medical image in response to a doctor's operation on the operation unit 33.
In step S7, the body part identification unit 306 identifies candidates for a term concerning a basic body part that corresponds to the attention location designated in step S6. For example, first processing and second processing are performed herein. In the first processing, a location in the anatomical model that corresponds to the attention location designated in the medical image is obtained. In the second processing, two or more terms concerning a basic body part are identified as candidates for the term based on the location in the anatomical model obtained by the first processing.
In the first processing, the body part identification unit 306 associates the diagnostic area r0 on the medical image with the reference area R0 in the anatomical model in accordance with the following equations (1) and (2).
(xi−xi1)/(xi2−xi1)=(xm−xm1)/(xm2−xm1) (1)
(yi−yi1)/(yi2−yi1)=(ym−ym1)/(ym2−ym1) (2)
In the first processing, the body part identification unit 306 converts the coordinates (xi, yi) of the point P1 as the attention location on the medical image into the coordinates (xm, ym) of the point p1 as the attention location in the anatomical model in accordance with the following equations (3) and (4).
xm=(xi−xi1)/(xi2−xi1)×(xm2−xm1)+xm1 (3)
ym=(yi−yi1)/(yi2−yi1)×(ym2−ym1)+ym1 (4)
In the second processing, the body part identification unit 306 compares the location information (the coordinates (xm, ym) of the point p1) on the attention location in the anatomical model obtained in the first processing with a predetermined area (e.g., the areas A1-A3) in the anatomical model for each term concerning a basic body part as an anatomical section. As a result, two or more terms concerning a basic body part can be identified.
In the examples of
As described above, whether the point p1 overlaps a predetermined area in the anatomical model is judged in the second processing for each term concerning a basic body part as an anatomical section. In this case, a flag “1: display” is set when the point p1 overlaps the predetermined area, and a flag “0: not display” is set when the point p1 does not overlap the predetermined area. A term concerning a basic body part with respect to which the flag “1: display” is set is identified as a candidate for the term concerning the basic body part. In the examples of
The body part identification unit 306 judges, for each candidate for the term concerning the basic body part with respect to which the flag “1: display” is set, whether the point p1 is located in a left area or in a right area of the anatomical model by using a center CO of the anatomical model in a horizontal direction as a reference. In accordance with a result of the judgment, a term “left” or “right” is added to each candidate for the term concerning the basic body part with respect to which the flag “1: display” is set. In the examples of
As described above, the processing to add the term “left” or “right” to each identified candidate for the term is performed after setting the flag “1: display” or “0: not display” in the present embodiment. The present invention, however, is not limited to this structure. For example, a term concerning a basic body part as an anatomical section to which the term “left” or “right” is added in advance may be prepared.
In step S8, the body part designation unit 308 designates one of two or more terms as candidates for the term that belongs to the item “basic body part” as identified in step S7. From among the two or more terms that belong to the item “basic body part” identified in step S7, a term that has higher priority is designated as the term that belongs to the item “basic body part” in an initial state. Specifically, the body part designation unit 308 designates a term that is associated with the highest priority degree of all the two or more terms that belong to the item “basic body part” identified in step S7 with reference to the priority information 122b, for example. For example, when the two or more terms as candidates for the term that belongs to the item “basic body part” are a middle lung field, a hilum of a lung, and a rib, the middle lung field, which is associated with the highest priority degree in the priority information 122b shown in
In step S9, the information extraction unit 309 extracts information on the term that belongs to the item “basic body part” designated in step S8 from the combination frequency information 122d, which is a combination information group, stored in the support information 122. In this case, when the term “middle lung field” is designated in step S8, for example, information on one or more combinations of terms concerning the middle lung field and information indicating frequency of the one or more combinations of terms are extracted from the combination frequency information 122d.
In step S10, the display controller 304 causes the display unit to display a support template based on the information on one or more combinations of terms and the information indicating frequency of the one or more combinations of terms as extracted in step S9. In the support template, a term list including a list of terms as a plurality of options is displayed based on the information on one or more combinations of terms extracted from the combination frequency information 122d in step S9.
In step S10, the display controller 304 causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on one or more combinations of terms extracted in step S9 so that the one or more terms are distinguishable from one or more remaining terms that belong to the item “basic findings”. For example, the association identification unit 310 identifies one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term designated in step S8. The display controller 304 then causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on one or more combinations of terms based on the one or more combinations of terms identified by the association identification unit 310 so that the one or more terms are distinguishable from one or more remaining terms that belong to the item “basic findings”, for example.
In the example of
In step S10, the display controller 304 causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on one or more combinations of terms extracted in step S9 in accordance with the information indicating frequency of the one or more combinations of terms. For example, the association identification unit 310 identifies one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term designated in step S8. The display controller 304 then causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings”, as the second item, and are included in the one or more combinations of terms identified by the association identification unit 310 in accordance with frequency of appearance in the one or more combinations of terms, for example.
In the example of
In the selection areas A31-A34, a detail designate button PS is provided on the right side of each option for the term. When the detail designate button PS is pressed with the mouse pointer M1, a detail window OW1 in which a detailed element (e.g., a modifier) that modifies the designated term can be input is displayed so as to be overlaid on the support template.
In step S11 shown in
In step S12, the body part designation unit 308 judges whether an instruction to change a category has been issued in the support template. When the instruction to change a category has not been issued, processing proceeds to step S13. When the instruction to change a category has been issued, processing proceeds to step S23 shown in
In step S13, the element designation unit 311 judges whether a term that belongs to the item “basic findings” has been designated in the selection area A33 of the support template. When the term that belongs to the item “basic findings” has not been designated, processing returns to step S11. When the term that belongs to the item “basic findings” has been designated, processing proceeds to step S14.
In step S14, the display controller 304 causes the display unit to change, in the term list, a method for displaying the term that belongs to the item “diagnosis” in accordance with the term that belongs to the item “basic findings” designated in step S11. For example, the association identification unit 310 identifies at least one combination of terms that is indicated by the combination frequency information 122d and corresponds to a combination of the one term designated by the body part designation unit 308 and each of the one or more terms designated by the element designation unit 311. The display controller 304 then causes the display unit to display, in the term list, one or more terms that belong to the item “diagnosis” and are included in the at least one combination of terms identified by the association identification unit 310 based on the at least one combination of terms so that the one or more terms are distinguishable from the other terms, for example. Specifically, the display controller 304 causes the display unit to display, in the term list, the one or more terms that belong to the item “diagnosis” and are included in the at least one combination of terms identified by the association identification unit 310 so that the one or more terms are distinguishable from one or more remaining terms that belong to the item “diagnosis”, for example.
In this step, the display controller 304 causes the display unit to display, in the term list, the one or more terms that belong to the item “diagnosis” and are included in the at least one combination of terms identified by the association identification unit 310 in accordance with frequency of appearance in the at least one combination of terms, for example. That is to say, terms that belong to the item “diagnosis” are displayed in accordance with frequency of appearance in at least one combination of terms that is indicated by the combination frequency information 122d and corresponds to a combination of the designated term that belongs to the item “basic body part” and each of the designated one or more terms that belong to the item “basic findings”.
In the example of
In step S15, the controllers 11 and 31 receive input in accordance with a doctor's operation on the operation unit 33 in the support template and the like. For example, input regarding cancellation of designation of a term achieved by pressing of the reset button RB1, change of a category achieved by pressing of any of the category buttons SP1-SP5, designation of a term that belongs to the item “diagnosis” achieved by the element designation unit 311, and the like is received in accordance with the operation on the operation unit 33.
In step S16, the controllers 11 and 31 judge whether an instruction to cancel designation of the term has been issued by pressing of the reset button RB1 with the mouse pointer M1. When the instruction to cancel designation of the term has not been issued, processing proceeds to step S17. When the instruction to cancel designation of the term has been issued, designation of the term that belongs to the item “basic findings” is canceled, and processing returns to step S11.
In step S17, judgment that is similar to that made in step S12 described above is made. When the instruction to change a category has not been issued, processing proceeds to step S18. When the instruction to change a category has been issued, processing proceeds to step S23 shown in
In step S18, the element designation unit 311 judges whether a term that belongs to the item “diagnosis” has been designated in the selection area A34 of the support template. When the term that belongs to the item “diagnosis” has not been designated, processing returns to step S15. When the term that belongs to the item “diagnosis” has been designated, processing proceeds to step S19. When processing proceeds to step S19, terms are displayed in all the text boxes Tb1 to Tb4. In this case, a finding sentence is generated based on terms displayed in the text boxes Tb1-Tb4 and predicates designated in the predicate lists PL1-PL4, for example, and is automatically input into a comment display area Ar21 (
In step S19, the controllers 11 and 31 receive input in accordance with a doctor's operation on the operation unit 33 in the support template and the like. For example, input regarding cancellation of designation of a term achieved by pressing of the reset button RB1, registration of a diagnostic result report achieved by pressing of a set button FB1, and the like is received in accordance with the operation on the operation unit 33.
In step S20, judgment that is similar to that made in step S16 described above is made. When the instruction to cancel designation of the term has not been issued, processing proceeds to step S21. When the instruction to cancel designation of the term has been issued, designation of the term that belongs to the item “basic findings” and the term that belongs to the item “diagnosis” is canceled, and processing returns to step S11. When the instruction to cancel designation of the term has been issued, designation of only the term that belongs to the item “diagnosis” may be canceled, and processing may return to step S15, for example.
In step S21, the registration designation unit 312 judges whether an instruction to register a diagnostic result report has been issued by pressing of the set button FB1 with the mouse pointer M1. When the instruction to register a diagnostic result report has not been issued, processing returns to step S19. When the instruction to register a diagnostic result report has been issued, processing proceeds to step S22.
In step S22, the storage controller 307 causes the medical care information 121 to store therein a diagnostic result report that includes a finding sentence generated based on the one term designated for the item “basic body part” and terms that are designated for the items “basic findings” and “diagnosis”. Specifically, the finding sentence displayed in the comment display area Ar21 in the fourth area Ar4 of the diagnosis support screen SD1 is stored in the medical care information 121.
In step S23 shown in
The following describes, as a specific example, a case where the candidates for the term that belongs to the item “basic body part” identified in step S7 are a term “middle lung field” that corresponds to the category “lung”, a term “hilum of a lung” that corresponds to the category “mediastinum”, and a term “rib” that corresponds to the category “bone”. When the terms that belong to the category for the diagnostic result report as an input target are changed from the terms that belong to the category “lung” or “bone” to the terms that belong to the category “mediastinum”, the body part designation unit 308 designates the term “hilum of a lung” that corresponds to the category “mediastinum”. When the terms that belong to the category for the diagnostic result report as an input target are changed from the terms that belong to the category “lung” or “mediastinum” to the terms that belong to the category “bone”, the body part designation unit 308 designates the term “rib” that corresponds to the category “bone”. When the terms that belong to the category for the diagnostic result report as an input target are changed from the terms that belong to the category “mediastinum” or “bone” to the terms that belong to the category “lung”, the body part designation unit 308 designates the term “middle lung field” that corresponds to the category “lung”.
In step S24, the information extraction unit 309 extracts information on terms that belong to the item “basic body part” designated in step S23 from the combination frequency information 122d, which is the combination information group, stored in the support information 122. In this case, when the category “mediastinum” is designated in step S23, for example, information on one or more combinations of terms that belong to the category “mediastinum” and information indicating frequency of the one or more combinations of terms are extracted from the combination frequency information 122d. When the category “bone” is designated in step S23, for example, information on one or more combinations of terms that belong to the category “bone” and information indicating frequency of the one or more combinations of terms are extracted from the combination frequency information 122d. When the category “lung” is designated in step S23, for example, information on one or more combinations of terms that belong to the category “lung” and information indicating frequency of the one or more combinations of terms are extracted from the combination frequency information 122d.
In step S25, the display controller 304 causes the display unit to display the support template based on the information on the one or more combinations of terms and the information indicating the frequency of the one or more combinations of terms, both extracted in step S24. In the support template, a term list including a list of terms as a plurality of options is displayed based on the information on the one or more combinations of terms extracted from the combination frequency information 122d in step S24.
In step S25, the display controller 304 causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on the one or more combinations of terms extracted in step S24 so that the one or more terms are distinguishable from one or more remaining terms that belong to the item “basic findings”. For example, the association identification unit 310 identifies one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term designated in step S23. The display controller 304 then causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on the one or more combinations of terms identified by the association identification unit 310 based on the one or more combinations of terms so that the one or more terms are distinguishable from one or more remaining terms that belong to the item “basic findings”, for example.
In the example of
In step S25, the display controller 304 causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings” and are included in the information on the one or more combinations of terms extracted in step S24 in accordance with the information indicating the frequency of the one or more combinations of terms. For example, the association identification unit 310 identifies one or more combinations of terms that are indicated by the combination frequency information 122d and correspond to the one term designated in step S23. The display controller 304 then causes the display unit to display, in the term list, one or more terms that belong to the item “basic findings”, which is the second item, and are included in the one or more combinations of terms identified by the association identification unit 310 in accordance with frequency of appearance in the one or more combinations of terms, for example.
In the examples of
After processing in step S25 is performed, processing proceeds to step S11 shown in
In the example of
In the examples of
The diagnosis support processing as described above supports creation of a diagnostic result report with respect to a medical image in which a three-dimensional anatomy of, for example, a chest area is captured, in consideration of not only diagnosis for a specific category “lung” but also diagnosis for other categories “mediastinum”, “bone”, and the like.
As set forth above, the information processing system 1 according to the present embodiment identifies, upon designation of an attention location in a medical image in which a three-dimensional anatomy is captured, a plurality of terms as a plurality of candidates for a term concerning the basic body part that corresponds to the attention location. With this structure, even general practitioners who, unlike specialists, may have difficulty specifying a “basic body part” at which an abnormal shadow exists while imagining a three-dimensional anatomy from overlapping of shadows in the medical image can accurately specify the “basic body part” at which the abnormal shadow exists. When one of a plurality of terms as a plurality of candidates for the term that belongs to the basic body part is designated, terms that are associated with the designated term and belong to items other than the item “basic body part” are displayed so as to be distinguishable from the other terms. This promotes creation of a diagnostic result report according to the three-dimensional anatomy. As a result, the diagnostic result report can efficiently and properly be created.
It should be noted that the present invention is not limited to the above-mentioned one embodiment, and various modifications and improvements can be made without departing from the scope of the present invention.
For example, in the above-mentioned one embodiment, terms that belong to a plurality of items are listed in the term list in the support template. The present invention, however, is not limited to this structure. For example, term lists that each include a plurality of options that belong to a corresponding one of items may be displayed in time sequence. In this case, the display controller 304 causes the display unit 34 to display a first term list, as a first list, that includes one or more terms that belong to the item “basic findings”, which is the second item, and are included in one or more combinations of terms corresponding to the one term designated by the body part designation unit 308. In response to designation of one or more of the one or more terms included in the first term list by the element designation unit 311, the display controller 304 causes the display unit 34 to display a second term list, as a second list, that includes one or more terms that belong to the item “diagnosis”, which is the third item. The one or more terms included in the second term list are included in at least one combination of terms that is indicated by the combination frequency information 122d, which is the combination information group, and corresponds to a combination of the one term designated by the body part designation unit 308 and each of the one or more terms designated by the element designation unit 311. Use of such a structure facilitates creation of a diagnostic result report even in a limited display area.
Instead of displaying term lists each including a plurality of terms that belong to a corresponding one of items in time sequence, term lists each including a plurality of terms that belong to a group of two or more items may be displayed in time sequence.
In the above-mentioned one embodiment, a term concerning the shooting condition provided to the medical image is automatically designated as the term that belongs to the item “shooting condition”. The present invention, however, is not limited to this structure. For example, a given term may be designated in the selection area A31 in accordance with an operation on the operation unit 33.
In the one embodiment described above, from among two or more terms as candidates for the term that belongs to the item “basic body part”, a term that has higher priority is designated by the body part designation unit 308 as the term that belongs to the item “basic body part” in an initial state. The present invention, however, is not limited to this structure. For example, the two or more terms as the candidates for the term that belongs to the item “basic body part” identified by the body part identification unit 306 may be displayed as options, and the body part designation unit 308 may designate a term concerning one of the options in response to an operation on the operation unit 33. In this case, the term concerning one of the options may be designated by indirectly designating the term that belongs to the item “basic body part” through pressing of any of the category buttons SP1-SP5, or may be designated by directly designating the term in the selection area A32.
In the one embodiment described above, one of candidates for the term that belong to the item “basic body part” is identified for each term concerning a category. The present invention, however, is not limited to this structure. For example, two or more of candidates for the term that belongs to the item “basic body part” may be identified for each term concerning a category. That is to say, the body part identification unit 306 may identify one or more of candidates for the term that belongs to the item “basic body part” for each term concerning a category. In this case, two or more candidates for the term that belongs to the item “basic body part” may be displayed, as options, for each term concerning a category, and the body part designation unit 308 may designate a term corresponding to one of the options in response to an operation on the operation unit 33, for example.
In the one embodiment described above, the body part designation unit 308 designates a term from among candidates for the term that belongs to the item “basic body part” identified by the body part identification unit 306. The present invention, however, is not limited to this structure. For example, when an appropriate term concerning the items “basic findings” and “diagnosis” cannot be designated in limited candidates for the term that belongs to the item “basic body part” identified by the body part identification unit 306, another term that belongs to the item “basic body part” may be designated.
In the one embodiment described above, the term that belongs to the item “basic body part” is indirectly designated by pressing of any of the category buttons SP1-SP5. The present invention, however, is not limited to this structure. For example, in place of or separately from the category buttons SP1-SP5, candidates for the term that belongs to the item “basic body part” identified by the body part identification unit 306 may be displayed as a plurality of options, and one of the options may directly be designated.
In the one embodiment described above, the body part identification unit 306 identifies candidates for the term that belongs to the item “basic body part” and corresponds to the attention location by performing the first processing and the second processing. The present invention, however, is not limited to this structure. For example, the candidates for the term that belongs to the item “basic body part” and corresponds to the attention location may be identified by performing other processing. As the other processing, a plurality of terms that show names of basic body parts that indicate anatomical sections may be identified from arrangement of bones in the medical image, for example. For example, in a typical chest frontal view, a lower edge of a clavicle corresponds to a center location of an upper lung field, and an upper edge of a diaphragm coincides with a lower edge of a lung field. A range of each anatomical section may thus be identified based on a relationship between a location of the lower edge of the clavicle and a location of the upper edge of the diaphragm, which are identified relatively easily from the chest frontal view, and a term that belongs to the item “basic body part” and indicates an anatomical section corresponding to the attention location may be identified.
In the one embodiment described above, each time the body part designation unit 308 designates the term that belongs to the item “basic body part”, the information extraction unit 309 extracts information corresponding to the designated term that belongs to the item “basic body part” from the combination frequency information 122d. The present invention, however, is not limited to this structure. For example, in response to identification of two or more terms that belong to the item “basic body part” by the body part identification unit 306, the information extraction unit 309 may once extract information pieces corresponding to the respective two or more terms that belong to the item “basic body part” from the combination frequency information 122d. In this case, each time the body part designation unit 308 designates a term that belongs to the item “basic body part”, information corresponding to the designated term that belongs to the item “basic body part” may be extracted from the above-mentioned information extracted once. This allows for smooth processing according to change of the term that belongs to the item “basic body part”.
In the one embodiment described above, the medical care information 121 and the support information 122 are stored in a single storage 12 included in the server 10. The present invention, however, is not limited to this structure. For example, the medical care information 121 and the support information 122 may separately be stored in two or more storage devices included in the storage as appropriate.
In the one embodiment described above, a variety of information is visually output by a single display unit 34. The present invention, however, is not limited to this structure. For example, a variety of information may separately and visually be output by two or more display devices included in the display unit as appropriate. For example, a medical image may be displayed by a first display device and a support template may be displayed by a second display device.
In the one embodiment described above, the support information 122 includes the term information 122c and the combination frequency information 122d. The present invention, however, is not limited to this structure. For example, the support information 122 may include many structured report data pieces. In this case, the many structured report data pieces indicate a plurality of combinations of elements that belong to a plurality of items that include the first item concerning a body part in a three-dimensional anatomy, and the second item and the third item each different from the first item.
In the one embodiment described above, terms are designated for respective items “shooting condition”, “basic body part”, “basic findings”, and “diagnosis” to create a diagnostic result report. The present invention, however, is not limited to this structure. For example, terms may be designated for respective items that include the first item concerning a body part in a three-dimensional anatomy captured in a medical image, and the second item and the third item each different from the first item.
In the one embodiment described above, an instruction signal is input in accordance with a user operation on the operation unit 33. The present invention, however, is not limited to this structure. For example, in response to input of a sound, the sound may be analyzed, and an instruction signal may be input. That is to say, it is sufficient that the instruction signal is input in accordance with a user action.
In the one embodiment described above, an element that belongs to each item is a term. The present invention, however, is not limited to this structure. The element that belongs to each item may be other elements, such as various words and phrases and a diagram that indicates a position and an area.
In the one embodiment described above, various functions for achieving the medical care support processing are shared by the server 10 and the terminal device 30. A ratio of functions achieved by the server 10 to functions achieved by the terminal device 30, however, may be changed as appropriate.
In the one embodiment described above, the information processing system 1 is a server-client system in which the server 10 and the terminal device 30 are connected via the communication line W1. The present invention, however, is not limited to this structure. For example, functions of the above-mentioned information processing system 1 may be achieved by a single computer, assuming that the information processing system 1 is a system in a private hospital.
It should be appreciated that all or part of the one embodiment and various modifications set forth above can appropriately be combined with one another unless any contradiction occurs.
While the invention has been shown and described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is therefore understood that numerous modifications and variations can be devised without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2014-030821 | Feb 2014 | JP | national |