The present invention relates to a technology that allows a user to display desired medical information using an intuitive user interface, such as a touch panel.
In medical sites, a wide variety of medical information is generated, including waveform information, such as electrocardiograms, electroencephalograms, and the like, numerical information, such as blood pressures, body temperatures, and the like, and textual information, such as various examination reports, medical records, and the like, as well as image information obtained by various modalities, such as CT, MRI, US, PET, and the like.
Some medical institutions have established a system for managing such medical information. For example, such medical information is stored in a database as electronic data, then medical information desired by a user is selected in response to a request from a client terminal, and the selected information is displayed on a display device connected to the client terminal.
In order to improve the operability of selection and display of such medical information, various user interfaces are proposed. For example, such a user interface is known as described, for example, in Japanese Unexamined Patent Publication No. 2003-260030 in which the user is allowed to specify a region within a human shape or within an image representing a portion of a body displayed on a display screen using a pointing device and, if specified, medical image information of a diseased region within the specified area or around there is extracted from a medical database and a list of the extracted medical image information is displayed.
Further, another user interface is known as described, for example, in Japanese Unexamined Patent Publication No. 2009-119000 in which the user is allowed to draw a reference line in an axial cross-sectional image by a touch operation using an input device having a touch screen display connected to and used with a medical image processing workstation and, when drawn, a coronal cross-sectional image with the reference line as the cutting plane is generated and displayed.
The user interface described in Japanese Unexamined Patent Publication No. 2003-260030, however, is an interface intended to obtain medical image information as much as possible by specifying one region and the use of the interface causes a list of very large amount of medical information to be displayed and it may become sometimes difficult to intuitively narrow down the range of required medical information appropriately and rapidly in view of the fact that a wide variety of medical information is generated in medical sites. The user interface described in Japanese Unexamined Patent Publication No. 2009-119000 is an interface intended to switch an already selected image to another and not for appropriately narrowing down the range of required medical information.
The present invention has been developed in view of the circumstances described above, and it is an object of the present invention to provide a medical information display apparatus, method, and program that allows a user to obtain desired medical information more easily through a more intuitive operation.
A medical information display apparatus of the present invention is an apparatus, including:
a display means for displaying given information;
a gesture input means for detecting a gesture operation performed on a display surface of the display means and outputting gesture information representing a content of the detected gesture operation;
a first display control means for displaying a subject appearance image representing an appearance of a subject at a predetermined display position of the display means based on image data of the subject appearance image, wherein each position of the image is related to region identification information for identifying a region of the subject;
a gesture type analysis means for determining, based on gesture information outputted according to a gesture operation detected by the gesture input means while the subject appearance image is displayed, a gesture type representing to which of a plurality of predetermined gesture operation patterns the detected gesture operation correspond;
a gesture region analysis means for identifying a gesture region which is a region of the subject corresponding to the detected gesture operation based on information of the display position of the subject appearance image, the region identification information related to the subject appearance image data, and gesture position information representing a position on which the gesture operation has been performed included in the gesture information outputted according to the gesture operation while the subject appearance image is displayed;
an obtaining condition identification means for identifying a medical information obtaining condition for obtaining medical information of the subject corresponding to the gesture operation performed while the subject appearance image is displayed based on the gesture type and the gesture region;
a medical information obtaining means for selectively obtaining medical information satisfying the identified medical information obtaining condition from a medical information storage means storing a plurality of sets of medical information; and
a second display control means for displaying the obtained medical information on the display means.
A medical information display system of the present invention is a system in which a medical information supply apparatus for selectively supplying medical information of a subject based on a given medical information obtaining condition and the medical information display apparatus of the present invention are communicatively linked via a network.
Here, the medical information supply apparatus may include: a medical information storage means storing a plurality of sets of medical information in a data structure that allows selection of medical information based on a given medical information obtaining condition; an obtaining condition receiving means for receiving a medical information obtaining condition from the medical information display apparatus; a medical information retrieval means for obtaining medical information satisfying the received medical information obtaining condition from the medical information storage means; and a medical information transmission means for transmitting the obtained medical information to the medical information display apparatus that has transmitted the medical information obtaining condition.
A medical information display method of the present invention is a method, including:
a step of displaying a subject appearance image representing an appearance of a subject at a predetermined display position of a display means based on image data of the subject appearance image, wherein each position of the image is related to region identification information for identifying a region of the subject;
a step of detecting a gesture operation performed on a display surface of the display means while the subject appearance image is displayed and outputting gesture information representing a content of the detected gesture operation;
a step of determining a gesture type representing to which of a plurality of predetermined gesture operation patterns the detected gesture operation correspond based on the outputted gesture information;
a step of identifying a gesture region which is a region of the subject corresponding to the detected gesture operation based on information of the display position of the subject appearance image, the region identification information related to the subject appearance image data, and gesture position information representing a position on which the gesture operation has been performed included in the gesture information outputted according to the gesture operation;
a step of identifying a medical information obtaining condition for obtaining medical information of the subject corresponding to the gesture operation performed while the subject appearance image is displayed based on the gesture type and gesture region;
a step of selectively obtaining medical information satisfying the identified medical information obtaining condition from a medical information storage means storing a plurality of sets of medical information; and
a step of displaying the obtained medical information on the display means.
A medical information display control program of the present invention is a program for causing a computer to perform:
a step of displaying a subject appearance image representing an appearance of a subject at a predetermined display position of a display means based on image data of the subject appearance image, wherein each position of the image is related to region identification information for identifying a region of the subject;
a step of detecting a gesture operation performed on a display surface of the display means while the subject appearance image is displayed and outputting gesture information representing a content of the detected gesture operation;
a step of determining a gesture type representing to which of a plurality of predetermined gesture operation patterns the detected gesture operation correspond based on the outputted gesture information;
a step of identifying a gesture region which is a region of the subject corresponding to the detected gesture operation based on information of the display position of the subject appearance image, the region identification information related to the subject appearance image data, and gesture position information representing a position on which the gesture operation has been performed included in the gesture information outputted according to the gesture operation;
a step of identifying a medical information obtaining condition for obtaining medical information of the subject corresponding to the gesture operation performed while the subject appearance image is displayed based on the gesture type and gesture region;
a step of selectively obtaining medical information satisfying the identified medical information obtaining condition from a medical information storage means storing a plurality of sets of medical information; and
a step of displaying the obtained medical information on the display means.
In the present invention, a touch panel type input means may be used to input a gesture.
The subject appearance image may be an image schematically representing the subject.
The subject appearance image may be displayed by changing the appearance of the subject in the subject appearance image to a predetermined appearance according to the detected gesture operation based on the gesture type and/or the gesture region corresponding to the detected gesture operation while the subject appearance image is displayed.
Otherwise, the subject appearance image may be displayed by changing, based on a first gesture type determined with respect to a first gesture operation detected while the subject appearance image is displayed and/or a first gesture region identified with respect to the first gesture operation, the appearance of the subject in the subject appearance image to a predetermined appearance according to the detected gesture operation and, based on at least some of the first gesture type, the first gesture region, a second gesture type determined by the gesture type determination means with respect to a second gesture operation detected while the changed subject appearance image is displayed, and a second gesture region identified by the gesture region identification means with respect to the second gesture operation, a medical information obtaining condition corresponding to the first and second gesture operations may be identified.
When identifying a medical information obtaining condition, the medical information obtaining condition may be identified based on reference data in which medical information obtaining condition is related to a combination of gesture type and gesture region.
The reference data may be data in which one or more medical information obtaining conditions are related to a pair of gesture type and gesture region and if two or more medical information obtaining conditions are related to the pair, a priority may further be related to each of the plurality of medical information obtaining conditions.
Further, an arrangement may be adopted in which the reference data are allowed to be edited.
Medical information satisfying a given condition may be pre-obtained from the medical information storage means. Further, each region of the subject represented in the subject appearance image may be displayed such that a region whose medical information is included in the pre-obtained medical information differs in appearance from a region whose medical information is not included in the pre-obtained medical information.
When displaying medical information, if a plurality of sets of medical information with respect to examinations of the same type with different examination times is obtained by the medical information obtaining means, the plurality of sets of medical information may be displayed in a comparable manner.
When a medical image representing the subject is obtained from the medical information storage means, predetermined image processing may be performed on the obtained medical image to obtain a medical image satisfying the medical information obtaining condition, as required.
When a plurality of sets of medical information satisfying the medical information obtaining condition is obtained, the plurality of sets of medical information may be list-displayed to receive selection of medical information to be displayed, and the selected medical information may be displayed.
Here, when list-displaying the plurality of extracted sets of medical information, the plurality of sets of medical information may be displayed in the form of thumbnails or icons.
According to the present invention, the following are performed: receiving, while a subject appearance image is displayed, a gesture operation performed on the display surface of the image; determining a gesture type representing to which of a plurality of predetermined gesture operation patterns the detected gesture operation correspond based on gesture information representing a content of the gesture operation; identifying a gesture region which is a region of the subject corresponding to the detected gesture operation based on information of the display position of the subject appearance image, region identification information related to the subject appearance image data, and gesture position information representing a position on which the gesture operation has been performed, identifying a medical information obtaining condition for obtaining medical information of the subject corresponding to the gesture operation performed while the subject appearance image is displayed based on the gesture type and gesture region; selectively obtaining medical information satisfying the identified medical information obtaining condition from a medical information storage means storing a plurality of sets of medical information; and displaying the obtained medical information on the display means. Accordingly, this allows the user to obtain desired medical information more easily through a more intuitive operation.
In the present embodiment, medical information generated in the electronic medical record system 4, the image diagnostic system 5, the endoscopic examination system 6, the pathological examination system 7, and the each clinical department system 8 is integrally collected and managed by the medical information management server 2. The medical information display apparatus 1 makes a request for desired medical information to the medical information management server 2 and displays medical information satisfying the request supplied from the medical information management server 2.
As illustrated in
As illustrated in
The CPU 15 performs each processing by loading middleware, such as an operating system and the like, and each program, such as application software for obtaining and displaying medical information of the present invention, stored in the auxiliary storage device 17 to the main memory 16. This allows receiving of user input via the touch panel 11, input/output control, such as display control of various types of information, including medical information, on the liquid crystal display 12, communication via the communication interface 18, and the like.
As for the auxiliary storage device 17, a well-known flash memory drive (SSD: Solid State Drive) or a hard disk drive (HDD) is used. The auxiliary storage device 17 includes each program described above installed therein. The application software for displaying medical information of the present invention may be installed from a recording medium, such as CD-ROM or the like, using a disk drive connected to the medical information display apparatus 1 or installed after downloaded from a storage device of a server linked to the apparatus 1 via a network, such as the Internet or the like. Further, the auxiliary storage device 17 is used for temporarily storing medical information obtained from the medical information management server 2.
As for the touch panel 11, any known type may be used, including resistive type, capacitive type, electromagnetic type, surface elastic (ultrasonic) wave type, infrared type, and the like. In the present embodiment, a projected capacitive touch panel capable of detecting a multi-touch, i.e., touches at a plurality of positions, is used in order to explain a wide variety of gesture patterns. Touch operations on the touch panel 11 are performed with a finger of the user or with a predetermined pen or the like. The touch panel 11 detects the start of touching thereon, movement of the touched position, and end of the touching at a time interval defined by the control program, and outputs information of detected touch type and touched position at the time in a coordinate system of the touch panel 11. The term “start of touching” as used herein refers to a touching operation to a new position on the touch panel 11, the term “movement of touched position” refers to a moving operation of the touched position with the touch panel 11 being kept touched, and the term “end of touching” refers to a removing operation from the touch panel. This allows various gesture operations performed on the touch panel 11 to be detected. That is, a series of operations from the start of touching, movement of the touched position, and end of the touching is detected as one gesture operation, and touch type and position info/illation detected at each time point of the series of operations is obtained as gesture information. The correspondence relationship between the coordinate system of the touch panel 11 and the coordinate system of the liquid crystal display 12 is identified through calibration at the time when the medical information display apparatus 1 is manufactured, so that a mutual coordinate conversion is possible. Hereinafter, the coordinate system of the liquid crystal display 12 and the coordinate system of the touch panel 11 are assumed to be the same coordinate system and referred to as the coordinate system of the display apparatus in order to simplify the description.
The communication interface 18 controls communication through a well-known mobile communication network, wireless LAN, and the like. In the present embodiment, communication with the medical information management server 2 is performed via the communication interface 18.
In the mean time, the medical information management server 2 is a computer having a medical information database. As for the hardware configuration, it includes an external storage device in addition to well-known hardware devices, including CPU, main memory, auxiliary storage device, I/O interface, communication interface, data bus, and the like. The medical information management server 2 is provided with application software for medical information registration in and extraction from the database, as well as a well-known operating system and database management software. Such software is installed from a recording medium, such as CD-ROM or the like, or after downloaded from a storage device of a server linked thereto via a network, such as the Internet or the like.
The electronic medical record system 4 employs a known computer system and is of a configuration in which, for example, a terminal of the each clinical department and an electronic medical record management server having an electronic medical record database in which electronic medical record information is stored are communicatively linked via a network. Electronic medical record information inputted from a terminal of each clinical department and the like is managed using the electronic medical record database. For example, the electronic medical record includes: patient information, such as name, date of birth, gender, and the like of a patient; examination history information, such as date of each examination received, contents, results, and the like; diagnostic history, such as date of diagnosis received, major complaint, determined disease name, and the like; and treatment history information, such as date of operation, procedure, or medication and the like. In the present embodiment, the electronic medical record database has a database structure in which a patient ID for identifying each patient is related to the aforementioned electronic medical record.
The image diagnostic system 5 also employs a known computer system and is of a configuration in which, for example, an image diagnostic medical workstation, an image management server having a database storing image data captured by modalities, such as CT, MRI, and the like, and an image interpretation report server having an image interpretation report database storing image interpretation reports of image interpretation results of the captured images are communicatively linked via a network. Here, the image diagnosis medical workstation is capable of performing known image processing such as MIP, MPR, CPR, volume rendering (VR), or the like according to the purpose or target of the diagnosis in combination with a known image analysis, such as bone extraction/elimination, blood vessel extraction, organ extraction, detection of abnormal tissue pattern, or the like, and these processed/analyzed images are also stored in the image database. The image data may include both two-dimensional images (pixel data) and three-dimensional images (voxel data), and both still images and moving images. In addition to the patient ID, the image database includes other auxiliary information related to each image, such as an image ID for identifying each image, modality information by which the image is obtained, region information of a subject in the image, and the like. The modality information is provided by the modality at the time of image generation. The region information of a subject may be provided by the modality at the time of image generation based on the examination order or the like or, if the image is a tomographic image, such as a CT image or the like, the region information may be provided by the image diagnosis medical workstation for each slice using a well-known region recognition technique, such as that described in Japanese Unexamined Patent Publication NO. 2008-259682. The image interpretation report database has a database structure in which each image interpretation report, patient ID, and image ID of an interpretation target image are related to each other. Each image data or image interpretation report may be indirectly related to the patient ID by way of examination ID for identifying each examination (imaging).
The endoscopic examination system 6 also employs a known computer system and includes an endoscopic examination management server with an endoscopic examination database having therein real endoscopic images obtained by various types of endoscopes, endoscopic examination reports which include summaries of endoscopic examination results, and the like related to the examination IDs and patient IDs, and access control to the endoscopic examination database is performed by the server.
The pathological examination system 7 also employs a known computer system and includes a pathological examination management server with a pathological examination database having therein microscope images obtained by pathological examinations, pathological examination reports which include summaries of pathological examination results, and the like related to examination IDs and patient IDs, and access control to the pathological examination database is performed by the server.
The each clinical department system 8 includes a management server of each clinical department with a database of each clinical department having therein examination data, examination reports, and the like unique to each clinical department related to the examination IDs and patient IDs, and access control to the database of each clinical department is performed by each server. The examination data unique to each clinical department may be, for example, electrocardiogram data and the like (waveforms, numerical values, or the like) if the clinical department is a cardiovascular department, auditory test data and the like (waveforms, numerical values, or the like) if the department is an otolaryngology department, or visual acuity test data, fundus examination data or the like (numerical values, or the like) if the department is an ophthalmology department.
In the present embodiment, the endoscopic examination system 6 and pathological examination system 7 are systems separate from the each clinical department system 8, but they may be integrated as a part of the each clinical department system 8. In this case, information of endoscopic examinations and pathological examinations is managed as examination data of each clinical department according to the content of each examination.
A first embodiment of the present invention is an embodiment in which medical information is obtained from the medical information management server 2 according to each of various types of touch panel operations performed in the medical information display apparatus 1 and displayed on the liquid crystal display 12.
The medical information database 53 has a database structure in which patient ID, index information (to be described later) corresponding to medical information obtaining condition, and real data of the medical information are related.
The medical information registration unit 51 of the medical information management server 2 obtains medical information generated in other systems (the electric medical record system 4, the image diagnostic system 5, the endoscopic examination system 6, the pathological examination system 7, and the each clinical department system 8) at a predetermined time interval, extracts patient ID and index information from the obtained medical information, converts the obtained medical information to the data structure of the medical information database 53, and registers the information in the medical information database 53. This causes display target medical information for the medical information display apparatus 1 to be accumulated in the medical information database 53.
First, in the medical information display apparatus 1, the patient ID input UI 31 receives a patient ID and stores the inputted patient ID to a predetermined area of the main memory 16 (#1). More specifically, the patient ID is received, for example, using a software keyboard system in which an image of a keyboard or a numeric keypad is displayed on the liquid crystal display 12 and a key input displayed at the touched position on the touch panel 11 is received.
Next, obtaining condition input UI 32 reads a human body icon image (
The gesture type analysis unit 33 determines to which of a plurality of predetermined gesture patterns the inputted gesture corresponds based on the gesture information and outputs a result of the determination to a predetermined area for storing gesture information in the main memory 16 (#4). If no gesture pattern corresponding to the inputted gesture is identified, the processing returns to a waiting state for input of a new gesture.
The gesture region analysis unit 34 first identifies on which position of the human body icon 45 the gesture input has been performed based on the gesture information and position information of the human body icon 45. That is, the gesture information is position information in the coordinate system of the display device, while the position information on the human body icon 45 is position information in the coordinate system of the human body icon 45, the gesture region analysis unit 34 converts the information of both positions to position information in the same coordinate system using information of display position of the human body icon 45 in the coordinate system of the display device. This allows a relative gesture position which is a point on the human body icon 45 constituting the gesture input to be identified. The relative gesture position may be represented by either the coordinate system of the display device or the coordinate system of the human body icon 45. In the present embodiment, it is represented by the coordinate system of the human body icon. Next, the gesture region analysis unit 34 identifies region information for identifying a region of the human body icon 45 related to the relative gesture position of the gesture and outputs the identified region information to a predetermined area for storing gesture region information in the main memory 16 (#5). If no region corresponding to the inputted gesture is identified on the human body icon 45, or if a gesture operation is performed only on the outside of the human body icon 45, the processing returns to a waiting state for input of a new gesture.
With reference to the obtaining condition table 46, the obtaining condition identification unit 35 identifies a medical information obtaining condition corresponding to the gesture type information outputted from the gesture type analysis unit 33 and the gesture region information outputted from the gesture region analysis unit 34, and outputs the identified medical information obtaining condition to a predetermined area of the main memory 16 (#6). If no medical information obtaining condition corresponding to the inputted gesture is identified, the processing returns to a waiting state for input of a new gesture.
Next, the medical information obtaining unit 36 of the medical information display apparatus 1 transmits the medical information obtaining condition set by the obtaining condition identification unit 35 to the medical information management server 2 (#7). The medical information retrieval unit 52 of the medical information management server 2 receives the medical information obtaining condition from the medical information display apparatus 1 (#8), searches the medical information database 53 to extract real data of the medical information satisfying the received medical information obtaining condition (#9), and transmits the extracted real data of the medical information to medical information display apparatus 1 (#10). The medical information obtaining unit 36 of the medical information display apparatus 1 receives the transmitted real data of the medical information and stores them in a predetermined area of the main memory 16 or in the auxiliary storage device 17 (#11). Then, the medical information display control unit 37 displays the medical information on the liquid crystal display 12 based on the received real data of the medical information (#12). If no medical information satisfying the medical information obtaining condition is registered in the medical information database 53, information notifying accordingly is displayed.
As described above, in the first embodiment of the present invention, when a gesture performed on the human body icon 45 is inputted from the touch panel 11 of the medical information display apparatus 1, a medical information obtaining condition corresponding to the gesture is identified, then medical information satisfying the identified medical information obtaining condition is extracted from the medical information database 53 of the medical information management server 2, and the extracted medical information is displayed on the liquid crystal display 12 of the medical information display apparatus 1. Hereinafter, a series of processing steps performed until medical information corresponding to each of various gesture inputs is obtained will be described in detail.
The medical information registration unit 51 collects medical information from each system and creates registration data to be registered in the medical information database 53 using the collected medical information. More specifically, a patient ID is extracted from each of the collected medical information and set to the patient ID entry of the medical information database 53. Information of a region of a patient is extracted from auxiliary information of each of the collected medical information or the like and set to the examination region entry. From auxiliary information of each of collected medical information or the like, a type of the information is extracted and set to the information type entry. Note that values based on a code system designed in advance are allocated to the examination region and information type. The examination region and information type of each of the medical information is automatically set by the medical information registration unit 51 based on a predetermined setting condition. The collected real data are set to the real data entry. Then, the created registration data are registered in (inserted in) the medical information database 53. For example, in the case of axial cross-sectional image data, “abdominal region” is set to the examination region and “CT” is set to the information type, as the index information in the present embodiment (Information No. 11).
In the present embodiment, the gesture type analysis unit 33 sequentially determines to which of four gesture patters of knife, specification, seizure, and hammer each inputted gesture corresponds.
If the trajectory of the inputted gesture is recognized as a straight line (line segment) by a known pattern recognition process based on position information at each time point from the start of touching, through movement of touching, to the end of touching included in the gesture information, the gesture type analysis unit 33 recognizes the gesture as a knife gesture and outputs gesture type information representing a knife gesture. Thus, the gesture in the example shown in
If the gesture type information is a knife gesture, the gesture region analysis unit 34 identifies, based on the position information at each time point from the start of touching, through movement of touching, to the end of touching included in the gesture information, a relative position on the human body icon 45 at each time point by the coordinate conversion described above, and further obtains information of a body region and a region detail related to each identified relative position. In the example shown in
If the inputted gesture is recognized as a knife gesture by the gesture type analysis unit 33, the obtaining condition input UI 32 may display the human body icon 45 separated along the trajectory of the gesture, as illustrated in the example shown in
If the amount of movement of the touched position is determined to be smaller than a predetermined threshold value (close to zero) based on position information at each time point from the start of touching, through movement of touching, to the end of touching included in the gesture information, the gesture type analysis unit 33 recognizes the gesture as a specification gesture and outputs gesture type information representing a specification gesture. Thus, the gesture in the example shown in
If the gesture type information is a specification gesture, the gesture region analysis unit 34 identifies, based on the position information at the start of touching or end of touching, a relative position on the human body icon 45 at the time by the coordinate conversion described above, and further obtains information of a body region and a region detail related to the identified relative gesture position. In the example shown in
If the inputted gesture is recognized as a specification gesture by the gesture type analysis unit 33, the obtaining condition input UI 32 may display an organ or the like (heart, in this case) represented by the region detail information related to the specified position in a display mode different from that of the other areas of the human body icon 45, as illustrated in the example shown in
If a trajectory in which two touched points are moved in a first direction in which they come closer to each other and then the two points are moved in a second direction different from the first direction with the distance between them at the end of the movement in the first direction maintained is recognized by a known pattern recognition process based on position information at each time point from the start of touching, through movement of touching, to the end of touching included in the gesture information, the gesture type analysis unit 33 recognizes the gesture as a seize gesture and outputs gesture type information representing a seize gesture. Thus, the gesture in the example shown in
If the gesture type information is a seize gesture, the gesture region analysis unit 34 identifies, based on position information from the start of touching, through movement of touching, to the end of touching included in the gesture information, a position between two points at a time point at which the movement of the two touched points in the first direction ends and the movement direction is about to be changed (end point of each arrow (1) in
If the inputted gesture is recognized as a seize gesture by the gesture type analysis unit 33, the obtaining condition input UI 32 may display an organ or the like (heart, in this case) represented by the region detail information obtained by the gesture region analysis unit 34 in a display mode different from that of the other areas of the human body icon 45 and in an animated fashion in which the heart is seized out of the human body icon 45 by moving the heart in the second direction (arrows (2) direction), as illustrated in the example shown in
If the amount of movement of the touched position is determined to be smaller than a predetermined threshold value (close to zero) and a gesture time from the start to end of touching is longer than a predetermined threshold value based on position information at each time point from the start of touching, through movement of touching, to the end of touching included in the gesture information, the gesture type analysis unit 33 recognizes the gesture as a hammer gesture and outputs gesture type information representing a hammer gesture. If the gesture time is shorter than the predetermined time, the gesture is recognized as a specification gesture in this recognition method. Thus, the gesture in the example shown in
If the gesture type information is a hammer gesture, the gesture region analysis unit 34 identifies, based on position information at the time point of the start or end of touching included in the gesture information, a relative gesture position on the human body icon 45 by the coordinate conversion described above, and further obtains body region information and detailed body region information related to the identified relative gesture position. In the example shown in
If the inputted gesture is recognized as a hammer gesture by the gesture type analysis unit 33, the obtaining condition input UI 32 may display an organ or the like (brain, in this case) represented by the region detail information related to the specified position in a display mode in which the brain appears to be broken down, as illustrated in the example shown in
For example, the gesture type analysis unit 33 may be configured to recognize an operation gesture of a predetermined medical instrument performed on the human body icon 45 (e.g., gesture of endoscope insertion operation) in addition to the gestures described above, and gesture region analysis unit 34 may be configured to recognize the operation target region of the medical instrument. Further, a gesture different from each gesture described above may be related to each gesture pattern described above.
As described above, the medical information obtaining condition identified by the obtaining condition identification unit 35 includes an examination region condition representing a condition with respect to examination region of the medical information to be obtained and an information type condition representing a condition with respect to information type. The obtaining condition identification unit 35 identifies an information type condition corresponding to the combination of gesture type information outputted from the gesture type analysis unit 33 and gesture region information outputted from the gesture region analysis unit 34 with reference to the obtaining condition table 46, and sets the gesture region information used to identify the information type condition to the examination region condition.
The obtaining condition table 46 is editable by an obtaining condition table editing UI 38.
In the case of the knife gesture performed on the abdominal region illustrated, by way of example, in
The medical information retrieval unit 52 of the medical information management server 2 receives the medical information obtaining conditions identified by the obtaining condition identification unit 35 from the medical information obtaining unit 36 and retrieves the medical information database 53 using the received medical information obtaining conditions as the retrieval conditions in the order of priority. Note that, if medical information satisfying the current retrieval condition is extracted, database retrieval using a remaining lower priority medical information retrieval condition is not performed. On the other hand, if medical information satisfying the current retrieval condition is not extracted, database retrieval is performed with a medical information obtaining condition next higher in priority to the current retrieval condition as a new retrieval condition.
In the case of the knife gesture performed on the abdominal region illustrated, by way of example, in
In the case of the specification gesture performed on the heart illustrated, by way of example, in
In the case of the seize gesture performed on the heart illustrated, by way of example, in
In the case of the hammer gesture performed on the head region illustrated, byway of example, in
As described above, in the first embodiment of the present invention, different medical information may be obtained and displayed even for the same region of the human body icon 45 according to gesture patterns, such as, for example, the specification gesture performed on the heart shown in
As shown in the obtaining condition table 46 of
As described above, in the first embodiment of the present invention, a gesture inputted through the obtaining condition input UI 32 is analyzed by the gesture type analysis unit 33 and gesture region analysis unit 34, whereby a gesture pattern and a position are obtained. Then, based on the obtained gesture pattern and position, a medical information obtaining condition intuitively represented by the gesture is identified by the obtaining condition identification unit 35. Then, medical information satisfying the identified medical information obtaining condition is obtained by the medical information obtaining unit 36 from the medical information database 53 of the medical information management server 2 and the obtained medical information is displayed on the liquid crystal display 12 by the medical information display control unit 37. Thus, the user may easily narrow down and obtain desired medical information for display only by a single action of performing an intuitive gesture on the touch panel 11 of the medical information display apparatus 1. In this way, the medical information display apparatus 1 of the present embodiment has extremely high operability and a high practical value.
As the obtaining condition table editing UI 38 for editing the obtaining condition table 46 is provided, it is possible to flexibly define medical information obtaining conditions that meet the requirements of clinical sites or preferences of users, whereby the operability and flexibility of the medical information display apparatus 1 may further be enhanced, thereby contributing to further improvement of working efficiency in the clinical sites.
Further, region information is hierarchically related to the human body icon 45, like the body region information and region detail information, so that it is possible to combine the gesture pattern and gesture region more flexibly and sophisticatedly.
The first embodiment described above is arranged such that, based on a single gesture from the start of touching, through the movement, to the end of touching, medical information corresponding to the gesture is obtained, that is, relatively simple and easy operability is provided in the first embodiment. In other words, a user interface for inexperienced beginners in operation is provided. On the other hand, this interface alone may possibly be insufficient in operability for skilled users. Consequently, in a second embodiment of the present invention, it is an object to provide a user interface having more complicated operability for skilled users in operation. A functional structure implemented in a medical information display apparatus and a medical information management server, and a flow of display processing performed in a medical information integration system in the second embodiment of the present invention are identical to those of the first embodiment (
The gesture type analysis unit 33 and the gesture region analysis unit 34 outputs gesture type information and gesture region information with respect to each of a plurality of gestures inputted through the obtaining condition input UI 32, that is, the gestures are divided by a unit from the start to end of touching, and the gesture type information and gesture region information are outputted with respect to each divided gesture.
In the case of only the knife gesture illustrated, by way of example, in
In the case of the input example illustrated, by way of example, in
As described above, in the second embodiment of the present invention, the obtaining condition input UI 32 is capable of receiving an input which includes a plurality of gestures and more complicated than that in the first embodiment. Then, with respect to each of a plurality of inputted gestures, the gesture type and gesture region are recognized by the gesture type analysis unit 33 and gesture region analysis unit 34, and a medical information obtaining condition related to the types and regions of the plurality of gestures is identified by the obtaining condition identification unit 35. This allows an input of more information to be received by the obtaining condition input UI 32 and a more detailed medical information obtaining condition to be identified according to the amount of information.
As described above, it is possible to identify the medical information obtaining condition for obtaining a cross-sectional image in the coronal direction by first rotating the human body icon 45 according to a rotation gesture, which is the first gesture performed through the obtaining condition input UI 32, and then receiving the up-to-down direction knife gesture which is the second gesture in the specific example.
As described above, it is possible to cause the heart icon to be displayed in response to the heart seize gesture which is the first gesture performed through the obtaining condition input UI 32, and to identify the medical information obtaining condition for obtaining the medical information of the coronary artery in response to the coronary artery specification gesture which is the subsequently performed second gesture. Further, in response to a size gesture performed on a given organ in the human body icon 45, it is also possible to cause an icon representing the organ to be displayed and to identify the medical information obtaining condition according to the gesture performed on the icon of the organ, as in the specific example. Thus, even for a fine structure for which it is difficult to input a gesture, such as a specific component of an organ or the like, it is possible to input the gesture by displaying an icon of the organ or the like in an enlarged form in response to a first gesture performed on the human body icon and receiving a second gesture input performed on the icon of the organ or the like, thereby resulting in improved operability.
Note that the gesture pattern explained in the first embodiment may be formed of a plurality of gestures. For example, gesture type analysis unit 33 may be configured to recognize two tapping operations within a predetermine time period as a specification gesture and three tapping operations as a hammer gesture.
In the case of the first and second embodiments, a medical information obtaining condition is identified first by the obtaining condition identification unit 35 of the medical information display apparatus 1 by analyzing an inputted gesture and then the medical information is obtained from the medical information management server 2. This may result in a prolonged wait time for the user of the medical information display apparatus 1 from the completion of the gesture to the display of the medical information, whereby the operability may be degraded. A third embodiment of the present invention is to solve such a problem.
As illustrated in the drawing, after an input of a patient ID is received by the patient ID input UI 31 as in the first embodiment (#21), the medical information pre-obtaining unit 39 transmits a medical information obtaining condition having only the inputted patient ID to the medical information management server 2 (#22). The medical information retrieval unit 52 of the medical information management server 2 receives the medical information obtaining condition (only the patient ID) from the medical information display apparatus 1 (#23), performs retrieval of the medical information database 53, and extracts medical information in the database whose patient ID corresponds to the patient ID of the received medical information obtaining condition (#24). Here, not only real data of the medical information but also index information corresponding to the medical information obtaining condition are extracted. The medical information retrieval unit 52 transmits the extracted medical information to the medical information display apparatus 1 (#25). The medical information pre-obtaining unit 39 of the medical information display apparatus 1 receives the transmitted medical information and stores the information in a predetermined area of the auxiliary storage device 17 or main memory 16 (#26).
In the mean time, the receiving of a gesture input and setting of a medical information obtaining condition according to the gesture are performed (#27 to #31) in the medical information display apparatus 1, as in the steps #2 to #7 of the first embodiment, while the aforementioned processing is performed by the medical information pre-obtaining unit 39.
Then, based on the medical information obtaining condition identified according to the inputted gesture and the index information included in the medical information obtained by the medical information pre-obtaining unit 39, the medical information extraction unit 40 extracts medical information that satisfies the identified medical information obtaining condition from the pre-obtained medical information (#32). Then, based on real data of the extracted medical information, the medical information display control unit 37 displays the medical information on the liquid crystal display 12 (#33).
As described above, in the third embodiment of the present invention, the medical information pre-obtaining unit 39 pre-obtains medical information related to the patient ID inputted through the patient ID input UI 31 from the medical information database 53 of the medical information management server 2 in parallel with the receiving of a gesture input by the obtaining condition input UI 32 and identification of a medical information obtaining condition by the obtaining condition identification unit 35 in the medical information display apparatus 1. When obtaining medical information that satisfies the medical information obtaining condition corresponding to the inputted gesture, this eliminates the need to gain access to the medical information database 53 of the medical information management server 2. This eliminates the need for the user of the medical information display apparatus 1 to wait for the retrieval operation performed by the medical information management server 2 and communication between the medical information display apparatus 1 and medical information management server 2, therefore, a throughput from the viewpoint of the user is improved and the operability is enhanced. Even when the medical information management server 2 and the network 9 have high loads or low performance, this embodiment may alleviate the influence thereof by pre-obtaining medical information.
In each of the aforementioned embodiments, the user is unable to know whether or not medical information with respect to the region on which a gesture is performed is present at the time when the gesture is inputted through the obtaining condition input UI 32. A fourth embodiment is to solve this problem.
As illustrated in the drawing, after an input of a patient ID is received by the patient ID input UI 31 as in the first embodiment (#41), the medical information pre-obtaining unit 39 transmits a medical information obtaining condition having only the inputted patient ID to the medical information management server 2 (#42) as in the third embodiment. The medical information retrieval unit 52 of the medical information management server 2 receives the medical information obtaining condition (only the patient ID) from the medical information display apparatus 1 (#43), performs retrieval of the medical information database 53, and extracts medical information in the database whose patient ID corresponds to the patient ID of the received medical information obtaining condition (#44). Here, not only real data of the medical information but also index information corresponding to the medical information obtaining condition are extracted. Now, in the present embodiment, the medical information retrieval unit 52 transmits only an index portion of the extracted medical information to the medical information display apparatus 1 (#45). The medical information pre-obtaining unit 39 of the medical information display apparatus 1 receives the transmitted index portion of the medical information and stores the information in a predetermined area of the auxiliary storage device 17 or main memory 16 (#46). Further, in the present embodiment, while the aforementioned processing is performed by the medical information pre-obtaining unit 39, the display of a human body icon (step #27 in the third embodiment) is not performed.
The obtaining condition input UI 32 reads information of examination regions of the transmitted medical information, then classifies the regions in the human body icon into a group for which medical information is present and a group for which medical information is not present, and displays the human body icon 45 in a manner in which both groups are distinguishable (#47).
Then, as in the step #32 of the third embodiment, the medical information extraction unit 40 extracts medical information that satisfies the identified medical information obtaining condition from the pre-obtained medical information (#54), and medical information display control unit 37 displays the extracted medical information (#55).
As described above, in the fourth embodiment of the present invention, the obtaining condition input UI 32 displays the human body icon 45 in a manner in which a region for which medical information that can be displayed is present and a region for which medical information that can be displayed is not present are distinguishable. This allows the user to know whether or not medical information is present for each region of the human body icon 45 with respect to the patient specified by the patient ID input UT 31 before inputting a gesture through the obtaining condition input UI 32, whereby a redundant gesture input for which medical information can not be obtained may be avoided and the operation efficiency may be improved.
In order to display the human body icon 45 in the manner described above, it is necessary for the obtaining condition input UI 32 to refer to medical information pre-obtained by the medical information pre-obtaining unit 39. This makes it impossible to perform the display of the human body icon 45 and pre-obtaining of the entire medical information in parallel with each other, as in the third embodiment. Here, if the medical information retrieval unit 52 transmits the entire medical information extracted based on the patient ID, as in the third embodiment, the wait time from the entry of the patient ID to the display of the human body icon is increased, whereby the operability and working efficiency are degraded. Consequently, the medical information retrieval unit 52 is configured to transmit only the index portion of the extracted medical information required by the obtaining condition input UI 32. This may largely reduce the wait time from the entry of the patient ID to the display of the human body icon in comparison with the case in which all items of medical information are received. Further, processing from the receiving of gesture input to identification of medical information obtaining condition and the processing of receiving real data of the medical information, which can be performed in parallel with each other, are performed in parallel, as in the third embodiment, so that the wait time from the completion of a gesture input to the display of desired medical information is reduced in comparison with the first and second embodiments.
Each of the embodiments described above does not take into account the case in which sets of medical information of the same patient, same examination region, and the same information type with different examination dates and times are present, i.e., the case in which a plurality of sets of medical information satisfying medical information obtaining conditions of the same priority as, for example, in the registration example of the medical information database of
A fifth embodiment of the present invention is to realize a further effective display of medical information in such a case. A functional structure implemented in a medical information display apparatus and a medical information management server, and a flow of display processing performed in a medical information integration system in the fifth embodiment of the present invention are identical to those of each embodiment described above. Note that, however, if a plurality of sets of medical information is present, all of them are transmitted from the medical information management server 2 to the medical information display apparatus 1 and for the medical information to be transmitted, not only a real data portion but also an index portion are transmitted.
In the present embodiment, medical information display control unit 37 refers to the index portion of the display target medical information and, if sets of medical information of the same patient, the same examination region, and the same information type with different examination dates and times are present, displays them on the liquid crystal display 12 in a comparable manner.
In each of the embodiments described above, medical information having a higher priority is displayed based on the priority attached to the medical information obtaining condition. But there may be a case in which medical information having a lower priority corresponding to the inputted gesture is desired to be displayed. A sixth embodiment of the present embodiment takes into account such a case.
The medical information extraction unit 40 extracts medical information satisfying each of a plurality of medical information obtaining conditions of different priorities identified by the obtaining condition identification unit 35 (#74). Here, if one set of medical information is extracted by the medical information extraction unit 40 (#75; NO), the medical information display control unit 37 displays the medical information on the liquid crystal display 12 based on real data of the extracted medical information, as in the fourth embodiment described above (#78). In the mean time, if a plurality of sets of medical information is extracted by the medical information extraction unit 40 (#75; YES), the medical information selection UI 41 displays a medical information selection screen in which extracted sets of medical information are listed in the order of priority of the medical information obtaining condition satisfied by each set of medical information on the liquid crystal display 12 (#76).
As described above, in the sixth embodiment of the present invention, if a plurality of sets of medical information satisfying a medical information obtaining condition identified by the obtaining condition identification unit 35 is extracted by the medical information extraction unit 40 of the medical information display apparatus 1, the medical information selection UI 41 receives a selection of medical information to be displayed, so that the user may display desired medical information by a simple touch operation, whereby the operability is further enhanced.
For example, if the medical information is an image, each of the embodiments described above assumes that an image generated in the image diagnostic system 5 is registered in the medical information database 53. In contrast, a seventh embodiment of the present invention is to deal with the case in which volume data or the like are registered in the medical information database as medical information.
If volume data are extracted by the medical information extraction unit 40 as medical information satisfying the medical information obtaining condition (#95; YES), the display image generation unit 42 reads the examination region and information type of the medical information related to the volume data and generates an image according to the content thereof. For example, in the case of Information No. 51 of
As described above, in the seventh embodiment of the present invention, instead of a generated medical image, volume data which are the original data of the image are registered in the medical information database 53 by the medical information registration unit 51, and if the medical information described above is extracted as the medical information satisfying the medical information obtaining condition, a display image is generated by the display image generation unit 42 according to the index information of the medical information. Thus, even in this case, medical information (image) identical to that of each embodiment may be displayed.
Further, in the present embodiment, if a user interface for changing an image generation condition is further provided, image generation conditions, such as the position and orientation of the cross-section, the color template and opacity curve of the volume rendering, the position of viewpoint and visual line direction, and the like, may be changed freely, and the display image generation unit 42 may generate a display image according to the changed image generation conditions. This allows more interactive medical information display to be realized in the medical information display apparatus 1.
In each of the aforementioned embodiments, a plurality of sets of medical information obtained based on medical information obtaining conditions of different priorities defined in the obtaining condition table 46 are not displayed simultaneously. There may be a case, however, in which these sets of medical information are required to be displayed at the same time depending on the user's preference or demand from the medical sites. In such a case, it is a problem that in what layout a plurality of sets of medical information is to be displayed. An eighth embodiment of the present invention is to solve the problem.
In the present embodiment, the medical information obtaining unit 36 or medical information extraction unit 40 obtains all sets of medical information, each satisfying each of medical information obtaining conditions of different priorities. Further, with reference to the obtaining condition table 46, the medical information display control unit 37 identifies the display condition related to the medical information obtaining condition that each set of the obtained medical information satisfies and displays each set of the obtained medical information based on the identified display condition.
As described above, in the present embodiment, medical information display control unit 37 may identify a display condition related to a medical information obtaining condition corresponding to display target medical information with reference to the obtaining condition table 46 and display the display target image information based on the identified display condition. Thus, in the case in which a plurality of sets of display target medical information is present, the sets of medical information may be displayed in an appropriate layout.
Each embodiment described above is provided for illustrative purposes only and all the explanations above should not be used to limit the technical scope of the present invention. Further, various changes and modifications made to the system configurations, hardware configurations, processing flows, module configurations, user interfaces, specific processing contents, and the like without departing from the spirit of the present invention are included in the technical scope of the present invention.
For example, a characteristic configuration of each embodiment may be combined, as appropriate, to produce a new embodiment. More specifically, the obtaining condition table 46 of the second embodiment of the present invention may be employed in the third to eighth embodiments, and the medical information selection UI 41 may be employed in the first and second embodiments.
Further, the description has been made of a case in which real data of medical information are also registered in the medical information database 53 of each embodiment. But, an arrangement may be adopted in which, instead of real data of medical information, link information (address information) for gaining access to the real data is registered in the database 53 and the real data stored in a database of the source system of the real data are to be used (by setting the hyperlink destination to the database of the source system of the data), and the real data may be obtained based on the link information only when the medical information becomes the display target.
In the embodiments described above, medical information management server 2 for integrally managing medical information is provided and medical information display apparatus 1 obtains medical information from the medical information database 53 of the medical information management server 2. But an arrangement may be adopted in which medical information is obtained directly from each of other systems, such as the image diagnostic system 5, the endoscopic examination system 6, and the like.
Further, the medical information display apparatus 1 may include the medical information database 53. In this case, it is only necessary to provide the function of the medical information retrieval unit 52 in the medical information obtaining unit 36 or medical information pre-obtaining unit 39.
In the embodiments described above, the description has been made of a case in which the medical information display apparatus 1 is a portable device, as illustrated in
Further, although the content of a gesture inputted from the touch panel 11 is analyzed by the gesture type analysis unit 33 and gesture region analysis unit 34, but an arrangement may be adopted in which whole or part of the analysis is performed by the operating system of the medical information display apparatus 1 or by a touch panel driver (software).
Still further, the image which is an example of medical information may be a moving image instead of a still image.
Number | Date | Country | Kind |
---|---|---|---|
2010-191853 | Aug 2010 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2011/004710 | Aug 2011 | US |
Child | 13780687 | US |