The present disclosure relates to an information-processing apparatus, a method for processing information, and an information-processing system, and a program.
In recent years, information about diagnosis and medical images that are used for the diagnosis has been computerized. PTL 1 discloses that image data is deleted after a predetermined period has elapsed since being saved in order to decrease the amount of data of the medical images that are saved in a server.
PTL 1: Japanese Patent Laid-Open No. 2008-287653
An information-processing apparatus according to an embodiment of the present invention includes an identification unit that identifies plural kinds of photoacoustic images that are captured on the basis of the same photoacoustic signal and that are stored in a saving unit, and a determination unit that determines whether at least one of the plural kinds of photoacoustic images is to be deleted from the saving unit on the basis of information about the kind of a photoacoustic image that is included in the plural kinds of photoacoustic images that are identified.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Embodiments of the present invention will hereinafter be described with reference to the drawings.
In the present disclosure, an acoustic wave that is generated by expansion inside a test object when the test object is irradiated with light is referred to as a photoacoustic wave.
Attention is paid to a photoacoustic imaging to image a state of the inside of the test object in a minimally invasive manner. In the photoacoustic imaging, an organism is irradiated with pulsed light that is generated from a light source, and photoacoustic waves that are generated from a living tissue are detected after the living tissue absorbs the energy of the pulsed light that propagates and diffuses inside the organism. In the following description, an image that is imaged by using the photoacoustic waves is referred to as a photoacoustic image. In the photoacoustic imaging, a difference in light-energy absorbance between the test object such as a tumor and another tissue is used, and a transducer receives elastic waves (photoacoustic waves) that are generated when the test object absorbs the energy of the irradiated light and instantaneously expands. In the following description, a signal that is detected at this time is referred to as a photoacoustic signal. A photoacoustic imaging device can obtain distribution of optical properties in the organism, particularly, distribution of light energy absorption density by analyzing the photoacoustic signal. There are various kinds of photoacoustic images depending on the optical properties inside the test object. Example of the photoacoustic images include an absorption coefficient image that represents distribution of absorption density. An image that represents the existence or ratio of an organism molecule such as oxyhemoglobin, reduced hemoglobin, water, fat, or collagen is generated from the absorption coefficient image. For example, an image that is related to oxygen saturation, which is an indicator that represents a state of a bond between hemoglobin and oxygen, is generated on the basis of a ratio between the oxyhemoglobin and the reduced hemoglobin. The plural kinds of photoacoustic images that are generated by the photoacoustic imaging device are correlated with each other. For example, an image that represents an absorption coefficient is generated from an image that represents an initial sound pressure and an image that represents light intensity distribution.
In recent years, medical images that are used for diagnosis, including the above photoacoustic images, and various kinds of information about diagnosis have been computerized. For example, a DICOM (Digital Imaging and Communications in Medicine) standard is frequently used for information sharing between an imaging device and various devices that are connected to the imaging device. The DICOM standard defines the format of each medical image and communication protocol between the devices that use the medical image. Data that is transmitted and received in accordance with the DICOM standard is referred to as an information object (IOD or Information Object Definition). In the following description, the information object is referred to as an IOD or an object in some cases. Examples of the IOD include a medical image, patient information, inspection information, and a structured report. Various kinds of data related to inspection and treatment in which the medical image is used can be also included.
An image that is dealt with in accordance with the DICOM standard, that is, an IOD image includes metadata and image data. The metadata includes information about a patient, inspection, a series, and the image. The metadata includes an aggregate of data elements called DICOM data elements. A tag for identification of each data element is added to the corresponding DICOM data element. The image data is pixel data and has a tag for representing that this is image data. For example, a patient name in the metadata has a tag for representing that this is the name of the patient. In the case where the metadata and the image data make a DICOM data set, the IOD may also include DICOM file meta-information about the DICOM data set. The DICOM file meta-information includes, for example, information about an application that has created the IOD (DICOM file).
The photoacoustic imaging device preferably outputs an IOD photoacoustic image in accordance with the DICOM standard in order to use the photoacoustic image in various devices in a medical facility. In the photoacoustic imaging, the various kinds of photoacoustic images can be generated from the photoacoustic signal during shooting at one time as described above. However, in the case where all of the generated photoacoustic images are saved in a device, the free space of the capacity of the device may be restricted. In the case where all of plural kinds of photoacoustic images that are associated with each other are collectively deleted, however, association cannot be used to reuse the photoacoustic images.
The capacity for saving the image data can be decreased by merely deleting image data after a certain period has elapsed. However, a user cannot observe the deleted image data. An object of a first embodiment is to manage the IOD such that metadata that is related to the photoacoustic images is used to decrease the capacity for saving.
Structure of Information-Processing Apparatus
An example of the imaging device 102 is a photoacoustic imaging device. The control apparatus 101 controls the imaging device 102, captures a photoacoustic image on the basis of the photoacoustic signal, and outputs an IOD photoacoustic image to the information-processing apparatus 100 or the viewer 104. An example of the information-processing apparatus 100 is a PACS (Picture Archiving and Communication System). The information-processing apparatus 100 obtains and saves the IOD that is related to the photoacoustic image. The information-processing apparatus 100 manages the saving form of plural kinds of photoacoustic images that are captured during an inspection depending on the kind thereof. Specifically, the information-processing apparatus 100 deletes the image data that can be generated on the basis of another kind of the photoacoustic image and saves only the metadata. This will now be described in detail.
The information-processing apparatus 100 includes a saving unit 108, an identification unit 109, a determination unit 110, a communication unit 111, and an input-output control unit 112.
The saving unit 108 saves the IOD and various kinds of data that are obtained from the control apparatus 101 and the imaging device 102. The saving unit 108 saves information about settings of deletion of the image data and information about a grouping process that is performed by the identification unit 109.
The identification unit 109 identifies different kinds of photoacoustic images, for example, on the basis of the metadata of the IOD that is related to the photoacoustic images that are received from the control apparatus 101. Specifically, the identification unit 109 identifies and groups the photoacoustic images that are generated on the basis of the same photoacoustic signal. For example, on the basis of information about inspection time that is written in the metadata of the IOD, the identification unit 109 determines whether the photoacoustic images are generated on the basis of the same photoacoustic signal. The identification unit 109 adds the same identifier to the photoacoustic images that are grouped and saves information about grouping such as the identifier in the saving unit 108.
The determination unit 110 determines whether the image data of the IOD that is saved in the saving unit 108 is deleted on the basis of information about the kind of the corresponding photoacoustic image and information of the group that is identified by the identification unit 109. The determination unit 110 may make the determination on the basis of information about various kinds of settings that is saved in the saving unit 108.
The communication unit 111 communicates with external devices such as the control apparatus 101 and the viewer 104 via the network 105.
The input-output control unit 112 controls the display unit 106 to cause the display unit 106 to display information. The input-output control unit 112 controls the console 107 to receive an input from the console 107.
The display unit 106 displays an image that is imaged by the photoacoustic imaging device 102 and information about inspection in response to control of the information-processing apparatus 100. The display unit 106 provides an interface for receiving a user instruction in response to control of the information-processing apparatus 100. An example of the display unit 106 is a liquid-crystal display. The console 107 transmits information about a manipulation input of a user to the information-processing apparatus 100. Examples of the console 107 include a keyboard and a mouse.
The display unit 106 and the console 107 may be integrated into a touch panel display. The display unit 106 and the console 107 may be a display unit and a console of a computer (not illustrated) that is connected to the information-processing apparatus 100 with a serial port or a network interposed therebetween provided that the information-processing apparatus 100 can input and output.
The photoacoustic imaging device 102 (also referred to below as the imaging device 102 simply) uses the photoacoustic imaging. Examples of an inner region of the targeted test object include a circulatory organ region, the breast, the groin, the abdomen, and the limbs that include the fingers and the toes. In particular, the target of each photoacoustic image to be imaged may include a blood vessel region that includes a new blood vessel and plaque on a blood vessel wall depending on characteristics that are related to light absorption inside the test object. A contrast agent may be given to a test object 1030 to image the photoacoustic image. Examples of the contrast agent include pigments such as methylene blue and indocyanine green and gold granules. An accumulation of at least one of the above substances or a substance that is chemically modified may be used as the contrast agent.
The imaging device 102 includes an irradiation unit (not illustrated) that irradiates the test object with light and a receiver (not illustrated) that receives the photoacoustic waves from the test object.
The pulse width of the light that is emitted from the irradiation unit (not illustrated) is, for example, no less than 1 ns and no more than 100 ns. The wavelength of the light that is emitted from the irradiation unit (not illustrated) is, for example, no less than 400 nm and no more than 1600 nm. In the case where a blood vessel near a surface of the test object is imaged with high resolution, the wavelength is preferably no less than 400 nm and no more than 700 nm at which the light is greatly absorbed in the blood vessel. In the case where a deep portion of the test object is imaged, the wavelength is preferably no less than 700 nm and no more than 1100 nm at which the light is unlikely to be absorbed by water and tissue such as fat. In the case where information about oxygen saturation is to be obtained, the test object is irradiated with, for example, light at a wavelength of 756 nm and light at a wavelength of 797 nm.
The receiver (not illustrated) includes at least one transducer, an example of which can detect a frequency component at, for example, 0.1 to 100 MHz. The imaging device 102 converts a time-resolved signal that is obtained by the transducer (not illustrated) into the photoacoustic signal, which is a digital signal, and transmits the converted signal to the information-processing apparatus 100.
The control apparatus 101 controls the imaging device 102. An example of the control apparatus 101 is a computer. The control apparatus 101 includes an image-capturing unit 113 and the communication unit 111.
The image-capturing unit 113 captures the photoacoustic image on the basis of the photoacoustic signal that is obtained from the imaging device 102. Specifically, the image-capturing unit 113 reconfigures distribution (referred to below as initial sound pressure distribution) of acoustic waves when light is emitted on the basis of the photoacoustic signal. The image-capturing unit 113 obtains absorption coefficient distribution of light inside the test object by dividing the reconfigured initial sound pressure distribution by light fluence distribution of the test object with respect to the light with which the test object is irradiated. For example, the light fluence distribution is obtained in advance and saved in a memory, not illustrated, which the control apparatus 101 includes. The fact that the degree of absorption of light inside the test object varies depending on the wavelength of the light with which the test object is irradiated is applied to obtain concentration distribution of a substance inside the test object from the absorption coefficient distribution relative to wavelengths. For example, the image-capturing unit 113 obtains the concentration distribution of oxyhemoglobin and deoxyhemoglobin in the substance inside the test object. The image-capturing unit 113 also obtains oxygen saturation distribution as a ratio of oxyhemoglobin concentration to deoxyhemoglobin concentration. For example, the photoacoustic image that is generated by the image-capturing unit 113 represents information about any one of or all of the initial sound pressure distribution, the light fluence distribution, the absorption coefficient distribution, the concentration distribution of the substance, and the oxygen saturation distribution, described above.
The communication unit 111 communicates with the information-processing apparatus 100 and the external devices via the network 105. For example, the communication unit 111 obtains information about the order for inspection from the ordering apparatus 103 and outputs information based on the order for the inspection to the imaging device 102. A communication unit 114 outputs, to the external device such as the information-processing apparatus 100, the data of the photoacoustic image that is captured by the image-capturing unit 113 and the IOD that includes the metadata that is related to the photoacoustic image.
The ordering apparatus 103 is a system that manages inspection information and manages the progress of the inspection by the imaging device. The inspection information includes information about an inspection ID for identification of the inspection and a shooting technique that is included in the inspection. The ordering apparatus 103 transmits information about the inspection that is carried out by the imaging device 102 to the control apparatus 101 in response to an inquiry from the control apparatus 101. The ordering apparatus 103 receives information about the progress of the inspection from the control apparatus 101.
The viewer 104 is a terminal for image diagnosis, reads the image that is stored in, for example, the information-processing apparatus 100, and displays the image for the diagnosis. A doctor observes the image that is displayed on the viewer 104 and records an image diagnosis report of information that is obtained by the observation. The image diagnosis report that is created by using the viewer 104 may be stored in the viewer 104 or may be outputted to the information-processing apparatus 100 or a report server (not illustrated) and stored therein.
The CPU (Central Processing Unit) 1001 is a control circuit that comprehensively controls the information-processing apparatus 100 and components that are connected thereto. The CPU 1001 executes programs that are stored in the ROM 1002 for the control. The CPU 1001 executes a display driver, which is software for controlling the display unit 106, for display control of the display unit 106. The CPU 1001 controls input and output for the console 107.
The ROM (Read Only Memory) 1002 stores a program in which control procedures of the CPU 1001 are written, and data. The ROM 1002 stores a boot program of the information-processing apparatus 100 and various initial data. In addition, various programs for the processes of the information-processing apparatus 100 are stored therein.
The RAM (Random Access Memory) 1003 provides a working memory area when the CPU 1001 executes an instruction program for the control. The RAM 1003 has stack and a working area. The RAM 1003 stores programs for performing the processes of the information-processing apparatus 100 and the components that are connected thereto, and various parameters that are used for an imaging process. The RAM 1003 stores a control program that is executed by the CPU 1001 and temporally stores various kinds of data for various kinds of control of the CPU 1001.
The storage device 1004 is an auxiliary storage device that saves various kinds of data such as an ultrasonic image and the photoacoustic image. Examples of the storage device 1004 include a HDD (Hard Disk Drive) and a SSD (Solid State Drive). The storage device 1004 preferably has a RAID (Redundant Arrays of Inexpensive Disks) structure.
The USB (Universal Serial Bus) 1005 is a connector that is connected to the console 107.
The communication circuit 1006 is a circuit for communication with various external devices that are connected to the components of a system 1000 and the network 105. For example, the communication circuit 1006 outputs information that is contained in a transfer packet to the external devices via the network 105 by using a communication technique such as TCP/IP. The information-processing apparatus 100 may include plural communication circuits to fit a desired communication form.
The graphics board 1007 includes a GPU (Graphics Processing Unit) and a video memory. For example, the GPU makes calculations that are related to a reconfiguration process for generating the photoacoustic image from the photoacoustic signal.
A HDMI (registered trademark) (High Definition Multimedia Interface) 1008 is a connector that is connected to the display unit 106.
The CPU 1001 and the GPU are examples of a processor. The ROM 1002, the RAM 1003, and the storage device 1004 are examples of a memory. The information-processing apparatus 100 may include plural processors. According to the first embodiment, the processor of the information-processing apparatus 100 executes the programs that are stored in the memory to perform the functions of the components of the information-processing apparatus 100.
The information-processing apparatus 100 may include a CPU, a GPU, and an ASIC (Application Specific Integrated Circuit) that exclusively perform a specific process. The information-processing apparatus 100 may include a FPGA (Field-Programmable Gate Array) in which the specific process or all of the processes are programed.
In the case where the information-processing apparatus 100 is not directly connected to the display unit 106 or the console 107, the information-processing apparatus 100 may not include the USB 1005, the graphics board 1007, or the HDMI 1008. The information-processing apparatus 100 may include a NAS (Network Attached Storage) or a SAN (Storage Area Network) that is connected to the network 105, or both instead of the storage device 1004 that the information-processing apparatus 100 includes. In any case, the information-processing apparatus 100 preferably has the RAID.
Example of Process Performed by Information-Processing Apparatus
The items 204 that are related to a series of the respective photoacoustic images that are grouped are displayed in the list display section 202. The items 204 are displayed by characters that represent the kind of the photoacoustic images. Deleted marks 206 represent that deletion is instructed by the manipulation input into the corresponding deletion instruction sections 205. The display form of an item 207 that is selected differs from that of the other items. An image that is related to the item 207 is displayed in the image display section 203. In an example illustrated in
At a step S301, the determination unit 110 receives a user instruction to delete the image data. For example, the content of the manipulation input into the corresponding deletion instruction section 205 in
At a step S302, the identification unit 109 groups the photoacoustic images that are saved in the saving unit 108 on the basis of the metadata of the IOD that is related to the image data, deletion of which is instructed at the step S301. Specifically, the identification unit 109 identifies and groups different kinds of the photoacoustic images that are generated on the basis of the same photoacoustic signal as the image data, deletion of which is instructed. On the basis of the metadata, the identification unit 109 identifies the plural kinds of photoacoustic images that are captured on the basis of the same photoacoustic signal. For example, on the basis of information about the inspection time that is written in the metadata, the identification unit 109 identifies the photoacoustic images that are generated on the basis of the same photoacoustic signal and groups these into a group. The identification unit 109 adds the identifier to each group for management.
At a step S303, the determination unit 110 determines whether forward generation of the photoacoustic image, deletion of which is instructed at the step S301 is possible on the basis of the plural kinds of photoacoustic images that are grouped at the step S302. Specifically, the determination unit 110 determines whether part thereof is deleted from the saving unit 108 on the basis of information about the kind of each photoacoustic image of the plural kinds of photoacoustic images that are grouped. The determination whether the forward generation is possible will be described in detail later.
At a step S304, the process branches on the basis of the result of the determination at the step S303. In the case where it is determined that the forward generation is possible at the step S303, the flow proceeds to a step S311. In the case where it is determined that the forward generation is impossible, the flow proceeds to a step S305.
At the step S305, the determination unit 110 determines whether backward generation of the photoacoustic image, deletion of which is instructed at the step S301 is possible on the basis of the plural kinds of photoacoustic images that are grouped at the step S302. Specifically, the determination unit 110 determines whether part thereof is deleted from the saving unit 108 on the basis of the information of the kind of each photoacoustic image of the plural kinds of photoacoustic images that are grouped. The determination whether the backward generation is possible will be described in detail later.
At a step S306, the process branches on the basis of the result of the determination at the step S305. In the case where it is determined that the backward generation is possible at the step S305, the flow proceeds to the step S311. In the case where the backward generation is impossible, the flow proceeds to a step S307.
The determination whether the forward generation is possible and the determination whether the backward generation is possible will now be described in detail.
When the determination unit 110 determines whether the forward generation of the photoacoustic image (referred to below as a target image) the kind of which is to be deleted, the determination unit 110 determines whether all of the photoacoustic images (referred to below as upstream images) the kind of which is ought to be adjacent to the target image in the upstream direction belong to the same group that is generated by the determination unit 110. In the example illustrated in the table in
When the determination unit 110 determines whether the backward generation of the target image is possible, the determination unit 110 determines whether all of the photoacoustic images (referred to below as downstream images) the kind of which is adjacent to the target image in the downstream direction belong to the same group. In the example illustrated in the table in
At a step S508 illustrated in
In the above examples, the information illustrated in
Returning now to the description of
At the step S307, the determination unit 110 reads information about a deletion prohibition level from the saving unit 108. The deletion prohibition level is set in advance by the user and represents information whether the deletion of the target image is permitted in the case where none of the forward generation and the backward generation of the target image is possible.
The setting image 601 includes a level setting section 602, a cancel section 603, and a confirmation section 604. The level setting section 602 is set at either the “high” level or the “low” level as described above. The cancel section 603 is a button for canceling an edited content in the setting image 601. The confirmation section 604 is a button for confirming the edited content in the setting image 601. The confirmed content is saved in the saving unit 108.
At a step S308, the process branches depending on the setting of the deletion prohibition level that is obtained at the step S307. In the case where the level is set at the “low” level, the flow proceeds to a step S309. In the case where the level is set at the “high” level, the target image is not deleted, and the processes illustrated in
At the step S309, the input-output control unit 112 causes the display unit 106 to display a dialog. The dialog is a user interface by which the user selects whether the target image is deleted.
At a step S310, the input-output control unit 112 obtains information about the manipulation input of the user into the dialog 701. In the case where the user decides that the target image is deleted, the flow proceeds to the step S311. In the case where the user decides that the target image is not deleted, the target image, deletion of which is instructed by the user at the step S301 is not deleted, and the processes illustrated in
At the step S311, the image data of the target image, deletion of which is instructed at the step S301 is deleted from the saving unit 108. According to the first embodiment, the information-processing apparatus 100 deletes only the image data of the IOD of the target image. The determination unit 110 may add information for generating the target image into the metadata of the IOD. In the case where the user instructs deletion at the step S310, the instruction may be added into the metadata of the IOD. In another example, the information-processing apparatus 100 may delete the IOD of the target image. In this case, a method for generating the target image in the saving unit 108 may be saved in the saving unit 108.
The information-processing apparatus 100 identifies and groups the photoacoustic images that are generated on the basis of the same photoacoustic signal as described above. The information-processing apparatus 100 does not delete a group of the image data but determines whether a pieces of the image data can be regenerated from another piece of the image data to control the deletion of the image data. Since the plural kinds of the photoacoustic images are generated by different calculations, the information-processing apparatus 100 controls the deletion on the basis of calculation methods.
With the structure according to the first embodiment, it is determined that the image data that can be generated on the basis of another kind of the image data can be deleted. This enables the capacity for saving to be decreased. The control is based on the methods for generating the image data. This decreases the possibility that the image data that is required for diagnosis is mistakenly deleted.
In the example described above, whether the target image is deleted is determined on the basis of the deletion prohibition level at the step S307 to the step S310 in
In an example described according to the first embodiment, the image data is deleted by the manipulation input of the user to instruct deletion. In an example described according to a second embodiment, the image data is deleted depending on a predetermined save period.
The structure of the information-processing apparatus 100 and the structure of the system 1000 are the same as those according to the first embodiment, and the above description is referred to omit a detailed description here.
The kind of the photoacoustic image is displayed in a column 801. Buttons 803 for selecting whether the kind of the photoacoustic image is outputted to the external device are displayed in the rows of a data kind 802. The user can select the kind of the photoacoustic image that is outputted to the external device by the manipulation input into the corresponding button. The buttons 803 are displayed such that the buttons 803 when being selected can be distinguished from those when being not selected.
The kind that is selected by the corresponding button 803 is displayed in a region 804. The photoacoustic image the kind of which is selected by the button 803 is previewed in a region 805.
A region 806 is used to select an output destination to which the IOD that is related to the photoacoustic image the kind of which is selected by the manipulation input into the corresponding button 803 is outputted. A button 807 is used to decide that the IOD is outputted to a PACS (the information-processing apparatus 100 according to the second embodiment). A button 808 is used to decide that the IOD is outputted to the viewer 104. A button 809 is used to freely select the output destination by the user and enables the output destination to be specified by the manipulation input into a region 810.
A region 811 is used to specify the format of the image data of the photoacoustic image the kind of which is selected by the manipulation input into the buttons 803. A button 812 is used to specify a non-compression format of the image data in accordance with the DICOM standard. A button 813 is used to specify a compression format (for example, JPEG2000) of the image data in accordance with the DICOM standard. A button 814 is used to freely select the format by the user and enables the format to be specified by the manipulation input into a region 815.
A region 816 is used to specify the save period of the image data of the photoacoustic image the kind of which is selected by the manipulation input into the buttons 803. A button 817 is used to set the save period at half a year. A button 818 is used to set the save period at 5 years, that is, for a save as a medical record. A button 819 is used to freely select the save period by the user and enables the save period of the photoacoustic image the kind of which is selected to be specified by the manipulation input into a region 820. The control apparatus 101 writes information about the save period in the metadata of the IOD and outputs the information to the information-processing apparatus 100.
A button 821 is used to instruct the output of the IOD that is related to the photoacoustic image the kind of which is selected by the manipulation input into the corresponding button 803. The IOD is transmitted to the information-processing apparatus 100 in response to the manipulation input into the button 821.
The determination unit 110 of the information-processing apparatus 100 according to the second embodiment determines that the image data of the IOD is deleted when the image data of the IOD has not been read during a period that is longer than the save period that is written in the IOD.
At a step S901, the determination unit 110 reads the metadata of the IOD that is saved in the saving unit 108 and obtains the information about the save period. The determination unit 110 also obtains information about history in which the image data of the IOD that is saved in the saving unit 108 has been read. Examples of the history in which the image data has been read include a history displayed on the display unit 106 that is connected to the information-processing apparatus 100 and a history outputted to the external device, such as the viewer 104, which can display the image data. According to the second embodiment, the information-processing apparatus 100 saves the information about the history in the saving unit 108. The determination unit 110 reads and obtains the information about the history from the saving unit 108.
The determination unit 110 obtains the information about the save period and information about the IOD the image data of which has not been read during a period that is longer than the save period on the basis of the history in which the image data has been read. When there is no IOD relevant to this, the processes illustrated in
At the step S902, the determination unit 110 determines whether the target image can be regenerated. The process at the step S902 is the same as the processes at the step S301 to the step S306 illustrated in
At the step S903, the target image is deleted from the saving unit 108. Also, according to the second embodiment, the information-processing apparatus 100 may perform the process at the step S310 to cause the display unit 106 to display the dialog 701, and the user may select whether the image data is deleted. Also, according to the second embodiment, the information-processing apparatus 100 may perform the processes at the step S307 to the step S310 and may control the deletion on the basis of a predetermined deletion prohibition level.
Modification
In the examples described according to the above embodiments, the image data of the IOD is deleted. The present invention, however, is not limited thereto. For example, the IOD itself may be deleted. This enables data capacity to be decreased.
In the examples described according to the above embodiments, the image data that can be generated by using another kind of the photoacoustic image is deleted on the basis of the method for generating the image. The present invention, however, is not limited thereto. For example, only the image data that is used for diagnosis may be saved in the saving unit 108, and the other kinds of the image data may be deleted. For example, the determination unit 110 may save only the oxygen saturation and the total amount of hemoglobin that are set as the kinds that are used for diagnosis, and the other kinds of the image data may be deleted.
In the examples described according to the embodiments, the images that are related to the initial sound pressure and the light intensity distribution are the most upstream images in
In the examples described according to the above embodiments, the information-processing apparatus 100 is the PACS. The present invention, however, is not limited thereto. The entire functional configuration of the information-processing apparatus 100 may be included in the control apparatus 101 that controls the imaging device 102. In this case, the control apparatus 101 may control the deletion of the image data that is saved in a PACS that is connected to the control apparatus 101. The functional configuration of the information-processing apparatus 100 may be shared by the PACS and the control apparatus 101 that controls the imaging device 102, and the above processes may be performed as a system.
The present invention can also be carried out in a manner in which the system or the apparatus is provided with a program for performing one or more functions according to the above embodiments via a network or a storage medium, and one or more processors of a computer of the system or the apparatus read and execute the program. The present invention can also be carried out by a circuit (for example, an ASIC) for performing one or more functions.
The information-processing apparatus according to each embodiment described above may be a single apparatus, or a plurality of apparatuses may be combined so as to be able to communicate with each other to perform the above processes. These are included in the embodiments of the present invention. The above processes may be performed by a common server apparatus or a server group. It is not necessary for a plurality of apparatuses that achieve the information-processing apparatus and the information-processing system to be installed in the same facility or the same country provided that the apparatuses can communicate at a predetermined communication rate.
The embodiments of the present invention include an embodiment in which the system or the apparatus is provided with a software program that performs the functions according to the above embodiments, and the computer of the system or the apparatus reads and executes codes of the provided program.
Accordingly, the program codes that are installed in the computer to perform the processes according to the embodiments by the computer are included in the embodiments of the present invention. The functions according to the above embodiments can be performed in a manner in which an OS that acts on the computer, for example, performs a part or all of actual processing on the basis of instructions that are included in the program that the computer reads.
An appropriate combination of the above embodiments is also included in the embodiments of the present invention.
The information-processing apparatus enables a part of image data to be deleted to decrease capacity for saving. Thereafter, a user can display the deleted image data.
The present invention is not limited to the above embodiments. Various modifications and alterations can be made without departing form the spirit and scope of the present invention. Accordingly, the following claims are attached to publish the scope of the present invention.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Number | Date | Country | Kind |
---|---|---|---|
2016-239376 | Dec 2016 | JP | national |
This application is a Continuation of International Patent Application No. PCT/JP2017/042818, filed Nov. 29, 2017, which claims the benefit of Japanese Patent Application No. 2016-239376, filed Dec. 9, 2016, both of which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2017/042818 | Nov 2017 | US |
Child | 16402108 | US |