The invention relates in particular to a PACS (Picture Archiving and Communication System). Systems of this type are available from various manufacturers for the effective acquisition and storage of the data volumes generated, for example, in the health care sector. The data volumes are particularly large due to the acquired image data, e.g. greater the 1 terabyte in relation to one hospital and one year. However, large image data volumes occur not only in the healthcare sector but also in other areas, such as cartographic services, social networks and like.
The editing of the image data with image editing programs or with image editing functions and/or the editing of other data with other data editing functions can be separated from the storage of the data. However, this may result, for example, in an increase in the data volumes to be transmitted via data transmission networks and/or in a complex maintenance of the data editing functions.
The invention relates to a method for editing, for example, image data or measurement value data, comprising:
The invention furthermore relates to a data processing system or a data processing system assembly, in particular for carrying out the method as claimed in one of the preceding claims, comprising:
The object of developments is to indicate simple methods for editing image data, measurement value data and/or additional data which, in particular, reduce the volume of the data to be transmitted and/or the maintenance complexity of data editing functions and/or offer further advantages. Furthermore, an associated data processing system or an associated data processing system assembly is to be specified.
One method for editing data may comprise:
The dataset may, in particular, be a data object which has been implemented according to a predefined class definition.
The method is, in particular, carried out automatically. The DP systems in each case contain a processor which executes program commands that are stored in an electronic memory.
The technical effects of the method consist, in particular, in that
Further technical effects can be found in the detailed descriptions of the general method given below.
A method for editing image data may comprise:
The image data may be:
The image editing function may be related:
The data editing function may be an image editing function. The image editing function may, in particular, be:
The additional image data may be stored, for example, according to the widely used DICOM (Digital Imaging and Communications in Medicine) format or EXIF (Exchangeable Image File) format. The additional image data may describe the image content in words and/or may specify the circumstances of the image recording. Furthermore, for example, the name of the patient or an identifier for the patient may be included. The name or an identifier for a recorded organ/bone or tissue may also be included in the additional image data.
“Transmitted jointly” means that the data are transmitted in the same message or in a plurality of messages which, for example, have a common identifier or have a different relationship known to the recipient, e.g. are transmitted in immediate succession or with little time delay.
The aforementioned method may, in particular, be used for image editing functions that can be performed with comparatively little processing effort, and/or for image data for which it is established that they will be retrieved or read again in the foreseeable future. Format conversions, for example, can be carried out with comparatively little processing effort. Image editing functions which improve the storage can also be used.
The measurement value data may be medical measurement value data, in particular the measurement value series generated by medical devices, e.g. ECG, EEG, etc. The editing function for editing the measurement value data may be a filter function, a function for determining specific features or a different function.
The editing function may, however, also relate to the additional image data or the additional measurement value data.
The rule may contain a condition part (IF) or an action part (THEN) specifying what applies if the condition of the rule is fulfilled. Logical link operators can be used which, for example, link a plurality of conditions, for example AND, OR, XOR or NOT operators. The action part may specify one action or one or more actions. In an alternative described below with specification of a function in a request message, the rule can also be evaluated inversely, i.e., starting from the action, conditions or a condition to be fulfilled by a data object can be determined, so that the function specified in the action part can be performed.
The storing of the rule enables a separation of functions for transmitting image data and functions for editing image data. No image editing function thus needs to be specified in the transmission of the image data or after the transmission of the image data.
The technical effects of this method may consist in that:
An API (Application Programming Interface) may be present in the data editing unit which forms part of a picture archiving system. This API can be controlled automatically on the basis of the additional image data or the additional measurement value data and predefined rules. An upstream interface can thus be used as the API, which is integrated into the automatic control and, for example, transmits messages to a control unit, as explained in further detail below with reference to
In particular, it is also simply possible by means of the method to perform the image editing or a different data editing on the side of the second data processing system, in particular in close physical proximity, i.e. at a distance of less than 100 meters or less than 10 meters. The first data processing system can thus be of simple design. Moreover, the technical effects specified below can be achieved, for example in terms of a simple maintenance of the first or second data processing system (DP system) and/or the image editing functions integrated into a PACS (Picture Archiving and Communication System) or a different data editing function integrated into a PACS (measurement value data, additional data), in terms of the avoidance of unnecessary transport of image data, etc.
A further method for editing image data and/or measurement value data may comprise:
The additional image data may be stored in an image data store or in a different DP system (data processing system), i.e. separately from the image data. The additional measurement value data can also be stored in the or in an image data store or in a different DP system, i.e. separately from the measurement value data.
The method can be used in particular if incoming image data or measurement value data are not initially edited, but are only stored together with the associated additional image data or additional measurement value data. On the other hand, some of the editing functions can be performed during the storage and others only during the reading.
The statements made above apply to the image data, the image editing function, the additional image data, the measurement value data, the additional measurement value data, the data editing function, the joint transmission and the rules.
The method using the request message can be employed, in particular, for image editing functions or measurement value editing functions which have to be performed with comparatively substantial computing effort, and/or for image data or measurement value data for which it is established that they will in any case be required again, e.g. immediately, within a specific period or e.g. on the next day.
The storage of the rule enables a separation of functions for requesting data and functions for editing data. No data editing function therefore needs to be specified when the data requested or after the data have been requested.
The technical effects of this method may consist in that:
An API (Application Programming Interface) may be present in the image editing unit or in a different data editing unit which forms part of a picture archiving system. This API can be controlled automatically on the basis of the additional image data or other additional data and predefined rules. An upstream interface can thus be used as the API, which is integrated into the automatic control and, for example, transmits messages to a control unit, as explained in further detail below with reference to
In particular, it is also simply possible by means of the method to perform the image editing or a different data editing on the side of the second data processing system, in particular in close physical proximity, i.e., for example, at a distance of less than 100 meters or less than 10 meters. The first data processing system can thus be of simple design. Moreover, the technical effects specified below can be achieved, for example in terms of a simple maintenance of the first or second DP system and/or the image editing functions integrated into a PACS (Picture archiving and Communication System) or other data editing functions, in terms of the avoidance of unnecessary transport of image data or other data, etc.
The image editing function and, where appropriate, other data editing functions also can be performed in a picture archiving unit into which at least one image editing unit is preferably integrated.
A component of the image editing unit or the measurement value editing unit may therefore be present on the first DP system, in particular a user interface (UI). The API of the image editing unit is used if the IEU (image editing unit) accesses the PAU (picture archiving unit) via the latter's API. However, a different component of the image editing unit is located, for example, on the second DP system or on another DP system which is different from the first DP system. However, all components of the data processing unit may also be located on one DP system or on a plurality of DP systems that are different from the first DP system.
The picture archiving units are known, for example, by the name of PACS, see e.g. the syngo.plaza system from SIEMENS AG. The picture archiving unit may support the DICOM standard.
The storage may comprise the storage of at least one rule which specifies a data editing function depending on an identifier which has been defined for a dataset or for a data object. The message may contain the identifier which specifies the data editing function. When the data editing function is performed, the image data of at least one image and/or the additional image data and/or the measurement value data and/or the additional measurement value data can be determined, in particular data to which the data editing function specified in the message is applicable. When the data editing function is performed, the data editing function can then be applied to the determined data.
Additional data specified in the message can be used when the data are determined.
An additional query facility or selection facility is therefore created. In particular, no new definitions are required for the message in order to specify a function directly in the message, since the function is defined indirectly via a rule.
The method with specification of the editing function in the message can also be used in particular if incoming image data or measurement value data are not initially edited, but are only stored together with the associated additional image data or additional measurement value data. On the other hand, some of the editing functions can be performed during the storage and others only during the reading.
The statements made above apply to the image data, the image editing function, the additional image data, the measurement value data, the additional measurement value data, the data editing function, the joint transmission and the rules.
The method using the request message with specification of the editing function can also be used in particular for image editing functions or measurement value editing functions which have to be performed with comparatively substantial computing effort, and/or for image data or measurement value data for which it is established that they will in any case be required again, e.g. immediately, within a specific period or e.g. on the next day.
The picture archiving unit may contain an image data store with a storage capacity greater than 1 terabyte or even greater than 100 terabytes, in particular a short-term storage unit for storing the image data for less than e.g. 6 months. The specified data volumes may also contain the measurement value data.
A user retrieving the image data or the measurement value data could thus be interested only in the edited image data or the edited measurement value data. In this case, the unedited data do not have to be transmitted via a data transmission network.
The data editing function may be defined in a dataset which is stored in the picture archiving unit, in particular in a dataset which meets the DICOM standard or the HL7 standard. The data editing function can thus be recorded in a simple manner in the PACS.
In one design, the picture archiving unit may contain only the second data processing system or a plurality of data processing systems on which at least one of the aforementioned method steps is in each case carried out. Groups or clusters of data processing systems containing more than 10, more than 100 or even more than 1000 data processing systems can thus be used, wherein, however, the first data processing system does not belong to the cluster.
The arrangement of the image editing functions in the cluster may, particularly in the case of very large clusters, result in a considerable reduction in the data traffic outside the cluster. In one design, at least two or all data processing systems of the picture archiving unit may be interconnected via a data transmission connection with a data transmission rate which is at least three times or at least ten times higher than the data transmission rate between the first data processing system and the second data processing system.
Clusters of computers with particularly fast data transmission connections or bus systems can thus be interconnected, e.g. more than 10, more than 100, or more than 1000 data processing systems. Fast access and therefore also a fast editing of the image data in the PACS or a different archiving system are thus possible.
In one design, the picture archiving unit may contain a first storage unit and a second storage unit, wherein image data are stored in the second storage unit for a period of time which is longer than a period of time for the storage of the image data in the first storage unit, e.g. more than four times more than ten times longer.
The access time of the second storage unit may be less than the access time of the first storage unit, e.g. by more than 10 percent or more than 50 percent in relation to the access time of the first storage unit.
The image data, the measurement value data or the additional data may be stored, for example, for a maximum of six months in the first storage unit. The image data, the measurement value data or the additional data may be stored, for example, for longer than two years in the second storage unit.
RAID (Redundant Array of Independent Disks) systems can be used in both storage units, in which the data are stored or mirrored multiple times. However, the outlay for the mirroring or data backup may differ in size in the two storage units.
In particular, the outlay for the data backup in the first storage unit may be higher.
Magnetic storage media, electronic storage media, such as e.g. EEPROMs (Electrically Erasable Programmable Read Only Memory) or Flash EEPROMs, solid state disks (SSD) or other storage types can be used. The storage type of the first storage unit may differ from the storage type of the second storage unit.
The picture archiving unit may also communicate with imaging devices as provided, for example, in the DICOM standard. The imaging devices may be the aforementioned devices, e.g. computer tomograph (CT), MRT, etc. In this way, relevant additional image data can be retrieved automatically, directly from the devices.
The image data may contain or may be medical data and/or the measurement value data may contain or may be medical measurement value data. The proposed solutions can be employed particularly effectively, specifically in the field of medicine, since very large data volumes have to be edited.
The additional image data or the additional measurement value data may be structured according to the DICOM standard or a standard based thereon. The DICOM standard is based on an object-oriented information model and enables data exchange via point-to-point connections and/or via networks and/or via the exchange of transportable media. The DICOM standard goes back to ARC (American College of Radiology) NEMA (National Electrical Manufacturers Association) 300-1985 or version 1.0.
The Information Object Definitions (IOD) of DICOM 3.0 are specified in the following two tables:
A computed tomography according to DICOM is, for example, specified in the following table (Computed Tomography Image IOD Module Table):
A Patient Module contains, for example, the following data according to DICOM:
Similar specifications apply to measurement value data, e.g. ECG (electrocardiogram), EEG (electro-encephalogram, US (ultrasound), blood flow velocity values, etc.
For the message exchange, DICOM defines a message transmission service, DICOM Message Service Element, which is based on TCP/IP (Transfer Control Protocol/Internet Protocol), ISO OS (International Standardization Organization Open System) or point-to-point interfaces. The combination of an information object and a data service of this type is referred to as the Service Object Pair (SOP). The SOP class represents the basic functional unit which is defined in DICOM. Through the definition of an SOP class, it is possible to define a specific subset of the DICOM functionality.
However, the additional image data or the additional measurement value data may also be structured, for example, according to the EXIF (Exchange Image File) standard or a standard based thereon. Alternatively, the additional image data may be structured according to the HL7 (Health Level 7) standard or a standard based thereon, which is similarly widely used in some fields of medicine.
Widely used standards, which are suitable, in particular, for medical image data or measurement value data also, are therefore employed.
The additional image data or the additional measurement value data may contain at least one, at least two or at least three of the following data: —a datum to indicate the identity of a patient,
These data are particularly suitable for use as decision criteria to determine which image editing function or other data editing function is to be carried out. However, further data are similarly suitable, e.g. a time of the recording of the image data or the measurement value data, etc.
In the method for storing image data, the or at least a part of the additional image data transmitted jointly with the image data can be transmitted separately from the image data in a message to a control unit which has access to the stored rules. In particular, a copy of the additional image data may be contained in the message. Alternatively, the or a part of the additional measurement value data transmitted jointly with the measurement value data can be transmitted separately from the measurement value data in a message to a control unit which has access to the stored rules. The image data or the measurement value data themselves do not therefore have to be transmitted unnecessarily in the picture archiving system.
In the method for requesting or reading image data, the determined additional image data or additional measurement value data or the additional image data or additional measurement value data contained in the request message can be transmitted in a message to a or to the control unit which has access to the stored rules. In particular, a copy of the additional image data or additional measurement value data can be contained in the message. The image data or measurement value data themselves do not therefore have to be transmitted unnecessarily in the picture archiving system.
The control unit can evaluate the additional image data contained in the message using the stored rule or rules. The control unit can generate a further message for an image editing unit or a different data editing unit (measurement value data, additional image data, additional measurement value data), wherein the further message contains, in particular, an identifier to specify the data editing function, in particular an image editing function, and/or the additional data, e.g. additional image data. This separation of the evaluation of the rules and the image editing function enables a simple programming and/or maintenance of the picture archiving system.
The control unit can generate the further message with a time delay in relation to the first message, which is introduced intentionally, e.g. with a delay greater than 5 minutes or greater than 30 minutes.
However, the delay may be less than 24 hours or less than one week. Thus, particularly during the night or on Saturdays or Sundays, it can be calculated where more processing capacity can be available than during the day or on the working days of a week. For example, batch processes can be used with which a multiplicity of identical data editing functions, in particular image editing functions or measurement value editing functions, are carried out for a multiplicity of data. The performance in the editing of the data can be increased as a result. A data editing function therefore needs to be initialized, for example, once only or a few times only, and can then be used to edit the data of a plurality of images or measurement value series, in particular to edit more than 10, more than 100 or more than 1000 images or measurement value series.
An asynchronous editing is therefore carried out. The receiving or storing of the received data is therefore temporally decoupled from the data editing itself. Similarly, the receiving of the request message can be temporally decoupled from the data editing itself.
The image editing or the editing of other data (measurement value data, additional data) can also be based on incremental methods to which only newly added image data are subjected.
At least one message containing an identifier for a data editing function, in particular an image editing function or a measurement value data editing function, and command code for a data editing function, in particular an image editing function or a measurement value data editing function, can be transmitted from the first data processing system to the second data processing system. The identifier and the command code can be stored in a or the image editing unit.
Through this measure, data editing functions can be inserted following the completion of the initial installation of the picture archiving program or system. Alternatively or additionally, such functions can also be permanently predefined in the picture archiving program system.
The data editing functions can thus be adapted in a simple manner to the needs of a user or a plurality of users of the PACS or a different picture archiving system. Updates of individual data editing functions can be carried out at a central location, similarly in a simple manner.
The command code is, for example, Java code, JavaScript code or C code, in particular C++ code or Visual C code. The command code may be object-oriented and/or sequential code which, for example, can be executed by a processor following a compilation and/or a link process, in particular a dynamic link process.
The identifier for the image editing function can also be used in at least one stored rule. The identifier for the image editing function can be used in one of the aforementioned messages.
At least one message in which an identifier for at least one data editing function, in particular an image editing function or measurement value editing function, and at least one rule are specified, said rule defining the condition under which the data editing function is to be performed depending on additional image data or additional measurement value data, or serving to establish whether a function affected by the rule is applicable to a dataset, can be transmitted from the first data processing system to the second data processing system. The rule can be stored in such a way that a control unit has access to it, wherein the rule is stored, in particular, in the control unit.
Through this measure, the rules or rule can be inserted following the completion of the initial installation of the picture archiving program or system. Alternatively or additionally, such rules can also be permanently predefined in the picture archiving program or system.
The rules can thus be adapted in a simple manner to the needs of a user or a plurality of users of the PACS or a different picture archiving system. Updates of individual rules can be carried out at a central location, similarly in a simple manner.
A data processing system or a data processing system assembly, in particular for carrying out one of the methods explained above, may contain:
The aforementioned first DP system and/or the similarly aforementioned second DP system, in particular, may belong to the DP system assembly. However, the DP system assembly may contain only the second DP system and further DP systems, wherein, however, the first DP system is not included in the data processing system assembly.
The technical effects indicated above for the method and its developments apply, in particular, to the DP system or to the DP system assembly.
“Integrated” means, in particular, a linking of programs and/or program parts with a larger program package, in particular dynamically also, i.e. in the program runtime. In other words, a standardized environment, for example, is specified for the processing in the memory of, for example, medical image data.
The processing of medical image data by computer systems is becoming increasingly important for various reasons. According to one study, the introduction of IT (Information Technology) is, for example in the healthcare sector in some countries, a strong trend, see e.g. “Electronic Health Records: A Global Perspective”, Healthcare Information and Management Systems Society (HIMSS), pp. 6 to 7.
The processing of, for example, medical image data requires a very large amount of memory resources due to the required precision and wealth of detail of the data. If an existing image is to be edited or processed, it must first be requested from the storage system and transmitted to the processing station or to the processing server (service provision computer) in a client-server-based image processing system, wherein the client can also be referred to as a service usage computer. Following the editing steps, the updated image is transmitted back into the storage system once more.
The transmission back and forth is disadvantageous due to:
1.2 Load on the medical workstation or on the mobile terminal device and consequent poor UI (User Interface) experience or poor operability:
1.3 No central optimization of image processing or image editing algorithms:
1.4 High transmission costs and poor transmission quality in networks:
The prior art consists in that the editing or processing is undertaken on the processing station. This results in the disadvantages described above.
Database-oriented storage systems allow the feed-in of procedural code for the execution of stored procedures on the basis of ordered events. However, these stored procedures are not adequate tools for processing image data, in particular medical image data. They are provided instead for the editing of tabular data. The method is essentially known for textual data in the form of stored procedures:
In this way the contents of database tables can be manipulated directly in the storage system.
Stored procedures are unsuitable for complex processes and algorithms due to the deficient facilities for structuring and deficient constructs, so that their own data structures, instructions, and arithmetical operations are lacking.
Distributed Map Reduce (Hadoop MR) solutions are optimized to perform parallel calculations on large data volumes in computer clusters directly in the nodes (computer nodes) in which the data are stored, but offer no fundamental functionality for editing image data, in particular medical image data, nor the corresponding interfaces. Map Reduce solutions are typically appropriate only if the operations are applicable to a large part of the stored data and are readily parallelizable.
Specifically in the case of medical image data or medical measurement value data, for example, the server-side processing is more useful due to the aforementioned problems with the data volume and the transmission bandwidths. A medical image processing step could thus be made available centrally for medical workstations that are placed at different locations. This would be possible with a conventional application server which can be installed in addition to the PACS system. However, the approach would have the disadvantage that a separate application server is poorly integrated and requires separate maintenance.
A runtime environment for the execution of processing steps for image data, in particular for medical image data, can therefore be provided in the image data store, i.e. in the specific case of medical image data or other image data, in a PACS. Measurement value data and additional data can similarly be integrated.
This form of application architecture resolves the previous paradigm of a 3-layer architecture which is described as follows:
This paradigm is resolved by a novel processing architecture that is made available by a novel infrastructure:
Standardized image processing steps are placed in the persistence layer or storage layer in the form of a module and are carried out following the feeding of the module into the memory.
The system may consist of a plurality of components:
Concerning B2, i.e. the runtime environment which provides medical evaluations or image processing steps for execution: The runtime environment standardizes or represents an API (Application Programming Interface) for the image editing modules, i.e. an interface which implements the file access to the image data contained in the PACS. The API enables operations that are normally carried out locally on the image processing workstations. The aim is to place image editing programs which run on the user workstation in the “PACS EE”. As a result, transmission times can be saved, i.e. the image operation is closer to the storage or memory, and the resource consumption can be reduced, i.e. fewer resources are consumed if fewer transmissions take place.
Concerning B3, i.e. concerning the interface which allows the storage of definitions or rules: The definitions define the condition under which an image editing module is executed. Conditions may relate, for example, to a certain age of the file or to a recording angle or to a specific imaging device with which the image was created. The conditions may relate to the fields that are defined, for example, by the DICOM format.
The dynamics may be represented in the sequence diagram shown in
The transmission of data in relation to the execution code for processing medical image data is reversed in the sense that:
The image data are thus no longer manipulated on the workstations, but rather on the server side in the memory. Due to the storing of the image processing modules in the PACS system, the entire system can be scaled centrally and relates less to the capacity of an image processing workstation or a mobile device.
In this way, further transmissions of the medical image data from the PACS server can be avoided. On the one hand, this relieves the I(Input)/O(Output) capacity of the PACS server (service provision computer) and, on the other hand, relieves the medical workstations by addressing or solving the aforementioned problems 1.1, 1.2 and 1.3. A further advantage can be achieved if, for example in a DICOM image series, a quantity or a set of image data is edited or transformed dependently on one another. An increased processing effort would be required here due to the concatenation of image data in a conditional manner.
In one example embodiment, medical image data, for example, can be loaded into a storage system in the DICOM format. One processing step here may entail the extraction and conversion of the 2D (two-dimensional) images contained in a DICOM file. For example, all images contained in a DICOM file can be extracted as JPEG (Joint Photographic Experts Group) image data. For example, EXIF data can be placed in the header of the JPEG format. Examples of EXIF data are the date, time of the recording, exposure parameters, preview images, copyright notices, etc.
It is assumed that the DICOM files have already been loaded into the storage system. The further steps then normally entail the reloading of the files from the storage system, the performance of the extraction and the transmission back into the storage system.
It is now proposed to design the extraction of DICOM data as a module and to load this module into the storage system. The module, i.e. the extraction of the image files, is executed there on the server side without transmission procedures being required.
Methods for CBIR (Content Based Image Retrieval), inter alia, can be considered as a further application. CBIR entails the enablement of the search for a specific image content through recognition of the image contents. In the configuration described above, CBIR methods can be applied to newly arriving image data so that the data are generated or implemented for the search for image contents. Algorithms, for example, are used for the extraction of features in CBIR. CBIR is improved here by two circumstances:
a) For the cluster formation of recognized features and the specification of the recognition rate, the feature recognition benefits from a global perspective of the images. In this respect, the use of CBIR on medical workstations would be disadvantageous.
b) In order to load new features for the recognition, a server-side approach is similarly advantageous, since the software would otherwise have to be updated on, for example, medical image processing workstations.
In CBIR, the textual data are stored as metadata of the image. In DICOM, this means the use of user-definable tags as DICOM data elements.
The characteristics, features and advantages of this invention described above, and the manner in which these are achieved will become clearer and more readily understandable in connection with the following description of the example embodiments. Insofar as the term “can” is used in this application, this means both the technical possibility and the actual technical implementation. Insofar as the term “approximately” is used in this application, this means that the exact value is also disclosed.
The figures are not drawn to scale and, in particular, the aspect ratios can be selected differently.
Example embodiments of the invention are explained below with reference to the attached drawings, in which:
The method steps are carried out using a picture archiving system, e.g. using a PACS 8, which is shown in
To the left of the dividing line 21, a data processing system 12 is shown, e.g. a workstation, a personal computer or a terminal, such as e.g. a tablet PC or smartphone.
Application software 10, for example, is installed on the data processing system 12, for example the interface of an image editing program.
The PACS 8 may contain a data processing system (DP system) or a plurality of DP systems on which a plurality of units are disposed, see e.g.:
Before the PACS 8 is used, it is configured, for example, using the DP system 12. To do this, an optional message 22 is transmitted from the DP system 12 to the interface 14, for example via a wired, a fiber-connected or a wireless network (radio). The message 22 is, for example, a message with the name submitOperationModule and contains, for example:
On the basis of the message 22, the interface 14 generates an optional message 24 which is transmitted from the interface 14 to the image editing unit 18. The message 24 has, for example, the name registerOperationModule and contains:
The interface 14 and the image editing unit 18 may be located on the same DP system or on different DP systems.
The “module” data are stored in the image editing unit 18 or are integrated into the command code of the image editing unit 18, which takes place immediately or on demand, in particular using a compiler and/or using dynamic linking, i.e. a linking in runtime.
Alternatively, the image editing function defined by the “module” data can also be permanently integrated into the PACS 8, i.e. can already be integrated during the installation of the latter into the image editing unit 18. Further image editing functions can be installed by further submitOperationModule messages which originate, for example, from the DP system 12 or from other DP systems.
Before the PACS 8 is used, it is further configured, for example using the DP system 12. An optional message 26 has the name submitConditions and contains:
The additional image data are explained in further detail below with reference to
On the basis of the message 26, the interface 14 generates an optional message 28 which is transmitted from the interface 14 to the control unit 20. The message 28 bears, for example, the name registerConditions or contains an identifier specifying this name. The message 28 furthermore contains:
The interface 14 and the control unit 20 may be located on the same DP system or on different DP systems.
The control unit 20 records the transmitted rule in an internal storage unit or in an external storage unit to which the control unit 20 has access. When the image data are accessed, the rules recorded in the control unit 20 are checked and, where relevant, result in corresponding image editing steps, which is explained in further detail below with reference to
Alternatively, the condition or the rule defined by the “conditions” data can also be permanently integrated into the PACS 8, or can be integrated during the installation of the latter into the control unit 20. Further conditions and rules can be installed through further submitConditions messages which originate, for example, from the DP system 12 or from other DP systems.
Confirmation messages, e.g. for the message 22 or 26, can be transmitted according to the DICOM standard.
It is assumed that an operating person initiates the transmission of image data from the data processing system 12 to the PACS 8 through a user input 6. A message 30 is then transmitted from the DP system 12 to the interface 14, for example via a wired, a fiber-connected or a wireless network (radio).
The message 30 bears the name sendObject or a corresponding identifier. Furthermore, the message 30 contains image data and additional image data, for example DICOM data generated according to the DICOM standard which contain both pixel data and additional image data, which is shown in
The DICOM data fields or the field data are extracted in the interface 14 or in the interface unit 14 following the reception of the message 30, wherein the image data are not included, see time 32. The field data are the additional image data or DICOM data elements.
The interface unit 14 then stores the actual image data and the additional image data in the image data store 16, for which purpose, for example, a message 34 is used. The message 34 is designated, for example, as storeImage and contains the image data and the additional image data, referred to here as “image” for short.
Furthermore, the interface unit 14 generates a message 36 before or after the storage of the image data. The message 36 is also designated as submit and contains the DICOM field data, wherein the actual pixel data are not included. The message 36 is associated with an asynchronous access to the data stored with the message 34, which takes place at a later time. The message 36 is transmitted from the interface unit 14 to the control unit 20 and is further edited there, for example depending on a predefined scheduling function, which ensures an effective performance of image editing functions.
The control unit 20 receives the message 36 and evaluates the DICOM field data contained therein at a later time 38 according to the rules R1, R2, etc., stored in the control unit. If a rule applies, a scheduling function can be performed which specifies when the image editing function to be performed is started, for example at a specific time or in a specific time period. It can also be ensured, for example, that a defined minimum number of images that are to be edited with this image editing function have been received. However, the operation can also be carried out without a scheduling function, e.g. according to the FIFO (First In First Out) principle.
It is assumed that the control unit 20 generates a message 40 at a time occurring after the time 38 in order to start an image processing function on the basis of the message 36. The message 40 is named, for example, “trigger” and contains:
The message 40 is transmitted from the control unit 20 to the image editing unit 18 in order to trigger the associated image editing.
The image editing unit 18 receives the message 40 and determines the image editing function that is to be performed, designated by the identifier id. Furthermore, the image editing unit 18 submits a request to read the image data designated by the DICOM field data from the image data store 16, see message acquireImage(data), wherein “data” specifies the image data to be read.
The image data store 16 is located, for example, on the same data processing system as the image editing unit 18. Alternatively, however, the image data store 16 is located on a different DP system. Both DP systems can be connected by a particularly fast data transmission network or bus system, e.g. a backplane.
At a time 44, the image data are transmitted from the image data store 16 to the image editing unit 18 and are edited there according to the image editing function designated by the identifier id in the message 40, see the cross or time 46.
The edited image data are then stored in the image store 16 in addition to or instead of the image data read in step 44, for example using a storeImage(image) message 48 from the image editing unit 18. The associated additional image data, i.e. the DICOM field data, are similarly stored for the edited data. The data transmitted with the message 40 or the data read in step 44 can be used for this purpose.
Further sendObject messages from the DP system 12 or from other DP systems are edited by the PACS 8 in the same way. Read requests can also be edited by the PACS 8, which is explained in further detail below with reference to
The stored original image data and/or the stored edited image data can be retrieved on demand from the DP system 12 or from the application program 10 or from other DP systems or application programs, wherein, for example, no further editing is carried out. Alternatively, however, a further editing can be carried out when the image data are retrieved, which is explained in further detail below with reference to
In one example embodiment, the image editing carried out at the time 46 consists, for example, in the extraction of JPEG (Joint Photographic Experts Group) image data. The aim of the extraction is, for example, to check whether a thumbnail view (preview) can be found as an EXIF entry in the JPEG header.
The image editing can be started asynchronously by a corresponding identification of the metadata (additional image data) in the picture archive or image data store. For example: If no thumbnail view (preview) is available, this is generated from the original image as an image processing step. This editing step is then stored in a rule which is executed, where relevant, immediately following the extraction or later.
Alternatively, the image editing can be started asynchronously in the user-definable metadata or additional image data, insofar as supported by the file format. These data may be further data, for example those created through CBIR.
In a further example embodiment, an automatic image recognition method is carried out at the time 46 in the context of the CBIR. The recognized structures are classified. The acquisition result is automatically recorded in the additional image data, for example in text form, which is also human-readable. The feature recognition can be improved manually or automatically in stages, since the image editing function is performed centrally for a very rapidly expanding database. In particular, the acquisition rate can be increased quickly, since a multiplicity of images are available at a central location. In the case of the image processing on the user workstations, these functions would have to be updated individually, thereby incurring an administrative/organizational overhead. The overhead for certification or for acceptance testing during commissioning is also much less if the software is updated on one (server) computer only, rather than on X image editing workstations.
The statements made with reference to
However, instead of the user input 6, a user input 58 by means of which the image data are requested, for example through input of a patient identifier, is performed in the method shown in
Instead of the message 30, a message 60, which is also designated as readObject and contains the DICOM field data, is generated by the DP system 12. Alternatively, the message 60 contains only an identifier which allows the determination of associated DICOM data. The message 60 is transmitted from the DP system 12 to the interface unit 14. The message 60 contains no pixel data.
Following the receipt of the message 60 in the interface unit 14, step 32b is carried out, wherein the DICOM field data of the message 60 are read. Alternatively, a memory access can be carried out in order to read the DICOM data depending on an identifier, for example from the image data store 16.
Since there are no pixel data in the message 60, a message corresponding to the message 34 storeImage(image) is absent from the sequence shown in
The edited image data can also be transmitted to the DP system 12 only, without a storage taking place in the image storage unit or in the image data store 16, see transmission 61 from the image editing unit 18 to the interface unit 14 and transmission 62 from the interface unit 14 to the DP system 12.
The DP system 12 outputs the edited image data, for example on a screen, by means of the application program or a different program, see screen output 64.
Further messages 68 readObject from the DP system 12 or from other DP systems can be edited by the PACS 8. Write requests can also be edited by the PACS 8, as explained, for example, in detail above with reference to
The stored original image data and/or the stored edited image data can also be retrieved on demand from the DP system 12 or from the application program 10 or from other DP systems or application programs, wherein, for example, no further editing is carried out.
This design offers the advantage that an image editing function can use the image data present at this time in the unit 10 for an aggregation. For example, the message 60c can be coded in such a way that an overlay with the addition of currently available images in work step 46b is requested, which is then transmitted back in message 62 as a newly generated image to the unit 12. In this way, images can be generated which do not yet exist as such in the transmission of the message 60c, but are generated dynamically only on request, which can also be referred to as “virtual” objects.
In a further example embodiment, an automatic image recognition method is carried out at the time 46b in the context of the CBIR. The recognized structures are classified. The acquisition result is automatically recorded in the additional image data, for example in text form, which is also human-readable. The feature recognition can be improved manually or automatically in stages, since the image editing function is carried out centrally for a very rapidly expanding database. In particular, the acquisition rate can be increased quickly, since a multiplicity of images are available at a central location. In the case of the image processing on the user workstations 12, these functions would have to be updated individually, thereby incurring an administrative/organizational overhead. The overhead for certification or for acceptance testing during commissioning is also much less if the software is updated on one (server) computer only, rather than on X image editing workstations.
The method explained with reference to
In a different example embodiment of the method shown in
The statements made with reference to
However, instead of the user input 6 or 58, a user input 58c is performed in the method shown in
It could thus be instigated that all JPEG images are subjected to a transformation to TIFF.
Instead of the message 60, a message 60c which, in the format of a DICOM object identifier, indirectly specifies a data editing function is generated by the DP system 12. This indirect specification of a function can also be regarded as a specification of a virtual object (VO). The message 60c may additionally contain DICOM field data FD also.
Following the receipt of the message 60c in the interface unit 14, step 32c is carried out, wherein the DICOM VO data of the message 60c are read, along with any field data FD that are present.
Since there are no pixel data in the message 60c, a message corresponding to the message 34 storeImage(image) is again absent from the sequence shown in
The identifier id in the message 40c now specifies the editing function determined using the identifier and the associated rule, e.g. F1 or F2, see also
When the data editing function is performed, data objects are determined by the image editing unit 18, wherein, where relevant, the transmitted field data FD are also used to interrogate the storage unit 16, see also the explanations relating to
The method then continues for all determined datasets and data objects as explained above with reference to
The edited image data may also only be transmitted to the DP system 12 without a storage taking place in the image storage unit or in the image data store 16, see transmission 62c.
The DP system 12 outputs the edited image data or other editing results, for example on a screen using the application program or a different program, see screen output 64c.
Further messages 60c readObject from the DP system 12 or from other DP systems can be edited by the PACS 8. Write requests can also be edited by the PACS 8, as explained, for example, in detail above with reference to
The stored original image data and/or the stored edited image data can also be retrieved on demand from the DP system 12 or from the application program 10 or from other DP systems or application programs, wherein, for example, no further editing is carried out.
This design offers the advantage that an image editing function can use the image data present at this time in the unit 10 for an aggregation. For example, the message 60c can be coded in such a way that an overlay with the addition of currently available images in work step 46c is requested, which is then transmitted back in message 62 as a newly generated image to the unit 12. The objects are selected according to the function stored in the image editing unit 18 or taking account of additional data FD or DICOM field data FD.
In a further example embodiment, an automatic image recognition method is carried out at the time 46c in the context of the CBIR. The recognized structures are classified. The acquisition result is automatically recorded in the additional image data, for example in text form, which is also human-readable. The feature recognition can be improved manually or automatically in stages, since the image editing function is carried out centrally for a very rapidly expanding database. In particular, the acquisition rate can be increased quickly, since a multiplicity of images are available at a central location. In the case of the image processing on the user workstations 12, these functions would have to be updated individually, thereby incurring an administrative/organizational overhead. The overhead for certification or for acceptance testing during commissioning is also much less if the software is updated on one (server) computer 8 only, rather than on X image editing workstations.
The method explained with reference to
In an alternative to the method shown in
The image data BD are embedded in a data block 102 which, for example, meets the DICOM standard. The data block 102 contains patient data PD which are also designated as additional image data BZD1. Examples of patient data are specified above in the table (Patient Module) mentioned in the introduction, e.g. “Patient Identification Number”.
Further data 104, for example, are stored between the patient data PD and a partial image block BT. The partial image block BT contains image data BD, e.g. pixel data in JPEG format or TIFF (Tagged Image File Format). The additional image data contained in the partial image block BT are also designated as additional image data BZD2. Examples of additional image data BZD2 are specified above in the introduction, e.g. “contrast”, “recording angle”, etc.
A rule R1 reads:
IF D5=kidney THEN ID=1 (edge detector).
An edge detection is thus carried out with the specification in a data field D5, according to which the static image of a kidney is involved.
A rule R2 reads:
IF D5=heart THEN ID=2 (determine chamber volume, e.g. with fuzzy methods).
In the case of a recording of the heart, the chamber volume of one or both cardiac atria or cardiac ventricles is determined over a plurality of dynamic images of the heart, wherein, for example, a fuzzy method is employed.
A rule R10 reads:
IF ID=O1 THEN F1 (edge detector).
A function F1, e.g. an edge detection, is thus defined here for an identifier O1. Without further additional data FD, the function F1 would be applied to all suitable objects in the image data store 16. Additional data FD can be used to demarcate the objects under consideration or to find relevant objects. The additional data may be one or more of the following data: —a time restriction, e.g. images recorded in the last month, and/or—specification of a patient, a study or other criterion.
A rule R12 reads:
IF ID=O2 THEN F2 (aggregation).
It is thus specified here for an identifier (ID) O2 (“virtual” object) that is intended to instigate the performance of the function F2, e.g. an aggregation over a plurality of blood pressure values, wherein, for example, a chart or a graph is produced. Additional data FD can be used to demarcate the object under consideration or to find relevant data objects. If no additional data FD are present, all relevant objects, for example, are edited, or a predefined limiting value is taken into account. The additional data may be one or more of the following data:
The rules R1, R2, R10 and/or R12 can also be of more complex design, in particular with logical links in the IF part, e.g. AND, OR, XOR or NEGATION. A plurality of functions can also be specified in the THEN part.
Instead of or in addition to the images, measurement value data can also be edited in the methods according to
The rules may also refer, for example, to the age of an image file and/or to the recording angle in the recording of the image data or to other data which are contained in the additional image data BZD1 and BZD2.
The example embodiments are not drawn to scale and are not limiting. Variations in the context of the activity of the person skilled in the art are possible. Although the invention has been illustrated and described in further detail by means of the preferred example embodiments, the invention is not restricted by the disclosed examples, and other variations can be derived here from by the person skilled in the art without exceeding the scope of protection of the invention. The developments and designs specified in the introduction may be combined with one another. The example embodiments specified in the description of the figures may similarly be combined with one another. Furthermore, the developments and designs specified in the introduction may be combined with the example embodiments specified in the description of the figures.
Number | Date | Country | Kind |
---|---|---|---|
10 2013 206 754.2 | Apr 2013 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2014/052295 | 2/6/2014 | WO | 00 |