This application is a National Stage Entry of PCT/JP2020/006048 filed on Feb. 17, 2020, which claims priority from Japanese Patent Application 2019-051317 filed on Mar. 19, 2019, the contents of all of which are incorporated herein by reference, in their entirety.
The present invention relates to a system, a client apparatus included in the system, a data processing method in the system, a computer program and a recording medium.
A Cloud service such as an IaaS (Infrastructure as a Service) may be used for a machine learning process. A Cloud service provider takes a security measure such as, for example, an encryption of a communication path and a storage encryption. However, for example, due to a human error or the like of the provider, there is a risk of a leakage of user information. For this reason, a user needs to take a measure in preparation for the information leakage, for example, by transmitting data with his own encryption to the cloud service.
As a technique in preparation for the information leakage, for example, a technique described in a Patent Literature 1 is proposed. In the technique described in the Patent Literature 1, first, data DT0 is divided into n pieces of distributed information, then, a secret distribution that allows the data DT0 to be restored by k (1<k<n) pieces of distributed information of the n pieces of distributed information is performed, and the k pieces of distributed information generated by the secret distribution are stored in different storage apparatuses. Then, data DT1, which are generated by editing the data DT0 restored from the k pieces of distributed information, is divided into n pieces of edited distributed information. Then, a difference is calculated between the distributed information stored in each storage apparatus and the corresponding edited distributed information, and the corresponding edited distributed information and a plurality of differences related to a plurality of edited distributed information other than the corresponding edited distributed information are stored in each storage apparatus. There are Patent Literatures 2 to 5 as other related techniques.
Patent Literature 1: JP 2007-304962A
Patent Literature 2: JP 2014-142871A
Patent Literature 3: JP 2017-211689A
Patent Literature 4: JP 4590048B
Patent Literature 5: JP 4588142B
In the technique described in the Patent Literature 1, the cloud services are not considered, and there is room for improvement.
In view of the above-described problems, it is therefore an example object of the present invention to provide a system, a client apparatus, a data processing method, a computer program and a recording medium that are configured to carry out a security measure in preparation for an information leakage without depending on a cloud service provider.
A system according to an example aspect of the present invention is a system including: a cloud server that is configured to perform a machine learning process; and a client apparatus that is configured to communicate with the cloud server, the client apparatus including: a generating unit that generates one or a plurality of reference data from a plurality of data used for the machine learning and that generates a plurality of difference data, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage unit that stores the plurality of difference data in a storage apparatus of the cloud server.
A client apparatus according to an example aspect of the present invention is a client apparatus that is configured to communicate with a cloud server that is configured to perform machine learning process, the client apparatus including: a generating unit that generates one or a plurality of reference data from a plurality of data used for the machine learning and that generates a plurality of difference data, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage unit that stores the plurality of difference data in a storage apparatus of the cloud server.
A data processing method according to an example aspect of the present invention is a data processing method in a system including: a cloud server that is configured to perform machine learning process; and a client apparatus that is configured to communicate with the cloud server, the data processing method including: a generating step in which one or a plurality of reference data is generated from a plurality of data used for the machine learning and a plurality of difference data are generated, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage step in which the plurality of difference data are stored in a storage apparatus of the cloud server.
A computer program according to an example aspect of the present invention allows a computer to execute the data processing method according to the example aspect described above.
A recording medium according to an example aspect of the present invention is a recording medium on which the computer program according to the example aspect described above is recorded.
According to the system, the client apparatus, the data processing method, the computer program, and the recording medium in the respective example aspects described above, it is possible to carry out a security measure for an information leakage without depending on a cloud service provider.
An example embodiment of a system, a client apparatus, a data processing method, a computer program and a recording medium will be described with reference to the drawings. The following describes the example embodiment of a system, a client apparatus, a data processing method, a computer program and a recording medium, by using a system 1 that includes: a cloud server that is configured to perform a machine learning process; and a client apparatus that is configured to communicate with the cloud server.
A configuration of the system 1 according to the example embodiment will be described with reference to
Next, a hardware configuration of the client apparatus 100 will be described with reference to
In
The CPU 11 reads a computer program. For example, the CPU 11 may read a computer program stored by at least one of the RAM 12, the ROM 13 and the storage apparatus 14. For example, the CPU 11 may read a computer program stored in a computer-readable recording medium, by using a not-illustrated recording medium reading apparatus. The CPU 11 may obtain (i.e., read) a computer program from a not illustrated apparatus disposed outside the client apparatus 100, through a network interface. The CPU 11 controls the RAM 12, the storage apparatus 14, the input apparatus 15, and the output apparatus 16 by executing the read computer program. Especially in this example embodiment, when the CPU 11 executes the read computer program, a logical functional block for encrypting a plurality of data used for a machine learning (which will be described later in detail) is implemented in the CPU 11. In other words, the CPU 11 is configured to function as a controller for encrypting the plurality of data used for the machine learning. A configuration of the functional block implemented in the CPU 11 will be described in detail later with reference to
The RAM 12 temporarily stores the computer program to be executed by the CPU 11. The RAM 12 temporarily stores the data that are temporarily used by the CPU 11 when the CPU 11 executes the computer program. The RAM 12 may be, for example, a D-RAM (Dynamic RAM).
The ROM 13 stores a computer program to be executed by the CPU 11. The ROM 13 may otherwise store fixed data. The ROM 13 may be, for example, a P-ROM (Programmable ROM).
The storage apparatus 14 stores the data that are stored for a long term by the client apparatus 100. The storage apparatus 14 may operate as a temporary storage apparatus of the CPU 11. The storage apparatus 14 may include, for example, at least one of a hard disk apparatus, a magneto-optical disk apparatus, an SSD (Solid State Drive), and a disk array apparatus.
The input apparatus 15 is an apparatus that receives an input instruction from a user of the client apparatus 100. The incoming apparatus 15 may include, for example, at least one of a keyboard, a mouse, and a touch panel.
The output apparatus 16 is an apparatus that outputs information about the client apparatus 100, to the outside. For example, the output apparatus 16 may be a display apparatus that is configured to display the information about the client apparatus 100.
Next, a hardware configuration of the cloud server 200 will be described with reference to
In
The CPU 21 reads a computer program. For example, the CPU 21 may read a computer program stored by at least one of the RAM 22, the ROM 23 and the storage apparatus 24. For example, the CPU 21 may read a computer program stored in a computer-readable recording medium, by using a not-illustrated recording medium reading apparatus. The CPU 21 may obtain (i.e., read) a computer program from a not illustrated apparatus disposed outside the cloud server 200, through a network interface. The CPU 21 controls the RAM 22, the storage apparatus 24, the input apparatus 25, and the output apparatus 26 by executing the read computer program. Especially in this example embodiment, when the CPU 21 executes the read computer program, a logical functional block for performing machine learning process and for decrypting the plurality of data used for the machine learning (which will be described later in detail) is implemented in the CPU 21. In other words, the CPU 21 is configured to function as a controller for performing the machine learning process and for decrypting the plurality of data used for the machine learning. A configuration of the functional block implemented in the CPU 21 will be described in detail later with reference to
The RAM 22 temporarily stores the computer program to be executed by the CPU 21. The RAM 22 temporarily stores the data that are temporarily used by the CPU 21 when the CPU 21 executes the computer program. The RAM 22 may be, for example, a D-RAM.
The ROM 23 stores a computer program to be executed by the CPU 21. The ROM 23 may otherwise store fixed data. The ROM 23 may be, for example, a P-ROM.
The storage apparatus 24 stores the data that are stored for a long term by the cloud server 200. The storage apparatus 24 may operate as a temporary storage apparatus of the CPU 21. The storage apparatus 24 may include, for example, at least one of a hard disk apparatus, a magneto-optical disk apparatus, an SSD, and a disk array apparatus.
The input apparatus 25 is an apparatus that receives an input instruction from an administrator of the cloud server 200. The incoming apparatus 25 may include, for example, at least one of a keyboard, a mouse, and a touch panel.
The output apparatus 26 is an apparatus that outputs information about the cloud server 200, to the outside. For example, the output apparatus 26 may be a display apparatus that is configured to display the information about the cloud server 200.
Next, the configurations of the functional blocks implemented in the CPU 11 and the CPU 21 will be described with reference to
In
In the CPU21 of the cloud server 200, a data decryption apparatus 211, a learning engine 212, and a prediction engine 213 are implemented as the logical functional block for performing the machine learning process and for decrypting the plurality of data used for the machine learning. The data decryption apparatus 211 includes a combined data calculation apparatus 2111.
As illustrated in
Incidentally, in
The data preprocessing apparatus 1111 of the data encryption apparatus 111 performs a predetermined data preprocessing on learning data and prediction data that are obtained from the learning data input apparatus 302 and the prediction data input apparatus 303, respectively. When the learning data and the prediction data are image data, there are a contrast correction, an alignment, a rotational correction and the like as the predetermined preprocessing, for example.
The simple data classification apparatus 1112 classifies each of the learning data or the prediction data by category, by using a model for a data classification that is obtained from the simple classification model input apparatus 301. Note that the model for the data classification may be a model disclosed on the Internet, or a model generated by the machine learning process in the past.
For example, when six face images A to F illustrated in
The reference data calculation apparatus 1113 calculates reference data by category, from each of the learning data or the prediction data classified by category. The reference data is a reference for a plurality of data classified into one category. Such reference data may be, for example, an average value of the plurality of data classified into one category. When the learning data and the prediction data are image data, such image data that a pixel value of each pixel is an average value of pixel values of a plurality of image data as the learning data or the prediction data, may be calculated as the reference data. Alternatively, the reference data may be, for example, the data that include a common component, which is common to each of the plurality of data classified into one category.
The reference data calculation apparatus 1113 may calculate an image illustrated in
The difference data calculation apparatus 1114 calculates difference data by calculating a difference between the reference data and the learning data or the prediction data, by category. The difference data calculation apparatus 1114 calculates difference data illustrated in
Here, an image A in
When the learning data and the prediction data are image data, the difference data calculated by the difference data calculation apparatus 1114 has less high-frequency components than those of the original image data, because it does not include the reference data. Therefore, when a frequency conversion compression (i.e., a process that converts the image data to data in a frequency area and then compresses it, and a compression process that is used, for example, in JPEG (Joint Photographic Experts Group method)) is performed on the difference data, it can be expected that the compressed difference data has a relatively small data size. Therefore, the difference data calculation apparatus 1114 may perform the frequency conversion compression on the calculated difference data, and may store the compressed difference data in the difference data storage apparatus 241. With this configuration, it is possible to reduce a time spent for storing the difference data in the difference data storage apparatus 241, and it is also possible to reduce the consumption of a capacity of the difference data storage apparatus 241.
The client apparatus 100 transmits the reference data stored in the reference data storage apparatus 141 to the data decryption processing 211 of the cloud server 200 when the machine learning process is requested to the cloud server 200.
The combined data calculation apparatus 2111 of the data decryption apparatus 211 combines the reference data transmitted from the client 100 with the difference data stored in advance in the difference data storage apparatus 241 to restore the learning data or the prediction data. Specifically, the combined data calculation apparatus 2111 combines the image as the reference data illustrated in
The learning engine 212 generates an analytical model by the machine learning that uses the learning data restored by the combined data calculation apparatus 2111. The learning engine 212 stores the generated analytical model, for example, in an analysis result storage apparatus 242 realized by the storage apparatus 24.
The prediction engine 213 stores a prediction result, for example, in the prediction result storage apparatus 243 realized by the storage apparatus 24, wherein the prediction result is obtained, for example, by predicting the category of each of the plurality of image data that are the prediction data by using the prediction data restored or decrypted by the combined data calculation apparatus 2111 and the analytical model stored in the analysis result storage apparatus 242.
Next, the operation of the system 1 when the machine learning process is performed will be described with reference to a flowchart in
In
The data preprocessing apparatus 1111 of the data encryption apparatus 111 performs the predetermined preprocessing on the learning data obtained in the step S102 (a step S103). Then, the simple data classification apparatus 1112 of the data encryption apparatus 111 classifies the learning data by category, by using the simple classification model obtained in the step S101 (a step S104).
Then, the reference data calculation apparatus 1113 of the data encryption apparatus 111 calculates the reference data by category, from the learning data classified by category (a step S105). The reference data calculation apparatus 1113 stores the calculated reference data in the reference data storage apparatus 141 (a step S106) and outputs the calculated reference data to the difference data calculation apparatus 1114 of the data encryption apparatus 111.
The difference data calculation apparatus 1114 calculates the difference data by category, by calculating the difference between the reference data and the learning data (a step S107). The difference data calculation apparatus 1114 stores the calculated plurality of difference data in the difference data storage apparatus 241 of the cloud server 200 (a step S108).
Then, when the machine learning process is requested to the cloud server 200, the client apparatus 100 transmits the reference data stored in the reference data storage apparatus 141 to the data decryption apparatus 211 of the cloud server 200 (a step S109).
The combined data calculation apparatus 2111 of the data decryption apparatus 211 combines the reference data transmitted from the client apparatus 100 with the difference data stored in advance in the difference data storage 241 to restore the learning data (a step S110). The data decryption apparatus 211 outputs the restored learning data to the learning engine 212.
The learning engine 212 generates the analytical model by performing the machine learning process by using the learning data (i.e., by analyzing the learning data) (a step S111). The learning engine 212 stores the generated analytical model in the analysis result storage apparatus 242 (a step S112).
Next, the operation of the system 1 when the analysis that uses the analytical model generated by the machine learning process is performed will be described with reference to a flowchart in
In
The data preprocessing apparatus 1111 of the data encryption apparatus 111 performs the predetermined preprocessing on the predictive data obtained in the step S202 (a step S203). Then, the simple data classification apparatus 1112 of the data encryption apparatus 111 classifies the prediction data by category, by using the simple classification model obtained in the step S201 (a step S204).
Then, the reference data calculation apparatus 1113 of the data encryption apparatus 111 calculates the reference data by category, from the prediction data classified by category (a step S205). The reference data calculation apparatus 1113 stores the calculated reference data in the reference data storage apparatus 141 (a step S206) and outputs the calculated reference data to the difference data calculation apparatus 1114 of the data encryption 111.
The difference data calculation apparatus 1114 calculates the difference data by category, by calculating the difference between the reference data and the prediction data (a step S207). The difference data calculation apparatus 1114 stores the calculated plurality of difference data in the difference data storage apparatus 241 of the cloud server 200 (a step S208).
Then, when the analysis that uses the analytical model is requested to the cloud server 200, the client apparatus 100 transmits the reference data stored in the reference data storage apparatus 141 to the data decryption apparatus 211 of the cloud server 200 (a step S209).
The combined data calculation apparatus 2111 of the data decryption apparatus 211 combines the reference data transmitted from the client apparatus 100 with the difference data stored in advance in the difference data storage apparatus 241 to restore the predictive data (a step S210). The data decryption apparatus 211 outputs the restored prediction data to the prediction engine 213.
The prediction engine 213 stores a prediction result in the prediction result storage apparatus 243, wherein the prediction result is obtained, for example, by predicting the category of each of the plurality of image data that are the prediction data by using the restored prediction data and the analytical model stored in the analysis result storage apparatus 242 (a step S212).
Incidentally, the “reference data calculation apparatus 1113”, the “difference data calculation apparatus 1114” and the “simple data classification apparatus 1112” corresponds to an example of the “generating unit” in Supplementary Note that will be described later. The “difference data calculation apparatus 1114” corresponds to an example of the “storage unit” in Supplementary Note that will be described later.
In the system 1, as described above, the learning data used for the machine learning process in the cloud server 200 and the prediction data used for the analysis that uses the analytical model are divided into the reference data and the difference data in the client apparatus 100 in the on-premises environment (as a result of which, the learning data and the prediction data are encrypted). Then, the difference data is stored in the difference data storage apparatus 241 of the cloud server 200, while the reference data is stored in the reference data storage apparatus 141 of the client apparatus 100.
Even when the cloud server 200 is attacked and the difference data stored in the difference data storage apparatus 241 is stolen, it is extremely difficult (or it can be said that it is virtually impossible) to restore the original data (i.e., the learning data or the prediction data) only from the difference data. When the machine learning process or the analysis that uses the analytical model is performed, the reference data is transmitted to the cloud server 200. However, the reference data and the original data restored from the reference data and the difference data are temporarily stored, for example, in a cache memory of the cloud server 200, and are deleted from the cloud server 200 after the end of the machine learning process or the like. Therefore, there is very little chance that the original data is stolen from the cloud server 200.
Thus, by dividing the original data (i.e., the learning data or the prediction data) into the reference data and the difference data in the client apparatus 100 in the on-premises environment, it is possible to carry out a security measure in preparation for an information leakage without depending on a cloud service provider. In particular, since the data are not degraded by data encryption (i.e., dividing the data into the reference data and the difference data) and decryption (i.e., restoring the original data from the reference data and the difference data), the accuracy of the machine learning process or the like is not impaired due to the encryption and the decryption.
In addition, since the difference data stored in (in other words, uploaded to) the difference data storage apparatus 241 of the cloud server 200 has a relatively small data size, it is possible to reduce the consumption of the capacity of the difference data storage 241, which is caused by the difference data. Furthermore, when distributed processing is performed in the cloud server 200, it is possible to suppress a communication traffic volume in the distributed processing.
(1) In the above-described example embodiment, the image data is exemplified as an example of the learning data and the prediction data; however, it is not limited to the image data, but may be, for example, audio data, time-series numerical data, or the like.
(2) As illustrated in
With respect to the example embodiments described above, the following Supplementary Notes will be further disclosed.
The system described in Supplementary Note 1 is a system including: a cloud server that is configured to perform a machine learning process; and a client apparatus that is configured to communicate with the cloud server, the client apparatus including: a generating unit that generates one or a plurality of reference data from a plurality of data used for the machine learning and that generates a plurality of difference data, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage unit that stores the plurality of difference data in a storage apparatus of the cloud server.
The system described in Supplementary Note 2 is the system described in Supplementary Note 1, wherein the generating unit classifies each of the plurality of data into one or a plurality of categories, and generates the reference data and the difference data for each of the one or a plurality of categories.
The system described in Supplementary Note 3 is the system described in Supplementary Note 1 or 2, wherein the storage unit performs frequency conversion compression on the plurality of difference data and then stores them in the storage apparatus.
The system described in Supplementary Note 4 is the system described in any one of Supplementary Notes 1 to 3, wherein the client apparatus transmits the one or a plurality of reference data to the cloud server when the machine learning process is requested to the cloud server, and the cloud server includes a restoration unit that restores the plurality of data from the plurality of difference data stored in the storage apparatus and the one or a plurality of reference data.
The client apparatus described in Supplementary Note 5 is a client apparatus that is configured to communicate with a cloud server that is configured to perform machine learning process, the client apparatus including: a generating unit that generates one or a plurality of reference data from a plurality of data used for the machine learning and that generates a plurality of difference data, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage unit that stores the plurality of difference data in a storage apparatus of the cloud server.
The data processing method described in Supplementary Note 6 is a data processing method in a system including: a cloud server that is configured to perform machine learning process; and a client apparatus that is configured to communicate with the cloud server, the data processing method including: a generating step in which one or a plurality of reference data is generated from a plurality of data used for the machine learning and a plurality of difference data are generated, wherein the reference data is a reference for at least a part of the plurality of data, and each difference data indicates a difference between each of the plurality of data and corresponding reference data out of the one or the plurality of reference data; and a storage step in which the plurality of difference data are stored in a storage apparatus of the cloud server.
The computer program described in Supplementary Note 7 is a computer program that allows a computer to execute the data processing method described in Supplementary Note 6.
The recording medium described in Supplementary Note 8 is a recording medium on which the computer program described in Supplementary Note 7 is recorded.
The present invention is allowed to be changed, if desired, without departing from the essence or spirit of the invention which can be read from the claims and the entire specification. A system, a client apparatus, a data processing method, a computer program and a recording medium, which involve such changes, are also intended to be within the technical scope of the present invention.
This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2019-051317, filed on Mar. 19, 2019, the disclosure of which is incorporated herein in its entirety by reference.
Number | Date | Country | Kind |
---|---|---|---|
2019-051317 | Mar 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/006048 | 2/17/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/189133 | 9/24/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6253203 | O'Flaherty et al. | Jun 2001 | B1 |
6275824 | O'Flaherty et al. | Aug 2001 | B1 |
20070266261 | Nishiguchi | Nov 2007 | A1 |
20100316265 | Nakanowatari | Dec 2010 | A1 |
20160019254 | Vijayrao | Jan 2016 | A1 |
20160155136 | Zhang | Jun 2016 | A1 |
20160335017 | Itoh | Nov 2016 | A1 |
20180247650 | Lu et al. | Aug 2018 | A1 |
20190020477 | Antonatos | Jan 2019 | A1 |
20210049347 | Zhao et al. | Feb 2021 | A1 |
20210295168 | Xu | Sep 2021 | A1 |
20220172454 | Goto | Jun 2022 | A1 |
20220383194 | Idesawa | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
109344742 | Feb 2019 | CN |
2007-257248 | Oct 2007 | JP |
2007-304962 | Nov 2007 | JP |
4588142 | Nov 2010 | JP |
4590048 | Dec 2010 | JP |
2014-142871 | Aug 2014 | JP |
2017-211689 | Nov 2017 | JP |
2018-514100 | May 2018 | JP |
2018-190239 | Nov 2018 | JP |
Entry |
---|
International Search Report for PCT Application No. PCT/JP2020/006048, dated Jun. 9, 2020. |
Number | Date | Country | |
---|---|---|---|
20220172454 A1 | Jun 2022 | US |