The present application claims the benefit under 35 U.S.C. § 119 of German Patent Application No. DE 102019217811.1 filed on Nov. 19, 2019, which is expressly incorporated herein by reference in its entirety.
The present invention relates to a method for processing data of a technical system.
The present invention further relates to a device for processing data of a technical system.
Preferred specific embodiments of the present invention relate to a method, in particular a computer-implemented method, for processing data of a technical system having multiple data processing areas, which is designed for processing input data, in particular of a computing device for carrying out cryptographic methods based on the input data, including, a) modeling at least one part of the technical system with the aid of at least one, in particular artificial, neural network; b) ascertaining at least two physical variables of the technical system for at least two of the multiple data processing areas, in particular during a processing of at least one part of the input data, the at least two physical variables each being assigned to one different data processing area, the at least two physical variables being at least temporarily supplied to the neural network as input variables. This advantageously makes it possible to jointly process the at least two physical variables with the aid of the neural network, whereby, for example side channel information of the different data processing areas of the technical system may be combined with the aid of the neural network. In further preferred specific embodiments, efficient side channel attacks on such technical systems, for example, may be implemented thereby, which distribute, for example, a processing of data to their multiple data processing areas, in particular using cryptographic methods for concealment or masking purposes.
The term side channel attacks describes a class of techniques, with the aid of which, for example, confidential parameters, e.g., of a cryptographic implementation, implemented by a technical system, may be extracted (e.g., key material of encryption methods).
To make side channel attacks more difficult, it is conventional to incorporate certain countermeasures into cryptographic implementations. One main category thereof is the so-called masking. Masking is based on the idea of randomizing (“masking”) sensitive intermediate results which occur, for example, during the calculation of a cryptographic operation (e.g., encrypting a plain text with the aid of a secret key), in particular to interrupt a data dependency, e.g., of the power consumption of the technical system or the implementation of the secret key. In an additive masking scheme of order d, e.g., every sensitive intermediate value x of an algorithm is represented and processed in d+1 number of shares, for example the first d number of shares is randomly selected, and the last share is calculated according to a certain specification. If the specified shares are processed or calculated, for example with the aid of different data processing areas of the technical system, conventional side channel attacks are made significantly more difficult.
Example methods according to the preferred specific embodiments of the present invention, however, permit efficient side channel attacks even in approaches of this type, because the neural network according to preferred specific embodiments may combine information, in particular side channel leakage, of the different data processing areas and thus the different shares. In this way, it is possible to successfully ascertain, for example, an interesting intermediate value x of the algorithm without the actual mask values of the masking method having to be known or ascertained.
In further preferred specific example embodiments of the present invention, it is provided that the modeling includes at least one of the following elements: a) providing the neural network, the neural network being designed, in particular, as a deep neural network (including an input layer, an output layer and at least one intermediate layer (“hidden layer”) situated between the input layer and the output layer), the neural network being, in particular, already trained; b) training the neural network using the at least two physical variables and further input variables, in particular the further input variables including at least one of the following elements: A) known input data for the technical system; B) a cryptographic key.
In further preferred specific embodiments of the present invention, it is provided that the method also includes: training the neural network using known or the known input data, in particular in a first operating phase; processing data of the technical system and/or a further technical system, in particular in a second operating phase following the first operating phase.
In further preferred specific embodiments of the present invention, it is provided that the neural network is an artificial neural network of the CNN type (convolutional neural network, i.e., a neural network based on convolutional operations) and/or the RNN type (recurrent neural network) and/or the MLP type (multilayer perceptron) and/or a mixed form thereof.
In further preferred specific embodiments of the present invention, it is provided that the ascertainment of the at least two physical variables of the technical system includes at least one of the following elements: a) ascertaining a time characteristic of an electric field of the at least one data processing area; b) ascertaining a time characteristic of a magnetic field of the at least one data processing area; c) ascertaining a time characteristic of an electromagnetic field of the at least one data processing area; d) ascertaining a time characteristic of a voltage assigned to the at least one data processing area (or a variable characterizing the voltage), in particular the voltage being ascertained at at least one decoupling capacitor, which is assigned, in particular, to the at least one data processing area.
Further preferred specific embodiments of the present invention relate to a method, in particular a computer-implemented method, for training an, in particular artificial, neural network, which is designed to model at least one part of a technical system including multiple data processing areas, the technical system being designed to process input data, in particular the technical system including or being a computing device for carrying out cryptographic methods based on the input data, the method including: operating the technical system with the aid of known input data; ascertaining at least two physical variables of the technical system, in particular during the operation of the technical system, with the aid of the known input data, the at least two physical variables each being assigned to a different data processing area of the multiple data processing areas; training the neural network as a function of at least the known input data and/or the at least two physical variables.
In further preferred specific embodiments of the present invention, it is provided that the ascertainment of the at least two physical variables of the technical system includes at least one of the following elements: a) ascertaining a time characteristic of an electrical field of the at least one data processing area; b) ascertaining a time characteristic of a magnetic field of the at least one data processing area; c) ascertaining a time characteristic of an electromagnetic field of the at least one data processing area; d) ascertaining a time characteristic of a voltage assigned to the at least one data processing area, in particular the voltage being ascertained at at least one decoupling capacitor, which is assigned, in particular, to the at least one data processing area.
Further preferred specific embodiments of the present invention relate to a device for carrying out the method according to the specific embodiments.
In further specific embodiments of the present invention, it is provided that the device includes: a computing device (“computer”), a memory device assigned to the computing device for at least temporarily storing at least one of the following elements: a) data; b) computer program, in particular for carrying out the method according to the specific embodiments.
In further preferred specific embodiments of the present invention, the data may at least temporarily include information, in particular parameters (e.g., weights, bias values, parameters of activation functions, etc.) of the neural network.
Further preferred specific embodiments of the present invention relate to a computer-readable storage medium, including commands, which, when executed by a computer, prompt the latter to carry out the method according to the specific embodiments.
Further preferred specific embodiments of the present invention relate to a computer program, including commands, which, when the program is executed by a computer, prompt the latter to carry out the method according to the specific embodiments.
Further preferred specific embodiments of the present invention relate to a data carrier signal, which characterizes and/or transfers the computer program according to the specific embodiments.
Further preferred specific embodiments of the present invention relate to a use of the method according to the specific embodiments and/or the device according to the specific embodiments and/or the computer program according to the specific embodiments and/or the data carrier signal according to the specific embodiments for at least one of the following elements: a) carrying out at least one attack, in particular a side channel attack, on the technical system; b) combining side channel information of the technical system assigned to different processing areas; c) training the neural network; d) training the neural network to combine side channel information of the technical system assigned to different processing areas.
Additional features, possible applications and advantages of the present invention are derived from the following description of exemplary embodiments of the present invention, which are illustrated in the figures. All features described or illustrated form the subject matter of the present invention alone or in any arbitrary combination, regardless of their wording in the description or illustration in the figures.
In other preferred specific embodiments, system 100 may be designed, for example, as a system on chip (SoC) 100, including a computing unit or a processor (“processing system”) B5 and, for example, a programmable logic unit PL, for example an FPGA (field-programmable gate array), logic unit PL implementing data processing areas B1, B2, B3, B4. For example, SoC 100 is designed to execute cryptographic algorithms, for example steps of the AES method, masking techniques being applicable, which assign different shares to each data processing area B1, B2, B3 for carrying out the corresponding calculations, e.g., for the purpose of making conventional side channel attacks more difficult.
For details on AES (advanced encryption standard), cf. for example https://doi.org/10.6028/NIST.FIPS.197. In further preferred specific embodiments, the cryptographic method may also include a method other than the encryption method mentioned as an example, e.g., a hash value formation or the like.
In further preferred specific embodiments of the present invention, data processing areas B1, B2, B3 may be assigned or correspond to, in particular different clock regions (CR) of FPGA PL, for example to avoid coupling effects therebetween, which would reduce the security of the implementation of the cryptographic algorithms in SoC 100.
In further preferred specific embodiments of the present invention, multiple decoupling capacitors EK, which are designated collectively in
In further specific embodiments of the present invention, SoC 100 includes multiple voltage (supply) lines, which each supply, in particular, a different part of SoC 100 with current. In particular, FPGA PL may be divided into multiple of clock regions (“CR”) already mentioned above, the individual CRs being suppliable with current via different pins of a power supply of FPGA PL.
The features according to preferred specific embodiments of the present invention described below as an example with reference to
Further preferred specific embodiments of the present invention relate to a method, in particular a computer-implemented method, for processing data of a or the technical system 100 (
Side channel attacks describe a class of techniques, with the aid of which, for example, confidential parameters, e.g., of a cryptographic implementation, implemented by a technical system 100, may be extracted (e.g., key material of encryption methods).
To make side channel attacks more difficult, it is conventional to incorporate certain countermeasures into cryptographic implementations. One main category thereof is the so-called masking. Masking is based on the idea of randomizing (“masking”) sensitive intermediate results which occur, for example, during the calculation of a cryptographic operation (e.g., encrypting a plain text with the aid of a secret key), in particular to interrupt a data dependency, e.g., of the power consumption of the technical system or the implementation of the secret key. In an additive masking scheme of order d, e.g., every sensitive intermediate value x of an algorithm is represented and processed in d+1 number of shares, for example the first d number of shares is randomly selected, and the last share is calculated according to a certain specification. If the specified shares are processed or calculated, for example with the aid of different data processing areas of the technical system, conventional side channel attacks are made significantly more difficult.
The method according to preferred specific embodiments of the present invention, however, permits efficient side channel attacks even in approaches of this type, because neural network NN according to preferred specific embodiments may combine information, in particular side channel leakage, of the different data processing areas B1, B2, B3 and thus the different shares. In this way, it is possible to successfully ascertain, for example, an interesting intermediate value x of the algorithm without the actual mask values of the masking method having to be known or ascertained.
In further preferred specific embodiments of the present invention, it is provided that modeling 200 includes at least one of the following elements, cf.
B) a cryptographic key.
In further preferred specific embodiments of the present invention, it is provided that the method also includes, cf.
This is illustrated as an example in
In further preferred specific embodiments of the present invention, it is provided that neural network NN is an artificial neural network of the CNN type (convolutional neural network, i.e., a neural network based on convolutional operations) and/or the RNN type (recurrent neural network) and/or the MLP type (multilayer perceptron) and/or a mixed form thereof.
Block 210 in
In further preferred specific embodiments of the present invention, it is provided that ascertainment 210 of the at least two physical variables GB1, GB2, GB3 of technical system 100 includes at least one of the following elements, also cf.
In the present case, the voltage or a time characteristic of the voltage is preferably ascertained, which is present at individual decoupling capacitors EK1, EK2, EK3, which, as described above, are each assigned, in particular to precisely one of data processing areas B1, B2, B3. For example, physical variable GB1 thus corresponds to the time characteristic of the voltage at decoupling capacitor EK1, which is assigned to data processing area B1, physical variable GB2 corresponds to the time characteristic of the voltage at decoupling capacitor EK2, and physical variable GB3 corresponds to the time characteristic of the voltage at decoupling capacitor EK3.
In further preferred specific embodiments of the present invention, neural network NN is trained in first operating phase PH1 according to
Second variable v preferably corresponds, for example to a result or an intermediate result of the cryptographic method, as carried out by technical system 100, based on input data ED′ (known in the present case).
In further preferred specific embodiments of the present invention, during the training, weights and/or other parameters of neural network NN are changed, in particular according to known training methods, to achieve the desired behavior of the neural network, for example approximation v* (
Once neural network NN has been trained, for example as described above, e.g., a side channel attack on system 100 or a further system 100′ (which includes, e.g., a same or similar implementation of the cryptographic method as system 100 according to
Further preferred specific embodiments relate to a method, in particular a computer-implemented method, for training an, in particular artificial, neural network NN, which is designed to model at least one part of a technical system 100 including multiple data processing areas B1, B2, B3, technical system 100 being designed to process input data ED, in particular technical system 100 including or being a computing device for carrying out cryptographic methods, based on the input data. According to further preferred specific embodiments, the training method may form, for example, a supplement to the methods described above with reference to
In further preferred specific embodiments of the present invention, it is provided that the training method is carried out as an independent method, e.g., according to
In further preferred specific embodiments of the present invention, the training method includes, cf.
In further preferred specific embodiments of the present invention, it is provided that ascertainment 230b (
Further preferred specific embodiments of the present invention relate to a device 300, cf.
In further specific embodiments of the present invention, it is provided that device 300 includes: a computing device 302 (“computer”) including at least one core 302a; a memory device 304 assigned to computing device 302 for at least temporarily storing at least one of the following elements: a) data DAT; b) computer program PRG, in particular for carrying out the method according to the specific embodiments.
In further preferred specific embodiments of the present invention, data DAT may at least temporarily include information, in particular parameters (e.g., weights, bias values, parameters of activation functions, etc.) of neural network NN, NN′. In further preferred specific embodiments, data DAT may at least temporarily include physical variables GB1, GB2, . . . and/or data derivable therefrom.
In further preferred specific embodiments of the present invention, memory device 304 includes a volatile memory 304a (e.g., random-access memory (RAM)) and/or a non-volatile memory 304b (e.g., flash EEPROM).
Further preferred specific embodiments of the present invention relate to a computer-readable storage medium SM, including commands PRG′, which, when executed by a computer, prompt the latter to carry out the method according to the specific embodiments of the present invention.
Further preferred specific embodiments of the present invention relate to a computer program PRG, PRG′, including commands, which, when program PRG, PRG′ is executed by a computer 302, prompt the latter to carry out the method according to the specific embodiments of the present invention.
Further preferred specific embodiments of the present invention relate to a data carrier signal DCS, which characterizes and/or transfers computer program PRG, PRG′ according to the specific embodiments. Data carrier signal DCS is receivable, for example, via an optional data interface 306 of device 300.
In further preferred specific embodiments of the present invention, computing device 302 may also include at least one computing unit 302b optimized, for example, for an evaluation or design of (trained) neural network NN, NN′ or for a training of a neural network NN, for example a graphics processor (GPU) and/or a tensor processor or the like.
In further preferred specific embodiments of the present invention, device 300 may also include a data interface 305 for receiving physical variables GB1, GB2, GB3 and/or input data ED or known input data ED′ (also cf.
Further preferred specific embodiments of the present invention relate to a use 250 (
In further preferred specific embodiments of the present invention, a method may be carried out, based on the following equation:
k corresponding to the (secret) key, k* corresponding to a key hypothesis, g( ) corresponding to a target operation, Oi corresponding to the physical variables or side channel information GB1, GB2, GB3, NA corresponding to a number of measurements of the physical variables for a side channel attack.
In further preferred specific embodiments of the present invention, neural network NN may also be referred to as a “multi-input” DNN, i.e., as a deep neural network having multiple input variables, since the at least two physical variables GB1, GB2, GB3 are at least temporarily supplied thereto as input variables according to further preferred specific embodiments.
In further preferred specific embodiments of the present invention, input layer IL (
Number | Date | Country | Kind |
---|---|---|---|
102019217811.1 | Nov 2019 | DE | national |