The disclosure relates in general to neural network (NN) processing method and server and electrical device therefor.
With the development of technology, more and more attention has been paid to related applications of AI (artificial intelligence). Most frameworks for AI only support CPU and GPU hardware environments. AI compiler (neural network compiler) can make neural network (NN) models to be executed on different types of hardware, such as mobile phones, embedded system devices, low-power special purpose chips, and so on.
The AI compiler can be implemented in cloud services. Model developers only need to upload the NN model, and then the AI compiler in cloud can optimize, benchmark, and package the models for different hardware platforms. The compiled NN model can be transmitted to AI SoC (System on a Chip) or AI dongle (such as USB AI dongle) for execution, or be executed on other different hardware platforms.
However, this approach of implementing the AI compiler in cloud services makes the NN model lack of privacy. When NN models developed with different frameworks are transmitted to the cloud for compilation, the NN models may be known by others. Excellent NN models are often important property of a company. If NN models are released to others, it may cause significant loss to the company. Therefore, how to properly protect the NN model developed by the model developer when the NN models are uploaded to the cloud has become a prominent task for the industries.
According to one embodiment, a neural network (NN) processing method is provided. The method includes the following steps. An AI (artificial intelligence) compiler code of an AI compiler is transformed to a garbled circuit code by sending a circuit graph of a garbled circuit corresponding to the garbled circuit code to a first electrical device by a server, the garbled circuit having a number of logic gates; creating a number of key codebooks for a number of candidate gates corresponding to each logic gate by the first electrical device; generating a number of garbled truth tables for the candidate gates corresponding to each logic gate by the first electrical device; transmitting the garbled truth tables for the candidate gates corresponding to each logic gate to the server by the first electrical device through using OT (Oblivious Transfer) protocol; and obtaining a target garbled truth table of each logic gate through using OT protocol based on the garbled truth tables for the candidate gates corresponding to each logic gate by the server. The neural network processing method further includes the step of encrypting an NN model according to the key codebooks by the first electrical device and generating a compiled NN model of an encrypted NN model according to the garbled circuit code with the target garbled truth table of each logic gate by the server.
According to another embodiment, a server for processing a neural network is provided. The server includes a transmission circuit and a processor. The processor is configured to transform an AI compiler code of an AI compiler to a garbled circuit code by performing following procedures: sending a circuit graph of a garbled circuit corresponding to the garbled circuit code to a first electrical device through the transmission circuit, the garbled circuit having a number of logic gates; receiving a number of garbled truth tables for a number of candidate gates corresponding to each logic gate from the first electrical device through using OT protocol via the transmission circuit; and obtaining a target garbled truth table of each logic gate through using OT protocol based on the garbled truth tables for the candidate gates corresponding to each logic gate. A number of key codebooks for the candidate gates corresponding to each logic gate are created by the first electrical device, an NN model is encrypted according to the key codebooks by the first electrical device, and the processor is further configured to generate a compiled NN model of an encrypted NN model according to the garbled circuit code with the target garbled truth table of each logic gate.
According to an alternative embodiment, an electrical device for processing a neural network is provided. The electrical device includes a transmission circuit and a processor. The processor is configured to assist a server to transform an AI compiler code of an AI compiler to a garbled circuit code by performing following procedures: receiving a circuit graph of a garbled circuit corresponding to the garbled circuit code from the server through the transmission circuit, the garbled circuit having a number of logic gates; creating a number of key codebooks for a number of candidate gates corresponding to each logic gate; generating a number of garbled truth tables for the candidate gates corresponding to each logic gate; and transmitting the garbled truth tables for the candidate gates corresponding to each logic gate to the server through using OT protocol via the transmission circuit. The processor is further configured to encrypt an NN model according to the key codebooks, a target garbled truth table of each logic gate is obtained through using OT protocol based on the garbled truth tables for the candidate gates corresponding to each logic gate by the server, and a compiled NN model of an encrypted NN model is generated according to the garbled circuit code with the target garbled truth table of each logic gate by the server.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Referring to
Afterward, the method enters to step 112, an NN model is encrypted according to the key codebooks by the electrical device 204. Then, step 114 is performed and a compiled NN model of an encrypted NN model is generated according to the garbled circuit code with the target garbled truth table of each logic gate by the server 202.
The electrical device 204 operates as, for example, a client end in the system 200. The server 202 is, for example, a cloud server. By encrypting the NN model according to the key codebooks and transmitting the garbled truth tables through using OT protocol, the content of the NN model provided by the electrical device 204 (client end) will not be released to the server 202 (cloud server), and the privacy of the NN model is protected. Besides, by sending a circuit graph Gc instead of the garbled circuit, using the garbled circuit code, and transmitting the garbled truth tables through using OT protocol, the content of the AI compiler of the server 202 (cloud server) will not be released to the electrical device 204 (client end). Therefore, the privacy of the AI compiler is also protected. The neural network (NN) processing method according to the embodiment of the disclosure will be described further below.
In cryptography, oblivious transfer (OT) protocol is a protocol in which a sender transfers one of many pieces of information to a receiver, but the sender remains oblivious as to what piece has been transferred. The first form of oblivious transfer was introduced in 1981 by Michael O. Rabin (Michael O. Rabin. “How to exchange secrets with oblivious transfer.” Technical Report TR-81, Aiken Computation Laboratory, Harvard University, 1981.). A more useful form of oblivious transfer called 1-2 oblivious transfer or “1 out of 2 oblivious transfer” was developed later by Shimon Even, Oded Goldreich, and Abraham Lempel in 1985 (S. Even, O. Goldreich, and A. Lempel, “A Randomized Protocol for Signing Contracts”, Communications of the ACM, Volume 28, Issue 6, pg. 637-647, 1985.). It is generalized to “1 out of n oblivious transfer” where the receiver gets exactly one element without the sender getting to know which element was queried, and without the receiver knowing anything about other elements that were not retrieved.
Referring to
Referring to
Assume Zq is a group of order q, that is, Zq represents the set of the elements of gm mod q, that is Zq={0, 1, 2, . . . q−1}. Generator “g” is known to both user A and user B. User A randomly chooses an element of Zq, which is presented by generator “c”. User A transmits generator “c” to user B. User B randomly chooses an element of Zq, which is presented by generator “k”. User B chooses the bit value b, b belongs to the set {0,1}. User B also set zb=gk, z1-b=c/gk (that is, when value “0” is chosen for b, z0=gk, z1=c/gk; when value “1” is chosen for b, z1=gk, z0=c/gk), then User B sends zb and z1-b to User A.
User A randomly chooses values “r0” and “r1”, and generates values “gr0” and “gr1”. User A has two pieces of information “x0” and “x1”, and user A encrypts “x0” and “x1” with values “r0” and “r1”, for example, user A generates the values H(z0r0) ⊕x0 and H(z1r1) ⊕x1. User A then sends data C0 and C1 to user B. Data C0 and C1 are defined in equation 1:
C0=[gr0,H(z0r0)⊕x0]
C1=[gr1,H(z1r1)⊕x1] (Equation 1)
H is hash function that can be used to map data of arbitrary length to the length of x0 and x1. The operator “⊕” represents bit-wise Exclusive OR operation.
After user B receives data C0 and C1, user B decrypts Cb=[v1, v2] by computing H(v1k) ⊕ v2. Take b=0 for example. When b=0, z0=gk, z1=c/gk. Then, z0r0=(gk)r0, z1r1=(c/gk)r1, and Cb=C0=[v1, v2]=[gr0, H(z0r0) ⊕ x0]. Therefore, H(v1k) ⊕ v2=H(gr0)k ⊕ H(z0r0) ⊕ x0=H(gr0)k ⊕ H((gk)r0) ⊕ x0=x0. However, since C1=[gr1, H(z1r1) ⊕ x1] and H(v1k) ⊕ v2 for C1 equal to H((gr1)k) ⊕ H(z1r1) ⊕ x1=H((gr1)k) ⊕ H((c/gk)r1) ⊕ x1. Since cr1 is unknown, the value of “x1” cannot be obtained. As a result, user B can obtain the information “x0” without knowing the information “x1”, and user A does not know the value of b chosen by user B. That is, user B gets only one of x0 and x1, and user A does not know which one of x0 and x1 is got by user B.
Referring to
Assume Zq is a group of order q, that is, Zq represents the set of the elements of gm mod q, that is Zq={0, 1, 2, . . . q−1}. Generators “g” and “h” of Zq are known to both user A and user B. User B chooses a value a, a belongs to the set {1, . . . , n}, n is an integer. User B randomly chooses an element of Zq, which is presented by generator “r”, and User B transmits a value y=grha to user A. User A randomly chooses n elements of Zq, which is presented by generator “k1, k2, . . . kn”. User A has information (x1, x2 . . . , xn). Then, user A transmits {<ci=(gki, xi(y/ha)ki)>: i=1, . . . , n} to user B. That is, user A transmits c1=(gk1, x1(y/ha)k1), c2=(gk2, x2(y/ha)k2), . . . cn=(gkn, xn(y/ha)kn) to user B. User B receives ca=(v,w)=(gka, xa(y/ha)ka), and computes xa=w/vr. For example, when user B chooses a=2, x2=w/vr=xa(y/ha)ka/(gka)r=xa(grha/ha)ka/(gka)r=xa=x2. As a result, user B can obtain the information “x2” without knowing the information “x1”, and “x3 to xn”, and user A does not know the value of a chosen by user B. That is, user B gets only one of x0 to xn, and user A does not know which one of x0 to xn is got by user B.
Referring to
As shown in
Then, user A encrypts each row of the truth table of AND gate 504 by encrypting the output-wire keys K0z and K1z with the corresponding pair of input-wire keys, as shown in
User A encrypts the output-wire keys K0z with the corresponding pair of input-wire keys K0x and K1y, corresponding to the second row of the truth table which shows the relation that z=0 when x=0 and y=1 to generate the content Ek0x(Ek1y(k0z)) for the second row of the encrypted truth table. Function Ek1y(k0z) represents encrypting the output-wire key K0z with using the input-wire key K1y, and function Ek0x(Ek1y(k0z)) represents encrypting the value of Ek1y(k0z) with using the input-wire key K0x. The output-wire key K0z can be decrypted by using the input-wire keys K1y and K0x.
User A encrypts the output-wire keys K0z with the corresponding pair of input-wire keys K1x and K0y, corresponding to the third row of the truth table which shows the relation that z=0 when x=1 and y=0 to generate the content Ek1x(Ek0y(k0z)) for the third row of the encrypted truth table. Function Ek0y(k0z) represents encrypting the output-wire key K0z with using the input-wire key K0y, and function Ek1x(Ek0y(k0z)) represents encrypting the value of Ek0y(k0z) with using the input-wire key K1x. The output-wire key K0z can be decrypted by using the input-wire keys K0y and K1x.
Similarly, user A encrypts the output-wire keys K1z with the corresponding pair of input-wire keys K1x and K1y, corresponding to the fourth row of the truth table which shows the relation that z=1 when x=1 and y=1 to generate the content Ek1x(Ek1y(k1z)) for the fourth row of the encrypted truth table. Function Ek1y(k1z) represents encrypting the output-wire key K1z with using the input-wire key K1y, and function Ek1x(Ek1y(k1z)) represents encrypting the value of Ek1y(k1z) with using the input-wire key K1x. The output-wire key K1z can be decrypted by using the input-wire keys K1y and K1x.
After the encrypted truth table for the AND gate 504 is generated, each row of the encrypted truth table may be listed in a random order to get more protection. The encrypted truth table is taken as a garbled truth table which user A sends to user B. User A may perform similar process described above to generate the encrypted truth table (garbled truth table) for other gates in the Boolean circuit in
The main steps for the garbled circuit protocol will be descripted below with one example. In main step 1, when user A's bit value is 1, user A simply sends input-wire key k1x to user B. When user A's bit value is 0, user A simply sends input-wire key k0x to user B. In main step 2, when user B's bit value is b, user B simply retrieves kby from user A by using OT protocol. That is, user A sends input-wire keys k0y and k1y to user B, and user B only retrieves k0y from user A by using OT protocol when user B's bit value is 0 and user B only retrieves k1y from user A by using OT protocol when user B's bit value is 1.
In main step 3, assuming user B's bit value is 0, user B can use input-wire keys k1x and k0y to compute k0z based on the garbled truth table which user A has sent to user B. Since user B only has input-wire keys k1x and k0y, user B cannot decrypt k0z by the content Ek0x(Ek0y(k0z)) in the first row of garbled truth table in
After user B finish the computation for AND gate 504, user B may further proceed with the computation for other gates, for example, OR gate 512, OR gate 514, AND gate 516, NOT gate 518, and AND gate 520 as shown in
Referring to
Then, the circuit code is converted to a garbled circuit code by garbled circuit protocol described above. The garbled circuit code can be simulated by the function of a garbled circuit.
In step 102 of
In step 104, a number of key codebooks for a number of candidate gates corresponding to each logic gate are created by the electrical device 204. The candidate gates includes at least one 1-input candidate gate and at least one 2-input candidate gate. The garbled truth tables includes 1-input garbled truth tables and 2-input garbled truth tables. For example, the at least one 1-input candidate gate includes a buffer gate and an NOT gate, for example, a buffer candidate gate and an NOT candidate gate. The at least one 2-input candidate gate includes an AND gate, an OR gate, an NAND gate, an NOR gate, an XOR gate, and an XNOR gate, for example, an AND candidate gate, an OR candidate gate, an NAND candidate gate, an NOR candidate gate, an XOR candidate gate, and an XNOR candidate gate.
Referring to
Take the logic gate 604(1) and graph gate 704(1) for example. Since graph gate 704(1) is a 2-input graph gate as shown in the circuit graph 700 of
In step 106, a number of garbled truth tables for the candidate gates corresponding to each logic gate are generated by the electrical device 204. Take the logic gate 604(1) and graph gate 704(1) for example. Since the electrical device 204 (client end) does not know the type of gate for graph gate 704(1) and the electrical device 204 knows that the graph gate 704(1) is a two input graph gate, the electrical device 204 knows the candidate gates for the graph gate 704(1) includes AND candidate gate, OR candidate gate, NAND candidate gate, NOR candidate gate, XOR candidate gate, and XNOR candidate gate. Therefore, the electrical device 204 generates the garbled truth tables for AND candidate gate, OR candidate gate, NAND candidate gate, NOR candidate gate, XOR candidate gate, and XNOR candidate gate corresponding to the logic gate 604(1). Furthermore, take the logic gate 604(5) and graph gate 704(5) for example. Since the electrical device 204 does not know the type of gate for graph gate 704(5) and the electrical device 204 knows the graph gate 704(5) is an one input graph gate, the electrical device 204 knows the candidate gates for the graph gate 704(5) includes a buffer candidate gate and an NOT candidate gate. Therefore, the electrical device 204 generates the garbled truth tables for the buffer candidate gate and the NOT candidate gate corresponding to the logic gate 604(5).
In step 108, the garbled truth tables for the candidate gates corresponding to each logic gate are transmitted to the server 202 by the electrical device 204 through using OT protocol. The circuit graph 700 of the garbled circuit 600 is, for example, a numbered circuit graph. The numbered circuit graph 700 has the graph gates assigned with numbers 1 to N without indicating the type of the graph gates. For example, the graph gates 704(1) to 704(6) of the graph unit 702(1) are sequentially numbered as 1 to 6. Similarly, the graph gates of graph units 702(2) to 702(T) are also numbered from 7 to N.
The process of the step 108 may include sending a request R(i) corresponding to an i-th logic gate of the logic gates to the electrical device 204 by the server 202, wherein i is an integer between 1 and N; and transmitting the garbled truth table X1(i) of the first type candidate gate, the garbled truth table X2(i) of a second type candidate gate, . . . the garbled truth table XM(i) of the M-th type candidate gate corresponding to the i-th logic gate to the server 202 in response to the request R(i) by the electrical device 204. When the i-th logic gate is a two input-wire logic gate, set R(i)=ai (ai belongs to the set {1, 2, . . . , 6}) to correspond to the i-th logic gate, the i-th logic gate belongs to the set {AND logic gate, OR logic gate, NAND logic gate, NOR logic gate, XOR logic gate, XNOR logic gate}. When the i-th logic gate is a one input-wire logic gate, set R(i)=bi (bi belongs to the set {0,1}) to correspond to the i-th logic gate, the i-th logic gate belongs to the set {buffer logic gate, NOT logic gate}.
For example, when i=1, the server 202 sends a request R(1) corresponding to the logic gate 604(1) to the electrical device 204. And, the electrical device 204 transmits the garbled truth table X1(1) of AND candidate gate, the garbled truth table X2(1) of OR candidate gate, the garbled truth table X3(1) of NAND candidate gate, the garbled truth table X4(1) of NOR candidate gate, the garbled truth table X5(1) of XOR candidate gate, and the garbled truth table X6(1) of XNOR candidate gate corresponding to the logic gate 604(1) to the server 202 in response to the request R(1). Since the logic gate 604(1) is a two input wire logic gate, the value of M equals to 6.
Furthermore, when i=5, the server 202 sends a request R(5) corresponding to the logic gate 604(5) to the electrical device 204. And, the electrical device 204 transmits the garbled truth table X1(5) of buffer candidate gate and the garbled truth table X2(5) of NOT candidate gate corresponding to the logic gate 604(5) to the server 202 in response to the request R(5). Since the logic gate 604(5) is one input wire logic gate, the value of M equals to 2.
In step 110, a target garbled truth table of each logic gate through using OT protocol based on the garbled truth tables for the candidate gates corresponding to each logic gate is obtained by the server 202. The process of step 110 may include obtaining the target garbled truth table of the i-th logic gate which is corresponding to the garbled truth table of a j-th type candidate gate by the server through a decrypting process by using OT protocol, the j-th type candidate gate has the same type of gate with the i-th logic gate, j is an integer between 1 and M.
Take i=1 for example. The server 202 obtains the target garbled truth table TG(1) of the logic gate 604(1) through using OT protocol based on the garbled truth tables for the candidate gates X1(1) to X6(1) corresponding to the logic gate 604(1) by obtaining the target garbled truth table TG(1) of the logic gate 604(1) which is corresponding to the garbled truth table of a first type candidate gate (i.e. the garbled truth table of AND candidate gate X1(1)) by the server 202 through a decrypting process by using OT protocol. The first type candidate gate (i.e. AND candidate gate X1(1)) has the same type of gate with the logic gate 604(1).
Take i=5 for example. The server 202 obtains the target garbled truth table TG(5) of the logic gate 604(5) through using OT protocol based on the garbled truth tables for the candidate gates X1(5) to X2(5) corresponding to the logic gate 604(5) by obtaining the target garbled truth table TG(5) of the logic gate 604(5) which is corresponding to the garbled truth table of a second type candidate gate (i.e. the garbled truth table of NOT candidate gate X2(5)) by the server 202 through a decrypting process by using OT protocol. The second type candidate gate (i.e. NOT candidate gate X2(5)) has the same type of gate with the logic gate 604(5).
The above steps 102 to 110 may be regarded as performing a set procedure for setting the AI compiler in server 202. Steps 112 to 114 may be regarded as performing an execution procedure for encrypting and decrypting an NN model sent by the client end (i.e. the electrical device 204).
In step 112, an NN model is encrypted according to the key codebooks by the electrical device 204. The process of step 112 may include transforming the NN model to binary values and transforming the binary values to key-type original model values according to the key codebooks by the electrical device 204. The key-type original model values are sent to the server 202. For example, the electrical device 204 encrypts an NN model which is pre-trained by a model developer with trained parameters or weights according to the key codebooks by transforming the NN model to binary values (for example, binary values (bI0 bI1 bI2 . . . bIS), S is an integer) firstly and transforming the binary values to key-type original model values (for example, key-type original model values (KI1 KI2 KI3 . . . KIS), S is an integer) according to the key codebooks. The key-type original model values (KI1 KI2 KI3 . . . KIS) are, for example, chosen from input-wire keys corresponding to the input ends of circuit graph 700. The input ends of circuit graph 700 may include the input ends of the first level graph gates of each circuit unit, for example, the input end of the graph gates numbered as 1, 2, 3, 7, 8, 9, 10, . . . , N−5, N−4, N−3, as shown in
In step 114, a compiled NN model of the encrypted NN model is generated according to the garbled circuit code with the target garbled truth table of each logic gate by the server 202. The process of step 114 may include performing the garbled circuit code based on the key-type original model values to generate key-type compiled model values of the compiled NN model. That is, since the AI complier in server 202 has been transformed to garbled circuit code corresponding to the garbled circuit 600 and the target garbled truth table of each logic gate has been obtained, the server 202 can generate the compiled NN model by evaluating the garbled circuit code with the inputted key-type original model values (KI1 KI2 KI3 . . . KIS).
The process of evaluating the garbled circuit code can be explained with the garbled circuit 600. Take the circuit unit 602(1) for example. The key-type input values KI1 and KI2 are inputted to the AND logic gate 604(1). The AND logic gate 604(1) uses the target garbled truth table TG(1) (i.e. the garbled truth table of the AND candidate gate for the graph gate 704(1)) of logic gate 604(1) to get a key-type output value Ka of the AND logic gate 604(1) with the key-type input value KI1 and KI2. Similarly, the NOR logic gate 604(2) uses the target garbled truth table TG(2) (i.e. the garbled truth table of the NOR candidate gate for the graph gate 704(2)) of logic gate 604(2) to get a key-type output value Kb of the NOR logic gate 604(2) with the key-type input values KI3 and KI4. The OR logic gate 604(3) uses the target garbled truth table TG(3) (i.e. the garbled truth table of the OR candidate gate for the graph gate 704(3)) of logic gate 604(3) to get a key-type output value Kc of the OR logic gate 604(3) with the key-type input values KI5 and KI6. The AND logic gate 604(4) uses the target garbled truth table TG(4) (i.e. the garbled truth table of the AND candidate gate for the graph gate 704(4)) of logic gate 604(4) to get a key-type output value Kd of the AND logic gate 604(4) with the key-type input values Ka and Kb. The NOT logic gate 604(5) uses the target garbled truth table TG(5) (i.e. the garbled truth table of the NOT candidate gate for the graph gate 704(5)) of logic gate 604(5) to get a key-type output value Ke of the NOT logic gate 604(5) with the key-type input value Kc. The AND logic gate 604(6) uses the target garbled truth table TG(6) (i.e. the garbled truth table of the AND candidate gate for the graph gate 704(6)) of AND logic gate 604(6) to get a key-type output value KO1 of the AND logic gate 604(6) with the key-type input values Kd and Ke.
The circuit units 602(2) to 602(T) perform similar process to generate the key-type output values KO2 to KOT, respectively. The key-type compiled model values of the compiled NN model is then generated as (KO1 KO2 KO3 . . . KOT). The server 202 then sends the key-type compiled model values (KO1 KO2 KO3 . . . KOT) of the compiled NN model to another electrical device 214.
The electrical device 214 further performs decrypting the compiled NN model according to the key codebooks to generate a machine code and performing the machine code. That is, the electrical device 204 sends the key codebooks Key_cb to the electrical device 214, and then the electrical device 214 decrypts the compiled NN model in the form of the key-type compiled model values (KO1 KO2 . . . KOT) according to the key codebooks Key_cb to generate a machine code and performing the machine code.
For example, the electrical device 214 may include a transmission circuit 216, a processor 218, and an AI execution module 220. The transmission circuit 216 receives the key-type compiled model values (KO1 KO2 . . . KOT). The processor 218 decrypts the key-type compiled model values (KO1 KO2 . . . KOT) by using the key codebooks Key_cb to generate binary values (bO1 bO2 . . . bOT). The processor 218 further transforms the binary values (bO1 bO2 . . . bOT) to a machine code (or a deployable code) which can be performed by the AI execution module 220. The AI execution module 220 may perform the machine code through a runtime module.
Referring to
The electrical device 204 creates a number of key codebooks for the candidate gates corresponding to each logic gate, and an NN model is encrypted according to the key codebooks by the electrical device 204. The processor 208 is further configured to generate a compiled NN model of an encrypted NN model according to the garbled circuit code with the target garbled truth table of each logic gate.
Referring to
The processor 212 is further configured to encrypt an NN model according to the key codebooks. The server 202 obtains a target garbled truth table of each logic gate through using OT protocol based on the garbled truth tables for the candidate gates corresponding to each logic gate. The server 202 generates a compiled NN model of an encrypted NN model according to the garbled circuit code with the target garbled truth table of each logic gate.
Although the embodiment in
By encrypting the NN model according to the key codebooks and transmitting the garbled truth tables through using OT protocol, the content of the NN model provided by the model provider (client end) will not be released to the cloud server, and the privacy of the NN model is protected. Besides, by sending a circuit graph instead of the garbled circuit, using the garbled circuit code, and transmitting the garbled truth tables through using OT protocol, the content of the AI compiler of the cloud server will not be released to the client end. Therefore, the privacy of the AI compiler is also protected.
This disclosure provides a secure AI complier (neural network compiler, deep learning compiler), which can complete NN compilation without knowing the pre-trained model, model parameters, and weights to generate a compiled and optimized and encrypted file, and then the client end can decrypt the encrypted file to generate the low-level machine code which can be executed on the hardware. The disclosure can achieve the function of protecting the NN model from being decoded by the compiler. The embodiment of the disclosure can increase the model privacy protection for the cloud AI compiler service. By using oblivious transfer protocol and garbled circuit (garbled gates) technology, the privacy protection mechanism for NN model compilation is achieved. Through the embodiment of this disclosure, model developers can protect their NN models, and the user can execute the compiled model code through obtains the decrypted information (for example, key codebook) from the model developers.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
8379841 | Taylor | Feb 2013 | B2 |
8762736 | Goldwasser | Jun 2014 | B1 |
8879727 | Taylor | Nov 2014 | B2 |
10630478 | Yavuz | Apr 2020 | B1 |
10664621 | Dropps | May 2020 | B1 |
10956584 | Heaton | Mar 2021 | B1 |
11023595 | Allen | Jun 2021 | B1 |
11328087 | Allen | May 2022 | B1 |
11880650 | Li | Jan 2024 | B1 |
20050037733 | Coleman | Feb 2005 | A1 |
20110216902 | Kolesnikov | Sep 2011 | A1 |
20140372769 | Kerschbaum | Dec 2014 | A1 |
20180268283 | Gilad-Bachrach | Sep 2018 | A1 |
20190007196 | Malluhi | Jan 2019 | A1 |
20190199509 | Hoshizuki | Jun 2019 | A1 |
20190296910 | Cheung | Sep 2019 | A1 |
20190303504 | Pasumarthy | Oct 2019 | A1 |
20190361917 | Tran | Nov 2019 | A1 |
20190392305 | Gu | Dec 2019 | A1 |
20200036510 | Gomez | Jan 2020 | A1 |
20200082273 | Rossi et al. | Mar 2020 | A1 |
20200117690 | Tran | Apr 2020 | A1 |
20200167484 | Linton | May 2020 | A1 |
20200175205 | Kramer | Jun 2020 | A1 |
20200177364 | Zhou | Jun 2020 | A1 |
20200226459 | Chen | Jul 2020 | A1 |
20200233979 | Tahmasebi Maraghoosh et al. | Jul 2020 | A1 |
20200234121 | Stapleton | Jul 2020 | A1 |
20200242465 | Krishnan et al. | Jul 2020 | A1 |
20200242466 | Mohassel et al. | Jul 2020 | A1 |
20200313849 | Kar | Oct 2020 | A1 |
20200328882 | Hoshizuki | Oct 2020 | A1 |
20200374101 | Hoshizuki | Nov 2020 | A1 |
20200394316 | Boehler | Dec 2020 | A1 |
20200410404 | Imani | Dec 2020 | A1 |
20210014039 | Zhang | Jan 2021 | A1 |
20210014205 | Montoya | Jan 2021 | A1 |
20210073393 | Jacobson | Mar 2021 | A1 |
20210083865 | Obadia | Mar 2021 | A1 |
20210097395 | Ocher | Apr 2021 | A1 |
20210133587 | Mohassel | May 2021 | A1 |
20210135837 | Cheung | May 2021 | A1 |
20210166152 | Gomez | Jun 2021 | A1 |
20210209247 | Mohassel | Jul 2021 | A1 |
20210211291 | Jindal | Jul 2021 | A1 |
20210241806 | Chawla et al. | Aug 2021 | A1 |
20210263904 | Kapinos | Aug 2021 | A1 |
20210271963 | Amisano | Sep 2021 | A1 |
20210374271 | Hung | Dec 2021 | A1 |
20210375008 | Hassan | Dec 2021 | A1 |
20210403004 | Alvarez | Dec 2021 | A1 |
20220038271 | Ranellucci | Feb 2022 | A1 |
20220046126 | Grabowski | Feb 2022 | A1 |
20220101160 | Chen | Mar 2022 | A1 |
20220138284 | Lee | May 2022 | A1 |
20220138286 | Zage | May 2022 | A1 |
20220206958 | LeMay | Jun 2022 | A1 |
20220366059 | Sasidharan Rajalekshmi | Nov 2022 | A1 |
20220374519 | Botelho | Nov 2022 | A1 |
20220405474 | Seo | Dec 2022 | A1 |
20230004872 | Ghose | Jan 2023 | A1 |
20230050481 | Nagaraja | Feb 2023 | A1 |
20230087777 | Sha | Mar 2023 | A1 |
20230094940 | Madhavan | Mar 2023 | A1 |
20230134515 | Mukai | May 2023 | A1 |
20230161919 | Leobandung | May 2023 | A1 |
20230177489 | Chan | Jun 2023 | A1 |
Number | Date | Country |
---|---|---|
110766147 | Feb 2020 | CN |
111045688 | Apr 2020 | CN |
109684855 | Jul 2020 | CN |
111967038 | Nov 2020 | CN |
112016086 | Dec 2020 | CN |
112334917 | Feb 2021 | CN |
112949741 | Jun 2021 | CN |
113177209 | Jul 2021 | CN |
M565371 | Aug 2018 | TW |
I724809 | Apr 2021 | TW |
WO-2020151964 | Jul 2020 | WO |
Entry |
---|
Boulemtafes et al., “A Review of Privacy-Preserving Techniques for Deep Learning”, Neurocomputing, Elsevier, 2020, 384, pp. 21-45. 10.1016/j.neucom.2019.11.041. hal-02921443. |
Chen et al., “Deep Learning With Edge Computing: A Review”, Proceedings of the IEEE, vol. 107, No. 8, Aug. 2019, 21 pps. |
Emmanuel et al., “Privacy-Preservation in Distributed Deep Neural Networks via Encryption of Selected Gradients”, 2020 IEEE 22nd International Conference on HPCC; IEEE 18th International Conference Smart City; IEEE 6th International Conference on DSS, 8 pps. |
Hao et al., “Towards Efficient and Privacy-preserving Federated Deep Learning”, 2019 IEEE, 6 pps. |
Liu et al., “DataMix: Elcient Privacy-Preserving Edge-Cloud Inference”, this paper has been published at ECCV 2020, 17 pps. |
Wirth, “Hardware Compilation: Translating Programs into Circuits”, Cybersquare, IEEE, Jun. 1998, 7 pps. |
Zhang et al., “Privacy-Preserving Deep Learning Based on Multiparty Secure Computation: A Survey”, IEEE Internet of Things Journal, vol. 8, No. 13, Jul. 1, 2021, 18 pps. |
Taiwanese Office Action and Search Report for Taiwanese Application No. 110148953, dated Jul. 13, 2022. |
Number | Date | Country | |
---|---|---|---|
20230208639 A1 | Jun 2023 | US |