
I read the thesis Applying Neural Networks to Encrypted Data with High Throughput and Accuracy by Microsoft research. In the paper, they associates the machine learning technique with the homomorphic encryption thus making it capable of applying the Neural network to the encrypted data to make encrypted predictions.
I think the main contribution of this thesis is that they use the proper homomorphic encryption scheme with appropriate optimizations to convert the Neural Networks to CryptoNets. Some ideas in the transform are worth using for reference. According to their method. It needs two main procedures to construct a Cryptonet.
1.Creat Neural Networks using basic operations.
We want cryptonets can be applied to encrypted data , that is to say the encrypted data support homomorphic calculation. So the main problem is to transform the functions in a Neural Networks to low-degree polynomials which can be evaluated on the ciphertexts by simply boolean circuits. However, a neural networks is of amazing complexity, it actually can break down intosmall parts of operations. Thus, we can evaluate simple operations on the encrypted data and finally accomplish the neyral networks prediction.
In this thesis, the nodes for prediction in the neural network contains five common functions except the nonlinear activation functions the data statistical function can be break down into basic addition and multiplication:
Statistical function:
$$
begin{array} {|l|c|}hline Weighted Sum&+,times \ hline
Max Pooling & Sorting Circuit \ hline
Mean Pooling&+ times \ hline
end{array}
$$
For max pooling it needs at least circuit depth 5 for sorting , thus it requires larger parameters. I calculate it in the GSW encryption scheme for security parameter $k=80$ and circuit depth $L=5$, the valid parameters of the encryption scheme is the following:
$$
begin{array} {|l|c|}hline q&113603506650998636563 \ hline
n& 100 \ hline
l&67 \ hline
m&6663 \ hline
l&67 \ hline
end{array}
$$
In the thesis, they consider an equivalent method due to the relation
$$(x_1,cdots x_n) = lim_{drightarrow infty }(sum_ix_i^d)^{1/d}$$.
When $d=1$ it returns the scalar multiple of the mean pooling function. They use this scaled mean-pool fuction instead of the max-pool function thus the max pooling
function can be broke down into addition and multiplication too. Since homomorphic encryption scheme supports only additions and multiplications the main difficulties are the activiation functions: sigmoid and rectified linear. In the thesis , like max pool they also use the equivalent polynomial function. As sigmoid is the function: $sigm(z)=frac{1}{1+exp(-z)}$ and it is equivalent to choose the lowest-degree non-linear polynomial function $sqr(z) = z^2$.
Both the equivalent function for maxpool and sigmoid are not precise, however it seems to be negligible in the neural networks. For more precise fitting, Taylor Series Expansion is a good choice. Such as:
$$e^x = 1+ x+ frac{x^2}{2!} + cdots + frac{x^n}{n!}$$
$$ln(1+x) = x-frac{x^2}{2!}+frac{x^3}{3!} - cdots $$
So we can get better fitting function for sigmoid.
2.Choose proper Encryption scheme.
In the thesis, they use YASHE for encrypting data. On the one hand, it is an Ring-LWE based encryption scheme thus some optimizations in the SEAL library are suitable for YASHE. Such as encoding strategy and SIMD. So the variants of GSW based on RLWE may more appropriate for CryptoNets. On the other hand, there are many template functions in SEAL library for the optimizations in the scheme which are common in RLWE based homomorphic encryption scheme. Includes: Encoding numbers by using Chinese Remainder theorem and parallel computation and so on. Now, i am reading the documentation of the SEAL library and learn how to use it.




近期评论