Shuffled Transformer for Privacy-Preserving Split Learning

04/16/2023
by   Hengyuan Xu, et al.
0

In conventional split learning, training and testing data often face severe privacy leakage threats. Existing solutions often have to trade learning accuracy for data privacy, or the other way around. We propose a lossless privacy-preserving split learning framework, on the basis of the permutation equivalence properties which are inherent to many neural network modules. We adopt Transformer as the example building block to the framework. It is proved that the Transformer encoder block is permutation equivalent, and thus training/testing could be done equivalently on permuted data. We further introduce shuffling-based privacy guarantee and enhance it by mix-up training. All properties are verified by conducted experiments, which also show strong defence against privacy attacks compared to the state-of-the-art, yet without any accuracy decline.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset