Skip to content

A compiled list of resources and materials for PPML

Notifications You must be signed in to change notification settings

khoaguin/ppml-materials

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 

Repository files navigation

Privacy-preserving Machine Learning: Resources and Materials

Table of Contents

  1. About
  2. Survey Papers
  3. Cryptographic-based Machine Learning
  4. Non-cryptographic-based Approaches
  5. Courses
  6. Frameworks
  7. Other Resources

About

This is a compiled list of resources and materials for PPML.

Survey Papers

  1. [Arxiv’20] Privacy in Deep Learning: A Survey
  2. [Arxiv’20] SoK: Training Machine Learning Models over Multiple Sources with Privacy Preservation
  3. [IEEEAccess’20] Privacy-Preserving Deep Learning on Machine Learning as a Service: A Comprehensive Survey
  4. [PETS’21] SoK: Privacy-Preserving Computation Techniques for Deep Learning
  5. [PETS’21] SoK: Efficient Privacy Preserving Clustering

Cryptographic-based Approach

Training Phase

Homomorphic Encryption (HE) & Functional Encryption (FE)

  1. [BMC medical genomics’18] Privacy-preserving logistic regression training
  2. [Arxiv’19] CryptoNN: Training Neural Networks over Encrypted Data (Functional Encryption). Code (Python)
  3. [PETS’18] CryptoDL: Privacy-preserving Machine Learning as a Service
  4. [IEEE/CVF’19] Towards Deep Neural Network Training on Encrypted Data
  5. [ArXiv’20] Neural Network Training With Homomorphic Encryption
  6. [ArXiv’20] PrivFT: Private and Fast Text Classification with Homomorphic Encryption

HE-based Hybrid Techniques

  1. [NeurlPS’20] Glyph: Fast and Accurately Training Deep Neural Networks on Encrypted Data (Switch between HE schemes: TFHE and BGV)
  2. [NDSS’21] POSEIDON: Privacy-Preserving Federated Neural Network Learning (Federated Learning + HE) (Code is confidential)

Secure Multi-Party Computation (SMPC)

  1. [S&P’17] SecureML: A System for Scalable Privacy-Preserving Machine Learning. Code (C++)
  2. [CCS’19] QUOTIENT: two-party secure neural network training and prediction
  3. [Arxiv’19] CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
  4. [PETS’20] Falcon: Honest-Majority Maliciously Secure Framework for Private Deep Learning. Code (C++)
  5. [ICPP’20] ParSecureML: An Efficient Parallel Secure Machine Learning Framework on GPUs. Code (C++)
  6. [PoPETs’20] FLASH: Fast and Robust Framework for Privacy-preserving Machine Learning.
  7. [NDSS’20] BLAZE: Blazing Fast Privacy-Preserving Machine Learning
  8. [USENIX’21] Cerebro: A Platform for Multi-Party Cryptographic Collaborative Learning. Code (Python)
  9. [USENIX’21] Fantastic Four: Honest-Majority Four-Party Secure Computation With Malicious Security. Code (Python)
  10. [S&P’21] CryptGPU: Fast Privacy-Preserving Machine Learning on the GPU. Code (Python)
  11. [USENIX 2021] SWIFT: Super-fast and Robust Privacy-Preserving Machine Learning
  12. [Arxiv’21] Adam in Private: Secure and Fast Training of Deep Neural Networks with Adaptive Moment Estimation
  13. [Arxiv’21] Secure Quantized Training for Deep Learning. Code (Python)
  14. [ACMCCS’21] ABY2.0: Improved Mixed-Protocol Secure Two-Party Computation. Code (C++)

SMPC-based Hybrid Techniques

  1. [CCS’18] ABY3: A Mixed Protocol Framework for Machine Learning. Code (C++)
  2. [PETS’18] SecureNN: 3-Party Secure Computation for Neural Network Training. Code (C++)
  3. [NDSS’20] Trident: Efficient 4PC Framework for Privacy Preserving Machine Learning
  4. [Arxiv’21] Tetrad: Actively Secure 4PC for Secure Training and Inference
  5. [S&P’21] MPCLeague: Robust 4-party Computation for Privacy-preserving Machine Learning

Hybrid Techniques

  1. [IACR Cryptol’17] Private Collaborative Neural Network Learning (SMPC + DP)

Inference Phase

Homomorphic Encryption (HE) & Functional Encryption

  1. [ICML’16] CryptoNets: Applying Neural Networks to Encrypted Data with High Throughput and Accuracy. Code (C#).

Secure Multi-Party Computation (SMPC)

  1. [SIGSAC’17] Oblivious Neural Network Predictions via MiniONN Transformations. Code (Python).
  2. [ASIACCS’18] Chameleon: A Hybrid Secure Computation Framework for Machine Learning Applications
  3. [S&P’20] CrypTFlow: Secure TensorFlow Inference

Non-cryptographic-based Approaches

Federated Learning

Courses

Frameworks

  • PySyft (Python): decouples private data from model training (using FL, DP, HE, SMPC...)
  • TASTY (Python): combines garbled circuits with homomorphic encryption
  • ABY (C++): combines arithmetic, boolean and garbled style computation. Proposes protocols to switch between the arithmetic/boolean/garbled worlds for 2 parties
  • ABY3 (C++): extends ABY to 3 parties
  • ABY2.0: improves upon the ABY framework, provides a fast online phase with applications to PPML

HE

  • TenSEAL (Python): A library for doing homomorphic encryption operations on tensors
  • concrete (Rust): zama.ai's variant of TFHE scheme. It is based on the Learning With Errors (LWE) and the Ring Learning With Errors (RLWE) problems

SMPC

  • CrypTen (Python): a framework for Privacy Preserving Machine Learning built on PyTorch
  • MP-SPDZ (C++): software to benchmark various secure multi-party computation (MPC) protocols such as SPDZ, SPDZ2k, MASCOT, Overdrive, BMR garbled circuits, Yao's garbled circuits, and computation based on three-party replicated secret sharing as well as Shamir's secret sharing (with an honest majority)
  • MOTION (C++): a Framework for Mixed-Protocol Multi-Party Computation

Other Resources