Learn Secure Multi-Party Computation
Master privacy-preserving computation for AI. From secret sharing and garbled circuits to secure inference with CrypTen and MP-SPDZ — enable multiple parties to jointly compute on private data without revealing their inputs.
Your Learning Path
Follow these lessons in order to build a complete understanding of secure multi-party computation for AI.
1. Introduction
What is MPC, the millionaires' problem, why MPC matters for AI, and the landscape of privacy-preserving computation.
2. MPC Protocols
Semi-honest vs malicious security, two-party and multi-party protocols, communication complexity, and protocol design principles.
3. Secret Sharing
Shamir's secret sharing, additive sharing, arithmetic and Boolean circuits, and computing on shared data.
4. Garbled Circuits
Yao's garbled circuits, oblivious transfer, point-and-permute optimization, and free XOR gates.
5. Applications
Secure ML inference, private set intersection, collaborative training with CrypTen, and MP-SPDZ.
6. Best Practices
Performance optimization, choosing the right protocol, combining MPC with other PETs, and deployment considerations.
What You'll Learn
By the end of this course, you'll be able to:
Understand MPC Theory
Grasp the cryptographic foundations of secure computation including secret sharing and garbled circuits.
Implement Secret Sharing
Build and use secret sharing schemes for privacy-preserving data analysis and ML.
Run Secure ML Inference
Deploy ML models that compute on encrypted inputs using CrypTen and MP-SPDZ.
Design MPC Protocols
Choose and configure the right MPC protocol for your privacy and performance requirements.
Lilly Tech Systems