Intermediate

HE Libraries & Tools

A practical guide to the open-source homomorphic encryption libraries available for building encrypted ML applications. From low-level crypto libraries to high-level ML frameworks.

Library Comparison

LibraryMaintainerSchemesLevelBest For
Microsoft SEALMicrosoftBFV, BGV, CKKSLow-levelCustom HE applications
Concrete MLZamaTFHEHigh-levelML model compilation to FHE
TenSEALOpenMinedBFV, CKKSMid-levelTensor operations on encrypted data
OpenFHEDuality TechnologiesBFV, BGV, CKKS, TFHELow-levelResearch, full FHE support
HElibIBMBGV, CKKSLow-levelResearch, bootstrapping

Microsoft SEAL

SEAL is the most widely used HE library, providing well-documented implementations of BFV and CKKS:

Python - Microsoft SEAL (via seal-python)
import seal

# Setup CKKS parameters
parms = seal.EncryptionParameters(seal.scheme_type.ckks)
parms.set_poly_modulus_degree(8192)
parms.set_coeff_modulus(seal.CoeffModulus.Create(
    8192, [60, 40, 40, 60]))

context = seal.SEALContext(parms)
keygen = seal.KeyGenerator(context)
public_key = keygen.create_public_key()
secret_key = keygen.secret_key()

encryptor = seal.Encryptor(context, public_key)
evaluator = seal.Evaluator(context)
decryptor = seal.Decryptor(context, secret_key)
encoder = seal.CKKSEncoder(context)

# Encode and encrypt
scale = 2.0**40
plain = encoder.encode([1.5, 2.3, 3.7], scale)
encrypted = encryptor.encrypt(plain)

# Compute on encrypted data
plain_coeff = encoder.encode([2.0, 2.0, 2.0], scale)
evaluator.multiply_plain_inplace(encrypted, plain_coeff)
evaluator.rescale_to_next_inplace(encrypted)

# Decrypt result
result = decryptor.decrypt(encrypted)
decoded = encoder.decode(result)  # [3.0, 4.6, 7.4]

Concrete ML

Concrete ML by Zama is the highest-level FHE ML library. It can compile standard scikit-learn and PyTorch models to run on encrypted data:

Python - Concrete ML Random Forest
from concrete.ml.sklearn import RandomForestClassifier
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

# Load data and train
X, y = load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y)

model = RandomForestClassifier(n_bits=6, n_estimators=10)
model.fit(X_train, y_train)

# Compile to FHE circuit
fhe_circuit = model.compile(X_train)

# Simulate FHE execution (faster for testing)
y_pred_simulated = model.predict(X_test, fhe="simulate")

# Run actual FHE inference
fhe_circuit.client.keygen()
y_pred_fhe = model.predict(X_test, fhe="execute")
print(f"FHE accuracy: {(y_pred_fhe == y_test).mean():.2%}")

TenSEAL

TenSEAL provides a tensor abstraction over SEAL, making it natural for ML applications. It supports encrypted matrix operations and is well-suited for implementing neural network layers.

OpenFHE

OpenFHE (successor to PALISADE) is the most comprehensive FHE library, supporting all major schemes including TFHE with programmable bootstrapping. It is primarily a C++ library with Python bindings.

Choosing a library: For ML applications, start with Concrete ML — it abstracts away all HE complexity. For custom encrypted computations with CKKS, use TenSEAL (Python-friendly) or SEAL (more control). For research or TFHE-based applications, use OpenFHE.