Skip to content

Koukyosyumei/Attack_SplitNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attacking SplitNN

Attacking_SplitNN allows you to easily experiment with various combinations of attack and defense algorithms against SplitNN within PyTorch and scikit-learn.

Install

    pip install git+https://github.com/Koukyosyumei/Attack_SplitNN

SplitNN

You can easily create two-SplitNN with this package as follows.
The client only has input data, and the server has only labels. This package implements SplitNN as the custom torch.nn.modules, so you can train SplitNN like the normal torch models.

    Examples:
            model_1 = FirstNet()
            model_1 = model_1.to(device)

            model_2 = SecondNet()
            model_2 = model_2.to(device)

            opt_1 = optim.Adam(model_1.parameters(), lr=1e-3)
            opt_2 = optim.Adam(model_2.parameters(), lr=1e-3)

            criterion = nn.BCELoss()

            client = Client(model_1)
            server = Server(model_2)

            splitnn = SplitNN(client, server, opt_1, opt_2)

            splitnn.train()
            for epoch in range(3):
            epoch_loss = 0
            epoch_outputs = []
            epoch_labels = []
            for i, data in enumerate(train_loader):
                    splitnn.zero_grads()
                    inputs, labels = data
                    inputs = inputs.to(device)
                    labels = labels.to(device)

                    outputs = splitnn(inputs)
                    loss = criterion(outputs, labels)
                    loss.backward()
                    epoch_loss += loss.item() / len(train_loader.dataset)

                    epoch_outputs.append(outputs)
                    epoch_labels.append(labels)

                    splitnn.backward()
                    splitnn.step()

            print(epoch_loss, torch_auc(torch.cat(epoch_labels),
                                            torch.cat(epoch_outputs)))

Attack

Attacking_SplitNN offers several attack methods with the same interface.

type example Reference
Intermidiate Level Attack evasion attack notebook original paper
Norm Attack label leakage attack notebook original paper
Transfer Inherit Attack membership inference attack notebook original paper
Black Box Model Inversion Attack model inversion attack notebook blog

Defense

example Reference
Max Norm notebook original paper
NoPeek notebook original paper
Shredder notebook original paper

License

This software is released under the MIT License, see LICENSE.txt.