Created
May 27, 2020 13:03
-
-
Save SETIADEEPANSHU/79681c58bd0e047db61852dba4001a16 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| { | |
| "cells": [ | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "1ESON-aFJUXi" | |
| }, | |
| "source": [ | |
| "#Ensuring data and model privacy\n", | |
| "\n", | |
| "In most of the industries today having both the required data in the required size, and the in-house Machine Learning expertise is extremely rare. But using third-party data or a third party model poses huge IP and privacy risks.\n", | |
| "\n", | |
| "Using third party data means that the data owner has to send his/her data to the model, and using a third party model means that the model has to be downloaded and loaded. Both of these scenarios expose either the data or the model to the third party.\n", | |
| "\n", | |
| "n this context, one potential solution is to encrypt both the model and the data in a way which allows one organization to use a model owned by another organization without either disclosing their IP to one another. Several encryption schemes exist that allow for computation over encrypted data, among which Secure Multi-Party Computation (SMPC), Homomorphic Encryption (FHE/SHE) and Functional Encryption (FE) are the most well known types. We will focus here on Secure Multi-Party Computation which consists of private additive sharing.\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "i3A-XuZWJmEI" | |
| }, | |
| "source": [ | |
| "# Scope of the Demo\n", | |
| "\n", | |
| "In this demo we will implement in a simple manner a Privacy-Preserving Prediction model. This model will allow a model trained on some data to make predictions on some other remote third-party data, without revealing anything about the model\n", | |
| "\n", | |
| "It will be divided in 2 parts:\n", | |
| "\n", | |
| "1. The learning process. We will train a simple Neural Network on the CIFAR dataset\n", | |
| "2. Prediction process. We will demonstrate how this trained model can be applied to another slice of this dataset which lies in a remote machine and provide labeling without revealing the model or the data that is getting fed. Essentially, the model holder (You) cannot see the target data, and the data owners (Bob and Alice) cannot see the model\n", | |
| "\n", | |
| "The end result will be a remote dataset with the predicted labels from the model we trained previously" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "oJy2n1v7J6WV" | |
| }, | |
| "source": [ | |
| "#Importing and installing the required libraries\n", | |
| "\n", | |
| "Our product is based on open-source code to ensure auditability and transparency. \n", | |
| "\n", | |
| "We are main contributors in PySyft library, which combines cryptographic protocols such as Secure Multiparty Computation with PyTorch. The library has been recently featured in the latest F8 of Facebook and there is an upcoming Udacity course on privacy-Preserving Machine Learning based on it" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "BGCSKEWjdH6X" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "!git clone https://github.com/OpenMined/PySyft.git\n", | |
| "%cd PySyft\n", | |
| "!pip install -r requirements.txt\n", | |
| "# Install the required libraries and do the required tests. \n", | |
| "# Warnings are ok, but if you get an error please contact decentriq\n", | |
| "!python setup.py install\n", | |
| "!python setup.py test" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "jhH4SScjZM_X" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "import torch\n", | |
| "import torch.nn as nn\n", | |
| "import torch.nn.functional as F\n", | |
| "import torch.optim as optim\n", | |
| "from torchvision import datasets, transforms\n", | |
| "import random\n", | |
| "import numpy as np\n", | |
| "import matplotlib.pyplot as plt\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "VDVBJHegKMOE" | |
| }, | |
| "source": [ | |
| "# The scenario\n", | |
| "\n", | |
| "Imagine that there is a model (you), with a model on some data (small images in this simple example) You wants to sell predictions for this model to two clients ,Bob and Alice who want to use this model architecure for their own image recognition. The current way of doing this is either by you sending the model to Bob and Alice where they would have unlimited access to all the work that you have done on the model (they would be able to see your weights), or by Alice and Bob sending continuously their data over to you in which situation they would share potentially sensitive data to you.\n", | |
| "\n", | |
| "With the current advancements in cryptography and Machine Learning however, there is a third way. Neither you nor them need to share any sensitive information, the model predictions are done on encrypted data while the model is also encrypted." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "gEVjwLJ5L-Hh" | |
| }, | |
| "source": [ | |
| "# Training an image recognition task\n", | |
| "\n", | |
| "In order to conduct privacy-preserving inference, we first have to train a model. The model is trained and tested locally. Meaning that Bob and Alice are not involved in this proccess. " | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "bOF47frmcIcR" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "class Arguments():\n", | |
| " def __init__(self):\n", | |
| " self.batch_size = 64\n", | |
| " self.test_batch_size = 50\n", | |
| " self.epochs = 5\n", | |
| " self.lr = 0.001\n", | |
| " self.log_interval = 100\n", | |
| "\n", | |
| "args = Arguments()\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 52 | |
| }, | |
| "colab_type": "code", | |
| "id": "aGGy5oiZcMG4", | |
| "outputId": "be4f0f2e-62e2-46ad-90be-7768399cd03b" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stderr", | |
| "output_type": "stream", | |
| "text": [ | |
| "0it [00:00, ?it/s]" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "Downloading https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz to ../data/cifar-10-python.tar.gz\n" | |
| ] | |
| }, | |
| { | |
| "name": "stderr", | |
| "output_type": "stream", | |
| "text": [ | |
| " 99%|█████████▉| 169189376/170498071 [00:39<00:00, 26308541.14it/s]" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "# seed = 42\n", | |
| "# random.seed(seed)\n", | |
| "# np.random.seed(seed)\n", | |
| "# torch.manual_seed(seed)\n", | |
| "# torch.cuda.manual_seed(seed)\n", | |
| "\n", | |
| "def _init_fn():\n", | |
| " return np.random.seed(seed)\n", | |
| "\n", | |
| "train_loader = torch.utils.data.DataLoader(\n", | |
| " datasets.CIFAR10('../data', train=True, download=True,\n", | |
| " transform=transforms.Compose([\n", | |
| " transforms.ToTensor()\n", | |
| " # transforms.Normalize((0.1307,), (0.3081,))\n", | |
| " ])),\n", | |
| " batch_size=args.batch_size, shuffle=True, worker_init_fn=_init_fn)\n", | |
| "\n", | |
| "test_loader = torch.utils.data.DataLoader(\n", | |
| " datasets.CIFAR10('../data', train=False,\n", | |
| " transform=transforms.Compose([\n", | |
| " transforms.ToTensor()\n", | |
| " # transforms.Normalize((0.1307,), (0.3081,))\n", | |
| " ])),\n", | |
| " batch_size=args.test_batch_size, shuffle=True, worker_init_fn=_init_fn)\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "-LGt5igJcPdy" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "# INTERNAL NOTE: Working on the network. For now I'm making sure that the whole NB runs from start to end.\n", | |
| "\n", | |
| "class Net(nn.Module):\n", | |
| " def __init__(self):\n", | |
| " super(Net, self).__init__()\n", | |
| " self.fc1 = nn.Linear(3072, 512)\n", | |
| " self.fc2 = nn.Linear(512, 256)\n", | |
| " self.fc3 = nn.Linear(256, 512)\n", | |
| " self.fc4 = nn.Linear(512, 10)\n", | |
| "\n", | |
| " def forward(self, x):\n", | |
| " x = x.view(-1, 3072)\n", | |
| " x = self.fc1(x)\n", | |
| " x = F.relu(x)\n", | |
| " x = self.fc2(x)\n", | |
| " x = self.fc3(x)\n", | |
| " x = F.relu(x)\n", | |
| " x = self.fc4(x)\n", | |
| " return x" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "y529C7rHcTvW" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "def train(args, model, train_loader, optimizer, epoch):\n", | |
| " model.train()\n", | |
| " for batch_idx, (data, target) in enumerate(train_loader):\n", | |
| " optimizer.zero_grad()\n", | |
| " output = model(data)\n", | |
| " output = F.log_softmax(output, dim=1)\n", | |
| " loss = F.nll_loss(output, target)\n", | |
| " loss.backward()\n", | |
| " optimizer.step()\n", | |
| " if batch_idx % args.log_interval == 0:\n", | |
| " print('Train Epoch: {} [{}/{} ({:.0f}%)]\\tLoss: {:.6f}'.format(\n", | |
| " epoch, batch_idx * args.batch_size, len(train_loader) * args.batch_size,\n", | |
| " 100. * batch_idx / len(train_loader), loss.item()))" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 712 | |
| }, | |
| "colab_type": "code", | |
| "id": "6FHqRoGGcWBr", | |
| "outputId": "59785622-31d8-44a7-a10a-d78a472b9673" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "Train Epoch: 1 [0/50048 (0%)]\tLoss: 2.304078\n", | |
| "Train Epoch: 1 [6400/50048 (13%)]\tLoss: 1.921039\n", | |
| "Train Epoch: 1 [12800/50048 (26%)]\tLoss: 1.735386\n" | |
| ] | |
| }, | |
| { | |
| "name": "stderr", | |
| "output_type": "stream", | |
| "text": [ | |
| "170500096it [00:50, 26308541.14it/s] " | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "Train Epoch: 1 [19200/50048 (38%)]\tLoss: 1.940833\n", | |
| "Train Epoch: 1 [25600/50048 (51%)]\tLoss: 2.075466\n", | |
| "Train Epoch: 1 [32000/50048 (64%)]\tLoss: 1.812921\n", | |
| "Train Epoch: 1 [38400/50048 (77%)]\tLoss: 1.975147\n", | |
| "Train Epoch: 1 [44800/50048 (90%)]\tLoss: 1.774485\n", | |
| "Train Epoch: 2 [0/50048 (0%)]\tLoss: 1.720586\n", | |
| "Train Epoch: 2 [6400/50048 (13%)]\tLoss: 1.767773\n", | |
| "Train Epoch: 2 [12800/50048 (26%)]\tLoss: 1.689665\n", | |
| "Train Epoch: 2 [19200/50048 (38%)]\tLoss: 1.696862\n", | |
| "Train Epoch: 2 [25600/50048 (51%)]\tLoss: 1.670267\n", | |
| "Train Epoch: 2 [32000/50048 (64%)]\tLoss: 1.776497\n", | |
| "Train Epoch: 2 [38400/50048 (77%)]\tLoss: 1.596704\n", | |
| "Train Epoch: 2 [44800/50048 (90%)]\tLoss: 1.607974\n", | |
| "Train Epoch: 3 [0/50048 (0%)]\tLoss: 1.645938\n", | |
| "Train Epoch: 3 [6400/50048 (13%)]\tLoss: 1.715776\n", | |
| "Train Epoch: 3 [12800/50048 (26%)]\tLoss: 1.574470\n", | |
| "Train Epoch: 3 [19200/50048 (38%)]\tLoss: 1.677928\n", | |
| "Train Epoch: 3 [25600/50048 (51%)]\tLoss: 1.708422\n", | |
| "Train Epoch: 3 [32000/50048 (64%)]\tLoss: 1.565789\n", | |
| "Train Epoch: 3 [38400/50048 (77%)]\tLoss: 1.648729\n", | |
| "Train Epoch: 3 [44800/50048 (90%)]\tLoss: 1.549786\n", | |
| "Train Epoch: 4 [0/50048 (0%)]\tLoss: 1.402773\n", | |
| "Train Epoch: 4 [6400/50048 (13%)]\tLoss: 1.317211\n", | |
| "Train Epoch: 4 [12800/50048 (26%)]\tLoss: 1.402785\n", | |
| "Train Epoch: 4 [19200/50048 (38%)]\tLoss: 1.381953\n", | |
| "Train Epoch: 4 [25600/50048 (51%)]\tLoss: 1.621290\n", | |
| "Train Epoch: 4 [32000/50048 (64%)]\tLoss: 1.648951\n", | |
| "Train Epoch: 4 [38400/50048 (77%)]\tLoss: 1.417453\n", | |
| "Train Epoch: 4 [44800/50048 (90%)]\tLoss: 1.415839\n", | |
| "Train Epoch: 5 [0/50048 (0%)]\tLoss: 1.410007\n", | |
| "Train Epoch: 5 [6400/50048 (13%)]\tLoss: 1.485538\n", | |
| "Train Epoch: 5 [12800/50048 (26%)]\tLoss: 1.496825\n", | |
| "Train Epoch: 5 [19200/50048 (38%)]\tLoss: 1.491663\n", | |
| "Train Epoch: 5 [25600/50048 (51%)]\tLoss: 1.582479\n", | |
| "Train Epoch: 5 [32000/50048 (64%)]\tLoss: 1.388218\n", | |
| "Train Epoch: 5 [38400/50048 (77%)]\tLoss: 1.564148\n", | |
| "Train Epoch: 5 [44800/50048 (90%)]\tLoss: 1.706602\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "model = Net()\n", | |
| "optimizer = torch.optim.Adam(model.parameters(), lr=args.lr)\n", | |
| "\n", | |
| "for epoch in range(1, args.epochs + 1):\n", | |
| " train(args, model, train_loader, optimizer, epoch)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "4OzGozDecYrG" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "def test(args, model, test_loader):\n", | |
| " model.eval()\n", | |
| " test_loss = 0\n", | |
| " correct = 0\n", | |
| " with torch.no_grad():\n", | |
| " for data, target in test_loader:\n", | |
| " output = model(data)\n", | |
| " test_loss += F.nll_loss(output, target, reduction='sum').item() # sum up batch loss\n", | |
| " pred = output.argmax(1, keepdim=True) # get the index of the max log-probability \n", | |
| " correct += pred.eq(target.view_as(pred)).sum().item()\n", | |
| "\n", | |
| " test_loss /= len(test_loader.dataset)\n", | |
| "\n", | |
| " print('\\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\\n'.format(\n", | |
| " test_loss, correct, len(test_loader.dataset),\n", | |
| " 100. * correct / len(test_loader.dataset)))" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 69 | |
| }, | |
| "colab_type": "code", | |
| "id": "l9YG-gSJcbjq", | |
| "outputId": "d000694a-ff1f-45a7-e92e-8cc265fe4a48" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\n", | |
| "Test set: Average loss: -1.5302, Accuracy: 4663/10000 (47%)\n", | |
| "\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "test(args, model, test_loader)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "wHx5Bho1ce2t" | |
| }, | |
| "source": [ | |
| "# Privacy Preserving Inference on remote data\n", | |
| "\n", | |
| "After the training and the testing of the model has been done on your side, you are ready now to use it on new data that Alice and Bob hold. As we said, Alice and Bob do not want to give you their raw data because its privacy-sensitive. \n", | |
| "\n", | |
| "Here we demonstrate how both you and them can benefit from the data and the model respectively while keeping them both private" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "tuLqN9Un5BOp" | |
| }, | |
| "source": [ | |
| "# Constructing the distributed environment\n", | |
| "\n", | |
| "Our product works with workers, each worker is basically a remote machine that is initialized in the beggining and creates the connection to the library.\n", | |
| "\n", | |
| "Here, since we are working locally (and because its a demo) we are creating two virtual worker() which are acting as virtual seperate machines. We are also creating a worker named \"crypto_provider\" who is basically an independed machine, which its only job is generating random numbers to be used for the cryptography operations.\n", | |
| "\n", | |
| "It is importand to be stated that **the crypto_provider does not hold any data or any shares of the model**\n", | |
| "\n", | |
| "In our example we name the remote machines (which will hold the real world data) Alice and Bob" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "Zqeax08X6yFZ" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "import syft as sy\n", | |
| "from syft.frameworks.torch.tensors.interpreters import PointerTensor\n", | |
| "from syft.frameworks.torch.tensors.decorators import LoggingTensor\n", | |
| "import sys\n", | |
| "\n", | |
| "\n", | |
| "hook = sy.TorchHook(torch) \n", | |
| "client = sy.VirtualWorker(hook, id=\"client\")\n", | |
| "bob = sy.VirtualWorker(hook, id=\"bob\")\n", | |
| "alice = sy.VirtualWorker(hook, id=\"alice\")\n", | |
| "crypto_provider = sy.VirtualWorker(hook, id=\"crypto_provider\")" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "940UAgSG7DLF" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "#Since the remote machine in the demo is virtual\n", | |
| "#(*Alice and Bob are not real*), we need to send them the that slice of data in the first place\n", | |
| "# This is what the code below is doing\n", | |
| "private_test_loader = []\n", | |
| "for data, target in test_loader:\n", | |
| " private_test_loader.append((\n", | |
| " data.fix_precision().share(alice, bob, crypto_provider=crypto_provider),\n", | |
| " target.fix_precision().share(alice, bob, crypto_provider=crypto_provider) \n", | |
| " ))" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "ucNVpDx45IX8" | |
| }, | |
| "source": [ | |
| "##Model sharing and visualising\n", | |
| "\n", | |
| "In order to do the private prediction we need to split the model in encrypted parts. That way the data owner will not be able to infer the weights of the model while being able to do part of the calculation." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 121 | |
| }, | |
| "colab_type": "code", | |
| "id": "6Gu0jsUP82ie", | |
| "outputId": "00d75b78-8bb8-4d6e-9c54-dea40ffb79e3" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "Net(\n", | |
| " (fc1): Linear(in_features=3072, out_features=512, bias=True)\n", | |
| " (fc2): Linear(in_features=512, out_features=256, bias=True)\n", | |
| " (fc3): Linear(in_features=256, out_features=512, bias=True)\n", | |
| " (fc4): Linear(in_features=512, out_features=10, bias=True)\n", | |
| ")" | |
| ] | |
| }, | |
| "execution_count": 13, | |
| "metadata": { | |
| "tags": [] | |
| }, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "model.fix_precision().share(alice, bob, crypto_provider=crypto_provider)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 34 | |
| }, | |
| "colab_type": "code", | |
| "id": "h2OP8ZzSO5gh", | |
| "outputId": "6f70c99a-7626-4211-d8ff-494a84a47bfb" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "408" | |
| ] | |
| }, | |
| "execution_count": 14, | |
| "metadata": { | |
| "tags": [] | |
| }, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "len(alice._objects)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "NQ7EQbG7uGXO" | |
| }, | |
| "source": [ | |
| "Here we visualize how the data that Bob holds will appear to us. As you can see it is pretty random" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 318 | |
| }, | |
| "colab_type": "code", | |
| "id": "C0iQMQc6PiCo", | |
| "outputId": "fb06c206-bbb7-410b-cc75-c12e01d70eb8" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlMAAAEtCAYAAAAsgeXEAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3XeYFFXaBfDzTo7knAQJIiiiIiom\nzK5Z14ARd3XVlXXVVdTPiBkxrWvWRRHXAAJmDCwqiglBkSAqkvOQZwYmz/3+qJrddnbqVENNaPT8\nnoeHmT59q25XV925XV39tjnnICIiIiLbJ6mhOyAiIiKyI9NkSkRERCQCTaZEREREItBkSkRERCQC\nTaZEREREItBkSkRERCQCTaYSmJndYGb/rO37xrEsZ2bd4rzvMDP7V22sN8719TKz6WZm/u+LzeyI\nONvG/bhqq62ZtTazeWaWvj3rlR1P2DFhZnPNbOA2LvMgM/sxcueCl/+xmV1UV8sXzszuNLN1Zra6\nofuyrRp634kyrtcmTabqiZldYGazzWyrma02syfMrAlr45y72zkX1066LfdtKGY2yszujLiYOwDc\n7xKoQJr/x7Ew5l+5mb0FAM65NQA+AnBxw/ZSaku157rSzIpifj8nrL1zrrdz7uNtWadz7lPn3C7b\n3ek6VEvH9W+WmXUCcDWAXs65Ng2w/oH+fly1D68ws9vqYb0fb+uLiojr6+xPvFLqYvmaTNUDM7sa\nwL0AhgJoDGA/ADsBmGRmaQFt6uQJ35GZWVsAhwJ4vaH7Esv/45jjnMsBkAtgGYBXY+7yIoBLGqRz\nUuuqnmv/+V4K4ISY215s6P7JDqcTgPXOubyawnr6W7AyZp8+EMCFZnZyPaz3V0OTqTpmZo0A3Abg\ncufce865MufcYgBnAOgM4Fz/fsPMbJyZ/cvM8gFcUP3tAjM738yWmNl6M7s59i2u2PvGzMAHm9lS\n//TxjTHL6W9mX5jZJjNbZWaPBk3qang8XcxsipkVmNkkAC2q5a/6Z942m9knZtbbv/1iAOcAuNZ/\n9fOWf/v1ZrbAX973ZnYKWf2RAL5xzhUH9C2ex3WsmS30t8l9ZpYU0/6P/ltyG83sfTPbKZ5tUs3B\n8LbJ+JjbvgKw83YuT3ZMaWY22t+v55pZv6qg2nHb37y3rfPNbI2ZPVjTwvyzB8tjfr/OP4NQYGY/\nmtnhNbTp4h8LSf7vz5hZXkz+gpldGdNkJzP7zF/mB2bWIua+23RcV+vHbWb2iP9zqpltMbP7/N8z\nzazYzJr5v+9nZp/7/f6Onbkws45mNsHM1vpj4qP+7V3N7EP/tnVm9qLFvAvgb/9rzGyW/3jGmFlG\nwDq6+ePdZn9ZY2KynmY2ycw2+M/BGTFZupnd74+/a8zsSTPLrGH5RwCYBKCdv/1G2X/H7wvNbCmA\nD/37nujvS5vMO6uza7XHNNR/TFvMbKR5lxi86z+f/zazpkHbMpZzbhGAzwH0iln+ADP72t8OX5vZ\ngGrNuprZNH8/fiPm+cww72/aer/fX5tZ6xq2Q1zHgX/foeaN7yvN7I/VsuPM7Ft/OcvMbFhM/In/\n/yZ/W+8ftq9sE+ec/tXhPwDHACgHkFJD9jyAl/2fhwEoA3AyvElupn/bv/y8F4BCeK8a0gDc79//\niJj2VfftDMABeMZfzh4ASgDs6ud7wzs7luLfdx6AK2P65QB0C3g8XwB4EEA6vIlDQdV6/fyP8M7O\npAP4O4CZMdkoAHdWW97pANr5j/lMAFsAtA1Y930AHqt22+KYbRDP4/oIQDN4rwZ/AnCRn50E4GcA\nu/rtbwLweTzbpFp/ngUwqobbZwE4saH3R/2r3X+x+1/MbcMAFAM4FkAygHsAfFlTG/94Os//OQfA\nfgHrGQhguf/zLvDOfrbzf+8MoGtAu6UA9vZ//hHAwphxYCmAPf2fPwawAEAPf8z4GMDwmOVs03Fd\nrQ+HAZjt/zzAX89XMdl3/s/tAaz3t1sSvBdP6wG0rGGZyQC+A/AQgGwAGQAO9LNuftt0AC3h/RH9\ne7XtPw3euNPMHycuDej7ywBu9PsTu45s/zn4gz9e7AlgHby36uD3601/+bkA3gJwT9hzG/N8OgCj\n/fVk+s/LFv9xpQK4Ft54lRbzmL4E0NrfjnkAvvH7lQFvQnZrnOvvDmAFgMP835sB2AjgPP+xnuX/\n3jxm31kBYDe/v+Px379Fl/iPPct/zvYG0KiGPsR7HBwDYE3Mul5CzNjsP5bd/eerj3/fk6tt15SY\n5dF9ZVv+6cxU3WsBYJ1zrryGbBV+eWbnC+fc6865SudcUbX7ngbgLefcVOdcKYBb4O0YzG3OuSLn\n3HfwBp49AMA5N8M596Vzrtx5Z8meAnBI2AMx7739fQDc7Jwrcc59Au9A+Q/n3LPOuQLnXAm8Pyp7\nmFnjoGU65151zq30H/MYAPMB9A+4exN4k7egZcXzuO51zm1wzi2F90fhLP/2S+ENdvP85+puAH1t\nG84mmVkWvOdpVA1xgd9/+W2Y6pyb6JyrAPAC/GOvBmUAuplZC+dcoXPuyziWXQFv8O9lZqnOucXO\nuQUB950C4BAzq7oWZ5z/excAjeCNC1Wec8795I89YwH0rQq29biu5gsA3c2sObwXYCMBtDezHHjH\n5xT/fucCmOhvt0rn3CQA0+FNrqrrD28yNNQ5t8U5V+ycm+r39Wfn3CR/jFoL78Vf9XHgH/64swHe\nGNYXNSuDd0lGu9h1ADgewGLn3HP+ePMtvEnE6WZm8K6RvMofawrgjSeD4txeVYb5j60I3gvNd/zH\nVQbvxXQmvMlplUecc2uccysAfApvwvqt887kvwZvYhWknX/mKB/ei8yvAFQ91uMAzHfOveA/1pcB\n/ADghJj2Lzjn5jjntgC4GcAZZpbsb7/m8CY7Ff4YnV/D+uM9Ds6At59WrWtYbOic+9g5N9vff2bB\nmwwH/m2Lc1+JiyZTdW8dgBZW8/vebf28yjKynHaxuXNuK7xXbUzsJ0O2wpvxw8x6mNnb5p22z4d3\noLeoaQE19GGjvxNXWVL1g5klm9lw8962y4f3agls2ea9dTnTP5A3wXvFEXT/jfBe5QUtK57HFbuN\nl/iPCfAGzIdj+rEBgMF7lRevU/12U2rIcgFs2oZlyY6t+rGXETAGXAjvrMMP/lsgx4ct2Dn3M4Ar\n4f0hyTOzV8ysXcDdp8B7tX4wvFfdH8P7Y3EIgE+dc5Wkz1XjxTYf19X6WwRvUnSI348p8N5GOgC/\nnEztBG8ysinmODwQ3jhZXUcAS2p6keq/vfWKeW+D5gP4Vw19rfGx1uBaeOPANP8ttqq3lXYCsG+1\nvp4DoA28MxxZAGbEZO/5t2+L2LGqHWLGWv95W4Zfjk9rYn4uquH3oMcIeNdMNXHONYL3oq8I3jsn\n/7Nu35Jq664+rqbC2+YvAHgfwCv+23IjzCy1hvXHexz84u9g9X6Z2b5m9pF5b/1uhvcimf39iWdf\niYsmU3XvC3hvsZ0ae6P/qux3ACbH3MzONK0C0CGmfSa8Gf/2eALeK4vu/sFzA7wBI8wqAE3NLDvm\ntk4xP58N7+2yI+BdaN+5qrv+/794fP5Zn2cA/AXeKeMmAOaQvsyCd8AFiedxdazW95X+z8sAXOIP\nKFX/Mp1zn5P1VTcYwGjnnz+u4v8R7YZfngUQgXNuvnPuLACt4H1IZVy14yuo3UvOuQPh/VF3ftua\nTAFwELwJ1RR4ZxuqT2LCbNNxTfpxGLyzI1/7vx8N7wxT1bUsy+Cd4Yg9BrOdc8NrWN4yAJ0CJqh3\n+33a3R8HzkV849v/cM6tds79yTnXDt5bVo+b9zH8ZQCmVOtrjnPuz/BeIBcB6B2TNXbexd3btPqY\nn1fCe64BAP7Zr47w3l6rVc65zfDePqs68/SLdfs6VVt39XG1DN47MmXOuducc73gnUU7HsD5Nawz\n3uNgVQ3rivUSvLdXOzrnGgN4Enw/rbV9RZOpOubvmLcBeMTMjjHvAszO8E6jL4c3c4/HOAAn+BcC\npsF7VbpdTzq8syT5AArNrCeAP8fTyDm3BN4rzNvMLM3MDsQvT/Xmwps4rof3yuzuaotYA2DnmN+z\n4e3IawHAzP4A78xUkEkA9rKAi0XjfFxDzaypmXUEcAWAqgtKnwTwf/bfC2sbm9nppC+/YGYd4H3S\n8Pka4v7w3hKo/upOfuPM7Fwza+mfaag6c1kZ0mYXMzvMvNplxfD+cNfYxjk338/PhffHPx/ecfh7\nxD+Z2tbjuiZT4P0R/d55lyl8DOAiAIv8t1cA76zACWZ2tH82LMO8C+871LC8afD+sA43s2z/vgfE\n9LcQwGYzaw/vU9TbxcxOj1n/RnjjVSWAtwH0MLPz/DE91cz2MbNd/efyGQAPmVkrfzntzezo7e0H\nvL8Xx5nZ4f6ZnavhPSfb8mIvLv4L/UEA5vo3TYT3WM82sxQzOxPeNbxvxzQ717wagFkAbgcwzjlX\nYWaHmtnu/lt++fAmWf+zr27DcTAW3oezqtZ1a7U8F8AG51yxmfWH90Kgylp/mTtXu3+t7CuaTNUD\n59wIeGdJ7oe3Q30F75XN4f41CPEsYy6AywG8Am8QKYR3kWFc7au5Bt5OVgDvoB/D7/4LZwPYF97b\nWbfCu0iyymh4p11XAPge3gWRsUbCu85jk5m97pz7HsAD8M7erYF34eBnQSt2Xs2mD+G9St7ex/UG\ngBkAZgJ4x+8TnHOvwXtF9Ip/uncOvDOH8ToP3jVvNV27cg68yZpIdccAmGtmhQAeBjDI/e/1ktWl\nAxgO7wzIaniv5v+P3H8KvI/eL4v53eBdoByPbTquA5bxObxrfKrOQn0PbyJY9Tv8/p0Eb6xcC2+M\nHIoa/k4571q0E+Cd8V0K74XpmX58G4C9AGyGd4xPiPNx1mQfAF/5z8+bAK5wzi30r4M6Ct6kYyW8\n5+FeeM8NAFwH7wLxL/3x5N/wPjiwXZxzP8KbED8C73k/AV5JjtLtXWY1VZ8mLIT3XDeDN27BObce\n3hmlq+FNqK8FcLxzLvYSlRfgXSu6Gt4F73/1b28D70RAPrwL/aeg5hMIcR0Hzrl34V3r+iG87fth\ntbtcBuB2MyuAd13x2Ji2WwHcBeAzf1/dD7W4r1i1dyRkB+G/etgE7y2tRQ3dn/piZr3gnf3pX/3t\ntETkvzKdAu9TUzWWdBARkR2bJlM7EDM7Ad41VgbvjM6+APbaESYVIiIiv1Z6m2/HchK8U8or4dUC\nGaSJlIiISMPSmSkRERGRCHRmSkRERCQCTaZEREREIoj0bdRmdgy8jzEmA/hnQHG12PvrPcXt5NVo\n+/WK+vh25PZR32pv6PZhKisr1znntrX6c73YljEsxxq7Zkn/8x2t/5G585rADACyFvBPsVunVjSf\n06wZzcs20vJUyM7kz3NFyrrArOfmzbRt0Ur+RQE/t9xC86yS4HUDQKXx5fcoCfxiBADA3DaraN77\n5+DnFQBWpfHamE0y+ffEJ1fwx7euMOjbhjwb+/DnNqmc5z228i/LmL8heN/LxULatlnroOL7nk0r\ng77pyFOW2Y3mm9vzx2arA79hDADQKa8RzUub8/bL1i2Pa/za7mum/CJcP8H7ksDl8KranuXXDgpq\n45KSgk+GNfSEoaHXz0TtG9vu8Sy/rrdN2PLD+p+Swl8XJCcnR1p+WJ6aWtM3JHjCHltZWRnNy8tr\n+lrH/6qoqKB5ZSUfjMLGgKh5YWHhDOdcP3qnBrCtY1in5B7u2uzHA5e360sP0PXtc+pSmqc8dQXN\nu592Bs1Xjd9K8/59+X60penIwOyTdybStvOG3UXz44bwrxzc8+dnaV6azJf/weIjaL7rNbfT/PuT\n/kbzuzqwMl7AKbt3oXlO/j9pPnLqappPWFlI86x1vDTZ5G9G0/zIly8PzA4P+UrBQVfwbfvG7afS\nfOXub9D8vbt5RZm0eybT/OFHjuLrv+BTmv/l6aviGr+ivM3XH8DPfgGzUnjFJIOKKYqIJBqNYSJS\nK6JMptrjl184uBzb9qWwIiINSWOYiNSKSNdMxcPMLgZwcV2vR0SktsWOX02NX9MkIr9dUc5MrcAv\nv725A2r4Bmvn3NPOuX6JeM2EiPymhY5hseNXjjWu186JyI4jymTqawDdzayLmaXB+8LHN2unWyIi\ndU5jmIjUiu1+m885V25mfwHwPryPFT/rnJtbaz0TEalDGsNEpLZEumbKOTcRAP/MbDXsI+b6apvt\nF/Xj6w0uYuWFsPIAYY8/aumIKNs3rHRBWGmEsPZheZgdft8itmUMa7x7CY7+8KfAPG/vUbT9x6mP\n0vy9BStp/kUX/vH2l68fRvP7fuS1os4eMSAwS1tyGG07uPx+mnc58hqav96Zf7T/5Md4aZMWl86i\n+VVzeS2ka1tl0fygNkfT/JtneVmKgs/+QPMjz9qT5mUV79N8t8ZjaV4ydSPN3767T2B27LpvaNus\n65vSvOe7bWk+6bkvaJ495Pc0n3cIr/H1XYsrab7mMl7aAU/zuIoqoIuIiIhEoMmUiIiISASaTImI\niIhEoMmUiIiISASaTImIiIhEoMmUiIiISASaTImIiIhEYPVZI8bMXErK9pe2itrXsFpBUdtHXX6U\nZYdt15ycHJoXFhbSPGzb1/W2TU7mdWZSU1Npnp6eTvPs7GyaZ2Zm0pwpLS2l+ZYtW2heXFwcaflR\n61RFrTNVXFw849fwdVI5ySmub3ajwPza03g9nZevWETzW/Y5luaHX3ADzQtGn0vzpa32p/kJ56wO\nzE66cxBtO/6xcTRPP4zXQm0/83Gap6zhX+867olHaF707zU0f/pwfoxM3aMjzc/s+keav/W7x2h+\n0d28DlbLqbvSPLvPWpp3HPgwzR9rkRGY/W1A8H4BAIOXtaD58UeNofk9RRfR/PABz9D8qOf48tvO\nLKP5uJv4vnll8VVxjV86MyUiIiISgSZTIiIiIhFoMiUiIiISgSZTIiIiIhFoMiUiIiISgSZTIiIi\nIhFoMiUiIiISwfYXfWoAdVnHKZ7lR8mTkvi8NSwPqwWUlZVF886dO9N86dKlNA+rdRRWBypMWPu0\ntDSaZ4XUgcrJzaV5y5Ytad6mTRuas/7n5+fTtps3b6b5pk2bIrUPqyEW9txWVFREysOWv6OobJ+L\n/KGHBOZXruG1yu4d0JXmk/a9nObNpi2meVLyjzQfN3o6zccMvy8wu/zSK2jbg3f6gOafHP8hzbOP\nWkDznpcOpPnwTxrT/NuhX9O8z4+9ae7cNTRf+iWvc/fyAH6M2Ncn0jxz3DKaX3kZ79/EDx+l+d8y\nZgdmT77G++ba8hqGG97vRPPStlfT/OZnmtD89TReg6v/MSto/vvlfHzE33lcRWemRERERCLQZEpE\nREQkAk2mRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkAk2mRERERCKo9zpTzrnALGqdpyjrjieP\nIqwWT9S+lZeX0zw7m9fACcu3bt1K85QUviuF1ZFKTeV1WjIyMmieGVJnK2z5ZWVlNA97/lidsKj7\nbVjf09PTaR62b4TVMAvbNnVd/y1R7LS8C566ZnRgPjyP17O5+FhesOahJx6g+fIuvJbSlrP4fmJ7\n30jzN65sG5it+WAWbfvhxutpfk/I+PCHxvz43bvDZTT/obw7zc/p/RLNX5n6Dc13OukCmp9Wwev0\nrT/nfZq3zbyTL/9wXgfr2GGf0/zGa8+n+ezjg/fdHnMPo21PGzCZ5l8+2JrmjTs1o/mt+/A6VkXN\n+L4zqs+bNB/5+yNoHi+dmRIRERGJQJMpERERkQg0mRIRERGJQJMpERERkQg0mRIRERGJQJMpERER\nkQg0mRIRERGJIFKdKTNbDKAAQAWAcudcv5D7R6rHw9rWh7qsgxXelteZqqjgtYTClp+bm0vzVatW\n0dyF1CoKq+CVnMyf28LCQprn5+fTPKwOVmYmr1WyadOm7V5+2LYP63tpaSnNS0pKaB5WIyusjlRd\n1l9raNsyhv3YBBh4dHC9tJ1mf0TX9dH5h9N8j5470TzpweE07zWX56sf2JnmhTd8H5h98nxX2vbJ\n/U+i+bMntaL5gg8+oPmhY5bQfPAFh9L8kLX8+O753UyaX3kAHx+LT+V1rGb9jtdqGnvZpTRfbtfR\n/OGf36F5+5N5DbIN/2wZmGXfzfebx5vNp3mXoSfS/OHWfPxZ+B6vo/feQffQvPfnz9H81EUP0nzi\nqzT+j9oo2nmoc25dLSxHRKQhaAwTkUj0Np+IiIhIBFEnUw7AB2Y2w8wuro0OiYjUI41hIhJZ1Lf5\nDnTOrTCzVgAmmdkPzrlPYu/gD1AapEQkEdEx7BfjV1bHBuqiiCS6SGemnHMr/P/zALwGoH8N93na\nOdfPOdfvt/KFqCKyYwgbw34xfqW3aIguisgOYLsnU2aWbWa5VT8DOArAnNrqmIhIXdIYJiK1Jcrb\nfK0BvOafbUoB8JJz7r1a6ZWISN3TGCYitWK7J1POuYUA9tjWdsnJwXVaotZxqus8DGvPHreX85OE\nKUkhtX6Mt09P43VGmjdpTvMWOTw347WMKpJ4LZGSkFpIYbWSKsqLaZ7k0mheVMmf+82bNtO8sjK4\nf2F1mpzjNboqQ/Lycr5twoT179f69vy2jmFp2YvRsd8fAvP8TcNo+6U9ea2iQ57jtYoWNXqR5ief\nt5jmKQc9TPPxo04NzJbN48d/1yt/ovlpTw+m+Z3Lv6N5Rnteq6j52gKa3//QuzR/80ReGWPGO3w3\nGTlkGs1fML78zycuovkbjSbRvPMxK2h+SR7fPgNccI2xU+6dQNve/9Vkml/8Kv/b92Kj/Wl+YDO+\n3z9SdhnN2xz2Jc2P7v4QzSeC7ztVVBpBREREJAJNpkREREQi0GRKREREJAJNpkREREQi0GRKRERE\nJAJNpkREREQi0GRKREREJAILqzFTm5KSklx6enpwZxK8zlRSEp97Rqmh5X3farCwOlMVpM4RAPTv\ntx/NG6fzGjh5i3gdk01b1tN8a1IJzStTeMkzC6nTlZnC60g1zW1K85Jyvn035+fT3FUG14IqKy+l\nbYuLeY2ssorySO1Da3SF5OF1snheVFQ0wznXj95pB9B5zxR305TGgfmEp/gxtPvxs2i+4M1HaH77\nrbzWUdF8Xm90SuqrNM+Z3T0wW/tpFm07dNXdNB/4Kd9HX7yoDc2/O+gZmrf4oQ/NN37XhealbflX\nBQ0/iz93GXcMofmPb/Pxa+jbH9F86+iTaH77YXzf+OismTQ/7Z0DArMueWtp2+mT+H43dY9lNJ9z\nCK/B1XUXXocq7d2jab589myaz/5mT5rfPWFMXOOXzkyJiIiIRKDJlIiIiEgEmkyJiIiIRKDJlIiI\niEgEmkyJiIiIRKDJlIiIiEgE/PPovzJRy0CEta8kH48PqXwAF3KHylQ+7y0rI+sGgHL+8ffdenSi\neUVuE5pPm/MNzX9cv5nmlpHBc/DSEhXgj3/D+o18+SGlF3KyeP/Yc19WzpedErLu8pCyFxkh2y6s\n9AHdbwGUl/PSDGVlZTQvKiqi+Y6i6eLGOGPwMYF52nOZtP1upXwfPrfoVJr32f8Vmnd66iuajx/P\nn4er7l8cmA0Yxz+6v2uTl2g+egIvGzHy+9U0P7D36zSftQ9f/0sbd6b5A3kjaL7fiXfQ/OCN/PGV\n/ul8mrc88UuaDx66L80P+JKPAc/0/iPN+302PzD7Ip+PH3d8/hnN95vbnuaN3t2F5tN3Du4bABy2\nObhcCQAsb/c8zTtU8tIJ8dKZKREREZEINJkSERERiUCTKREREZEINJkSERERiUCTKREREZEINJkS\nERERiUCTKREREZEIflN1psKE1duJUmfKjNeYsaSQOlKOty+vDClkVc5rASWVbKV501S++N07daC5\n42VQUMxjLM9bT/MtZaU0D6vllBRSx6u4ZAvNQepgVYY8N6Hlz0L2jbB9K6zOVNQ8av22HcWWlhX4\nakh+YP79olNo+7zfX0jznSa9SfMVz91G88emnkfzV4/iteR2u+uWwOyLWbzO3Es7/x/Nc6cMoHnG\noL/S/Np/70Hz477hI8i5dgLNl/+J10oq3u1gmk/7ko9/awr5+HV8++BtDwB5l31C8xtKX6T5jPv+\nQfMRnwavv9/Z/LEd2ONMmi97ndf4a/dQV5rjp2dpvFf28TSfcGZfmq+7aiRff5x0ZkpEREQkAk2m\nRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkgtA6U2b2LIDjAeQ553bz\nb2sGYAyAzgAWAzjDOceLSdSCRK9nE6l/YW1DcldRzvOyEpqngLdPM15rqEebVjTv1LkdzZds2kDz\nrQUFNC/P4oWweAUxoLSCb5+KkO2bZMHrT01No20NvAZWmePbvriU9720lNfgKi8P2XfC9r0EPy5r\nawxbnZuBew7uEZh3Hfkk7cfpU++l+c3/5LWaOn/2Lc1b39qb5svsJpp/M+KAwOzyE6+lbYe0u5Lm\nJw7l+9jUthNoPuLeq2n+u0Fv0/y4V4MfGwCcdPvRNC/p+R7NO+3yE80/3XsczQ/ah9cQe2r0fjRf\nvuFVmnfpvITm3XNuDsx2efQw2nbSI8Np/uUwXgNryD960vzOiYto3nI+r0HW+BZeI/CO2Xz8jVc8\nZ6ZGATim2m3XA5jsnOsOYLL/u4hIIhoFjWEiUodCJ1POuU8AVD9tcBKA5/2fnwdwci33S0SkVmgM\nE5G6tr3XTLV2zq3yf14NoHUt9UdEpD5oDBORWhP5AnTnXTAReNGEmV1sZtPNbHqiX1shIr89bAyL\nHb9K1/HvrxSR367tnUytMbO2AOD/nxd0R+fc0865fs65fmFfyCoiUk/iGsNix6+0Fln12kER2XFs\n72TqTQCD/Z8HA3ijdrojIlIvNIaJSK0JnUyZ2csAvgCwi5ktN7MLAQwHcKSZzQdwhP+7iEjC0Rgm\nInUttM6Uc+6sgOjwWu5LqIa+5irsbUqWh77FGfLYUhyv05KewZ/KtDQ+b640XompoIRfL9I2OZPm\n7bNyaF5ezGsxNQ15fLnt2tO8IonXEtm6JZ/nW4tpvqUwuNZTZQXf9uXl/LkvLi6ieUkJrzMV9bjZ\n0etM1dYY1mxuPs7q81Fgfs7gi7thAAAgAElEQVRbr9P2Xd7itY7+lDOG5kPnXErzq06+g+aPrl1I\n80Gjgt/GXFbSgbbdpfWpNP/ykONpXj54V5rPmMfryC2adyTNizYtpXlrN5rmd711Es3Pv5zXmdrn\nx1tovsdYXgdq/Lv30fyu3/1I87T7r6L5U7cFj//pn/EaXuvnXEHznrfwv30Ze31P8wPm8BpkJ307\nhObDzj+f5qdW8PU/OZHG/6EK6CIiIiIRaDIlIiIiEoEmUyIiIiIRaDIlIiIiEoEmUyIiIiIRaDIl\nIiIiEoEmUyIiIiIRhNaZqm2VlcH1LBr+62b4+sPq6bDuh5XiyUjndVSaNeJfZdGkUTbNW7VuQfPc\nZk1ovm7jRprnrC2geXr+BpqnJPMN1DwrneZI489dVtPGNG/WlG8/V8FXv3lzcB2u5UvX8LYb19O8\nuKKU5pWO1whjxxyQ+HWiEsUytyuuLvsiMF+w+Gza/o4DTqH5n9t+SvO3B/xA8/v3e5jmj02/l+aF\nzYP3s83d/0rbrn6Q13E64jNe5+ie3h/T/KgbetB80J25NM8beCzNL3plCs2HTuPj203vXUnz0iW8\nDtXYlbyG2FsnHUHzTT/PoPmV+31H82MbzQ3MRh59GF/24Yto/sNzvEbiv5/fRPPif1xC87cfGUzz\ntEI+vh31yd9p/uTEC2leRWemRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkAk2mRERERCLQZEpE\nREQkAk2mRERERCKo1zpTzjlUVgYX7AmrMxVehyqsfXJI65D2YXWmSD2f3JxM2rZ1q5Y0b9WyOc2z\nsnkdppyQOlUtmvM6TEl5vA7T1iUraF4WUkeqUTO+/J1btKX5nPxCmrdq34bmqMygcXkZr9WUnBRc\nJ6y0hNeJKijkNbyKC8pongS+bSvDyki5sPpqIXWqwPNfi17Js/BG446B+cY3ttD2JR8E1/IBgN+P\n7UXz4edfT/MDJ/NaS5+O5sdIIYJrId26ldfyueL+U2m+58mdaX7izd1pPvPHf9E8+/07aD5i8Uc0\n/3thHs273MJreGWdyWsx3Z+0F80zb21F81evaErzPumjef7DBTQvPDm4TthtE5bTtocf8CjNP7mM\nr/uTJF4D8ZivZtL8kEX8uZtx2TSan/pdO5rHS2emRERERCLQZEpEREQkAk2mRERERCLQZEpEREQk\nAk2mRERERCLQZEpEREQkAk2mRERERCKo1zpTZt6/YGEFcSKuPyRPAa9DlWTlNG/aOLjOS4d2vI5U\nixbNaJ6Zk0Pz1PTgOkcAYCl83pyUxLd9ivHc5fA6VjmlJTQvXZtP87Ulm2i+qpLXYurYhdcSSUnh\ndbrCpKYGb9+WLXgNr8JCXkNmY/46mleUhdSZKuf7dUVYnSkLrg0HAAjZN34tUi0bba1/YH7dZyfR\n9vvN+YzmLb/n9XAevv4cms8aci/NH+jBn6f77jgtMJuZ8Qfa9tGMPWjexe6h+Q/91tB89pqFND9j\n8XE0v/+nd2n+VgUf24ee25Pm78xYSfMDHvua5r9/82eaT71/EM1f2InX+bul69s03+PFIwOz6/J4\nDbC9m/PnpvjpvWn+1Hcn0Lx08Jc0HzHiLppf+BKvUXbc7Ftp/g5N/0tnpkREREQi0GRKREREJAJN\npkREREQi0GRKREREJAJNpkREREQi0GRKREREJAJNpkREREQiCK0zZWbPAjgeQJ5zbjf/tmEA/gRg\nrX+3G5xzE+NZofFCU3XLVfI8pI5Uk8aNaN6pU9vArFmT4BpUAJCTnUHzls14LaKMkDpPrVrzOleZ\nLVvRfHkmr2OS3L45zctW5vH2m7bSPL+ggOZ54HWsli/jdWCSkngtpjbtO9A8JSX4UEpN4zXAWrXi\nz83qtXzbrc7bSPPKypA6VC6khpiFHTeJXWeqtsawkjbp+On/ugTmrQ7htZgeOoPXgTpu7uk0nzLm\nRpq3XNmb5ltTHqD5KY3HBmZHLuW1evZ9hVfjOfCOF2l+xfTg7QoAE956guZ33MzHpz+cE/zYAOAP\nZ15F86vffZXmzVIOpvmy3UfQ/KcTx9A8r7gJzfP33kDz9kWlNJ928wGB2aj2L9C2HR7gNfxu7MNr\nCL7RfzPNJ2/l7WfctTvNP9yHH3fffcX/tsR7zimee40CcEwNtz/knOvr/4trIiUi0gBGQWOYiNSh\n0MmUc+4TAHzaKyKSoDSGiUhdi3LN1F/MbJaZPWtm/D0oEZHEozFMRGrF9k6mngDQFUBfAKsABL4Z\nb2YXm9l0M5secmmGiEh9iWsMix2/NuYX1Wf/RGQHsl2TKefcGudchXOuEsAzAAK//dM597Rzrp9z\nrl9DXnsuIlIl3jEsdvxq2iizfjspIjuM7ZpMmVnsx9ZOATCndrojIlL3NIaJSG2KpzTCywAGAmhh\nZssB3ApgoJn1BeAALAZwSR32UURku2kME5G6FjqZcs6dVcPNI7dvdRapzpQLuegqbNkGXi+ncWNe\nq6lTxzY0b9k8uBZIZnoabZuWzE8Sdm4fXMMKAPru04+336UnzTOzcmhelsTrYC2bNY3mFSE1vrKS\neQ2vPuW8BljPvfjj6943pAbPZl5rZMMG/mGwVSuD61ilpvI6U9nZ2TRv1bo1zdeu53VYKiv5vuUc\nP27KKytCls/zhlZbY1hq2UZ0XDUhMG/UaRZtX/p3Ptwu/ewRmj+/+A2aH3304zTvXf4TzduduiQw\n6zfjddq25Hd87MzMPZLmKzpeQfP/e2o6zfuv5HXuLj2Ut/974+dovrSoMc1HOF7r7aXN42l+2w18\n+1V23Ifm/57I/zam5K2l+XM9zwvM/jr+C9r2n0/vT/OhC2s6/P5r6kt87L6iH6/f9syJR9D8n5/3\npfnj1/MaXCfS9L9UAV1EREQkAk2mRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkAk2mRERERCLQ\nZEpEREQkgtA6U4kkKYnP/cLqTOVk8VpJ7dq2pHmTxrweUGpKcP/S03itodQk3vdmTfn3sHbt1p3m\nuc1a8Lwpzw876WSaf5vG+184n9e46X7UrjRv8dNimrc+iNfZSuvP60zlpvE6LysWBdfgAYCxY8YE\nZitJDSoASE5Opnl2Fu9beno6zSsreI2v8pDcq2sp360HWo8K3s9fvvxc2v7lO/5B81vfOpPmf3n9\nGJqPODK4zh0ANNp4J81vO+2VwOyId/k+OKGU1zHaczkfW/eZcCDN+y66keaNXgz8RjMAQPMNvE7f\nF60m0fyNf/H2C7/ehee35dF8aofg+mUAsPQtXofr4LY30/zxte/RfP3ERYHZI+ecQNtmJr9N8+uS\nb6N5k+P5+HLvnKE0P6N8FM1bzN+b5vsv4X+b46UzUyIiIiIRaDIlIiIiEoEmUyIiIiIRaDIlIiIi\nEoEmUyIiIiIRaDIlIiIiEoEmUyIiIiIRJFSdqbA6UWFSUvjDadWK11Jq2aIxzbMyeT2KjLTg9ael\n8lpCySGPPT0rl+YFBUU0T8rkeTZ/6MjK5utv2bYjzef99DPNd92tF80bbcqn+eaff6R5+/58+akh\ntZratmtP85atWgVmK1asoG2TkvlrmszMTJpnZPD6aQUFfNtVVPA6L5Uhdaac+23Uocqu6IG+BeMC\n85s+fJK2/+N1V9N8ce/7aP7End/T/HdNcnj7iZ/RvPLkGYHZxla707bP9xpE8+IXb6V5j4fW0HzL\nndNpnnwor7HVOy24hhYA9GjyF5q3vboDzcs+ak7zQRNm0jxt5EU039LoBZrvfV1Pmv91Mt8+pwz5\ne2A2d58PaNs20xrR/NNXn6b5vns+S/M3Z/EaZI9nfErzR96/neZ5A+fTHOP434YqOjMlIiIiEoEm\nUyIiIiIRaDIlIiIiEoEmUyIiIiIRaDIlIiIiEoEmUyIiIiIRaDIlIiIiEkFC1ZkKE1aFqlEjXu+i\neXNeayMjndeCSkvlc880VmcqhS/bQua1hcVlNN9SXEHzjK2lNC8rKqF5UmolzVvu1JnmpdnNaL50\n/Waad2zKaymtX7CA5p0281pLlskLbZWW8e23bu3awGz16tW0bfPmvEZNcjLfdyor+XNTXl5Ocxey\n7znVmQIAbG2dhGl/yw7MZ416kbbfY/zrNE/NW0nz9+4eS/NuWbyW3MY9eb2yPc7rEZjtm8GP37Kj\njqb5zRfwWkIDB55C8xbreZ2oz45uR/NL8vgxeOXRc2meMegKmm+46Amafz2jO83f/Ac/xj/ufy3N\nd+7RheYPH/wNzb+dF1zH6o07l9G2W9vz/fKKkm40b1JyGs2vu6cpzUefzP82PnMjH98WpB5Lc/AS\nZP+hM1MiIiIiEWgyJSIiIhKBJlMiIiIiEWgyJSIiIhKBJlMiIiIiEWgyJSIiIhKBJlMiIiIiEYTW\nmTKzjgBGA2gNwAF42jn3sJk1AzAGQGcAiwGc4ZzbyJfmAFKTxkLq1WRkpNK8TUgdqSY5wTViACAj\njVeySk3hmyuF1ANKDqkzBcfXXRlSK2hD3hqaL1u8iOZ7JfP1t+vUieaNWrWieftOO9F8zqzZNF9R\nvI7mjXNyaQ5Lo3FlBa/TlZ9fSPO53/8UmM38bg5tu+dee9A8JY33PazMkwOvQ+VCK7iF5Ja4r8lq\nc/zac+MifP7quYH5Pxx/Hm9+5Aya98wdRPPJ3y2l+ZgXXqV574NH0/ycue8FZn9+ZDFte+w+nWm+\nulUxzdv1yqH5qFZjaD5+5q00H3pjB5o3mTGK5i//8RqaP/kkHx+nHteR5jf135/mbw5vQ/M2HXam\n+fo7x9P8iOEHBGbXZA2kbde+M4Dmm8rn0/yLlDtoPqb/STR/bzjftwZ0/hPNx721N82BySG5J55R\nsBzA1c65XgD2AzDEzHoBuB7AZOdcd39t18e1RhGR+qPxS0TqXOhkyjm3yjn3jf9zAYB5ANoDOAnA\n8/7dngdwcl11UkRke2j8EpH6sE3n582sM4A9AXwFoLVzbpUfrYZ3Gl1EJCFp/BKRuhL3ZMrMcgCM\nB3Clc+4XX3TmvC/nqvHKDTO72Mymm9n038hXeIlIgqmN8WttKf8OMBH57YprMmVmqfAGohedcxP8\nm9eYWVs/bwsgr6a2zrmnnXP9nHP9LOw6VxGRWlZb41fLNP4BGBH57QqdTJmZARgJYJ5z7sGY6E0A\ng/2fBwN4o/a7JyKy/TR+iUh9CC2NAOAAAOcBmG1mM/3bbgAwHMBYM7sQwBIA/HO/IiL1T+OXiNS5\n0MmUc24qggvNHL6tK7SaL00AACSRDAAaZ2XQvHkuz7NS+Im49BRezyctJE9KCl5+EqlBBQDJpC0A\noKyIxpvXraL5hvwCmpcU9aZ5kvFaRSkh26bPbr1o/snkf9N84U/zaH7+ecH1fwDAGjWl+dZivn1X\nrw6p47VidWC2fmN+YAYAW4r4urOT+dtLZWW8BllYnSmEvP9uju+bLoEvhqzN8asyJQVbWzYLzFtm\nfk3br/xgV5q/8wdeB6rLwtNoPrxvJs03X3I/zZ9oEVxmq+MsXgNr/Kdraf7CqZ1p/tjan2k+eRzf\nNhOW8xpfPRbPpfmlsy+g+ZoN79B8yurLaL7gppE0v/FW/mHSnw4KrmMHAON+XkjzdksW0/zoA84O\nzArL+X71zax+NJ85gz/2XU79kOZjj7+b5r1z3qb55AGf0bz1mZtoHq/ErbYnIiIisgPQZEpEREQk\nAk2mRERERCLQZEpEREQkAk2mRERERCLQZEpEREQkAk2mRERERCKIp2hn7SIlaZJTeC2mJo0b0Tw3\nO5vmqSn84SaH1IKykHo8tM5USB2plJC+rV0bXMcIADZtDq4RAwBNW7aieVISrxVUXl5K89Qkvu06\ntGtL84EDD6Z5aWkxzbObNqf5mrW8Ds7Wws0037hxA83Xrw9efuPGubRtTnYWzQsKeJ2qkhK+bcJK\nmIUpq6wIWf5v4zXZ+mYdMfrsfwTmj70xhLZ/b9hgmv/QZwzNCw7nx3izc4fS/NVevFba8JevC8w6\n//lQ2vaT9/nxdf73HWj+xLl/o/mb82bS/L4zD6R5zkRea+j9OfwY/eOtR9D88OVv0RzXvELjzFtK\naP5o5v40nz9tMc3HLhxI81a9g+unFbQbS9sO++BEmt+ZPYrmE9/oT/OOj/MaXmdfl07zRzvNpnne\nqPY0j9dvYxQUERERqSOaTImIiIhEoMmUiIiISASaTImIiIhEoMmUiIiISASaTImIiIhEoMmUiIiI\nSAT1X2eKSEtLpXluLq8zlZ7B602E1ZEKq5cT1p7VigpbdlheHlLrBxVlPA5pX1lRTvOigkLePovX\nqTLju9q++/Tj6y/ktZaWL15E8/VrVtE8rM5WkyZNaQ4XvP127tyJNs0JqY+2ZMlymleE1AAzxx8b\nwPeNjLQ0mleGLf5XYkvBekyb/EJgXo7HafvDDxxB873m3Efz5z7rTPNVrXidqYm9H6T5/msGBmbz\nBoymbVtMeJfmzUZ8SPN+jb6m+bwL/0zz7IG8DtSTp+1B80kPH0vzH1/ldapyOvAaY+c++QnNh5W+\nTvOVmfy5rfzxW5pf/rsraL4gN7hOWLcex9G2HebvR/OTz7uW5i9fO4/muz/xEM373PQczd89/jCa\nF2efSnMMfIbnPp2ZEhEREYlAkykRERGRCDSZEhEREYlAkykRERGRCDSZEhEREYlAkykRERGRCOq9\nNIJD8OeoU1N4aYSw0gkV5SHlA1J5aYOw0gdR8rC2aSEfP09N5U9VUsjyy0M+Pr91Ky99sHF98Edn\nASBp82aaN2rUhOazZn5D88ULF9B81Spe+qBpU17aIC1k++bm5tA8KyN43+zYsR1tW1pcQvPC/AKa\nl5Xw5zYpie8bnTrx0g0FW7bSfM3adTT/tUjt0Ajt7gv+mHXervfQ9ss77E3zvKydaL7bS2Npfucr\nl/B89kSaH3DQXYHZ/eAfP9//3i9pPrI9L10w5IelNP+qLS99cuaoqTRv3bQZzfvhKJo/M/Btmj98\n4QU0z+3Ex6dPj+Tj/9NNptP8xMZ303z9/d/R/M97DA7MHpjC/65OPetMmrc48nqaL7hrf5qXjW9D\n8/cP4MdFm6f42P3WQXz9fPT+L52ZEhEREYlAkykRERGRCDSZEhEREYlAkykRERGRCDSZEhEREYlA\nkykRERGRCDSZEhEREYkgtM6UmXUEMBpAawAOwNPOuYfNbBiAPwGoKkB0g3OOFzIBkGTB87fkFF4P\nJyWZdzc1ldehCsuTkvjc0sy2u33YssEXjXJXxtddWUnz0lJey8gqeS2RRQt+pvm0r3mdqMOOOJLm\n/570Ac3feustmrds1ZLmO3XiNXxKthbRHCinaZPG2YFZdiavIbNyxRqa54fU8GrejNfQ2mtvXt+o\nPLj0GwDg8y+m8Tu4kPpuDag2x6/0yiJ02zIvMJ/x/p20L1M29aB52lheryet17c0n7uC12L66NlB\nvH3B4YHZ6U/sSdt2ahdcowoAHhrCx58te02mecdT+bYtbD+a5hP/ehnNjzqH16m6a+wmmrfvyOvo\nff/zvjT/8npeR691fz7+nd71d7x91sM0/+PY4L+tK47iz93CP/P6Zrse/wea9x7J/zZcfSevI9Xn\n2y00b7KY15Eq7XYDzYFrQnJPPEU7ywFc7Zz7xsxyAcwws0l+9pBz7v641iQiUv80folInQudTDnn\nVgFY5f9cYGbzALSv646JiESl8UtE6sM2XTNlZp0B7AngK/+mv5jZLDN71sz4ew0iIg1I45eI1JW4\nJ1NmlgNgPIArnXP5AJ4A0BVAX3iv/B4IaHexmU03s+nka/lEROpMbYxfhev491eKyG9XXJMpM0uF\nNxC96JybAADOuTXOuQrnXCWAZwD0r6mtc+5p51w/51y/sIusRURqW22NXzkt+BemishvV+hkyryP\nsI0EMM8592DM7W1j7nYKgDm13z0Rke2n8UtE6kM8n+Y7AMB5AGab2Uz/thsAnGVmfeF93HgxAP75\nSBGR+qfxS0TqXDyf5puKmqsghdaUqs5gMFJvKTmZ15mKmieF5SG1oFJS+OZKT08PzKLWuEISr+WT\nlMLfQ62o5HWSyst5HasVy5bS/PXXxtO8UePGNG/VqhXNmzXldVy6de1K89WrVvN8xUqat2/P+9e2\nTYvArLKCb9v169bRvE1IDa0999qL5kUlpTT/ZOoXNC8v4+1TQ46L4tKGq0NVm+NXeUk28hYHb+vH\nRwyh7Y85me9D41/nx8jnh75I82eN17G666fraH7WwxmB2V7nvUnbvvT5DJovuXk+zdsufIUv/11+\nwe3ERw6l+R9v6kfzn678lOY35q6geZ9uF9D8zeGv03zqMbyO19qPD6T5T8/Npvni0/gY8u3y4DqB\n3bufQdtOnsbrQB1X2IXmA8p+pPmxju/3Z2edS/PbT+H73ur9Z9I8XqqALiIiIhKBJlMiIiIiEWgy\nJSIiIhKBJlMiIiIiEWgyJSIiIhKBJlMiIiIiEWgyJSIiIhJBPEU7a29lKUlo0TT4KxmaNs6i7XOy\n02ienRWWZ9I8IzO4zgoApKXx5efmBj+2lGS+qS2J14lyxutEJaeE1OAynpcUbaF5aclWmu/SbWea\np4fUwUptnE3z/v14HZYuXfj654R8lVFyyPdGduzYht8Bwc/PhvVracv27VvTfMAAXmNm0+Z8mn/4\n8cc0b5zL9/sk43WiwupYFWwtofmOoiCrAp/2CT5OevY7jrbv2/Iymh/+Da/31eeGSpq/cdqFNP/m\nvo9pvuW2zwKzln/hx/+3/e6n+avN+Pj346Ov0rz4869ofvrfd6f5s9Om0/yuU3mNr9KfFtI869EH\naT41hdcyWtaNr//RI/j4+MnnvI7eRev+TvOH7s0NzAo++BNtW/wD//KAQ7sfSfMR/yii+dgkvu8d\ndf7hNH/v0J1o3uaoXjSPl85MiYiIiESgyZSIiIhIBJpMiYiIiESgyZSIiIhIBJpMiYiIiESgyZSI\niIhIBJpMiYiIiERQr3WmcnOycdjB+wTmqcm8GFCLprwWR24Or1OVncVrdWRk8Dw5iddqysgIrteT\nnMzbWkgdJDNeYyY5Odq8uKhgM81Tk3ghpv333ZvmjbN5LaN58+bRvEWT4DooANA0l9cQ23mn9jRv\n2awZzYuKea2TgsJNgVmXLrzOSd++e9A8JYXXNysoWM+X36cnzQsLC2mel5fH22/h22bV+h9ovqPI\nzi9F/8lLAvOP08+m7W/fsh/Nbz6W1wp6pPRumrd5MrhvAJD2zQ00P+3i8YHZvyr4PnDW1XNpPuJl\nXkdq4qSraP7EbtNoPv/pz2l+ZLNnaP7xX26h+V1D+N+WK/vzWnF7/ms+zcf87UeaT3qc1yBb/kMq\nzfOO5bWgWiw+KzA75OPf07bzBj1P812/59v2ycv4ft2tzz00P3r+GJqX7n8rzVNf24Xm8dKZKRER\nEZEINJkSERERiUCTKREREZEINJkSERERiUCTKREREZEINJkSERERiUCTKREREZEI6rXOVEZGOnru\n0j0wrygt4e3TeXfT03huIXPHivIy3j6F11oqLy8lKe9bSgrPk5N438NqYBUV8VpAX3/9Nc0zs3gd\np5YtWtHcVVZEyi2kEFdBPq+TVbSV11JKCqlxlhzy3Pfdc/fAbNdde9C2hYUFNF+5ktcfSk3l+05G\nOq9BU1HO61i1btWC5o1L+HEL/DrqTOVvXYz3ZlwYmOf87SnavtM+vFZS5pT+NL/wmcU03/W6i2ne\n4ol+NH+mdaPA7PMP+f6/26QPeN71UJqf8NRNNP/TuAE0H//1YJrntuLjw77vPEHzloteoHnB5S/R\nfPruh9N8SIdJND/k3hE0H7rvLJrf+cQQml9/VPDzMzOzL2173MW8vtrpx7aj+aARV9B84msP0fzs\nnXn9ttNXNKf5e935vhUvnZkSERERiUCTKREREZEINJkSERERiUCTKREREZEINJkSERERiUCTKRER\nEZEINJkSERERiSC0zpSZZQD4BEC6f/9xzrlbzawLgFcANAcwA8B5zjlWaAmlpaVYvHhJYJ6Zxuvh\nNGmUQ/PkkDwlpJaQJfFaKsnJfO7J6v2kp6fTthkZGXzZKXzdzvG+l5QU0zwvbw3Nu3XrRvPsbL7t\nKyp5/1q3bs3bV/A6VFu2bKF5aUgNs6wc/vzs0ZfXWmnXrm1gtmnTBto2P5/nKSH7XUlInafiYv7c\nl5eX0zxs30JI3NBqawyrcLui0L0SuJ4zXuS1hk58qTPND3nhVpq/+9olND+7E6+3Uz55b5pXPB1c\n6+mRvN607YXN+T622zG8hl/Ly4PrtAFAdnbw8QUAY194l+YpF7xB840ZD9L8psv+SfOCgTvT/PQ/\nn07zVpXjaT5kJX9uP3iwE80P/vAqmi/LDn78jUe3oW0fenkQzSdcwLfNwK78uX+t2d9ovvBPfPwa\nWfoozV9MX0TzeMVzZqoEwGHOuT0A9AVwjJntB+BeAA8557oB2AgguJqdiEjD0RgmInUqdDLlPFXl\no1P9fw7AYQDG+bc/D+DkOumhiEgEGsNEpK7Fdc2UmSWb2UwAeQAmAVgAYJNzrur82nIA7eumiyIi\n0WgME5G6FNdkyjlX4ZzrC6ADgP4Aesa7AjO72Mymm9n0omJ6SZWISJ3Y3jEsdvyq2LKxTvsoIjuu\nbfo0n3NuE4CPAOwPoImZVV1x3QHAioA2Tzvn+jnn+mVm8C9UFRGpS9s6hsWOX8nZTeuxpyKyIwmd\nTJlZSzNr4v+cCeBIAPPgDUin+XcbDIB/XEJEpAFoDBORuhZaGgFAWwDPm1kyvMnXWOfc22b2PYBX\nzOxOAN8CGFmH/RQR2V4aw0SkToVOppxzswDsWcPtC+FdexC3kuISLFwQXNOhR7eutH1ZGa81BMdP\ntOU2akTzjAxea6iyspLmSUnB609N45u60vHHFna9mQvpW1kZr8XRti2v45KbmxuyfF5HpjTkuduy\nZSvNw2oplZWFXI/n+FzqvwEAAAXTSURBVPbZuUsHmnfqwGutrF27NjBbv2E9bRtW52nrlpBtWxrt\nWsSwOlJhNb7C8oZWW2NY062rcNr0OwLzF0pep+3POW4uzW+6+F6aH9b5CZpfut9xNL/6U15n76Jl\nUwKz3XZaQNte8M2dND/yjc9p/u1tQ2j+dm9ex26/4Ytpvvs/g49PALin7WaaH3NJFs0/L1lJ85Tn\nhtL8oZd5nb82Y2+m+ZDTrqD5sEP4Mbru5uDxb8v/TaRt1+8efEwAQJOjD6L5gK8up/mh/7yU5i27\nPEXzJx/j+9ZzNz9J870vovF/qAK6iIiISASaTImIiIhEoMmUiIiISASaTImIiIhEoMmUiIiISASa\nTImIiIhEoMmUiIiISAQWVmOmVldmthbAkpibWgBYV28d2HaJ3L9E7huQ2P1L5L4Bv77+7eSca1lX\nnakvGr9qXSL3L5H7BiR2/xK5b0AdjV/1Opn6n5WbTXfO9WuwDoRI5P4lct+AxO5fIvcNUP92FIm+\nHdS/7ZfIfQMSu3+J3Deg7vqnt/lEREREItBkSkRERCSChp5MPd3A6w+TyP1L5L4Bid2/RO4boP7t\nKBJ9O6h/2y+R+wYkdv8SuW9AHfWvQa+ZEhEREdnRNfSZKREREZEdWoNMpszsGDP70cx+NrPrG6IP\njJktNrPZZjbTzKYnQH+eNbM8M5sTc1szM5tkZvP9/5smWP+GmdkKfxvONLNjG6hvHc3sIzP73szm\nmtkV/u0Nvv1I3xJl22WY2TQz+87v323+7V3M7Cv/+B1jZmkN0b+GpDFsm/qi8Wv7+5aw41dI/xJl\n+9XfGOacq9d/AJIBLACwM4A0AN8B6FXf/Qjp42IALRq6HzH9ORjAXgDmxNw2AsD1/s/XA7g3wfo3\nDMA1CbDt2gLYy/85F8BPAHolwvYjfUuUbWcAcvyfUwF8BWA/AGMBDPJvfxLAnxu6r/W8XTSGbVtf\nNH5tf98SdvwK6V+ibL96G8Ma4sxUfwA/O+cWOudKAbwC4KQG6McOwzn3CYAN1W4+CcDz/s/PAzi5\nXjsVI6B/CcE5t8o5943/cwGAeQDaIwG2H+lbQnCeQv/XVP+fA3AYgHH+7Q267zUQjWHbQOPX9kvk\n8SukfwmhPsewhphMtQewLOb35Uigje9zAD4wsxlmdnFDdyZAa+fcKv/n1QBaN2RnAvzFzGb5p9Eb\n7DR+FTPrDGBPeK9OEmr7VesbkCDbzsySzWwmgDwAk+CdkdnknCv375KIx29d0xgWXUIdfwES4his\nksjjF6AxTBeg1+xA59xeAH4HYIiZHdzQHWKcd64y0T6W+QSArgD6AlgF4IGG7IyZ5QAYD+BK51x+\nbNbQ26+GviXMtnPOVTjn+gLoAO+MTM+G6otskx1mDGvo4y9AwhyDQGKPX4DGMKBhJlMrAHSM+b2D\nf1vCcM6t8P/PA/AavCcg0awxs7YA4P+f18D9+QXn3Bp/J64E8AwacBuaWSq8A/1F59wE/+aE2H41\n9S2Rtl0V59wmAB8B2B9AEzNL8aOEO37rgcaw6BLi+AuSSMdgIo9fQf1LpO1Xpa7HsIaYTH0NoLt/\nNX0agEEA3myAftTIzLLNLLfqZwBHAZjDWzWINwEM9n8eDOCNBuzL/6g60H2noIG2oZkZgJEA5jnn\nHoyJGnz7BfUtgbZdSzNr4v+cCeBIeNdEfATgNP9uCbfv1QONYdE1+PHHJNAxmLDjF6Ax7Bca6Ar7\nY+Fd9b8AwI0N0QfSt53hfTrnOwBzE6F/AF6Gd6q0DN77uxcCaA5gMoD5AP4NoFmC9e8FALMBzIJ3\n4LdtoL4dCO8U+CwAM/1/xybC9iN9S5Rt1wfAt34/5gC4xb99ZwDTAPwM4FUA6Q217zXUP41h29Qf\njV/b37eEHb9C+pco26/exjBVQBcRERGJQBegi4iIiESgyZSIiIhIBJpMiYiIiESgyZSIiIhIBJpM\niYiIiESgyZSIiIhIBJpMiYiIiESgyZSIiIhIBP8PjtButOJymWEAAAAASUVORK5CYII=\n", | |
| "text/plain": [ | |
| "<Figure size 720x1080 with 2 Axes>" | |
| ] | |
| }, | |
| "metadata": { | |
| "tags": [] | |
| }, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "# Fix seed so we always get the same result from the sampler\n", | |
| "# seed = 42\n", | |
| "# random.seed(seed) <--\n", | |
| "# np.random.seed(seed) <--\n", | |
| "# torch.manual_seed(seed) <--\n", | |
| "# torch.cuda.manual_seed(seed) <-- This BLOCK is messing up wiht the keys\n", | |
| "\n", | |
| "# Getting the plaintext copy of the data\n", | |
| "actual_data, actual_label = next(iter(test_loader))\n", | |
| "\n", | |
| "plt.figure(figsize=(10,15))\n", | |
| "plt.subplot(121)\n", | |
| "plt.title(f\"Original data (label {int(actual_label[0])})\")\n", | |
| "plt.imshow(np.moveaxis(actual_data[0].numpy(), 0, 2))\n", | |
| "\n", | |
| "\n", | |
| "# Visualzing the secret data\n", | |
| "\n", | |
| "# NOTE: Other thing I'm testing\n", | |
| "sample_idx = list(bob._objects.keys())[0]\n", | |
| "bob_sample = bob.get_obj(sample_idx).numpy()[0]\n", | |
| "bob_sample = bob_sample - bob_sample.min()\n", | |
| "bob_sample = bob_sample / bob_sample.max()\n", | |
| "\n", | |
| "plt.subplot(122)\n", | |
| "plt.title(\"This is what we can see from Bobs's data\")\n", | |
| "plt.imshow(np.moveaxis(bob_sample/bob_sample.max(), 0, -1))\n", | |
| "plt.show()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "MRKSuKv0uRiQ" | |
| }, | |
| "source": [ | |
| "### Testing\n", | |
| "\n", | |
| "Now we do define the test function and we do the privacy-preserving predictions on the last part of the data" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": {}, | |
| "colab_type": "code", | |
| "id": "JNGFND2yckHV" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "def test(args, model, test_loader):\n", | |
| " model.eval()\n", | |
| " n_correct_priv = 0\n", | |
| " n_total = 0\n", | |
| " with torch.no_grad():\n", | |
| " for idx, (data, target) in enumerate(test_loader):\n", | |
| " output = model(data)\n", | |
| " pred = output.argmax(dim=1) \n", | |
| " n_correct_priv += pred.eq(target.view_as(pred)).sum() \n", | |
| " n_total += args.test_batch_size\n", | |
| " # The following test function performs the encrypted evaluation. \n", | |
| " # The model weights, the data inputs, the prediction and the target \n", | |
| " # used for scoring are all encrypted! However as you can observe, \n", | |
| " # the syntax is very similar to normal PyTorch testing! Nice!\n", | |
| " # The only thing we decrypt from the server side is \n", | |
| " # the final score at the end to verify predictions were on average good. \n", | |
| " n_correct = n_correct_priv.copy().get().float_precision().long().item()\n", | |
| " \n", | |
| " print('Test set: Accuracy: {}/{} ({:.0f}%)'.format(\n", | |
| " n_correct, n_total,\n", | |
| " 100. * n_correct / n_total))\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": 0, | |
| "metadata": { | |
| "colab": { | |
| "base_uri": "https://localhost:8080/", | |
| "height": 1511 | |
| }, | |
| "colab_type": "code", | |
| "id": "0o0pL8R-cm_X", | |
| "outputId": "1b9daf7a-66c4-4410-c5ab-3f6140af90b7" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "Test set: Accuracy: 21/50 (42%)\n", | |
| "Test set: Accuracy: 43/100 (43%)\n", | |
| "Test set: Accuracy: 63/150 (42%)\n", | |
| "Test set: Accuracy: 83/200 (42%)\n", | |
| "Test set: Accuracy: 107/250 (43%)\n", | |
| "Test set: Accuracy: 121/300 (40%)\n", | |
| "Test set: Accuracy: 144/350 (41%)\n", | |
| "Test set: Accuracy: 166/400 (42%)\n", | |
| "Test set: Accuracy: 187/450 (42%)\n", | |
| "Test set: Accuracy: 210/500 (42%)\n", | |
| "Test set: Accuracy: 233/550 (42%)\n", | |
| "Test set: Accuracy: 260/600 (43%)\n", | |
| "Test set: Accuracy: 283/650 (44%)\n", | |
| "Test set: Accuracy: 310/700 (44%)\n", | |
| "Test set: Accuracy: 336/750 (45%)\n", | |
| "Test set: Accuracy: 361/800 (45%)\n", | |
| "Test set: Accuracy: 386/850 (45%)\n", | |
| "Test set: Accuracy: 414/900 (46%)\n", | |
| "Test set: Accuracy: 439/950 (46%)\n", | |
| "Test set: Accuracy: 463/1000 (46%)\n", | |
| "Test set: Accuracy: 491/1050 (47%)\n", | |
| "Test set: Accuracy: 520/1100 (47%)\n", | |
| "Test set: Accuracy: 544/1150 (47%)\n", | |
| "Test set: Accuracy: 559/1200 (47%)\n", | |
| "Test set: Accuracy: 584/1250 (47%)\n", | |
| "Test set: Accuracy: 610/1300 (47%)\n", | |
| "Test set: Accuracy: 634/1350 (47%)\n", | |
| "Test set: Accuracy: 657/1400 (47%)\n", | |
| "Test set: Accuracy: 682/1450 (47%)\n", | |
| "Test set: Accuracy: 703/1500 (47%)\n", | |
| "Test set: Accuracy: 723/1550 (47%)\n", | |
| "Test set: Accuracy: 749/1600 (47%)\n", | |
| "Test set: Accuracy: 772/1650 (47%)\n", | |
| "Test set: Accuracy: 800/1700 (47%)\n", | |
| "Test set: Accuracy: 821/1750 (47%)\n", | |
| "Test set: Accuracy: 842/1800 (47%)\n", | |
| "Test set: Accuracy: 862/1850 (47%)\n", | |
| "Test set: Accuracy: 889/1900 (47%)\n", | |
| "Test set: Accuracy: 912/1950 (47%)\n", | |
| "Test set: Accuracy: 935/2000 (47%)\n", | |
| "Test set: Accuracy: 954/2050 (47%)\n", | |
| "Test set: Accuracy: 981/2100 (47%)\n", | |
| "Test set: Accuracy: 1003/2150 (47%)\n", | |
| "Test set: Accuracy: 1021/2200 (46%)\n", | |
| "Test set: Accuracy: 1042/2250 (46%)\n", | |
| "Test set: Accuracy: 1066/2300 (46%)\n", | |
| "Test set: Accuracy: 1090/2350 (46%)\n", | |
| "Test set: Accuracy: 1110/2400 (46%)\n", | |
| "Test set: Accuracy: 1131/2450 (46%)\n", | |
| "Test set: Accuracy: 1155/2500 (46%)\n", | |
| "Test set: Accuracy: 1175/2550 (46%)\n", | |
| "Test set: Accuracy: 1191/2600 (46%)\n", | |
| "Test set: Accuracy: 1212/2650 (46%)\n", | |
| "Test set: Accuracy: 1238/2700 (46%)\n", | |
| "Test set: Accuracy: 1266/2750 (46%)\n", | |
| "Test set: Accuracy: 1300/2800 (46%)\n", | |
| "Test set: Accuracy: 1324/2850 (46%)\n", | |
| "Test set: Accuracy: 1344/2900 (46%)\n", | |
| "Test set: Accuracy: 1364/2950 (46%)\n", | |
| "Test set: Accuracy: 1390/3000 (46%)\n", | |
| "Test set: Accuracy: 1412/3050 (46%)\n", | |
| "Test set: Accuracy: 1436/3100 (46%)\n", | |
| "Test set: Accuracy: 1458/3150 (46%)\n", | |
| "Test set: Accuracy: 1481/3200 (46%)\n", | |
| "Test set: Accuracy: 1501/3250 (46%)\n", | |
| "Test set: Accuracy: 1527/3300 (46%)\n", | |
| "Test set: Accuracy: 1552/3350 (46%)\n", | |
| "Test set: Accuracy: 1572/3400 (46%)\n", | |
| "Test set: Accuracy: 1597/3450 (46%)\n", | |
| "Test set: Accuracy: 1618/3500 (46%)\n", | |
| "Test set: Accuracy: 1636/3550 (46%)\n", | |
| "Test set: Accuracy: 1665/3600 (46%)\n", | |
| "Test set: Accuracy: 1693/3650 (46%)\n", | |
| "Test set: Accuracy: 1720/3700 (46%)\n", | |
| "Test set: Accuracy: 1741/3750 (46%)\n", | |
| "Test set: Accuracy: 1765/3800 (46%)\n", | |
| "Test set: Accuracy: 1787/3850 (46%)\n", | |
| "Test set: Accuracy: 1810/3900 (46%)\n", | |
| "Test set: Accuracy: 1830/3950 (46%)\n", | |
| "Test set: Accuracy: 1857/4000 (46%)\n", | |
| "Test set: Accuracy: 1881/4050 (46%)\n", | |
| "Test set: Accuracy: 1897/4100 (46%)\n", | |
| "Test set: Accuracy: 1917/4150 (46%)\n", | |
| "Test set: Accuracy: 1936/4200 (46%)\n", | |
| "Test set: Accuracy: 1960/4250 (46%)\n", | |
| "Test set: Accuracy: 1981/4300 (46%)\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "test(args, model, private_test_loader)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "colab_type": "text", | |
| "id": "kloCn-I0wUt6" | |
| }, | |
| "source": [ | |
| "As you can see the testing results are of the same accuracy levels as the local test thas has been done above." | |
| ] | |
| } | |
| ], | |
| "metadata": { | |
| "accelerator": "GPU", | |
| "colab": { | |
| "collapsed_sections": [], | |
| "name": "CIFAR Privacy-Preserving inference.ipynb", | |
| "provenance": [] | |
| }, | |
| "kernelspec": { | |
| "display_name": "Python 3", | |
| "language": "python", | |
| "name": "python3" | |
| }, | |
| "language_info": { | |
| "codemirror_mode": { | |
| "name": "ipython", | |
| "version": 3 | |
| }, | |
| "file_extension": ".py", | |
| "mimetype": "text/x-python", | |
| "name": "python", | |
| "nbconvert_exporter": "python", | |
| "pygments_lexer": "ipython3", | |
| "version": "3.6.8" | |
| } | |
| }, | |
| "nbformat": 4, | |
| "nbformat_minor": 1 | |
| } |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment