Skip to content

Instantly share code, notes, and snippets.

@3sdd
Created December 22, 2019 20:35
Show Gist options
  • Save 3sdd/bd8a038a08bf87d258f57ab9aae9466c to your computer and use it in GitHub Desktop.
Save 3sdd/bd8a038a08bf87d258f57ab9aae9466c to your computer and use it in GitHub Desktop.
PyTorchGAN_DCGAN.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "PyTorchGAN_DCGAN.ipynb",
"provenance": [],
"collapsed_sections": [],
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/3sdd/bd8a038a08bf87d258f57ab9aae9466c/pytorchgan_dcgan.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "code",
"metadata": {
"id": "d4VZpkw7cGbQ",
"colab_type": "code",
"outputId": "bbfb0413-1f59-4308-db23-7963da959138",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 312
}
},
"source": [
"!nvidia-smi"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Sun Dec 22 19:09:04 2019 \n",
"+-----------------------------------------------------------------------------+\n",
"| NVIDIA-SMI 440.44 Driver Version: 418.67 CUDA Version: 10.1 |\n",
"|-------------------------------+----------------------+----------------------+\n",
"| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |\n",
"| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |\n",
"|===============================+======================+======================|\n",
"| 0 Tesla P100-PCIE... Off | 00000000:00:04.0 Off | 0 |\n",
"| N/A 39C P0 26W / 250W | 0MiB / 16280MiB | 0% Default |\n",
"+-------------------------------+----------------------+----------------------+\n",
" \n",
"+-----------------------------------------------------------------------------+\n",
"| Processes: GPU Memory |\n",
"| GPU PID Type Process name Usage |\n",
"|=============================================================================|\n",
"| No running processes found |\n",
"+-----------------------------------------------------------------------------+\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "Dy_RYzwmcXt4",
"colab_type": "code",
"outputId": "3b6fcc58-5075-417a-9d38-38227b7fe269",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 451
}
},
"source": [
"!git clone https://github.com/eriklindernoren/PyTorch-GAN\n",
"%cd PyTorch-GAN/\n",
"!pip install -r requirements.txt"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Cloning into 'PyTorch-GAN'...\n",
"remote: Enumerating objects: 1277, done.\u001b[K\n",
"remote: Total 1277 (delta 0), reused 0 (delta 0), pack-reused 1277\u001b[K\n",
"Receiving objects: 100% (1277/1277), 68.04 MiB | 56.50 MiB/s, done.\n",
"Resolving deltas: 100% (747/747), done.\n",
"/content/PyTorch-GAN\n",
"Requirement already satisfied: torch>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 1)) (1.3.1)\n",
"Requirement already satisfied: torchvision in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 2)) (0.4.2)\n",
"Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 3)) (3.1.2)\n",
"Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 4)) (1.17.4)\n",
"Requirement already satisfied: scipy in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 5)) (1.3.3)\n",
"Requirement already satisfied: pillow in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 6)) (4.3.0)\n",
"Requirement already satisfied: urllib3 in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 7)) (1.24.3)\n",
"Requirement already satisfied: scikit-image in /usr/local/lib/python3.6/dist-packages (from -r requirements.txt (line 8)) (0.15.0)\n",
"Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from torchvision->-r requirements.txt (line 2)) (1.12.0)\n",
"Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->-r requirements.txt (line 3)) (0.10.0)\n",
"Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->-r requirements.txt (line 3)) (1.1.0)\n",
"Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->-r requirements.txt (line 3)) (2.6.1)\n",
"Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->-r requirements.txt (line 3)) (2.4.5)\n",
"Requirement already satisfied: olefile in /usr/local/lib/python3.6/dist-packages (from pillow->-r requirements.txt (line 6)) (0.46)\n",
"Requirement already satisfied: PyWavelets>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->-r requirements.txt (line 8)) (1.1.1)\n",
"Requirement already satisfied: networkx>=2.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->-r requirements.txt (line 8)) (2.4)\n",
"Requirement already satisfied: imageio>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from scikit-image->-r requirements.txt (line 8)) (2.4.1)\n",
"Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from kiwisolver>=1.0.1->matplotlib->-r requirements.txt (line 3)) (42.0.2)\n",
"Requirement already satisfied: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx>=2.0->scikit-image->-r requirements.txt (line 8)) (4.4.1)\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "rmFjrLrkccyH",
"colab_type": "code",
"outputId": "69704a1c-9c7e-4fcb-bbc2-8e3349f79043",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 1000
}
},
"source": [
"%cd /content/PyTorch-GAN/implementations/dcgan/\n",
"!python dcgan.py --batch_size 1024"
],
"execution_count": 3,
"outputs": [
{
"output_type": "stream",
"text": [
"/content/PyTorch-GAN/implementations/dcgan\n",
"Namespace(b1=0.5, b2=0.999, batch_size=1024, channels=1, img_size=32, latent_dim=100, lr=0.0002, n_cpu=8, n_epochs=200, sample_interval=400)\n",
"Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz to ../../data/mnist/MNIST/raw/train-images-idx3-ubyte.gz\n",
"9920512it [00:00, 26632234.02it/s] \n",
"Extracting ../../data/mnist/MNIST/raw/train-images-idx3-ubyte.gz to ../../data/mnist/MNIST/raw\n",
"Downloading http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz to ../../data/mnist/MNIST/raw/train-labels-idx1-ubyte.gz\n",
"32768it [00:00, 448427.37it/s]\n",
"Extracting ../../data/mnist/MNIST/raw/train-labels-idx1-ubyte.gz to ../../data/mnist/MNIST/raw\n",
"Downloading http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz to ../../data/mnist/MNIST/raw/t10k-images-idx3-ubyte.gz\n",
"1654784it [00:00, 7756521.33it/s] \n",
"Extracting ../../data/mnist/MNIST/raw/t10k-images-idx3-ubyte.gz to ../../data/mnist/MNIST/raw\n",
"Downloading http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz to ../../data/mnist/MNIST/raw/t10k-labels-idx1-ubyte.gz\n",
"8192it [00:00, 188450.15it/s]\n",
"Extracting ../../data/mnist/MNIST/raw/t10k-labels-idx1-ubyte.gz to ../../data/mnist/MNIST/raw\n",
"Processing...\n",
"Done!\n",
"[Epoch 0/200] [Batch 0/59] [D loss: 0.693297] [G loss: 0.712285]\n",
"[Epoch 0/200] [Batch 1/59] [D loss: 0.693157] [G loss: 0.711613]\n",
"[Epoch 0/200] [Batch 2/59] [D loss: 0.693071] [G loss: 0.710946]\n",
"[Epoch 0/200] [Batch 3/59] [D loss: 0.692900] [G loss: 0.710350]\n",
"[Epoch 0/200] [Batch 4/59] [D loss: 0.692759] [G loss: 0.709665]\n",
"[Epoch 0/200] [Batch 5/59] [D loss: 0.692555] [G loss: 0.709005]\n",
"[Epoch 0/200] [Batch 6/59] [D loss: 0.692355] [G loss: 0.708297]\n",
"[Epoch 0/200] [Batch 7/59] [D loss: 0.692079] [G loss: 0.707614]\n",
"[Epoch 0/200] [Batch 8/59] [D loss: 0.691774] [G loss: 0.706711]\n",
"[Epoch 0/200] [Batch 9/59] [D loss: 0.691521] [G loss: 0.705680]\n",
"[Epoch 0/200] [Batch 10/59] [D loss: 0.691130] [G loss: 0.704435]\n",
"[Epoch 0/200] [Batch 11/59] [D loss: 0.690937] [G loss: 0.702763]\n",
"[Epoch 0/200] [Batch 12/59] [D loss: 0.690907] [G loss: 0.700451]\n",
"[Epoch 0/200] [Batch 13/59] [D loss: 0.690924] [G loss: 0.697742]\n",
"[Epoch 0/200] [Batch 14/59] [D loss: 0.690873] [G loss: 0.695238]\n",
"[Epoch 0/200] [Batch 15/59] [D loss: 0.691678] [G loss: 0.691481]\n",
"[Epoch 0/200] [Batch 16/59] [D loss: 0.691869] [G loss: 0.688974]\n",
"[Epoch 0/200] [Batch 17/59] [D loss: 0.692920] [G loss: 0.686545]\n",
"[Epoch 0/200] [Batch 18/59] [D loss: 0.693053] [G loss: 0.686700]\n",
"[Epoch 0/200] [Batch 19/59] [D loss: 0.693405] [G loss: 0.687200]\n",
"[Epoch 0/200] [Batch 20/59] [D loss: 0.693037] [G loss: 0.689098]\n",
"[Epoch 0/200] [Batch 21/59] [D loss: 0.692744] [G loss: 0.691385]\n",
"[Epoch 0/200] [Batch 22/59] [D loss: 0.692135] [G loss: 0.694105]\n",
"[Epoch 0/200] [Batch 23/59] [D loss: 0.691381] [G loss: 0.697101]\n",
"[Epoch 0/200] [Batch 24/59] [D loss: 0.690421] [G loss: 0.700271]\n",
"[Epoch 0/200] [Batch 25/59] [D loss: 0.688975] [G loss: 0.704383]\n",
"[Epoch 0/200] [Batch 26/59] [D loss: 0.687533] [G loss: 0.708523]\n",
"[Epoch 0/200] [Batch 27/59] [D loss: 0.685831] [G loss: 0.713528]\n",
"[Epoch 0/200] [Batch 28/59] [D loss: 0.684240] [G loss: 0.718151]\n",
"[Epoch 0/200] [Batch 29/59] [D loss: 0.683141] [G loss: 0.721105]\n",
"[Epoch 0/200] [Batch 30/59] [D loss: 0.683488] [G loss: 0.721458]\n",
"[Epoch 0/200] [Batch 31/59] [D loss: 0.683939] [G loss: 0.719254]\n",
"[Epoch 0/200] [Batch 32/59] [D loss: 0.685025] [G loss: 0.713590]\n",
"[Epoch 0/200] [Batch 33/59] [D loss: 0.684878] [G loss: 0.707927]\n",
"[Epoch 0/200] [Batch 34/59] [D loss: 0.684419] [G loss: 0.700486]\n",
"[Epoch 0/200] [Batch 35/59] [D loss: 0.681913] [G loss: 0.694297]\n",
"[Epoch 0/200] [Batch 36/59] [D loss: 0.678433] [G loss: 0.688242]\n",
"[Epoch 0/200] [Batch 37/59] [D loss: 0.674760] [G loss: 0.686116]\n",
"[Epoch 0/200] [Batch 38/59] [D loss: 0.669461] [G loss: 0.681021]\n",
"[Epoch 0/200] [Batch 39/59] [D loss: 0.662687] [G loss: 0.679210]\n",
"[Epoch 0/200] [Batch 40/59] [D loss: 0.658942] [G loss: 0.671297]\n",
"[Epoch 0/200] [Batch 41/59] [D loss: 0.655516] [G loss: 0.661073]\n",
"[Epoch 0/200] [Batch 42/59] [D loss: 0.657300] [G loss: 0.642795]\n",
"[Epoch 0/200] [Batch 43/59] [D loss: 0.661842] [G loss: 0.620194]\n",
"[Epoch 0/200] [Batch 44/59] [D loss: 0.673018] [G loss: 0.599659]\n",
"[Epoch 0/200] [Batch 45/59] [D loss: 0.680457] [G loss: 0.584012]\n",
"[Epoch 0/200] [Batch 46/59] [D loss: 0.693852] [G loss: 0.582227]\n",
"[Epoch 0/200] [Batch 47/59] [D loss: 0.697778] [G loss: 0.592895]\n",
"[Epoch 0/200] [Batch 48/59] [D loss: 0.698040] [G loss: 0.609568]\n",
"[Epoch 0/200] [Batch 49/59] [D loss: 0.696136] [G loss: 0.634670]\n",
"[Epoch 0/200] [Batch 50/59] [D loss: 0.693818] [G loss: 0.661065]\n",
"[Epoch 0/200] [Batch 51/59] [D loss: 0.690809] [G loss: 0.687528]\n",
"[Epoch 0/200] [Batch 52/59] [D loss: 0.687954] [G loss: 0.710390]\n",
"[Epoch 0/200] [Batch 53/59] [D loss: 0.688335] [G loss: 0.719064]\n",
"[Epoch 0/200] [Batch 54/59] [D loss: 0.692532] [G loss: 0.720412]\n",
"[Epoch 0/200] [Batch 55/59] [D loss: 0.699031] [G loss: 0.708979]\n",
"[Epoch 0/200] [Batch 56/59] [D loss: 0.704981] [G loss: 0.698369]\n",
"[Epoch 0/200] [Batch 57/59] [D loss: 0.704371] [G loss: 0.692246]\n",
"[Epoch 0/200] [Batch 58/59] [D loss: 0.699354] [G loss: 0.690341]\n",
"[Epoch 1/200] [Batch 0/59] [D loss: 0.692744] [G loss: 0.690592]\n",
"[Epoch 1/200] [Batch 1/59] [D loss: 0.687252] [G loss: 0.689049]\n",
"[Epoch 1/200] [Batch 2/59] [D loss: 0.684595] [G loss: 0.685629]\n",
"[Epoch 1/200] [Batch 3/59] [D loss: 0.683002] [G loss: 0.676710]\n",
"[Epoch 1/200] [Batch 4/59] [D loss: 0.684915] [G loss: 0.662315]\n",
"[Epoch 1/200] [Batch 5/59] [D loss: 0.691454] [G loss: 0.647059]\n",
"[Epoch 1/200] [Batch 6/59] [D loss: 0.698540] [G loss: 0.636100]\n",
"[Epoch 1/200] [Batch 7/59] [D loss: 0.705944] [G loss: 0.634703]\n",
"[Epoch 1/200] [Batch 8/59] [D loss: 0.706609] [G loss: 0.644098]\n",
"[Epoch 1/200] [Batch 9/59] [D loss: 0.705144] [G loss: 0.663659]\n",
"[Epoch 1/200] [Batch 10/59] [D loss: 0.702811] [G loss: 0.685115]\n",
"[Epoch 1/200] [Batch 11/59] [D loss: 0.697593] [G loss: 0.708806]\n",
"[Epoch 1/200] [Batch 12/59] [D loss: 0.695315] [G loss: 0.726766]\n",
"[Epoch 1/200] [Batch 13/59] [D loss: 0.690547] [G loss: 0.741708]\n",
"[Epoch 1/200] [Batch 14/59] [D loss: 0.686239] [G loss: 0.754000]\n",
"[Epoch 1/200] [Batch 15/59] [D loss: 0.684212] [G loss: 0.765233]\n",
"[Epoch 1/200] [Batch 16/59] [D loss: 0.681057] [G loss: 0.772236]\n",
"[Epoch 1/200] [Batch 17/59] [D loss: 0.678801] [G loss: 0.778834]\n",
"[Epoch 1/200] [Batch 18/59] [D loss: 0.678412] [G loss: 0.786010]\n",
"[Epoch 1/200] [Batch 19/59] [D loss: 0.679524] [G loss: 0.774911]\n",
"[Epoch 1/200] [Batch 20/59] [D loss: 0.683639] [G loss: 0.764698]\n",
"[Epoch 1/200] [Batch 21/59] [D loss: 0.691872] [G loss: 0.744536]\n",
"[Epoch 1/200] [Batch 22/59] [D loss: 0.703184] [G loss: 0.712204]\n",
"[Epoch 1/200] [Batch 23/59] [D loss: 0.713629] [G loss: 0.681594]\n",
"[Epoch 1/200] [Batch 24/59] [D loss: 0.715007] [G loss: 0.664335]\n",
"[Epoch 1/200] [Batch 25/59] [D loss: 0.708461] [G loss: 0.662284]\n",
"[Epoch 1/200] [Batch 26/59] [D loss: 0.700703] [G loss: 0.664084]\n",
"[Epoch 1/200] [Batch 27/59] [D loss: 0.692483] [G loss: 0.670017]\n",
"[Epoch 1/200] [Batch 28/59] [D loss: 0.684937] [G loss: 0.671565]\n",
"[Epoch 1/200] [Batch 29/59] [D loss: 0.679691] [G loss: 0.673641]\n",
"[Epoch 1/200] [Batch 30/59] [D loss: 0.673561] [G loss: 0.672711]\n",
"[Epoch 1/200] [Batch 31/59] [D loss: 0.669733] [G loss: 0.668972]\n",
"[Epoch 1/200] [Batch 32/59] [D loss: 0.670069] [G loss: 0.659102]\n",
"[Epoch 1/200] [Batch 33/59] [D loss: 0.672318] [G loss: 0.646299]\n",
"[Epoch 1/200] [Batch 34/59] [D loss: 0.678071] [G loss: 0.626048]\n",
"[Epoch 1/200] [Batch 35/59] [D loss: 0.690303] [G loss: 0.607749]\n",
"[Epoch 1/200] [Batch 36/59] [D loss: 0.704117] [G loss: 0.595449]\n",
"[Epoch 1/200] [Batch 37/59] [D loss: 0.709369] [G loss: 0.598389]\n",
"[Epoch 1/200] [Batch 38/59] [D loss: 0.709770] [G loss: 0.610857]\n",
"[Epoch 1/200] [Batch 39/59] [D loss: 0.708943] [G loss: 0.624385]\n",
"[Epoch 1/200] [Batch 40/59] [D loss: 0.708157] [G loss: 0.647882]\n",
"[Epoch 1/200] [Batch 41/59] [D loss: 0.706968] [G loss: 0.667193]\n",
"[Epoch 1/200] [Batch 42/59] [D loss: 0.701706] [G loss: 0.684390]\n",
"[Epoch 1/200] [Batch 43/59] [D loss: 0.700054] [G loss: 0.703449]\n",
"[Epoch 1/200] [Batch 44/59] [D loss: 0.696558] [G loss: 0.717780]\n",
"[Epoch 1/200] [Batch 45/59] [D loss: 0.694002] [G loss: 0.726714]\n",
"[Epoch 1/200] [Batch 46/59] [D loss: 0.691905] [G loss: 0.739772]\n",
"[Epoch 1/200] [Batch 47/59] [D loss: 0.689420] [G loss: 0.746116]\n",
"[Epoch 1/200] [Batch 48/59] [D loss: 0.688868] [G loss: 0.754177]\n",
"[Epoch 1/200] [Batch 49/59] [D loss: 0.687548] [G loss: 0.760977]\n",
"[Epoch 1/200] [Batch 50/59] [D loss: 0.686128] [G loss: 0.767595]\n",
"[Epoch 1/200] [Batch 51/59] [D loss: 0.685994] [G loss: 0.768289]\n",
"[Epoch 1/200] [Batch 52/59] [D loss: 0.686079] [G loss: 0.770338]\n",
"[Epoch 1/200] [Batch 53/59] [D loss: 0.687528] [G loss: 0.769332]\n",
"[Epoch 1/200] [Batch 54/59] [D loss: 0.689249] [G loss: 0.766611]\n",
"[Epoch 1/200] [Batch 55/59] [D loss: 0.689836] [G loss: 0.761908]\n",
"[Epoch 1/200] [Batch 56/59] [D loss: 0.693299] [G loss: 0.749032]\n",
"[Epoch 1/200] [Batch 57/59] [D loss: 0.696889] [G loss: 0.738571]\n",
"[Epoch 1/200] [Batch 58/59] [D loss: 0.698065] [G loss: 0.727614]\n",
"[Epoch 2/200] [Batch 0/59] [D loss: 0.700117] [G loss: 0.716397]\n",
"[Epoch 2/200] [Batch 1/59] [D loss: 0.702317] [G loss: 0.707592]\n",
"[Epoch 2/200] [Batch 2/59] [D loss: 0.698874] [G loss: 0.700401]\n",
"[Epoch 2/200] [Batch 3/59] [D loss: 0.698870] [G loss: 0.694700]\n",
"[Epoch 2/200] [Batch 4/59] [D loss: 0.696554] [G loss: 0.692704]\n",
"[Epoch 2/200] [Batch 5/59] [D loss: 0.695717] [G loss: 0.689336]\n",
"[Epoch 2/200] [Batch 6/59] [D loss: 0.692556] [G loss: 0.684506]\n",
"[Epoch 2/200] [Batch 7/59] [D loss: 0.690727] [G loss: 0.682804]\n",
"[Epoch 2/200] [Batch 8/59] [D loss: 0.690438] [G loss: 0.679805]\n",
"[Epoch 2/200] [Batch 9/59] [D loss: 0.689019] [G loss: 0.675662]\n",
"[Epoch 2/200] [Batch 10/59] [D loss: 0.689757] [G loss: 0.671279]\n",
"[Epoch 2/200] [Batch 11/59] [D loss: 0.690640] [G loss: 0.666627]\n",
"[Epoch 2/200] [Batch 12/59] [D loss: 0.692058] [G loss: 0.663094]\n",
"[Epoch 2/200] [Batch 13/59] [D loss: 0.692934] [G loss: 0.661497]\n",
"[Epoch 2/200] [Batch 14/59] [D loss: 0.694579] [G loss: 0.661242]\n",
"[Epoch 2/200] [Batch 15/59] [D loss: 0.694934] [G loss: 0.659110]\n",
"[Epoch 2/200] [Batch 16/59] [D loss: 0.696329] [G loss: 0.662610]\n",
"[Epoch 2/200] [Batch 17/59] [D loss: 0.696102] [G loss: 0.668581]\n",
"[Epoch 2/200] [Batch 18/59] [D loss: 0.695672] [G loss: 0.671410]\n",
"[Epoch 2/200] [Batch 19/59] [D loss: 0.697215] [G loss: 0.677307]\n",
"[Epoch 2/200] [Batch 20/59] [D loss: 0.695809] [G loss: 0.682288]\n",
"[Epoch 2/200] [Batch 21/59] [D loss: 0.695884] [G loss: 0.688998]\n",
"[Epoch 2/200] [Batch 22/59] [D loss: 0.695164] [G loss: 0.696388]\n",
"[Epoch 2/200] [Batch 23/59] [D loss: 0.694937] [G loss: 0.699855]\n",
"[Epoch 2/200] [Batch 24/59] [D loss: 0.694908] [G loss: 0.704468]\n",
"[Epoch 2/200] [Batch 25/59] [D loss: 0.693902] [G loss: 0.710766]\n",
"[Epoch 2/200] [Batch 26/59] [D loss: 0.693114] [G loss: 0.713598]\n",
"[Epoch 2/200] [Batch 27/59] [D loss: 0.693802] [G loss: 0.716666]\n",
"[Epoch 2/200] [Batch 28/59] [D loss: 0.693040] [G loss: 0.718175]\n",
"[Epoch 2/200] [Batch 29/59] [D loss: 0.692457] [G loss: 0.719235]\n",
"[Epoch 2/200] [Batch 30/59] [D loss: 0.692002] [G loss: 0.724231]\n",
"[Epoch 2/200] [Batch 31/59] [D loss: 0.692481] [G loss: 0.723035]\n",
"[Epoch 2/200] [Batch 32/59] [D loss: 0.691720] [G loss: 0.724682]\n",
"[Epoch 2/200] [Batch 33/59] [D loss: 0.692332] [G loss: 0.724452]\n",
"[Epoch 2/200] [Batch 34/59] [D loss: 0.692962] [G loss: 0.724260]\n",
"[Epoch 2/200] [Batch 35/59] [D loss: 0.694190] [G loss: 0.723616]\n",
"[Epoch 2/200] [Batch 36/59] [D loss: 0.694909] [G loss: 0.720008]\n",
"[Epoch 2/200] [Batch 37/59] [D loss: 0.694010] [G loss: 0.718105]\n",
"[Epoch 2/200] [Batch 38/59] [D loss: 0.695039] [G loss: 0.714417]\n",
"[Epoch 2/200] [Batch 39/59] [D loss: 0.695525] [G loss: 0.712127]\n",
"[Epoch 2/200] [Batch 40/59] [D loss: 0.695395] [G loss: 0.709939]\n",
"[Epoch 2/200] [Batch 41/59] [D loss: 0.695254] [G loss: 0.707979]\n",
"[Epoch 2/200] [Batch 42/59] [D loss: 0.695170] [G loss: 0.705536]\n",
"[Epoch 2/200] [Batch 43/59] [D loss: 0.695916] [G loss: 0.702960]\n",
"[Epoch 2/200] [Batch 44/59] [D loss: 0.694992] [G loss: 0.701217]\n",
"[Epoch 2/200] [Batch 45/59] [D loss: 0.694745] [G loss: 0.697938]\n",
"[Epoch 2/200] [Batch 46/59] [D loss: 0.693849] [G loss: 0.696318]\n",
"[Epoch 2/200] [Batch 47/59] [D loss: 0.694439] [G loss: 0.694512]\n",
"[Epoch 2/200] [Batch 48/59] [D loss: 0.693732] [G loss: 0.692566]\n",
"[Epoch 2/200] [Batch 49/59] [D loss: 0.694111] [G loss: 0.690701]\n",
"[Epoch 2/200] [Batch 50/59] [D loss: 0.693167] [G loss: 0.689065]\n",
"[Epoch 2/200] [Batch 51/59] [D loss: 0.694073] [G loss: 0.688925]\n",
"[Epoch 2/200] [Batch 52/59] [D loss: 0.693326] [G loss: 0.687649]\n",
"[Epoch 2/200] [Batch 53/59] [D loss: 0.693094] [G loss: 0.687057]\n",
"[Epoch 2/200] [Batch 54/59] [D loss: 0.693545] [G loss: 0.687177]\n",
"[Epoch 2/200] [Batch 55/59] [D loss: 0.694007] [G loss: 0.686188]\n",
"[Epoch 2/200] [Batch 56/59] [D loss: 0.693735] [G loss: 0.686941]\n",
"[Epoch 2/200] [Batch 57/59] [D loss: 0.693385] [G loss: 0.687194]\n",
"[Epoch 2/200] [Batch 58/59] [D loss: 0.693263] [G loss: 0.688869]\n",
"[Epoch 3/200] [Batch 0/59] [D loss: 0.693607] [G loss: 0.688580]\n",
"[Epoch 3/200] [Batch 1/59] [D loss: 0.693658] [G loss: 0.690547]\n",
"[Epoch 3/200] [Batch 2/59] [D loss: 0.693612] [G loss: 0.691177]\n",
"[Epoch 3/200] [Batch 3/59] [D loss: 0.692694] [G loss: 0.693362]\n",
"[Epoch 3/200] [Batch 4/59] [D loss: 0.693786] [G loss: 0.693596]\n",
"[Epoch 3/200] [Batch 5/59] [D loss: 0.693619] [G loss: 0.694880]\n",
"[Epoch 3/200] [Batch 6/59] [D loss: 0.693271] [G loss: 0.696842]\n",
"[Epoch 3/200] [Batch 7/59] [D loss: 0.692822] [G loss: 0.696997]\n",
"[Epoch 3/200] [Batch 8/59] [D loss: 0.693108] [G loss: 0.698016]\n",
"[Epoch 3/200] [Batch 9/59] [D loss: 0.693903] [G loss: 0.698729]\n",
"[Epoch 3/200] [Batch 10/59] [D loss: 0.693347] [G loss: 0.699027]\n",
"[Epoch 3/200] [Batch 11/59] [D loss: 0.693668] [G loss: 0.700879]\n",
"[Epoch 3/200] [Batch 12/59] [D loss: 0.693726] [G loss: 0.700334]\n",
"[Epoch 3/200] [Batch 13/59] [D loss: 0.693283] [G loss: 0.701309]\n",
"[Epoch 3/200] [Batch 14/59] [D loss: 0.694063] [G loss: 0.699800]\n",
"[Epoch 3/200] [Batch 15/59] [D loss: 0.693970] [G loss: 0.700111]\n",
"[Epoch 3/200] [Batch 16/59] [D loss: 0.694080] [G loss: 0.700342]\n",
"[Epoch 3/200] [Batch 17/59] [D loss: 0.694793] [G loss: 0.699568]\n",
"[Epoch 3/200] [Batch 18/59] [D loss: 0.694164] [G loss: 0.700626]\n",
"[Epoch 3/200] [Batch 19/59] [D loss: 0.694462] [G loss: 0.698978]\n",
"[Epoch 3/200] [Batch 20/59] [D loss: 0.694739] [G loss: 0.700297]\n",
"[Epoch 3/200] [Batch 21/59] [D loss: 0.694230] [G loss: 0.697650]\n",
"[Epoch 3/200] [Batch 22/59] [D loss: 0.694037] [G loss: 0.697893]\n",
"[Epoch 3/200] [Batch 23/59] [D loss: 0.694371] [G loss: 0.696887]\n",
"[Epoch 3/200] [Batch 24/59] [D loss: 0.695380] [G loss: 0.695598]\n",
"[Epoch 3/200] [Batch 25/59] [D loss: 0.694045] [G loss: 0.696423]\n",
"[Epoch 3/200] [Batch 26/59] [D loss: 0.694022] [G loss: 0.695725]\n",
"[Epoch 3/200] [Batch 27/59] [D loss: 0.694658] [G loss: 0.693901]\n",
"[Epoch 3/200] [Batch 28/59] [D loss: 0.694350] [G loss: 0.694743]\n",
"[Epoch 3/200] [Batch 29/59] [D loss: 0.694168] [G loss: 0.693421]\n",
"[Epoch 3/200] [Batch 30/59] [D loss: 0.694346] [G loss: 0.694039]\n",
"[Epoch 3/200] [Batch 31/59] [D loss: 0.693691] [G loss: 0.692708]\n",
"[Epoch 3/200] [Batch 32/59] [D loss: 0.693519] [G loss: 0.693119]\n",
"[Epoch 3/200] [Batch 33/59] [D loss: 0.693600] [G loss: 0.692907]\n",
"[Epoch 3/200] [Batch 34/59] [D loss: 0.692541] [G loss: 0.692258]\n",
"[Epoch 3/200] [Batch 35/59] [D loss: 0.692565] [G loss: 0.691333]\n",
"[Epoch 3/200] [Batch 36/59] [D loss: 0.692671] [G loss: 0.691339]\n",
"[Epoch 3/200] [Batch 37/59] [D loss: 0.693186] [G loss: 0.689902]\n",
"[Epoch 3/200] [Batch 38/59] [D loss: 0.691569] [G loss: 0.690288]\n",
"[Epoch 3/200] [Batch 39/59] [D loss: 0.691480] [G loss: 0.689648]\n",
"[Epoch 3/200] [Batch 40/59] [D loss: 0.691218] [G loss: 0.688893]\n",
"[Epoch 3/200] [Batch 41/59] [D loss: 0.691272] [G loss: 0.688564]\n",
"[Epoch 3/200] [Batch 42/59] [D loss: 0.691576] [G loss: 0.686523]\n",
"[Epoch 3/200] [Batch 43/59] [D loss: 0.692069] [G loss: 0.686549]\n",
"[Epoch 3/200] [Batch 44/59] [D loss: 0.690900] [G loss: 0.685682]\n",
"[Epoch 3/200] [Batch 45/59] [D loss: 0.691699] [G loss: 0.685259]\n",
"[Epoch 3/200] [Batch 46/59] [D loss: 0.692164] [G loss: 0.684732]\n",
"[Epoch 3/200] [Batch 47/59] [D loss: 0.692494] [G loss: 0.683728]\n",
"[Epoch 3/200] [Batch 48/59] [D loss: 0.693588] [G loss: 0.684096]\n",
"[Epoch 3/200] [Batch 49/59] [D loss: 0.692513] [G loss: 0.685819]\n",
"[Epoch 3/200] [Batch 50/59] [D loss: 0.693534] [G loss: 0.685414]\n",
"[Epoch 3/200] [Batch 51/59] [D loss: 0.693053] [G loss: 0.686465]\n",
"[Epoch 3/200] [Batch 52/59] [D loss: 0.694013] [G loss: 0.687285]\n",
"[Epoch 3/200] [Batch 53/59] [D loss: 0.693200] [G loss: 0.689191]\n",
"[Epoch 3/200] [Batch 54/59] [D loss: 0.694054] [G loss: 0.689088]\n",
"[Epoch 3/200] [Batch 55/59] [D loss: 0.692925] [G loss: 0.690574]\n",
"[Epoch 3/200] [Batch 56/59] [D loss: 0.693813] [G loss: 0.691217]\n",
"[Epoch 3/200] [Batch 57/59] [D loss: 0.694174] [G loss: 0.692380]\n",
"[Epoch 3/200] [Batch 58/59] [D loss: 0.693990] [G loss: 0.694467]\n",
"[Epoch 4/200] [Batch 0/59] [D loss: 0.693594] [G loss: 0.696960]\n",
"[Epoch 4/200] [Batch 1/59] [D loss: 0.693410] [G loss: 0.697178]\n",
"[Epoch 4/200] [Batch 2/59] [D loss: 0.692742] [G loss: 0.698591]\n",
"[Epoch 4/200] [Batch 3/59] [D loss: 0.692981] [G loss: 0.700124]\n",
"[Epoch 4/200] [Batch 4/59] [D loss: 0.693006] [G loss: 0.700410]\n",
"[Epoch 4/200] [Batch 5/59] [D loss: 0.692000] [G loss: 0.702108]\n",
"[Epoch 4/200] [Batch 6/59] [D loss: 0.692259] [G loss: 0.702691]\n",
"[Epoch 4/200] [Batch 7/59] [D loss: 0.692559] [G loss: 0.703596]\n",
"[Epoch 4/200] [Batch 8/59] [D loss: 0.692408] [G loss: 0.703362]\n",
"[Epoch 4/200] [Batch 9/59] [D loss: 0.691855] [G loss: 0.703270]\n",
"[Epoch 4/200] [Batch 10/59] [D loss: 0.692164] [G loss: 0.700957]\n",
"[Epoch 4/200] [Batch 11/59] [D loss: 0.693084] [G loss: 0.700920]\n",
"[Epoch 4/200] [Batch 12/59] [D loss: 0.693285] [G loss: 0.700431]\n",
"[Epoch 4/200] [Batch 13/59] [D loss: 0.693078] [G loss: 0.700035]\n",
"[Epoch 4/200] [Batch 14/59] [D loss: 0.693689] [G loss: 0.698784]\n",
"[Epoch 4/200] [Batch 15/59] [D loss: 0.694308] [G loss: 0.696704]\n",
"[Epoch 4/200] [Batch 16/59] [D loss: 0.694874] [G loss: 0.696556]\n",
"[Epoch 4/200] [Batch 17/59] [D loss: 0.694984] [G loss: 0.697460]\n",
"[Epoch 4/200] [Batch 18/59] [D loss: 0.694573] [G loss: 0.696411]\n",
"[Epoch 4/200] [Batch 19/59] [D loss: 0.695130] [G loss: 0.696402]\n",
"[Epoch 4/200] [Batch 20/59] [D loss: 0.694426] [G loss: 0.695802]\n",
"[Epoch 4/200] [Batch 21/59] [D loss: 0.694714] [G loss: 0.696538]\n",
"[Epoch 4/200] [Batch 22/59] [D loss: 0.693611] [G loss: 0.696803]\n",
"[Epoch 4/200] [Batch 23/59] [D loss: 0.693858] [G loss: 0.696507]\n",
"[Epoch 4/200] [Batch 24/59] [D loss: 0.693601] [G loss: 0.695642]\n",
"[Epoch 4/200] [Batch 25/59] [D loss: 0.694046] [G loss: 0.695844]\n",
"[Epoch 4/200] [Batch 26/59] [D loss: 0.692715] [G loss: 0.695069]\n",
"[Epoch 4/200] [Batch 27/59] [D loss: 0.693145] [G loss: 0.695452]\n",
"[Epoch 4/200] [Batch 28/59] [D loss: 0.692228] [G loss: 0.694827]\n",
"[Epoch 4/200] [Batch 29/59] [D loss: 0.690911] [G loss: 0.694636]\n",
"[Epoch 4/200] [Batch 30/59] [D loss: 0.692067] [G loss: 0.694309]\n",
"[Epoch 4/200] [Batch 31/59] [D loss: 0.691411] [G loss: 0.693281]\n",
"[Epoch 4/200] [Batch 32/59] [D loss: 0.691124] [G loss: 0.692649]\n",
"[Epoch 4/200] [Batch 33/59] [D loss: 0.692209] [G loss: 0.693577]\n",
"[Epoch 4/200] [Batch 34/59] [D loss: 0.691012] [G loss: 0.693289]\n",
"[Epoch 4/200] [Batch 35/59] [D loss: 0.690723] [G loss: 0.692568]\n",
"[Epoch 4/200] [Batch 36/59] [D loss: 0.691194] [G loss: 0.692761]\n",
"[Epoch 4/200] [Batch 37/59] [D loss: 0.691723] [G loss: 0.692389]\n",
"[Epoch 4/200] [Batch 38/59] [D loss: 0.692340] [G loss: 0.691499]\n",
"[Epoch 4/200] [Batch 39/59] [D loss: 0.692277] [G loss: 0.691959]\n",
"[Epoch 4/200] [Batch 40/59] [D loss: 0.692330] [G loss: 0.693036]\n",
"[Epoch 4/200] [Batch 41/59] [D loss: 0.693462] [G loss: 0.694368]\n",
"[Epoch 4/200] [Batch 42/59] [D loss: 0.693751] [G loss: 0.694558]\n",
"[Epoch 4/200] [Batch 43/59] [D loss: 0.695015] [G loss: 0.694219]\n",
"[Epoch 4/200] [Batch 44/59] [D loss: 0.694533] [G loss: 0.695491]\n",
"[Epoch 4/200] [Batch 45/59] [D loss: 0.694772] [G loss: 0.696578]\n",
"[Epoch 4/200] [Batch 46/59] [D loss: 0.695670] [G loss: 0.696785]\n",
"[Epoch 4/200] [Batch 47/59] [D loss: 0.695569] [G loss: 0.698890]\n",
"[Epoch 4/200] [Batch 48/59] [D loss: 0.695181] [G loss: 0.700139]\n",
"[Epoch 4/200] [Batch 49/59] [D loss: 0.695174] [G loss: 0.701277]\n",
"[Epoch 4/200] [Batch 50/59] [D loss: 0.694819] [G loss: 0.701720]\n",
"[Epoch 4/200] [Batch 51/59] [D loss: 0.694447] [G loss: 0.702784]\n",
"[Epoch 4/200] [Batch 52/59] [D loss: 0.693735] [G loss: 0.704412]\n",
"[Epoch 4/200] [Batch 53/59] [D loss: 0.692899] [G loss: 0.704628]\n",
"[Epoch 4/200] [Batch 54/59] [D loss: 0.692993] [G loss: 0.705020]\n",
"[Epoch 4/200] [Batch 55/59] [D loss: 0.691798] [G loss: 0.706309]\n",
"[Epoch 4/200] [Batch 56/59] [D loss: 0.692127] [G loss: 0.704728]\n",
"[Epoch 4/200] [Batch 57/59] [D loss: 0.691207] [G loss: 0.705623]\n",
"[Epoch 4/200] [Batch 58/59] [D loss: 0.690899] [G loss: 0.703765]\n",
"[Epoch 5/200] [Batch 0/59] [D loss: 0.691252] [G loss: 0.701608]\n",
"[Epoch 5/200] [Batch 1/59] [D loss: 0.690847] [G loss: 0.700967]\n",
"[Epoch 5/200] [Batch 2/59] [D loss: 0.691247] [G loss: 0.697896]\n",
"[Epoch 5/200] [Batch 3/59] [D loss: 0.690761] [G loss: 0.696160]\n",
"[Epoch 5/200] [Batch 4/59] [D loss: 0.691939] [G loss: 0.692842]\n",
"[Epoch 5/200] [Batch 5/59] [D loss: 0.692833] [G loss: 0.692204]\n",
"[Epoch 5/200] [Batch 6/59] [D loss: 0.693166] [G loss: 0.688869]\n",
"[Epoch 5/200] [Batch 7/59] [D loss: 0.694196] [G loss: 0.686843]\n",
"[Epoch 5/200] [Batch 8/59] [D loss: 0.693979] [G loss: 0.684440]\n",
"[Epoch 5/200] [Batch 9/59] [D loss: 0.694595] [G loss: 0.681237]\n",
"[Epoch 5/200] [Batch 10/59] [D loss: 0.696054] [G loss: 0.680929]\n",
"[Epoch 5/200] [Batch 11/59] [D loss: 0.694897] [G loss: 0.679869]\n",
"[Epoch 5/200] [Batch 12/59] [D loss: 0.694108] [G loss: 0.680095]\n",
"[Epoch 5/200] [Batch 13/59] [D loss: 0.695205] [G loss: 0.679500]\n",
"[Epoch 5/200] [Batch 14/59] [D loss: 0.694646] [G loss: 0.677813]\n",
"[Epoch 5/200] [Batch 15/59] [D loss: 0.695034] [G loss: 0.679267]\n",
"[Epoch 5/200] [Batch 16/59] [D loss: 0.695097] [G loss: 0.678749]\n",
"[Epoch 5/200] [Batch 17/59] [D loss: 0.693654] [G loss: 0.679861]\n",
"[Epoch 5/200] [Batch 18/59] [D loss: 0.693672] [G loss: 0.681583]\n",
"[Epoch 5/200] [Batch 19/59] [D loss: 0.693279] [G loss: 0.682195]\n",
"[Epoch 5/200] [Batch 20/59] [D loss: 0.692960] [G loss: 0.682659]\n",
"[Epoch 5/200] [Batch 21/59] [D loss: 0.692877] [G loss: 0.683707]\n",
"[Epoch 5/200] [Batch 22/59] [D loss: 0.692364] [G loss: 0.686361]\n",
"[Epoch 5/200] [Batch 23/59] [D loss: 0.691698] [G loss: 0.687712]\n",
"[Epoch 5/200] [Batch 24/59] [D loss: 0.692098] [G loss: 0.687942]\n",
"[Epoch 5/200] [Batch 25/59] [D loss: 0.692429] [G loss: 0.688859]\n",
"[Epoch 5/200] [Batch 26/59] [D loss: 0.691374] [G loss: 0.690600]\n",
"[Epoch 5/200] [Batch 27/59] [D loss: 0.692114] [G loss: 0.691183]\n",
"[Epoch 5/200] [Batch 28/59] [D loss: 0.692181] [G loss: 0.692247]\n",
"[Epoch 5/200] [Batch 29/59] [D loss: 0.691710] [G loss: 0.694169]\n",
"[Epoch 5/200] [Batch 30/59] [D loss: 0.692407] [G loss: 0.695787]\n",
"[Epoch 5/200] [Batch 31/59] [D loss: 0.693601] [G loss: 0.696445]\n",
"[Epoch 5/200] [Batch 32/59] [D loss: 0.693034] [G loss: 0.698398]\n",
"[Epoch 5/200] [Batch 33/59] [D loss: 0.694466] [G loss: 0.698040]\n",
"[Epoch 5/200] [Batch 34/59] [D loss: 0.694979] [G loss: 0.699622]\n",
"[Epoch 5/200] [Batch 35/59] [D loss: 0.694736] [G loss: 0.702040]\n",
"[Epoch 5/200] [Batch 36/59] [D loss: 0.695413] [G loss: 0.702475]\n",
"[Epoch 5/200] [Batch 37/59] [D loss: 0.695101] [G loss: 0.703319]\n",
"[Epoch 5/200] [Batch 38/59] [D loss: 0.695147] [G loss: 0.704369]\n",
"[Epoch 5/200] [Batch 39/59] [D loss: 0.695301] [G loss: 0.706970]\n",
"[Epoch 5/200] [Batch 40/59] [D loss: 0.694511] [G loss: 0.707120]\n",
"[Epoch 5/200] [Batch 41/59] [D loss: 0.694443] [G loss: 0.707738]\n",
"[Epoch 5/200] [Batch 42/59] [D loss: 0.694368] [G loss: 0.707959]\n",
"[Epoch 5/200] [Batch 43/59] [D loss: 0.693000] [G loss: 0.708790]\n",
"[Epoch 5/200] [Batch 44/59] [D loss: 0.692326] [G loss: 0.709597]\n",
"[Epoch 5/200] [Batch 45/59] [D loss: 0.691976] [G loss: 0.707893]\n",
"[Epoch 5/200] [Batch 46/59] [D loss: 0.691880] [G loss: 0.708895]\n",
"[Epoch 5/200] [Batch 47/59] [D loss: 0.690708] [G loss: 0.707904]\n",
"[Epoch 5/200] [Batch 48/59] [D loss: 0.690190] [G loss: 0.705278]\n",
"[Epoch 5/200] [Batch 49/59] [D loss: 0.689713] [G loss: 0.703068]\n",
"[Epoch 5/200] [Batch 50/59] [D loss: 0.689663] [G loss: 0.699235]\n",
"[Epoch 5/200] [Batch 51/59] [D loss: 0.687892] [G loss: 0.695997]\n",
"[Epoch 5/200] [Batch 52/59] [D loss: 0.687598] [G loss: 0.691680]\n",
"[Epoch 5/200] [Batch 53/59] [D loss: 0.687396] [G loss: 0.686121]\n",
"[Epoch 5/200] [Batch 54/59] [D loss: 0.685250] [G loss: 0.680664]\n",
"[Epoch 5/200] [Batch 55/59] [D loss: 0.685660] [G loss: 0.675986]\n",
"[Epoch 5/200] [Batch 56/59] [D loss: 0.683994] [G loss: 0.670779]\n",
"[Epoch 5/200] [Batch 57/59] [D loss: 0.683804] [G loss: 0.661932]\n",
"[Epoch 5/200] [Batch 58/59] [D loss: 0.684480] [G loss: 0.654681]\n",
"[Epoch 6/200] [Batch 0/59] [D loss: 0.688100] [G loss: 0.643532]\n",
"[Epoch 6/200] [Batch 1/59] [D loss: 0.692939] [G loss: 0.635614]\n",
"[Epoch 6/200] [Batch 2/59] [D loss: 0.697727] [G loss: 0.628806]\n",
"[Epoch 6/200] [Batch 3/59] [D loss: 0.703079] [G loss: 0.631778]\n",
"[Epoch 6/200] [Batch 4/59] [D loss: 0.703609] [G loss: 0.640635]\n",
"[Epoch 6/200] [Batch 5/59] [D loss: 0.702849] [G loss: 0.654130]\n",
"[Epoch 6/200] [Batch 6/59] [D loss: 0.702920] [G loss: 0.664681]\n",
"[Epoch 6/200] [Batch 7/59] [D loss: 0.702259] [G loss: 0.674045]\n",
"[Epoch 6/200] [Batch 8/59] [D loss: 0.700553] [G loss: 0.684365]\n",
"[Epoch 6/200] [Batch 9/59] [D loss: 0.699300] [G loss: 0.692730]\n",
"[Epoch 6/200] [Batch 10/59] [D loss: 0.697669] [G loss: 0.699503]\n",
"[Epoch 6/200] [Batch 11/59] [D loss: 0.696859] [G loss: 0.704411]\n",
"[Epoch 6/200] [Batch 12/59] [D loss: 0.695806] [G loss: 0.708991]\n",
"[Epoch 6/200] [Batch 13/59] [D loss: 0.694372] [G loss: 0.713352]\n",
"[Epoch 6/200] [Batch 14/59] [D loss: 0.694194] [G loss: 0.714897]\n",
"[Epoch 6/200] [Batch 15/59] [D loss: 0.693085] [G loss: 0.718085]\n",
"[Epoch 6/200] [Batch 16/59] [D loss: 0.693241] [G loss: 0.719268]\n",
"[Epoch 6/200] [Batch 17/59] [D loss: 0.691959] [G loss: 0.723172]\n",
"[Epoch 6/200] [Batch 18/59] [D loss: 0.691186] [G loss: 0.724396]\n",
"[Epoch 6/200] [Batch 19/59] [D loss: 0.690385] [G loss: 0.726039]\n",
"[Epoch 6/200] [Batch 20/59] [D loss: 0.689405] [G loss: 0.727576]\n",
"[Epoch 6/200] [Batch 21/59] [D loss: 0.688583] [G loss: 0.730791]\n",
"[Epoch 6/200] [Batch 22/59] [D loss: 0.687881] [G loss: 0.731881]\n",
"[Epoch 6/200] [Batch 23/59] [D loss: 0.687040] [G loss: 0.732762]\n",
"[Epoch 6/200] [Batch 24/59] [D loss: 0.686931] [G loss: 0.735090]\n",
"[Epoch 6/200] [Batch 25/59] [D loss: 0.685878] [G loss: 0.735662]\n",
"[Epoch 6/200] [Batch 26/59] [D loss: 0.685859] [G loss: 0.737128]\n",
"[Epoch 6/200] [Batch 27/59] [D loss: 0.684996] [G loss: 0.737524]\n",
"[Epoch 6/200] [Batch 28/59] [D loss: 0.684775] [G loss: 0.736923]\n",
"[Epoch 6/200] [Batch 29/59] [D loss: 0.684705] [G loss: 0.737504]\n",
"[Epoch 6/200] [Batch 30/59] [D loss: 0.686799] [G loss: 0.733105]\n",
"[Epoch 6/200] [Batch 31/59] [D loss: 0.689535] [G loss: 0.724519]\n",
"[Epoch 6/200] [Batch 32/59] [D loss: 0.691443] [G loss: 0.713915]\n",
"[Epoch 6/200] [Batch 33/59] [D loss: 0.697443] [G loss: 0.697978]\n",
"[Epoch 6/200] [Batch 34/59] [D loss: 0.701676] [G loss: 0.681604]\n",
"[Epoch 6/200] [Batch 35/59] [D loss: 0.707646] [G loss: 0.667690]\n",
"[Epoch 6/200] [Batch 36/59] [D loss: 0.705926] [G loss: 0.660027]\n",
"[Epoch 6/200] [Batch 37/59] [D loss: 0.704276] [G loss: 0.658083]\n",
"[Epoch 6/200] [Batch 38/59] [D loss: 0.699439] [G loss: 0.661569]\n",
"[Epoch 6/200] [Batch 39/59] [D loss: 0.694306] [G loss: 0.664456]\n",
"[Epoch 6/200] [Batch 40/59] [D loss: 0.691344] [G loss: 0.666941]\n",
"[Epoch 6/200] [Batch 41/59] [D loss: 0.688455] [G loss: 0.666437]\n",
"[Epoch 6/200] [Batch 42/59] [D loss: 0.688036] [G loss: 0.663638]\n",
"[Epoch 6/200] [Batch 43/59] [D loss: 0.685957] [G loss: 0.660438]\n",
"[Epoch 6/200] [Batch 44/59] [D loss: 0.686220] [G loss: 0.654876]\n",
"[Epoch 6/200] [Batch 45/59] [D loss: 0.686536] [G loss: 0.651238]\n",
"[Epoch 6/200] [Batch 46/59] [D loss: 0.688494] [G loss: 0.646687]\n",
"[Epoch 6/200] [Batch 47/59] [D loss: 0.689735] [G loss: 0.645718]\n",
"[Epoch 6/200] [Batch 48/59] [D loss: 0.691365] [G loss: 0.644650]\n",
"[Epoch 6/200] [Batch 49/59] [D loss: 0.693261] [G loss: 0.650509]\n",
"[Epoch 6/200] [Batch 50/59] [D loss: 0.693970] [G loss: 0.655298]\n",
"[Epoch 6/200] [Batch 51/59] [D loss: 0.692496] [G loss: 0.667771]\n",
"[Epoch 6/200] [Batch 52/59] [D loss: 0.692514] [G loss: 0.674720]\n",
"[Epoch 6/200] [Batch 53/59] [D loss: 0.692332] [G loss: 0.684311]\n",
"[Epoch 6/200] [Batch 54/59] [D loss: 0.693343] [G loss: 0.692679]\n",
"[Epoch 6/200] [Batch 55/59] [D loss: 0.692872] [G loss: 0.701641]\n",
"[Epoch 6/200] [Batch 56/59] [D loss: 0.691254] [G loss: 0.712201]\n",
"[Epoch 6/200] [Batch 57/59] [D loss: 0.691021] [G loss: 0.718515]\n",
"[Epoch 6/200] [Batch 58/59] [D loss: 0.691477] [G loss: 0.724922]\n",
"[Epoch 7/200] [Batch 0/59] [D loss: 0.691783] [G loss: 0.729529]\n",
"[Epoch 7/200] [Batch 1/59] [D loss: 0.692707] [G loss: 0.732456]\n",
"[Epoch 7/200] [Batch 2/59] [D loss: 0.693746] [G loss: 0.735018]\n",
"[Epoch 7/200] [Batch 3/59] [D loss: 0.694682] [G loss: 0.733092]\n",
"[Epoch 7/200] [Batch 4/59] [D loss: 0.696198] [G loss: 0.731895]\n",
"[Epoch 7/200] [Batch 5/59] [D loss: 0.697768] [G loss: 0.726397]\n",
"[Epoch 7/200] [Batch 6/59] [D loss: 0.699620] [G loss: 0.722446]\n",
"[Epoch 7/200] [Batch 7/59] [D loss: 0.699765] [G loss: 0.717936]\n",
"[Epoch 7/200] [Batch 8/59] [D loss: 0.698928] [G loss: 0.713442]\n",
"[Epoch 7/200] [Batch 9/59] [D loss: 0.698617] [G loss: 0.710619]\n",
"[Epoch 7/200] [Batch 10/59] [D loss: 0.697882] [G loss: 0.706529]\n",
"[Epoch 7/200] [Batch 11/59] [D loss: 0.696757] [G loss: 0.704147]\n",
"[Epoch 7/200] [Batch 12/59] [D loss: 0.694754] [G loss: 0.703200]\n",
"[Epoch 7/200] [Batch 13/59] [D loss: 0.693260] [G loss: 0.700638]\n",
"[Epoch 7/200] [Batch 14/59] [D loss: 0.691414] [G loss: 0.700326]\n",
"[Epoch 7/200] [Batch 15/59] [D loss: 0.690132] [G loss: 0.699276]\n",
"[Epoch 7/200] [Batch 16/59] [D loss: 0.688524] [G loss: 0.697719]\n",
"[Epoch 7/200] [Batch 17/59] [D loss: 0.686630] [G loss: 0.696018]\n",
"[Epoch 7/200] [Batch 18/59] [D loss: 0.684819] [G loss: 0.694451]\n",
"[Epoch 7/200] [Batch 19/59] [D loss: 0.683661] [G loss: 0.690693]\n",
"[Epoch 7/200] [Batch 20/59] [D loss: 0.682956] [G loss: 0.686250]\n",
"[Epoch 7/200] [Batch 21/59] [D loss: 0.681783] [G loss: 0.683103]\n",
"[Epoch 7/200] [Batch 22/59] [D loss: 0.681684] [G loss: 0.678539]\n",
"[Epoch 7/200] [Batch 23/59] [D loss: 0.682043] [G loss: 0.670440]\n",
"[Epoch 7/200] [Batch 24/59] [D loss: 0.682629] [G loss: 0.664810]\n",
"[Epoch 7/200] [Batch 25/59] [D loss: 0.687797] [G loss: 0.658278]\n",
"[Epoch 7/200] [Batch 26/59] [D loss: 0.690084] [G loss: 0.651292]\n",
"[Epoch 7/200] [Batch 27/59] [D loss: 0.694650] [G loss: 0.647680]\n",
"[Epoch 7/200] [Batch 28/59] [D loss: 0.702263] [G loss: 0.645165]\n",
"[Epoch 7/200] [Batch 29/59] [D loss: 0.704782] [G loss: 0.649209]\n",
"[Epoch 7/200] [Batch 30/59] [D loss: 0.706901] [G loss: 0.655867]\n",
"[Epoch 7/200] [Batch 31/59] [D loss: 0.705476] [G loss: 0.667937]\n",
"[Epoch 7/200] [Batch 32/59] [D loss: 0.706454] [G loss: 0.675536]\n",
"[Epoch 7/200] [Batch 33/59] [D loss: 0.703360] [G loss: 0.686074]\n",
"[Epoch 7/200] [Batch 34/59] [D loss: 0.701834] [G loss: 0.694709]\n",
"[Epoch 7/200] [Batch 35/59] [D loss: 0.699514] [G loss: 0.700072]\n",
"[Epoch 7/200] [Batch 36/59] [D loss: 0.698103] [G loss: 0.706693]\n",
"[Epoch 7/200] [Batch 37/59] [D loss: 0.696495] [G loss: 0.708435]\n",
"[Epoch 7/200] [Batch 38/59] [D loss: 0.694655] [G loss: 0.711820]\n",
"[Epoch 7/200] [Batch 39/59] [D loss: 0.693632] [G loss: 0.714147]\n",
"[Epoch 7/200] [Batch 40/59] [D loss: 0.692683] [G loss: 0.717019]\n",
"[Epoch 7/200] [Batch 41/59] [D loss: 0.691914] [G loss: 0.719255]\n",
"[Epoch 7/200] [Batch 42/59] [D loss: 0.691265] [G loss: 0.719511]\n",
"[Epoch 7/200] [Batch 43/59] [D loss: 0.689581] [G loss: 0.720591]\n",
"[Epoch 7/200] [Batch 44/59] [D loss: 0.688954] [G loss: 0.722364]\n",
"[Epoch 7/200] [Batch 45/59] [D loss: 0.688738] [G loss: 0.722310]\n",
"[Epoch 7/200] [Batch 46/59] [D loss: 0.686582] [G loss: 0.723909]\n",
"[Epoch 7/200] [Batch 47/59] [D loss: 0.687193] [G loss: 0.723389]\n",
"[Epoch 7/200] [Batch 48/59] [D loss: 0.684884] [G loss: 0.724051]\n",
"[Epoch 7/200] [Batch 49/59] [D loss: 0.685064] [G loss: 0.725500]\n",
"[Epoch 7/200] [Batch 50/59] [D loss: 0.684582] [G loss: 0.723881]\n",
"[Epoch 7/200] [Batch 51/59] [D loss: 0.686443] [G loss: 0.720855]\n",
"[Epoch 7/200] [Batch 52/59] [D loss: 0.686674] [G loss: 0.714772]\n",
"[Epoch 7/200] [Batch 53/59] [D loss: 0.688784] [G loss: 0.707922]\n",
"[Epoch 7/200] [Batch 54/59] [D loss: 0.691846] [G loss: 0.696842]\n",
"[Epoch 7/200] [Batch 55/59] [D loss: 0.695094] [G loss: 0.684270]\n",
"[Epoch 7/200] [Batch 56/59] [D loss: 0.695712] [G loss: 0.673962]\n",
"[Epoch 7/200] [Batch 57/59] [D loss: 0.695986] [G loss: 0.666266]\n",
"[Epoch 7/200] [Batch 58/59] [D loss: 0.696099] [G loss: 0.658508]\n",
"[Epoch 8/200] [Batch 0/59] [D loss: 0.696537] [G loss: 0.652858]\n",
"[Epoch 8/200] [Batch 1/59] [D loss: 0.693932] [G loss: 0.648986]\n",
"[Epoch 8/200] [Batch 2/59] [D loss: 0.695556] [G loss: 0.645926]\n",
"[Epoch 8/200] [Batch 3/59] [D loss: 0.695352] [G loss: 0.643668]\n",
"[Epoch 8/200] [Batch 4/59] [D loss: 0.696794] [G loss: 0.644512]\n",
"[Epoch 8/200] [Batch 5/59] [D loss: 0.697049] [G loss: 0.646577]\n",
"[Epoch 8/200] [Batch 6/59] [D loss: 0.695254] [G loss: 0.652079]\n",
"[Epoch 8/200] [Batch 7/59] [D loss: 0.696824] [G loss: 0.659050]\n",
"[Epoch 8/200] [Batch 8/59] [D loss: 0.696357] [G loss: 0.666673]\n",
"[Epoch 8/200] [Batch 9/59] [D loss: 0.695955] [G loss: 0.674464]\n",
"[Epoch 8/200] [Batch 10/59] [D loss: 0.694594] [G loss: 0.684792]\n",
"[Epoch 8/200] [Batch 11/59] [D loss: 0.693889] [G loss: 0.695872]\n",
"[Epoch 8/200] [Batch 12/59] [D loss: 0.692336] [G loss: 0.705420]\n",
"[Epoch 8/200] [Batch 13/59] [D loss: 0.690659] [G loss: 0.715019]\n",
"[Epoch 8/200] [Batch 14/59] [D loss: 0.689305] [G loss: 0.724131]\n",
"[Epoch 8/200] [Batch 15/59] [D loss: 0.687724] [G loss: 0.732091]\n",
"[Epoch 8/200] [Batch 16/59] [D loss: 0.687558] [G loss: 0.737264]\n",
"[Epoch 8/200] [Batch 17/59] [D loss: 0.688321] [G loss: 0.742978]\n",
"[Epoch 8/200] [Batch 18/59] [D loss: 0.689615] [G loss: 0.743552]\n",
"[Epoch 8/200] [Batch 19/59] [D loss: 0.692520] [G loss: 0.742009]\n",
"[Epoch 8/200] [Batch 20/59] [D loss: 0.694300] [G loss: 0.736338]\n",
"[Epoch 8/200] [Batch 21/59] [D loss: 0.698317] [G loss: 0.728739]\n",
"[Epoch 8/200] [Batch 22/59] [D loss: 0.701287] [G loss: 0.719947]\n",
"[Epoch 8/200] [Batch 23/59] [D loss: 0.700832] [G loss: 0.710504]\n",
"[Epoch 8/200] [Batch 24/59] [D loss: 0.699156] [G loss: 0.704494]\n",
"[Epoch 8/200] [Batch 25/59] [D loss: 0.698028] [G loss: 0.699791]\n",
"[Epoch 8/200] [Batch 26/59] [D loss: 0.696130] [G loss: 0.696485]\n",
"[Epoch 8/200] [Batch 27/59] [D loss: 0.693785] [G loss: 0.692551]\n",
"[Epoch 8/200] [Batch 28/59] [D loss: 0.692554] [G loss: 0.688855]\n",
"[Epoch 8/200] [Batch 29/59] [D loss: 0.689957] [G loss: 0.685175]\n",
"[Epoch 8/200] [Batch 30/59] [D loss: 0.691271] [G loss: 0.679354]\n",
"[Epoch 8/200] [Batch 31/59] [D loss: 0.690747] [G loss: 0.674458]\n",
"[Epoch 8/200] [Batch 32/59] [D loss: 0.690289] [G loss: 0.671489]\n",
"[Epoch 8/200] [Batch 33/59] [D loss: 0.691202] [G loss: 0.667324]\n",
"[Epoch 8/200] [Batch 34/59] [D loss: 0.692466] [G loss: 0.666502]\n",
"[Epoch 8/200] [Batch 35/59] [D loss: 0.695332] [G loss: 0.664366]\n",
"[Epoch 8/200] [Batch 36/59] [D loss: 0.695669] [G loss: 0.667806]\n",
"[Epoch 8/200] [Batch 37/59] [D loss: 0.695636] [G loss: 0.673631]\n",
"[Epoch 8/200] [Batch 38/59] [D loss: 0.695241] [G loss: 0.680379]\n",
"[Epoch 8/200] [Batch 39/59] [D loss: 0.695419] [G loss: 0.685782]\n",
"[Epoch 8/200] [Batch 40/59] [D loss: 0.694754] [G loss: 0.691800]\n",
"[Epoch 8/200] [Batch 41/59] [D loss: 0.693948] [G loss: 0.699101]\n",
"[Epoch 8/200] [Batch 42/59] [D loss: 0.692953] [G loss: 0.704709]\n",
"[Epoch 8/200] [Batch 43/59] [D loss: 0.693535] [G loss: 0.707167]\n",
"[Epoch 8/200] [Batch 44/59] [D loss: 0.692731] [G loss: 0.712865]\n",
"[Epoch 8/200] [Batch 45/59] [D loss: 0.692031] [G loss: 0.716781]\n",
"[Epoch 8/200] [Batch 46/59] [D loss: 0.692221] [G loss: 0.718389]\n",
"[Epoch 8/200] [Batch 47/59] [D loss: 0.691759] [G loss: 0.722415]\n",
"[Epoch 8/200] [Batch 48/59] [D loss: 0.691275] [G loss: 0.724055]\n",
"[Epoch 8/200] [Batch 49/59] [D loss: 0.691721] [G loss: 0.724928]\n",
"[Epoch 8/200] [Batch 50/59] [D loss: 0.692355] [G loss: 0.724839]\n",
"[Epoch 8/200] [Batch 51/59] [D loss: 0.692812] [G loss: 0.724383]\n",
"[Epoch 8/200] [Batch 52/59] [D loss: 0.694329] [G loss: 0.723115]\n",
"[Epoch 8/200] [Batch 53/59] [D loss: 0.694269] [G loss: 0.720690]\n",
"[Epoch 8/200] [Batch 54/59] [D loss: 0.694502] [G loss: 0.719359]\n",
"[Epoch 8/200] [Batch 55/59] [D loss: 0.695437] [G loss: 0.713526]\n",
"[Epoch 8/200] [Batch 56/59] [D loss: 0.695424] [G loss: 0.711411]\n",
"[Epoch 8/200] [Batch 57/59] [D loss: 0.695459] [G loss: 0.706269]\n",
"[Epoch 8/200] [Batch 58/59] [D loss: 0.695024] [G loss: 0.702669]\n",
"[Epoch 9/200] [Batch 0/59] [D loss: 0.695081] [G loss: 0.697950]\n",
"[Epoch 9/200] [Batch 1/59] [D loss: 0.694480] [G loss: 0.694290]\n",
"[Epoch 9/200] [Batch 2/59] [D loss: 0.692899] [G loss: 0.691526]\n",
"[Epoch 9/200] [Batch 3/59] [D loss: 0.692175] [G loss: 0.688782]\n",
"[Epoch 9/200] [Batch 4/59] [D loss: 0.692704] [G loss: 0.685044]\n",
"[Epoch 9/200] [Batch 5/59] [D loss: 0.691575] [G loss: 0.681822]\n",
"[Epoch 9/200] [Batch 6/59] [D loss: 0.690368] [G loss: 0.678231]\n",
"[Epoch 9/200] [Batch 7/59] [D loss: 0.690132] [G loss: 0.675422]\n",
"[Epoch 9/200] [Batch 8/59] [D loss: 0.689961] [G loss: 0.672009]\n",
"[Epoch 9/200] [Batch 9/59] [D loss: 0.689525] [G loss: 0.667953]\n",
"[Epoch 9/200] [Batch 10/59] [D loss: 0.690034] [G loss: 0.667166]\n",
"[Epoch 9/200] [Batch 11/59] [D loss: 0.690410] [G loss: 0.665238]\n",
"[Epoch 9/200] [Batch 12/59] [D loss: 0.690521] [G loss: 0.663930]\n",
"[Epoch 9/200] [Batch 13/59] [D loss: 0.691399] [G loss: 0.664695]\n",
"[Epoch 9/200] [Batch 14/59] [D loss: 0.691323] [G loss: 0.666928]\n",
"[Epoch 9/200] [Batch 15/59] [D loss: 0.692145] [G loss: 0.668375]\n",
"[Epoch 9/200] [Batch 16/59] [D loss: 0.692780] [G loss: 0.671809]\n",
"[Epoch 9/200] [Batch 17/59] [D loss: 0.692565] [G loss: 0.676376]\n",
"[Epoch 9/200] [Batch 18/59] [D loss: 0.692773] [G loss: 0.681828]\n",
"[Epoch 9/200] [Batch 19/59] [D loss: 0.693161] [G loss: 0.688571]\n",
"[Epoch 9/200] [Batch 20/59] [D loss: 0.694191] [G loss: 0.693942]\n",
"[Epoch 9/200] [Batch 21/59] [D loss: 0.693701] [G loss: 0.700106]\n",
"[Epoch 9/200] [Batch 22/59] [D loss: 0.693659] [G loss: 0.705481]\n",
"[Epoch 9/200] [Batch 23/59] [D loss: 0.693936] [G loss: 0.710904]\n",
"[Epoch 9/200] [Batch 24/59] [D loss: 0.694670] [G loss: 0.713442]\n",
"[Epoch 9/200] [Batch 25/59] [D loss: 0.694936] [G loss: 0.716430]\n",
"[Epoch 9/200] [Batch 26/59] [D loss: 0.694991] [G loss: 0.716557]\n",
"[Epoch 9/200] [Batch 27/59] [D loss: 0.695447] [G loss: 0.716318]\n",
"[Epoch 9/200] [Batch 28/59] [D loss: 0.695698] [G loss: 0.717806]\n",
"[Epoch 9/200] [Batch 29/59] [D loss: 0.696498] [G loss: 0.714977]\n",
"[Epoch 9/200] [Batch 30/59] [D loss: 0.696952] [G loss: 0.713593]\n",
"[Epoch 9/200] [Batch 31/59] [D loss: 0.696691] [G loss: 0.710937]\n",
"[Epoch 9/200] [Batch 32/59] [D loss: 0.696227] [G loss: 0.709209]\n",
"[Epoch 9/200] [Batch 33/59] [D loss: 0.695683] [G loss: 0.706196]\n",
"[Epoch 9/200] [Batch 34/59] [D loss: 0.694120] [G loss: 0.704917]\n",
"[Epoch 9/200] [Batch 35/59] [D loss: 0.693650] [G loss: 0.702461]\n",
"[Epoch 9/200] [Batch 36/59] [D loss: 0.691976] [G loss: 0.700251]\n",
"[Epoch 9/200] [Batch 37/59] [D loss: 0.690905] [G loss: 0.697068]\n",
"[Epoch 9/200] [Batch 38/59] [D loss: 0.688820] [G loss: 0.695360]\n",
"[Epoch 9/200] [Batch 39/59] [D loss: 0.687097] [G loss: 0.692447]\n",
"[Epoch 9/200] [Batch 40/59] [D loss: 0.686249] [G loss: 0.687414]\n",
"[Epoch 9/200] [Batch 41/59] [D loss: 0.686805] [G loss: 0.681775]\n",
"[Epoch 9/200] [Batch 42/59] [D loss: 0.685863] [G loss: 0.677394]\n",
"[Epoch 9/200] [Batch 43/59] [D loss: 0.686580] [G loss: 0.672243]\n",
"[Epoch 9/200] [Batch 44/59] [D loss: 0.687710] [G loss: 0.668473]\n",
"[Epoch 9/200] [Batch 45/59] [D loss: 0.687851] [G loss: 0.666558]\n",
"[Epoch 9/200] [Batch 46/59] [D loss: 0.688445] [G loss: 0.670815]\n",
"[Epoch 9/200] [Batch 47/59] [D loss: 0.689597] [G loss: 0.671903]\n",
"[Epoch 9/200] [Batch 48/59] [D loss: 0.691640] [G loss: 0.675851]\n",
"[Epoch 9/200] [Batch 49/59] [D loss: 0.690251] [G loss: 0.683956]\n",
"[Epoch 9/200] [Batch 50/59] [D loss: 0.692771] [G loss: 0.688749]\n",
"[Epoch 9/200] [Batch 51/59] [D loss: 0.692027] [G loss: 0.697278]\n",
"[Epoch 9/200] [Batch 52/59] [D loss: 0.694054] [G loss: 0.698591]\n",
"[Epoch 9/200] [Batch 53/59] [D loss: 0.693724] [G loss: 0.704164]\n",
"[Epoch 9/200] [Batch 54/59] [D loss: 0.693920] [G loss: 0.708476]\n",
"[Epoch 9/200] [Batch 55/59] [D loss: 0.694350] [G loss: 0.708867]\n",
"[Epoch 9/200] [Batch 56/59] [D loss: 0.693713] [G loss: 0.710203]\n",
"[Epoch 9/200] [Batch 57/59] [D loss: 0.692572] [G loss: 0.710346]\n",
"[Epoch 9/200] [Batch 58/59] [D loss: 0.694075] [G loss: 0.707114]\n",
"[Epoch 10/200] [Batch 0/59] [D loss: 0.692904] [G loss: 0.707080]\n",
"[Epoch 10/200] [Batch 1/59] [D loss: 0.693502] [G loss: 0.703729]\n",
"[Epoch 10/200] [Batch 2/59] [D loss: 0.691705] [G loss: 0.702351]\n",
"[Epoch 10/200] [Batch 3/59] [D loss: 0.691651] [G loss: 0.695779]\n",
"[Epoch 10/200] [Batch 4/59] [D loss: 0.691546] [G loss: 0.692388]\n",
"[Epoch 10/200] [Batch 5/59] [D loss: 0.692575] [G loss: 0.684904]\n",
"[Epoch 10/200] [Batch 6/59] [D loss: 0.694378] [G loss: 0.681139]\n",
"[Epoch 10/200] [Batch 7/59] [D loss: 0.693643] [G loss: 0.676323]\n",
"[Epoch 10/200] [Batch 8/59] [D loss: 0.695010] [G loss: 0.672132]\n",
"[Epoch 10/200] [Batch 9/59] [D loss: 0.695080] [G loss: 0.670683]\n",
"[Epoch 10/200] [Batch 10/59] [D loss: 0.695259] [G loss: 0.668638]\n",
"[Epoch 10/200] [Batch 11/59] [D loss: 0.697028] [G loss: 0.667039]\n",
"[Epoch 10/200] [Batch 12/59] [D loss: 0.697399] [G loss: 0.667819]\n",
"[Epoch 10/200] [Batch 13/59] [D loss: 0.697105] [G loss: 0.670305]\n",
"[Epoch 10/200] [Batch 14/59] [D loss: 0.697061] [G loss: 0.675097]\n",
"[Epoch 10/200] [Batch 15/59] [D loss: 0.696915] [G loss: 0.680594]\n",
"[Epoch 10/200] [Batch 16/59] [D loss: 0.696047] [G loss: 0.684132]\n",
"[Epoch 10/200] [Batch 17/59] [D loss: 0.695489] [G loss: 0.689019]\n",
"[Epoch 10/200] [Batch 18/59] [D loss: 0.694823] [G loss: 0.691259]\n",
"[Epoch 10/200] [Batch 19/59] [D loss: 0.695070] [G loss: 0.696464]\n",
"[Epoch 10/200] [Batch 20/59] [D loss: 0.693528] [G loss: 0.698511]\n",
"[Epoch 10/200] [Batch 21/59] [D loss: 0.693334] [G loss: 0.701753]\n",
"[Epoch 10/200] [Batch 22/59] [D loss: 0.692721] [G loss: 0.704590]\n",
"[Epoch 10/200] [Batch 23/59] [D loss: 0.691703] [G loss: 0.707362]\n",
"[Epoch 10/200] [Batch 24/59] [D loss: 0.691351] [G loss: 0.709099]\n",
"[Epoch 10/200] [Batch 25/59] [D loss: 0.690384] [G loss: 0.710851]\n",
"[Epoch 10/200] [Batch 26/59] [D loss: 0.690160] [G loss: 0.711327]\n",
"[Epoch 10/200] [Batch 27/59] [D loss: 0.688899] [G loss: 0.713262]\n",
"[Epoch 10/200] [Batch 28/59] [D loss: 0.689064] [G loss: 0.711956]\n",
"[Epoch 10/200] [Batch 29/59] [D loss: 0.686837] [G loss: 0.714768]\n",
"[Epoch 10/200] [Batch 30/59] [D loss: 0.686293] [G loss: 0.714699]\n",
"[Epoch 10/200] [Batch 31/59] [D loss: 0.686230] [G loss: 0.713466]\n",
"[Epoch 10/200] [Batch 32/59] [D loss: 0.685559] [G loss: 0.713997]\n",
"[Epoch 10/200] [Batch 33/59] [D loss: 0.685219] [G loss: 0.713220]\n",
"[Epoch 10/200] [Batch 34/59] [D loss: 0.685790] [G loss: 0.711202]\n",
"[Epoch 10/200] [Batch 35/59] [D loss: 0.686988] [G loss: 0.705888]\n",
"[Epoch 10/200] [Batch 36/59] [D loss: 0.688667] [G loss: 0.703319]\n",
"[Epoch 10/200] [Batch 37/59] [D loss: 0.689465] [G loss: 0.699413]\n",
"[Epoch 10/200] [Batch 38/59] [D loss: 0.693245] [G loss: 0.692795]\n",
"[Epoch 10/200] [Batch 39/59] [D loss: 0.695301] [G loss: 0.685515]\n",
"[Epoch 10/200] [Batch 40/59] [D loss: 0.698985] [G loss: 0.678214]\n",
"[Epoch 10/200] [Batch 41/59] [D loss: 0.698728] [G loss: 0.676767]\n",
"[Epoch 10/200] [Batch 42/59] [D loss: 0.700120] [G loss: 0.672060]\n",
"[Epoch 10/200] [Batch 43/59] [D loss: 0.699803] [G loss: 0.671531]\n",
"[Epoch 10/200] [Batch 44/59] [D loss: 0.696268] [G loss: 0.671396]\n",
"[Epoch 10/200] [Batch 45/59] [D loss: 0.695211] [G loss: 0.675077]\n",
"[Epoch 10/200] [Batch 46/59] [D loss: 0.693569] [G loss: 0.675106]\n",
"[Epoch 10/200] [Batch 47/59] [D loss: 0.690816] [G loss: 0.677537]\n",
"[Epoch 10/200] [Batch 48/59] [D loss: 0.689884] [G loss: 0.678517]\n",
"[Epoch 10/200] [Batch 49/59] [D loss: 0.688890] [G loss: 0.676108]\n",
"[Epoch 10/200] [Batch 50/59] [D loss: 0.686437] [G loss: 0.676952]\n",
"[Epoch 10/200] [Batch 51/59] [D loss: 0.686574] [G loss: 0.672033]\n",
"[Epoch 10/200] [Batch 52/59] [D loss: 0.685658] [G loss: 0.671854]\n",
"[Epoch 10/200] [Batch 53/59] [D loss: 0.687153] [G loss: 0.670111]\n",
"[Epoch 10/200] [Batch 54/59] [D loss: 0.690112] [G loss: 0.667593]\n",
"[Epoch 10/200] [Batch 55/59] [D loss: 0.691480] [G loss: 0.667659]\n",
"[Epoch 10/200] [Batch 56/59] [D loss: 0.695808] [G loss: 0.667695]\n",
"[Epoch 10/200] [Batch 57/59] [D loss: 0.699722] [G loss: 0.668191]\n",
"[Epoch 10/200] [Batch 58/59] [D loss: 0.703375] [G loss: 0.672880]\n",
"[Epoch 11/200] [Batch 0/59] [D loss: 0.704311] [G loss: 0.681134]\n",
"[Epoch 11/200] [Batch 1/59] [D loss: 0.703311] [G loss: 0.689698]\n",
"[Epoch 11/200] [Batch 2/59] [D loss: 0.702264] [G loss: 0.697760]\n",
"[Epoch 11/200] [Batch 3/59] [D loss: 0.699913] [G loss: 0.704584]\n",
"[Epoch 11/200] [Batch 4/59] [D loss: 0.698103] [G loss: 0.709145]\n",
"[Epoch 11/200] [Batch 5/59] [D loss: 0.697447] [G loss: 0.712225]\n",
"[Epoch 11/200] [Batch 6/59] [D loss: 0.696190] [G loss: 0.714767]\n",
"[Epoch 11/200] [Batch 7/59] [D loss: 0.694338] [G loss: 0.716139]\n",
"[Epoch 11/200] [Batch 8/59] [D loss: 0.694277] [G loss: 0.716463]\n",
"[Epoch 11/200] [Batch 9/59] [D loss: 0.692478] [G loss: 0.717143]\n",
"[Epoch 11/200] [Batch 10/59] [D loss: 0.692515] [G loss: 0.716218]\n",
"[Epoch 11/200] [Batch 11/59] [D loss: 0.691659] [G loss: 0.715618]\n",
"[Epoch 11/200] [Batch 12/59] [D loss: 0.691314] [G loss: 0.714675]\n",
"[Epoch 11/200] [Batch 13/59] [D loss: 0.690258] [G loss: 0.713487]\n",
"[Epoch 11/200] [Batch 14/59] [D loss: 0.689334] [G loss: 0.713272]\n",
"[Epoch 11/200] [Batch 15/59] [D loss: 0.688735] [G loss: 0.712469]\n",
"[Epoch 11/200] [Batch 16/59] [D loss: 0.687931] [G loss: 0.711042]\n",
"[Epoch 11/200] [Batch 17/59] [D loss: 0.687410] [G loss: 0.709323]\n",
"[Epoch 11/200] [Batch 18/59] [D loss: 0.685803] [G loss: 0.709073]\n",
"[Epoch 11/200] [Batch 19/59] [D loss: 0.685464] [G loss: 0.708792]\n",
"[Epoch 11/200] [Batch 20/59] [D loss: 0.684568] [G loss: 0.707721]\n",
"[Epoch 11/200] [Batch 21/59] [D loss: 0.683119] [G loss: 0.707126]\n",
"[Epoch 11/200] [Batch 22/59] [D loss: 0.684342] [G loss: 0.704093]\n",
"[Epoch 11/200] [Batch 23/59] [D loss: 0.684435] [G loss: 0.704144]\n",
"[Epoch 11/200] [Batch 24/59] [D loss: 0.687026] [G loss: 0.700063]\n",
"[Epoch 11/200] [Batch 25/59] [D loss: 0.688620] [G loss: 0.701554]\n",
"[Epoch 11/200] [Batch 26/59] [D loss: 0.691452] [G loss: 0.699636]\n",
"[Epoch 11/200] [Batch 27/59] [D loss: 0.696290] [G loss: 0.696136]\n",
"[Epoch 11/200] [Batch 28/59] [D loss: 0.700004] [G loss: 0.698414]\n",
"[Epoch 11/200] [Batch 29/59] [D loss: 0.702279] [G loss: 0.694518]\n",
"[Epoch 11/200] [Batch 30/59] [D loss: 0.704218] [G loss: 0.692346]\n",
"[Epoch 11/200] [Batch 31/59] [D loss: 0.703648] [G loss: 0.695037]\n",
"[Epoch 11/200] [Batch 32/59] [D loss: 0.702746] [G loss: 0.694302]\n",
"[Epoch 11/200] [Batch 33/59] [D loss: 0.700848] [G loss: 0.697100]\n",
"[Epoch 11/200] [Batch 34/59] [D loss: 0.699188] [G loss: 0.699520]\n",
"[Epoch 11/200] [Batch 35/59] [D loss: 0.697701] [G loss: 0.698802]\n",
"[Epoch 11/200] [Batch 36/59] [D loss: 0.694451] [G loss: 0.700719]\n",
"[Epoch 11/200] [Batch 37/59] [D loss: 0.693115] [G loss: 0.700736]\n",
"[Epoch 11/200] [Batch 38/59] [D loss: 0.691794] [G loss: 0.700263]\n",
"[Epoch 11/200] [Batch 39/59] [D loss: 0.690234] [G loss: 0.699027]\n",
"[Epoch 11/200] [Batch 40/59] [D loss: 0.688873] [G loss: 0.696963]\n",
"[Epoch 11/200] [Batch 41/59] [D loss: 0.687566] [G loss: 0.693139]\n",
"[Epoch 11/200] [Batch 42/59] [D loss: 0.685574] [G loss: 0.687501]\n",
"[Epoch 11/200] [Batch 43/59] [D loss: 0.683348] [G loss: 0.682495]\n",
"[Epoch 11/200] [Batch 44/59] [D loss: 0.682921] [G loss: 0.674590]\n",
"[Epoch 11/200] [Batch 45/59] [D loss: 0.681671] [G loss: 0.669715]\n",
"[Epoch 11/200] [Batch 46/59] [D loss: 0.683152] [G loss: 0.661244]\n",
"[Epoch 11/200] [Batch 47/59] [D loss: 0.683231] [G loss: 0.657217]\n",
"[Epoch 11/200] [Batch 48/59] [D loss: 0.686577] [G loss: 0.651715]\n",
"[Epoch 11/200] [Batch 49/59] [D loss: 0.689720] [G loss: 0.649786]\n",
"[Epoch 11/200] [Batch 50/59] [D loss: 0.693681] [G loss: 0.652892]\n",
"[Epoch 11/200] [Batch 51/59] [D loss: 0.695942] [G loss: 0.662045]\n",
"[Epoch 11/200] [Batch 52/59] [D loss: 0.698220] [G loss: 0.671636]\n",
"[Epoch 11/200] [Batch 53/59] [D loss: 0.700894] [G loss: 0.681657]\n",
"[Epoch 11/200] [Batch 54/59] [D loss: 0.704370] [G loss: 0.685056]\n",
"[Epoch 11/200] [Batch 55/59] [D loss: 0.707675] [G loss: 0.690139]\n",
"[Epoch 11/200] [Batch 56/59] [D loss: 0.706786] [G loss: 0.693553]\n",
"[Epoch 11/200] [Batch 57/59] [D loss: 0.705975] [G loss: 0.696625]\n",
"[Epoch 11/200] [Batch 58/59] [D loss: 0.701801] [G loss: 0.704375]\n",
"[Epoch 12/200] [Batch 0/59] [D loss: 0.698747] [G loss: 0.706727]\n",
"[Epoch 12/200] [Batch 1/59] [D loss: 0.696215] [G loss: 0.711146]\n",
"[Epoch 12/200] [Batch 2/59] [D loss: 0.693937] [G loss: 0.712462]\n",
"[Epoch 12/200] [Batch 3/59] [D loss: 0.691936] [G loss: 0.713215]\n",
"[Epoch 12/200] [Batch 4/59] [D loss: 0.690550] [G loss: 0.713410]\n",
"[Epoch 12/200] [Batch 5/59] [D loss: 0.689197] [G loss: 0.712608]\n",
"[Epoch 12/200] [Batch 6/59] [D loss: 0.687874] [G loss: 0.711812]\n",
"[Epoch 12/200] [Batch 7/59] [D loss: 0.687357] [G loss: 0.710140]\n",
"[Epoch 12/200] [Batch 8/59] [D loss: 0.685810] [G loss: 0.707411]\n",
"[Epoch 12/200] [Batch 9/59] [D loss: 0.685194] [G loss: 0.706128]\n",
"[Epoch 12/200] [Batch 10/59] [D loss: 0.684005] [G loss: 0.704538]\n",
"[Epoch 12/200] [Batch 11/59] [D loss: 0.682898] [G loss: 0.703167]\n",
"[Epoch 12/200] [Batch 12/59] [D loss: 0.681939] [G loss: 0.700541]\n",
"[Epoch 12/200] [Batch 13/59] [D loss: 0.681285] [G loss: 0.699013]\n",
"[Epoch 12/200] [Batch 14/59] [D loss: 0.680940] [G loss: 0.697221]\n",
"[Epoch 12/200] [Batch 15/59] [D loss: 0.680356] [G loss: 0.699958]\n",
"[Epoch 12/200] [Batch 16/59] [D loss: 0.681438] [G loss: 0.699029]\n",
"[Epoch 12/200] [Batch 17/59] [D loss: 0.681629] [G loss: 0.706019]\n",
"[Epoch 12/200] [Batch 18/59] [D loss: 0.683356] [G loss: 0.708678]\n",
"[Epoch 12/200] [Batch 19/59] [D loss: 0.685357] [G loss: 0.717362]\n",
"[Epoch 12/200] [Batch 20/59] [D loss: 0.691759] [G loss: 0.720655]\n",
"[Epoch 12/200] [Batch 21/59] [D loss: 0.694656] [G loss: 0.717919]\n",
"[Epoch 12/200] [Batch 22/59] [D loss: 0.702108] [G loss: 0.712596]\n",
"[Epoch 12/200] [Batch 23/59] [D loss: 0.707383] [G loss: 0.704936]\n",
"[Epoch 12/200] [Batch 24/59] [D loss: 0.710918] [G loss: 0.692190]\n",
"[Epoch 12/200] [Batch 25/59] [D loss: 0.710098] [G loss: 0.687294]\n",
"[Epoch 12/200] [Batch 26/59] [D loss: 0.707674] [G loss: 0.683672]\n",
"[Epoch 12/200] [Batch 27/59] [D loss: 0.702759] [G loss: 0.686196]\n",
"[Epoch 12/200] [Batch 28/59] [D loss: 0.699927] [G loss: 0.687240]\n",
"[Epoch 12/200] [Batch 29/59] [D loss: 0.696267] [G loss: 0.686888]\n",
"[Epoch 12/200] [Batch 30/59] [D loss: 0.693630] [G loss: 0.685848]\n",
"[Epoch 12/200] [Batch 31/59] [D loss: 0.690520] [G loss: 0.684995]\n",
"[Epoch 12/200] [Batch 32/59] [D loss: 0.688664] [G loss: 0.682152]\n",
"[Epoch 12/200] [Batch 33/59] [D loss: 0.687389] [G loss: 0.676776]\n",
"[Epoch 12/200] [Batch 34/59] [D loss: 0.686008] [G loss: 0.673335]\n",
"[Epoch 12/200] [Batch 35/59] [D loss: 0.685551] [G loss: 0.668025]\n",
"[Epoch 12/200] [Batch 36/59] [D loss: 0.684960] [G loss: 0.662853]\n",
"[Epoch 12/200] [Batch 37/59] [D loss: 0.683804] [G loss: 0.658924]\n",
"[Epoch 12/200] [Batch 38/59] [D loss: 0.683799] [G loss: 0.657574]\n",
"[Epoch 12/200] [Batch 39/59] [D loss: 0.682590] [G loss: 0.658264]\n",
"[Epoch 12/200] [Batch 40/59] [D loss: 0.681870] [G loss: 0.658575]\n",
"[Epoch 12/200] [Batch 41/59] [D loss: 0.682700] [G loss: 0.661108]\n",
"[Epoch 12/200] [Batch 42/59] [D loss: 0.681658] [G loss: 0.664453]\n",
"[Epoch 12/200] [Batch 43/59] [D loss: 0.686436] [G loss: 0.669707]\n",
"[Epoch 12/200] [Batch 44/59] [D loss: 0.690551] [G loss: 0.673011]\n",
"[Epoch 12/200] [Batch 45/59] [D loss: 0.697359] [G loss: 0.675927]\n",
"[Epoch 12/200] [Batch 46/59] [D loss: 0.703196] [G loss: 0.677411]\n",
"[Epoch 12/200] [Batch 47/59] [D loss: 0.709412] [G loss: 0.678176]\n",
"[Epoch 12/200] [Batch 48/59] [D loss: 0.711304] [G loss: 0.680945]\n",
"[Epoch 12/200] [Batch 49/59] [D loss: 0.705987] [G loss: 0.690189]\n",
"[Epoch 12/200] [Batch 50/59] [D loss: 0.702045] [G loss: 0.700519]\n",
"[Epoch 12/200] [Batch 51/59] [D loss: 0.698209] [G loss: 0.708535]\n",
"[Epoch 12/200] [Batch 52/59] [D loss: 0.694435] [G loss: 0.715417]\n",
"[Epoch 12/200] [Batch 53/59] [D loss: 0.692466] [G loss: 0.720070]\n",
"[Epoch 12/200] [Batch 54/59] [D loss: 0.690213] [G loss: 0.720813]\n",
"[Epoch 12/200] [Batch 55/59] [D loss: 0.688162] [G loss: 0.722465]\n",
"[Epoch 12/200] [Batch 56/59] [D loss: 0.687220] [G loss: 0.723369]\n",
"[Epoch 12/200] [Batch 57/59] [D loss: 0.685299] [G loss: 0.723297]\n",
"[Epoch 12/200] [Batch 58/59] [D loss: 0.683961] [G loss: 0.725510]\n",
"[Epoch 13/200] [Batch 0/59] [D loss: 0.682553] [G loss: 0.723108]\n",
"[Epoch 13/200] [Batch 1/59] [D loss: 0.681979] [G loss: 0.724258]\n",
"[Epoch 13/200] [Batch 2/59] [D loss: 0.678573] [G loss: 0.728131]\n",
"[Epoch 13/200] [Batch 3/59] [D loss: 0.677051] [G loss: 0.731084]\n",
"[Epoch 13/200] [Batch 4/59] [D loss: 0.678176] [G loss: 0.728931]\n",
"[Epoch 13/200] [Batch 5/59] [D loss: 0.677731] [G loss: 0.732207]\n",
"[Epoch 13/200] [Batch 6/59] [D loss: 0.680105] [G loss: 0.736404]\n",
"[Epoch 13/200] [Batch 7/59] [D loss: 0.682622] [G loss: 0.736530]\n",
"[Epoch 13/200] [Batch 8/59] [D loss: 0.689157] [G loss: 0.730592]\n",
"[Epoch 13/200] [Batch 9/59] [D loss: 0.698350] [G loss: 0.724148]\n",
"[Epoch 13/200] [Batch 10/59] [D loss: 0.705718] [G loss: 0.713213]\n",
"[Epoch 13/200] [Batch 11/59] [D loss: 0.712568] [G loss: 0.697551]\n",
"[Epoch 13/200] [Batch 12/59] [D loss: 0.717331] [G loss: 0.684388]\n",
"[Epoch 13/200] [Batch 13/59] [D loss: 0.714577] [G loss: 0.679167]\n",
"[Epoch 13/200] [Batch 14/59] [D loss: 0.707732] [G loss: 0.678402]\n",
"[Epoch 13/200] [Batch 15/59] [D loss: 0.703650] [G loss: 0.677601]\n",
"[Epoch 13/200] [Batch 16/59] [D loss: 0.698750] [G loss: 0.677302]\n",
"[Epoch 13/200] [Batch 17/59] [D loss: 0.695386] [G loss: 0.677449]\n",
"[Epoch 13/200] [Batch 18/59] [D loss: 0.691475] [G loss: 0.675918]\n",
"[Epoch 13/200] [Batch 19/59] [D loss: 0.690465] [G loss: 0.673930]\n",
"[Epoch 13/200] [Batch 20/59] [D loss: 0.688153] [G loss: 0.670467]\n",
"[Epoch 13/200] [Batch 21/59] [D loss: 0.686412] [G loss: 0.667620]\n",
"[Epoch 13/200] [Batch 22/59] [D loss: 0.685941] [G loss: 0.663994]\n",
"[Epoch 13/200] [Batch 23/59] [D loss: 0.683274] [G loss: 0.660694]\n",
"[Epoch 13/200] [Batch 24/59] [D loss: 0.682576] [G loss: 0.657878]\n",
"[Epoch 13/200] [Batch 25/59] [D loss: 0.680509] [G loss: 0.656686]\n",
"[Epoch 13/200] [Batch 26/59] [D loss: 0.677964] [G loss: 0.656382]\n",
"[Epoch 13/200] [Batch 27/59] [D loss: 0.679238] [G loss: 0.653781]\n",
"[Epoch 13/200] [Batch 28/59] [D loss: 0.677656] [G loss: 0.658589]\n",
"[Epoch 13/200] [Batch 29/59] [D loss: 0.678459] [G loss: 0.659296]\n",
"[Epoch 13/200] [Batch 30/59] [D loss: 0.683426] [G loss: 0.656194]\n",
"[Epoch 13/200] [Batch 31/59] [D loss: 0.687887] [G loss: 0.654902]\n",
"[Epoch 13/200] [Batch 32/59] [D loss: 0.696159] [G loss: 0.656868]\n",
"[Epoch 13/200] [Batch 33/59] [D loss: 0.703490] [G loss: 0.652983]\n",
"[Epoch 13/200] [Batch 34/59] [D loss: 0.706852] [G loss: 0.652376]\n",
"[Epoch 13/200] [Batch 35/59] [D loss: 0.707744] [G loss: 0.653359]\n",
"[Epoch 13/200] [Batch 36/59] [D loss: 0.703999] [G loss: 0.659896]\n",
"[Epoch 13/200] [Batch 37/59] [D loss: 0.699915] [G loss: 0.669509]\n",
"[Epoch 13/200] [Batch 38/59] [D loss: 0.697573] [G loss: 0.678867]\n",
"[Epoch 13/200] [Batch 39/59] [D loss: 0.694085] [G loss: 0.685554]\n",
"[Epoch 13/200] [Batch 40/59] [D loss: 0.693001] [G loss: 0.692715]\n",
"[Epoch 13/200] [Batch 41/59] [D loss: 0.692131] [G loss: 0.696701]\n",
"[Epoch 13/200] [Batch 42/59] [D loss: 0.691948] [G loss: 0.700969]\n",
"[Epoch 13/200] [Batch 43/59] [D loss: 0.693573] [G loss: 0.701953]\n",
"[Epoch 13/200] [Batch 44/59] [D loss: 0.691326] [G loss: 0.708092]\n",
"[Epoch 13/200] [Batch 45/59] [D loss: 0.690990] [G loss: 0.713729]\n",
"[Epoch 13/200] [Batch 46/59] [D loss: 0.690927] [G loss: 0.716805]\n",
"[Epoch 13/200] [Batch 47/59] [D loss: 0.689298] [G loss: 0.720115]\n",
"[Epoch 13/200] [Batch 48/59] [D loss: 0.687817] [G loss: 0.725762]\n",
"[Epoch 13/200] [Batch 49/59] [D loss: 0.687591] [G loss: 0.729917]\n",
"[Epoch 13/200] [Batch 50/59] [D loss: 0.686909] [G loss: 0.736601]\n",
"[Epoch 13/200] [Batch 51/59] [D loss: 0.685558] [G loss: 0.743603]\n",
"[Epoch 13/200] [Batch 52/59] [D loss: 0.683879] [G loss: 0.749887]\n",
"[Epoch 13/200] [Batch 53/59] [D loss: 0.684279] [G loss: 0.751942]\n",
"[Epoch 13/200] [Batch 54/59] [D loss: 0.686021] [G loss: 0.753020]\n",
"[Epoch 13/200] [Batch 55/59] [D loss: 0.687901] [G loss: 0.752087]\n",
"[Epoch 13/200] [Batch 56/59] [D loss: 0.691960] [G loss: 0.744812]\n",
"[Epoch 13/200] [Batch 57/59] [D loss: 0.695971] [G loss: 0.732055]\n",
"[Epoch 13/200] [Batch 58/59] [D loss: 0.698391] [G loss: 0.720011]\n",
"[Epoch 14/200] [Batch 0/59] [D loss: 0.697629] [G loss: 0.707888]\n",
"[Epoch 14/200] [Batch 1/59] [D loss: 0.696497] [G loss: 0.697483]\n",
"[Epoch 14/200] [Batch 2/59] [D loss: 0.692719] [G loss: 0.685069]\n",
"[Epoch 14/200] [Batch 3/59] [D loss: 0.691976] [G loss: 0.672188]\n",
"[Epoch 14/200] [Batch 4/59] [D loss: 0.693562] [G loss: 0.659441]\n",
"[Epoch 14/200] [Batch 5/59] [D loss: 0.694072] [G loss: 0.652971]\n",
"[Epoch 14/200] [Batch 6/59] [D loss: 0.694451] [G loss: 0.652698]\n",
"[Epoch 14/200] [Batch 7/59] [D loss: 0.692071] [G loss: 0.658952]\n",
"[Epoch 14/200] [Batch 8/59] [D loss: 0.690152] [G loss: 0.667860]\n",
"[Epoch 14/200] [Batch 9/59] [D loss: 0.690576] [G loss: 0.680762]\n",
"[Epoch 14/200] [Batch 10/59] [D loss: 0.688768] [G loss: 0.690597]\n",
"[Epoch 14/200] [Batch 11/59] [D loss: 0.686766] [G loss: 0.699017]\n",
"[Epoch 14/200] [Batch 12/59] [D loss: 0.685069] [G loss: 0.711739]\n",
"[Epoch 14/200] [Batch 13/59] [D loss: 0.686480] [G loss: 0.714725]\n",
"[Epoch 14/200] [Batch 14/59] [D loss: 0.686390] [G loss: 0.715451]\n",
"[Epoch 14/200] [Batch 15/59] [D loss: 0.690138] [G loss: 0.713724]\n",
"[Epoch 14/200] [Batch 16/59] [D loss: 0.691504] [G loss: 0.707338]\n",
"[Epoch 14/200] [Batch 17/59] [D loss: 0.691838] [G loss: 0.702308]\n",
"[Epoch 14/200] [Batch 18/59] [D loss: 0.694582] [G loss: 0.692729]\n",
"[Epoch 14/200] [Batch 19/59] [D loss: 0.695099] [G loss: 0.686287]\n",
"[Epoch 14/200] [Batch 20/59] [D loss: 0.697016] [G loss: 0.677721]\n",
"[Epoch 14/200] [Batch 21/59] [D loss: 0.694736] [G loss: 0.675364]\n",
"[Epoch 14/200] [Batch 22/59] [D loss: 0.695988] [G loss: 0.671864]\n",
"[Epoch 14/200] [Batch 23/59] [D loss: 0.694951] [G loss: 0.672425]\n",
"[Epoch 14/200] [Batch 24/59] [D loss: 0.696773] [G loss: 0.674455]\n",
"[Epoch 14/200] [Batch 25/59] [D loss: 0.696352] [G loss: 0.678755]\n",
"[Epoch 14/200] [Batch 26/59] [D loss: 0.694635] [G loss: 0.686094]\n",
"[Epoch 14/200] [Batch 27/59] [D loss: 0.692916] [G loss: 0.692328]\n",
"[Epoch 14/200] [Batch 28/59] [D loss: 0.691792] [G loss: 0.700219]\n",
"[Epoch 14/200] [Batch 29/59] [D loss: 0.690875] [G loss: 0.706379]\n",
"[Epoch 14/200] [Batch 30/59] [D loss: 0.689639] [G loss: 0.709777]\n",
"[Epoch 14/200] [Batch 31/59] [D loss: 0.686840] [G loss: 0.714047]\n",
"[Epoch 14/200] [Batch 32/59] [D loss: 0.686168] [G loss: 0.716894]\n",
"[Epoch 14/200] [Batch 33/59] [D loss: 0.686645] [G loss: 0.715013]\n",
"[Epoch 14/200] [Batch 34/59] [D loss: 0.686478] [G loss: 0.715138]\n",
"[Epoch 14/200] [Batch 35/59] [D loss: 0.684042] [G loss: 0.716811]\n",
"[Epoch 14/200] [Batch 36/59] [D loss: 0.687309] [G loss: 0.710740]\n",
"[Epoch 14/200] [Batch 37/59] [D loss: 0.686044] [G loss: 0.709140]\n",
"[Epoch 14/200] [Batch 38/59] [D loss: 0.687344] [G loss: 0.703422]\n",
"[Epoch 14/200] [Batch 39/59] [D loss: 0.685842] [G loss: 0.699122]\n",
"[Epoch 14/200] [Batch 40/59] [D loss: 0.687989] [G loss: 0.693541]\n",
"[Epoch 14/200] [Batch 41/59] [D loss: 0.688456] [G loss: 0.685598]\n",
"[Epoch 14/200] [Batch 42/59] [D loss: 0.688322] [G loss: 0.679386]\n",
"[Epoch 14/200] [Batch 43/59] [D loss: 0.692011] [G loss: 0.673921]\n",
"[Epoch 14/200] [Batch 44/59] [D loss: 0.692717] [G loss: 0.670426]\n",
"[Epoch 14/200] [Batch 45/59] [D loss: 0.692369] [G loss: 0.666933]\n",
"[Epoch 14/200] [Batch 46/59] [D loss: 0.691918] [G loss: 0.666639]\n",
"[Epoch 14/200] [Batch 47/59] [D loss: 0.694439] [G loss: 0.665319]\n",
"[Epoch 14/200] [Batch 48/59] [D loss: 0.693085] [G loss: 0.672009]\n",
"[Epoch 14/200] [Batch 49/59] [D loss: 0.694403] [G loss: 0.673625]\n",
"[Epoch 14/200] [Batch 50/59] [D loss: 0.693457] [G loss: 0.676104]\n",
"[Epoch 14/200] [Batch 51/59] [D loss: 0.692600] [G loss: 0.687249]\n",
"[Epoch 14/200] [Batch 52/59] [D loss: 0.691761] [G loss: 0.687143]\n",
"[Epoch 14/200] [Batch 53/59] [D loss: 0.691598] [G loss: 0.689166]\n",
"[Epoch 14/200] [Batch 54/59] [D loss: 0.690157] [G loss: 0.691830]\n",
"[Epoch 14/200] [Batch 55/59] [D loss: 0.689376] [G loss: 0.692943]\n",
"[Epoch 14/200] [Batch 56/59] [D loss: 0.688064] [G loss: 0.693543]\n",
"[Epoch 14/200] [Batch 57/59] [D loss: 0.688335] [G loss: 0.689475]\n",
"[Epoch 14/200] [Batch 58/59] [D loss: 0.686649] [G loss: 0.686952]\n",
"[Epoch 15/200] [Batch 0/59] [D loss: 0.685152] [G loss: 0.687098]\n",
"[Epoch 15/200] [Batch 1/59] [D loss: 0.684606] [G loss: 0.685131]\n",
"[Epoch 15/200] [Batch 2/59] [D loss: 0.686136] [G loss: 0.679458]\n",
"[Epoch 15/200] [Batch 3/59] [D loss: 0.683975] [G loss: 0.680205]\n",
"[Epoch 15/200] [Batch 4/59] [D loss: 0.685566] [G loss: 0.676973]\n",
"[Epoch 15/200] [Batch 5/59] [D loss: 0.686558] [G loss: 0.680748]\n",
"[Epoch 15/200] [Batch 6/59] [D loss: 0.686324] [G loss: 0.684735]\n",
"[Epoch 15/200] [Batch 7/59] [D loss: 0.685469] [G loss: 0.688122]\n",
"[Epoch 15/200] [Batch 8/59] [D loss: 0.688093] [G loss: 0.691270]\n",
"[Epoch 15/200] [Batch 9/59] [D loss: 0.688054] [G loss: 0.695842]\n",
"[Epoch 15/200] [Batch 10/59] [D loss: 0.689724] [G loss: 0.696703]\n",
"[Epoch 15/200] [Batch 11/59] [D loss: 0.689645] [G loss: 0.698988]\n",
"[Epoch 15/200] [Batch 12/59] [D loss: 0.688838] [G loss: 0.702416]\n",
"[Epoch 15/200] [Batch 13/59] [D loss: 0.688487] [G loss: 0.704859]\n",
"[Epoch 15/200] [Batch 14/59] [D loss: 0.690563] [G loss: 0.701830]\n",
"[Epoch 15/200] [Batch 15/59] [D loss: 0.688472] [G loss: 0.704736]\n",
"[Epoch 15/200] [Batch 16/59] [D loss: 0.686594] [G loss: 0.705218]\n",
"[Epoch 15/200] [Batch 17/59] [D loss: 0.687345] [G loss: 0.703596]\n",
"[Epoch 15/200] [Batch 18/59] [D loss: 0.686159] [G loss: 0.704927]\n",
"[Epoch 15/200] [Batch 19/59] [D loss: 0.687377] [G loss: 0.704065]\n",
"[Epoch 15/200] [Batch 20/59] [D loss: 0.688539] [G loss: 0.704458]\n",
"[Epoch 15/200] [Batch 21/59] [D loss: 0.690862] [G loss: 0.708216]\n",
"[Epoch 15/200] [Batch 22/59] [D loss: 0.691391] [G loss: 0.704534]\n",
"[Epoch 15/200] [Batch 23/59] [D loss: 0.694267] [G loss: 0.702776]\n",
"[Epoch 15/200] [Batch 24/59] [D loss: 0.694009] [G loss: 0.698887]\n",
"[Epoch 15/200] [Batch 25/59] [D loss: 0.696817] [G loss: 0.691434]\n",
"[Epoch 15/200] [Batch 26/59] [D loss: 0.695317] [G loss: 0.690927]\n",
"[Epoch 15/200] [Batch 27/59] [D loss: 0.694531] [G loss: 0.684330]\n",
"[Epoch 15/200] [Batch 28/59] [D loss: 0.693252] [G loss: 0.681258]\n",
"[Epoch 15/200] [Batch 29/59] [D loss: 0.691660] [G loss: 0.682038]\n",
"[Epoch 15/200] [Batch 30/59] [D loss: 0.688186] [G loss: 0.684353]\n",
"[Epoch 15/200] [Batch 31/59] [D loss: 0.689329] [G loss: 0.685833]\n",
"[Epoch 15/200] [Batch 32/59] [D loss: 0.690531] [G loss: 0.687972]\n",
"[Epoch 15/200] [Batch 33/59] [D loss: 0.685970] [G loss: 0.694930]\n",
"[Epoch 15/200] [Batch 34/59] [D loss: 0.684909] [G loss: 0.697812]\n",
"[Epoch 15/200] [Batch 35/59] [D loss: 0.680790] [G loss: 0.701403]\n",
"[Epoch 15/200] [Batch 36/59] [D loss: 0.681572] [G loss: 0.696786]\n",
"[Epoch 15/200] [Batch 37/59] [D loss: 0.682412] [G loss: 0.686150]\n",
"[Epoch 15/200] [Batch 38/59] [D loss: 0.682008] [G loss: 0.673499]\n",
"[Epoch 15/200] [Batch 39/59] [D loss: 0.682786] [G loss: 0.665251]\n",
"[Epoch 15/200] [Batch 40/59] [D loss: 0.686578] [G loss: 0.669343]\n",
"[Epoch 15/200] [Batch 41/59] [D loss: 0.685242] [G loss: 0.675143]\n",
"[Epoch 15/200] [Batch 42/59] [D loss: 0.686046] [G loss: 0.678231]\n",
"[Epoch 15/200] [Batch 43/59] [D loss: 0.686856] [G loss: 0.683696]\n",
"[Epoch 15/200] [Batch 44/59] [D loss: 0.688290] [G loss: 0.688075]\n",
"[Epoch 15/200] [Batch 45/59] [D loss: 0.693129] [G loss: 0.682419]\n",
"[Epoch 15/200] [Batch 46/59] [D loss: 0.696941] [G loss: 0.680373]\n",
"[Epoch 15/200] [Batch 47/59] [D loss: 0.697143] [G loss: 0.680286]\n",
"[Epoch 15/200] [Batch 48/59] [D loss: 0.696012] [G loss: 0.689562]\n",
"[Epoch 15/200] [Batch 49/59] [D loss: 0.696644] [G loss: 0.693456]\n",
"[Epoch 15/200] [Batch 50/59] [D loss: 0.695733] [G loss: 0.698357]\n",
"[Epoch 15/200] [Batch 51/59] [D loss: 0.693266] [G loss: 0.708415]\n",
"[Epoch 15/200] [Batch 52/59] [D loss: 0.693007] [G loss: 0.710492]\n",
"[Epoch 15/200] [Batch 53/59] [D loss: 0.689988] [G loss: 0.713175]\n",
"[Epoch 15/200] [Batch 54/59] [D loss: 0.692112] [G loss: 0.714178]\n",
"[Epoch 15/200] [Batch 55/59] [D loss: 0.688935] [G loss: 0.716118]\n",
"[Epoch 15/200] [Batch 56/59] [D loss: 0.690714] [G loss: 0.712301]\n",
"[Epoch 15/200] [Batch 57/59] [D loss: 0.692619] [G loss: 0.706856]\n",
"[Epoch 15/200] [Batch 58/59] [D loss: 0.692921] [G loss: 0.706199]\n",
"[Epoch 16/200] [Batch 0/59] [D loss: 0.694277] [G loss: 0.707719]\n",
"[Epoch 16/200] [Batch 1/59] [D loss: 0.694842] [G loss: 0.707931]\n",
"[Epoch 16/200] [Batch 2/59] [D loss: 0.695491] [G loss: 0.711689]\n",
"[Epoch 16/200] [Batch 3/59] [D loss: 0.691732] [G loss: 0.714064]\n",
"[Epoch 16/200] [Batch 4/59] [D loss: 0.692066] [G loss: 0.715013]\n",
"[Epoch 16/200] [Batch 5/59] [D loss: 0.688456] [G loss: 0.715335]\n",
"[Epoch 16/200] [Batch 6/59] [D loss: 0.684889] [G loss: 0.718472]\n",
"[Epoch 16/200] [Batch 7/59] [D loss: 0.678656] [G loss: 0.716805]\n",
"[Epoch 16/200] [Batch 8/59] [D loss: 0.677941] [G loss: 0.707300]\n",
"[Epoch 16/200] [Batch 9/59] [D loss: 0.673142] [G loss: 0.708111]\n",
"[Epoch 16/200] [Batch 10/59] [D loss: 0.670073] [G loss: 0.694571]\n",
"[Epoch 16/200] [Batch 11/59] [D loss: 0.668113] [G loss: 0.696012]\n",
"[Epoch 16/200] [Batch 12/59] [D loss: 0.675515] [G loss: 0.695500]\n",
"[Epoch 16/200] [Batch 13/59] [D loss: 0.678612] [G loss: 0.697912]\n",
"[Epoch 16/200] [Batch 14/59] [D loss: 0.685939] [G loss: 0.707430]\n",
"[Epoch 16/200] [Batch 15/59] [D loss: 0.694600] [G loss: 0.704516]\n",
"[Epoch 16/200] [Batch 16/59] [D loss: 0.704933] [G loss: 0.685505]\n",
"[Epoch 16/200] [Batch 17/59] [D loss: 0.705503] [G loss: 0.677934]\n",
"[Epoch 16/200] [Batch 18/59] [D loss: 0.704710] [G loss: 0.653301]\n",
"[Epoch 16/200] [Batch 19/59] [D loss: 0.700451] [G loss: 0.650587]\n",
"[Epoch 16/200] [Batch 20/59] [D loss: 0.697789] [G loss: 0.662736]\n",
"[Epoch 16/200] [Batch 21/59] [D loss: 0.695881] [G loss: 0.689106]\n",
"[Epoch 16/200] [Batch 22/59] [D loss: 0.692327] [G loss: 0.708879]\n",
"[Epoch 16/200] [Batch 23/59] [D loss: 0.688249] [G loss: 0.725853]\n",
"[Epoch 16/200] [Batch 24/59] [D loss: 0.689172] [G loss: 0.729876]\n",
"[Epoch 16/200] [Batch 25/59] [D loss: 0.686705] [G loss: 0.726963]\n",
"[Epoch 16/200] [Batch 26/59] [D loss: 0.686626] [G loss: 0.714676]\n",
"[Epoch 16/200] [Batch 27/59] [D loss: 0.684514] [G loss: 0.707326]\n",
"[Epoch 16/200] [Batch 28/59] [D loss: 0.687453] [G loss: 0.690384]\n",
"[Epoch 16/200] [Batch 29/59] [D loss: 0.686929] [G loss: 0.678677]\n",
"[Epoch 16/200] [Batch 30/59] [D loss: 0.689463] [G loss: 0.673159]\n",
"[Epoch 16/200] [Batch 31/59] [D loss: 0.688979] [G loss: 0.675702]\n",
"[Epoch 16/200] [Batch 32/59] [D loss: 0.692292] [G loss: 0.681621]\n",
"[Epoch 16/200] [Batch 33/59] [D loss: 0.687706] [G loss: 0.692139]\n",
"[Epoch 16/200] [Batch 34/59] [D loss: 0.688708] [G loss: 0.701531]\n",
"[Epoch 16/200] [Batch 35/59] [D loss: 0.686795] [G loss: 0.708474]\n",
"[Epoch 16/200] [Batch 36/59] [D loss: 0.688485] [G loss: 0.700076]\n",
"[Epoch 16/200] [Batch 37/59] [D loss: 0.687507] [G loss: 0.700423]\n",
"[Epoch 16/200] [Batch 38/59] [D loss: 0.688959] [G loss: 0.693411]\n",
"[Epoch 16/200] [Batch 39/59] [D loss: 0.688938] [G loss: 0.685638]\n",
"[Epoch 16/200] [Batch 40/59] [D loss: 0.690272] [G loss: 0.685690]\n",
"[Epoch 16/200] [Batch 41/59] [D loss: 0.689151] [G loss: 0.686121]\n",
"[Epoch 16/200] [Batch 42/59] [D loss: 0.692508] [G loss: 0.692892]\n",
"[Epoch 16/200] [Batch 43/59] [D loss: 0.692956] [G loss: 0.697330]\n",
"[Epoch 16/200] [Batch 44/59] [D loss: 0.695534] [G loss: 0.706346]\n",
"[Epoch 16/200] [Batch 45/59] [D loss: 0.696351] [G loss: 0.707378]\n",
"[Epoch 16/200] [Batch 46/59] [D loss: 0.696336] [G loss: 0.709850]\n",
"[Epoch 16/200] [Batch 47/59] [D loss: 0.698897] [G loss: 0.706666]\n",
"[Epoch 16/200] [Batch 48/59] [D loss: 0.697178] [G loss: 0.714071]\n",
"[Epoch 16/200] [Batch 49/59] [D loss: 0.695928] [G loss: 0.717970]\n",
"[Epoch 16/200] [Batch 50/59] [D loss: 0.689946] [G loss: 0.723925]\n",
"[Epoch 16/200] [Batch 51/59] [D loss: 0.683398] [G loss: 0.729997]\n",
"[Epoch 16/200] [Batch 52/59] [D loss: 0.680051] [G loss: 0.733527]\n",
"[Epoch 16/200] [Batch 53/59] [D loss: 0.670683] [G loss: 0.728414]\n",
"[Epoch 16/200] [Batch 54/59] [D loss: 0.668245] [G loss: 0.724579]\n",
"[Epoch 16/200] [Batch 55/59] [D loss: 0.664902] [G loss: 0.710694]\n",
"[Epoch 16/200] [Batch 56/59] [D loss: 0.665321] [G loss: 0.705309]\n",
"[Epoch 16/200] [Batch 57/59] [D loss: 0.666629] [G loss: 0.701317]\n",
"[Epoch 16/200] [Batch 58/59] [D loss: 0.672392] [G loss: 0.692734]\n",
"[Epoch 17/200] [Batch 0/59] [D loss: 0.687815] [G loss: 0.685565]\n",
"[Epoch 17/200] [Batch 1/59] [D loss: 0.684758] [G loss: 0.690903]\n",
"[Epoch 17/200] [Batch 2/59] [D loss: 0.691708] [G loss: 0.680162]\n",
"[Epoch 17/200] [Batch 3/59] [D loss: 0.700959] [G loss: 0.671043]\n",
"[Epoch 17/200] [Batch 4/59] [D loss: 0.698469] [G loss: 0.690323]\n",
"[Epoch 17/200] [Batch 5/59] [D loss: 0.693179] [G loss: 0.711763]\n",
"[Epoch 17/200] [Batch 6/59] [D loss: 0.693599] [G loss: 0.710999]\n",
"[Epoch 17/200] [Batch 7/59] [D loss: 0.689724] [G loss: 0.706628]\n",
"[Epoch 17/200] [Batch 8/59] [D loss: 0.687372] [G loss: 0.687816]\n",
"[Epoch 17/200] [Batch 9/59] [D loss: 0.688335] [G loss: 0.679638]\n",
"[Epoch 17/200] [Batch 10/59] [D loss: 0.685385] [G loss: 0.683230]\n",
"[Epoch 17/200] [Batch 11/59] [D loss: 0.684878] [G loss: 0.688351]\n",
"[Epoch 17/200] [Batch 12/59] [D loss: 0.686732] [G loss: 0.688366]\n",
"[Epoch 17/200] [Batch 13/59] [D loss: 0.685352] [G loss: 0.692026]\n",
"[Epoch 17/200] [Batch 14/59] [D loss: 0.687538] [G loss: 0.680728]\n",
"[Epoch 17/200] [Batch 15/59] [D loss: 0.685224] [G loss: 0.675315]\n",
"[Epoch 17/200] [Batch 16/59] [D loss: 0.688577] [G loss: 0.664011]\n",
"[Epoch 17/200] [Batch 17/59] [D loss: 0.685724] [G loss: 0.667490]\n",
"[Epoch 17/200] [Batch 18/59] [D loss: 0.682916] [G loss: 0.679958]\n",
"[Epoch 17/200] [Batch 19/59] [D loss: 0.683284] [G loss: 0.685830]\n",
"[Epoch 17/200] [Batch 20/59] [D loss: 0.682025] [G loss: 0.696615]\n",
"[Epoch 17/200] [Batch 21/59] [D loss: 0.683681] [G loss: 0.701701]\n",
"[Epoch 17/200] [Batch 22/59] [D loss: 0.678927] [G loss: 0.707093]\n",
"[Epoch 17/200] [Batch 23/59] [D loss: 0.677204] [G loss: 0.704711]\n",
"[Epoch 17/200] [Batch 24/59] [D loss: 0.676565] [G loss: 0.699673]\n",
"[Epoch 17/200] [Batch 25/59] [D loss: 0.679650] [G loss: 0.685471]\n",
"[Epoch 17/200] [Batch 26/59] [D loss: 0.680762] [G loss: 0.680173]\n",
"[Epoch 17/200] [Batch 27/59] [D loss: 0.682976] [G loss: 0.679450]\n",
"[Epoch 17/200] [Batch 28/59] [D loss: 0.688234] [G loss: 0.683632]\n",
"[Epoch 17/200] [Batch 29/59] [D loss: 0.685885] [G loss: 0.700878]\n",
"[Epoch 17/200] [Batch 30/59] [D loss: 0.690196] [G loss: 0.713458]\n",
"[Epoch 17/200] [Batch 31/59] [D loss: 0.693740] [G loss: 0.712732]\n",
"[Epoch 17/200] [Batch 32/59] [D loss: 0.690634] [G loss: 0.711645]\n",
"[Epoch 17/200] [Batch 33/59] [D loss: 0.694467] [G loss: 0.700235]\n",
"[Epoch 17/200] [Batch 34/59] [D loss: 0.690036] [G loss: 0.693120]\n",
"[Epoch 17/200] [Batch 35/59] [D loss: 0.689234] [G loss: 0.682971]\n",
"[Epoch 17/200] [Batch 36/59] [D loss: 0.684898] [G loss: 0.699176]\n",
"[Epoch 17/200] [Batch 37/59] [D loss: 0.688224] [G loss: 0.709624]\n",
"[Epoch 17/200] [Batch 38/59] [D loss: 0.685607] [G loss: 0.721163]\n",
"[Epoch 17/200] [Batch 39/59] [D loss: 0.681751] [G loss: 0.724240]\n",
"[Epoch 17/200] [Batch 40/59] [D loss: 0.683338] [G loss: 0.705757]\n",
"[Epoch 17/200] [Batch 41/59] [D loss: 0.682151] [G loss: 0.687079]\n",
"[Epoch 17/200] [Batch 42/59] [D loss: 0.676340] [G loss: 0.669116]\n",
"[Epoch 17/200] [Batch 43/59] [D loss: 0.680174] [G loss: 0.659354]\n",
"[Epoch 17/200] [Batch 44/59] [D loss: 0.674928] [G loss: 0.672601]\n",
"[Epoch 17/200] [Batch 45/59] [D loss: 0.677476] [G loss: 0.710515]\n",
"[Epoch 17/200] [Batch 46/59] [D loss: 0.667611] [G loss: 0.768544]\n",
"[Epoch 17/200] [Batch 47/59] [D loss: 0.661734] [G loss: 0.786937]\n",
"[Epoch 17/200] [Batch 48/59] [D loss: 0.661914] [G loss: 0.786094]\n",
"[Epoch 17/200] [Batch 49/59] [D loss: 0.667503] [G loss: 0.747909]\n",
"[Epoch 17/200] [Batch 50/59] [D loss: 0.678546] [G loss: 0.712468]\n",
"[Epoch 17/200] [Batch 51/59] [D loss: 0.674835] [G loss: 0.669163]\n",
"[Epoch 17/200] [Batch 52/59] [D loss: 0.686870] [G loss: 0.637820]\n",
"[Epoch 17/200] [Batch 53/59] [D loss: 0.694018] [G loss: 0.663273]\n",
"[Epoch 17/200] [Batch 54/59] [D loss: 0.705625] [G loss: 0.712290]\n",
"[Epoch 17/200] [Batch 55/59] [D loss: 0.707198] [G loss: 0.742464]\n",
"[Epoch 17/200] [Batch 56/59] [D loss: 0.711832] [G loss: 0.725441]\n",
"[Epoch 17/200] [Batch 57/59] [D loss: 0.701159] [G loss: 0.695198]\n",
"[Epoch 17/200] [Batch 58/59] [D loss: 0.695391] [G loss: 0.662595]\n",
"[Epoch 18/200] [Batch 0/59] [D loss: 0.688738] [G loss: 0.650485]\n",
"[Epoch 18/200] [Batch 1/59] [D loss: 0.682086] [G loss: 0.659805]\n",
"[Epoch 18/200] [Batch 2/59] [D loss: 0.680098] [G loss: 0.690137]\n",
"[Epoch 18/200] [Batch 3/59] [D loss: 0.673989] [G loss: 0.715701]\n",
"[Epoch 18/200] [Batch 4/59] [D loss: 0.670898] [G loss: 0.732767]\n",
"[Epoch 18/200] [Batch 5/59] [D loss: 0.671681] [G loss: 0.705645]\n",
"[Epoch 18/200] [Batch 6/59] [D loss: 0.673433] [G loss: 0.689221]\n",
"[Epoch 18/200] [Batch 7/59] [D loss: 0.678303] [G loss: 0.658407]\n",
"[Epoch 18/200] [Batch 8/59] [D loss: 0.682196] [G loss: 0.656441]\n",
"[Epoch 18/200] [Batch 9/59] [D loss: 0.686622] [G loss: 0.675697]\n",
"[Epoch 18/200] [Batch 10/59] [D loss: 0.688548] [G loss: 0.703123]\n",
"[Epoch 18/200] [Batch 11/59] [D loss: 0.694363] [G loss: 0.738244]\n",
"[Epoch 18/200] [Batch 12/59] [D loss: 0.691955] [G loss: 0.753468]\n",
"[Epoch 18/200] [Batch 13/59] [D loss: 0.686548] [G loss: 0.750572]\n",
"[Epoch 18/200] [Batch 14/59] [D loss: 0.686859] [G loss: 0.733319]\n",
"[Epoch 18/200] [Batch 15/59] [D loss: 0.677916] [G loss: 0.719394]\n",
"[Epoch 18/200] [Batch 16/59] [D loss: 0.677713] [G loss: 0.699280]\n",
"[Epoch 18/200] [Batch 17/59] [D loss: 0.676355] [G loss: 0.695558]\n",
"[Epoch 18/200] [Batch 18/59] [D loss: 0.673194] [G loss: 0.705831]\n",
"[Epoch 18/200] [Batch 19/59] [D loss: 0.685028] [G loss: 0.716022]\n",
"[Epoch 18/200] [Batch 20/59] [D loss: 0.683558] [G loss: 0.715500]\n",
"[Epoch 18/200] [Batch 21/59] [D loss: 0.691128] [G loss: 0.698625]\n",
"[Epoch 18/200] [Batch 22/59] [D loss: 0.689925] [G loss: 0.677687]\n",
"[Epoch 18/200] [Batch 23/59] [D loss: 0.693870] [G loss: 0.678272]\n",
"[Epoch 18/200] [Batch 24/59] [D loss: 0.695488] [G loss: 0.693092]\n",
"[Epoch 18/200] [Batch 25/59] [D loss: 0.691872] [G loss: 0.705550]\n",
"[Epoch 18/200] [Batch 26/59] [D loss: 0.694521] [G loss: 0.709662]\n",
"[Epoch 18/200] [Batch 27/59] [D loss: 0.687024] [G loss: 0.716763]\n",
"[Epoch 18/200] [Batch 28/59] [D loss: 0.682360] [G loss: 0.711210]\n",
"[Epoch 18/200] [Batch 29/59] [D loss: 0.682420] [G loss: 0.706829]\n",
"[Epoch 18/200] [Batch 30/59] [D loss: 0.676201] [G loss: 0.701057]\n",
"[Epoch 18/200] [Batch 31/59] [D loss: 0.672494] [G loss: 0.700362]\n",
"[Epoch 18/200] [Batch 32/59] [D loss: 0.681949] [G loss: 0.697449]\n",
"[Epoch 18/200] [Batch 33/59] [D loss: 0.674452] [G loss: 0.704648]\n",
"[Epoch 18/200] [Batch 34/59] [D loss: 0.675670] [G loss: 0.726842]\n",
"[Epoch 18/200] [Batch 35/59] [D loss: 0.673102] [G loss: 0.736374]\n",
"[Epoch 18/200] [Batch 36/59] [D loss: 0.664849] [G loss: 0.750262]\n",
"[Epoch 18/200] [Batch 37/59] [D loss: 0.666138] [G loss: 0.744165]\n",
"[Epoch 18/200] [Batch 38/59] [D loss: 0.669247] [G loss: 0.723291]\n",
"[Epoch 18/200] [Batch 39/59] [D loss: 0.665533] [G loss: 0.710167]\n",
"[Epoch 18/200] [Batch 40/59] [D loss: 0.674314] [G loss: 0.699154]\n",
"[Epoch 18/200] [Batch 41/59] [D loss: 0.666525] [G loss: 0.702343]\n",
"[Epoch 18/200] [Batch 42/59] [D loss: 0.674059] [G loss: 0.677234]\n",
"[Epoch 18/200] [Batch 43/59] [D loss: 0.675063] [G loss: 0.687709]\n",
"[Epoch 18/200] [Batch 44/59] [D loss: 0.683927] [G loss: 0.695905]\n",
"[Epoch 18/200] [Batch 45/59] [D loss: 0.676861] [G loss: 0.732893]\n",
"[Epoch 18/200] [Batch 46/59] [D loss: 0.676803] [G loss: 0.729337]\n",
"[Epoch 18/200] [Batch 47/59] [D loss: 0.680884] [G loss: 0.720392]\n",
"[Epoch 18/200] [Batch 48/59] [D loss: 0.675663] [G loss: 0.701882]\n",
"[Epoch 18/200] [Batch 49/59] [D loss: 0.677218] [G loss: 0.687475]\n",
"[Epoch 18/200] [Batch 50/59] [D loss: 0.674020] [G loss: 0.686154]\n",
"[Epoch 18/200] [Batch 51/59] [D loss: 0.678910] [G loss: 0.681131]\n",
"[Epoch 18/200] [Batch 52/59] [D loss: 0.685433] [G loss: 0.705263]\n",
"[Epoch 18/200] [Batch 53/59] [D loss: 0.692713] [G loss: 0.715951]\n",
"[Epoch 18/200] [Batch 54/59] [D loss: 0.701772] [G loss: 0.712533]\n",
"[Epoch 18/200] [Batch 55/59] [D loss: 0.705582] [G loss: 0.688379]\n",
"[Epoch 18/200] [Batch 56/59] [D loss: 0.707591] [G loss: 0.690644]\n",
"[Epoch 18/200] [Batch 57/59] [D loss: 0.706565] [G loss: 0.699619]\n",
"[Epoch 18/200] [Batch 58/59] [D loss: 0.701055] [G loss: 0.711448]\n",
"[Epoch 19/200] [Batch 0/59] [D loss: 0.696831] [G loss: 0.745305]\n",
"[Epoch 19/200] [Batch 1/59] [D loss: 0.680039] [G loss: 0.753749]\n",
"[Epoch 19/200] [Batch 2/59] [D loss: 0.669679] [G loss: 0.751789]\n",
"[Epoch 19/200] [Batch 3/59] [D loss: 0.679025] [G loss: 0.709963]\n",
"[Epoch 19/200] [Batch 4/59] [D loss: 0.677583] [G loss: 0.684736]\n",
"[Epoch 19/200] [Batch 5/59] [D loss: 0.694022] [G loss: 0.681697]\n",
"[Epoch 19/200] [Batch 6/59] [D loss: 0.694934] [G loss: 0.700758]\n",
"[Epoch 19/200] [Batch 7/59] [D loss: 0.694521] [G loss: 0.700193]\n",
"[Epoch 19/200] [Batch 8/59] [D loss: 0.689559] [G loss: 0.701722]\n",
"[Epoch 19/200] [Batch 9/59] [D loss: 0.685592] [G loss: 0.701274]\n",
"[Epoch 19/200] [Batch 10/59] [D loss: 0.680755] [G loss: 0.702292]\n",
"[Epoch 19/200] [Batch 11/59] [D loss: 0.673914] [G loss: 0.700191]\n",
"[Epoch 19/200] [Batch 12/59] [D loss: 0.673821] [G loss: 0.706865]\n",
"[Epoch 19/200] [Batch 13/59] [D loss: 0.668641] [G loss: 0.730650]\n",
"[Epoch 19/200] [Batch 14/59] [D loss: 0.670037] [G loss: 0.724227]\n",
"[Epoch 19/200] [Batch 15/59] [D loss: 0.678576] [G loss: 0.717508]\n",
"[Epoch 19/200] [Batch 16/59] [D loss: 0.677539] [G loss: 0.724579]\n",
"[Epoch 19/200] [Batch 17/59] [D loss: 0.690920] [G loss: 0.715086]\n",
"[Epoch 19/200] [Batch 18/59] [D loss: 0.687372] [G loss: 0.702405]\n",
"[Epoch 19/200] [Batch 19/59] [D loss: 0.689804] [G loss: 0.702673]\n",
"[Epoch 19/200] [Batch 20/59] [D loss: 0.683994] [G loss: 0.711857]\n",
"[Epoch 19/200] [Batch 21/59] [D loss: 0.678819] [G loss: 0.709486]\n",
"[Epoch 19/200] [Batch 22/59] [D loss: 0.675363] [G loss: 0.734382]\n",
"[Epoch 19/200] [Batch 23/59] [D loss: 0.670914] [G loss: 0.743141]\n",
"[Epoch 19/200] [Batch 24/59] [D loss: 0.674318] [G loss: 0.747012]\n",
"[Epoch 19/200] [Batch 25/59] [D loss: 0.672747] [G loss: 0.730700]\n",
"[Epoch 19/200] [Batch 26/59] [D loss: 0.670836] [G loss: 0.719565]\n",
"[Epoch 19/200] [Batch 27/59] [D loss: 0.668822] [G loss: 0.693076]\n",
"[Epoch 19/200] [Batch 28/59] [D loss: 0.668062] [G loss: 0.655705]\n",
"[Epoch 19/200] [Batch 29/59] [D loss: 0.670763] [G loss: 0.662419]\n",
"[Epoch 19/200] [Batch 30/59] [D loss: 0.667663] [G loss: 0.694887]\n",
"[Epoch 19/200] [Batch 31/59] [D loss: 0.668918] [G loss: 0.701906]\n",
"[Epoch 19/200] [Batch 32/59] [D loss: 0.674619] [G loss: 0.701083]\n",
"[Epoch 19/200] [Batch 33/59] [D loss: 0.675251] [G loss: 0.686464]\n",
"[Epoch 19/200] [Batch 34/59] [D loss: 0.688979] [G loss: 0.660311]\n",
"[Epoch 19/200] [Batch 35/59] [D loss: 0.685205] [G loss: 0.698679]\n",
"[Epoch 19/200] [Batch 36/59] [D loss: 0.688663] [G loss: 0.721133]\n",
"[Epoch 19/200] [Batch 37/59] [D loss: 0.681998] [G loss: 0.717051]\n",
"[Epoch 19/200] [Batch 38/59] [D loss: 0.691265] [G loss: 0.711469]\n",
"[Epoch 19/200] [Batch 39/59] [D loss: 0.678942] [G loss: 0.713958]\n",
"[Epoch 19/200] [Batch 40/59] [D loss: 0.681011] [G loss: 0.706900]\n",
"[Epoch 19/200] [Batch 41/59] [D loss: 0.679144] [G loss: 0.720137]\n",
"[Epoch 19/200] [Batch 42/59] [D loss: 0.681377] [G loss: 0.727545]\n",
"[Epoch 19/200] [Batch 43/59] [D loss: 0.684400] [G loss: 0.734793]\n",
"[Epoch 19/200] [Batch 44/59] [D loss: 0.682500] [G loss: 0.727080]\n",
"[Epoch 19/200] [Batch 45/59] [D loss: 0.690144] [G loss: 0.724701]\n",
"[Epoch 19/200] [Batch 46/59] [D loss: 0.690051] [G loss: 0.732792]\n",
"[Epoch 19/200] [Batch 47/59] [D loss: 0.683006] [G loss: 0.744437]\n",
"[Epoch 19/200] [Batch 48/59] [D loss: 0.687414] [G loss: 0.739369]\n",
"[Epoch 19/200] [Batch 49/59] [D loss: 0.683220] [G loss: 0.732197]\n",
"[Epoch 19/200] [Batch 50/59] [D loss: 0.687607] [G loss: 0.720550]\n",
"[Epoch 19/200] [Batch 51/59] [D loss: 0.681693] [G loss: 0.722250]\n",
"[Epoch 19/200] [Batch 52/59] [D loss: 0.677017] [G loss: 0.737677]\n",
"[Epoch 19/200] [Batch 53/59] [D loss: 0.671587] [G loss: 0.732513]\n",
"[Epoch 19/200] [Batch 54/59] [D loss: 0.674860] [G loss: 0.722636]\n",
"[Epoch 19/200] [Batch 55/59] [D loss: 0.663359] [G loss: 0.721339]\n",
"[Epoch 19/200] [Batch 56/59] [D loss: 0.665433] [G loss: 0.716831]\n",
"[Epoch 19/200] [Batch 57/59] [D loss: 0.657719] [G loss: 0.737795]\n",
"[Epoch 19/200] [Batch 58/59] [D loss: 0.663611] [G loss: 0.708206]\n",
"[Epoch 20/200] [Batch 0/59] [D loss: 0.663170] [G loss: 0.701142]\n",
"[Epoch 20/200] [Batch 1/59] [D loss: 0.663524] [G loss: 0.708092]\n",
"[Epoch 20/200] [Batch 2/59] [D loss: 0.655809] [G loss: 0.710032]\n",
"[Epoch 20/200] [Batch 3/59] [D loss: 0.667840] [G loss: 0.691370]\n",
"[Epoch 20/200] [Batch 4/59] [D loss: 0.661947] [G loss: 0.701880]\n",
"[Epoch 20/200] [Batch 5/59] [D loss: 0.673435] [G loss: 0.705598]\n",
"[Epoch 20/200] [Batch 6/59] [D loss: 0.674783] [G loss: 0.679596]\n",
"[Epoch 20/200] [Batch 7/59] [D loss: 0.682331] [G loss: 0.678676]\n",
"[Epoch 20/200] [Batch 8/59] [D loss: 0.684038] [G loss: 0.730575]\n",
"[Epoch 20/200] [Batch 9/59] [D loss: 0.683512] [G loss: 0.730900]\n",
"[Epoch 20/200] [Batch 10/59] [D loss: 0.690651] [G loss: 0.702880]\n",
"[Epoch 20/200] [Batch 11/59] [D loss: 0.696092] [G loss: 0.702981]\n",
"[Epoch 20/200] [Batch 12/59] [D loss: 0.686448] [G loss: 0.759533]\n",
"[Epoch 20/200] [Batch 13/59] [D loss: 0.681537] [G loss: 0.777331]\n",
"[Epoch 20/200] [Batch 14/59] [D loss: 0.682702] [G loss: 0.769346]\n",
"[Epoch 20/200] [Batch 15/59] [D loss: 0.667234] [G loss: 0.729257]\n",
"[Epoch 20/200] [Batch 16/59] [D loss: 0.672318] [G loss: 0.735882]\n",
"[Epoch 20/200] [Batch 17/59] [D loss: 0.668069] [G loss: 0.760254]\n",
"[Epoch 20/200] [Batch 18/59] [D loss: 0.669246] [G loss: 0.753886]\n",
"[Epoch 20/200] [Batch 19/59] [D loss: 0.680442] [G loss: 0.691581]\n",
"[Epoch 20/200] [Batch 20/59] [D loss: 0.688884] [G loss: 0.677392]\n",
"[Epoch 20/200] [Batch 21/59] [D loss: 0.681853] [G loss: 0.688989]\n",
"[Epoch 20/200] [Batch 22/59] [D loss: 0.680765] [G loss: 0.706873]\n",
"[Epoch 20/200] [Batch 23/59] [D loss: 0.673086] [G loss: 0.719680]\n",
"[Epoch 20/200] [Batch 24/59] [D loss: 0.670582] [G loss: 0.715650]\n",
"[Epoch 20/200] [Batch 25/59] [D loss: 0.668021] [G loss: 0.721052]\n",
"[Epoch 20/200] [Batch 26/59] [D loss: 0.669542] [G loss: 0.713237]\n",
"[Epoch 20/200] [Batch 27/59] [D loss: 0.678557] [G loss: 0.717866]\n",
"[Epoch 20/200] [Batch 28/59] [D loss: 0.693272] [G loss: 0.700466]\n",
"[Epoch 20/200] [Batch 29/59] [D loss: 0.690760] [G loss: 0.733708]\n",
"[Epoch 20/200] [Batch 30/59] [D loss: 0.688291] [G loss: 0.752449]\n",
"[Epoch 20/200] [Batch 31/59] [D loss: 0.670009] [G loss: 0.761527]\n",
"[Epoch 20/200] [Batch 32/59] [D loss: 0.659533] [G loss: 0.766174]\n",
"[Epoch 20/200] [Batch 33/59] [D loss: 0.634424] [G loss: 0.788000]\n",
"[Epoch 20/200] [Batch 34/59] [D loss: 0.633237] [G loss: 0.757791]\n",
"[Epoch 20/200] [Batch 35/59] [D loss: 0.637955] [G loss: 0.733403]\n",
"[Epoch 20/200] [Batch 36/59] [D loss: 0.644740] [G loss: 0.742774]\n",
"[Epoch 20/200] [Batch 37/59] [D loss: 0.657684] [G loss: 0.725789]\n",
"[Epoch 20/200] [Batch 38/59] [D loss: 0.671105] [G loss: 0.732580]\n",
"[Epoch 20/200] [Batch 39/59] [D loss: 0.672330] [G loss: 0.687265]\n",
"[Epoch 20/200] [Batch 40/59] [D loss: 0.681132] [G loss: 0.650307]\n",
"[Epoch 20/200] [Batch 41/59] [D loss: 0.668903] [G loss: 0.687247]\n",
"[Epoch 20/200] [Batch 42/59] [D loss: 0.664779] [G loss: 0.705261]\n",
"[Epoch 20/200] [Batch 43/59] [D loss: 0.650516] [G loss: 0.698456]\n",
"[Epoch 20/200] [Batch 44/59] [D loss: 0.651283] [G loss: 0.693329]\n",
"[Epoch 20/200] [Batch 45/59] [D loss: 0.660842] [G loss: 0.688824]\n",
"[Epoch 20/200] [Batch 46/59] [D loss: 0.664258] [G loss: 0.725332]\n",
"[Epoch 20/200] [Batch 47/59] [D loss: 0.670185] [G loss: 0.718857]\n",
"[Epoch 20/200] [Batch 48/59] [D loss: 0.682948] [G loss: 0.684265]\n",
"[Epoch 20/200] [Batch 49/59] [D loss: 0.677186] [G loss: 0.725094]\n",
"[Epoch 20/200] [Batch 50/59] [D loss: 0.686356] [G loss: 0.722313]\n",
"[Epoch 20/200] [Batch 51/59] [D loss: 0.702889] [G loss: 0.703795]\n",
"[Epoch 20/200] [Batch 52/59] [D loss: 0.701668] [G loss: 0.743629]\n",
"[Epoch 20/200] [Batch 53/59] [D loss: 0.699246] [G loss: 0.788861]\n",
"[Epoch 20/200] [Batch 54/59] [D loss: 0.695331] [G loss: 0.768376]\n",
"[Epoch 20/200] [Batch 55/59] [D loss: 0.674532] [G loss: 0.772498]\n",
"[Epoch 20/200] [Batch 56/59] [D loss: 0.657832] [G loss: 0.794505]\n",
"[Epoch 20/200] [Batch 57/59] [D loss: 0.663659] [G loss: 0.774419]\n",
"[Epoch 20/200] [Batch 58/59] [D loss: 0.659204] [G loss: 0.750924]\n",
"[Epoch 21/200] [Batch 0/59] [D loss: 0.671398] [G loss: 0.724192]\n",
"[Epoch 21/200] [Batch 1/59] [D loss: 0.672342] [G loss: 0.722715]\n",
"[Epoch 21/200] [Batch 2/59] [D loss: 0.675460] [G loss: 0.723022]\n",
"[Epoch 21/200] [Batch 3/59] [D loss: 0.670997] [G loss: 0.732021]\n",
"[Epoch 21/200] [Batch 4/59] [D loss: 0.648912] [G loss: 0.742330]\n",
"[Epoch 21/200] [Batch 5/59] [D loss: 0.665767] [G loss: 0.731848]\n",
"[Epoch 21/200] [Batch 6/59] [D loss: 0.645182] [G loss: 0.744688]\n",
"[Epoch 21/200] [Batch 7/59] [D loss: 0.652108] [G loss: 0.730302]\n",
"[Epoch 21/200] [Batch 8/59] [D loss: 0.652368] [G loss: 0.755440]\n",
"[Epoch 21/200] [Batch 9/59] [D loss: 0.667633] [G loss: 0.755806]\n",
"[Epoch 21/200] [Batch 10/59] [D loss: 0.665053] [G loss: 0.723498]\n",
"[Epoch 21/200] [Batch 11/59] [D loss: 0.680986] [G loss: 0.723688]\n",
"[Epoch 21/200] [Batch 12/59] [D loss: 0.676385] [G loss: 0.712382]\n",
"[Epoch 21/200] [Batch 13/59] [D loss: 0.669979] [G loss: 0.722708]\n",
"[Epoch 21/200] [Batch 14/59] [D loss: 0.656279] [G loss: 0.723297]\n",
"[Epoch 21/200] [Batch 15/59] [D loss: 0.663152] [G loss: 0.733994]\n",
"[Epoch 21/200] [Batch 16/59] [D loss: 0.642889] [G loss: 0.754145]\n",
"[Epoch 21/200] [Batch 17/59] [D loss: 0.635909] [G loss: 0.752340]\n",
"[Epoch 21/200] [Batch 18/59] [D loss: 0.626743] [G loss: 0.756608]\n",
"[Epoch 21/200] [Batch 19/59] [D loss: 0.646463] [G loss: 0.737275]\n",
"[Epoch 21/200] [Batch 20/59] [D loss: 0.646189] [G loss: 0.730360]\n",
"[Epoch 21/200] [Batch 21/59] [D loss: 0.660674] [G loss: 0.736733]\n",
"[Epoch 21/200] [Batch 22/59] [D loss: 0.671545] [G loss: 0.698237]\n",
"[Epoch 21/200] [Batch 23/59] [D loss: 0.673974] [G loss: 0.706243]\n",
"[Epoch 21/200] [Batch 24/59] [D loss: 0.675686] [G loss: 0.708612]\n",
"[Epoch 21/200] [Batch 25/59] [D loss: 0.668847] [G loss: 0.736777]\n",
"[Epoch 21/200] [Batch 26/59] [D loss: 0.675173] [G loss: 0.737302]\n",
"[Epoch 21/200] [Batch 27/59] [D loss: 0.672417] [G loss: 0.713458]\n",
"[Epoch 21/200] [Batch 28/59] [D loss: 0.677087] [G loss: 0.707130]\n",
"[Epoch 21/200] [Batch 29/59] [D loss: 0.679011] [G loss: 0.722669]\n",
"[Epoch 21/200] [Batch 30/59] [D loss: 0.669465] [G loss: 0.786415]\n",
"[Epoch 21/200] [Batch 31/59] [D loss: 0.678528] [G loss: 0.695291]\n",
"[Epoch 21/200] [Batch 32/59] [D loss: 0.695208] [G loss: 0.700033]\n",
"[Epoch 21/200] [Batch 33/59] [D loss: 0.685425] [G loss: 0.772002]\n",
"[Epoch 21/200] [Batch 34/59] [D loss: 0.674165] [G loss: 0.786917]\n",
"[Epoch 21/200] [Batch 35/59] [D loss: 0.681111] [G loss: 0.724378]\n",
"[Epoch 21/200] [Batch 36/59] [D loss: 0.670515] [G loss: 0.754562]\n",
"[Epoch 21/200] [Batch 37/59] [D loss: 0.652475] [G loss: 0.791084]\n",
"[Epoch 21/200] [Batch 38/59] [D loss: 0.659273] [G loss: 0.757737]\n",
"[Epoch 21/200] [Batch 39/59] [D loss: 0.659125] [G loss: 0.791124]\n",
"[Epoch 21/200] [Batch 40/59] [D loss: 0.655868] [G loss: 0.776731]\n",
"[Epoch 21/200] [Batch 41/59] [D loss: 0.660715] [G loss: 0.770609]\n",
"[Epoch 21/200] [Batch 42/59] [D loss: 0.653712] [G loss: 0.712726]\n",
"[Epoch 21/200] [Batch 43/59] [D loss: 0.669737] [G loss: 0.715005]\n",
"[Epoch 21/200] [Batch 44/59] [D loss: 0.676370] [G loss: 0.748493]\n",
"[Epoch 21/200] [Batch 45/59] [D loss: 0.671867] [G loss: 0.728427]\n",
"[Epoch 21/200] [Batch 46/59] [D loss: 0.687560] [G loss: 0.710236]\n",
"[Epoch 21/200] [Batch 47/59] [D loss: 0.667321] [G loss: 0.726057]\n",
"[Epoch 21/200] [Batch 48/59] [D loss: 0.678936] [G loss: 0.717192]\n",
"[Epoch 21/200] [Batch 49/59] [D loss: 0.673839] [G loss: 0.728395]\n",
"[Epoch 21/200] [Batch 50/59] [D loss: 0.684834] [G loss: 0.727596]\n",
"[Epoch 21/200] [Batch 51/59] [D loss: 0.680541] [G loss: 0.735609]\n",
"[Epoch 21/200] [Batch 52/59] [D loss: 0.683171] [G loss: 0.745948]\n",
"[Epoch 21/200] [Batch 53/59] [D loss: 0.679545] [G loss: 0.721404]\n",
"[Epoch 21/200] [Batch 54/59] [D loss: 0.688616] [G loss: 0.722189]\n",
"[Epoch 21/200] [Batch 55/59] [D loss: 0.680041] [G loss: 0.748906]\n",
"[Epoch 21/200] [Batch 56/59] [D loss: 0.676400] [G loss: 0.752239]\n",
"[Epoch 21/200] [Batch 57/59] [D loss: 0.656478] [G loss: 0.730737]\n",
"[Epoch 21/200] [Batch 58/59] [D loss: 0.659869] [G loss: 0.758530]\n",
"[Epoch 22/200] [Batch 0/59] [D loss: 0.655012] [G loss: 0.749241]\n",
"[Epoch 22/200] [Batch 1/59] [D loss: 0.654101] [G loss: 0.782978]\n",
"[Epoch 22/200] [Batch 2/59] [D loss: 0.651911] [G loss: 0.764211]\n",
"[Epoch 22/200] [Batch 3/59] [D loss: 0.641407] [G loss: 0.756487]\n",
"[Epoch 22/200] [Batch 4/59] [D loss: 0.654058] [G loss: 0.751289]\n",
"[Epoch 22/200] [Batch 5/59] [D loss: 0.659799] [G loss: 0.722998]\n",
"[Epoch 22/200] [Batch 6/59] [D loss: 0.648251] [G loss: 0.747976]\n",
"[Epoch 22/200] [Batch 7/59] [D loss: 0.653293] [G loss: 0.753886]\n",
"[Epoch 22/200] [Batch 8/59] [D loss: 0.659271] [G loss: 0.744456]\n",
"[Epoch 22/200] [Batch 9/59] [D loss: 0.664854] [G loss: 0.725738]\n",
"[Epoch 22/200] [Batch 10/59] [D loss: 0.664240] [G loss: 0.743832]\n",
"[Epoch 22/200] [Batch 11/59] [D loss: 0.666809] [G loss: 0.744071]\n",
"[Epoch 22/200] [Batch 12/59] [D loss: 0.690805] [G loss: 0.756108]\n",
"[Epoch 22/200] [Batch 13/59] [D loss: 0.697492] [G loss: 0.694728]\n",
"[Epoch 22/200] [Batch 14/59] [D loss: 0.703598] [G loss: 0.695839]\n",
"[Epoch 22/200] [Batch 15/59] [D loss: 0.700274] [G loss: 0.747425]\n",
"[Epoch 22/200] [Batch 16/59] [D loss: 0.698592] [G loss: 0.705054]\n",
"[Epoch 22/200] [Batch 17/59] [D loss: 0.691302] [G loss: 0.719961]\n",
"[Epoch 22/200] [Batch 18/59] [D loss: 0.673950] [G loss: 0.744617]\n",
"[Epoch 22/200] [Batch 19/59] [D loss: 0.674567] [G loss: 0.716426]\n",
"[Epoch 22/200] [Batch 20/59] [D loss: 0.669299] [G loss: 0.738027]\n",
"[Epoch 22/200] [Batch 21/59] [D loss: 0.658259] [G loss: 0.770758]\n",
"[Epoch 22/200] [Batch 22/59] [D loss: 0.661150] [G loss: 0.761246]\n",
"[Epoch 22/200] [Batch 23/59] [D loss: 0.654855] [G loss: 0.736101]\n",
"[Epoch 22/200] [Batch 24/59] [D loss: 0.653385] [G loss: 0.742915]\n",
"[Epoch 22/200] [Batch 25/59] [D loss: 0.646984] [G loss: 0.779569]\n",
"[Epoch 22/200] [Batch 26/59] [D loss: 0.651139] [G loss: 0.756196]\n",
"[Epoch 22/200] [Batch 27/59] [D loss: 0.653239] [G loss: 0.738245]\n",
"[Epoch 22/200] [Batch 28/59] [D loss: 0.669450] [G loss: 0.698503]\n",
"[Epoch 22/200] [Batch 29/59] [D loss: 0.654138] [G loss: 0.739815]\n",
"[Epoch 22/200] [Batch 30/59] [D loss: 0.670944] [G loss: 0.751987]\n",
"[Epoch 22/200] [Batch 31/59] [D loss: 0.668103] [G loss: 0.691593]\n",
"[Epoch 22/200] [Batch 32/59] [D loss: 0.654920] [G loss: 0.703061]\n",
"[Epoch 22/200] [Batch 33/59] [D loss: 0.687989] [G loss: 0.719453]\n",
"[Epoch 22/200] [Batch 34/59] [D loss: 0.662314] [G loss: 0.763503]\n",
"[Epoch 22/200] [Batch 35/59] [D loss: 0.668312] [G loss: 0.682684]\n",
"[Epoch 22/200] [Batch 36/59] [D loss: 0.665279] [G loss: 0.732941]\n",
"[Epoch 22/200] [Batch 37/59] [D loss: 0.677636] [G loss: 0.741523]\n",
"[Epoch 22/200] [Batch 38/59] [D loss: 0.683356] [G loss: 0.711169]\n",
"[Epoch 22/200] [Batch 39/59] [D loss: 0.684902] [G loss: 0.760904]\n",
"[Epoch 22/200] [Batch 40/59] [D loss: 0.686397] [G loss: 0.770704]\n",
"[Epoch 22/200] [Batch 41/59] [D loss: 0.681244] [G loss: 0.745245]\n",
"[Epoch 22/200] [Batch 42/59] [D loss: 0.676139] [G loss: 0.763701]\n",
"[Epoch 22/200] [Batch 43/59] [D loss: 0.663173] [G loss: 0.762545]\n",
"[Epoch 22/200] [Batch 44/59] [D loss: 0.668109] [G loss: 0.777587]\n",
"[Epoch 22/200] [Batch 45/59] [D loss: 0.661484] [G loss: 0.767345]\n",
"[Epoch 22/200] [Batch 46/59] [D loss: 0.659944] [G loss: 0.747092]\n",
"[Epoch 22/200] [Batch 47/59] [D loss: 0.667366] [G loss: 0.739101]\n",
"[Epoch 22/200] [Batch 48/59] [D loss: 0.661400] [G loss: 0.736022]\n",
"[Epoch 22/200] [Batch 49/59] [D loss: 0.665513] [G loss: 0.717859]\n",
"[Epoch 22/200] [Batch 50/59] [D loss: 0.682446] [G loss: 0.715735]\n",
"[Epoch 22/200] [Batch 51/59] [D loss: 0.676078] [G loss: 0.715172]\n",
"[Epoch 22/200] [Batch 52/59] [D loss: 0.667426] [G loss: 0.713966]\n",
"[Epoch 22/200] [Batch 53/59] [D loss: 0.672680] [G loss: 0.723864]\n",
"[Epoch 22/200] [Batch 54/59] [D loss: 0.666521] [G loss: 0.702722]\n",
"[Epoch 22/200] [Batch 55/59] [D loss: 0.665980] [G loss: 0.711685]\n",
"[Epoch 22/200] [Batch 56/59] [D loss: 0.660577] [G loss: 0.749475]\n",
"[Epoch 22/200] [Batch 57/59] [D loss: 0.665767] [G loss: 0.716116]\n",
"[Epoch 22/200] [Batch 58/59] [D loss: 0.663811] [G loss: 0.710620]\n",
"[Epoch 23/200] [Batch 0/59] [D loss: 0.677612] [G loss: 0.725299]\n",
"[Epoch 23/200] [Batch 1/59] [D loss: 0.665991] [G loss: 0.746346]\n",
"[Epoch 23/200] [Batch 2/59] [D loss: 0.666255] [G loss: 0.727900]\n",
"[Epoch 23/200] [Batch 3/59] [D loss: 0.665125] [G loss: 0.734305]\n",
"[Epoch 23/200] [Batch 4/59] [D loss: 0.676263] [G loss: 0.718527]\n",
"[Epoch 23/200] [Batch 5/59] [D loss: 0.678003] [G loss: 0.746763]\n",
"[Epoch 23/200] [Batch 6/59] [D loss: 0.674528] [G loss: 0.779380]\n",
"[Epoch 23/200] [Batch 7/59] [D loss: 0.675035] [G loss: 0.739115]\n",
"[Epoch 23/200] [Batch 8/59] [D loss: 0.672321] [G loss: 0.738208]\n",
"[Epoch 23/200] [Batch 9/59] [D loss: 0.673626] [G loss: 0.739279]\n",
"[Epoch 23/200] [Batch 10/59] [D loss: 0.684080] [G loss: 0.732897]\n",
"[Epoch 23/200] [Batch 11/59] [D loss: 0.671318] [G loss: 0.749392]\n",
"[Epoch 23/200] [Batch 12/59] [D loss: 0.669910] [G loss: 0.724999]\n",
"[Epoch 23/200] [Batch 13/59] [D loss: 0.669073] [G loss: 0.720553]\n",
"[Epoch 23/200] [Batch 14/59] [D loss: 0.681442] [G loss: 0.730423]\n",
"[Epoch 23/200] [Batch 15/59] [D loss: 0.668363] [G loss: 0.776441]\n",
"[Epoch 23/200] [Batch 16/59] [D loss: 0.674102] [G loss: 0.738354]\n",
"[Epoch 23/200] [Batch 17/59] [D loss: 0.657653] [G loss: 0.748371]\n",
"[Epoch 23/200] [Batch 18/59] [D loss: 0.656761] [G loss: 0.746289]\n",
"[Epoch 23/200] [Batch 19/59] [D loss: 0.652363] [G loss: 0.749263]\n",
"[Epoch 23/200] [Batch 20/59] [D loss: 0.663822] [G loss: 0.727448]\n",
"[Epoch 23/200] [Batch 21/59] [D loss: 0.668431] [G loss: 0.752461]\n",
"[Epoch 23/200] [Batch 22/59] [D loss: 0.670092] [G loss: 0.743525]\n",
"[Epoch 23/200] [Batch 23/59] [D loss: 0.670382] [G loss: 0.763927]\n",
"[Epoch 23/200] [Batch 24/59] [D loss: 0.672744] [G loss: 0.746794]\n",
"[Epoch 23/200] [Batch 25/59] [D loss: 0.670672] [G loss: 0.710369]\n",
"[Epoch 23/200] [Batch 26/59] [D loss: 0.686001] [G loss: 0.748021]\n",
"[Epoch 23/200] [Batch 27/59] [D loss: 0.687216] [G loss: 0.753514]\n",
"[Epoch 23/200] [Batch 28/59] [D loss: 0.675421] [G loss: 0.699264]\n",
"[Epoch 23/200] [Batch 29/59] [D loss: 0.674409] [G loss: 0.724405]\n",
"[Epoch 23/200] [Batch 30/59] [D loss: 0.687235] [G loss: 0.708252]\n",
"[Epoch 23/200] [Batch 31/59] [D loss: 0.688799] [G loss: 0.709876]\n",
"[Epoch 23/200] [Batch 32/59] [D loss: 0.665022] [G loss: 0.747558]\n",
"[Epoch 23/200] [Batch 33/59] [D loss: 0.665653] [G loss: 0.708957]\n",
"[Epoch 23/200] [Batch 34/59] [D loss: 0.666816] [G loss: 0.713076]\n",
"[Epoch 23/200] [Batch 35/59] [D loss: 0.666615] [G loss: 0.743480]\n",
"[Epoch 23/200] [Batch 36/59] [D loss: 0.666507] [G loss: 0.767761]\n",
"[Epoch 23/200] [Batch 37/59] [D loss: 0.661239] [G loss: 0.713752]\n",
"[Epoch 23/200] [Batch 38/59] [D loss: 0.657106] [G loss: 0.716302]\n",
"[Epoch 23/200] [Batch 39/59] [D loss: 0.681775] [G loss: 0.750332]\n",
"[Epoch 23/200] [Batch 40/59] [D loss: 0.666083] [G loss: 0.791063]\n",
"[Epoch 23/200] [Batch 41/59] [D loss: 0.662753] [G loss: 0.766698]\n",
"[Epoch 23/200] [Batch 42/59] [D loss: 0.683832] [G loss: 0.742439]\n",
"[Epoch 23/200] [Batch 43/59] [D loss: 0.661294] [G loss: 0.743133]\n",
"[Epoch 23/200] [Batch 44/59] [D loss: 0.683057] [G loss: 0.740775]\n",
"[Epoch 23/200] [Batch 45/59] [D loss: 0.679808] [G loss: 0.772780]\n",
"[Epoch 23/200] [Batch 46/59] [D loss: 0.665085] [G loss: 0.765888]\n",
"[Epoch 23/200] [Batch 47/59] [D loss: 0.665409] [G loss: 0.737432]\n",
"[Epoch 23/200] [Batch 48/59] [D loss: 0.669081] [G loss: 0.745125]\n",
"[Epoch 23/200] [Batch 49/59] [D loss: 0.659748] [G loss: 0.766702]\n",
"[Epoch 23/200] [Batch 50/59] [D loss: 0.659546] [G loss: 0.792514]\n",
"[Epoch 23/200] [Batch 51/59] [D loss: 0.664308] [G loss: 0.699571]\n",
"[Epoch 23/200] [Batch 52/59] [D loss: 0.677861] [G loss: 0.729462]\n",
"[Epoch 23/200] [Batch 53/59] [D loss: 0.673499] [G loss: 0.748783]\n",
"[Epoch 23/200] [Batch 54/59] [D loss: 0.666924] [G loss: 0.737876]\n",
"[Epoch 23/200] [Batch 55/59] [D loss: 0.682476] [G loss: 0.707208]\n",
"[Epoch 23/200] [Batch 56/59] [D loss: 0.674493] [G loss: 0.751687]\n",
"[Epoch 23/200] [Batch 57/59] [D loss: 0.669440] [G loss: 0.775311]\n",
"[Epoch 23/200] [Batch 58/59] [D loss: 0.669333] [G loss: 0.735888]\n",
"[Epoch 24/200] [Batch 0/59] [D loss: 0.678393] [G loss: 0.691959]\n",
"[Epoch 24/200] [Batch 1/59] [D loss: 0.676437] [G loss: 0.774858]\n",
"[Epoch 24/200] [Batch 2/59] [D loss: 0.667338] [G loss: 0.758906]\n",
"[Epoch 24/200] [Batch 3/59] [D loss: 0.682112] [G loss: 0.708991]\n",
"[Epoch 24/200] [Batch 4/59] [D loss: 0.678770] [G loss: 0.701365]\n",
"[Epoch 24/200] [Batch 5/59] [D loss: 0.676318] [G loss: 0.779483]\n",
"[Epoch 24/200] [Batch 6/59] [D loss: 0.670123] [G loss: 0.783790]\n",
"[Epoch 24/200] [Batch 7/59] [D loss: 0.682803] [G loss: 0.712791]\n",
"[Epoch 24/200] [Batch 8/59] [D loss: 0.669407] [G loss: 0.751408]\n",
"[Epoch 24/200] [Batch 9/59] [D loss: 0.669745] [G loss: 0.727929]\n",
"[Epoch 24/200] [Batch 10/59] [D loss: 0.674010] [G loss: 0.756098]\n",
"[Epoch 24/200] [Batch 11/59] [D loss: 0.648315] [G loss: 0.749139]\n",
"[Epoch 24/200] [Batch 12/59] [D loss: 0.663080] [G loss: 0.722594]\n",
"[Epoch 24/200] [Batch 13/59] [D loss: 0.647946] [G loss: 0.750403]\n",
"[Epoch 24/200] [Batch 14/59] [D loss: 0.653553] [G loss: 0.747621]\n",
"[Epoch 24/200] [Batch 15/59] [D loss: 0.647355] [G loss: 0.699497]\n",
"[Epoch 24/200] [Batch 16/59] [D loss: 0.664576] [G loss: 0.720461]\n",
"[Epoch 24/200] [Batch 17/59] [D loss: 0.664211] [G loss: 0.829982]\n",
"[Epoch 24/200] [Batch 18/59] [D loss: 0.648510] [G loss: 0.797774]\n",
"[Epoch 24/200] [Batch 19/59] [D loss: 0.665349] [G loss: 0.674811]\n",
"[Epoch 24/200] [Batch 20/59] [D loss: 0.663845] [G loss: 0.714840]\n",
"[Epoch 24/200] [Batch 21/59] [D loss: 0.653659] [G loss: 0.804569]\n",
"[Epoch 24/200] [Batch 22/59] [D loss: 0.663143] [G loss: 0.757945]\n",
"[Epoch 24/200] [Batch 23/59] [D loss: 0.673216] [G loss: 0.693495]\n",
"[Epoch 24/200] [Batch 24/59] [D loss: 0.658530] [G loss: 0.744525]\n",
"[Epoch 24/200] [Batch 25/59] [D loss: 0.671890] [G loss: 0.789921]\n",
"[Epoch 24/200] [Batch 26/59] [D loss: 0.673437] [G loss: 0.742693]\n",
"[Epoch 24/200] [Batch 27/59] [D loss: 0.667559] [G loss: 0.703422]\n",
"[Epoch 24/200] [Batch 28/59] [D loss: 0.663097] [G loss: 0.723638]\n",
"[Epoch 24/200] [Batch 29/59] [D loss: 0.679649] [G loss: 0.776230]\n",
"[Epoch 24/200] [Batch 30/59] [D loss: 0.687960] [G loss: 0.743839]\n",
"[Epoch 24/200] [Batch 31/59] [D loss: 0.666008] [G loss: 0.710437]\n",
"[Epoch 24/200] [Batch 32/59] [D loss: 0.675880] [G loss: 0.699094]\n",
"[Epoch 24/200] [Batch 33/59] [D loss: 0.663886] [G loss: 0.756091]\n",
"[Epoch 24/200] [Batch 34/59] [D loss: 0.678197] [G loss: 0.771038]\n",
"[Epoch 24/200] [Batch 35/59] [D loss: 0.654434] [G loss: 0.734970]\n",
"[Epoch 24/200] [Batch 36/59] [D loss: 0.643560] [G loss: 0.750546]\n",
"[Epoch 24/200] [Batch 37/59] [D loss: 0.655121] [G loss: 0.783613]\n",
"[Epoch 24/200] [Batch 38/59] [D loss: 0.651188] [G loss: 0.814659]\n",
"[Epoch 24/200] [Batch 39/59] [D loss: 0.638762] [G loss: 0.742135]\n",
"[Epoch 24/200] [Batch 40/59] [D loss: 0.649886] [G loss: 0.743944]\n",
"[Epoch 24/200] [Batch 41/59] [D loss: 0.654122] [G loss: 0.815588]\n",
"[Epoch 24/200] [Batch 42/59] [D loss: 0.638310] [G loss: 0.782404]\n",
"[Epoch 24/200] [Batch 43/59] [D loss: 0.678154] [G loss: 0.707301]\n",
"[Epoch 24/200] [Batch 44/59] [D loss: 0.684779] [G loss: 0.753219]\n",
"[Epoch 24/200] [Batch 45/59] [D loss: 0.678689] [G loss: 0.783840]\n",
"[Epoch 24/200] [Batch 46/59] [D loss: 0.667767] [G loss: 0.747438]\n",
"[Epoch 24/200] [Batch 47/59] [D loss: 0.673694] [G loss: 0.695904]\n",
"[Epoch 24/200] [Batch 48/59] [D loss: 0.658890] [G loss: 0.774595]\n",
"[Epoch 24/200] [Batch 49/59] [D loss: 0.663490] [G loss: 0.756427]\n",
"[Epoch 24/200] [Batch 50/59] [D loss: 0.649229] [G loss: 0.709542]\n",
"[Epoch 24/200] [Batch 51/59] [D loss: 0.661659] [G loss: 0.710203]\n",
"[Epoch 24/200] [Batch 52/59] [D loss: 0.654838] [G loss: 0.800000]\n",
"[Epoch 24/200] [Batch 53/59] [D loss: 0.671427] [G loss: 0.760021]\n",
"[Epoch 24/200] [Batch 54/59] [D loss: 0.656394] [G loss: 0.736202]\n",
"[Epoch 24/200] [Batch 55/59] [D loss: 0.670033] [G loss: 0.712355]\n",
"[Epoch 24/200] [Batch 56/59] [D loss: 0.667305] [G loss: 0.797476]\n",
"[Epoch 24/200] [Batch 57/59] [D loss: 0.658524] [G loss: 0.780275]\n",
"[Epoch 24/200] [Batch 58/59] [D loss: 0.691110] [G loss: 0.733590]\n",
"[Epoch 25/200] [Batch 0/59] [D loss: 0.666581] [G loss: 0.739031]\n",
"[Epoch 25/200] [Batch 1/59] [D loss: 0.662795] [G loss: 0.788791]\n",
"[Epoch 25/200] [Batch 2/59] [D loss: 0.644096] [G loss: 0.776511]\n",
"[Epoch 25/200] [Batch 3/59] [D loss: 0.659243] [G loss: 0.754391]\n",
"[Epoch 25/200] [Batch 4/59] [D loss: 0.646150] [G loss: 0.739558]\n",
"[Epoch 25/200] [Batch 5/59] [D loss: 0.653021] [G loss: 0.800872]\n",
"[Epoch 25/200] [Batch 6/59] [D loss: 0.646481] [G loss: 0.782496]\n",
"[Epoch 25/200] [Batch 7/59] [D loss: 0.660800] [G loss: 0.730220]\n",
"[Epoch 25/200] [Batch 8/59] [D loss: 0.649911] [G loss: 0.712242]\n",
"[Epoch 25/200] [Batch 9/59] [D loss: 0.661354] [G loss: 0.748939]\n",
"[Epoch 25/200] [Batch 10/59] [D loss: 0.660116] [G loss: 0.743577]\n",
"[Epoch 25/200] [Batch 11/59] [D loss: 0.671181] [G loss: 0.682848]\n",
"[Epoch 25/200] [Batch 12/59] [D loss: 0.666964] [G loss: 0.708725]\n",
"[Epoch 25/200] [Batch 13/59] [D loss: 0.670440] [G loss: 0.732269]\n",
"[Epoch 25/200] [Batch 14/59] [D loss: 0.665853] [G loss: 0.767693]\n",
"[Epoch 25/200] [Batch 15/59] [D loss: 0.662320] [G loss: 0.754656]\n",
"[Epoch 25/200] [Batch 16/59] [D loss: 0.678324] [G loss: 0.731215]\n",
"[Epoch 25/200] [Batch 17/59] [D loss: 0.662405] [G loss: 0.795596]\n",
"[Epoch 25/200] [Batch 18/59] [D loss: 0.651566] [G loss: 0.756850]\n",
"[Epoch 25/200] [Batch 19/59] [D loss: 0.658651] [G loss: 0.735060]\n",
"[Epoch 25/200] [Batch 20/59] [D loss: 0.662898] [G loss: 0.768866]\n",
"[Epoch 25/200] [Batch 21/59] [D loss: 0.670755] [G loss: 0.727753]\n",
"[Epoch 25/200] [Batch 22/59] [D loss: 0.649201] [G loss: 0.777556]\n",
"[Epoch 25/200] [Batch 23/59] [D loss: 0.650907] [G loss: 0.831616]\n",
"[Epoch 25/200] [Batch 24/59] [D loss: 0.661109] [G loss: 0.732630]\n",
"[Epoch 25/200] [Batch 25/59] [D loss: 0.674142] [G loss: 0.695088]\n",
"[Epoch 25/200] [Batch 26/59] [D loss: 0.666974] [G loss: 0.771703]\n",
"[Epoch 25/200] [Batch 27/59] [D loss: 0.662446] [G loss: 0.754975]\n",
"[Epoch 25/200] [Batch 28/59] [D loss: 0.674748] [G loss: 0.733073]\n",
"[Epoch 25/200] [Batch 29/59] [D loss: 0.656847] [G loss: 0.736313]\n",
"[Epoch 25/200] [Batch 30/59] [D loss: 0.659473] [G loss: 0.760955]\n",
"[Epoch 25/200] [Batch 31/59] [D loss: 0.673452] [G loss: 0.754613]\n",
"[Epoch 25/200] [Batch 32/59] [D loss: 0.677642] [G loss: 0.728492]\n",
"[Epoch 25/200] [Batch 33/59] [D loss: 0.654376] [G loss: 0.720149]\n",
"[Epoch 25/200] [Batch 34/59] [D loss: 0.689148] [G loss: 0.774106]\n",
"[Epoch 25/200] [Batch 35/59] [D loss: 0.676182] [G loss: 0.792201]\n",
"[Epoch 25/200] [Batch 36/59] [D loss: 0.671028] [G loss: 0.734934]\n",
"[Epoch 25/200] [Batch 37/59] [D loss: 0.655107] [G loss: 0.737575]\n",
"[Epoch 25/200] [Batch 38/59] [D loss: 0.647014] [G loss: 0.757044]\n",
"[Epoch 25/200] [Batch 39/59] [D loss: 0.656901] [G loss: 0.781471]\n",
"[Epoch 25/200] [Batch 40/59] [D loss: 0.656476] [G loss: 0.775637]\n",
"[Epoch 25/200] [Batch 41/59] [D loss: 0.647229] [G loss: 0.751134]\n",
"[Epoch 25/200] [Batch 42/59] [D loss: 0.664406] [G loss: 0.749646]\n",
"[Epoch 25/200] [Batch 43/59] [D loss: 0.656260] [G loss: 0.773862]\n",
"[Epoch 25/200] [Batch 44/59] [D loss: 0.647423] [G loss: 0.785525]\n",
"[Epoch 25/200] [Batch 45/59] [D loss: 0.657615] [G loss: 0.699761]\n",
"[Epoch 25/200] [Batch 46/59] [D loss: 0.679201] [G loss: 0.735326]\n",
"[Epoch 25/200] [Batch 47/59] [D loss: 0.654019] [G loss: 0.781125]\n",
"[Epoch 25/200] [Batch 48/59] [D loss: 0.666758] [G loss: 0.752946]\n",
"[Epoch 25/200] [Batch 49/59] [D loss: 0.659635] [G loss: 0.731186]\n",
"[Epoch 25/200] [Batch 50/59] [D loss: 0.655558] [G loss: 0.780396]\n",
"[Epoch 25/200] [Batch 51/59] [D loss: 0.668208] [G loss: 0.750309]\n",
"[Epoch 25/200] [Batch 52/59] [D loss: 0.661144] [G loss: 0.717905]\n",
"[Epoch 25/200] [Batch 53/59] [D loss: 0.661234] [G loss: 0.747669]\n",
"[Epoch 25/200] [Batch 54/59] [D loss: 0.655091] [G loss: 0.775691]\n",
"[Epoch 25/200] [Batch 55/59] [D loss: 0.655263] [G loss: 0.749094]\n",
"[Epoch 25/200] [Batch 56/59] [D loss: 0.672688] [G loss: 0.712581]\n",
"[Epoch 25/200] [Batch 57/59] [D loss: 0.652288] [G loss: 0.786774]\n",
"[Epoch 25/200] [Batch 58/59] [D loss: 0.670114] [G loss: 0.736606]\n",
"[Epoch 26/200] [Batch 0/59] [D loss: 0.668098] [G loss: 0.806283]\n",
"[Epoch 26/200] [Batch 1/59] [D loss: 0.677911] [G loss: 0.765849]\n",
"[Epoch 26/200] [Batch 2/59] [D loss: 0.648973] [G loss: 0.759622]\n",
"[Epoch 26/200] [Batch 3/59] [D loss: 0.665153] [G loss: 0.713924]\n",
"[Epoch 26/200] [Batch 4/59] [D loss: 0.654985] [G loss: 0.783173]\n",
"[Epoch 26/200] [Batch 5/59] [D loss: 0.651309] [G loss: 0.750196]\n",
"[Epoch 26/200] [Batch 6/59] [D loss: 0.656556] [G loss: 0.671719]\n",
"[Epoch 26/200] [Batch 7/59] [D loss: 0.656987] [G loss: 0.745102]\n",
"[Epoch 26/200] [Batch 8/59] [D loss: 0.672048] [G loss: 0.767061]\n",
"[Epoch 26/200] [Batch 9/59] [D loss: 0.670035] [G loss: 0.736046]\n",
"[Epoch 26/200] [Batch 10/59] [D loss: 0.653644] [G loss: 0.788736]\n",
"[Epoch 26/200] [Batch 11/59] [D loss: 0.662855] [G loss: 0.758104]\n",
"[Epoch 26/200] [Batch 12/59] [D loss: 0.674468] [G loss: 0.726610]\n",
"[Epoch 26/200] [Batch 13/59] [D loss: 0.683798] [G loss: 0.745374]\n",
"[Epoch 26/200] [Batch 14/59] [D loss: 0.657028] [G loss: 0.757443]\n",
"[Epoch 26/200] [Batch 15/59] [D loss: 0.675816] [G loss: 0.764458]\n",
"[Epoch 26/200] [Batch 16/59] [D loss: 0.687396] [G loss: 0.755810]\n",
"[Epoch 26/200] [Batch 17/59] [D loss: 0.669399] [G loss: 0.766892]\n",
"[Epoch 26/200] [Batch 18/59] [D loss: 0.684037] [G loss: 0.761803]\n",
"[Epoch 26/200] [Batch 19/59] [D loss: 0.675320] [G loss: 0.718147]\n",
"[Epoch 26/200] [Batch 20/59] [D loss: 0.666923] [G loss: 0.760108]\n",
"[Epoch 26/200] [Batch 21/59] [D loss: 0.649958] [G loss: 0.785450]\n",
"[Epoch 26/200] [Batch 22/59] [D loss: 0.656392] [G loss: 0.703362]\n",
"[Epoch 26/200] [Batch 23/59] [D loss: 0.662593] [G loss: 0.733486]\n",
"[Epoch 26/200] [Batch 24/59] [D loss: 0.652446] [G loss: 0.782612]\n",
"[Epoch 26/200] [Batch 25/59] [D loss: 0.640459] [G loss: 0.788012]\n",
"[Epoch 26/200] [Batch 26/59] [D loss: 0.651330] [G loss: 0.694592]\n",
"[Epoch 26/200] [Batch 27/59] [D loss: 0.646672] [G loss: 0.779588]\n",
"[Epoch 26/200] [Batch 28/59] [D loss: 0.652901] [G loss: 0.796791]\n",
"[Epoch 26/200] [Batch 29/59] [D loss: 0.640958] [G loss: 0.755650]\n",
"[Epoch 26/200] [Batch 30/59] [D loss: 0.636854] [G loss: 0.737483]\n",
"[Epoch 26/200] [Batch 31/59] [D loss: 0.637622] [G loss: 0.769009]\n",
"[Epoch 26/200] [Batch 32/59] [D loss: 0.643293] [G loss: 0.782197]\n",
"[Epoch 26/200] [Batch 33/59] [D loss: 0.638321] [G loss: 0.772933]\n",
"[Epoch 26/200] [Batch 34/59] [D loss: 0.645037] [G loss: 0.745366]\n",
"[Epoch 26/200] [Batch 35/59] [D loss: 0.636759] [G loss: 0.787921]\n",
"[Epoch 26/200] [Batch 36/59] [D loss: 0.653336] [G loss: 0.739064]\n",
"[Epoch 26/200] [Batch 37/59] [D loss: 0.655679] [G loss: 0.752460]\n",
"[Epoch 26/200] [Batch 38/59] [D loss: 0.664900] [G loss: 0.723187]\n",
"[Epoch 26/200] [Batch 39/59] [D loss: 0.682270] [G loss: 0.745446]\n",
"[Epoch 26/200] [Batch 40/59] [D loss: 0.691963] [G loss: 0.741766]\n",
"[Epoch 26/200] [Batch 41/59] [D loss: 0.699723] [G loss: 0.736026]\n",
"[Epoch 26/200] [Batch 42/59] [D loss: 0.692640] [G loss: 0.737087]\n",
"[Epoch 26/200] [Batch 43/59] [D loss: 0.694416] [G loss: 0.719052]\n",
"[Epoch 26/200] [Batch 44/59] [D loss: 0.668392] [G loss: 0.763749]\n",
"[Epoch 26/200] [Batch 45/59] [D loss: 0.666505] [G loss: 0.783430]\n",
"[Epoch 26/200] [Batch 46/59] [D loss: 0.641482] [G loss: 0.806550]\n",
"[Epoch 26/200] [Batch 47/59] [D loss: 0.650085] [G loss: 0.767905]\n",
"[Epoch 26/200] [Batch 48/59] [D loss: 0.643287] [G loss: 0.764731]\n",
"[Epoch 26/200] [Batch 49/59] [D loss: 0.653262] [G loss: 0.727046]\n",
"[Epoch 26/200] [Batch 50/59] [D loss: 0.649845] [G loss: 0.784762]\n",
"[Epoch 26/200] [Batch 51/59] [D loss: 0.644407] [G loss: 0.742276]\n",
"[Epoch 26/200] [Batch 52/59] [D loss: 0.635836] [G loss: 0.750986]\n",
"[Epoch 26/200] [Batch 53/59] [D loss: 0.627634] [G loss: 0.817412]\n",
"[Epoch 26/200] [Batch 54/59] [D loss: 0.652978] [G loss: 0.736672]\n",
"[Epoch 26/200] [Batch 55/59] [D loss: 0.656751] [G loss: 0.761720]\n",
"[Epoch 26/200] [Batch 56/59] [D loss: 0.648632] [G loss: 0.778596]\n",
"[Epoch 26/200] [Batch 57/59] [D loss: 0.659502] [G loss: 0.755692]\n",
"[Epoch 26/200] [Batch 58/59] [D loss: 0.678361] [G loss: 0.762636]\n",
"[Epoch 27/200] [Batch 0/59] [D loss: 0.686717] [G loss: 0.757232]\n",
"[Epoch 27/200] [Batch 1/59] [D loss: 0.676014] [G loss: 0.780843]\n",
"[Epoch 27/200] [Batch 2/59] [D loss: 0.676288] [G loss: 0.758659]\n",
"[Epoch 27/200] [Batch 3/59] [D loss: 0.688452] [G loss: 0.705002]\n",
"[Epoch 27/200] [Batch 4/59] [D loss: 0.665193] [G loss: 0.740003]\n",
"[Epoch 27/200] [Batch 5/59] [D loss: 0.667034] [G loss: 0.692409]\n",
"[Epoch 27/200] [Batch 6/59] [D loss: 0.648281] [G loss: 0.729038]\n",
"[Epoch 27/200] [Batch 7/59] [D loss: 0.664117] [G loss: 0.733045]\n",
"[Epoch 27/200] [Batch 8/59] [D loss: 0.655657] [G loss: 0.768463]\n",
"[Epoch 27/200] [Batch 9/59] [D loss: 0.653475] [G loss: 0.762863]\n",
"[Epoch 27/200] [Batch 10/59] [D loss: 0.658059] [G loss: 0.769491]\n",
"[Epoch 27/200] [Batch 11/59] [D loss: 0.650682] [G loss: 0.788895]\n",
"[Epoch 27/200] [Batch 12/59] [D loss: 0.644141] [G loss: 0.806155]\n",
"[Epoch 27/200] [Batch 13/59] [D loss: 0.644722] [G loss: 0.773391]\n",
"[Epoch 27/200] [Batch 14/59] [D loss: 0.656866] [G loss: 0.769480]\n",
"[Epoch 27/200] [Batch 15/59] [D loss: 0.651845] [G loss: 0.775932]\n",
"[Epoch 27/200] [Batch 16/59] [D loss: 0.641605] [G loss: 0.760534]\n",
"[Epoch 27/200] [Batch 17/59] [D loss: 0.642084] [G loss: 0.761239]\n",
"[Epoch 27/200] [Batch 18/59] [D loss: 0.638879] [G loss: 0.790796]\n",
"[Epoch 27/200] [Batch 19/59] [D loss: 0.628441] [G loss: 0.753500]\n",
"[Epoch 27/200] [Batch 20/59] [D loss: 0.654370] [G loss: 0.753302]\n",
"[Epoch 27/200] [Batch 21/59] [D loss: 0.665842] [G loss: 0.784741]\n",
"[Epoch 27/200] [Batch 22/59] [D loss: 0.643909] [G loss: 0.755951]\n",
"[Epoch 27/200] [Batch 23/59] [D loss: 0.662808] [G loss: 0.695605]\n",
"[Epoch 27/200] [Batch 24/59] [D loss: 0.677753] [G loss: 0.842490]\n",
"[Epoch 27/200] [Batch 25/59] [D loss: 0.675958] [G loss: 0.771350]\n",
"[Epoch 27/200] [Batch 26/59] [D loss: 0.706051] [G loss: 0.710293]\n",
"[Epoch 27/200] [Batch 27/59] [D loss: 0.664536] [G loss: 0.735108]\n",
"[Epoch 27/200] [Batch 28/59] [D loss: 0.687375] [G loss: 0.740001]\n",
"[Epoch 27/200] [Batch 29/59] [D loss: 0.680574] [G loss: 0.721184]\n",
"[Epoch 27/200] [Batch 30/59] [D loss: 0.673733] [G loss: 0.831398]\n",
"[Epoch 27/200] [Batch 31/59] [D loss: 0.656319] [G loss: 0.814225]\n",
"[Epoch 27/200] [Batch 32/59] [D loss: 0.665116] [G loss: 0.653903]\n",
"[Epoch 27/200] [Batch 33/59] [D loss: 0.640244] [G loss: 0.831879]\n",
"[Epoch 27/200] [Batch 34/59] [D loss: 0.636566] [G loss: 0.823981]\n",
"[Epoch 27/200] [Batch 35/59] [D loss: 0.664580] [G loss: 0.620938]\n",
"[Epoch 27/200] [Batch 36/59] [D loss: 0.634774] [G loss: 0.938527]\n",
"[Epoch 27/200] [Batch 37/59] [D loss: 0.620045] [G loss: 0.850816]\n",
"[Epoch 27/200] [Batch 38/59] [D loss: 0.663569] [G loss: 0.603351]\n",
"[Epoch 27/200] [Batch 39/59] [D loss: 0.653623] [G loss: 0.801577]\n",
"[Epoch 27/200] [Batch 40/59] [D loss: 0.669081] [G loss: 0.923229]\n",
"[Epoch 27/200] [Batch 41/59] [D loss: 0.655352] [G loss: 0.662172]\n",
"[Epoch 27/200] [Batch 42/59] [D loss: 0.678328] [G loss: 0.664695]\n",
"[Epoch 27/200] [Batch 43/59] [D loss: 0.683062] [G loss: 0.882155]\n",
"[Epoch 27/200] [Batch 44/59] [D loss: 0.689619] [G loss: 0.763941]\n",
"[Epoch 27/200] [Batch 45/59] [D loss: 0.669442] [G loss: 0.672661]\n",
"[Epoch 27/200] [Batch 46/59] [D loss: 0.670685] [G loss: 0.746673]\n",
"[Epoch 27/200] [Batch 47/59] [D loss: 0.664816] [G loss: 0.769350]\n",
"[Epoch 27/200] [Batch 48/59] [D loss: 0.662922] [G loss: 0.757492]\n",
"[Epoch 27/200] [Batch 49/59] [D loss: 0.660933] [G loss: 0.710837]\n",
"[Epoch 27/200] [Batch 50/59] [D loss: 0.648287] [G loss: 0.742744]\n",
"[Epoch 27/200] [Batch 51/59] [D loss: 0.670384] [G loss: 0.804464]\n",
"[Epoch 27/200] [Batch 52/59] [D loss: 0.645813] [G loss: 0.803853]\n",
"[Epoch 27/200] [Batch 53/59] [D loss: 0.661639] [G loss: 0.717902]\n",
"[Epoch 27/200] [Batch 54/59] [D loss: 0.649721] [G loss: 0.777146]\n",
"[Epoch 27/200] [Batch 55/59] [D loss: 0.637353] [G loss: 0.814057]\n",
"[Epoch 27/200] [Batch 56/59] [D loss: 0.654103] [G loss: 0.773047]\n",
"[Epoch 27/200] [Batch 57/59] [D loss: 0.636185] [G loss: 0.799392]\n",
"[Epoch 27/200] [Batch 58/59] [D loss: 0.655646] [G loss: 0.786425]\n",
"[Epoch 28/200] [Batch 0/59] [D loss: 0.649536] [G loss: 0.802757]\n",
"[Epoch 28/200] [Batch 1/59] [D loss: 0.644835] [G loss: 0.740165]\n",
"[Epoch 28/200] [Batch 2/59] [D loss: 0.661897] [G loss: 0.741404]\n",
"[Epoch 28/200] [Batch 3/59] [D loss: 0.664902] [G loss: 0.753102]\n",
"[Epoch 28/200] [Batch 4/59] [D loss: 0.667426] [G loss: 0.742108]\n",
"[Epoch 28/200] [Batch 5/59] [D loss: 0.650038] [G loss: 0.791790]\n",
"[Epoch 28/200] [Batch 6/59] [D loss: 0.652773] [G loss: 0.713829]\n",
"[Epoch 28/200] [Batch 7/59] [D loss: 0.666488] [G loss: 0.734530]\n",
"[Epoch 28/200] [Batch 8/59] [D loss: 0.674642] [G loss: 0.777414]\n",
"[Epoch 28/200] [Batch 9/59] [D loss: 0.657547] [G loss: 0.768765]\n",
"[Epoch 28/200] [Batch 10/59] [D loss: 0.666443] [G loss: 0.709853]\n",
"[Epoch 28/200] [Batch 11/59] [D loss: 0.661269] [G loss: 0.789106]\n",
"[Epoch 28/200] [Batch 12/59] [D loss: 0.682432] [G loss: 0.775883]\n",
"[Epoch 28/200] [Batch 13/59] [D loss: 0.677395] [G loss: 0.758042]\n",
"[Epoch 28/200] [Batch 14/59] [D loss: 0.662961] [G loss: 0.793060]\n",
"[Epoch 28/200] [Batch 15/59] [D loss: 0.672151] [G loss: 0.733777]\n",
"[Epoch 28/200] [Batch 16/59] [D loss: 0.668669] [G loss: 0.740055]\n",
"[Epoch 28/200] [Batch 17/59] [D loss: 0.645008] [G loss: 0.765189]\n",
"[Epoch 28/200] [Batch 18/59] [D loss: 0.661004] [G loss: 0.735641]\n",
"[Epoch 28/200] [Batch 19/59] [D loss: 0.665004] [G loss: 0.737288]\n",
"[Epoch 28/200] [Batch 20/59] [D loss: 0.669533] [G loss: 0.733425]\n",
"[Epoch 28/200] [Batch 21/59] [D loss: 0.658943] [G loss: 0.759451]\n",
"[Epoch 28/200] [Batch 22/59] [D loss: 0.650565] [G loss: 0.783979]\n",
"[Epoch 28/200] [Batch 23/59] [D loss: 0.651670] [G loss: 0.769417]\n",
"[Epoch 28/200] [Batch 24/59] [D loss: 0.653640] [G loss: 0.759717]\n",
"[Epoch 28/200] [Batch 25/59] [D loss: 0.653794] [G loss: 0.815917]\n",
"[Epoch 28/200] [Batch 26/59] [D loss: 0.651263] [G loss: 0.778177]\n",
"[Epoch 28/200] [Batch 27/59] [D loss: 0.661335] [G loss: 0.732491]\n",
"[Epoch 28/200] [Batch 28/59] [D loss: 0.642385] [G loss: 0.795157]\n",
"[Epoch 28/200] [Batch 29/59] [D loss: 0.652565] [G loss: 0.820492]\n",
"[Epoch 28/200] [Batch 30/59] [D loss: 0.658848] [G loss: 0.736953]\n",
"[Epoch 28/200] [Batch 31/59] [D loss: 0.674175] [G loss: 0.775336]\n",
"[Epoch 28/200] [Batch 32/59] [D loss: 0.660372] [G loss: 0.800967]\n",
"[Epoch 28/200] [Batch 33/59] [D loss: 0.655911] [G loss: 0.742359]\n",
"[Epoch 28/200] [Batch 34/59] [D loss: 0.660286] [G loss: 0.696631]\n",
"[Epoch 28/200] [Batch 35/59] [D loss: 0.664729] [G loss: 0.785102]\n",
"[Epoch 28/200] [Batch 36/59] [D loss: 0.669038] [G loss: 0.807919]\n",
"[Epoch 28/200] [Batch 37/59] [D loss: 0.664130] [G loss: 0.725441]\n",
"[Epoch 28/200] [Batch 38/59] [D loss: 0.652796] [G loss: 0.697377]\n",
"[Epoch 28/200] [Batch 39/59] [D loss: 0.653029] [G loss: 0.805883]\n",
"[Epoch 28/200] [Batch 40/59] [D loss: 0.635362] [G loss: 0.813763]\n",
"[Epoch 28/200] [Batch 41/59] [D loss: 0.644890] [G loss: 0.719990]\n",
"[Epoch 28/200] [Batch 42/59] [D loss: 0.631184] [G loss: 0.802221]\n",
"[Epoch 28/200] [Batch 43/59] [D loss: 0.647885] [G loss: 0.781693]\n",
"[Epoch 28/200] [Batch 44/59] [D loss: 0.656815] [G loss: 0.751343]\n",
"[Epoch 28/200] [Batch 45/59] [D loss: 0.652833] [G loss: 0.793534]\n",
"[Epoch 28/200] [Batch 46/59] [D loss: 0.656349] [G loss: 0.776084]\n",
"[Epoch 28/200] [Batch 47/59] [D loss: 0.666149] [G loss: 0.741354]\n",
"[Epoch 28/200] [Batch 48/59] [D loss: 0.686245] [G loss: 0.748323]\n",
"[Epoch 28/200] [Batch 49/59] [D loss: 0.657892] [G loss: 0.781573]\n",
"[Epoch 28/200] [Batch 50/59] [D loss: 0.669900] [G loss: 0.750621]\n",
"[Epoch 28/200] [Batch 51/59] [D loss: 0.668033] [G loss: 0.750993]\n",
"[Epoch 28/200] [Batch 52/59] [D loss: 0.659002] [G loss: 0.735040]\n",
"[Epoch 28/200] [Batch 53/59] [D loss: 0.670593] [G loss: 0.800146]\n",
"[Epoch 28/200] [Batch 54/59] [D loss: 0.663720] [G loss: 0.759935]\n",
"[Epoch 28/200] [Batch 55/59] [D loss: 0.658204] [G loss: 0.754392]\n",
"[Epoch 28/200] [Batch 56/59] [D loss: 0.653537] [G loss: 0.803111]\n",
"[Epoch 28/200] [Batch 57/59] [D loss: 0.640021] [G loss: 0.768119]\n",
"[Epoch 28/200] [Batch 58/59] [D loss: 0.635491] [G loss: 0.753059]\n",
"[Epoch 29/200] [Batch 0/59] [D loss: 0.644225] [G loss: 0.786207]\n",
"[Epoch 29/200] [Batch 1/59] [D loss: 0.643534] [G loss: 0.731113]\n",
"[Epoch 29/200] [Batch 2/59] [D loss: 0.667735] [G loss: 0.754639]\n",
"[Epoch 29/200] [Batch 3/59] [D loss: 0.658155] [G loss: 0.759417]\n",
"[Epoch 29/200] [Batch 4/59] [D loss: 0.663595] [G loss: 0.771511]\n",
"[Epoch 29/200] [Batch 5/59] [D loss: 0.675609] [G loss: 0.710828]\n",
"[Epoch 29/200] [Batch 6/59] [D loss: 0.681469] [G loss: 0.765005]\n",
"[Epoch 29/200] [Batch 7/59] [D loss: 0.665061] [G loss: 0.743708]\n",
"[Epoch 29/200] [Batch 8/59] [D loss: 0.671976] [G loss: 0.718429]\n",
"[Epoch 29/200] [Batch 9/59] [D loss: 0.662776] [G loss: 0.752075]\n",
"[Epoch 29/200] [Batch 10/59] [D loss: 0.653898] [G loss: 0.774513]\n",
"[Epoch 29/200] [Batch 11/59] [D loss: 0.651039] [G loss: 0.722588]\n",
"[Epoch 29/200] [Batch 12/59] [D loss: 0.634477] [G loss: 0.745561]\n",
"[Epoch 29/200] [Batch 13/59] [D loss: 0.621515] [G loss: 0.810066]\n",
"[Epoch 29/200] [Batch 14/59] [D loss: 0.637828] [G loss: 0.807595]\n",
"[Epoch 29/200] [Batch 15/59] [D loss: 0.631785] [G loss: 0.774785]\n",
"[Epoch 29/200] [Batch 16/59] [D loss: 0.632230] [G loss: 0.761345]\n",
"[Epoch 29/200] [Batch 17/59] [D loss: 0.638439] [G loss: 0.768561]\n",
"[Epoch 29/200] [Batch 18/59] [D loss: 0.659670] [G loss: 0.786983]\n",
"[Epoch 29/200] [Batch 19/59] [D loss: 0.644745] [G loss: 0.793011]\n",
"[Epoch 29/200] [Batch 20/59] [D loss: 0.666955] [G loss: 0.727544]\n",
"[Epoch 29/200] [Batch 21/59] [D loss: 0.663971] [G loss: 0.693342]\n",
"[Epoch 29/200] [Batch 22/59] [D loss: 0.660375] [G loss: 0.787691]\n",
"[Epoch 29/200] [Batch 23/59] [D loss: 0.682385] [G loss: 0.723670]\n",
"[Epoch 29/200] [Batch 24/59] [D loss: 0.670714] [G loss: 0.726041]\n",
"[Epoch 29/200] [Batch 25/59] [D loss: 0.655009] [G loss: 0.684284]\n",
"[Epoch 29/200] [Batch 26/59] [D loss: 0.675513] [G loss: 0.816486]\n",
"[Epoch 29/200] [Batch 27/59] [D loss: 0.663841] [G loss: 0.852596]\n",
"[Epoch 29/200] [Batch 28/59] [D loss: 0.653801] [G loss: 0.757550]\n",
"[Epoch 29/200] [Batch 29/59] [D loss: 0.654698] [G loss: 0.761666]\n",
"[Epoch 29/200] [Batch 30/59] [D loss: 0.633485] [G loss: 0.850599]\n",
"[Epoch 29/200] [Batch 31/59] [D loss: 0.640190] [G loss: 0.817241]\n",
"[Epoch 29/200] [Batch 32/59] [D loss: 0.632800] [G loss: 0.734723]\n",
"[Epoch 29/200] [Batch 33/59] [D loss: 0.643674] [G loss: 0.742153]\n",
"[Epoch 29/200] [Batch 34/59] [D loss: 0.639579] [G loss: 0.826332]\n",
"[Epoch 29/200] [Batch 35/59] [D loss: 0.637472] [G loss: 0.746111]\n",
"[Epoch 29/200] [Batch 36/59] [D loss: 0.653497] [G loss: 0.677877]\n",
"[Epoch 29/200] [Batch 37/59] [D loss: 0.657250] [G loss: 0.770926]\n",
"[Epoch 29/200] [Batch 38/59] [D loss: 0.671281] [G loss: 0.777490]\n",
"[Epoch 29/200] [Batch 39/59] [D loss: 0.680014] [G loss: 0.736968]\n",
"[Epoch 29/200] [Batch 40/59] [D loss: 0.663795] [G loss: 0.750822]\n",
"[Epoch 29/200] [Batch 41/59] [D loss: 0.675725] [G loss: 0.775436]\n",
"[Epoch 29/200] [Batch 42/59] [D loss: 0.667981] [G loss: 0.774839]\n",
"[Epoch 29/200] [Batch 43/59] [D loss: 0.662228] [G loss: 0.727093]\n",
"[Epoch 29/200] [Batch 44/59] [D loss: 0.650485] [G loss: 0.788512]\n",
"[Epoch 29/200] [Batch 45/59] [D loss: 0.657539] [G loss: 0.744906]\n",
"[Epoch 29/200] [Batch 46/59] [D loss: 0.651094] [G loss: 0.748534]\n",
"[Epoch 29/200] [Batch 47/59] [D loss: 0.664052] [G loss: 0.770204]\n",
"[Epoch 29/200] [Batch 48/59] [D loss: 0.657706] [G loss: 0.761496]\n",
"[Epoch 29/200] [Batch 49/59] [D loss: 0.661503] [G loss: 0.718047]\n",
"[Epoch 29/200] [Batch 50/59] [D loss: 0.689558] [G loss: 0.794715]\n",
"[Epoch 29/200] [Batch 51/59] [D loss: 0.684711] [G loss: 0.776962]\n",
"[Epoch 29/200] [Batch 52/59] [D loss: 0.685533] [G loss: 0.651318]\n",
"[Epoch 29/200] [Batch 53/59] [D loss: 0.672347] [G loss: 0.786885]\n",
"[Epoch 29/200] [Batch 54/59] [D loss: 0.673324] [G loss: 0.819705]\n",
"[Epoch 29/200] [Batch 55/59] [D loss: 0.664058] [G loss: 0.714641]\n",
"[Epoch 29/200] [Batch 56/59] [D loss: 0.669291] [G loss: 0.773038]\n",
"[Epoch 29/200] [Batch 57/59] [D loss: 0.640625] [G loss: 0.793502]\n",
"[Epoch 29/200] [Batch 58/59] [D loss: 0.635873] [G loss: 0.729229]\n",
"[Epoch 30/200] [Batch 0/59] [D loss: 0.650853] [G loss: 0.753235]\n",
"[Epoch 30/200] [Batch 1/59] [D loss: 0.629507] [G loss: 0.844733]\n",
"[Epoch 30/200] [Batch 2/59] [D loss: 0.643948] [G loss: 0.801261]\n",
"[Epoch 30/200] [Batch 3/59] [D loss: 0.632042] [G loss: 0.716886]\n",
"[Epoch 30/200] [Batch 4/59] [D loss: 0.639381] [G loss: 0.849020]\n",
"[Epoch 30/200] [Batch 5/59] [D loss: 0.641878] [G loss: 0.851580]\n",
"[Epoch 30/200] [Batch 6/59] [D loss: 0.639627] [G loss: 0.683964]\n",
"[Epoch 30/200] [Batch 7/59] [D loss: 0.660472] [G loss: 0.841947]\n",
"[Epoch 30/200] [Batch 8/59] [D loss: 0.662034] [G loss: 0.880561]\n",
"[Epoch 30/200] [Batch 9/59] [D loss: 0.686143] [G loss: 0.651645]\n",
"[Epoch 30/200] [Batch 10/59] [D loss: 0.676301] [G loss: 0.749810]\n",
"[Epoch 30/200] [Batch 11/59] [D loss: 0.662051] [G loss: 0.762997]\n",
"[Epoch 30/200] [Batch 12/59] [D loss: 0.674041] [G loss: 0.761928]\n",
"[Epoch 30/200] [Batch 13/59] [D loss: 0.665778] [G loss: 0.700539]\n",
"[Epoch 30/200] [Batch 14/59] [D loss: 0.664967] [G loss: 0.771647]\n",
"[Epoch 30/200] [Batch 15/59] [D loss: 0.652311] [G loss: 0.711969]\n",
"[Epoch 30/200] [Batch 16/59] [D loss: 0.642630] [G loss: 0.807027]\n",
"[Epoch 30/200] [Batch 17/59] [D loss: 0.660946] [G loss: 0.772352]\n",
"[Epoch 30/200] [Batch 18/59] [D loss: 0.638575] [G loss: 0.768874]\n",
"[Epoch 30/200] [Batch 19/59] [D loss: 0.640043] [G loss: 0.762762]\n",
"[Epoch 30/200] [Batch 20/59] [D loss: 0.650187] [G loss: 0.826455]\n",
"[Epoch 30/200] [Batch 21/59] [D loss: 0.639356] [G loss: 0.781568]\n",
"[Epoch 30/200] [Batch 22/59] [D loss: 0.652444] [G loss: 0.718703]\n",
"[Epoch 30/200] [Batch 23/59] [D loss: 0.682724] [G loss: 0.772998]\n",
"[Epoch 30/200] [Batch 24/59] [D loss: 0.657296] [G loss: 0.768959]\n",
"[Epoch 30/200] [Batch 25/59] [D loss: 0.663574] [G loss: 0.702048]\n",
"[Epoch 30/200] [Batch 26/59] [D loss: 0.662355] [G loss: 0.773845]\n",
"[Epoch 30/200] [Batch 27/59] [D loss: 0.664325] [G loss: 0.779812]\n",
"[Epoch 30/200] [Batch 28/59] [D loss: 0.666458] [G loss: 0.737376]\n",
"[Epoch 30/200] [Batch 29/59] [D loss: 0.657582] [G loss: 0.795508]\n",
"[Epoch 30/200] [Batch 30/59] [D loss: 0.622183] [G loss: 0.793804]\n",
"[Epoch 30/200] [Batch 31/59] [D loss: 0.641333] [G loss: 0.754105]\n",
"[Epoch 30/200] [Batch 32/59] [D loss: 0.636009] [G loss: 0.801220]\n",
"[Epoch 30/200] [Batch 33/59] [D loss: 0.635251] [G loss: 0.785489]\n",
"[Epoch 30/200] [Batch 34/59] [D loss: 0.640993] [G loss: 0.809821]\n",
"[Epoch 30/200] [Batch 35/59] [D loss: 0.645701] [G loss: 0.778992]\n",
"[Epoch 30/200] [Batch 36/59] [D loss: 0.646936] [G loss: 0.755502]\n",
"[Epoch 30/200] [Batch 37/59] [D loss: 0.651318] [G loss: 0.718691]\n",
"[Epoch 30/200] [Batch 38/59] [D loss: 0.664112] [G loss: 0.847996]\n",
"[Epoch 30/200] [Batch 39/59] [D loss: 0.681772] [G loss: 0.806853]\n",
"[Epoch 30/200] [Batch 40/59] [D loss: 0.659100] [G loss: 0.722579]\n",
"[Epoch 30/200] [Batch 41/59] [D loss: 0.687133] [G loss: 0.716818]\n",
"[Epoch 30/200] [Batch 42/59] [D loss: 0.683552] [G loss: 0.780054]\n",
"[Epoch 30/200] [Batch 43/59] [D loss: 0.677113] [G loss: 0.752042]\n",
"[Epoch 30/200] [Batch 44/59] [D loss: 0.675138] [G loss: 0.682730]\n",
"[Epoch 30/200] [Batch 45/59] [D loss: 0.642081] [G loss: 0.696727]\n",
"[Epoch 30/200] [Batch 46/59] [D loss: 0.638346] [G loss: 0.843935]\n",
"[Epoch 30/200] [Batch 47/59] [D loss: 0.624730] [G loss: 0.798382]\n",
"[Epoch 30/200] [Batch 48/59] [D loss: 0.626122] [G loss: 0.771657]\n",
"[Epoch 30/200] [Batch 49/59] [D loss: 0.621570] [G loss: 0.806817]\n",
"[Epoch 30/200] [Batch 50/59] [D loss: 0.626970] [G loss: 0.789630]\n",
"[Epoch 30/200] [Batch 51/59] [D loss: 0.622148] [G loss: 0.803996]\n",
"[Epoch 30/200] [Batch 52/59] [D loss: 0.634440] [G loss: 0.753721]\n",
"[Epoch 30/200] [Batch 53/59] [D loss: 0.649163] [G loss: 0.725066]\n",
"[Epoch 30/200] [Batch 54/59] [D loss: 0.680113] [G loss: 0.852885]\n",
"[Epoch 30/200] [Batch 55/59] [D loss: 0.689054] [G loss: 0.721357]\n",
"[Epoch 30/200] [Batch 56/59] [D loss: 0.689145] [G loss: 0.737388]\n",
"[Epoch 30/200] [Batch 57/59] [D loss: 0.678575] [G loss: 0.771121]\n",
"[Epoch 30/200] [Batch 58/59] [D loss: 0.697097] [G loss: 0.711640]\n",
"[Epoch 31/200] [Batch 0/59] [D loss: 0.685415] [G loss: 0.723424]\n",
"[Epoch 31/200] [Batch 1/59] [D loss: 0.671356] [G loss: 0.814313]\n",
"[Epoch 31/200] [Batch 2/59] [D loss: 0.655814] [G loss: 0.814685]\n",
"[Epoch 31/200] [Batch 3/59] [D loss: 0.659423] [G loss: 0.749374]\n",
"[Epoch 31/200] [Batch 4/59] [D loss: 0.636602] [G loss: 0.826593]\n",
"[Epoch 31/200] [Batch 5/59] [D loss: 0.634524] [G loss: 0.851139]\n",
"[Epoch 31/200] [Batch 6/59] [D loss: 0.640791] [G loss: 0.794891]\n",
"[Epoch 31/200] [Batch 7/59] [D loss: 0.620020] [G loss: 0.804804]\n",
"[Epoch 31/200] [Batch 8/59] [D loss: 0.653763] [G loss: 0.750167]\n",
"[Epoch 31/200] [Batch 9/59] [D loss: 0.656584] [G loss: 0.750299]\n",
"[Epoch 31/200] [Batch 10/59] [D loss: 0.659628] [G loss: 0.782237]\n",
"[Epoch 31/200] [Batch 11/59] [D loss: 0.669246] [G loss: 0.734365]\n",
"[Epoch 31/200] [Batch 12/59] [D loss: 0.682511] [G loss: 0.705014]\n",
"[Epoch 31/200] [Batch 13/59] [D loss: 0.669500] [G loss: 0.733263]\n",
"[Epoch 31/200] [Batch 14/59] [D loss: 0.665465] [G loss: 0.784873]\n",
"[Epoch 31/200] [Batch 15/59] [D loss: 0.650627] [G loss: 0.707448]\n",
"[Epoch 31/200] [Batch 16/59] [D loss: 0.663224] [G loss: 0.723787]\n",
"[Epoch 31/200] [Batch 17/59] [D loss: 0.658665] [G loss: 0.803147]\n",
"[Epoch 31/200] [Batch 18/59] [D loss: 0.639799] [G loss: 0.794344]\n",
"[Epoch 31/200] [Batch 19/59] [D loss: 0.633320] [G loss: 0.763647]\n",
"[Epoch 31/200] [Batch 20/59] [D loss: 0.640436] [G loss: 0.770756]\n",
"[Epoch 31/200] [Batch 21/59] [D loss: 0.629383] [G loss: 0.858150]\n",
"[Epoch 31/200] [Batch 22/59] [D loss: 0.623060] [G loss: 0.782695]\n",
"[Epoch 31/200] [Batch 23/59] [D loss: 0.633195] [G loss: 0.751576]\n",
"[Epoch 31/200] [Batch 24/59] [D loss: 0.661914] [G loss: 0.803252]\n",
"[Epoch 31/200] [Batch 25/59] [D loss: 0.679742] [G loss: 0.826512]\n",
"[Epoch 31/200] [Batch 26/59] [D loss: 0.657297] [G loss: 0.775010]\n",
"[Epoch 31/200] [Batch 27/59] [D loss: 0.667529] [G loss: 0.697419]\n",
"[Epoch 31/200] [Batch 28/59] [D loss: 0.683945] [G loss: 0.765400]\n",
"[Epoch 31/200] [Batch 29/59] [D loss: 0.670995] [G loss: 0.793030]\n",
"[Epoch 31/200] [Batch 30/59] [D loss: 0.673501] [G loss: 0.717637]\n",
"[Epoch 31/200] [Batch 31/59] [D loss: 0.673628] [G loss: 0.766606]\n",
"[Epoch 31/200] [Batch 32/59] [D loss: 0.647739] [G loss: 0.800217]\n",
"[Epoch 31/200] [Batch 33/59] [D loss: 0.639343] [G loss: 0.768545]\n",
"[Epoch 31/200] [Batch 34/59] [D loss: 0.634254] [G loss: 0.768384]\n",
"[Epoch 31/200] [Batch 35/59] [D loss: 0.629791] [G loss: 0.831478]\n",
"[Epoch 31/200] [Batch 36/59] [D loss: 0.622891] [G loss: 0.780452]\n",
"[Epoch 31/200] [Batch 37/59] [D loss: 0.641538] [G loss: 0.720079]\n",
"[Epoch 31/200] [Batch 38/59] [D loss: 0.631921] [G loss: 0.837310]\n",
"[Epoch 31/200] [Batch 39/59] [D loss: 0.639255] [G loss: 0.766185]\n",
"[Epoch 31/200] [Batch 40/59] [D loss: 0.664874] [G loss: 0.746502]\n",
"[Epoch 31/200] [Batch 41/59] [D loss: 0.672047] [G loss: 0.744550]\n",
"[Epoch 31/200] [Batch 42/59] [D loss: 0.679796] [G loss: 0.776780]\n",
"[Epoch 31/200] [Batch 43/59] [D loss: 0.678818] [G loss: 0.742828]\n",
"[Epoch 31/200] [Batch 44/59] [D loss: 0.674991] [G loss: 0.754013]\n",
"[Epoch 31/200] [Batch 45/59] [D loss: 0.674734] [G loss: 0.704025]\n",
"[Epoch 31/200] [Batch 46/59] [D loss: 0.653653] [G loss: 0.764096]\n",
"[Epoch 31/200] [Batch 47/59] [D loss: 0.659915] [G loss: 0.799496]\n",
"[Epoch 31/200] [Batch 48/59] [D loss: 0.638492] [G loss: 0.774277]\n",
"[Epoch 31/200] [Batch 49/59] [D loss: 0.631148] [G loss: 0.757985]\n",
"[Epoch 31/200] [Batch 50/59] [D loss: 0.640318] [G loss: 0.737352]\n",
"[Epoch 31/200] [Batch 51/59] [D loss: 0.631244] [G loss: 0.815338]\n",
"[Epoch 31/200] [Batch 52/59] [D loss: 0.645251] [G loss: 0.774154]\n",
"[Epoch 31/200] [Batch 53/59] [D loss: 0.651068] [G loss: 0.757798]\n",
"[Epoch 31/200] [Batch 54/59] [D loss: 0.648045] [G loss: 0.828413]\n",
"[Epoch 31/200] [Batch 55/59] [D loss: 0.638733] [G loss: 0.751031]\n",
"[Epoch 31/200] [Batch 56/59] [D loss: 0.660108] [G loss: 0.698243]\n",
"[Epoch 31/200] [Batch 57/59] [D loss: 0.658211] [G loss: 0.768853]\n",
"[Epoch 31/200] [Batch 58/59] [D loss: 0.661159] [G loss: 0.818439]\n",
"[Epoch 32/200] [Batch 0/59] [D loss: 0.684474] [G loss: 0.683537]\n",
"[Epoch 32/200] [Batch 1/59] [D loss: 0.678689] [G loss: 0.755400]\n",
"[Epoch 32/200] [Batch 2/59] [D loss: 0.659840] [G loss: 0.821391]\n",
"[Epoch 32/200] [Batch 3/59] [D loss: 0.674177] [G loss: 0.761321]\n",
"[Epoch 32/200] [Batch 4/59] [D loss: 0.640106] [G loss: 0.725199]\n",
"[Epoch 32/200] [Batch 5/59] [D loss: 0.657246] [G loss: 0.813550]\n",
"[Epoch 32/200] [Batch 6/59] [D loss: 0.632151] [G loss: 0.837982]\n",
"[Epoch 32/200] [Batch 7/59] [D loss: 0.639282] [G loss: 0.776488]\n",
"[Epoch 32/200] [Batch 8/59] [D loss: 0.646998] [G loss: 0.761039]\n",
"[Epoch 32/200] [Batch 9/59] [D loss: 0.641532] [G loss: 0.791653]\n",
"[Epoch 32/200] [Batch 10/59] [D loss: 0.634187] [G loss: 0.862832]\n",
"[Epoch 32/200] [Batch 11/59] [D loss: 0.660534] [G loss: 0.784221]\n",
"[Epoch 32/200] [Batch 12/59] [D loss: 0.666389] [G loss: 0.688339]\n",
"[Epoch 32/200] [Batch 13/59] [D loss: 0.679949] [G loss: 0.748570]\n",
"[Epoch 32/200] [Batch 14/59] [D loss: 0.673661] [G loss: 0.791203]\n",
"[Epoch 32/200] [Batch 15/59] [D loss: 0.684615] [G loss: 0.692242]\n",
"[Epoch 32/200] [Batch 16/59] [D loss: 0.677391] [G loss: 0.718022]\n",
"[Epoch 32/200] [Batch 17/59] [D loss: 0.676994] [G loss: 0.754561]\n",
"[Epoch 32/200] [Batch 18/59] [D loss: 0.671844] [G loss: 0.725078]\n",
"[Epoch 32/200] [Batch 19/59] [D loss: 0.639549] [G loss: 0.784809]\n",
"[Epoch 32/200] [Batch 20/59] [D loss: 0.643891] [G loss: 0.826812]\n",
"[Epoch 32/200] [Batch 21/59] [D loss: 0.628060] [G loss: 0.817648]\n",
"[Epoch 32/200] [Batch 22/59] [D loss: 0.650674] [G loss: 0.702908]\n",
"[Epoch 32/200] [Batch 23/59] [D loss: 0.627792] [G loss: 0.878922]\n",
"[Epoch 32/200] [Batch 24/59] [D loss: 0.620143] [G loss: 0.805931]\n",
"[Epoch 32/200] [Batch 25/59] [D loss: 0.619079] [G loss: 0.720540]\n",
"[Epoch 32/200] [Batch 26/59] [D loss: 0.626575] [G loss: 0.766783]\n",
"[Epoch 32/200] [Batch 27/59] [D loss: 0.640387] [G loss: 0.838866]\n",
"[Epoch 32/200] [Batch 28/59] [D loss: 0.645053] [G loss: 0.748885]\n",
"[Epoch 32/200] [Batch 29/59] [D loss: 0.658816] [G loss: 0.745893]\n",
"[Epoch 32/200] [Batch 30/59] [D loss: 0.677979] [G loss: 0.750871]\n",
"[Epoch 32/200] [Batch 31/59] [D loss: 0.680265] [G loss: 0.819637]\n",
"[Epoch 32/200] [Batch 32/59] [D loss: 0.680729] [G loss: 0.775834]\n",
"[Epoch 32/200] [Batch 33/59] [D loss: 0.682484] [G loss: 0.671353]\n",
"[Epoch 32/200] [Batch 34/59] [D loss: 0.666544] [G loss: 0.786736]\n",
"[Epoch 32/200] [Batch 35/59] [D loss: 0.672583] [G loss: 0.836240]\n",
"[Epoch 32/200] [Batch 36/59] [D loss: 0.653454] [G loss: 0.799389]\n",
"[Epoch 32/200] [Batch 37/59] [D loss: 0.644268] [G loss: 0.784175]\n",
"[Epoch 32/200] [Batch 38/59] [D loss: 0.625503] [G loss: 0.842600]\n",
"[Epoch 32/200] [Batch 39/59] [D loss: 0.624421] [G loss: 0.853349]\n",
"[Epoch 32/200] [Batch 40/59] [D loss: 0.630441] [G loss: 0.797021]\n",
"[Epoch 32/200] [Batch 41/59] [D loss: 0.629482] [G loss: 0.783148]\n",
"[Epoch 32/200] [Batch 42/59] [D loss: 0.622277] [G loss: 0.853561]\n",
"[Epoch 32/200] [Batch 43/59] [D loss: 0.624641] [G loss: 0.843189]\n",
"[Epoch 32/200] [Batch 44/59] [D loss: 0.623582] [G loss: 0.758814]\n",
"[Epoch 32/200] [Batch 45/59] [D loss: 0.669867] [G loss: 0.767854]\n",
"[Epoch 32/200] [Batch 46/59] [D loss: 0.654528] [G loss: 0.837494]\n",
"[Epoch 32/200] [Batch 47/59] [D loss: 0.669891] [G loss: 0.729573]\n",
"[Epoch 32/200] [Batch 48/59] [D loss: 0.659136] [G loss: 0.701394]\n",
"[Epoch 32/200] [Batch 49/59] [D loss: 0.661879] [G loss: 0.792645]\n",
"[Epoch 32/200] [Batch 50/59] [D loss: 0.650616] [G loss: 0.751919]\n",
"[Epoch 32/200] [Batch 51/59] [D loss: 0.659312] [G loss: 0.728985]\n",
"[Epoch 32/200] [Batch 52/59] [D loss: 0.649763] [G loss: 0.741374]\n",
"[Epoch 32/200] [Batch 53/59] [D loss: 0.661097] [G loss: 0.836168]\n",
"[Epoch 32/200] [Batch 54/59] [D loss: 0.647946] [G loss: 0.733311]\n",
"[Epoch 32/200] [Batch 55/59] [D loss: 0.658789] [G loss: 0.738905]\n",
"[Epoch 32/200] [Batch 56/59] [D loss: 0.654722] [G loss: 0.901498]\n",
"[Epoch 32/200] [Batch 57/59] [D loss: 0.643676] [G loss: 0.784548]\n",
"[Epoch 32/200] [Batch 58/59] [D loss: 0.652633] [G loss: 0.658408]\n",
"[Epoch 33/200] [Batch 0/59] [D loss: 0.669388] [G loss: 0.880010]\n",
"[Epoch 33/200] [Batch 1/59] [D loss: 0.654646] [G loss: 0.772737]\n",
"[Epoch 33/200] [Batch 2/59] [D loss: 0.646779] [G loss: 0.694903]\n",
"[Epoch 33/200] [Batch 3/59] [D loss: 0.655523] [G loss: 0.820058]\n",
"[Epoch 33/200] [Batch 4/59] [D loss: 0.651855] [G loss: 0.784656]\n",
"[Epoch 33/200] [Batch 5/59] [D loss: 0.638805] [G loss: 0.743894]\n",
"[Epoch 33/200] [Batch 6/59] [D loss: 0.640610] [G loss: 0.763412]\n",
"[Epoch 33/200] [Batch 7/59] [D loss: 0.647526] [G loss: 0.786167]\n",
"[Epoch 33/200] [Batch 8/59] [D loss: 0.651521] [G loss: 0.791361]\n",
"[Epoch 33/200] [Batch 9/59] [D loss: 0.652038] [G loss: 0.750297]\n",
"[Epoch 33/200] [Batch 10/59] [D loss: 0.649794] [G loss: 0.775145]\n",
"[Epoch 33/200] [Batch 11/59] [D loss: 0.659806] [G loss: 0.756833]\n",
"[Epoch 33/200] [Batch 12/59] [D loss: 0.655342] [G loss: 0.828124]\n",
"[Epoch 33/200] [Batch 13/59] [D loss: 0.666563] [G loss: 0.772664]\n",
"[Epoch 33/200] [Batch 14/59] [D loss: 0.647060] [G loss: 0.770480]\n",
"[Epoch 33/200] [Batch 15/59] [D loss: 0.658236] [G loss: 0.753692]\n",
"[Epoch 33/200] [Batch 16/59] [D loss: 0.649872] [G loss: 0.756965]\n",
"[Epoch 33/200] [Batch 17/59] [D loss: 0.672260] [G loss: 0.771190]\n",
"[Epoch 33/200] [Batch 18/59] [D loss: 0.654454] [G loss: 0.803496]\n",
"[Epoch 33/200] [Batch 19/59] [D loss: 0.686987] [G loss: 0.769000]\n",
"[Epoch 33/200] [Batch 20/59] [D loss: 0.656880] [G loss: 0.785083]\n",
"[Epoch 33/200] [Batch 21/59] [D loss: 0.681241] [G loss: 0.732956]\n",
"[Epoch 33/200] [Batch 22/59] [D loss: 0.656685] [G loss: 0.787753]\n",
"[Epoch 33/200] [Batch 23/59] [D loss: 0.654277] [G loss: 0.807694]\n",
"[Epoch 33/200] [Batch 24/59] [D loss: 0.645878] [G loss: 0.786698]\n",
"[Epoch 33/200] [Batch 25/59] [D loss: 0.640566] [G loss: 0.792666]\n",
"[Epoch 33/200] [Batch 26/59] [D loss: 0.623707] [G loss: 0.802118]\n",
"[Epoch 33/200] [Batch 27/59] [D loss: 0.621144] [G loss: 0.805316]\n",
"[Epoch 33/200] [Batch 28/59] [D loss: 0.631460] [G loss: 0.835301]\n",
"[Epoch 33/200] [Batch 29/59] [D loss: 0.621338] [G loss: 0.803074]\n",
"[Epoch 33/200] [Batch 30/59] [D loss: 0.632346] [G loss: 0.769694]\n",
"[Epoch 33/200] [Batch 31/59] [D loss: 0.640681] [G loss: 0.755385]\n",
"[Epoch 33/200] [Batch 32/59] [D loss: 0.648499] [G loss: 0.847022]\n",
"[Epoch 33/200] [Batch 33/59] [D loss: 0.657474] [G loss: 0.697615]\n",
"[Epoch 33/200] [Batch 34/59] [D loss: 0.681843] [G loss: 0.785609]\n",
"[Epoch 33/200] [Batch 35/59] [D loss: 0.684331] [G loss: 0.752542]\n",
"[Epoch 33/200] [Batch 36/59] [D loss: 0.668856] [G loss: 0.765069]\n",
"[Epoch 33/200] [Batch 37/59] [D loss: 0.695691] [G loss: 0.733672]\n",
"[Epoch 33/200] [Batch 38/59] [D loss: 0.680588] [G loss: 0.808792]\n",
"[Epoch 33/200] [Batch 39/59] [D loss: 0.670035] [G loss: 0.750229]\n",
"[Epoch 33/200] [Batch 40/59] [D loss: 0.651728] [G loss: 0.796789]\n",
"[Epoch 33/200] [Batch 41/59] [D loss: 0.637462] [G loss: 0.848386]\n",
"[Epoch 33/200] [Batch 42/59] [D loss: 0.630079] [G loss: 0.788353]\n",
"[Epoch 33/200] [Batch 43/59] [D loss: 0.611775] [G loss: 0.792302]\n",
"[Epoch 33/200] [Batch 44/59] [D loss: 0.622234] [G loss: 0.803892]\n",
"[Epoch 33/200] [Batch 45/59] [D loss: 0.631427] [G loss: 0.813378]\n",
"[Epoch 33/200] [Batch 46/59] [D loss: 0.642822] [G loss: 0.782822]\n",
"[Epoch 33/200] [Batch 47/59] [D loss: 0.634674] [G loss: 0.775963]\n",
"[Epoch 33/200] [Batch 48/59] [D loss: 0.667692] [G loss: 0.809739]\n",
"[Epoch 33/200] [Batch 49/59] [D loss: 0.672466] [G loss: 0.769952]\n",
"[Epoch 33/200] [Batch 50/59] [D loss: 0.691877] [G loss: 0.692512]\n",
"[Epoch 33/200] [Batch 51/59] [D loss: 0.665623] [G loss: 0.742266]\n",
"[Epoch 33/200] [Batch 52/59] [D loss: 0.692179] [G loss: 0.803489]\n",
"[Epoch 33/200] [Batch 53/59] [D loss: 0.678742] [G loss: 0.806746]\n",
"[Epoch 33/200] [Batch 54/59] [D loss: 0.664922] [G loss: 0.689846]\n",
"[Epoch 33/200] [Batch 55/59] [D loss: 0.680403] [G loss: 0.841543]\n",
"[Epoch 33/200] [Batch 56/59] [D loss: 0.653277] [G loss: 0.778136]\n",
"[Epoch 33/200] [Batch 57/59] [D loss: 0.649024] [G loss: 0.766127]\n",
"[Epoch 33/200] [Batch 58/59] [D loss: 0.642560] [G loss: 0.852029]\n",
"[Epoch 34/200] [Batch 0/59] [D loss: 0.636180] [G loss: 0.801913]\n",
"[Epoch 34/200] [Batch 1/59] [D loss: 0.628369] [G loss: 0.794604]\n",
"[Epoch 34/200] [Batch 2/59] [D loss: 0.646614] [G loss: 0.789243]\n",
"[Epoch 34/200] [Batch 3/59] [D loss: 0.655642] [G loss: 0.867730]\n",
"[Epoch 34/200] [Batch 4/59] [D loss: 0.625738] [G loss: 0.836918]\n",
"[Epoch 34/200] [Batch 5/59] [D loss: 0.644128] [G loss: 0.737232]\n",
"[Epoch 34/200] [Batch 6/59] [D loss: 0.650500] [G loss: 0.812745]\n",
"[Epoch 34/200] [Batch 7/59] [D loss: 0.651998] [G loss: 0.804181]\n",
"[Epoch 34/200] [Batch 8/59] [D loss: 0.644354] [G loss: 0.766598]\n",
"[Epoch 34/200] [Batch 9/59] [D loss: 0.662671] [G loss: 0.684938]\n",
"[Epoch 34/200] [Batch 10/59] [D loss: 0.654316] [G loss: 0.876027]\n",
"[Epoch 34/200] [Batch 11/59] [D loss: 0.665345] [G loss: 0.789666]\n",
"[Epoch 34/200] [Batch 12/59] [D loss: 0.667343] [G loss: 0.754158]\n",
"[Epoch 34/200] [Batch 13/59] [D loss: 0.647604] [G loss: 0.771137]\n",
"[Epoch 34/200] [Batch 14/59] [D loss: 0.662106] [G loss: 0.879623]\n",
"[Epoch 34/200] [Batch 15/59] [D loss: 0.629471] [G loss: 0.746970]\n",
"[Epoch 34/200] [Batch 16/59] [D loss: 0.633155] [G loss: 0.744072]\n",
"[Epoch 34/200] [Batch 17/59] [D loss: 0.650679] [G loss: 0.887518]\n",
"[Epoch 34/200] [Batch 18/59] [D loss: 0.651963] [G loss: 0.822613]\n",
"[Epoch 34/200] [Batch 19/59] [D loss: 0.649235] [G loss: 0.656655]\n",
"[Epoch 34/200] [Batch 20/59] [D loss: 0.648792] [G loss: 0.756263]\n",
"[Epoch 34/200] [Batch 21/59] [D loss: 0.652554] [G loss: 0.819580]\n",
"[Epoch 34/200] [Batch 22/59] [D loss: 0.648317] [G loss: 0.730393]\n",
"[Epoch 34/200] [Batch 23/59] [D loss: 0.655370] [G loss: 0.688481]\n",
"[Epoch 34/200] [Batch 24/59] [D loss: 0.640397] [G loss: 0.834856]\n",
"[Epoch 34/200] [Batch 25/59] [D loss: 0.644774] [G loss: 0.775910]\n",
"[Epoch 34/200] [Batch 26/59] [D loss: 0.663013] [G loss: 0.759773]\n",
"[Epoch 34/200] [Batch 27/59] [D loss: 0.648821] [G loss: 0.738145]\n",
"[Epoch 34/200] [Batch 28/59] [D loss: 0.644046] [G loss: 0.773905]\n",
"[Epoch 34/200] [Batch 29/59] [D loss: 0.635754] [G loss: 0.834165]\n",
"[Epoch 34/200] [Batch 30/59] [D loss: 0.655362] [G loss: 0.782214]\n",
"[Epoch 34/200] [Batch 31/59] [D loss: 0.652931] [G loss: 0.715861]\n",
"[Epoch 34/200] [Batch 32/59] [D loss: 0.668024] [G loss: 0.791957]\n",
"[Epoch 34/200] [Batch 33/59] [D loss: 0.674775] [G loss: 0.803009]\n",
"[Epoch 34/200] [Batch 34/59] [D loss: 0.642631] [G loss: 0.758019]\n",
"[Epoch 34/200] [Batch 35/59] [D loss: 0.636176] [G loss: 0.754024]\n",
"[Epoch 34/200] [Batch 36/59] [D loss: 0.661220] [G loss: 0.850866]\n",
"[Epoch 34/200] [Batch 37/59] [D loss: 0.663172] [G loss: 0.763880]\n",
"[Epoch 34/200] [Batch 38/59] [D loss: 0.667004] [G loss: 0.694408]\n",
"[Epoch 34/200] [Batch 39/59] [D loss: 0.676603] [G loss: 0.810592]\n",
"[Epoch 34/200] [Batch 40/59] [D loss: 0.659401] [G loss: 0.799698]\n",
"[Epoch 34/200] [Batch 41/59] [D loss: 0.655362] [G loss: 0.721504]\n",
"[Epoch 34/200] [Batch 42/59] [D loss: 0.644011] [G loss: 0.784145]\n",
"[Epoch 34/200] [Batch 43/59] [D loss: 0.651504] [G loss: 0.845045]\n",
"[Epoch 34/200] [Batch 44/59] [D loss: 0.646150] [G loss: 0.846003]\n",
"[Epoch 34/200] [Batch 45/59] [D loss: 0.633228] [G loss: 0.717407]\n",
"[Epoch 34/200] [Batch 46/59] [D loss: 0.628699] [G loss: 0.841301]\n",
"[Epoch 34/200] [Batch 47/59] [D loss: 0.614970] [G loss: 0.825426]\n",
"[Epoch 34/200] [Batch 48/59] [D loss: 0.602687] [G loss: 0.777349]\n",
"[Epoch 34/200] [Batch 49/59] [D loss: 0.596432] [G loss: 0.820244]\n",
"[Epoch 34/200] [Batch 50/59] [D loss: 0.621813] [G loss: 0.815968]\n",
"[Epoch 34/200] [Batch 51/59] [D loss: 0.621847] [G loss: 0.748972]\n",
"[Epoch 34/200] [Batch 52/59] [D loss: 0.643405] [G loss: 0.825675]\n",
"[Epoch 34/200] [Batch 53/59] [D loss: 0.666904] [G loss: 0.755703]\n",
"[Epoch 34/200] [Batch 54/59] [D loss: 0.670452] [G loss: 0.762553]\n",
"[Epoch 34/200] [Batch 55/59] [D loss: 0.668510] [G loss: 0.727844]\n",
"[Epoch 34/200] [Batch 56/59] [D loss: 0.696064] [G loss: 0.783874]\n",
"[Epoch 34/200] [Batch 57/59] [D loss: 0.666865] [G loss: 0.705598]\n",
"[Epoch 34/200] [Batch 58/59] [D loss: 0.681372] [G loss: 0.760516]\n",
"[Epoch 35/200] [Batch 0/59] [D loss: 0.655488] [G loss: 0.782755]\n",
"[Epoch 35/200] [Batch 1/59] [D loss: 0.662997] [G loss: 0.772883]\n",
"[Epoch 35/200] [Batch 2/59] [D loss: 0.644874] [G loss: 0.737064]\n",
"[Epoch 35/200] [Batch 3/59] [D loss: 0.636469] [G loss: 0.839385]\n",
"[Epoch 35/200] [Batch 4/59] [D loss: 0.616803] [G loss: 0.716753]\n",
"[Epoch 35/200] [Batch 5/59] [D loss: 0.633329] [G loss: 0.715989]\n",
"[Epoch 35/200] [Batch 6/59] [D loss: 0.625807] [G loss: 0.853119]\n",
"[Epoch 35/200] [Batch 7/59] [D loss: 0.652256] [G loss: 0.841136]\n",
"[Epoch 35/200] [Batch 8/59] [D loss: 0.636482] [G loss: 0.729522]\n",
"[Epoch 35/200] [Batch 9/59] [D loss: 0.642413] [G loss: 0.872112]\n",
"[Epoch 35/200] [Batch 10/59] [D loss: 0.637974] [G loss: 0.792149]\n",
"[Epoch 35/200] [Batch 11/59] [D loss: 0.662384] [G loss: 0.721904]\n",
"[Epoch 35/200] [Batch 12/59] [D loss: 0.654286] [G loss: 0.786044]\n",
"[Epoch 35/200] [Batch 13/59] [D loss: 0.669859] [G loss: 0.772443]\n",
"[Epoch 35/200] [Batch 14/59] [D loss: 0.645150] [G loss: 0.772233]\n",
"[Epoch 35/200] [Batch 15/59] [D loss: 0.669474] [G loss: 0.813939]\n",
"[Epoch 35/200] [Batch 16/59] [D loss: 0.650063] [G loss: 0.748971]\n",
"[Epoch 35/200] [Batch 17/59] [D loss: 0.657658] [G loss: 0.802807]\n",
"[Epoch 35/200] [Batch 18/59] [D loss: 0.645818] [G loss: 0.811106]\n",
"[Epoch 35/200] [Batch 19/59] [D loss: 0.654234] [G loss: 0.831305]\n",
"[Epoch 35/200] [Batch 20/59] [D loss: 0.627621] [G loss: 0.731654]\n",
"[Epoch 35/200] [Batch 21/59] [D loss: 0.653200] [G loss: 0.747776]\n",
"[Epoch 35/200] [Batch 22/59] [D loss: 0.641049] [G loss: 0.812690]\n",
"[Epoch 35/200] [Batch 23/59] [D loss: 0.645578] [G loss: 0.770689]\n",
"[Epoch 35/200] [Batch 24/59] [D loss: 0.647114] [G loss: 0.804095]\n",
"[Epoch 35/200] [Batch 25/59] [D loss: 0.671050] [G loss: 0.703795]\n",
"[Epoch 35/200] [Batch 26/59] [D loss: 0.660774] [G loss: 0.769235]\n",
"[Epoch 35/200] [Batch 27/59] [D loss: 0.664049] [G loss: 0.808354]\n",
"[Epoch 35/200] [Batch 28/59] [D loss: 0.663566] [G loss: 0.693006]\n",
"[Epoch 35/200] [Batch 29/59] [D loss: 0.655567] [G loss: 0.757025]\n",
"[Epoch 35/200] [Batch 30/59] [D loss: 0.674276] [G loss: 0.885728]\n",
"[Epoch 35/200] [Batch 31/59] [D loss: 0.612082] [G loss: 0.781491]\n",
"[Epoch 35/200] [Batch 32/59] [D loss: 0.626604] [G loss: 0.747682]\n",
"[Epoch 35/200] [Batch 33/59] [D loss: 0.621547] [G loss: 0.837236]\n",
"[Epoch 35/200] [Batch 34/59] [D loss: 0.585728] [G loss: 0.945989]\n",
"[Epoch 35/200] [Batch 35/59] [D loss: 0.635645] [G loss: 0.773723]\n",
"[Epoch 35/200] [Batch 36/59] [D loss: 0.632153] [G loss: 0.727171]\n",
"[Epoch 35/200] [Batch 37/59] [D loss: 0.651300] [G loss: 0.868704]\n",
"[Epoch 35/200] [Batch 38/59] [D loss: 0.665713] [G loss: 0.742195]\n",
"[Epoch 35/200] [Batch 39/59] [D loss: 0.652330] [G loss: 0.726846]\n",
"[Epoch 35/200] [Batch 40/59] [D loss: 0.630430] [G loss: 0.783819]\n",
"[Epoch 35/200] [Batch 41/59] [D loss: 0.643231] [G loss: 0.718038]\n",
"[Epoch 35/200] [Batch 42/59] [D loss: 0.667179] [G loss: 0.754600]\n",
"[Epoch 35/200] [Batch 43/59] [D loss: 0.669775] [G loss: 0.742469]\n",
"[Epoch 35/200] [Batch 44/59] [D loss: 0.670103] [G loss: 0.723582]\n",
"[Epoch 35/200] [Batch 45/59] [D loss: 0.677529] [G loss: 0.773220]\n",
"[Epoch 35/200] [Batch 46/59] [D loss: 0.663910] [G loss: 0.821357]\n",
"[Epoch 35/200] [Batch 47/59] [D loss: 0.661431] [G loss: 0.714569]\n",
"[Epoch 35/200] [Batch 48/59] [D loss: 0.637124] [G loss: 0.788175]\n",
"[Epoch 35/200] [Batch 49/59] [D loss: 0.634405] [G loss: 0.804944]\n",
"[Epoch 35/200] [Batch 50/59] [D loss: 0.631307] [G loss: 0.756991]\n",
"[Epoch 35/200] [Batch 51/59] [D loss: 0.627090] [G loss: 0.833790]\n",
"[Epoch 35/200] [Batch 52/59] [D loss: 0.614188] [G loss: 0.839842]\n",
"[Epoch 35/200] [Batch 53/59] [D loss: 0.572668] [G loss: 0.828371]\n",
"[Epoch 35/200] [Batch 54/59] [D loss: 0.611729] [G loss: 0.860632]\n",
"[Epoch 35/200] [Batch 55/59] [D loss: 0.624460] [G loss: 0.869163]\n",
"[Epoch 35/200] [Batch 56/59] [D loss: 0.624998] [G loss: 0.738018]\n",
"[Epoch 35/200] [Batch 57/59] [D loss: 0.645716] [G loss: 0.712848]\n",
"[Epoch 35/200] [Batch 58/59] [D loss: 0.691877] [G loss: 0.788089]\n",
"[Epoch 36/200] [Batch 0/59] [D loss: 0.676771] [G loss: 0.839526]\n",
"[Epoch 36/200] [Batch 1/59] [D loss: 0.673689] [G loss: 0.654849]\n",
"[Epoch 36/200] [Batch 2/59] [D loss: 0.717885] [G loss: 0.790000]\n",
"[Epoch 36/200] [Batch 3/59] [D loss: 0.670022] [G loss: 0.740757]\n",
"[Epoch 36/200] [Batch 4/59] [D loss: 0.664027] [G loss: 0.728242]\n",
"[Epoch 36/200] [Batch 5/59] [D loss: 0.676552] [G loss: 0.795421]\n",
"[Epoch 36/200] [Batch 6/59] [D loss: 0.674525] [G loss: 0.790200]\n",
"[Epoch 36/200] [Batch 7/59] [D loss: 0.625364] [G loss: 0.782866]\n",
"[Epoch 36/200] [Batch 8/59] [D loss: 0.627114] [G loss: 0.800511]\n",
"[Epoch 36/200] [Batch 9/59] [D loss: 0.626492] [G loss: 0.919651]\n",
"[Epoch 36/200] [Batch 10/59] [D loss: 0.618562] [G loss: 0.816421]\n",
"[Epoch 36/200] [Batch 11/59] [D loss: 0.601381] [G loss: 0.717461]\n",
"[Epoch 36/200] [Batch 12/59] [D loss: 0.642305] [G loss: 0.838110]\n",
"[Epoch 36/200] [Batch 13/59] [D loss: 0.618517] [G loss: 0.860122]\n",
"[Epoch 36/200] [Batch 14/59] [D loss: 0.640022] [G loss: 0.739309]\n",
"[Epoch 36/200] [Batch 15/59] [D loss: 0.675553] [G loss: 0.779189]\n",
"[Epoch 36/200] [Batch 16/59] [D loss: 0.663901] [G loss: 0.764890]\n",
"[Epoch 36/200] [Batch 17/59] [D loss: 0.705110] [G loss: 0.766522]\n",
"[Epoch 36/200] [Batch 18/59] [D loss: 0.670698] [G loss: 0.739125]\n",
"[Epoch 36/200] [Batch 19/59] [D loss: 0.687915] [G loss: 0.730606]\n",
"[Epoch 36/200] [Batch 20/59] [D loss: 0.667570] [G loss: 0.755025]\n",
"[Epoch 36/200] [Batch 21/59] [D loss: 0.682387] [G loss: 0.799531]\n",
"[Epoch 36/200] [Batch 22/59] [D loss: 0.649359] [G loss: 0.704237]\n",
"[Epoch 36/200] [Batch 23/59] [D loss: 0.642802] [G loss: 0.830796]\n",
"[Epoch 36/200] [Batch 24/59] [D loss: 0.655140] [G loss: 0.848497]\n",
"[Epoch 36/200] [Batch 25/59] [D loss: 0.627133] [G loss: 0.794533]\n",
"[Epoch 36/200] [Batch 26/59] [D loss: 0.634936] [G loss: 0.730738]\n",
"[Epoch 36/200] [Batch 27/59] [D loss: 0.606699] [G loss: 0.871122]\n",
"[Epoch 36/200] [Batch 28/59] [D loss: 0.622831] [G loss: 0.820673]\n",
"[Epoch 36/200] [Batch 29/59] [D loss: 0.636677] [G loss: 0.753072]\n",
"[Epoch 36/200] [Batch 30/59] [D loss: 0.637483] [G loss: 0.801149]\n",
"[Epoch 36/200] [Batch 31/59] [D loss: 0.638987] [G loss: 0.804499]\n",
"[Epoch 36/200] [Batch 32/59] [D loss: 0.650778] [G loss: 0.802835]\n",
"[Epoch 36/200] [Batch 33/59] [D loss: 0.647424] [G loss: 0.797181]\n",
"[Epoch 36/200] [Batch 34/59] [D loss: 0.668489] [G loss: 0.679544]\n",
"[Epoch 36/200] [Batch 35/59] [D loss: 0.662544] [G loss: 0.797093]\n",
"[Epoch 36/200] [Batch 36/59] [D loss: 0.655206] [G loss: 0.786138]\n",
"[Epoch 36/200] [Batch 37/59] [D loss: 0.654463] [G loss: 0.723391]\n",
"[Epoch 36/200] [Batch 38/59] [D loss: 0.644290] [G loss: 0.826027]\n",
"[Epoch 36/200] [Batch 39/59] [D loss: 0.650783] [G loss: 0.789845]\n",
"[Epoch 36/200] [Batch 40/59] [D loss: 0.673010] [G loss: 0.682253]\n",
"[Epoch 36/200] [Batch 41/59] [D loss: 0.662326] [G loss: 0.781808]\n",
"[Epoch 36/200] [Batch 42/59] [D loss: 0.634687] [G loss: 0.880848]\n",
"[Epoch 36/200] [Batch 43/59] [D loss: 0.622965] [G loss: 0.808287]\n",
"[Epoch 36/200] [Batch 44/59] [D loss: 0.643010] [G loss: 0.702149]\n",
"[Epoch 36/200] [Batch 45/59] [D loss: 0.638521] [G loss: 0.807580]\n",
"[Epoch 36/200] [Batch 46/59] [D loss: 0.632717] [G loss: 0.795879]\n",
"[Epoch 36/200] [Batch 47/59] [D loss: 0.639801] [G loss: 0.783421]\n",
"[Epoch 36/200] [Batch 48/59] [D loss: 0.664258] [G loss: 0.771726]\n",
"[Epoch 36/200] [Batch 49/59] [D loss: 0.671206] [G loss: 0.770661]\n",
"[Epoch 36/200] [Batch 50/59] [D loss: 0.633877] [G loss: 0.809448]\n",
"[Epoch 36/200] [Batch 51/59] [D loss: 0.652983] [G loss: 0.752999]\n",
"[Epoch 36/200] [Batch 52/59] [D loss: 0.664699] [G loss: 0.776110]\n",
"[Epoch 36/200] [Batch 53/59] [D loss: 0.655776] [G loss: 0.798923]\n",
"[Epoch 36/200] [Batch 54/59] [D loss: 0.667631] [G loss: 0.755711]\n",
"[Epoch 36/200] [Batch 55/59] [D loss: 0.657491] [G loss: 0.771306]\n",
"[Epoch 36/200] [Batch 56/59] [D loss: 0.644096] [G loss: 0.927608]\n",
"[Epoch 36/200] [Batch 57/59] [D loss: 0.640256] [G loss: 0.738529]\n",
"[Epoch 36/200] [Batch 58/59] [D loss: 0.635965] [G loss: 0.710504]\n",
"[Epoch 37/200] [Batch 0/59] [D loss: 0.633351] [G loss: 0.950669]\n",
"[Epoch 37/200] [Batch 1/59] [D loss: 0.626070] [G loss: 0.820879]\n",
"[Epoch 37/200] [Batch 2/59] [D loss: 0.653596] [G loss: 0.722615]\n",
"[Epoch 37/200] [Batch 3/59] [D loss: 0.639848] [G loss: 0.830385]\n",
"[Epoch 37/200] [Batch 4/59] [D loss: 0.636359] [G loss: 0.776929]\n",
"[Epoch 37/200] [Batch 5/59] [D loss: 0.646256] [G loss: 0.745400]\n",
"[Epoch 37/200] [Batch 6/59] [D loss: 0.642459] [G loss: 0.783834]\n",
"[Epoch 37/200] [Batch 7/59] [D loss: 0.647081] [G loss: 0.796955]\n",
"[Epoch 37/200] [Batch 8/59] [D loss: 0.681369] [G loss: 0.724388]\n",
"[Epoch 37/200] [Batch 9/59] [D loss: 0.677273] [G loss: 0.738444]\n",
"[Epoch 37/200] [Batch 10/59] [D loss: 0.680592] [G loss: 0.826157]\n",
"[Epoch 37/200] [Batch 11/59] [D loss: 0.662335] [G loss: 0.711368]\n",
"[Epoch 37/200] [Batch 12/59] [D loss: 0.657612] [G loss: 0.751434]\n",
"[Epoch 37/200] [Batch 13/59] [D loss: 0.623262] [G loss: 0.822955]\n",
"[Epoch 37/200] [Batch 14/59] [D loss: 0.620948] [G loss: 0.821082]\n",
"[Epoch 37/200] [Batch 15/59] [D loss: 0.629881] [G loss: 0.760788]\n",
"[Epoch 37/200] [Batch 16/59] [D loss: 0.622056] [G loss: 0.771189]\n",
"[Epoch 37/200] [Batch 17/59] [D loss: 0.628629] [G loss: 0.799334]\n",
"[Epoch 37/200] [Batch 18/59] [D loss: 0.627012] [G loss: 0.816984]\n",
"[Epoch 37/200] [Batch 19/59] [D loss: 0.607225] [G loss: 0.833929]\n",
"[Epoch 37/200] [Batch 20/59] [D loss: 0.621395] [G loss: 0.806314]\n",
"[Epoch 37/200] [Batch 21/59] [D loss: 0.644307] [G loss: 0.729765]\n",
"[Epoch 37/200] [Batch 22/59] [D loss: 0.653863] [G loss: 0.825755]\n",
"[Epoch 37/200] [Batch 23/59] [D loss: 0.666766] [G loss: 0.766921]\n",
"[Epoch 37/200] [Batch 24/59] [D loss: 0.684354] [G loss: 0.686348]\n",
"[Epoch 37/200] [Batch 25/59] [D loss: 0.703124] [G loss: 0.872811]\n",
"[Epoch 37/200] [Batch 26/59] [D loss: 0.658517] [G loss: 0.825275]\n",
"[Epoch 37/200] [Batch 27/59] [D loss: 0.652730] [G loss: 0.679761]\n",
"[Epoch 37/200] [Batch 28/59] [D loss: 0.661267] [G loss: 0.793528]\n",
"[Epoch 37/200] [Batch 29/59] [D loss: 0.639152] [G loss: 0.764877]\n",
"[Epoch 37/200] [Batch 30/59] [D loss: 0.644363] [G loss: 0.806161]\n",
"[Epoch 37/200] [Batch 31/59] [D loss: 0.648273] [G loss: 0.773333]\n",
"[Epoch 37/200] [Batch 32/59] [D loss: 0.620293] [G loss: 0.764389]\n",
"[Epoch 37/200] [Batch 33/59] [D loss: 0.642748] [G loss: 0.811424]\n",
"[Epoch 37/200] [Batch 34/59] [D loss: 0.637406] [G loss: 0.791899]\n",
"[Epoch 37/200] [Batch 35/59] [D loss: 0.625183] [G loss: 0.890339]\n",
"[Epoch 37/200] [Batch 36/59] [D loss: 0.642264] [G loss: 0.766123]\n",
"[Epoch 37/200] [Batch 37/59] [D loss: 0.636223] [G loss: 0.817444]\n",
"[Epoch 37/200] [Batch 38/59] [D loss: 0.665228] [G loss: 0.851164]\n",
"[Epoch 37/200] [Batch 39/59] [D loss: 0.642591] [G loss: 0.844396]\n",
"[Epoch 37/200] [Batch 40/59] [D loss: 0.684543] [G loss: 0.642930]\n",
"[Epoch 37/200] [Batch 41/59] [D loss: 0.646256] [G loss: 0.785781]\n",
"[Epoch 37/200] [Batch 42/59] [D loss: 0.665543] [G loss: 0.789403]\n",
"[Epoch 37/200] [Batch 43/59] [D loss: 0.680316] [G loss: 0.631745]\n",
"[Epoch 37/200] [Batch 44/59] [D loss: 0.675016] [G loss: 0.788063]\n",
"[Epoch 37/200] [Batch 45/59] [D loss: 0.643851] [G loss: 0.853826]\n",
"[Epoch 37/200] [Batch 46/59] [D loss: 0.655719] [G loss: 0.722521]\n",
"[Epoch 37/200] [Batch 47/59] [D loss: 0.630729] [G loss: 0.848656]\n",
"[Epoch 37/200] [Batch 48/59] [D loss: 0.634953] [G loss: 0.810498]\n",
"[Epoch 37/200] [Batch 49/59] [D loss: 0.619284] [G loss: 0.794998]\n",
"[Epoch 37/200] [Batch 50/59] [D loss: 0.624948] [G loss: 0.765448]\n",
"[Epoch 37/200] [Batch 51/59] [D loss: 0.620254] [G loss: 0.809770]\n",
"[Epoch 37/200] [Batch 52/59] [D loss: 0.650337] [G loss: 0.794238]\n",
"[Epoch 37/200] [Batch 53/59] [D loss: 0.642211] [G loss: 0.754615]\n",
"[Epoch 37/200] [Batch 54/59] [D loss: 0.640782] [G loss: 0.806093]\n",
"[Epoch 37/200] [Batch 55/59] [D loss: 0.656546] [G loss: 0.780941]\n",
"[Epoch 37/200] [Batch 56/59] [D loss: 0.647980] [G loss: 0.777265]\n",
"[Epoch 37/200] [Batch 57/59] [D loss: 0.643349] [G loss: 0.837986]\n",
"[Epoch 37/200] [Batch 58/59] [D loss: 0.653798] [G loss: 0.747349]\n",
"[Epoch 38/200] [Batch 0/59] [D loss: 0.648396] [G loss: 0.735675]\n",
"[Epoch 38/200] [Batch 1/59] [D loss: 0.650851] [G loss: 0.854828]\n",
"[Epoch 38/200] [Batch 2/59] [D loss: 0.645174] [G loss: 0.840469]\n",
"[Epoch 38/200] [Batch 3/59] [D loss: 0.651694] [G loss: 0.722484]\n",
"[Epoch 38/200] [Batch 4/59] [D loss: 0.640949] [G loss: 0.762666]\n",
"[Epoch 38/200] [Batch 5/59] [D loss: 0.649594] [G loss: 0.842075]\n",
"[Epoch 38/200] [Batch 6/59] [D loss: 0.631162] [G loss: 0.799779]\n",
"[Epoch 38/200] [Batch 7/59] [D loss: 0.633052] [G loss: 0.741335]\n",
"[Epoch 38/200] [Batch 8/59] [D loss: 0.644576] [G loss: 0.896336]\n",
"[Epoch 38/200] [Batch 9/59] [D loss: 0.640026] [G loss: 0.842175]\n",
"[Epoch 38/200] [Batch 10/59] [D loss: 0.628149] [G loss: 0.803519]\n",
"[Epoch 38/200] [Batch 11/59] [D loss: 0.631719] [G loss: 0.767367]\n",
"[Epoch 38/200] [Batch 12/59] [D loss: 0.632358] [G loss: 0.804953]\n",
"[Epoch 38/200] [Batch 13/59] [D loss: 0.637555] [G loss: 0.915032]\n",
"[Epoch 38/200] [Batch 14/59] [D loss: 0.644906] [G loss: 0.726359]\n",
"[Epoch 38/200] [Batch 15/59] [D loss: 0.634302] [G loss: 0.811061]\n",
"[Epoch 38/200] [Batch 16/59] [D loss: 0.653227] [G loss: 0.787022]\n",
"[Epoch 38/200] [Batch 17/59] [D loss: 0.631643] [G loss: 0.812627]\n",
"[Epoch 38/200] [Batch 18/59] [D loss: 0.642072] [G loss: 0.789631]\n",
"[Epoch 38/200] [Batch 19/59] [D loss: 0.643207] [G loss: 0.726159]\n",
"[Epoch 38/200] [Batch 20/59] [D loss: 0.659944] [G loss: 0.788427]\n",
"[Epoch 38/200] [Batch 21/59] [D loss: 0.643977] [G loss: 0.834991]\n",
"[Epoch 38/200] [Batch 22/59] [D loss: 0.639584] [G loss: 0.800551]\n",
"[Epoch 38/200] [Batch 23/59] [D loss: 0.631515] [G loss: 0.750512]\n",
"[Epoch 38/200] [Batch 24/59] [D loss: 0.647812] [G loss: 0.810976]\n",
"[Epoch 38/200] [Batch 25/59] [D loss: 0.628190] [G loss: 0.867364]\n",
"[Epoch 38/200] [Batch 26/59] [D loss: 0.643193] [G loss: 0.677250]\n",
"[Epoch 38/200] [Batch 27/59] [D loss: 0.639985] [G loss: 0.815157]\n",
"[Epoch 38/200] [Batch 28/59] [D loss: 0.608758] [G loss: 0.812284]\n",
"[Epoch 38/200] [Batch 29/59] [D loss: 0.638288] [G loss: 0.696268]\n",
"[Epoch 38/200] [Batch 30/59] [D loss: 0.655561] [G loss: 0.814393]\n",
"[Epoch 38/200] [Batch 31/59] [D loss: 0.631806] [G loss: 0.902183]\n",
"[Epoch 38/200] [Batch 32/59] [D loss: 0.628752] [G loss: 0.743314]\n",
"[Epoch 38/200] [Batch 33/59] [D loss: 0.655253] [G loss: 0.787812]\n",
"[Epoch 38/200] [Batch 34/59] [D loss: 0.626932] [G loss: 0.819237]\n",
"[Epoch 38/200] [Batch 35/59] [D loss: 0.635171] [G loss: 0.816631]\n",
"[Epoch 38/200] [Batch 36/59] [D loss: 0.675753] [G loss: 0.770492]\n",
"[Epoch 38/200] [Batch 37/59] [D loss: 0.653551] [G loss: 0.828919]\n",
"[Epoch 38/200] [Batch 38/59] [D loss: 0.661790] [G loss: 0.736784]\n",
"[Epoch 38/200] [Batch 39/59] [D loss: 0.665193] [G loss: 0.818726]\n",
"[Epoch 38/200] [Batch 40/59] [D loss: 0.660596] [G loss: 0.791044]\n",
"[Epoch 38/200] [Batch 41/59] [D loss: 0.663986] [G loss: 0.732557]\n",
"[Epoch 38/200] [Batch 42/59] [D loss: 0.670466] [G loss: 0.736535]\n",
"[Epoch 38/200] [Batch 43/59] [D loss: 0.662227] [G loss: 0.781466]\n",
"[Epoch 38/200] [Batch 44/59] [D loss: 0.669485] [G loss: 0.860485]\n",
"[Epoch 38/200] [Batch 45/59] [D loss: 0.634588] [G loss: 0.727843]\n",
"[Epoch 38/200] [Batch 46/59] [D loss: 0.644181] [G loss: 0.754747]\n",
"[Epoch 38/200] [Batch 47/59] [D loss: 0.632727] [G loss: 0.883294]\n",
"[Epoch 38/200] [Batch 48/59] [D loss: 0.632113] [G loss: 0.866535]\n",
"[Epoch 38/200] [Batch 49/59] [D loss: 0.628307] [G loss: 0.702810]\n",
"[Epoch 38/200] [Batch 50/59] [D loss: 0.630634] [G loss: 0.788599]\n",
"[Epoch 38/200] [Batch 51/59] [D loss: 0.648237] [G loss: 0.889766]\n",
"[Epoch 38/200] [Batch 52/59] [D loss: 0.609736] [G loss: 0.855111]\n",
"[Epoch 38/200] [Batch 53/59] [D loss: 0.629360] [G loss: 0.726739]\n",
"[Epoch 38/200] [Batch 54/59] [D loss: 0.622438] [G loss: 0.784043]\n",
"[Epoch 38/200] [Batch 55/59] [D loss: 0.647539] [G loss: 0.820479]\n",
"[Epoch 38/200] [Batch 56/59] [D loss: 0.642333] [G loss: 0.745143]\n",
"[Epoch 38/200] [Batch 57/59] [D loss: 0.638490] [G loss: 0.826775]\n",
"[Epoch 38/200] [Batch 58/59] [D loss: 0.656230] [G loss: 0.810978]\n",
"[Epoch 39/200] [Batch 0/59] [D loss: 0.648483] [G loss: 0.734929]\n",
"[Epoch 39/200] [Batch 1/59] [D loss: 0.656573] [G loss: 0.765942]\n",
"[Epoch 39/200] [Batch 2/59] [D loss: 0.667760] [G loss: 0.770372]\n",
"[Epoch 39/200] [Batch 3/59] [D loss: 0.679574] [G loss: 0.826298]\n",
"[Epoch 39/200] [Batch 4/59] [D loss: 0.668889] [G loss: 0.798504]\n",
"[Epoch 39/200] [Batch 5/59] [D loss: 0.647665] [G loss: 0.718337]\n",
"[Epoch 39/200] [Batch 6/59] [D loss: 0.649953] [G loss: 0.845589]\n",
"[Epoch 39/200] [Batch 7/59] [D loss: 0.641760] [G loss: 0.831010]\n",
"[Epoch 39/200] [Batch 8/59] [D loss: 0.630201] [G loss: 0.735266]\n",
"[Epoch 39/200] [Batch 9/59] [D loss: 0.624628] [G loss: 0.821373]\n",
"[Epoch 39/200] [Batch 10/59] [D loss: 0.626257] [G loss: 0.850457]\n",
"[Epoch 39/200] [Batch 11/59] [D loss: 0.616903] [G loss: 0.787793]\n",
"[Epoch 39/200] [Batch 12/59] [D loss: 0.630840] [G loss: 0.747360]\n",
"[Epoch 39/200] [Batch 13/59] [D loss: 0.626017] [G loss: 0.808689]\n",
"[Epoch 39/200] [Batch 14/59] [D loss: 0.647205] [G loss: 0.845644]\n",
"[Epoch 39/200] [Batch 15/59] [D loss: 0.639479] [G loss: 0.739769]\n",
"[Epoch 39/200] [Batch 16/59] [D loss: 0.645428] [G loss: 0.818122]\n",
"[Epoch 39/200] [Batch 17/59] [D loss: 0.664574] [G loss: 0.822414]\n",
"[Epoch 39/200] [Batch 18/59] [D loss: 0.647071] [G loss: 0.780687]\n",
"[Epoch 39/200] [Batch 19/59] [D loss: 0.664157] [G loss: 0.714357]\n",
"[Epoch 39/200] [Batch 20/59] [D loss: 0.697372] [G loss: 0.844368]\n",
"[Epoch 39/200] [Batch 21/59] [D loss: 0.660242] [G loss: 0.702118]\n",
"[Epoch 39/200] [Batch 22/59] [D loss: 0.673077] [G loss: 0.809795]\n",
"[Epoch 39/200] [Batch 23/59] [D loss: 0.648484] [G loss: 0.808594]\n",
"[Epoch 39/200] [Batch 24/59] [D loss: 0.654154] [G loss: 0.805939]\n",
"[Epoch 39/200] [Batch 25/59] [D loss: 0.645468] [G loss: 0.784712]\n",
"[Epoch 39/200] [Batch 26/59] [D loss: 0.623801] [G loss: 0.866517]\n",
"[Epoch 39/200] [Batch 27/59] [D loss: 0.627553] [G loss: 0.787254]\n",
"[Epoch 39/200] [Batch 28/59] [D loss: 0.614487] [G loss: 0.738901]\n",
"[Epoch 39/200] [Batch 29/59] [D loss: 0.617321] [G loss: 0.872280]\n",
"[Epoch 39/200] [Batch 30/59] [D loss: 0.635414] [G loss: 0.811154]\n",
"[Epoch 39/200] [Batch 31/59] [D loss: 0.649873] [G loss: 0.756654]\n",
"[Epoch 39/200] [Batch 32/59] [D loss: 0.647167] [G loss: 0.733974]\n",
"[Epoch 39/200] [Batch 33/59] [D loss: 0.643908] [G loss: 0.941932]\n",
"[Epoch 39/200] [Batch 34/59] [D loss: 0.657847] [G loss: 0.754423]\n",
"[Epoch 39/200] [Batch 35/59] [D loss: 0.647492] [G loss: 0.683355]\n",
"[Epoch 39/200] [Batch 36/59] [D loss: 0.674269] [G loss: 0.756523]\n",
"[Epoch 39/200] [Batch 37/59] [D loss: 0.659177] [G loss: 0.847477]\n",
"[Epoch 39/200] [Batch 38/59] [D loss: 0.634708] [G loss: 0.798609]\n",
"[Epoch 39/200] [Batch 39/59] [D loss: 0.639169] [G loss: 0.733682]\n",
"[Epoch 39/200] [Batch 40/59] [D loss: 0.644581] [G loss: 0.802441]\n",
"[Epoch 39/200] [Batch 41/59] [D loss: 0.638351] [G loss: 0.902885]\n",
"[Epoch 39/200] [Batch 42/59] [D loss: 0.631938] [G loss: 0.818921]\n",
"[Epoch 39/200] [Batch 43/59] [D loss: 0.629341] [G loss: 0.774752]\n",
"[Epoch 39/200] [Batch 44/59] [D loss: 0.622143] [G loss: 0.810750]\n",
"[Epoch 39/200] [Batch 45/59] [D loss: 0.648864] [G loss: 0.842851]\n",
"[Epoch 39/200] [Batch 46/59] [D loss: 0.640794] [G loss: 0.808206]\n",
"[Epoch 39/200] [Batch 47/59] [D loss: 0.645014] [G loss: 0.831038]\n",
"[Epoch 39/200] [Batch 48/59] [D loss: 0.652230] [G loss: 0.878975]\n",
"[Epoch 39/200] [Batch 49/59] [D loss: 0.642331] [G loss: 0.788362]\n",
"[Epoch 39/200] [Batch 50/59] [D loss: 0.633039] [G loss: 0.785229]\n",
"[Epoch 39/200] [Batch 51/59] [D loss: 0.681550] [G loss: 0.776613]\n",
"[Epoch 39/200] [Batch 52/59] [D loss: 0.644640] [G loss: 0.749008]\n",
"[Epoch 39/200] [Batch 53/59] [D loss: 0.639894] [G loss: 0.796372]\n",
"[Epoch 39/200] [Batch 54/59] [D loss: 0.652316] [G loss: 0.741604]\n",
"[Epoch 39/200] [Batch 55/59] [D loss: 0.651512] [G loss: 0.774420]\n",
"[Epoch 39/200] [Batch 56/59] [D loss: 0.659196] [G loss: 0.774429]\n",
"[Epoch 39/200] [Batch 57/59] [D loss: 0.650817] [G loss: 0.828733]\n",
"[Epoch 39/200] [Batch 58/59] [D loss: 0.655200] [G loss: 0.769399]\n",
"[Epoch 40/200] [Batch 0/59] [D loss: 0.634224] [G loss: 0.822380]\n",
"[Epoch 40/200] [Batch 1/59] [D loss: 0.608573] [G loss: 0.859418]\n",
"[Epoch 40/200] [Batch 2/59] [D loss: 0.635962] [G loss: 0.785123]\n",
"[Epoch 40/200] [Batch 3/59] [D loss: 0.628352] [G loss: 0.848755]\n",
"[Epoch 40/200] [Batch 4/59] [D loss: 0.648220] [G loss: 0.775762]\n",
"[Epoch 40/200] [Batch 5/59] [D loss: 0.629887] [G loss: 0.902281]\n",
"[Epoch 40/200] [Batch 6/59] [D loss: 0.631251] [G loss: 0.766715]\n",
"[Epoch 40/200] [Batch 7/59] [D loss: 0.660267] [G loss: 0.764349]\n",
"[Epoch 40/200] [Batch 8/59] [D loss: 0.657094] [G loss: 0.832508]\n",
"[Epoch 40/200] [Batch 9/59] [D loss: 0.627837] [G loss: 0.734674]\n",
"[Epoch 40/200] [Batch 10/59] [D loss: 0.654314] [G loss: 0.770268]\n",
"[Epoch 40/200] [Batch 11/59] [D loss: 0.641963] [G loss: 0.749648]\n",
"[Epoch 40/200] [Batch 12/59] [D loss: 0.639665] [G loss: 0.738935]\n",
"[Epoch 40/200] [Batch 13/59] [D loss: 0.664971] [G loss: 0.777772]\n",
"[Epoch 40/200] [Batch 14/59] [D loss: 0.653726] [G loss: 0.860151]\n",
"[Epoch 40/200] [Batch 15/59] [D loss: 0.646238] [G loss: 0.756735]\n",
"[Epoch 40/200] [Batch 16/59] [D loss: 0.648712] [G loss: 0.759996]\n",
"[Epoch 40/200] [Batch 17/59] [D loss: 0.651253] [G loss: 0.796063]\n",
"[Epoch 40/200] [Batch 18/59] [D loss: 0.643772] [G loss: 0.748778]\n",
"[Epoch 40/200] [Batch 19/59] [D loss: 0.662894] [G loss: 0.831392]\n",
"[Epoch 40/200] [Batch 20/59] [D loss: 0.658337] [G loss: 0.789386]\n",
"[Epoch 40/200] [Batch 21/59] [D loss: 0.648482] [G loss: 0.794516]\n",
"[Epoch 40/200] [Batch 22/59] [D loss: 0.642683] [G loss: 0.778932]\n",
"[Epoch 40/200] [Batch 23/59] [D loss: 0.638767] [G loss: 0.724065]\n",
"[Epoch 40/200] [Batch 24/59] [D loss: 0.658486] [G loss: 0.883695]\n",
"[Epoch 40/200] [Batch 25/59] [D loss: 0.650576] [G loss: 0.722038]\n",
"[Epoch 40/200] [Batch 26/59] [D loss: 0.643302] [G loss: 0.794825]\n",
"[Epoch 40/200] [Batch 27/59] [D loss: 0.647871] [G loss: 0.811104]\n",
"[Epoch 40/200] [Batch 28/59] [D loss: 0.665059] [G loss: 0.715559]\n",
"[Epoch 40/200] [Batch 29/59] [D loss: 0.636959] [G loss: 0.768316]\n",
"[Epoch 40/200] [Batch 30/59] [D loss: 0.646191] [G loss: 0.861480]\n",
"[Epoch 40/200] [Batch 31/59] [D loss: 0.648899] [G loss: 0.758954]\n",
"[Epoch 40/200] [Batch 32/59] [D loss: 0.656514] [G loss: 0.824229]\n",
"[Epoch 40/200] [Batch 33/59] [D loss: 0.643205] [G loss: 0.848753]\n",
"[Epoch 40/200] [Batch 34/59] [D loss: 0.636270] [G loss: 0.744502]\n",
"[Epoch 40/200] [Batch 35/59] [D loss: 0.651882] [G loss: 0.749646]\n",
"[Epoch 40/200] [Batch 36/59] [D loss: 0.654686] [G loss: 0.801311]\n",
"[Epoch 40/200] [Batch 37/59] [D loss: 0.621658] [G loss: 0.849835]\n",
"[Epoch 40/200] [Batch 38/59] [D loss: 0.634693] [G loss: 0.745805]\n",
"[Epoch 40/200] [Batch 39/59] [D loss: 0.655086] [G loss: 0.761998]\n",
"[Epoch 40/200] [Batch 40/59] [D loss: 0.652075] [G loss: 0.841031]\n",
"[Epoch 40/200] [Batch 41/59] [D loss: 0.628459] [G loss: 0.749937]\n",
"[Epoch 40/200] [Batch 42/59] [D loss: 0.667385] [G loss: 0.763132]\n",
"[Epoch 40/200] [Batch 43/59] [D loss: 0.657371] [G loss: 0.793658]\n",
"[Epoch 40/200] [Batch 44/59] [D loss: 0.615184] [G loss: 0.832531]\n",
"[Epoch 40/200] [Batch 45/59] [D loss: 0.641320] [G loss: 0.690630]\n",
"[Epoch 40/200] [Batch 46/59] [D loss: 0.668192] [G loss: 0.772589]\n",
"[Epoch 40/200] [Batch 47/59] [D loss: 0.667317] [G loss: 0.931704]\n",
"[Epoch 40/200] [Batch 48/59] [D loss: 0.641790] [G loss: 0.734566]\n",
"[Epoch 40/200] [Batch 49/59] [D loss: 0.654011] [G loss: 0.734952]\n",
"[Epoch 40/200] [Batch 50/59] [D loss: 0.657828] [G loss: 0.928217]\n",
"[Epoch 40/200] [Batch 51/59] [D loss: 0.642967] [G loss: 0.842307]\n",
"[Epoch 40/200] [Batch 52/59] [D loss: 0.645844] [G loss: 0.683355]\n",
"[Epoch 40/200] [Batch 53/59] [D loss: 0.645830] [G loss: 0.789379]\n",
"[Epoch 40/200] [Batch 54/59] [D loss: 0.645815] [G loss: 0.889486]\n",
"[Epoch 40/200] [Batch 55/59] [D loss: 0.663650] [G loss: 0.734970]\n",
"[Epoch 40/200] [Batch 56/59] [D loss: 0.635528] [G loss: 0.713973]\n",
"[Epoch 40/200] [Batch 57/59] [D loss: 0.642623] [G loss: 0.819841]\n",
"[Epoch 40/200] [Batch 58/59] [D loss: 0.639459] [G loss: 0.747176]\n",
"[Epoch 41/200] [Batch 0/59] [D loss: 0.649218] [G loss: 0.798123]\n",
"[Epoch 41/200] [Batch 1/59] [D loss: 0.645931] [G loss: 0.741436]\n",
"[Epoch 41/200] [Batch 2/59] [D loss: 0.660884] [G loss: 0.731404]\n",
"[Epoch 41/200] [Batch 3/59] [D loss: 0.642887] [G loss: 0.804615]\n",
"[Epoch 41/200] [Batch 4/59] [D loss: 0.670181] [G loss: 0.813226]\n",
"[Epoch 41/200] [Batch 5/59] [D loss: 0.660600] [G loss: 0.753578]\n",
"[Epoch 41/200] [Batch 6/59] [D loss: 0.667367] [G loss: 0.753410]\n",
"[Epoch 41/200] [Batch 7/59] [D loss: 0.646299] [G loss: 0.840065]\n",
"[Epoch 41/200] [Batch 8/59] [D loss: 0.631886] [G loss: 0.844218]\n",
"[Epoch 41/200] [Batch 9/59] [D loss: 0.636637] [G loss: 0.744294]\n",
"[Epoch 41/200] [Batch 10/59] [D loss: 0.648239] [G loss: 0.814704]\n",
"[Epoch 41/200] [Batch 11/59] [D loss: 0.625579] [G loss: 0.783695]\n",
"[Epoch 41/200] [Batch 12/59] [D loss: 0.622431] [G loss: 0.801449]\n",
"[Epoch 41/200] [Batch 13/59] [D loss: 0.610945] [G loss: 0.746806]\n",
"[Epoch 41/200] [Batch 14/59] [D loss: 0.623126] [G loss: 0.808017]\n",
"[Epoch 41/200] [Batch 15/59] [D loss: 0.638881] [G loss: 0.847206]\n",
"[Epoch 41/200] [Batch 16/59] [D loss: 0.631126] [G loss: 0.740076]\n",
"[Epoch 41/200] [Batch 17/59] [D loss: 0.635593] [G loss: 0.779043]\n",
"[Epoch 41/200] [Batch 18/59] [D loss: 0.636809] [G loss: 0.709843]\n",
"[Epoch 41/200] [Batch 19/59] [D loss: 0.660549] [G loss: 0.840879]\n",
"[Epoch 41/200] [Batch 20/59] [D loss: 0.641718] [G loss: 0.748935]\n",
"[Epoch 41/200] [Batch 21/59] [D loss: 0.642208] [G loss: 0.757161]\n",
"[Epoch 41/200] [Batch 22/59] [D loss: 0.678976] [G loss: 0.742917]\n",
"[Epoch 41/200] [Batch 23/59] [D loss: 0.671150] [G loss: 0.863395]\n",
"[Epoch 41/200] [Batch 24/59] [D loss: 0.648517] [G loss: 0.720556]\n",
"[Epoch 41/200] [Batch 25/59] [D loss: 0.631258] [G loss: 0.722028]\n",
"[Epoch 41/200] [Batch 26/59] [D loss: 0.636816] [G loss: 0.821093]\n",
"[Epoch 41/200] [Batch 27/59] [D loss: 0.642725] [G loss: 0.810954]\n",
"[Epoch 41/200] [Batch 28/59] [D loss: 0.638612] [G loss: 0.758732]\n",
"[Epoch 41/200] [Batch 29/59] [D loss: 0.629054] [G loss: 0.930583]\n",
"[Epoch 41/200] [Batch 30/59] [D loss: 0.625914] [G loss: 0.762782]\n",
"[Epoch 41/200] [Batch 31/59] [D loss: 0.633452] [G loss: 0.779303]\n",
"[Epoch 41/200] [Batch 32/59] [D loss: 0.616866] [G loss: 0.856992]\n",
"[Epoch 41/200] [Batch 33/59] [D loss: 0.635346] [G loss: 0.721844]\n",
"[Epoch 41/200] [Batch 34/59] [D loss: 0.654369] [G loss: 0.704024]\n",
"[Epoch 41/200] [Batch 35/59] [D loss: 0.635711] [G loss: 0.858596]\n",
"[Epoch 41/200] [Batch 36/59] [D loss: 0.637064] [G loss: 0.803595]\n",
"[Epoch 41/200] [Batch 37/59] [D loss: 0.653312] [G loss: 0.801854]\n",
"[Epoch 41/200] [Batch 38/59] [D loss: 0.626436] [G loss: 0.730878]\n",
"[Epoch 41/200] [Batch 39/59] [D loss: 0.648967] [G loss: 0.790245]\n",
"[Epoch 41/200] [Batch 40/59] [D loss: 0.637571] [G loss: 0.801743]\n",
"[Epoch 41/200] [Batch 41/59] [D loss: 0.662904] [G loss: 0.754541]\n",
"[Epoch 41/200] [Batch 42/59] [D loss: 0.654767] [G loss: 0.791770]\n",
"[Epoch 41/200] [Batch 43/59] [D loss: 0.649567] [G loss: 0.814522]\n",
"[Epoch 41/200] [Batch 44/59] [D loss: 0.655482] [G loss: 0.771858]\n",
"[Epoch 41/200] [Batch 45/59] [D loss: 0.627897] [G loss: 0.833314]\n",
"[Epoch 41/200] [Batch 46/59] [D loss: 0.668818] [G loss: 0.801990]\n",
"[Epoch 41/200] [Batch 47/59] [D loss: 0.633635] [G loss: 0.769671]\n",
"[Epoch 41/200] [Batch 48/59] [D loss: 0.636472] [G loss: 0.778923]\n",
"[Epoch 41/200] [Batch 49/59] [D loss: 0.629386] [G loss: 0.820886]\n",
"[Epoch 41/200] [Batch 50/59] [D loss: 0.656667] [G loss: 0.771354]\n",
"[Epoch 41/200] [Batch 51/59] [D loss: 0.638056] [G loss: 0.818697]\n",
"[Epoch 41/200] [Batch 52/59] [D loss: 0.645687] [G loss: 0.737121]\n",
"[Epoch 41/200] [Batch 53/59] [D loss: 0.642618] [G loss: 0.793338]\n",
"[Epoch 41/200] [Batch 54/59] [D loss: 0.634057] [G loss: 0.804400]\n",
"[Epoch 41/200] [Batch 55/59] [D loss: 0.641751] [G loss: 0.758563]\n",
"[Epoch 41/200] [Batch 56/59] [D loss: 0.624767] [G loss: 0.829552]\n",
"[Epoch 41/200] [Batch 57/59] [D loss: 0.640015] [G loss: 0.728109]\n",
"[Epoch 41/200] [Batch 58/59] [D loss: 0.639160] [G loss: 0.852950]\n",
"[Epoch 42/200] [Batch 0/59] [D loss: 0.649372] [G loss: 0.812305]\n",
"[Epoch 42/200] [Batch 1/59] [D loss: 0.619709] [G loss: 0.773658]\n",
"[Epoch 42/200] [Batch 2/59] [D loss: 0.660788] [G loss: 0.817662]\n",
"[Epoch 42/200] [Batch 3/59] [D loss: 0.622475] [G loss: 0.788209]\n",
"[Epoch 42/200] [Batch 4/59] [D loss: 0.682154] [G loss: 0.786773]\n",
"[Epoch 42/200] [Batch 5/59] [D loss: 0.657799] [G loss: 0.774345]\n",
"[Epoch 42/200] [Batch 6/59] [D loss: 0.652440] [G loss: 0.817309]\n",
"[Epoch 42/200] [Batch 7/59] [D loss: 0.640326] [G loss: 0.814378]\n",
"[Epoch 42/200] [Batch 8/59] [D loss: 0.664348] [G loss: 0.691426]\n",
"[Epoch 42/200] [Batch 9/59] [D loss: 0.625609] [G loss: 0.815088]\n",
"[Epoch 42/200] [Batch 10/59] [D loss: 0.645984] [G loss: 0.961212]\n",
"[Epoch 42/200] [Batch 11/59] [D loss: 0.639634] [G loss: 0.788344]\n",
"[Epoch 42/200] [Batch 12/59] [D loss: 0.604075] [G loss: 0.803832]\n",
"[Epoch 42/200] [Batch 13/59] [D loss: 0.639964] [G loss: 0.950677]\n",
"[Epoch 42/200] [Batch 14/59] [D loss: 0.652900] [G loss: 0.793614]\n",
"[Epoch 42/200] [Batch 15/59] [D loss: 0.633232] [G loss: 0.651805]\n",
"[Epoch 42/200] [Batch 16/59] [D loss: 0.652816] [G loss: 1.038809]\n",
"[Epoch 42/200] [Batch 17/59] [D loss: 0.630120] [G loss: 0.769097]\n",
"[Epoch 42/200] [Batch 18/59] [D loss: 0.653854] [G loss: 0.593347]\n",
"[Epoch 42/200] [Batch 19/59] [D loss: 0.639671] [G loss: 0.932402]\n",
"[Epoch 42/200] [Batch 20/59] [D loss: 0.636983] [G loss: 0.832103]\n",
"[Epoch 42/200] [Batch 21/59] [D loss: 0.653021] [G loss: 0.606240]\n",
"[Epoch 42/200] [Batch 22/59] [D loss: 0.643521] [G loss: 0.882325]\n",
"[Epoch 42/200] [Batch 23/59] [D loss: 0.657742] [G loss: 0.840804]\n",
"[Epoch 42/200] [Batch 24/59] [D loss: 0.643897] [G loss: 0.826165]\n",
"[Epoch 42/200] [Batch 25/59] [D loss: 0.614889] [G loss: 0.807078]\n",
"[Epoch 42/200] [Batch 26/59] [D loss: 0.656594] [G loss: 0.791300]\n",
"[Epoch 42/200] [Batch 27/59] [D loss: 0.621918] [G loss: 0.881508]\n",
"[Epoch 42/200] [Batch 28/59] [D loss: 0.630880] [G loss: 0.726330]\n",
"[Epoch 42/200] [Batch 29/59] [D loss: 0.649555] [G loss: 0.780735]\n",
"[Epoch 42/200] [Batch 30/59] [D loss: 0.636880] [G loss: 0.868961]\n",
"[Epoch 42/200] [Batch 31/59] [D loss: 0.626217] [G loss: 0.750233]\n",
"[Epoch 42/200] [Batch 32/59] [D loss: 0.671689] [G loss: 0.684645]\n",
"[Epoch 42/200] [Batch 33/59] [D loss: 0.659312] [G loss: 0.948957]\n",
"[Epoch 42/200] [Batch 34/59] [D loss: 0.644932] [G loss: 0.750189]\n",
"[Epoch 42/200] [Batch 35/59] [D loss: 0.628632] [G loss: 0.687531]\n",
"[Epoch 42/200] [Batch 36/59] [D loss: 0.654309] [G loss: 0.851619]\n",
"[Epoch 42/200] [Batch 37/59] [D loss: 0.623950] [G loss: 0.836119]\n",
"[Epoch 42/200] [Batch 38/59] [D loss: 0.647380] [G loss: 0.728967]\n",
"[Epoch 42/200] [Batch 39/59] [D loss: 0.643623] [G loss: 0.750813]\n",
"[Epoch 42/200] [Batch 40/59] [D loss: 0.616653] [G loss: 0.873367]\n",
"[Epoch 42/200] [Batch 41/59] [D loss: 0.640316] [G loss: 0.749421]\n",
"[Epoch 42/200] [Batch 42/59] [D loss: 0.655002] [G loss: 0.794166]\n",
"[Epoch 42/200] [Batch 43/59] [D loss: 0.652073] [G loss: 0.812695]\n",
"[Epoch 42/200] [Batch 44/59] [D loss: 0.625407] [G loss: 0.776548]\n",
"[Epoch 42/200] [Batch 45/59] [D loss: 0.602372] [G loss: 0.791803]\n",
"[Epoch 42/200] [Batch 46/59] [D loss: 0.636842] [G loss: 0.859614]\n",
"[Epoch 42/200] [Batch 47/59] [D loss: 0.632880] [G loss: 0.876790]\n",
"[Epoch 42/200] [Batch 48/59] [D loss: 0.652396] [G loss: 0.802251]\n",
"[Epoch 42/200] [Batch 49/59] [D loss: 0.676181] [G loss: 0.672657]\n",
"[Epoch 42/200] [Batch 50/59] [D loss: 0.654245] [G loss: 0.846225]\n",
"[Epoch 42/200] [Batch 51/59] [D loss: 0.642072] [G loss: 0.818389]\n",
"[Epoch 42/200] [Batch 52/59] [D loss: 0.684868] [G loss: 0.719064]\n",
"[Epoch 42/200] [Batch 53/59] [D loss: 0.629469] [G loss: 0.766280]\n",
"[Epoch 42/200] [Batch 54/59] [D loss: 0.625219] [G loss: 0.871850]\n",
"[Epoch 42/200] [Batch 55/59] [D loss: 0.632928] [G loss: 0.777850]\n",
"[Epoch 42/200] [Batch 56/59] [D loss: 0.646549] [G loss: 0.836596]\n",
"[Epoch 42/200] [Batch 57/59] [D loss: 0.639743] [G loss: 0.696122]\n",
"[Epoch 42/200] [Batch 58/59] [D loss: 0.619398] [G loss: 0.884690]\n",
"[Epoch 43/200] [Batch 0/59] [D loss: 0.642616] [G loss: 0.792306]\n",
"[Epoch 43/200] [Batch 1/59] [D loss: 0.617236] [G loss: 0.719291]\n",
"[Epoch 43/200] [Batch 2/59] [D loss: 0.633011] [G loss: 0.865710]\n",
"[Epoch 43/200] [Batch 3/59] [D loss: 0.667017] [G loss: 0.771870]\n",
"[Epoch 43/200] [Batch 4/59] [D loss: 0.648679] [G loss: 0.817710]\n",
"[Epoch 43/200] [Batch 5/59] [D loss: 0.630348] [G loss: 0.824841]\n",
"[Epoch 43/200] [Batch 6/59] [D loss: 0.638904] [G loss: 0.789050]\n",
"[Epoch 43/200] [Batch 7/59] [D loss: 0.645111] [G loss: 0.711301]\n",
"[Epoch 43/200] [Batch 8/59] [D loss: 0.645214] [G loss: 0.821127]\n",
"[Epoch 43/200] [Batch 9/59] [D loss: 0.626634] [G loss: 0.907322]\n",
"[Epoch 43/200] [Batch 10/59] [D loss: 0.638512] [G loss: 0.783604]\n",
"[Epoch 43/200] [Batch 11/59] [D loss: 0.639152] [G loss: 0.687030]\n",
"[Epoch 43/200] [Batch 12/59] [D loss: 0.614591] [G loss: 0.787365]\n",
"[Epoch 43/200] [Batch 13/59] [D loss: 0.609881] [G loss: 0.808275]\n",
"[Epoch 43/200] [Batch 14/59] [D loss: 0.627384] [G loss: 0.814409]\n",
"[Epoch 43/200] [Batch 15/59] [D loss: 0.669887] [G loss: 0.758423]\n",
"[Epoch 43/200] [Batch 16/59] [D loss: 0.662419] [G loss: 0.837371]\n",
"[Epoch 43/200] [Batch 17/59] [D loss: 0.636908] [G loss: 0.720575]\n",
"[Epoch 43/200] [Batch 18/59] [D loss: 0.654732] [G loss: 0.825797]\n",
"[Epoch 43/200] [Batch 19/59] [D loss: 0.632593] [G loss: 0.784934]\n",
"[Epoch 43/200] [Batch 20/59] [D loss: 0.623917] [G loss: 0.829890]\n",
"[Epoch 43/200] [Batch 21/59] [D loss: 0.632236] [G loss: 0.779766]\n",
"[Epoch 43/200] [Batch 22/59] [D loss: 0.592218] [G loss: 0.801054]\n",
"[Epoch 43/200] [Batch 23/59] [D loss: 0.604730] [G loss: 0.806747]\n",
"[Epoch 43/200] [Batch 24/59] [D loss: 0.624128] [G loss: 0.833602]\n",
"[Epoch 43/200] [Batch 25/59] [D loss: 0.623406] [G loss: 0.798416]\n",
"[Epoch 43/200] [Batch 26/59] [D loss: 0.608037] [G loss: 0.790949]\n",
"[Epoch 43/200] [Batch 27/59] [D loss: 0.612180] [G loss: 0.848393]\n",
"[Epoch 43/200] [Batch 28/59] [D loss: 0.635467] [G loss: 0.767040]\n",
"[Epoch 43/200] [Batch 29/59] [D loss: 0.645475] [G loss: 0.822307]\n",
"[Epoch 43/200] [Batch 30/59] [D loss: 0.626380] [G loss: 0.791008]\n",
"[Epoch 43/200] [Batch 31/59] [D loss: 0.640763] [G loss: 0.710056]\n",
"[Epoch 43/200] [Batch 32/59] [D loss: 0.641016] [G loss: 0.899671]\n",
"[Epoch 43/200] [Batch 33/59] [D loss: 0.668212] [G loss: 0.826855]\n",
"[Epoch 43/200] [Batch 34/59] [D loss: 0.668607] [G loss: 0.672765]\n",
"[Epoch 43/200] [Batch 35/59] [D loss: 0.667848] [G loss: 0.828289]\n",
"[Epoch 43/200] [Batch 36/59] [D loss: 0.636391] [G loss: 0.864502]\n",
"[Epoch 43/200] [Batch 37/59] [D loss: 0.649741] [G loss: 0.671214]\n",
"[Epoch 43/200] [Batch 38/59] [D loss: 0.654199] [G loss: 0.833065]\n",
"[Epoch 43/200] [Batch 39/59] [D loss: 0.633549] [G loss: 0.854496]\n",
"[Epoch 43/200] [Batch 40/59] [D loss: 0.619054] [G loss: 0.656317]\n",
"[Epoch 43/200] [Batch 41/59] [D loss: 0.630630] [G loss: 0.831772]\n",
"[Epoch 43/200] [Batch 42/59] [D loss: 0.622716] [G loss: 0.834630]\n",
"[Epoch 43/200] [Batch 43/59] [D loss: 0.623120] [G loss: 0.789414]\n",
"[Epoch 43/200] [Batch 44/59] [D loss: 0.611177] [G loss: 0.779991]\n",
"[Epoch 43/200] [Batch 45/59] [D loss: 0.632722] [G loss: 0.821916]\n",
"[Epoch 43/200] [Batch 46/59] [D loss: 0.641742] [G loss: 0.727180]\n",
"[Epoch 43/200] [Batch 47/59] [D loss: 0.656360] [G loss: 0.793606]\n",
"[Epoch 43/200] [Batch 48/59] [D loss: 0.639434] [G loss: 0.814949]\n",
"[Epoch 43/200] [Batch 49/59] [D loss: 0.653635] [G loss: 0.697710]\n",
"[Epoch 43/200] [Batch 50/59] [D loss: 0.630914] [G loss: 0.786419]\n",
"[Epoch 43/200] [Batch 51/59] [D loss: 0.626064] [G loss: 0.886143]\n",
"[Epoch 43/200] [Batch 52/59] [D loss: 0.638392] [G loss: 0.754334]\n",
"[Epoch 43/200] [Batch 53/59] [D loss: 0.648719] [G loss: 0.731615]\n",
"[Epoch 43/200] [Batch 54/59] [D loss: 0.618544] [G loss: 0.876715]\n",
"[Epoch 43/200] [Batch 55/59] [D loss: 0.632196] [G loss: 0.819416]\n",
"[Epoch 43/200] [Batch 56/59] [D loss: 0.661490] [G loss: 0.739958]\n",
"[Epoch 43/200] [Batch 57/59] [D loss: 0.625195] [G loss: 0.830630]\n",
"[Epoch 43/200] [Batch 58/59] [D loss: 0.661904] [G loss: 0.859977]\n",
"[Epoch 44/200] [Batch 0/59] [D loss: 0.641497] [G loss: 0.661335]\n",
"[Epoch 44/200] [Batch 1/59] [D loss: 0.665652] [G loss: 0.855976]\n",
"[Epoch 44/200] [Batch 2/59] [D loss: 0.637166] [G loss: 0.840405]\n",
"[Epoch 44/200] [Batch 3/59] [D loss: 0.659004] [G loss: 0.733928]\n",
"[Epoch 44/200] [Batch 4/59] [D loss: 0.670537] [G loss: 0.858361]\n",
"[Epoch 44/200] [Batch 5/59] [D loss: 0.638173] [G loss: 0.851738]\n",
"[Epoch 44/200] [Batch 6/59] [D loss: 0.657024] [G loss: 0.746842]\n",
"[Epoch 44/200] [Batch 7/59] [D loss: 0.655520] [G loss: 0.792912]\n",
"[Epoch 44/200] [Batch 8/59] [D loss: 0.669309] [G loss: 0.913465]\n",
"[Epoch 44/200] [Batch 9/59] [D loss: 0.618469] [G loss: 0.850217]\n",
"[Epoch 44/200] [Batch 10/59] [D loss: 0.657515] [G loss: 0.651384]\n",
"[Epoch 44/200] [Batch 11/59] [D loss: 0.632136] [G loss: 0.874564]\n",
"[Epoch 44/200] [Batch 12/59] [D loss: 0.638776] [G loss: 0.817199]\n",
"[Epoch 44/200] [Batch 13/59] [D loss: 0.662496] [G loss: 0.834234]\n",
"[Epoch 44/200] [Batch 14/59] [D loss: 0.648639] [G loss: 0.811979]\n",
"[Epoch 44/200] [Batch 15/59] [D loss: 0.646951] [G loss: 0.815407]\n",
"[Epoch 44/200] [Batch 16/59] [D loss: 0.648519] [G loss: 0.853881]\n",
"[Epoch 44/200] [Batch 17/59] [D loss: 0.633213] [G loss: 0.715936]\n",
"[Epoch 44/200] [Batch 18/59] [D loss: 0.642843] [G loss: 0.772734]\n",
"[Epoch 44/200] [Batch 19/59] [D loss: 0.648469] [G loss: 0.806618]\n",
"[Epoch 44/200] [Batch 20/59] [D loss: 0.668319] [G loss: 0.849924]\n",
"[Epoch 44/200] [Batch 21/59] [D loss: 0.646729] [G loss: 0.751657]\n",
"[Epoch 44/200] [Batch 22/59] [D loss: 0.640017] [G loss: 0.749320]\n",
"[Epoch 44/200] [Batch 23/59] [D loss: 0.621025] [G loss: 0.912334]\n",
"[Epoch 44/200] [Batch 24/59] [D loss: 0.624128] [G loss: 0.737269]\n",
"[Epoch 44/200] [Batch 25/59] [D loss: 0.629927] [G loss: 0.767675]\n",
"[Epoch 44/200] [Batch 26/59] [D loss: 0.667112] [G loss: 0.829769]\n",
"[Epoch 44/200] [Batch 27/59] [D loss: 0.651980] [G loss: 0.745779]\n",
"[Epoch 44/200] [Batch 28/59] [D loss: 0.637475] [G loss: 0.739462]\n",
"[Epoch 44/200] [Batch 29/59] [D loss: 0.620943] [G loss: 0.778792]\n",
"[Epoch 44/200] [Batch 30/59] [D loss: 0.649360] [G loss: 0.814462]\n",
"[Epoch 44/200] [Batch 31/59] [D loss: 0.642709] [G loss: 0.851236]\n",
"[Epoch 44/200] [Batch 32/59] [D loss: 0.634052] [G loss: 0.690868]\n",
"[Epoch 44/200] [Batch 33/59] [D loss: 0.648117] [G loss: 0.772256]\n",
"[Epoch 44/200] [Batch 34/59] [D loss: 0.610582] [G loss: 0.821851]\n",
"[Epoch 44/200] [Batch 35/59] [D loss: 0.647685] [G loss: 0.795714]\n",
"[Epoch 44/200] [Batch 36/59] [D loss: 0.649978] [G loss: 0.675629]\n",
"[Epoch 44/200] [Batch 37/59] [D loss: 0.646201] [G loss: 0.858788]\n",
"[Epoch 44/200] [Batch 38/59] [D loss: 0.691651] [G loss: 0.795850]\n",
"[Epoch 44/200] [Batch 39/59] [D loss: 0.638998] [G loss: 0.790063]\n",
"[Epoch 44/200] [Batch 40/59] [D loss: 0.631175] [G loss: 0.858329]\n",
"[Epoch 44/200] [Batch 41/59] [D loss: 0.638592] [G loss: 0.812446]\n",
"[Epoch 44/200] [Batch 42/59] [D loss: 0.636077] [G loss: 0.676097]\n",
"[Epoch 44/200] [Batch 43/59] [D loss: 0.638172] [G loss: 0.928421]\n",
"[Epoch 44/200] [Batch 44/59] [D loss: 0.642231] [G loss: 0.799784]\n",
"[Epoch 44/200] [Batch 45/59] [D loss: 0.614777] [G loss: 0.772635]\n",
"[Epoch 44/200] [Batch 46/59] [D loss: 0.669670] [G loss: 0.780073]\n",
"[Epoch 44/200] [Batch 47/59] [D loss: 0.658103] [G loss: 0.883609]\n",
"[Epoch 44/200] [Batch 48/59] [D loss: 0.628102] [G loss: 0.787885]\n",
"[Epoch 44/200] [Batch 49/59] [D loss: 0.616253] [G loss: 0.850940]\n",
"[Epoch 44/200] [Batch 50/59] [D loss: 0.628493] [G loss: 0.769189]\n",
"[Epoch 44/200] [Batch 51/59] [D loss: 0.659857] [G loss: 0.723991]\n",
"[Epoch 44/200] [Batch 52/59] [D loss: 0.632686] [G loss: 0.977547]\n",
"[Epoch 44/200] [Batch 53/59] [D loss: 0.637295] [G loss: 0.802057]\n",
"[Epoch 44/200] [Batch 54/59] [D loss: 0.649911] [G loss: 0.740963]\n",
"[Epoch 44/200] [Batch 55/59] [D loss: 0.618738] [G loss: 0.812189]\n",
"[Epoch 44/200] [Batch 56/59] [D loss: 0.651859] [G loss: 0.880950]\n",
"[Epoch 44/200] [Batch 57/59] [D loss: 0.623557] [G loss: 0.723860]\n",
"[Epoch 44/200] [Batch 58/59] [D loss: 0.639719] [G loss: 0.771288]\n",
"[Epoch 45/200] [Batch 0/59] [D loss: 0.652025] [G loss: 0.885184]\n",
"[Epoch 45/200] [Batch 1/59] [D loss: 0.640113] [G loss: 0.728023]\n",
"[Epoch 45/200] [Batch 2/59] [D loss: 0.653778] [G loss: 0.809286]\n",
"[Epoch 45/200] [Batch 3/59] [D loss: 0.651071] [G loss: 0.839915]\n",
"[Epoch 45/200] [Batch 4/59] [D loss: 0.640483] [G loss: 0.795136]\n",
"[Epoch 45/200] [Batch 5/59] [D loss: 0.622807] [G loss: 0.731643]\n",
"[Epoch 45/200] [Batch 6/59] [D loss: 0.629114] [G loss: 0.795860]\n",
"[Epoch 45/200] [Batch 7/59] [D loss: 0.642260] [G loss: 0.862418]\n",
"[Epoch 45/200] [Batch 8/59] [D loss: 0.622983] [G loss: 0.802194]\n",
"[Epoch 45/200] [Batch 9/59] [D loss: 0.613434] [G loss: 0.746270]\n",
"[Epoch 45/200] [Batch 10/59] [D loss: 0.665954] [G loss: 0.909802]\n",
"[Epoch 45/200] [Batch 11/59] [D loss: 0.646867] [G loss: 0.845102]\n",
"[Epoch 45/200] [Batch 12/59] [D loss: 0.623515] [G loss: 0.795175]\n",
"[Epoch 45/200] [Batch 13/59] [D loss: 0.621039] [G loss: 0.845310]\n",
"[Epoch 45/200] [Batch 14/59] [D loss: 0.623454] [G loss: 0.778158]\n",
"[Epoch 45/200] [Batch 15/59] [D loss: 0.652427] [G loss: 0.884396]\n",
"[Epoch 45/200] [Batch 16/59] [D loss: 0.666316] [G loss: 0.673382]\n",
"[Epoch 45/200] [Batch 17/59] [D loss: 0.641163] [G loss: 0.817702]\n",
"[Epoch 45/200] [Batch 18/59] [D loss: 0.668895] [G loss: 0.795713]\n",
"[Epoch 45/200] [Batch 19/59] [D loss: 0.634339] [G loss: 0.806111]\n",
"[Epoch 45/200] [Batch 20/59] [D loss: 0.649130] [G loss: 0.750167]\n",
"[Epoch 45/200] [Batch 21/59] [D loss: 0.665665] [G loss: 0.873214]\n",
"[Epoch 45/200] [Batch 22/59] [D loss: 0.647184] [G loss: 0.767658]\n",
"[Epoch 45/200] [Batch 23/59] [D loss: 0.607017] [G loss: 0.843264]\n",
"[Epoch 45/200] [Batch 24/59] [D loss: 0.644918] [G loss: 0.851535]\n",
"[Epoch 45/200] [Batch 25/59] [D loss: 0.649635] [G loss: 0.794122]\n",
"[Epoch 45/200] [Batch 26/59] [D loss: 0.644204] [G loss: 0.669341]\n",
"[Epoch 45/200] [Batch 27/59] [D loss: 0.622407] [G loss: 0.910585]\n",
"[Epoch 45/200] [Batch 28/59] [D loss: 0.649670] [G loss: 0.729461]\n",
"[Epoch 45/200] [Batch 29/59] [D loss: 0.613095] [G loss: 0.817712]\n",
"[Epoch 45/200] [Batch 30/59] [D loss: 0.629583] [G loss: 0.880347]\n",
"[Epoch 45/200] [Batch 31/59] [D loss: 0.617492] [G loss: 0.778994]\n",
"[Epoch 45/200] [Batch 32/59] [D loss: 0.615343] [G loss: 0.769345]\n",
"[Epoch 45/200] [Batch 33/59] [D loss: 0.641984] [G loss: 0.826945]\n",
"[Epoch 45/200] [Batch 34/59] [D loss: 0.663928] [G loss: 0.793314]\n",
"[Epoch 45/200] [Batch 35/59] [D loss: 0.661498] [G loss: 0.828134]\n",
"[Epoch 45/200] [Batch 36/59] [D loss: 0.650349] [G loss: 0.759987]\n",
"[Epoch 45/200] [Batch 37/59] [D loss: 0.632868] [G loss: 0.859075]\n",
"[Epoch 45/200] [Batch 38/59] [D loss: 0.619920] [G loss: 0.742281]\n",
"[Epoch 45/200] [Batch 39/59] [D loss: 0.650320] [G loss: 0.724451]\n",
"[Epoch 45/200] [Batch 40/59] [D loss: 0.634330] [G loss: 0.884260]\n",
"[Epoch 45/200] [Batch 41/59] [D loss: 0.630115] [G loss: 0.732579]\n",
"[Epoch 45/200] [Batch 42/59] [D loss: 0.653078] [G loss: 0.793229]\n",
"[Epoch 45/200] [Batch 43/59] [D loss: 0.655940] [G loss: 0.851860]\n",
"[Epoch 45/200] [Batch 44/59] [D loss: 0.644199] [G loss: 0.746932]\n",
"[Epoch 45/200] [Batch 45/59] [D loss: 0.633475] [G loss: 0.732347]\n",
"[Epoch 45/200] [Batch 46/59] [D loss: 0.633626] [G loss: 0.789697]\n",
"[Epoch 45/200] [Batch 47/59] [D loss: 0.645269] [G loss: 0.870618]\n",
"[Epoch 45/200] [Batch 48/59] [D loss: 0.645403] [G loss: 0.792261]\n",
"[Epoch 45/200] [Batch 49/59] [D loss: 0.640831] [G loss: 0.784560]\n",
"[Epoch 45/200] [Batch 50/59] [D loss: 0.631871] [G loss: 0.803540]\n",
"[Epoch 45/200] [Batch 51/59] [D loss: 0.638836] [G loss: 0.873414]\n",
"[Epoch 45/200] [Batch 52/59] [D loss: 0.641237] [G loss: 0.704033]\n",
"[Epoch 45/200] [Batch 53/59] [D loss: 0.637749] [G loss: 0.831328]\n",
"[Epoch 45/200] [Batch 54/59] [D loss: 0.634415] [G loss: 0.837363]\n",
"[Epoch 45/200] [Batch 55/59] [D loss: 0.610280] [G loss: 0.770646]\n",
"[Epoch 45/200] [Batch 56/59] [D loss: 0.603565] [G loss: 0.864386]\n",
"[Epoch 45/200] [Batch 57/59] [D loss: 0.625093] [G loss: 0.807217]\n",
"[Epoch 45/200] [Batch 58/59] [D loss: 0.616461] [G loss: 0.767609]\n",
"[Epoch 46/200] [Batch 0/59] [D loss: 0.666657] [G loss: 0.765697]\n",
"[Epoch 46/200] [Batch 1/59] [D loss: 0.655741] [G loss: 0.790408]\n",
"[Epoch 46/200] [Batch 2/59] [D loss: 0.646675] [G loss: 0.814122]\n",
"[Epoch 46/200] [Batch 3/59] [D loss: 0.676841] [G loss: 0.659261]\n",
"[Epoch 46/200] [Batch 4/59] [D loss: 0.662796] [G loss: 0.871818]\n",
"[Epoch 46/200] [Batch 5/59] [D loss: 0.650687] [G loss: 0.726824]\n",
"[Epoch 46/200] [Batch 6/59] [D loss: 0.678924] [G loss: 0.726007]\n",
"[Epoch 46/200] [Batch 7/59] [D loss: 0.637035] [G loss: 0.983052]\n",
"[Epoch 46/200] [Batch 8/59] [D loss: 0.636117] [G loss: 0.834373]\n",
"[Epoch 46/200] [Batch 9/59] [D loss: 0.661643] [G loss: 0.650886]\n",
"[Epoch 46/200] [Batch 10/59] [D loss: 0.656992] [G loss: 0.983009]\n",
"[Epoch 46/200] [Batch 11/59] [D loss: 0.636775] [G loss: 0.844190]\n",
"[Epoch 46/200] [Batch 12/59] [D loss: 0.648217] [G loss: 0.790040]\n",
"[Epoch 46/200] [Batch 13/59] [D loss: 0.633289] [G loss: 0.806171]\n",
"[Epoch 46/200] [Batch 14/59] [D loss: 0.649640] [G loss: 0.794791]\n",
"[Epoch 46/200] [Batch 15/59] [D loss: 0.628629] [G loss: 0.857321]\n",
"[Epoch 46/200] [Batch 16/59] [D loss: 0.618566] [G loss: 0.791448]\n",
"[Epoch 46/200] [Batch 17/59] [D loss: 0.627491] [G loss: 0.781579]\n",
"[Epoch 46/200] [Batch 18/59] [D loss: 0.634636] [G loss: 0.784531]\n",
"[Epoch 46/200] [Batch 19/59] [D loss: 0.643881] [G loss: 0.714480]\n",
"[Epoch 46/200] [Batch 20/59] [D loss: 0.628210] [G loss: 0.840928]\n",
"[Epoch 46/200] [Batch 21/59] [D loss: 0.655948] [G loss: 0.755292]\n",
"[Epoch 46/200] [Batch 22/59] [D loss: 0.667675] [G loss: 0.696951]\n",
"[Epoch 46/200] [Batch 23/59] [D loss: 0.647062] [G loss: 1.032614]\n",
"[Epoch 46/200] [Batch 24/59] [D loss: 0.610461] [G loss: 0.794303]\n",
"[Epoch 46/200] [Batch 25/59] [D loss: 0.642978] [G loss: 0.666321]\n",
"[Epoch 46/200] [Batch 26/59] [D loss: 0.636826] [G loss: 0.889230]\n",
"[Epoch 46/200] [Batch 27/59] [D loss: 0.622220] [G loss: 0.847663]\n",
"[Epoch 46/200] [Batch 28/59] [D loss: 0.658566] [G loss: 0.777886]\n",
"[Epoch 46/200] [Batch 29/59] [D loss: 0.642226] [G loss: 0.773262]\n",
"[Epoch 46/200] [Batch 30/59] [D loss: 0.633255] [G loss: 0.762913]\n",
"[Epoch 46/200] [Batch 31/59] [D loss: 0.651296] [G loss: 0.844063]\n",
"[Epoch 46/200] [Batch 32/59] [D loss: 0.664071] [G loss: 0.816873]\n",
"[Epoch 46/200] [Batch 33/59] [D loss: 0.639885] [G loss: 0.727727]\n",
"[Epoch 46/200] [Batch 34/59] [D loss: 0.624890] [G loss: 0.911457]\n",
"[Epoch 46/200] [Batch 35/59] [D loss: 0.608300] [G loss: 0.825494]\n",
"[Epoch 46/200] [Batch 36/59] [D loss: 0.645060] [G loss: 0.699106]\n",
"[Epoch 46/200] [Batch 37/59] [D loss: 0.647478] [G loss: 0.854428]\n",
"[Epoch 46/200] [Batch 38/59] [D loss: 0.622025] [G loss: 0.900901]\n",
"[Epoch 46/200] [Batch 39/59] [D loss: 0.639780] [G loss: 0.763779]\n",
"[Epoch 46/200] [Batch 40/59] [D loss: 0.635591] [G loss: 0.880071]\n",
"[Epoch 46/200] [Batch 41/59] [D loss: 0.626313] [G loss: 0.798920]\n",
"[Epoch 46/200] [Batch 42/59] [D loss: 0.641880] [G loss: 0.786037]\n",
"[Epoch 46/200] [Batch 43/59] [D loss: 0.626617] [G loss: 0.798424]\n",
"[Epoch 46/200] [Batch 44/59] [D loss: 0.638323] [G loss: 0.801340]\n",
"[Epoch 46/200] [Batch 45/59] [D loss: 0.642342] [G loss: 0.844954]\n",
"[Epoch 46/200] [Batch 46/59] [D loss: 0.639737] [G loss: 0.721400]\n",
"[Epoch 46/200] [Batch 47/59] [D loss: 0.662480] [G loss: 0.714654]\n",
"[Epoch 46/200] [Batch 48/59] [D loss: 0.619552] [G loss: 0.782266]\n",
"[Epoch 46/200] [Batch 49/59] [D loss: 0.656965] [G loss: 0.859397]\n",
"[Epoch 46/200] [Batch 50/59] [D loss: 0.666572] [G loss: 0.765094]\n",
"[Epoch 46/200] [Batch 51/59] [D loss: 0.654227] [G loss: 0.751362]\n",
"[Epoch 46/200] [Batch 52/59] [D loss: 0.610666] [G loss: 0.760401]\n",
"[Epoch 46/200] [Batch 53/59] [D loss: 0.641499] [G loss: 0.841730]\n",
"[Epoch 46/200] [Batch 54/59] [D loss: 0.630593] [G loss: 0.717726]\n",
"[Epoch 46/200] [Batch 55/59] [D loss: 0.618582] [G loss: 0.938105]\n",
"[Epoch 46/200] [Batch 56/59] [D loss: 0.607366] [G loss: 0.759322]\n",
"[Epoch 46/200] [Batch 57/59] [D loss: 0.626748] [G loss: 0.863234]\n",
"[Epoch 46/200] [Batch 58/59] [D loss: 0.614205] [G loss: 0.880958]\n",
"[Epoch 47/200] [Batch 0/59] [D loss: 0.630307] [G loss: 0.695689]\n",
"[Epoch 47/200] [Batch 1/59] [D loss: 0.658156] [G loss: 0.780381]\n",
"[Epoch 47/200] [Batch 2/59] [D loss: 0.634793] [G loss: 0.894605]\n",
"[Epoch 47/200] [Batch 3/59] [D loss: 0.648372] [G loss: 0.804515]\n",
"[Epoch 47/200] [Batch 4/59] [D loss: 0.650199] [G loss: 0.706542]\n",
"[Epoch 47/200] [Batch 5/59] [D loss: 0.645679] [G loss: 0.795565]\n",
"[Epoch 47/200] [Batch 6/59] [D loss: 0.663168] [G loss: 0.873976]\n",
"[Epoch 47/200] [Batch 7/59] [D loss: 0.645886] [G loss: 0.712806]\n",
"[Epoch 47/200] [Batch 8/59] [D loss: 0.630128] [G loss: 0.797733]\n",
"[Epoch 47/200] [Batch 9/59] [D loss: 0.648910] [G loss: 0.928767]\n",
"[Epoch 47/200] [Batch 10/59] [D loss: 0.627301] [G loss: 0.809488]\n",
"[Epoch 47/200] [Batch 11/59] [D loss: 0.605760] [G loss: 0.827496]\n",
"[Epoch 47/200] [Batch 12/59] [D loss: 0.662153] [G loss: 0.811099]\n",
"[Epoch 47/200] [Batch 13/59] [D loss: 0.610590] [G loss: 0.834620]\n",
"[Epoch 47/200] [Batch 14/59] [D loss: 0.648107] [G loss: 0.789031]\n",
"[Epoch 47/200] [Batch 15/59] [D loss: 0.630339] [G loss: 0.822094]\n",
"[Epoch 47/200] [Batch 16/59] [D loss: 0.616487] [G loss: 0.913437]\n",
"[Epoch 47/200] [Batch 17/59] [D loss: 0.640020] [G loss: 0.774403]\n",
"[Epoch 47/200] [Batch 18/59] [D loss: 0.610512] [G loss: 0.764739]\n",
"[Epoch 47/200] [Batch 19/59] [D loss: 0.622983] [G loss: 0.802789]\n",
"[Epoch 47/200] [Batch 20/59] [D loss: 0.638974] [G loss: 0.894210]\n",
"[Epoch 47/200] [Batch 21/59] [D loss: 0.665954] [G loss: 0.675905]\n",
"[Epoch 47/200] [Batch 22/59] [D loss: 0.652591] [G loss: 0.744891]\n",
"[Epoch 47/200] [Batch 23/59] [D loss: 0.642694] [G loss: 0.904920]\n",
"[Epoch 47/200] [Batch 24/59] [D loss: 0.670892] [G loss: 0.729722]\n",
"[Epoch 47/200] [Batch 25/59] [D loss: 0.630722] [G loss: 0.812714]\n",
"[Epoch 47/200] [Batch 26/59] [D loss: 0.639828] [G loss: 0.846697]\n",
"[Epoch 47/200] [Batch 27/59] [D loss: 0.641998] [G loss: 0.803451]\n",
"[Epoch 47/200] [Batch 28/59] [D loss: 0.614086] [G loss: 0.766440]\n",
"[Epoch 47/200] [Batch 29/59] [D loss: 0.637920] [G loss: 0.744860]\n",
"[Epoch 47/200] [Batch 30/59] [D loss: 0.640064] [G loss: 0.999650]\n",
"[Epoch 47/200] [Batch 31/59] [D loss: 0.643169] [G loss: 0.609188]\n",
"[Epoch 47/200] [Batch 32/59] [D loss: 0.621700] [G loss: 0.885217]\n",
"[Epoch 47/200] [Batch 33/59] [D loss: 0.652208] [G loss: 1.007511]\n",
"[Epoch 47/200] [Batch 34/59] [D loss: 0.632003] [G loss: 0.657439]\n",
"[Epoch 47/200] [Batch 35/59] [D loss: 0.615870] [G loss: 0.759140]\n",
"[Epoch 47/200] [Batch 36/59] [D loss: 0.666615] [G loss: 0.984169]\n",
"[Epoch 47/200] [Batch 37/59] [D loss: 0.645797] [G loss: 0.824392]\n",
"[Epoch 47/200] [Batch 38/59] [D loss: 0.652903] [G loss: 0.674996]\n",
"[Epoch 47/200] [Batch 39/59] [D loss: 0.624820] [G loss: 0.828247]\n",
"[Epoch 47/200] [Batch 40/59] [D loss: 0.677798] [G loss: 0.890671]\n",
"[Epoch 47/200] [Batch 41/59] [D loss: 0.619105] [G loss: 0.782850]\n",
"[Epoch 47/200] [Batch 42/59] [D loss: 0.662273] [G loss: 0.697269]\n",
"[Epoch 47/200] [Batch 43/59] [D loss: 0.623579] [G loss: 0.910778]\n",
"[Epoch 47/200] [Batch 44/59] [D loss: 0.639158] [G loss: 0.912995]\n",
"[Epoch 47/200] [Batch 45/59] [D loss: 0.627369] [G loss: 0.705526]\n",
"[Epoch 47/200] [Batch 46/59] [D loss: 0.653156] [G loss: 0.840310]\n",
"[Epoch 47/200] [Batch 47/59] [D loss: 0.622294] [G loss: 0.854301]\n",
"[Epoch 47/200] [Batch 48/59] [D loss: 0.624802] [G loss: 0.848809]\n",
"[Epoch 47/200] [Batch 49/59] [D loss: 0.636842] [G loss: 0.758409]\n",
"[Epoch 47/200] [Batch 50/59] [D loss: 0.657726] [G loss: 0.761678]\n",
"[Epoch 47/200] [Batch 51/59] [D loss: 0.640935] [G loss: 0.769839]\n",
"[Epoch 47/200] [Batch 52/59] [D loss: 0.626347] [G loss: 0.819265]\n",
"[Epoch 47/200] [Batch 53/59] [D loss: 0.638746] [G loss: 0.847678]\n",
"[Epoch 47/200] [Batch 54/59] [D loss: 0.625767] [G loss: 0.753918]\n",
"[Epoch 47/200] [Batch 55/59] [D loss: 0.624936] [G loss: 0.941414]\n",
"[Epoch 47/200] [Batch 56/59] [D loss: 0.640748] [G loss: 0.740667]\n",
"[Epoch 47/200] [Batch 57/59] [D loss: 0.633725] [G loss: 0.813975]\n",
"[Epoch 47/200] [Batch 58/59] [D loss: 0.646056] [G loss: 0.831706]\n",
"[Epoch 48/200] [Batch 0/59] [D loss: 0.663121] [G loss: 0.774199]\n",
"[Epoch 48/200] [Batch 1/59] [D loss: 0.659096] [G loss: 0.712702]\n",
"[Epoch 48/200] [Batch 2/59] [D loss: 0.628620] [G loss: 0.868171]\n",
"[Epoch 48/200] [Batch 3/59] [D loss: 0.629973] [G loss: 0.745560]\n",
"[Epoch 48/200] [Batch 4/59] [D loss: 0.647831] [G loss: 0.723663]\n",
"[Epoch 48/200] [Batch 5/59] [D loss: 0.657170] [G loss: 1.020383]\n",
"[Epoch 48/200] [Batch 6/59] [D loss: 0.646503] [G loss: 0.822968]\n",
"[Epoch 48/200] [Batch 7/59] [D loss: 0.640434] [G loss: 0.657265]\n",
"[Epoch 48/200] [Batch 8/59] [D loss: 0.652078] [G loss: 0.807771]\n",
"[Epoch 48/200] [Batch 9/59] [D loss: 0.669993] [G loss: 1.005569]\n",
"[Epoch 48/200] [Batch 10/59] [D loss: 0.638884] [G loss: 0.755759]\n",
"[Epoch 48/200] [Batch 11/59] [D loss: 0.619597] [G loss: 0.747612]\n",
"[Epoch 48/200] [Batch 12/59] [D loss: 0.630412] [G loss: 0.957555]\n",
"[Epoch 48/200] [Batch 13/59] [D loss: 0.639622] [G loss: 0.797226]\n",
"[Epoch 48/200] [Batch 14/59] [D loss: 0.636918] [G loss: 0.753317]\n",
"[Epoch 48/200] [Batch 15/59] [D loss: 0.609210] [G loss: 0.975641]\n",
"[Epoch 48/200] [Batch 16/59] [D loss: 0.621637] [G loss: 0.773820]\n",
"[Epoch 48/200] [Batch 17/59] [D loss: 0.617820] [G loss: 0.758253]\n",
"[Epoch 48/200] [Batch 18/59] [D loss: 0.637195] [G loss: 0.742773]\n",
"[Epoch 48/200] [Batch 19/59] [D loss: 0.631895] [G loss: 0.808729]\n",
"[Epoch 48/200] [Batch 20/59] [D loss: 0.653451] [G loss: 0.801346]\n",
"[Epoch 48/200] [Batch 21/59] [D loss: 0.637916] [G loss: 0.673118]\n",
"[Epoch 48/200] [Batch 22/59] [D loss: 0.644516] [G loss: 0.869877]\n",
"[Epoch 48/200] [Batch 23/59] [D loss: 0.628691] [G loss: 0.850952]\n",
"[Epoch 48/200] [Batch 24/59] [D loss: 0.640749] [G loss: 0.714301]\n",
"[Epoch 48/200] [Batch 25/59] [D loss: 0.643218] [G loss: 0.685720]\n",
"[Epoch 48/200] [Batch 26/59] [D loss: 0.650065] [G loss: 0.956491]\n",
"[Epoch 48/200] [Batch 27/59] [D loss: 0.636522] [G loss: 0.849419]\n",
"[Epoch 48/200] [Batch 28/59] [D loss: 0.637583] [G loss: 0.746868]\n",
"[Epoch 48/200] [Batch 29/59] [D loss: 0.613917] [G loss: 0.849003]\n",
"[Epoch 48/200] [Batch 30/59] [D loss: 0.634292] [G loss: 0.796940]\n",
"[Epoch 48/200] [Batch 31/59] [D loss: 0.644308] [G loss: 0.772130]\n",
"[Epoch 48/200] [Batch 32/59] [D loss: 0.619704] [G loss: 0.775981]\n",
"[Epoch 48/200] [Batch 33/59] [D loss: 0.653059] [G loss: 0.770771]\n",
"[Epoch 48/200] [Batch 34/59] [D loss: 0.699690] [G loss: 0.816842]\n",
"[Epoch 48/200] [Batch 35/59] [D loss: 0.657526] [G loss: 0.846863]\n",
"[Epoch 48/200] [Batch 36/59] [D loss: 0.611209] [G loss: 0.841566]\n",
"[Epoch 48/200] [Batch 37/59] [D loss: 0.635774] [G loss: 0.745801]\n",
"[Epoch 48/200] [Batch 38/59] [D loss: 0.648581] [G loss: 0.795748]\n",
"[Epoch 48/200] [Batch 39/59] [D loss: 0.654355] [G loss: 0.885903]\n",
"[Epoch 48/200] [Batch 40/59] [D loss: 0.643709] [G loss: 0.738493]\n",
"[Epoch 48/200] [Batch 41/59] [D loss: 0.665767] [G loss: 0.843202]\n",
"[Epoch 48/200] [Batch 42/59] [D loss: 0.648292] [G loss: 0.841373]\n",
"[Epoch 48/200] [Batch 43/59] [D loss: 0.641768] [G loss: 0.786527]\n",
"[Epoch 48/200] [Batch 44/59] [D loss: 0.658845] [G loss: 0.835956]\n",
"[Epoch 48/200] [Batch 45/59] [D loss: 0.655844] [G loss: 0.745807]\n",
"[Epoch 48/200] [Batch 46/59] [D loss: 0.639460] [G loss: 0.756149]\n",
"[Epoch 48/200] [Batch 47/59] [D loss: 0.632014] [G loss: 0.904233]\n",
"[Epoch 48/200] [Batch 48/59] [D loss: 0.612420] [G loss: 0.754114]\n",
"[Epoch 48/200] [Batch 49/59] [D loss: 0.644857] [G loss: 0.759348]\n",
"[Epoch 48/200] [Batch 50/59] [D loss: 0.660375] [G loss: 0.795284]\n",
"[Epoch 48/200] [Batch 51/59] [D loss: 0.669351] [G loss: 0.823053]\n",
"[Epoch 48/200] [Batch 52/59] [D loss: 0.663286] [G loss: 0.698168]\n",
"[Epoch 48/200] [Batch 53/59] [D loss: 0.650530] [G loss: 0.886538]\n",
"[Epoch 48/200] [Batch 54/59] [D loss: 0.627232] [G loss: 0.796694]\n",
"[Epoch 48/200] [Batch 55/59] [D loss: 0.663003] [G loss: 0.716191]\n",
"[Epoch 48/200] [Batch 56/59] [D loss: 0.617775] [G loss: 0.935046]\n",
"[Epoch 48/200] [Batch 57/59] [D loss: 0.647026] [G loss: 0.803038]\n",
"[Epoch 48/200] [Batch 58/59] [D loss: 0.632144] [G loss: 0.731496]\n",
"[Epoch 49/200] [Batch 0/59] [D loss: 0.627807] [G loss: 0.881724]\n",
"[Epoch 49/200] [Batch 1/59] [D loss: 0.633980] [G loss: 0.837983]\n",
"[Epoch 49/200] [Batch 2/59] [D loss: 0.612584] [G loss: 0.768690]\n",
"[Epoch 49/200] [Batch 3/59] [D loss: 0.636166] [G loss: 0.906842]\n",
"[Epoch 49/200] [Batch 4/59] [D loss: 0.607249] [G loss: 0.780681]\n",
"[Epoch 49/200] [Batch 5/59] [D loss: 0.619477] [G loss: 0.753798]\n",
"[Epoch 49/200] [Batch 6/59] [D loss: 0.636848] [G loss: 0.849281]\n",
"[Epoch 49/200] [Batch 7/59] [D loss: 0.625344] [G loss: 0.752525]\n",
"[Epoch 49/200] [Batch 8/59] [D loss: 0.658053] [G loss: 0.769634]\n",
"[Epoch 49/200] [Batch 9/59] [D loss: 0.665498] [G loss: 0.878006]\n",
"[Epoch 49/200] [Batch 10/59] [D loss: 0.645450] [G loss: 0.778856]\n",
"[Epoch 49/200] [Batch 11/59] [D loss: 0.666467] [G loss: 0.846379]\n",
"[Epoch 49/200] [Batch 12/59] [D loss: 0.663206] [G loss: 0.775131]\n",
"[Epoch 49/200] [Batch 13/59] [D loss: 0.652341] [G loss: 0.780313]\n",
"[Epoch 49/200] [Batch 14/59] [D loss: 0.624651] [G loss: 0.775349]\n",
"[Epoch 49/200] [Batch 15/59] [D loss: 0.638883] [G loss: 0.830499]\n",
"[Epoch 49/200] [Batch 16/59] [D loss: 0.610828] [G loss: 0.772532]\n",
"[Epoch 49/200] [Batch 17/59] [D loss: 0.613131] [G loss: 0.784768]\n",
"[Epoch 49/200] [Batch 18/59] [D loss: 0.630820] [G loss: 0.810330]\n",
"[Epoch 49/200] [Batch 19/59] [D loss: 0.602213] [G loss: 0.706040]\n",
"[Epoch 49/200] [Batch 20/59] [D loss: 0.623110] [G loss: 0.947509]\n",
"[Epoch 49/200] [Batch 21/59] [D loss: 0.606883] [G loss: 0.798224]\n",
"[Epoch 49/200] [Batch 22/59] [D loss: 0.635233] [G loss: 0.822472]\n",
"[Epoch 49/200] [Batch 23/59] [D loss: 0.635358] [G loss: 0.817966]\n",
"[Epoch 49/200] [Batch 24/59] [D loss: 0.637413] [G loss: 0.825450]\n",
"[Epoch 49/200] [Batch 25/59] [D loss: 0.647526] [G loss: 0.752366]\n",
"[Epoch 49/200] [Batch 26/59] [D loss: 0.638326] [G loss: 0.993666]\n",
"[Epoch 49/200] [Batch 27/59] [D loss: 0.644775] [G loss: 0.685313]\n",
"[Epoch 49/200] [Batch 28/59] [D loss: 0.645518] [G loss: 0.799418]\n",
"[Epoch 49/200] [Batch 29/59] [D loss: 0.666443] [G loss: 0.830526]\n",
"[Epoch 49/200] [Batch 30/59] [D loss: 0.663966] [G loss: 0.759858]\n",
"[Epoch 49/200] [Batch 31/59] [D loss: 0.625624] [G loss: 0.806403]\n",
"[Epoch 49/200] [Batch 32/59] [D loss: 0.668784] [G loss: 0.926256]\n",
"[Epoch 49/200] [Batch 33/59] [D loss: 0.626665] [G loss: 0.742100]\n",
"[Epoch 49/200] [Batch 34/59] [D loss: 0.660701] [G loss: 0.820814]\n",
"[Epoch 49/200] [Batch 35/59] [D loss: 0.665413] [G loss: 0.946519]\n",
"[Epoch 49/200] [Batch 36/59] [D loss: 0.610817] [G loss: 0.723904]\n",
"[Epoch 49/200] [Batch 37/59] [D loss: 0.647903] [G loss: 0.815816]\n",
"[Epoch 49/200] [Batch 38/59] [D loss: 0.613511] [G loss: 0.987824]\n",
"[Epoch 49/200] [Batch 39/59] [D loss: 0.587365] [G loss: 0.783526]\n",
"[Epoch 49/200] [Batch 40/59] [D loss: 0.649646] [G loss: 0.893550]\n",
"[Epoch 49/200] [Batch 41/59] [D loss: 0.615623] [G loss: 0.862699]\n",
"[Epoch 49/200] [Batch 42/59] [D loss: 0.635900] [G loss: 0.796572]\n",
"[Epoch 49/200] [Batch 43/59] [D loss: 0.621698] [G loss: 0.771291]\n",
"[Epoch 49/200] [Batch 44/59] [D loss: 0.650204] [G loss: 0.768933]\n",
"[Epoch 49/200] [Batch 45/59] [D loss: 0.650131] [G loss: 1.022239]\n",
"[Epoch 49/200] [Batch 46/59] [D loss: 0.619774] [G loss: 0.671510]\n",
"[Epoch 49/200] [Batch 47/59] [D loss: 0.639856] [G loss: 0.763444]\n",
"[Epoch 49/200] [Batch 48/59] [D loss: 0.643570] [G loss: 0.908854]\n",
"[Epoch 49/200] [Batch 49/59] [D loss: 0.649719] [G loss: 0.696090]\n",
"[Epoch 49/200] [Batch 50/59] [D loss: 0.619963] [G loss: 0.820897]\n",
"[Epoch 49/200] [Batch 51/59] [D loss: 0.674407] [G loss: 0.825243]\n",
"[Epoch 49/200] [Batch 52/59] [D loss: 0.659491] [G loss: 0.761293]\n",
"[Epoch 49/200] [Batch 53/59] [D loss: 0.636977] [G loss: 0.789033]\n",
"[Epoch 49/200] [Batch 54/59] [D loss: 0.621879] [G loss: 0.832962]\n",
"[Epoch 49/200] [Batch 55/59] [D loss: 0.608357] [G loss: 0.685266]\n",
"[Epoch 49/200] [Batch 56/59] [D loss: 0.626423] [G loss: 0.845292]\n",
"[Epoch 49/200] [Batch 57/59] [D loss: 0.625876] [G loss: 0.876358]\n",
"[Epoch 49/200] [Batch 58/59] [D loss: 0.630331] [G loss: 0.745423]\n",
"[Epoch 50/200] [Batch 0/59] [D loss: 0.638745] [G loss: 0.830078]\n",
"[Epoch 50/200] [Batch 1/59] [D loss: 0.633713] [G loss: 0.873796]\n",
"[Epoch 50/200] [Batch 2/59] [D loss: 0.625077] [G loss: 0.812447]\n",
"[Epoch 50/200] [Batch 3/59] [D loss: 0.612148] [G loss: 0.731272]\n",
"[Epoch 50/200] [Batch 4/59] [D loss: 0.641737] [G loss: 0.866211]\n",
"[Epoch 50/200] [Batch 5/59] [D loss: 0.641172] [G loss: 0.865521]\n",
"[Epoch 50/200] [Batch 6/59] [D loss: 0.612609] [G loss: 0.751044]\n",
"[Epoch 50/200] [Batch 7/59] [D loss: 0.639261] [G loss: 0.815037]\n",
"[Epoch 50/200] [Batch 8/59] [D loss: 0.645811] [G loss: 0.831367]\n",
"[Epoch 50/200] [Batch 9/59] [D loss: 0.652987] [G loss: 0.823055]\n",
"[Epoch 50/200] [Batch 10/59] [D loss: 0.647903] [G loss: 0.690821]\n",
"[Epoch 50/200] [Batch 11/59] [D loss: 0.648829] [G loss: 0.897127]\n",
"[Epoch 50/200] [Batch 12/59] [D loss: 0.665888] [G loss: 0.802460]\n",
"[Epoch 50/200] [Batch 13/59] [D loss: 0.631326] [G loss: 0.656487]\n",
"[Epoch 50/200] [Batch 14/59] [D loss: 0.625305] [G loss: 1.112048]\n",
"[Epoch 50/200] [Batch 15/59] [D loss: 0.631230] [G loss: 0.787244]\n",
"[Epoch 50/200] [Batch 16/59] [D loss: 0.604852] [G loss: 0.666687]\n",
"[Epoch 50/200] [Batch 17/59] [D loss: 0.605499] [G loss: 1.002526]\n",
"[Epoch 50/200] [Batch 18/59] [D loss: 0.629244] [G loss: 0.857190]\n",
"[Epoch 50/200] [Batch 19/59] [D loss: 0.628169] [G loss: 0.697902]\n",
"[Epoch 50/200] [Batch 20/59] [D loss: 0.669912] [G loss: 0.755530]\n",
"[Epoch 50/200] [Batch 21/59] [D loss: 0.658261] [G loss: 0.867238]\n",
"[Epoch 50/200] [Batch 22/59] [D loss: 0.649733] [G loss: 0.736142]\n",
"[Epoch 50/200] [Batch 23/59] [D loss: 0.665574] [G loss: 0.695032]\n",
"[Epoch 50/200] [Batch 24/59] [D loss: 0.623030] [G loss: 0.864901]\n",
"[Epoch 50/200] [Batch 25/59] [D loss: 0.639345] [G loss: 0.949036]\n",
"[Epoch 50/200] [Batch 26/59] [D loss: 0.633561] [G loss: 0.682153]\n",
"[Epoch 50/200] [Batch 27/59] [D loss: 0.640775] [G loss: 0.688441]\n",
"[Epoch 50/200] [Batch 28/59] [D loss: 0.637022] [G loss: 0.904094]\n",
"[Epoch 50/200] [Batch 29/59] [D loss: 0.622060] [G loss: 0.862518]\n",
"[Epoch 50/200] [Batch 30/59] [D loss: 0.640749] [G loss: 0.731715]\n",
"[Epoch 50/200] [Batch 31/59] [D loss: 0.645473] [G loss: 0.813013]\n",
"[Epoch 50/200] [Batch 32/59] [D loss: 0.659403] [G loss: 0.981361]\n",
"[Epoch 50/200] [Batch 33/59] [D loss: 0.639016] [G loss: 0.686589]\n",
"[Epoch 50/200] [Batch 34/59] [D loss: 0.638526] [G loss: 0.732330]\n",
"[Epoch 50/200] [Batch 35/59] [D loss: 0.655202] [G loss: 0.983893]\n",
"[Epoch 50/200] [Batch 36/59] [D loss: 0.598997] [G loss: 0.829322]\n",
"[Epoch 50/200] [Batch 37/59] [D loss: 0.647607] [G loss: 0.622081]\n",
"[Epoch 50/200] [Batch 38/59] [D loss: 0.618968] [G loss: 1.005672]\n",
"[Epoch 50/200] [Batch 39/59] [D loss: 0.634890] [G loss: 0.919499]\n",
"[Epoch 50/200] [Batch 40/59] [D loss: 0.640181] [G loss: 0.635660]\n",
"[Epoch 50/200] [Batch 41/59] [D loss: 0.667420] [G loss: 0.872022]\n",
"[Epoch 50/200] [Batch 42/59] [D loss: 0.670248] [G loss: 0.816381]\n",
"[Epoch 50/200] [Batch 43/59] [D loss: 0.670129] [G loss: 0.758320]\n",
"[Epoch 50/200] [Batch 44/59] [D loss: 0.628532] [G loss: 0.746689]\n",
"[Epoch 50/200] [Batch 45/59] [D loss: 0.653600] [G loss: 0.804872]\n",
"[Epoch 50/200] [Batch 46/59] [D loss: 0.648738] [G loss: 0.878313]\n",
"[Epoch 50/200] [Batch 47/59] [D loss: 0.616821] [G loss: 0.741609]\n",
"[Epoch 50/200] [Batch 48/59] [D loss: 0.631923] [G loss: 0.710803]\n",
"[Epoch 50/200] [Batch 49/59] [D loss: 0.657182] [G loss: 0.946761]\n",
"[Epoch 50/200] [Batch 50/59] [D loss: 0.595556] [G loss: 0.849759]\n",
"[Epoch 50/200] [Batch 51/59] [D loss: 0.593927] [G loss: 0.718927]\n",
"[Epoch 50/200] [Batch 52/59] [D loss: 0.601480] [G loss: 0.854058]\n",
"[Epoch 50/200] [Batch 53/59] [D loss: 0.597906] [G loss: 0.787875]\n",
"[Epoch 50/200] [Batch 54/59] [D loss: 0.617541] [G loss: 0.726536]\n",
"[Epoch 50/200] [Batch 55/59] [D loss: 0.621148] [G loss: 0.902273]\n",
"[Epoch 50/200] [Batch 56/59] [D loss: 0.622225] [G loss: 0.821702]\n",
"[Epoch 50/200] [Batch 57/59] [D loss: 0.650331] [G loss: 0.775957]\n",
"[Epoch 50/200] [Batch 58/59] [D loss: 0.651228] [G loss: 0.845902]\n",
"[Epoch 51/200] [Batch 0/59] [D loss: 0.641647] [G loss: 0.821201]\n",
"[Epoch 51/200] [Batch 1/59] [D loss: 0.631777] [G loss: 0.791893]\n",
"[Epoch 51/200] [Batch 2/59] [D loss: 0.640911] [G loss: 0.755562]\n",
"[Epoch 51/200] [Batch 3/59] [D loss: 0.642001] [G loss: 0.812731]\n",
"[Epoch 51/200] [Batch 4/59] [D loss: 0.651338] [G loss: 0.828636]\n",
"[Epoch 51/200] [Batch 5/59] [D loss: 0.606925] [G loss: 0.777382]\n",
"[Epoch 51/200] [Batch 6/59] [D loss: 0.630267] [G loss: 0.711380]\n",
"[Epoch 51/200] [Batch 7/59] [D loss: 0.634353] [G loss: 0.729459]\n",
"[Epoch 51/200] [Batch 8/59] [D loss: 0.665543] [G loss: 0.939878]\n",
"[Epoch 51/200] [Batch 9/59] [D loss: 0.599185] [G loss: 0.784000]\n",
"[Epoch 51/200] [Batch 10/59] [D loss: 0.637172] [G loss: 0.673374]\n",
"[Epoch 51/200] [Batch 11/59] [D loss: 0.624898] [G loss: 0.835361]\n",
"[Epoch 51/200] [Batch 12/59] [D loss: 0.633848] [G loss: 0.867491]\n",
"[Epoch 51/200] [Batch 13/59] [D loss: 0.610039] [G loss: 0.851194]\n",
"[Epoch 51/200] [Batch 14/59] [D loss: 0.594792] [G loss: 0.883625]\n",
"[Epoch 51/200] [Batch 15/59] [D loss: 0.620334] [G loss: 0.835758]\n",
"[Epoch 51/200] [Batch 16/59] [D loss: 0.605228] [G loss: 0.803891]\n",
"[Epoch 51/200] [Batch 17/59] [D loss: 0.637479] [G loss: 0.799961]\n",
"[Epoch 51/200] [Batch 18/59] [D loss: 0.597055] [G loss: 0.862934]\n",
"[Epoch 51/200] [Batch 19/59] [D loss: 0.649524] [G loss: 0.791477]\n",
"[Epoch 51/200] [Batch 20/59] [D loss: 0.635958] [G loss: 0.858007]\n",
"[Epoch 51/200] [Batch 21/59] [D loss: 0.634491] [G loss: 0.758580]\n",
"[Epoch 51/200] [Batch 22/59] [D loss: 0.650471] [G loss: 0.783441]\n",
"[Epoch 51/200] [Batch 23/59] [D loss: 0.647200] [G loss: 0.768489]\n",
"[Epoch 51/200] [Batch 24/59] [D loss: 0.625901] [G loss: 0.751876]\n",
"[Epoch 51/200] [Batch 25/59] [D loss: 0.631149] [G loss: 0.861897]\n",
"[Epoch 51/200] [Batch 26/59] [D loss: 0.641458] [G loss: 0.822369]\n",
"[Epoch 51/200] [Batch 27/59] [D loss: 0.617790] [G loss: 0.806131]\n",
"[Epoch 51/200] [Batch 28/59] [D loss: 0.622111] [G loss: 0.760581]\n",
"[Epoch 51/200] [Batch 29/59] [D loss: 0.608655] [G loss: 0.796081]\n",
"[Epoch 51/200] [Batch 30/59] [D loss: 0.612447] [G loss: 0.829974]\n",
"[Epoch 51/200] [Batch 31/59] [D loss: 0.621134] [G loss: 0.886557]\n",
"[Epoch 51/200] [Batch 32/59] [D loss: 0.582275] [G loss: 0.825139]\n",
"[Epoch 51/200] [Batch 33/59] [D loss: 0.611128] [G loss: 0.831396]\n",
"[Epoch 51/200] [Batch 34/59] [D loss: 0.644346] [G loss: 0.724209]\n",
"[Epoch 51/200] [Batch 35/59] [D loss: 0.632116] [G loss: 0.994852]\n",
"[Epoch 51/200] [Batch 36/59] [D loss: 0.610295] [G loss: 0.749300]\n",
"[Epoch 51/200] [Batch 37/59] [D loss: 0.610650] [G loss: 0.657057]\n",
"[Epoch 51/200] [Batch 38/59] [D loss: 0.648454] [G loss: 0.929078]\n",
"[Epoch 51/200] [Batch 39/59] [D loss: 0.653143] [G loss: 0.796370]\n",
"[Epoch 51/200] [Batch 40/59] [D loss: 0.637518] [G loss: 0.755976]\n",
"[Epoch 51/200] [Batch 41/59] [D loss: 0.629731] [G loss: 0.904540]\n",
"[Epoch 51/200] [Batch 42/59] [D loss: 0.642886] [G loss: 0.756982]\n",
"[Epoch 51/200] [Batch 43/59] [D loss: 0.657048] [G loss: 0.840493]\n",
"[Epoch 51/200] [Batch 44/59] [D loss: 0.656599] [G loss: 0.727652]\n",
"[Epoch 51/200] [Batch 45/59] [D loss: 0.665612] [G loss: 0.834960]\n",
"[Epoch 51/200] [Batch 46/59] [D loss: 0.653088] [G loss: 0.812920]\n",
"[Epoch 51/200] [Batch 47/59] [D loss: 0.642365] [G loss: 0.775366]\n",
"[Epoch 51/200] [Batch 48/59] [D loss: 0.619232] [G loss: 0.824654]\n",
"[Epoch 51/200] [Batch 49/59] [D loss: 0.649724] [G loss: 0.803166]\n",
"[Epoch 51/200] [Batch 50/59] [D loss: 0.601940] [G loss: 0.857878]\n",
"[Epoch 51/200] [Batch 51/59] [D loss: 0.626426] [G loss: 0.804829]\n",
"[Epoch 51/200] [Batch 52/59] [D loss: 0.630305] [G loss: 0.896057]\n",
"[Epoch 51/200] [Batch 53/59] [D loss: 0.659559] [G loss: 0.808868]\n",
"[Epoch 51/200] [Batch 54/59] [D loss: 0.605613] [G loss: 0.779805]\n",
"[Epoch 51/200] [Batch 55/59] [D loss: 0.637470] [G loss: 0.859927]\n",
"[Epoch 51/200] [Batch 56/59] [D loss: 0.613098] [G loss: 0.790177]\n",
"[Epoch 51/200] [Batch 57/59] [D loss: 0.608208] [G loss: 0.768526]\n",
"[Epoch 51/200] [Batch 58/59] [D loss: 0.639083] [G loss: 0.895519]\n",
"[Epoch 52/200] [Batch 0/59] [D loss: 0.675588] [G loss: 0.638980]\n",
"[Epoch 52/200] [Batch 1/59] [D loss: 0.612179] [G loss: 1.045007]\n",
"[Epoch 52/200] [Batch 2/59] [D loss: 0.649773] [G loss: 0.723417]\n",
"[Epoch 52/200] [Batch 3/59] [D loss: 0.631720] [G loss: 0.820539]\n",
"[Epoch 52/200] [Batch 4/59] [D loss: 0.638769] [G loss: 0.838801]\n",
"[Epoch 52/200] [Batch 5/59] [D loss: 0.632369] [G loss: 0.761900]\n",
"[Epoch 52/200] [Batch 6/59] [D loss: 0.633837] [G loss: 0.733399]\n",
"[Epoch 52/200] [Batch 7/59] [D loss: 0.653595] [G loss: 0.862304]\n",
"[Epoch 52/200] [Batch 8/59] [D loss: 0.623532] [G loss: 0.753416]\n",
"[Epoch 52/200] [Batch 9/59] [D loss: 0.651754] [G loss: 0.780965]\n",
"[Epoch 52/200] [Batch 10/59] [D loss: 0.663823] [G loss: 0.922191]\n",
"[Epoch 52/200] [Batch 11/59] [D loss: 0.652062] [G loss: 0.978918]\n",
"[Epoch 52/200] [Batch 12/59] [D loss: 0.615759] [G loss: 0.747713]\n",
"[Epoch 52/200] [Batch 13/59] [D loss: 0.634506] [G loss: 0.765151]\n",
"[Epoch 52/200] [Batch 14/59] [D loss: 0.638855] [G loss: 0.837375]\n",
"[Epoch 52/200] [Batch 15/59] [D loss: 0.618224] [G loss: 0.822745]\n",
"[Epoch 52/200] [Batch 16/59] [D loss: 0.610581] [G loss: 0.790959]\n",
"[Epoch 52/200] [Batch 17/59] [D loss: 0.630242] [G loss: 0.741067]\n",
"[Epoch 52/200] [Batch 18/59] [D loss: 0.618810] [G loss: 0.870275]\n",
"[Epoch 52/200] [Batch 19/59] [D loss: 0.653947] [G loss: 0.777243]\n",
"[Epoch 52/200] [Batch 20/59] [D loss: 0.614740] [G loss: 0.819666]\n",
"[Epoch 52/200] [Batch 21/59] [D loss: 0.621225] [G loss: 0.812567]\n",
"[Epoch 52/200] [Batch 22/59] [D loss: 0.632743] [G loss: 0.714193]\n",
"[Epoch 52/200] [Batch 23/59] [D loss: 0.655030] [G loss: 0.921232]\n",
"[Epoch 52/200] [Batch 24/59] [D loss: 0.640667] [G loss: 0.761176]\n",
"[Epoch 52/200] [Batch 25/59] [D loss: 0.601739] [G loss: 0.794575]\n",
"[Epoch 52/200] [Batch 26/59] [D loss: 0.591303] [G loss: 0.864754]\n",
"[Epoch 52/200] [Batch 27/59] [D loss: 0.648677] [G loss: 0.721568]\n",
"[Epoch 52/200] [Batch 28/59] [D loss: 0.639585] [G loss: 0.870360]\n",
"[Epoch 52/200] [Batch 29/59] [D loss: 0.629189] [G loss: 0.770303]\n",
"[Epoch 52/200] [Batch 30/59] [D loss: 0.617071] [G loss: 0.768030]\n",
"[Epoch 52/200] [Batch 31/59] [D loss: 0.648907] [G loss: 0.861158]\n",
"[Epoch 52/200] [Batch 32/59] [D loss: 0.639672] [G loss: 0.779607]\n",
"[Epoch 52/200] [Batch 33/59] [D loss: 0.629128] [G loss: 0.798824]\n",
"[Epoch 52/200] [Batch 34/59] [D loss: 0.637793] [G loss: 0.889566]\n",
"[Epoch 52/200] [Batch 35/59] [D loss: 0.611414] [G loss: 0.824846]\n",
"[Epoch 52/200] [Batch 36/59] [D loss: 0.610637] [G loss: 0.820446]\n",
"[Epoch 52/200] [Batch 37/59] [D loss: 0.586186] [G loss: 0.899713]\n",
"[Epoch 52/200] [Batch 38/59] [D loss: 0.640709] [G loss: 0.856903]\n",
"[Epoch 52/200] [Batch 39/59] [D loss: 0.615146] [G loss: 0.869917]\n",
"[Epoch 52/200] [Batch 40/59] [D loss: 0.635597] [G loss: 0.790217]\n",
"[Epoch 52/200] [Batch 41/59] [D loss: 0.636468] [G loss: 0.874377]\n",
"[Epoch 52/200] [Batch 42/59] [D loss: 0.640043] [G loss: 0.828597]\n",
"[Epoch 52/200] [Batch 43/59] [D loss: 0.640460] [G loss: 0.729581]\n",
"[Epoch 52/200] [Batch 44/59] [D loss: 0.660698] [G loss: 0.756925]\n",
"[Epoch 52/200] [Batch 45/59] [D loss: 0.653297] [G loss: 0.937936]\n",
"[Epoch 52/200] [Batch 46/59] [D loss: 0.638089] [G loss: 0.650589]\n",
"[Epoch 52/200] [Batch 47/59] [D loss: 0.643290] [G loss: 0.868465]\n",
"[Epoch 52/200] [Batch 48/59] [D loss: 0.639807] [G loss: 0.942226]\n",
"[Epoch 52/200] [Batch 49/59] [D loss: 0.629963] [G loss: 0.691005]\n",
"[Epoch 52/200] [Batch 50/59] [D loss: 0.620334] [G loss: 0.705092]\n",
"[Epoch 52/200] [Batch 51/59] [D loss: 0.636014] [G loss: 0.947888]\n",
"[Epoch 52/200] [Batch 52/59] [D loss: 0.605637] [G loss: 0.762254]\n",
"[Epoch 52/200] [Batch 53/59] [D loss: 0.604185] [G loss: 0.914953]\n",
"[Epoch 52/200] [Batch 54/59] [D loss: 0.608689] [G loss: 0.807860]\n",
"[Epoch 52/200] [Batch 55/59] [D loss: 0.646652] [G loss: 0.864959]\n",
"[Epoch 52/200] [Batch 56/59] [D loss: 0.604701] [G loss: 0.768702]\n",
"[Epoch 52/200] [Batch 57/59] [D loss: 0.610069] [G loss: 0.920197]\n",
"[Epoch 52/200] [Batch 58/59] [D loss: 0.629665] [G loss: 0.842527]\n",
"[Epoch 53/200] [Batch 0/59] [D loss: 0.606934] [G loss: 0.770100]\n",
"[Epoch 53/200] [Batch 1/59] [D loss: 0.641863] [G loss: 0.685695]\n",
"[Epoch 53/200] [Batch 2/59] [D loss: 0.646733] [G loss: 0.907325]\n",
"[Epoch 53/200] [Batch 3/59] [D loss: 0.649035] [G loss: 0.839332]\n",
"[Epoch 53/200] [Batch 4/59] [D loss: 0.651271] [G loss: 0.686266]\n",
"[Epoch 53/200] [Batch 5/59] [D loss: 0.661534] [G loss: 0.866113]\n",
"[Epoch 53/200] [Batch 6/59] [D loss: 0.621910] [G loss: 0.832695]\n",
"[Epoch 53/200] [Batch 7/59] [D loss: 0.636095] [G loss: 0.743575]\n",
"[Epoch 53/200] [Batch 8/59] [D loss: 0.625049] [G loss: 0.930314]\n",
"[Epoch 53/200] [Batch 9/59] [D loss: 0.622680] [G loss: 0.805086]\n",
"[Epoch 53/200] [Batch 10/59] [D loss: 0.613787] [G loss: 0.699462]\n",
"[Epoch 53/200] [Batch 11/59] [D loss: 0.646464] [G loss: 0.950683]\n",
"[Epoch 53/200] [Batch 12/59] [D loss: 0.611262] [G loss: 0.767113]\n",
"[Epoch 53/200] [Batch 13/59] [D loss: 0.596119] [G loss: 0.798731]\n",
"[Epoch 53/200] [Batch 14/59] [D loss: 0.619906] [G loss: 0.828093]\n",
"[Epoch 53/200] [Batch 15/59] [D loss: 0.630336] [G loss: 0.807858]\n",
"[Epoch 53/200] [Batch 16/59] [D loss: 0.603323] [G loss: 0.793772]\n",
"[Epoch 53/200] [Batch 17/59] [D loss: 0.578876] [G loss: 0.832159]\n",
"[Epoch 53/200] [Batch 18/59] [D loss: 0.620734] [G loss: 0.909790]\n",
"[Epoch 53/200] [Batch 19/59] [D loss: 0.618936] [G loss: 0.772044]\n",
"[Epoch 53/200] [Batch 20/59] [D loss: 0.661043] [G loss: 0.740817]\n",
"[Epoch 53/200] [Batch 21/59] [D loss: 0.685013] [G loss: 0.882140]\n",
"[Epoch 53/200] [Batch 22/59] [D loss: 0.648905] [G loss: 0.756447]\n",
"[Epoch 53/200] [Batch 23/59] [D loss: 0.657583] [G loss: 0.845876]\n",
"[Epoch 53/200] [Batch 24/59] [D loss: 0.642464] [G loss: 0.797922]\n",
"[Epoch 53/200] [Batch 25/59] [D loss: 0.660466] [G loss: 0.782495]\n",
"[Epoch 53/200] [Batch 26/59] [D loss: 0.672242] [G loss: 0.754305]\n",
"[Epoch 53/200] [Batch 27/59] [D loss: 0.668916] [G loss: 0.974732]\n",
"[Epoch 53/200] [Batch 28/59] [D loss: 0.583383] [G loss: 0.673478]\n",
"[Epoch 53/200] [Batch 29/59] [D loss: 0.625092] [G loss: 0.786528]\n",
"[Epoch 53/200] [Batch 30/59] [D loss: 0.640973] [G loss: 1.016264]\n",
"[Epoch 53/200] [Batch 31/59] [D loss: 0.615058] [G loss: 0.778085]\n",
"[Epoch 53/200] [Batch 32/59] [D loss: 0.596932] [G loss: 0.650243]\n",
"[Epoch 53/200] [Batch 33/59] [D loss: 0.613856] [G loss: 0.973154]\n",
"[Epoch 53/200] [Batch 34/59] [D loss: 0.622988] [G loss: 0.789967]\n",
"[Epoch 53/200] [Batch 35/59] [D loss: 0.601265] [G loss: 0.761259]\n",
"[Epoch 53/200] [Batch 36/59] [D loss: 0.630328] [G loss: 0.819013]\n",
"[Epoch 53/200] [Batch 37/59] [D loss: 0.615384] [G loss: 0.955161]\n",
"[Epoch 53/200] [Batch 38/59] [D loss: 0.649486] [G loss: 0.803807]\n",
"[Epoch 53/200] [Batch 39/59] [D loss: 0.632914] [G loss: 0.720198]\n",
"[Epoch 53/200] [Batch 40/59] [D loss: 0.646773] [G loss: 0.869407]\n",
"[Epoch 53/200] [Batch 41/59] [D loss: 0.651624] [G loss: 0.946139]\n",
"[Epoch 53/200] [Batch 42/59] [D loss: 0.644235] [G loss: 0.748019]\n",
"[Epoch 53/200] [Batch 43/59] [D loss: 0.583022] [G loss: 0.844807]\n",
"[Epoch 53/200] [Batch 44/59] [D loss: 0.654336] [G loss: 0.768193]\n",
"[Epoch 53/200] [Batch 45/59] [D loss: 0.655113] [G loss: 0.810862]\n",
"[Epoch 53/200] [Batch 46/59] [D loss: 0.671188] [G loss: 0.835467]\n",
"[Epoch 53/200] [Batch 47/59] [D loss: 0.611316] [G loss: 0.717272]\n",
"[Epoch 53/200] [Batch 48/59] [D loss: 0.666257] [G loss: 0.760846]\n",
"[Epoch 53/200] [Batch 49/59] [D loss: 0.625909] [G loss: 0.799752]\n",
"[Epoch 53/200] [Batch 50/59] [D loss: 0.633613] [G loss: 0.660975]\n",
"[Epoch 53/200] [Batch 51/59] [D loss: 0.640206] [G loss: 0.864625]\n",
"[Epoch 53/200] [Batch 52/59] [D loss: 0.654676] [G loss: 0.830691]\n",
"[Epoch 53/200] [Batch 53/59] [D loss: 0.616182] [G loss: 0.873512]\n",
"[Epoch 53/200] [Batch 54/59] [D loss: 0.613911] [G loss: 0.717793]\n",
"[Epoch 53/200] [Batch 55/59] [D loss: 0.630029] [G loss: 0.958912]\n",
"[Epoch 53/200] [Batch 56/59] [D loss: 0.575197] [G loss: 0.830136]\n",
"[Epoch 53/200] [Batch 57/59] [D loss: 0.592512] [G loss: 0.829961]\n",
"[Epoch 53/200] [Batch 58/59] [D loss: 0.626696] [G loss: 0.815550]\n",
"[Epoch 54/200] [Batch 0/59] [D loss: 0.628794] [G loss: 0.919765]\n",
"[Epoch 54/200] [Batch 1/59] [D loss: 0.592549] [G loss: 0.761950]\n",
"[Epoch 54/200] [Batch 2/59] [D loss: 0.642004] [G loss: 0.709217]\n",
"[Epoch 54/200] [Batch 3/59] [D loss: 0.647381] [G loss: 0.929643]\n",
"[Epoch 54/200] [Batch 4/59] [D loss: 0.642451] [G loss: 0.821344]\n",
"[Epoch 54/200] [Batch 5/59] [D loss: 0.644174] [G loss: 0.666933]\n",
"[Epoch 54/200] [Batch 6/59] [D loss: 0.622925] [G loss: 0.765394]\n",
"[Epoch 54/200] [Batch 7/59] [D loss: 0.641725] [G loss: 0.940573]\n",
"[Epoch 54/200] [Batch 8/59] [D loss: 0.672138] [G loss: 0.678795]\n",
"[Epoch 54/200] [Batch 9/59] [D loss: 0.635240] [G loss: 0.819801]\n",
"[Epoch 54/200] [Batch 10/59] [D loss: 0.638464] [G loss: 0.963747]\n",
"[Epoch 54/200] [Batch 11/59] [D loss: 0.641270] [G loss: 0.686600]\n",
"[Epoch 54/200] [Batch 12/59] [D loss: 0.633205] [G loss: 0.745616]\n",
"[Epoch 54/200] [Batch 13/59] [D loss: 0.585158] [G loss: 0.969221]\n",
"[Epoch 54/200] [Batch 14/59] [D loss: 0.619706] [G loss: 0.847537]\n",
"[Epoch 54/200] [Batch 15/59] [D loss: 0.603405] [G loss: 0.748738]\n",
"[Epoch 54/200] [Batch 16/59] [D loss: 0.631424] [G loss: 0.778709]\n",
"[Epoch 54/200] [Batch 17/59] [D loss: 0.616032] [G loss: 0.848731]\n",
"[Epoch 54/200] [Batch 18/59] [D loss: 0.597940] [G loss: 0.679387]\n",
"[Epoch 54/200] [Batch 19/59] [D loss: 0.640579] [G loss: 0.876829]\n",
"[Epoch 54/200] [Batch 20/59] [D loss: 0.618243] [G loss: 0.989066]\n",
"[Epoch 54/200] [Batch 21/59] [D loss: 0.644637] [G loss: 0.783848]\n",
"[Epoch 54/200] [Batch 22/59] [D loss: 0.613172] [G loss: 0.708848]\n",
"[Epoch 54/200] [Batch 23/59] [D loss: 0.623664] [G loss: 0.954659]\n",
"[Epoch 54/200] [Batch 24/59] [D loss: 0.639691] [G loss: 0.866560]\n",
"[Epoch 54/200] [Batch 25/59] [D loss: 0.613627] [G loss: 0.802944]\n",
"[Epoch 54/200] [Batch 26/59] [D loss: 0.653401] [G loss: 0.715177]\n",
"[Epoch 54/200] [Batch 27/59] [D loss: 0.682608] [G loss: 1.034922]\n",
"[Epoch 54/200] [Batch 28/59] [D loss: 0.673065] [G loss: 0.686608]\n",
"[Epoch 54/200] [Batch 29/59] [D loss: 0.644787] [G loss: 0.799425]\n",
"[Epoch 54/200] [Batch 30/59] [D loss: 0.610260] [G loss: 0.867331]\n",
"[Epoch 54/200] [Batch 31/59] [D loss: 0.665697] [G loss: 0.724518]\n",
"[Epoch 54/200] [Batch 32/59] [D loss: 0.610968] [G loss: 0.745239]\n",
"[Epoch 54/200] [Batch 33/59] [D loss: 0.617678] [G loss: 0.898180]\n",
"[Epoch 54/200] [Batch 34/59] [D loss: 0.586316] [G loss: 0.870178]\n",
"[Epoch 54/200] [Batch 35/59] [D loss: 0.647894] [G loss: 0.796392]\n",
"[Epoch 54/200] [Batch 36/59] [D loss: 0.580259] [G loss: 0.844981]\n",
"[Epoch 54/200] [Batch 37/59] [D loss: 0.624211] [G loss: 0.972116]\n",
"[Epoch 54/200] [Batch 38/59] [D loss: 0.608386] [G loss: 0.665793]\n",
"[Epoch 54/200] [Batch 39/59] [D loss: 0.586897] [G loss: 0.935176]\n",
"[Epoch 54/200] [Batch 40/59] [D loss: 0.589147] [G loss: 0.999187]\n",
"[Epoch 54/200] [Batch 41/59] [D loss: 0.595663] [G loss: 0.761111]\n",
"[Epoch 54/200] [Batch 42/59] [D loss: 0.642335] [G loss: 0.818792]\n",
"[Epoch 54/200] [Batch 43/59] [D loss: 0.633605] [G loss: 0.934722]\n",
"[Epoch 54/200] [Batch 44/59] [D loss: 0.627711] [G loss: 0.746051]\n",
"[Epoch 54/200] [Batch 45/59] [D loss: 0.648482] [G loss: 0.731705]\n",
"[Epoch 54/200] [Batch 46/59] [D loss: 0.659157] [G loss: 0.789699]\n",
"[Epoch 54/200] [Batch 47/59] [D loss: 0.660467] [G loss: 0.778513]\n",
"[Epoch 54/200] [Batch 48/59] [D loss: 0.670576] [G loss: 0.727220]\n",
"[Epoch 54/200] [Batch 49/59] [D loss: 0.637608] [G loss: 0.780185]\n",
"[Epoch 54/200] [Batch 50/59] [D loss: 0.676366] [G loss: 0.746611]\n",
"[Epoch 54/200] [Batch 51/59] [D loss: 0.619515] [G loss: 0.823961]\n",
"[Epoch 54/200] [Batch 52/59] [D loss: 0.617950] [G loss: 0.850919]\n",
"[Epoch 54/200] [Batch 53/59] [D loss: 0.608926] [G loss: 0.755451]\n",
"[Epoch 54/200] [Batch 54/59] [D loss: 0.640651] [G loss: 0.827156]\n",
"[Epoch 54/200] [Batch 55/59] [D loss: 0.582191] [G loss: 0.995524]\n",
"[Epoch 54/200] [Batch 56/59] [D loss: 0.594078] [G loss: 0.806515]\n",
"[Epoch 54/200] [Batch 57/59] [D loss: 0.590793] [G loss: 0.798669]\n",
"[Epoch 54/200] [Batch 58/59] [D loss: 0.621658] [G loss: 0.890159]\n",
"[Epoch 55/200] [Batch 0/59] [D loss: 0.593544] [G loss: 0.845173]\n",
"[Epoch 55/200] [Batch 1/59] [D loss: 0.614864] [G loss: 0.742244]\n",
"[Epoch 55/200] [Batch 2/59] [D loss: 0.651530] [G loss: 0.782104]\n",
"[Epoch 55/200] [Batch 3/59] [D loss: 0.627285] [G loss: 0.864216]\n",
"[Epoch 55/200] [Batch 4/59] [D loss: 0.665951] [G loss: 0.894316]\n",
"[Epoch 55/200] [Batch 5/59] [D loss: 0.648037] [G loss: 0.755000]\n",
"[Epoch 55/200] [Batch 6/59] [D loss: 0.611788] [G loss: 0.812441]\n",
"[Epoch 55/200] [Batch 7/59] [D loss: 0.678595] [G loss: 0.780537]\n",
"[Epoch 55/200] [Batch 8/59] [D loss: 0.651796] [G loss: 0.757843]\n",
"[Epoch 55/200] [Batch 9/59] [D loss: 0.639934] [G loss: 0.767555]\n",
"[Epoch 55/200] [Batch 10/59] [D loss: 0.662147] [G loss: 0.979067]\n",
"[Epoch 55/200] [Batch 11/59] [D loss: 0.622941] [G loss: 0.774112]\n",
"[Epoch 55/200] [Batch 12/59] [D loss: 0.628892] [G loss: 0.769182]\n",
"[Epoch 55/200] [Batch 13/59] [D loss: 0.642906] [G loss: 1.059508]\n",
"[Epoch 55/200] [Batch 14/59] [D loss: 0.613789] [G loss: 0.684590]\n",
"[Epoch 55/200] [Batch 15/59] [D loss: 0.629797] [G loss: 0.827584]\n",
"[Epoch 55/200] [Batch 16/59] [D loss: 0.620898] [G loss: 1.007275]\n",
"[Epoch 55/200] [Batch 17/59] [D loss: 0.611081] [G loss: 0.697531]\n",
"[Epoch 55/200] [Batch 18/59] [D loss: 0.610126] [G loss: 0.836358]\n",
"[Epoch 55/200] [Batch 19/59] [D loss: 0.612048] [G loss: 0.864653]\n",
"[Epoch 55/200] [Batch 20/59] [D loss: 0.615716] [G loss: 0.834468]\n",
"[Epoch 55/200] [Batch 21/59] [D loss: 0.569683] [G loss: 0.786310]\n",
"[Epoch 55/200] [Batch 22/59] [D loss: 0.627633] [G loss: 0.803155]\n",
"[Epoch 55/200] [Batch 23/59] [D loss: 0.608012] [G loss: 0.834876]\n",
"[Epoch 55/200] [Batch 24/59] [D loss: 0.630315] [G loss: 0.822474]\n",
"[Epoch 55/200] [Batch 25/59] [D loss: 0.636553] [G loss: 0.756314]\n",
"[Epoch 55/200] [Batch 26/59] [D loss: 0.604615] [G loss: 0.987261]\n",
"[Epoch 55/200] [Batch 27/59] [D loss: 0.639417] [G loss: 0.701505]\n",
"[Epoch 55/200] [Batch 28/59] [D loss: 0.645896] [G loss: 0.917264]\n",
"[Epoch 55/200] [Batch 29/59] [D loss: 0.635917] [G loss: 0.762463]\n",
"[Epoch 55/200] [Batch 30/59] [D loss: 0.623495] [G loss: 0.758568]\n",
"[Epoch 55/200] [Batch 31/59] [D loss: 0.629609] [G loss: 0.863542]\n",
"[Epoch 55/200] [Batch 32/59] [D loss: 0.667316] [G loss: 0.842258]\n",
"[Epoch 55/200] [Batch 33/59] [D loss: 0.646345] [G loss: 0.674152]\n",
"[Epoch 55/200] [Batch 34/59] [D loss: 0.700985] [G loss: 0.842023]\n",
"[Epoch 55/200] [Batch 35/59] [D loss: 0.630432] [G loss: 0.760004]\n",
"[Epoch 55/200] [Batch 36/59] [D loss: 0.635296] [G loss: 0.977090]\n",
"[Epoch 55/200] [Batch 37/59] [D loss: 0.602499] [G loss: 0.758855]\n",
"[Epoch 55/200] [Batch 38/59] [D loss: 0.579022] [G loss: 0.761194]\n",
"[Epoch 55/200] [Batch 39/59] [D loss: 0.637452] [G loss: 1.013844]\n",
"[Epoch 55/200] [Batch 40/59] [D loss: 0.623177] [G loss: 0.855423]\n",
"[Epoch 55/200] [Batch 41/59] [D loss: 0.577007] [G loss: 0.853833]\n",
"[Epoch 55/200] [Batch 42/59] [D loss: 0.593706] [G loss: 0.808092]\n",
"[Epoch 55/200] [Batch 43/59] [D loss: 0.600653] [G loss: 0.892564]\n",
"[Epoch 55/200] [Batch 44/59] [D loss: 0.594779] [G loss: 0.832465]\n",
"[Epoch 55/200] [Batch 45/59] [D loss: 0.597985] [G loss: 0.829323]\n",
"[Epoch 55/200] [Batch 46/59] [D loss: 0.624659] [G loss: 0.787763]\n",
"[Epoch 55/200] [Batch 47/59] [D loss: 0.665171] [G loss: 0.773995]\n",
"[Epoch 55/200] [Batch 48/59] [D loss: 0.676781] [G loss: 0.772235]\n",
"[Epoch 55/200] [Batch 49/59] [D loss: 0.631093] [G loss: 0.761533]\n",
"[Epoch 55/200] [Batch 50/59] [D loss: 0.656290] [G loss: 0.777393]\n",
"[Epoch 55/200] [Batch 51/59] [D loss: 0.648877] [G loss: 0.809141]\n",
"[Epoch 55/200] [Batch 52/59] [D loss: 0.640628] [G loss: 0.726710]\n",
"[Epoch 55/200] [Batch 53/59] [D loss: 0.632954] [G loss: 0.833384]\n",
"[Epoch 55/200] [Batch 54/59] [D loss: 0.623459] [G loss: 0.798339]\n",
"[Epoch 55/200] [Batch 55/59] [D loss: 0.627094] [G loss: 0.784711]\n",
"[Epoch 55/200] [Batch 56/59] [D loss: 0.637218] [G loss: 0.943962]\n",
"[Epoch 55/200] [Batch 57/59] [D loss: 0.608906] [G loss: 0.771149]\n",
"[Epoch 55/200] [Batch 58/59] [D loss: 0.604544] [G loss: 0.844477]\n",
"[Epoch 56/200] [Batch 0/59] [D loss: 0.627441] [G loss: 0.884642]\n",
"[Epoch 56/200] [Batch 1/59] [D loss: 0.603105] [G loss: 0.726838]\n",
"[Epoch 56/200] [Batch 2/59] [D loss: 0.626076] [G loss: 0.931872]\n",
"[Epoch 56/200] [Batch 3/59] [D loss: 0.657209] [G loss: 0.901560]\n",
"[Epoch 56/200] [Batch 4/59] [D loss: 0.653520] [G loss: 0.816869]\n",
"[Epoch 56/200] [Batch 5/59] [D loss: 0.663664] [G loss: 0.873397]\n",
"[Epoch 56/200] [Batch 6/59] [D loss: 0.637120] [G loss: 0.747174]\n",
"[Epoch 56/200] [Batch 7/59] [D loss: 0.631277] [G loss: 0.791916]\n",
"[Epoch 56/200] [Batch 8/59] [D loss: 0.661959] [G loss: 0.872577]\n",
"[Epoch 56/200] [Batch 9/59] [D loss: 0.608502] [G loss: 0.922157]\n",
"[Epoch 56/200] [Batch 10/59] [D loss: 0.605633] [G loss: 0.731878]\n",
"[Epoch 56/200] [Batch 11/59] [D loss: 0.606060] [G loss: 0.909646]\n",
"[Epoch 56/200] [Batch 12/59] [D loss: 0.631789] [G loss: 0.818626]\n",
"[Epoch 56/200] [Batch 13/59] [D loss: 0.628448] [G loss: 0.883743]\n",
"[Epoch 56/200] [Batch 14/59] [D loss: 0.645907] [G loss: 0.876296]\n",
"[Epoch 56/200] [Batch 15/59] [D loss: 0.642001] [G loss: 0.706736]\n",
"[Epoch 56/200] [Batch 16/59] [D loss: 0.623246] [G loss: 0.943674]\n",
"[Epoch 56/200] [Batch 17/59] [D loss: 0.675383] [G loss: 0.838820]\n",
"[Epoch 56/200] [Batch 18/59] [D loss: 0.659564] [G loss: 0.852708]\n",
"[Epoch 56/200] [Batch 19/59] [D loss: 0.641254] [G loss: 0.776018]\n",
"[Epoch 56/200] [Batch 20/59] [D loss: 0.619409] [G loss: 0.788258]\n",
"[Epoch 56/200] [Batch 21/59] [D loss: 0.622471] [G loss: 0.824937]\n",
"[Epoch 56/200] [Batch 22/59] [D loss: 0.624682] [G loss: 0.816000]\n",
"[Epoch 56/200] [Batch 23/59] [D loss: 0.643599] [G loss: 0.983448]\n",
"[Epoch 56/200] [Batch 24/59] [D loss: 0.663656] [G loss: 0.713695]\n",
"[Epoch 56/200] [Batch 25/59] [D loss: 0.580526] [G loss: 0.912943]\n",
"[Epoch 56/200] [Batch 26/59] [D loss: 0.611443] [G loss: 0.942268]\n",
"[Epoch 56/200] [Batch 27/59] [D loss: 0.618839] [G loss: 0.765945]\n",
"[Epoch 56/200] [Batch 28/59] [D loss: 0.599383] [G loss: 0.810364]\n",
"[Epoch 56/200] [Batch 29/59] [D loss: 0.617315] [G loss: 0.955942]\n",
"[Epoch 56/200] [Batch 30/59] [D loss: 0.642560] [G loss: 0.730451]\n",
"[Epoch 56/200] [Batch 31/59] [D loss: 0.637303] [G loss: 0.847995]\n",
"[Epoch 56/200] [Batch 32/59] [D loss: 0.638179] [G loss: 0.841983]\n",
"[Epoch 56/200] [Batch 33/59] [D loss: 0.618036] [G loss: 0.730301]\n",
"[Epoch 56/200] [Batch 34/59] [D loss: 0.660647] [G loss: 0.885009]\n",
"[Epoch 56/200] [Batch 35/59] [D loss: 0.636252] [G loss: 0.736156]\n",
"[Epoch 56/200] [Batch 36/59] [D loss: 0.609494] [G loss: 0.829733]\n",
"[Epoch 56/200] [Batch 37/59] [D loss: 0.637604] [G loss: 0.897984]\n",
"[Epoch 56/200] [Batch 38/59] [D loss: 0.632882] [G loss: 0.781318]\n",
"[Epoch 56/200] [Batch 39/59] [D loss: 0.654881] [G loss: 0.697765]\n",
"[Epoch 56/200] [Batch 40/59] [D loss: 0.667830] [G loss: 0.943560]\n",
"[Epoch 56/200] [Batch 41/59] [D loss: 0.601308] [G loss: 0.728087]\n",
"[Epoch 56/200] [Batch 42/59] [D loss: 0.639402] [G loss: 0.747122]\n",
"[Epoch 56/200] [Batch 43/59] [D loss: 0.642821] [G loss: 0.949930]\n",
"[Epoch 56/200] [Batch 44/59] [D loss: 0.646977] [G loss: 0.752001]\n",
"[Epoch 56/200] [Batch 45/59] [D loss: 0.603847] [G loss: 0.841837]\n",
"[Epoch 56/200] [Batch 46/59] [D loss: 0.617638] [G loss: 0.805829]\n",
"[Epoch 56/200] [Batch 47/59] [D loss: 0.657848] [G loss: 0.883613]\n",
"[Epoch 56/200] [Batch 48/59] [D loss: 0.624741] [G loss: 0.768269]\n",
"[Epoch 56/200] [Batch 49/59] [D loss: 0.628956] [G loss: 0.822134]\n",
"[Epoch 56/200] [Batch 50/59] [D loss: 0.640828] [G loss: 0.849786]\n",
"[Epoch 56/200] [Batch 51/59] [D loss: 0.630780] [G loss: 0.943000]\n",
"[Epoch 56/200] [Batch 52/59] [D loss: 0.643415] [G loss: 0.710126]\n",
"[Epoch 56/200] [Batch 53/59] [D loss: 0.643633] [G loss: 0.882689]\n",
"[Epoch 56/200] [Batch 54/59] [D loss: 0.612933] [G loss: 0.877552]\n",
"[Epoch 56/200] [Batch 55/59] [D loss: 0.634347] [G loss: 0.724851]\n",
"[Epoch 56/200] [Batch 56/59] [D loss: 0.589574] [G loss: 0.762235]\n",
"[Epoch 56/200] [Batch 57/59] [D loss: 0.609097] [G loss: 1.015443]\n",
"[Epoch 56/200] [Batch 58/59] [D loss: 0.632284] [G loss: 0.770250]\n",
"[Epoch 57/200] [Batch 0/59] [D loss: 0.640149] [G loss: 0.663076]\n",
"[Epoch 57/200] [Batch 1/59] [D loss: 0.643595] [G loss: 0.983477]\n",
"[Epoch 57/200] [Batch 2/59] [D loss: 0.633603] [G loss: 0.910408]\n",
"[Epoch 57/200] [Batch 3/59] [D loss: 0.642857] [G loss: 0.651728]\n",
"[Epoch 57/200] [Batch 4/59] [D loss: 0.623559] [G loss: 0.842115]\n",
"[Epoch 57/200] [Batch 5/59] [D loss: 0.621465] [G loss: 0.794577]\n",
"[Epoch 57/200] [Batch 6/59] [D loss: 0.602511] [G loss: 0.829525]\n",
"[Epoch 57/200] [Batch 7/59] [D loss: 0.658635] [G loss: 0.726726]\n",
"[Epoch 57/200] [Batch 8/59] [D loss: 0.641006] [G loss: 0.925232]\n",
"[Epoch 57/200] [Batch 9/59] [D loss: 0.621384] [G loss: 0.685296]\n",
"[Epoch 57/200] [Batch 10/59] [D loss: 0.637455] [G loss: 0.797301]\n",
"[Epoch 57/200] [Batch 11/59] [D loss: 0.648639] [G loss: 0.954742]\n",
"[Epoch 57/200] [Batch 12/59] [D loss: 0.604317] [G loss: 0.721869]\n",
"[Epoch 57/200] [Batch 13/59] [D loss: 0.633894] [G loss: 0.876703]\n",
"[Epoch 57/200] [Batch 14/59] [D loss: 0.608090] [G loss: 0.884873]\n",
"[Epoch 57/200] [Batch 15/59] [D loss: 0.631352] [G loss: 0.708748]\n",
"[Epoch 57/200] [Batch 16/59] [D loss: 0.608546] [G loss: 0.852217]\n",
"[Epoch 57/200] [Batch 17/59] [D loss: 0.637546] [G loss: 0.816135]\n",
"[Epoch 57/200] [Batch 18/59] [D loss: 0.635298] [G loss: 0.734893]\n",
"[Epoch 57/200] [Batch 19/59] [D loss: 0.645992] [G loss: 0.973103]\n",
"[Epoch 57/200] [Batch 20/59] [D loss: 0.679567] [G loss: 0.611620]\n",
"[Epoch 57/200] [Batch 21/59] [D loss: 0.642210] [G loss: 0.785770]\n",
"[Epoch 57/200] [Batch 22/59] [D loss: 0.640321] [G loss: 0.857710]\n",
"[Epoch 57/200] [Batch 23/59] [D loss: 0.642564] [G loss: 0.956677]\n",
"[Epoch 57/200] [Batch 24/59] [D loss: 0.619304] [G loss: 0.807128]\n",
"[Epoch 57/200] [Batch 25/59] [D loss: 0.637758] [G loss: 0.721552]\n",
"[Epoch 57/200] [Batch 26/59] [D loss: 0.643957] [G loss: 0.977992]\n",
"[Epoch 57/200] [Batch 27/59] [D loss: 0.642675] [G loss: 0.866422]\n",
"[Epoch 57/200] [Batch 28/59] [D loss: 0.629339] [G loss: 0.725051]\n",
"[Epoch 57/200] [Batch 29/59] [D loss: 0.610585] [G loss: 0.867524]\n",
"[Epoch 57/200] [Batch 30/59] [D loss: 0.603608] [G loss: 0.915625]\n",
"[Epoch 57/200] [Batch 31/59] [D loss: 0.579196] [G loss: 0.833690]\n",
"[Epoch 57/200] [Batch 32/59] [D loss: 0.585202] [G loss: 0.787529]\n",
"[Epoch 57/200] [Batch 33/59] [D loss: 0.619953] [G loss: 0.801119]\n",
"[Epoch 57/200] [Batch 34/59] [D loss: 0.609560] [G loss: 0.958574]\n",
"[Epoch 57/200] [Batch 35/59] [D loss: 0.609406] [G loss: 0.702725]\n",
"[Epoch 57/200] [Batch 36/59] [D loss: 0.620025] [G loss: 0.831691]\n",
"[Epoch 57/200] [Batch 37/59] [D loss: 0.653821] [G loss: 0.932620]\n",
"[Epoch 57/200] [Batch 38/59] [D loss: 0.626528] [G loss: 0.789965]\n",
"[Epoch 57/200] [Batch 39/59] [D loss: 0.632561] [G loss: 0.761612]\n",
"[Epoch 57/200] [Batch 40/59] [D loss: 0.624085] [G loss: 0.820076]\n",
"[Epoch 57/200] [Batch 41/59] [D loss: 0.606159] [G loss: 0.869464]\n",
"[Epoch 57/200] [Batch 42/59] [D loss: 0.639637] [G loss: 0.748755]\n",
"[Epoch 57/200] [Batch 43/59] [D loss: 0.619479] [G loss: 0.779586]\n",
"[Epoch 57/200] [Batch 44/59] [D loss: 0.620088] [G loss: 0.828551]\n",
"[Epoch 57/200] [Batch 45/59] [D loss: 0.646662] [G loss: 0.757948]\n",
"[Epoch 57/200] [Batch 46/59] [D loss: 0.619036] [G loss: 0.796908]\n",
"[Epoch 57/200] [Batch 47/59] [D loss: 0.601168] [G loss: 0.859821]\n",
"[Epoch 57/200] [Batch 48/59] [D loss: 0.652609] [G loss: 0.824824]\n",
"[Epoch 57/200] [Batch 49/59] [D loss: 0.603410] [G loss: 0.745937]\n",
"[Epoch 57/200] [Batch 50/59] [D loss: 0.626267] [G loss: 0.973509]\n",
"[Epoch 57/200] [Batch 51/59] [D loss: 0.573563] [G loss: 0.770843]\n",
"[Epoch 57/200] [Batch 52/59] [D loss: 0.598384] [G loss: 0.774985]\n",
"[Epoch 57/200] [Batch 53/59] [D loss: 0.626273] [G loss: 0.981688]\n",
"[Epoch 57/200] [Batch 54/59] [D loss: 0.642919] [G loss: 0.841766]\n",
"[Epoch 57/200] [Batch 55/59] [D loss: 0.595926] [G loss: 0.808689]\n",
"[Epoch 57/200] [Batch 56/59] [D loss: 0.595263] [G loss: 0.843468]\n",
"[Epoch 57/200] [Batch 57/59] [D loss: 0.624361] [G loss: 0.846447]\n",
"[Epoch 57/200] [Batch 58/59] [D loss: 0.624621] [G loss: 0.842289]\n",
"[Epoch 58/200] [Batch 0/59] [D loss: 0.632060] [G loss: 0.857131]\n",
"[Epoch 58/200] [Batch 1/59] [D loss: 0.623591] [G loss: 0.787789]\n",
"[Epoch 58/200] [Batch 2/59] [D loss: 0.638115] [G loss: 0.999361]\n",
"[Epoch 58/200] [Batch 3/59] [D loss: 0.602028] [G loss: 0.763188]\n",
"[Epoch 58/200] [Batch 4/59] [D loss: 0.656930] [G loss: 0.791975]\n",
"[Epoch 58/200] [Batch 5/59] [D loss: 0.648113] [G loss: 0.961957]\n",
"[Epoch 58/200] [Batch 6/59] [D loss: 0.668504] [G loss: 0.694422]\n",
"[Epoch 58/200] [Batch 7/59] [D loss: 0.620250] [G loss: 0.778467]\n",
"[Epoch 58/200] [Batch 8/59] [D loss: 0.645862] [G loss: 0.809327]\n",
"[Epoch 58/200] [Batch 9/59] [D loss: 0.661701] [G loss: 0.870392]\n",
"[Epoch 58/200] [Batch 10/59] [D loss: 0.618168] [G loss: 0.772457]\n",
"[Epoch 58/200] [Batch 11/59] [D loss: 0.588873] [G loss: 0.815258]\n",
"[Epoch 58/200] [Batch 12/59] [D loss: 0.622229] [G loss: 0.848333]\n",
"[Epoch 58/200] [Batch 13/59] [D loss: 0.615324] [G loss: 0.886547]\n",
"[Epoch 58/200] [Batch 14/59] [D loss: 0.630886] [G loss: 0.825283]\n",
"[Epoch 58/200] [Batch 15/59] [D loss: 0.603576] [G loss: 0.881311]\n",
"[Epoch 58/200] [Batch 16/59] [D loss: 0.588748] [G loss: 0.734348]\n",
"[Epoch 58/200] [Batch 17/59] [D loss: 0.604947] [G loss: 0.922491]\n",
"[Epoch 58/200] [Batch 18/59] [D loss: 0.604596] [G loss: 0.814846]\n",
"[Epoch 58/200] [Batch 19/59] [D loss: 0.632592] [G loss: 0.858107]\n",
"[Epoch 58/200] [Batch 20/59] [D loss: 0.623257] [G loss: 0.832626]\n",
"[Epoch 58/200] [Batch 21/59] [D loss: 0.641303] [G loss: 0.850886]\n",
"[Epoch 58/200] [Batch 22/59] [D loss: 0.635728] [G loss: 0.802710]\n",
"[Epoch 58/200] [Batch 23/59] [D loss: 0.676267] [G loss: 0.841157]\n",
"[Epoch 58/200] [Batch 24/59] [D loss: 0.649999] [G loss: 0.755686]\n",
"[Epoch 58/200] [Batch 25/59] [D loss: 0.649136] [G loss: 0.891175]\n",
"[Epoch 58/200] [Batch 26/59] [D loss: 0.643848] [G loss: 0.660126]\n",
"[Epoch 58/200] [Batch 27/59] [D loss: 0.623059] [G loss: 0.948738]\n",
"[Epoch 58/200] [Batch 28/59] [D loss: 0.624524] [G loss: 0.870614]\n",
"[Epoch 58/200] [Batch 29/59] [D loss: 0.597643] [G loss: 0.800628]\n",
"[Epoch 58/200] [Batch 30/59] [D loss: 0.615273] [G loss: 0.873868]\n",
"[Epoch 58/200] [Batch 31/59] [D loss: 0.599867] [G loss: 0.834615]\n",
"[Epoch 58/200] [Batch 32/59] [D loss: 0.623287] [G loss: 0.777982]\n",
"[Epoch 58/200] [Batch 33/59] [D loss: 0.631336] [G loss: 0.933656]\n",
"[Epoch 58/200] [Batch 34/59] [D loss: 0.609694] [G loss: 0.860122]\n",
"[Epoch 58/200] [Batch 35/59] [D loss: 0.608877] [G loss: 0.754584]\n",
"[Epoch 58/200] [Batch 36/59] [D loss: 0.604258] [G loss: 0.917390]\n",
"[Epoch 58/200] [Batch 37/59] [D loss: 0.613273] [G loss: 0.916412]\n",
"[Epoch 58/200] [Batch 38/59] [D loss: 0.628102] [G loss: 0.806163]\n",
"[Epoch 58/200] [Batch 39/59] [D loss: 0.638656] [G loss: 0.818559]\n",
"[Epoch 58/200] [Batch 40/59] [D loss: 0.637425] [G loss: 0.904657]\n",
"[Epoch 58/200] [Batch 41/59] [D loss: 0.664369] [G loss: 0.759255]\n",
"[Epoch 58/200] [Batch 42/59] [D loss: 0.671628] [G loss: 0.761563]\n",
"[Epoch 58/200] [Batch 43/59] [D loss: 0.639053] [G loss: 0.809555]\n",
"[Epoch 58/200] [Batch 44/59] [D loss: 0.660658] [G loss: 0.839012]\n",
"[Epoch 58/200] [Batch 45/59] [D loss: 0.607599] [G loss: 0.721195]\n",
"[Epoch 58/200] [Batch 46/59] [D loss: 0.633321] [G loss: 0.789938]\n",
"[Epoch 58/200] [Batch 47/59] [D loss: 0.617144] [G loss: 0.931643]\n",
"[Epoch 58/200] [Batch 48/59] [D loss: 0.604339] [G loss: 0.828841]\n",
"[Epoch 58/200] [Batch 49/59] [D loss: 0.621000] [G loss: 0.729258]\n",
"[Epoch 58/200] [Batch 50/59] [D loss: 0.611513] [G loss: 0.960860]\n",
"[Epoch 58/200] [Batch 51/59] [D loss: 0.586623] [G loss: 0.812439]\n",
"[Epoch 58/200] [Batch 52/59] [D loss: 0.656315] [G loss: 0.775843]\n",
"[Epoch 58/200] [Batch 53/59] [D loss: 0.595760] [G loss: 1.049212]\n",
"[Epoch 58/200] [Batch 54/59] [D loss: 0.596953] [G loss: 0.753160]\n",
"[Epoch 58/200] [Batch 55/59] [D loss: 0.618723] [G loss: 0.833056]\n",
"[Epoch 58/200] [Batch 56/59] [D loss: 0.641006] [G loss: 0.847389]\n",
"[Epoch 58/200] [Batch 57/59] [D loss: 0.606457] [G loss: 0.870260]\n",
"[Epoch 58/200] [Batch 58/59] [D loss: 0.608073] [G loss: 0.818193]\n",
"[Epoch 59/200] [Batch 0/59] [D loss: 0.636168] [G loss: 0.825880]\n",
"[Epoch 59/200] [Batch 1/59] [D loss: 0.645320] [G loss: 0.808021]\n",
"[Epoch 59/200] [Batch 2/59] [D loss: 0.659102] [G loss: 0.820314]\n",
"[Epoch 59/200] [Batch 3/59] [D loss: 0.639749] [G loss: 0.834902]\n",
"[Epoch 59/200] [Batch 4/59] [D loss: 0.650365] [G loss: 0.775252]\n",
"[Epoch 59/200] [Batch 5/59] [D loss: 0.661425] [G loss: 0.776598]\n",
"[Epoch 59/200] [Batch 6/59] [D loss: 0.630927] [G loss: 0.785273]\n",
"[Epoch 59/200] [Batch 7/59] [D loss: 0.632485] [G loss: 0.773718]\n",
"[Epoch 59/200] [Batch 8/59] [D loss: 0.631404] [G loss: 0.823068]\n",
"[Epoch 59/200] [Batch 9/59] [D loss: 0.624787] [G loss: 0.881391]\n",
"[Epoch 59/200] [Batch 10/59] [D loss: 0.591756] [G loss: 0.719922]\n",
"[Epoch 59/200] [Batch 11/59] [D loss: 0.612775] [G loss: 0.809584]\n",
"[Epoch 59/200] [Batch 12/59] [D loss: 0.602908] [G loss: 0.914608]\n",
"[Epoch 59/200] [Batch 13/59] [D loss: 0.611587] [G loss: 0.777804]\n",
"[Epoch 59/200] [Batch 14/59] [D loss: 0.590952] [G loss: 0.810787]\n",
"[Epoch 59/200] [Batch 15/59] [D loss: 0.593106] [G loss: 0.961967]\n",
"[Epoch 59/200] [Batch 16/59] [D loss: 0.592872] [G loss: 0.753752]\n",
"[Epoch 59/200] [Batch 17/59] [D loss: 0.587283] [G loss: 0.884738]\n",
"[Epoch 59/200] [Batch 18/59] [D loss: 0.620431] [G loss: 1.002686]\n",
"[Epoch 59/200] [Batch 19/59] [D loss: 0.645774] [G loss: 0.659404]\n",
"[Epoch 59/200] [Batch 20/59] [D loss: 0.634273] [G loss: 0.859827]\n",
"[Epoch 59/200] [Batch 21/59] [D loss: 0.660306] [G loss: 0.998965]\n",
"[Epoch 59/200] [Batch 22/59] [D loss: 0.597302] [G loss: 0.752655]\n",
"[Epoch 59/200] [Batch 23/59] [D loss: 0.676458] [G loss: 0.755224]\n",
"[Epoch 59/200] [Batch 24/59] [D loss: 0.669751] [G loss: 1.022961]\n",
"[Epoch 59/200] [Batch 25/59] [D loss: 0.635539] [G loss: 0.801226]\n",
"[Epoch 59/200] [Batch 26/59] [D loss: 0.605426] [G loss: 0.703410]\n",
"[Epoch 59/200] [Batch 27/59] [D loss: 0.654026] [G loss: 0.969899]\n",
"[Epoch 59/200] [Batch 28/59] [D loss: 0.623919] [G loss: 0.855879]\n",
"[Epoch 59/200] [Batch 29/59] [D loss: 0.642080] [G loss: 0.725967]\n",
"[Epoch 59/200] [Batch 30/59] [D loss: 0.635525] [G loss: 0.912580]\n",
"[Epoch 59/200] [Batch 31/59] [D loss: 0.612106] [G loss: 0.878523]\n",
"[Epoch 59/200] [Batch 32/59] [D loss: 0.627584] [G loss: 0.692679]\n",
"[Epoch 59/200] [Batch 33/59] [D loss: 0.569717] [G loss: 0.894843]\n",
"[Epoch 59/200] [Batch 34/59] [D loss: 0.603353] [G loss: 0.892385]\n",
"[Epoch 59/200] [Batch 35/59] [D loss: 0.635813] [G loss: 0.774524]\n",
"[Epoch 59/200] [Batch 36/59] [D loss: 0.638436] [G loss: 0.936131]\n",
"[Epoch 59/200] [Batch 37/59] [D loss: 0.589879] [G loss: 0.722758]\n",
"[Epoch 59/200] [Batch 38/59] [D loss: 0.589714] [G loss: 0.865002]\n",
"[Epoch 59/200] [Batch 39/59] [D loss: 0.613889] [G loss: 0.863312]\n",
"[Epoch 59/200] [Batch 40/59] [D loss: 0.642362] [G loss: 0.659029]\n",
"[Epoch 59/200] [Batch 41/59] [D loss: 0.654680] [G loss: 0.818722]\n",
"[Epoch 59/200] [Batch 42/59] [D loss: 0.640363] [G loss: 0.805721]\n",
"[Epoch 59/200] [Batch 43/59] [D loss: 0.640713] [G loss: 0.728460]\n",
"[Epoch 59/200] [Batch 44/59] [D loss: 0.657507] [G loss: 0.784142]\n",
"[Epoch 59/200] [Batch 45/59] [D loss: 0.639639] [G loss: 0.756862]\n",
"[Epoch 59/200] [Batch 46/59] [D loss: 0.655136] [G loss: 0.982501]\n",
"[Epoch 59/200] [Batch 47/59] [D loss: 0.616476] [G loss: 0.738779]\n",
"[Epoch 59/200] [Batch 48/59] [D loss: 0.636001] [G loss: 0.689948]\n",
"[Epoch 59/200] [Batch 49/59] [D loss: 0.644304] [G loss: 1.111641]\n",
"[Epoch 59/200] [Batch 50/59] [D loss: 0.614610] [G loss: 0.722877]\n",
"[Epoch 59/200] [Batch 51/59] [D loss: 0.630046] [G loss: 0.787335]\n",
"[Epoch 59/200] [Batch 52/59] [D loss: 0.566033] [G loss: 1.064676]\n",
"[Epoch 59/200] [Batch 53/59] [D loss: 0.602207] [G loss: 0.808765]\n",
"[Epoch 59/200] [Batch 54/59] [D loss: 0.577369] [G loss: 0.822064]\n",
"[Epoch 59/200] [Batch 55/59] [D loss: 0.625418] [G loss: 0.973798]\n",
"[Epoch 59/200] [Batch 56/59] [D loss: 0.636198] [G loss: 0.818161]\n",
"[Epoch 59/200] [Batch 57/59] [D loss: 0.633356] [G loss: 0.685758]\n",
"[Epoch 59/200] [Batch 58/59] [D loss: 0.595649] [G loss: 0.849361]\n",
"[Epoch 60/200] [Batch 0/59] [D loss: 0.672749] [G loss: 0.843846]\n",
"[Epoch 60/200] [Batch 1/59] [D loss: 0.683399] [G loss: 0.633283]\n",
"[Epoch 60/200] [Batch 2/59] [D loss: 0.632878] [G loss: 0.851246]\n",
"[Epoch 60/200] [Batch 3/59] [D loss: 0.661814] [G loss: 0.749946]\n",
"[Epoch 60/200] [Batch 4/59] [D loss: 0.662747] [G loss: 0.870883]\n",
"[Epoch 60/200] [Batch 5/59] [D loss: 0.668933] [G loss: 0.861377]\n",
"[Epoch 60/200] [Batch 6/59] [D loss: 0.628115] [G loss: 0.880793]\n",
"[Epoch 60/200] [Batch 7/59] [D loss: 0.617657] [G loss: 0.857487]\n",
"[Epoch 60/200] [Batch 8/59] [D loss: 0.595589] [G loss: 0.798075]\n",
"[Epoch 60/200] [Batch 9/59] [D loss: 0.613547] [G loss: 0.862488]\n",
"[Epoch 60/200] [Batch 10/59] [D loss: 0.598134] [G loss: 0.866419]\n",
"[Epoch 60/200] [Batch 11/59] [D loss: 0.605525] [G loss: 0.849844]\n",
"[Epoch 60/200] [Batch 12/59] [D loss: 0.613308] [G loss: 0.823802]\n",
"[Epoch 60/200] [Batch 13/59] [D loss: 0.610682] [G loss: 1.010552]\n",
"[Epoch 60/200] [Batch 14/59] [D loss: 0.626344] [G loss: 0.759366]\n",
"[Epoch 60/200] [Batch 15/59] [D loss: 0.619512] [G loss: 0.842798]\n",
"[Epoch 60/200] [Batch 16/59] [D loss: 0.617993] [G loss: 0.872445]\n",
"[Epoch 60/200] [Batch 17/59] [D loss: 0.622874] [G loss: 0.773158]\n",
"[Epoch 60/200] [Batch 18/59] [D loss: 0.620900] [G loss: 0.781186]\n",
"[Epoch 60/200] [Batch 19/59] [D loss: 0.645649] [G loss: 0.830038]\n",
"[Epoch 60/200] [Batch 20/59] [D loss: 0.638314] [G loss: 0.714384]\n",
"[Epoch 60/200] [Batch 21/59] [D loss: 0.630440] [G loss: 0.767112]\n",
"[Epoch 60/200] [Batch 22/59] [D loss: 0.647713] [G loss: 0.873191]\n",
"[Epoch 60/200] [Batch 23/59] [D loss: 0.611370] [G loss: 0.821309]\n",
"[Epoch 60/200] [Batch 24/59] [D loss: 0.644306] [G loss: 0.756719]\n",
"[Epoch 60/200] [Batch 25/59] [D loss: 0.619192] [G loss: 0.827296]\n",
"[Epoch 60/200] [Batch 26/59] [D loss: 0.623299] [G loss: 0.962669]\n",
"[Epoch 60/200] [Batch 27/59] [D loss: 0.593327] [G loss: 0.761389]\n",
"[Epoch 60/200] [Batch 28/59] [D loss: 0.625753] [G loss: 0.811146]\n",
"[Epoch 60/200] [Batch 29/59] [D loss: 0.626857] [G loss: 0.892764]\n",
"[Epoch 60/200] [Batch 30/59] [D loss: 0.622565] [G loss: 0.825631]\n",
"[Epoch 60/200] [Batch 31/59] [D loss: 0.634267] [G loss: 0.784071]\n",
"[Epoch 60/200] [Batch 32/59] [D loss: 0.646075] [G loss: 0.895800]\n",
"[Epoch 60/200] [Batch 33/59] [D loss: 0.649651] [G loss: 0.735118]\n",
"[Epoch 60/200] [Batch 34/59] [D loss: 0.577736] [G loss: 0.883154]\n",
"[Epoch 60/200] [Batch 35/59] [D loss: 0.634001] [G loss: 0.898469]\n",
"[Epoch 60/200] [Batch 36/59] [D loss: 0.608880] [G loss: 0.830536]\n",
"[Epoch 60/200] [Batch 37/59] [D loss: 0.610720] [G loss: 0.729818]\n",
"[Epoch 60/200] [Batch 38/59] [D loss: 0.608352] [G loss: 0.964383]\n",
"[Epoch 60/200] [Batch 39/59] [D loss: 0.605068] [G loss: 0.719716]\n",
"[Epoch 60/200] [Batch 40/59] [D loss: 0.651264] [G loss: 0.775572]\n",
"[Epoch 60/200] [Batch 41/59] [D loss: 0.598443] [G loss: 0.913800]\n",
"[Epoch 60/200] [Batch 42/59] [D loss: 0.607916] [G loss: 0.832236]\n",
"[Epoch 60/200] [Batch 43/59] [D loss: 0.641254] [G loss: 0.814298]\n",
"[Epoch 60/200] [Batch 44/59] [D loss: 0.656032] [G loss: 0.935362]\n",
"[Epoch 60/200] [Batch 45/59] [D loss: 0.654004] [G loss: 0.672464]\n",
"[Epoch 60/200] [Batch 46/59] [D loss: 0.654541] [G loss: 0.844865]\n",
"[Epoch 60/200] [Batch 47/59] [D loss: 0.657379] [G loss: 0.779010]\n",
"[Epoch 60/200] [Batch 48/59] [D loss: 0.620804] [G loss: 0.765678]\n",
"[Epoch 60/200] [Batch 49/59] [D loss: 0.632680] [G loss: 0.884484]\n",
"[Epoch 60/200] [Batch 50/59] [D loss: 0.627213] [G loss: 0.814189]\n",
"[Epoch 60/200] [Batch 51/59] [D loss: 0.638054] [G loss: 0.793135]\n",
"[Epoch 60/200] [Batch 52/59] [D loss: 0.596160] [G loss: 0.848376]\n",
"[Epoch 60/200] [Batch 53/59] [D loss: 0.601195] [G loss: 1.022701]\n",
"[Epoch 60/200] [Batch 54/59] [D loss: 0.588912] [G loss: 0.652649]\n",
"[Epoch 60/200] [Batch 55/59] [D loss: 0.618794] [G loss: 0.945513]\n",
"[Epoch 60/200] [Batch 56/59] [D loss: 0.604338] [G loss: 0.933783]\n",
"[Epoch 60/200] [Batch 57/59] [D loss: 0.619620] [G loss: 0.716026]\n",
"[Epoch 60/200] [Batch 58/59] [D loss: 0.592092] [G loss: 0.915706]\n",
"[Epoch 61/200] [Batch 0/59] [D loss: 0.617795] [G loss: 0.772964]\n",
"[Epoch 61/200] [Batch 1/59] [D loss: 0.661221] [G loss: 0.763574]\n",
"[Epoch 61/200] [Batch 2/59] [D loss: 0.669352] [G loss: 1.102026]\n",
"[Epoch 61/200] [Batch 3/59] [D loss: 0.650600] [G loss: 0.610928]\n",
"[Epoch 61/200] [Batch 4/59] [D loss: 0.608881] [G loss: 0.810147]\n",
"[Epoch 61/200] [Batch 5/59] [D loss: 0.634069] [G loss: 0.933528]\n",
"[Epoch 61/200] [Batch 6/59] [D loss: 0.624602] [G loss: 0.758624]\n",
"[Epoch 61/200] [Batch 7/59] [D loss: 0.618784] [G loss: 0.737580]\n",
"[Epoch 61/200] [Batch 8/59] [D loss: 0.650770] [G loss: 1.017779]\n",
"[Epoch 61/200] [Batch 9/59] [D loss: 0.613539] [G loss: 0.718999]\n",
"[Epoch 61/200] [Batch 10/59] [D loss: 0.617875] [G loss: 0.762882]\n",
"[Epoch 61/200] [Batch 11/59] [D loss: 0.609852] [G loss: 1.051448]\n",
"[Epoch 61/200] [Batch 12/59] [D loss: 0.617798] [G loss: 0.779419]\n",
"[Epoch 61/200] [Batch 13/59] [D loss: 0.651602] [G loss: 0.728423]\n",
"[Epoch 61/200] [Batch 14/59] [D loss: 0.612461] [G loss: 0.931287]\n",
"[Epoch 61/200] [Batch 15/59] [D loss: 0.618390] [G loss: 0.857735]\n",
"[Epoch 61/200] [Batch 16/59] [D loss: 0.647866] [G loss: 0.715782]\n",
"[Epoch 61/200] [Batch 17/59] [D loss: 0.596494] [G loss: 0.844907]\n",
"[Epoch 61/200] [Batch 18/59] [D loss: 0.647970] [G loss: 0.997455]\n",
"[Epoch 61/200] [Batch 19/59] [D loss: 0.600171] [G loss: 0.730151]\n",
"[Epoch 61/200] [Batch 20/59] [D loss: 0.640620] [G loss: 0.778004]\n",
"[Epoch 61/200] [Batch 21/59] [D loss: 0.606016] [G loss: 1.035290]\n",
"[Epoch 61/200] [Batch 22/59] [D loss: 0.595105] [G loss: 0.900399]\n",
"[Epoch 61/200] [Batch 23/59] [D loss: 0.624471] [G loss: 0.713882]\n",
"[Epoch 61/200] [Batch 24/59] [D loss: 0.639736] [G loss: 0.945912]\n",
"[Epoch 61/200] [Batch 25/59] [D loss: 0.661236] [G loss: 0.901618]\n",
"[Epoch 61/200] [Batch 26/59] [D loss: 0.646977] [G loss: 0.723786]\n",
"[Epoch 61/200] [Batch 27/59] [D loss: 0.632513] [G loss: 0.904921]\n",
"[Epoch 61/200] [Batch 28/59] [D loss: 0.623919] [G loss: 0.884341]\n",
"[Epoch 61/200] [Batch 29/59] [D loss: 0.645277] [G loss: 0.649266]\n",
"[Epoch 61/200] [Batch 30/59] [D loss: 0.635511] [G loss: 0.947954]\n",
"[Epoch 61/200] [Batch 31/59] [D loss: 0.607965] [G loss: 0.817009]\n",
"[Epoch 61/200] [Batch 32/59] [D loss: 0.607638] [G loss: 0.747061]\n",
"[Epoch 61/200] [Batch 33/59] [D loss: 0.588722] [G loss: 0.972184]\n",
"[Epoch 61/200] [Batch 34/59] [D loss: 0.619450] [G loss: 0.827355]\n",
"[Epoch 61/200] [Batch 35/59] [D loss: 0.595614] [G loss: 0.848052]\n",
"[Epoch 61/200] [Batch 36/59] [D loss: 0.638563] [G loss: 1.065046]\n",
"[Epoch 61/200] [Batch 37/59] [D loss: 0.613539] [G loss: 0.703737]\n",
"[Epoch 61/200] [Batch 38/59] [D loss: 0.608306] [G loss: 0.890316]\n",
"[Epoch 61/200] [Batch 39/59] [D loss: 0.632347] [G loss: 0.923052]\n",
"[Epoch 61/200] [Batch 40/59] [D loss: 0.594143] [G loss: 0.756896]\n",
"[Epoch 61/200] [Batch 41/59] [D loss: 0.653289] [G loss: 0.699460]\n",
"[Epoch 61/200] [Batch 42/59] [D loss: 0.629955] [G loss: 0.885396]\n",
"[Epoch 61/200] [Batch 43/59] [D loss: 0.643578] [G loss: 0.825048]\n",
"[Epoch 61/200] [Batch 44/59] [D loss: 0.642390] [G loss: 0.770309]\n",
"[Epoch 61/200] [Batch 45/59] [D loss: 0.626369] [G loss: 0.746501]\n",
"[Epoch 61/200] [Batch 46/59] [D loss: 0.657480] [G loss: 0.900870]\n",
"[Epoch 61/200] [Batch 47/59] [D loss: 0.606076] [G loss: 0.803607]\n",
"[Epoch 61/200] [Batch 48/59] [D loss: 0.621983] [G loss: 0.761599]\n",
"[Epoch 61/200] [Batch 49/59] [D loss: 0.628539] [G loss: 0.771446]\n",
"[Epoch 61/200] [Batch 50/59] [D loss: 0.631663] [G loss: 0.830786]\n",
"[Epoch 61/200] [Batch 51/59] [D loss: 0.660664] [G loss: 0.812206]\n",
"[Epoch 61/200] [Batch 52/59] [D loss: 0.628941] [G loss: 0.744654]\n",
"[Epoch 61/200] [Batch 53/59] [D loss: 0.626160] [G loss: 0.752510]\n",
"[Epoch 61/200] [Batch 54/59] [D loss: 0.624377] [G loss: 0.910837]\n",
"[Epoch 61/200] [Batch 55/59] [D loss: 0.626292] [G loss: 0.786327]\n",
"[Epoch 61/200] [Batch 56/59] [D loss: 0.612279] [G loss: 0.761791]\n",
"[Epoch 61/200] [Batch 57/59] [D loss: 0.643835] [G loss: 0.889847]\n",
"[Epoch 61/200] [Batch 58/59] [D loss: 0.617033] [G loss: 0.752806]\n",
"[Epoch 62/200] [Batch 0/59] [D loss: 0.617663] [G loss: 0.919553]\n",
"[Epoch 62/200] [Batch 1/59] [D loss: 0.606596] [G loss: 0.776788]\n",
"[Epoch 62/200] [Batch 2/59] [D loss: 0.579398] [G loss: 0.826141]\n",
"[Epoch 62/200] [Batch 3/59] [D loss: 0.595912] [G loss: 0.831063]\n",
"[Epoch 62/200] [Batch 4/59] [D loss: 0.623063] [G loss: 0.786633]\n",
"[Epoch 62/200] [Batch 5/59] [D loss: 0.621946] [G loss: 0.993861]\n",
"[Epoch 62/200] [Batch 6/59] [D loss: 0.610896] [G loss: 0.799636]\n",
"[Epoch 62/200] [Batch 7/59] [D loss: 0.632549] [G loss: 0.708588]\n",
"[Epoch 62/200] [Batch 8/59] [D loss: 0.629486] [G loss: 0.914771]\n",
"[Epoch 62/200] [Batch 9/59] [D loss: 0.640214] [G loss: 0.772025]\n",
"[Epoch 62/200] [Batch 10/59] [D loss: 0.613695] [G loss: 0.674862]\n",
"[Epoch 62/200] [Batch 11/59] [D loss: 0.648012] [G loss: 0.974654]\n",
"[Epoch 62/200] [Batch 12/59] [D loss: 0.615711] [G loss: 0.743776]\n",
"[Epoch 62/200] [Batch 13/59] [D loss: 0.596174] [G loss: 0.693961]\n",
"[Epoch 62/200] [Batch 14/59] [D loss: 0.636072] [G loss: 0.950857]\n",
"[Epoch 62/200] [Batch 15/59] [D loss: 0.625793] [G loss: 0.770049]\n",
"[Epoch 62/200] [Batch 16/59] [D loss: 0.631130] [G loss: 0.655195]\n",
"[Epoch 62/200] [Batch 17/59] [D loss: 0.611508] [G loss: 1.042367]\n",
"[Epoch 62/200] [Batch 18/59] [D loss: 0.651367] [G loss: 0.897454]\n",
"[Epoch 62/200] [Batch 19/59] [D loss: 0.635153] [G loss: 0.669747]\n",
"[Epoch 62/200] [Batch 20/59] [D loss: 0.604707] [G loss: 1.002812]\n",
"[Epoch 62/200] [Batch 21/59] [D loss: 0.629937] [G loss: 1.007599]\n",
"[Epoch 62/200] [Batch 22/59] [D loss: 0.622182] [G loss: 0.679247]\n",
"[Epoch 62/200] [Batch 23/59] [D loss: 0.632503] [G loss: 0.848384]\n",
"[Epoch 62/200] [Batch 24/59] [D loss: 0.635751] [G loss: 0.942927]\n",
"[Epoch 62/200] [Batch 25/59] [D loss: 0.586789] [G loss: 0.845878]\n",
"[Epoch 62/200] [Batch 26/59] [D loss: 0.642638] [G loss: 0.810667]\n",
"[Epoch 62/200] [Batch 27/59] [D loss: 0.645456] [G loss: 0.879248]\n",
"[Epoch 62/200] [Batch 28/59] [D loss: 0.632628] [G loss: 0.861624]\n",
"[Epoch 62/200] [Batch 29/59] [D loss: 0.608036] [G loss: 0.720977]\n",
"[Epoch 62/200] [Batch 30/59] [D loss: 0.649103] [G loss: 0.834077]\n",
"[Epoch 62/200] [Batch 31/59] [D loss: 0.629462] [G loss: 0.925028]\n",
"[Epoch 62/200] [Batch 32/59] [D loss: 0.639001] [G loss: 0.769846]\n",
"[Epoch 62/200] [Batch 33/59] [D loss: 0.627813] [G loss: 0.754541]\n",
"[Epoch 62/200] [Batch 34/59] [D loss: 0.651060] [G loss: 0.873725]\n",
"[Epoch 62/200] [Batch 35/59] [D loss: 0.616426] [G loss: 0.715828]\n",
"[Epoch 62/200] [Batch 36/59] [D loss: 0.612888] [G loss: 0.871016]\n",
"[Epoch 62/200] [Batch 37/59] [D loss: 0.641783] [G loss: 0.896166]\n",
"[Epoch 62/200] [Batch 38/59] [D loss: 0.619489] [G loss: 0.674241]\n",
"[Epoch 62/200] [Batch 39/59] [D loss: 0.607151] [G loss: 0.921007]\n",
"[Epoch 62/200] [Batch 40/59] [D loss: 0.624031] [G loss: 0.855281]\n",
"[Epoch 62/200] [Batch 41/59] [D loss: 0.621670] [G loss: 0.866186]\n",
"[Epoch 62/200] [Batch 42/59] [D loss: 0.616582] [G loss: 0.777120]\n",
"[Epoch 62/200] [Batch 43/59] [D loss: 0.651462] [G loss: 0.940707]\n",
"[Epoch 62/200] [Batch 44/59] [D loss: 0.612925] [G loss: 0.790297]\n",
"[Epoch 62/200] [Batch 45/59] [D loss: 0.610857] [G loss: 0.749981]\n",
"[Epoch 62/200] [Batch 46/59] [D loss: 0.636472] [G loss: 0.942162]\n",
"[Epoch 62/200] [Batch 47/59] [D loss: 0.646264] [G loss: 0.857065]\n",
"[Epoch 62/200] [Batch 48/59] [D loss: 0.637658] [G loss: 0.809407]\n",
"[Epoch 62/200] [Batch 49/59] [D loss: 0.623142] [G loss: 0.673296]\n",
"[Epoch 62/200] [Batch 50/59] [D loss: 0.643302] [G loss: 0.868922]\n",
"[Epoch 62/200] [Batch 51/59] [D loss: 0.611956] [G loss: 0.810362]\n",
"[Epoch 62/200] [Batch 52/59] [D loss: 0.658695] [G loss: 0.724096]\n",
"[Epoch 62/200] [Batch 53/59] [D loss: 0.632016] [G loss: 0.936458]\n",
"[Epoch 62/200] [Batch 54/59] [D loss: 0.616320] [G loss: 0.814749]\n",
"[Epoch 62/200] [Batch 55/59] [D loss: 0.611195] [G loss: 0.938837]\n",
"[Epoch 62/200] [Batch 56/59] [D loss: 0.589356] [G loss: 0.723922]\n",
"[Epoch 62/200] [Batch 57/59] [D loss: 0.619776] [G loss: 0.756091]\n",
"[Epoch 62/200] [Batch 58/59] [D loss: 0.597317] [G loss: 0.889512]\n",
"[Epoch 63/200] [Batch 0/59] [D loss: 0.606544] [G loss: 0.818529]\n",
"[Epoch 63/200] [Batch 1/59] [D loss: 0.638752] [G loss: 0.805942]\n",
"[Epoch 63/200] [Batch 2/59] [D loss: 0.610188] [G loss: 0.809087]\n",
"[Epoch 63/200] [Batch 3/59] [D loss: 0.658744] [G loss: 0.877321]\n",
"[Epoch 63/200] [Batch 4/59] [D loss: 0.615984] [G loss: 0.748321]\n",
"[Epoch 63/200] [Batch 5/59] [D loss: 0.610661] [G loss: 0.870115]\n",
"[Epoch 63/200] [Batch 6/59] [D loss: 0.565180] [G loss: 0.885872]\n",
"[Epoch 63/200] [Batch 7/59] [D loss: 0.640174] [G loss: 0.910268]\n",
"[Epoch 63/200] [Batch 8/59] [D loss: 0.620115] [G loss: 0.744465]\n",
"[Epoch 63/200] [Batch 9/59] [D loss: 0.615994] [G loss: 0.862109]\n",
"[Epoch 63/200] [Batch 10/59] [D loss: 0.617453] [G loss: 0.955992]\n",
"[Epoch 63/200] [Batch 11/59] [D loss: 0.594122] [G loss: 0.676182]\n",
"[Epoch 63/200] [Batch 12/59] [D loss: 0.621048] [G loss: 0.815745]\n",
"[Epoch 63/200] [Batch 13/59] [D loss: 0.644683] [G loss: 1.041584]\n",
"[Epoch 63/200] [Batch 14/59] [D loss: 0.642879] [G loss: 0.690834]\n",
"[Epoch 63/200] [Batch 15/59] [D loss: 0.610687] [G loss: 0.835522]\n",
"[Epoch 63/200] [Batch 16/59] [D loss: 0.631409] [G loss: 1.028199]\n",
"[Epoch 63/200] [Batch 17/59] [D loss: 0.633330] [G loss: 0.674753]\n",
"[Epoch 63/200] [Batch 18/59] [D loss: 0.667333] [G loss: 0.711130]\n",
"[Epoch 63/200] [Batch 19/59] [D loss: 0.659715] [G loss: 1.144767]\n",
"[Epoch 63/200] [Batch 20/59] [D loss: 0.595809] [G loss: 0.762433]\n",
"[Epoch 63/200] [Batch 21/59] [D loss: 0.639415] [G loss: 0.780710]\n",
"[Epoch 63/200] [Batch 22/59] [D loss: 0.631103] [G loss: 0.880315]\n",
"[Epoch 63/200] [Batch 23/59] [D loss: 0.598922] [G loss: 0.969092]\n",
"[Epoch 63/200] [Batch 24/59] [D loss: 0.644077] [G loss: 0.742202]\n",
"[Epoch 63/200] [Batch 25/59] [D loss: 0.596729] [G loss: 0.824642]\n",
"[Epoch 63/200] [Batch 26/59] [D loss: 0.619743] [G loss: 1.013191]\n",
"[Epoch 63/200] [Batch 27/59] [D loss: 0.606081] [G loss: 0.712330]\n",
"[Epoch 63/200] [Batch 28/59] [D loss: 0.614697] [G loss: 0.711868]\n",
"[Epoch 63/200] [Batch 29/59] [D loss: 0.657620] [G loss: 0.950657]\n",
"[Epoch 63/200] [Batch 30/59] [D loss: 0.638907] [G loss: 0.899022]\n",
"[Epoch 63/200] [Batch 31/59] [D loss: 0.600802] [G loss: 0.684868]\n",
"[Epoch 63/200] [Batch 32/59] [D loss: 0.601010] [G loss: 0.822533]\n",
"[Epoch 63/200] [Batch 33/59] [D loss: 0.575717] [G loss: 0.851871]\n",
"[Epoch 63/200] [Batch 34/59] [D loss: 0.612089] [G loss: 0.836419]\n",
"[Epoch 63/200] [Batch 35/59] [D loss: 0.606050] [G loss: 0.829181]\n",
"[Epoch 63/200] [Batch 36/59] [D loss: 0.626816] [G loss: 0.903164]\n",
"[Epoch 63/200] [Batch 37/59] [D loss: 0.634848] [G loss: 0.813066]\n",
"[Epoch 63/200] [Batch 38/59] [D loss: 0.671606] [G loss: 0.822080]\n",
"[Epoch 63/200] [Batch 39/59] [D loss: 0.619760] [G loss: 0.818625]\n",
"[Epoch 63/200] [Batch 40/59] [D loss: 0.618548] [G loss: 0.837886]\n",
"[Epoch 63/200] [Batch 41/59] [D loss: 0.615690] [G loss: 0.866814]\n",
"[Epoch 63/200] [Batch 42/59] [D loss: 0.609694] [G loss: 0.821065]\n",
"[Epoch 63/200] [Batch 43/59] [D loss: 0.642053] [G loss: 0.912170]\n",
"[Epoch 63/200] [Batch 44/59] [D loss: 0.658175] [G loss: 0.751501]\n",
"[Epoch 63/200] [Batch 45/59] [D loss: 0.634283] [G loss: 0.987710]\n",
"[Epoch 63/200] [Batch 46/59] [D loss: 0.621560] [G loss: 0.789886]\n",
"[Epoch 63/200] [Batch 47/59] [D loss: 0.603603] [G loss: 1.019633]\n",
"[Epoch 63/200] [Batch 48/59] [D loss: 0.643950] [G loss: 0.797077]\n",
"[Epoch 63/200] [Batch 49/59] [D loss: 0.629024] [G loss: 0.732988]\n",
"[Epoch 63/200] [Batch 50/59] [D loss: 0.611586] [G loss: 0.974532]\n",
"[Epoch 63/200] [Batch 51/59] [D loss: 0.660379] [G loss: 0.792868]\n",
"[Epoch 63/200] [Batch 52/59] [D loss: 0.627246] [G loss: 0.710118]\n",
"[Epoch 63/200] [Batch 53/59] [D loss: 0.615327] [G loss: 0.968288]\n",
"[Epoch 63/200] [Batch 54/59] [D loss: 0.616378] [G loss: 0.833395]\n",
"[Epoch 63/200] [Batch 55/59] [D loss: 0.615052] [G loss: 0.797001]\n",
"[Epoch 63/200] [Batch 56/59] [D loss: 0.635400] [G loss: 0.918651]\n",
"[Epoch 63/200] [Batch 57/59] [D loss: 0.581320] [G loss: 0.846452]\n",
"[Epoch 63/200] [Batch 58/59] [D loss: 0.601486] [G loss: 0.788422]\n",
"[Epoch 64/200] [Batch 0/59] [D loss: 0.631733] [G loss: 0.818449]\n",
"[Epoch 64/200] [Batch 1/59] [D loss: 0.592501] [G loss: 0.812302]\n",
"[Epoch 64/200] [Batch 2/59] [D loss: 0.620036] [G loss: 0.904888]\n",
"[Epoch 64/200] [Batch 3/59] [D loss: 0.613869] [G loss: 0.707798]\n",
"[Epoch 64/200] [Batch 4/59] [D loss: 0.576764] [G loss: 0.871256]\n",
"[Epoch 64/200] [Batch 5/59] [D loss: 0.624905] [G loss: 0.874844]\n",
"[Epoch 64/200] [Batch 6/59] [D loss: 0.656634] [G loss: 0.689302]\n",
"[Epoch 64/200] [Batch 7/59] [D loss: 0.637994] [G loss: 0.895383]\n",
"[Epoch 64/200] [Batch 8/59] [D loss: 0.601539] [G loss: 0.952000]\n",
"[Epoch 64/200] [Batch 9/59] [D loss: 0.643085] [G loss: 0.662411]\n",
"[Epoch 64/200] [Batch 10/59] [D loss: 0.645381] [G loss: 0.838488]\n",
"[Epoch 64/200] [Batch 11/59] [D loss: 0.628561] [G loss: 1.009555]\n",
"[Epoch 64/200] [Batch 12/59] [D loss: 0.645376] [G loss: 0.782859]\n",
"[Epoch 64/200] [Batch 13/59] [D loss: 0.675096] [G loss: 0.622934]\n",
"[Epoch 64/200] [Batch 14/59] [D loss: 0.667279] [G loss: 0.991101]\n",
"[Epoch 64/200] [Batch 15/59] [D loss: 0.638572] [G loss: 0.780140]\n",
"[Epoch 64/200] [Batch 16/59] [D loss: 0.627210] [G loss: 0.718854]\n",
"[Epoch 64/200] [Batch 17/59] [D loss: 0.629998] [G loss: 0.874196]\n",
"[Epoch 64/200] [Batch 18/59] [D loss: 0.619534] [G loss: 0.929562]\n",
"[Epoch 64/200] [Batch 19/59] [D loss: 0.619008] [G loss: 0.654577]\n",
"[Epoch 64/200] [Batch 20/59] [D loss: 0.587786] [G loss: 0.915440]\n",
"[Epoch 64/200] [Batch 21/59] [D loss: 0.638888] [G loss: 0.736756]\n",
"[Epoch 64/200] [Batch 22/59] [D loss: 0.604396] [G loss: 0.846199]\n",
"[Epoch 64/200] [Batch 23/59] [D loss: 0.567921] [G loss: 0.863804]\n",
"[Epoch 64/200] [Batch 24/59] [D loss: 0.613676] [G loss: 0.874830]\n",
"[Epoch 64/200] [Batch 25/59] [D loss: 0.626010] [G loss: 0.901084]\n",
"[Epoch 64/200] [Batch 26/59] [D loss: 0.627552] [G loss: 0.738342]\n",
"[Epoch 64/200] [Batch 27/59] [D loss: 0.625000] [G loss: 0.876516]\n",
"[Epoch 64/200] [Batch 28/59] [D loss: 0.618730] [G loss: 0.781258]\n",
"[Epoch 64/200] [Batch 29/59] [D loss: 0.627966] [G loss: 0.851064]\n",
"[Epoch 64/200] [Batch 30/59] [D loss: 0.622888] [G loss: 0.782670]\n",
"[Epoch 64/200] [Batch 31/59] [D loss: 0.620575] [G loss: 0.734583]\n",
"[Epoch 64/200] [Batch 32/59] [D loss: 0.666908] [G loss: 0.772416]\n",
"[Epoch 64/200] [Batch 33/59] [D loss: 0.611809] [G loss: 0.894013]\n",
"[Epoch 64/200] [Batch 34/59] [D loss: 0.605152] [G loss: 0.816630]\n",
"[Epoch 64/200] [Batch 35/59] [D loss: 0.624677] [G loss: 0.768258]\n",
"[Epoch 64/200] [Batch 36/59] [D loss: 0.623496] [G loss: 0.866725]\n",
"[Epoch 64/200] [Batch 37/59] [D loss: 0.602249] [G loss: 0.878098]\n",
"[Epoch 64/200] [Batch 38/59] [D loss: 0.637835] [G loss: 0.719977]\n",
"[Epoch 64/200] [Batch 39/59] [D loss: 0.610207] [G loss: 1.042106]\n",
"[Epoch 64/200] [Batch 40/59] [D loss: 0.607745] [G loss: 0.751936]\n",
"[Epoch 64/200] [Batch 41/59] [D loss: 0.584429] [G loss: 0.803264]\n",
"[Epoch 64/200] [Batch 42/59] [D loss: 0.599595] [G loss: 0.956885]\n",
"[Epoch 64/200] [Batch 43/59] [D loss: 0.646585] [G loss: 0.992375]\n",
"[Epoch 64/200] [Batch 44/59] [D loss: 0.587360] [G loss: 0.671201]\n",
"[Epoch 64/200] [Batch 45/59] [D loss: 0.648059] [G loss: 0.953753]\n",
"[Epoch 64/200] [Batch 46/59] [D loss: 0.632065] [G loss: 0.817547]\n",
"[Epoch 64/200] [Batch 47/59] [D loss: 0.612736] [G loss: 0.712733]\n",
"[Epoch 64/200] [Batch 48/59] [D loss: 0.672783] [G loss: 0.982530]\n",
"[Epoch 64/200] [Batch 49/59] [D loss: 0.630872] [G loss: 0.675781]\n",
"[Epoch 64/200] [Batch 50/59] [D loss: 0.633036] [G loss: 0.716407]\n",
"[Epoch 64/200] [Batch 51/59] [D loss: 0.633597] [G loss: 0.991836]\n",
"[Epoch 64/200] [Batch 52/59] [D loss: 0.644946] [G loss: 0.796043]\n",
"[Epoch 64/200] [Batch 53/59] [D loss: 0.625602] [G loss: 0.937041]\n",
"[Epoch 64/200] [Batch 54/59] [D loss: 0.615316] [G loss: 0.743112]\n",
"[Epoch 64/200] [Batch 55/59] [D loss: 0.660417] [G loss: 0.908148]\n",
"[Epoch 64/200] [Batch 56/59] [D loss: 0.619730] [G loss: 0.835489]\n",
"[Epoch 64/200] [Batch 57/59] [D loss: 0.607762] [G loss: 0.848994]\n",
"[Epoch 64/200] [Batch 58/59] [D loss: 0.601590] [G loss: 0.903900]\n",
"[Epoch 65/200] [Batch 0/59] [D loss: 0.578694] [G loss: 0.841206]\n",
"[Epoch 65/200] [Batch 1/59] [D loss: 0.609015] [G loss: 0.766571]\n",
"[Epoch 65/200] [Batch 2/59] [D loss: 0.601607] [G loss: 0.949039]\n",
"[Epoch 65/200] [Batch 3/59] [D loss: 0.620679] [G loss: 0.802734]\n",
"[Epoch 65/200] [Batch 4/59] [D loss: 0.617783] [G loss: 0.828022]\n",
"[Epoch 65/200] [Batch 5/59] [D loss: 0.583433] [G loss: 0.910597]\n",
"[Epoch 65/200] [Batch 6/59] [D loss: 0.600304] [G loss: 0.740963]\n",
"[Epoch 65/200] [Batch 7/59] [D loss: 0.628743] [G loss: 0.877499]\n",
"[Epoch 65/200] [Batch 8/59] [D loss: 0.644118] [G loss: 0.872272]\n",
"[Epoch 65/200] [Batch 9/59] [D loss: 0.610052] [G loss: 0.751326]\n",
"[Epoch 65/200] [Batch 10/59] [D loss: 0.650840] [G loss: 0.790084]\n",
"[Epoch 65/200] [Batch 11/59] [D loss: 0.625720] [G loss: 1.015603]\n",
"[Epoch 65/200] [Batch 12/59] [D loss: 0.604666] [G loss: 0.725584]\n",
"[Epoch 65/200] [Batch 13/59] [D loss: 0.638783] [G loss: 0.858301]\n",
"[Epoch 65/200] [Batch 14/59] [D loss: 0.633216] [G loss: 0.940402]\n",
"[Epoch 65/200] [Batch 15/59] [D loss: 0.649006] [G loss: 0.771899]\n",
"[Epoch 65/200] [Batch 16/59] [D loss: 0.624439] [G loss: 0.726321]\n",
"[Epoch 65/200] [Batch 17/59] [D loss: 0.618591] [G loss: 0.938654]\n",
"[Epoch 65/200] [Batch 18/59] [D loss: 0.631545] [G loss: 0.993386]\n",
"[Epoch 65/200] [Batch 19/59] [D loss: 0.614253] [G loss: 0.695975]\n",
"[Epoch 65/200] [Batch 20/59] [D loss: 0.586724] [G loss: 0.897798]\n",
"[Epoch 65/200] [Batch 21/59] [D loss: 0.602020] [G loss: 0.922087]\n",
"[Epoch 65/200] [Batch 22/59] [D loss: 0.623682] [G loss: 0.707767]\n",
"[Epoch 65/200] [Batch 23/59] [D loss: 0.612330] [G loss: 0.851802]\n",
"[Epoch 65/200] [Batch 24/59] [D loss: 0.661661] [G loss: 1.104184]\n",
"[Epoch 65/200] [Batch 25/59] [D loss: 0.651490] [G loss: 0.661936]\n",
"[Epoch 65/200] [Batch 26/59] [D loss: 0.649588] [G loss: 0.899118]\n",
"[Epoch 65/200] [Batch 27/59] [D loss: 0.663024] [G loss: 0.824857]\n",
"[Epoch 65/200] [Batch 28/59] [D loss: 0.594292] [G loss: 0.810248]\n",
"[Epoch 65/200] [Batch 29/59] [D loss: 0.644089] [G loss: 0.728346]\n",
"[Epoch 65/200] [Batch 30/59] [D loss: 0.625348] [G loss: 0.674455]\n",
"[Epoch 65/200] [Batch 31/59] [D loss: 0.658763] [G loss: 1.061939]\n",
"[Epoch 65/200] [Batch 32/59] [D loss: 0.566441] [G loss: 0.757100]\n",
"[Epoch 65/200] [Batch 33/59] [D loss: 0.612017] [G loss: 0.636939]\n",
"[Epoch 65/200] [Batch 34/59] [D loss: 0.614032] [G loss: 0.969420]\n",
"[Epoch 65/200] [Batch 35/59] [D loss: 0.628133] [G loss: 0.905915]\n",
"[Epoch 65/200] [Batch 36/59] [D loss: 0.584007] [G loss: 0.645701]\n",
"[Epoch 65/200] [Batch 37/59] [D loss: 0.615790] [G loss: 0.849978]\n",
"[Epoch 65/200] [Batch 38/59] [D loss: 0.615053] [G loss: 0.909240]\n",
"[Epoch 65/200] [Batch 39/59] [D loss: 0.566375] [G loss: 0.786888]\n",
"[Epoch 65/200] [Batch 40/59] [D loss: 0.601724] [G loss: 0.789234]\n",
"[Epoch 65/200] [Batch 41/59] [D loss: 0.650036] [G loss: 0.884951]\n",
"[Epoch 65/200] [Batch 42/59] [D loss: 0.566013] [G loss: 0.797735]\n",
"[Epoch 65/200] [Batch 43/59] [D loss: 0.629610] [G loss: 0.915702]\n",
"[Epoch 65/200] [Batch 44/59] [D loss: 0.599347] [G loss: 0.869229]\n",
"[Epoch 65/200] [Batch 45/59] [D loss: 0.620188] [G loss: 0.856230]\n",
"[Epoch 65/200] [Batch 46/59] [D loss: 0.607511] [G loss: 0.767002]\n",
"[Epoch 65/200] [Batch 47/59] [D loss: 0.633602] [G loss: 0.910817]\n",
"[Epoch 65/200] [Batch 48/59] [D loss: 0.579072] [G loss: 0.753727]\n",
"[Epoch 65/200] [Batch 49/59] [D loss: 0.582918] [G loss: 0.780386]\n",
"[Epoch 65/200] [Batch 50/59] [D loss: 0.648040] [G loss: 0.863694]\n",
"[Epoch 65/200] [Batch 51/59] [D loss: 0.632923] [G loss: 0.699886]\n",
"[Epoch 65/200] [Batch 52/59] [D loss: 0.623297] [G loss: 0.825309]\n",
"[Epoch 65/200] [Batch 53/59] [D loss: 0.640493] [G loss: 0.797781]\n",
"[Epoch 65/200] [Batch 54/59] [D loss: 0.615061] [G loss: 0.852274]\n",
"[Epoch 65/200] [Batch 55/59] [D loss: 0.642336] [G loss: 0.848754]\n",
"[Epoch 65/200] [Batch 56/59] [D loss: 0.654256] [G loss: 0.763402]\n",
"[Epoch 65/200] [Batch 57/59] [D loss: 0.654342] [G loss: 0.653670]\n",
"[Epoch 65/200] [Batch 58/59] [D loss: 0.600252] [G loss: 1.089794]\n",
"[Epoch 66/200] [Batch 0/59] [D loss: 0.607637] [G loss: 0.856201]\n",
"[Epoch 66/200] [Batch 1/59] [D loss: 0.629294] [G loss: 0.669105]\n",
"[Epoch 66/200] [Batch 2/59] [D loss: 0.656226] [G loss: 0.875104]\n",
"[Epoch 66/200] [Batch 3/59] [D loss: 0.601568] [G loss: 1.076345]\n",
"[Epoch 66/200] [Batch 4/59] [D loss: 0.567824] [G loss: 0.697856]\n",
"[Epoch 66/200] [Batch 5/59] [D loss: 0.636835] [G loss: 0.733829]\n",
"[Epoch 66/200] [Batch 6/59] [D loss: 0.602493] [G loss: 1.045776]\n",
"[Epoch 66/200] [Batch 7/59] [D loss: 0.589686] [G loss: 0.889660]\n",
"[Epoch 66/200] [Batch 8/59] [D loss: 0.614300] [G loss: 0.776369]\n",
"[Epoch 66/200] [Batch 9/59] [D loss: 0.652941] [G loss: 0.870061]\n",
"[Epoch 66/200] [Batch 10/59] [D loss: 0.667811] [G loss: 0.868672]\n",
"[Epoch 66/200] [Batch 11/59] [D loss: 0.685902] [G loss: 0.740494]\n",
"[Epoch 66/200] [Batch 12/59] [D loss: 0.646031] [G loss: 0.676668]\n",
"[Epoch 66/200] [Batch 13/59] [D loss: 0.619282] [G loss: 0.912310]\n",
"[Epoch 66/200] [Batch 14/59] [D loss: 0.671993] [G loss: 0.807174]\n",
"[Epoch 66/200] [Batch 15/59] [D loss: 0.629801] [G loss: 1.038970]\n",
"[Epoch 66/200] [Batch 16/59] [D loss: 0.609677] [G loss: 0.810825]\n",
"[Epoch 66/200] [Batch 17/59] [D loss: 0.635111] [G loss: 0.807156]\n",
"[Epoch 66/200] [Batch 18/59] [D loss: 0.582605] [G loss: 0.855220]\n",
"[Epoch 66/200] [Batch 19/59] [D loss: 0.614326] [G loss: 0.853810]\n",
"[Epoch 66/200] [Batch 20/59] [D loss: 0.611038] [G loss: 0.833134]\n",
"[Epoch 66/200] [Batch 21/59] [D loss: 0.652377] [G loss: 0.836625]\n",
"[Epoch 66/200] [Batch 22/59] [D loss: 0.654542] [G loss: 0.924014]\n",
"[Epoch 66/200] [Batch 23/59] [D loss: 0.612899] [G loss: 0.727632]\n",
"[Epoch 66/200] [Batch 24/59] [D loss: 0.624192] [G loss: 0.829960]\n",
"[Epoch 66/200] [Batch 25/59] [D loss: 0.630505] [G loss: 1.082713]\n",
"[Epoch 66/200] [Batch 26/59] [D loss: 0.632601] [G loss: 0.787677]\n",
"[Epoch 66/200] [Batch 27/59] [D loss: 0.640469] [G loss: 0.617268]\n",
"[Epoch 66/200] [Batch 28/59] [D loss: 0.665158] [G loss: 0.955820]\n",
"[Epoch 66/200] [Batch 29/59] [D loss: 0.598958] [G loss: 0.859701]\n",
"[Epoch 66/200] [Batch 30/59] [D loss: 0.610561] [G loss: 0.743504]\n",
"[Epoch 66/200] [Batch 31/59] [D loss: 0.618858] [G loss: 0.777383]\n",
"[Epoch 66/200] [Batch 32/59] [D loss: 0.605975] [G loss: 0.856322]\n",
"[Epoch 66/200] [Batch 33/59] [D loss: 0.636149] [G loss: 0.784693]\n",
"[Epoch 66/200] [Batch 34/59] [D loss: 0.642801] [G loss: 0.714349]\n",
"[Epoch 66/200] [Batch 35/59] [D loss: 0.648435] [G loss: 0.906159]\n",
"[Epoch 66/200] [Batch 36/59] [D loss: 0.613459] [G loss: 0.827573]\n",
"[Epoch 66/200] [Batch 37/59] [D loss: 0.613815] [G loss: 0.842709]\n",
"[Epoch 66/200] [Batch 38/59] [D loss: 0.628431] [G loss: 0.846287]\n",
"[Epoch 66/200] [Batch 39/59] [D loss: 0.657832] [G loss: 0.750592]\n",
"[Epoch 66/200] [Batch 40/59] [D loss: 0.620341] [G loss: 0.972290]\n",
"[Epoch 66/200] [Batch 41/59] [D loss: 0.616081] [G loss: 0.777124]\n",
"[Epoch 66/200] [Batch 42/59] [D loss: 0.613569] [G loss: 0.750759]\n",
"[Epoch 66/200] [Batch 43/59] [D loss: 0.617497] [G loss: 0.895121]\n",
"[Epoch 66/200] [Batch 44/59] [D loss: 0.641625] [G loss: 0.893048]\n",
"[Epoch 66/200] [Batch 45/59] [D loss: 0.643165] [G loss: 0.866797]\n",
"[Epoch 66/200] [Batch 46/59] [D loss: 0.601141] [G loss: 0.833378]\n",
"[Epoch 66/200] [Batch 47/59] [D loss: 0.594019] [G loss: 0.966527]\n",
"[Epoch 66/200] [Batch 48/59] [D loss: 0.614269] [G loss: 0.791809]\n",
"[Epoch 66/200] [Batch 49/59] [D loss: 0.616266] [G loss: 0.901391]\n",
"[Epoch 66/200] [Batch 50/59] [D loss: 0.634683] [G loss: 0.913535]\n",
"[Epoch 66/200] [Batch 51/59] [D loss: 0.619392] [G loss: 0.869168]\n",
"[Epoch 66/200] [Batch 52/59] [D loss: 0.584012] [G loss: 0.777973]\n",
"[Epoch 66/200] [Batch 53/59] [D loss: 0.645980] [G loss: 0.831138]\n",
"[Epoch 66/200] [Batch 54/59] [D loss: 0.610639] [G loss: 0.732002]\n",
"[Epoch 66/200] [Batch 55/59] [D loss: 0.608583] [G loss: 0.985942]\n",
"[Epoch 66/200] [Batch 56/59] [D loss: 0.589573] [G loss: 0.886423]\n",
"[Epoch 66/200] [Batch 57/59] [D loss: 0.624677] [G loss: 0.628293]\n",
"[Epoch 66/200] [Batch 58/59] [D loss: 0.591747] [G loss: 0.893201]\n",
"[Epoch 67/200] [Batch 0/59] [D loss: 0.625888] [G loss: 0.906829]\n",
"[Epoch 67/200] [Batch 1/59] [D loss: 0.610154] [G loss: 0.773347]\n",
"[Epoch 67/200] [Batch 2/59] [D loss: 0.601421] [G loss: 0.902612]\n",
"[Epoch 67/200] [Batch 3/59] [D loss: 0.608654] [G loss: 0.780897]\n",
"[Epoch 67/200] [Batch 4/59] [D loss: 0.604719] [G loss: 0.803925]\n",
"[Epoch 67/200] [Batch 5/59] [D loss: 0.599928] [G loss: 0.746402]\n",
"[Epoch 67/200] [Batch 6/59] [D loss: 0.592908] [G loss: 0.750107]\n",
"[Epoch 67/200] [Batch 7/59] [D loss: 0.623332] [G loss: 1.024060]\n",
"[Epoch 67/200] [Batch 8/59] [D loss: 0.590006] [G loss: 0.687731]\n",
"[Epoch 67/200] [Batch 9/59] [D loss: 0.663241] [G loss: 0.772020]\n",
"[Epoch 67/200] [Batch 10/59] [D loss: 0.644371] [G loss: 0.995910]\n",
"[Epoch 67/200] [Batch 11/59] [D loss: 0.601530] [G loss: 0.848042]\n",
"[Epoch 67/200] [Batch 12/59] [D loss: 0.645318] [G loss: 0.695931]\n",
"[Epoch 67/200] [Batch 13/59] [D loss: 0.637522] [G loss: 0.935686]\n",
"[Epoch 67/200] [Batch 14/59] [D loss: 0.634970] [G loss: 0.790926]\n",
"[Epoch 67/200] [Batch 15/59] [D loss: 0.622794] [G loss: 0.813466]\n",
"[Epoch 67/200] [Batch 16/59] [D loss: 0.679076] [G loss: 0.938805]\n",
"[Epoch 67/200] [Batch 17/59] [D loss: 0.627779] [G loss: 0.740950]\n",
"[Epoch 67/200] [Batch 18/59] [D loss: 0.602537] [G loss: 0.807017]\n",
"[Epoch 67/200] [Batch 19/59] [D loss: 0.610791] [G loss: 0.803604]\n",
"[Epoch 67/200] [Batch 20/59] [D loss: 0.591311] [G loss: 0.917367]\n",
"[Epoch 67/200] [Batch 21/59] [D loss: 0.631213] [G loss: 0.681902]\n",
"[Epoch 67/200] [Batch 22/59] [D loss: 0.611665] [G loss: 0.869478]\n",
"[Epoch 67/200] [Batch 23/59] [D loss: 0.623355] [G loss: 0.896790]\n",
"[Epoch 67/200] [Batch 24/59] [D loss: 0.613714] [G loss: 0.695335]\n",
"[Epoch 67/200] [Batch 25/59] [D loss: 0.632877] [G loss: 0.899524]\n",
"[Epoch 67/200] [Batch 26/59] [D loss: 0.637602] [G loss: 0.873309]\n",
"[Epoch 67/200] [Batch 27/59] [D loss: 0.621885] [G loss: 0.656125]\n",
"[Epoch 67/200] [Batch 28/59] [D loss: 0.608313] [G loss: 0.834053]\n",
"[Epoch 67/200] [Batch 29/59] [D loss: 0.619399] [G loss: 0.964483]\n",
"[Epoch 67/200] [Batch 30/59] [D loss: 0.640617] [G loss: 0.741646]\n",
"[Epoch 67/200] [Batch 31/59] [D loss: 0.609066] [G loss: 0.962361]\n",
"[Epoch 67/200] [Batch 32/59] [D loss: 0.606089] [G loss: 0.926032]\n",
"[Epoch 67/200] [Batch 33/59] [D loss: 0.685079] [G loss: 0.768658]\n",
"[Epoch 67/200] [Batch 34/59] [D loss: 0.598361] [G loss: 0.972290]\n",
"[Epoch 67/200] [Batch 35/59] [D loss: 0.610858] [G loss: 0.805825]\n",
"[Epoch 67/200] [Batch 36/59] [D loss: 0.627454] [G loss: 0.755500]\n",
"[Epoch 67/200] [Batch 37/59] [D loss: 0.612679] [G loss: 0.877447]\n",
"[Epoch 67/200] [Batch 38/59] [D loss: 0.596671] [G loss: 0.639118]\n",
"[Epoch 67/200] [Batch 39/59] [D loss: 0.616554] [G loss: 0.831004]\n",
"[Epoch 67/200] [Batch 40/59] [D loss: 0.628088] [G loss: 0.963205]\n",
"[Epoch 67/200] [Batch 41/59] [D loss: 0.602406] [G loss: 0.876881]\n",
"[Epoch 67/200] [Batch 42/59] [D loss: 0.620630] [G loss: 0.695872]\n",
"[Epoch 67/200] [Batch 43/59] [D loss: 0.615396] [G loss: 0.978920]\n",
"[Epoch 67/200] [Batch 44/59] [D loss: 0.626857] [G loss: 0.952504]\n",
"[Epoch 67/200] [Batch 45/59] [D loss: 0.592592] [G loss: 0.713441]\n",
"[Epoch 67/200] [Batch 46/59] [D loss: 0.635975] [G loss: 0.938219]\n",
"[Epoch 67/200] [Batch 47/59] [D loss: 0.592648] [G loss: 0.904166]\n",
"[Epoch 67/200] [Batch 48/59] [D loss: 0.623046] [G loss: 0.684387]\n",
"[Epoch 67/200] [Batch 49/59] [D loss: 0.618757] [G loss: 0.733029]\n",
"[Epoch 67/200] [Batch 50/59] [D loss: 0.608826] [G loss: 0.954175]\n",
"[Epoch 67/200] [Batch 51/59] [D loss: 0.613805] [G loss: 0.831706]\n",
"[Epoch 67/200] [Batch 52/59] [D loss: 0.616054] [G loss: 0.683218]\n",
"[Epoch 67/200] [Batch 53/59] [D loss: 0.626588] [G loss: 0.975714]\n",
"[Epoch 67/200] [Batch 54/59] [D loss: 0.635272] [G loss: 0.806535]\n",
"[Epoch 67/200] [Batch 55/59] [D loss: 0.627465] [G loss: 0.718847]\n",
"[Epoch 67/200] [Batch 56/59] [D loss: 0.638426] [G loss: 0.883576]\n",
"[Epoch 67/200] [Batch 57/59] [D loss: 0.623097] [G loss: 0.788584]\n",
"[Epoch 67/200] [Batch 58/59] [D loss: 0.602582] [G loss: 0.758452]\n",
"[Epoch 68/200] [Batch 0/59] [D loss: 0.624964] [G loss: 0.810593]\n",
"[Epoch 68/200] [Batch 1/59] [D loss: 0.621237] [G loss: 0.765121]\n",
"[Epoch 68/200] [Batch 2/59] [D loss: 0.649182] [G loss: 0.884005]\n",
"[Epoch 68/200] [Batch 3/59] [D loss: 0.597443] [G loss: 0.746962]\n",
"[Epoch 68/200] [Batch 4/59] [D loss: 0.587616] [G loss: 0.884818]\n",
"[Epoch 68/200] [Batch 5/59] [D loss: 0.623327] [G loss: 0.979649]\n",
"[Epoch 68/200] [Batch 6/59] [D loss: 0.643446] [G loss: 0.734840]\n",
"[Epoch 68/200] [Batch 7/59] [D loss: 0.637036] [G loss: 0.896411]\n",
"[Epoch 68/200] [Batch 8/59] [D loss: 0.658741] [G loss: 0.891225]\n",
"[Epoch 68/200] [Batch 9/59] [D loss: 0.612904] [G loss: 0.776188]\n",
"[Epoch 68/200] [Batch 10/59] [D loss: 0.619015] [G loss: 0.851581]\n",
"[Epoch 68/200] [Batch 11/59] [D loss: 0.621681] [G loss: 0.896268]\n",
"[Epoch 68/200] [Batch 12/59] [D loss: 0.655010] [G loss: 0.876766]\n",
"[Epoch 68/200] [Batch 13/59] [D loss: 0.652535] [G loss: 0.683588]\n",
"[Epoch 68/200] [Batch 14/59] [D loss: 0.566135] [G loss: 0.995941]\n",
"[Epoch 68/200] [Batch 15/59] [D loss: 0.632539] [G loss: 1.003896]\n",
"[Epoch 68/200] [Batch 16/59] [D loss: 0.663918] [G loss: 0.587719]\n",
"[Epoch 68/200] [Batch 17/59] [D loss: 0.618416] [G loss: 0.826977]\n",
"[Epoch 68/200] [Batch 18/59] [D loss: 0.636906] [G loss: 1.128894]\n",
"[Epoch 68/200] [Batch 19/59] [D loss: 0.596277] [G loss: 0.694771]\n",
"[Epoch 68/200] [Batch 20/59] [D loss: 0.590074] [G loss: 0.961518]\n",
"[Epoch 68/200] [Batch 21/59] [D loss: 0.578475] [G loss: 1.038382]\n",
"[Epoch 68/200] [Batch 22/59] [D loss: 0.624162] [G loss: 0.866539]\n",
"[Epoch 68/200] [Batch 23/59] [D loss: 0.583328] [G loss: 0.738044]\n",
"[Epoch 68/200] [Batch 24/59] [D loss: 0.655503] [G loss: 0.857311]\n",
"[Epoch 68/200] [Batch 25/59] [D loss: 0.650400] [G loss: 0.835171]\n",
"[Epoch 68/200] [Batch 26/59] [D loss: 0.645420] [G loss: 0.759304]\n",
"[Epoch 68/200] [Batch 27/59] [D loss: 0.609868] [G loss: 0.804346]\n",
"[Epoch 68/200] [Batch 28/59] [D loss: 0.631799] [G loss: 0.912950]\n",
"[Epoch 68/200] [Batch 29/59] [D loss: 0.606871] [G loss: 0.857268]\n",
"[Epoch 68/200] [Batch 30/59] [D loss: 0.624615] [G loss: 0.715204]\n",
"[Epoch 68/200] [Batch 31/59] [D loss: 0.615100] [G loss: 0.921601]\n",
"[Epoch 68/200] [Batch 32/59] [D loss: 0.599812] [G loss: 0.884016]\n",
"[Epoch 68/200] [Batch 33/59] [D loss: 0.624184] [G loss: 0.829941]\n",
"[Epoch 68/200] [Batch 34/59] [D loss: 0.621470] [G loss: 0.850829]\n",
"[Epoch 68/200] [Batch 35/59] [D loss: 0.605892] [G loss: 0.823926]\n",
"[Epoch 68/200] [Batch 36/59] [D loss: 0.611643] [G loss: 0.789430]\n",
"[Epoch 68/200] [Batch 37/59] [D loss: 0.574734] [G loss: 0.686738]\n",
"[Epoch 68/200] [Batch 38/59] [D loss: 0.626037] [G loss: 0.793213]\n",
"[Epoch 68/200] [Batch 39/59] [D loss: 0.644843] [G loss: 0.869330]\n",
"[Epoch 68/200] [Batch 40/59] [D loss: 0.671336] [G loss: 0.741661]\n",
"[Epoch 68/200] [Batch 41/59] [D loss: 0.621981] [G loss: 0.837546]\n",
"[Epoch 68/200] [Batch 42/59] [D loss: 0.599180] [G loss: 0.954515]\n",
"[Epoch 68/200] [Batch 43/59] [D loss: 0.593286] [G loss: 0.832969]\n",
"[Epoch 68/200] [Batch 44/59] [D loss: 0.608474] [G loss: 0.908553]\n",
"[Epoch 68/200] [Batch 45/59] [D loss: 0.630671] [G loss: 0.892937]\n",
"[Epoch 68/200] [Batch 46/59] [D loss: 0.623070] [G loss: 0.880592]\n",
"[Epoch 68/200] [Batch 47/59] [D loss: 0.632947] [G loss: 0.846896]\n",
"[Epoch 68/200] [Batch 48/59] [D loss: 0.635237] [G loss: 0.929882]\n",
"[Epoch 68/200] [Batch 49/59] [D loss: 0.577000] [G loss: 0.810192]\n",
"[Epoch 68/200] [Batch 50/59] [D loss: 0.601684] [G loss: 0.838468]\n",
"[Epoch 68/200] [Batch 51/59] [D loss: 0.588362] [G loss: 0.874440]\n",
"[Epoch 68/200] [Batch 52/59] [D loss: 0.579728] [G loss: 0.769952]\n",
"[Epoch 68/200] [Batch 53/59] [D loss: 0.643677] [G loss: 0.909928]\n",
"[Epoch 68/200] [Batch 54/59] [D loss: 0.623093] [G loss: 0.777300]\n",
"[Epoch 68/200] [Batch 55/59] [D loss: 0.618468] [G loss: 0.823699]\n",
"[Epoch 68/200] [Batch 56/59] [D loss: 0.636228] [G loss: 0.759966]\n",
"[Epoch 68/200] [Batch 57/59] [D loss: 0.614170] [G loss: 0.825871]\n",
"[Epoch 68/200] [Batch 58/59] [D loss: 0.577244] [G loss: 0.819971]\n",
"[Epoch 69/200] [Batch 0/59] [D loss: 0.608323] [G loss: 0.746075]\n",
"[Epoch 69/200] [Batch 1/59] [D loss: 0.664532] [G loss: 0.897260]\n",
"[Epoch 69/200] [Batch 2/59] [D loss: 0.557413] [G loss: 0.921471]\n",
"[Epoch 69/200] [Batch 3/59] [D loss: 0.674875] [G loss: 0.651165]\n",
"[Epoch 69/200] [Batch 4/59] [D loss: 0.607008] [G loss: 1.064168]\n",
"[Epoch 69/200] [Batch 5/59] [D loss: 0.597968] [G loss: 0.872707]\n",
"[Epoch 69/200] [Batch 6/59] [D loss: 0.619748] [G loss: 0.630455]\n",
"[Epoch 69/200] [Batch 7/59] [D loss: 0.570345] [G loss: 0.918732]\n",
"[Epoch 69/200] [Batch 8/59] [D loss: 0.597818] [G loss: 0.954518]\n",
"[Epoch 69/200] [Batch 9/59] [D loss: 0.601309] [G loss: 0.729485]\n",
"[Epoch 69/200] [Batch 10/59] [D loss: 0.621870] [G loss: 0.941500]\n",
"[Epoch 69/200] [Batch 11/59] [D loss: 0.589594] [G loss: 0.765716]\n",
"[Epoch 69/200] [Batch 12/59] [D loss: 0.642594] [G loss: 0.876680]\n",
"[Epoch 69/200] [Batch 13/59] [D loss: 0.604857] [G loss: 0.840396]\n",
"[Epoch 69/200] [Batch 14/59] [D loss: 0.597042] [G loss: 0.746596]\n",
"[Epoch 69/200] [Batch 15/59] [D loss: 0.613724] [G loss: 0.865383]\n",
"[Epoch 69/200] [Batch 16/59] [D loss: 0.625635] [G loss: 0.946653]\n",
"[Epoch 69/200] [Batch 17/59] [D loss: 0.621903] [G loss: 0.816575]\n",
"[Epoch 69/200] [Batch 18/59] [D loss: 0.673191] [G loss: 0.720718]\n",
"[Epoch 69/200] [Batch 19/59] [D loss: 0.600989] [G loss: 1.058010]\n",
"[Epoch 69/200] [Batch 20/59] [D loss: 0.595779] [G loss: 0.859600]\n",
"[Epoch 69/200] [Batch 21/59] [D loss: 0.640212] [G loss: 0.674887]\n",
"[Epoch 69/200] [Batch 22/59] [D loss: 0.638902] [G loss: 0.961394]\n",
"[Epoch 69/200] [Batch 23/59] [D loss: 0.632955] [G loss: 0.831620]\n",
"[Epoch 69/200] [Batch 24/59] [D loss: 0.593154] [G loss: 0.748101]\n",
"[Epoch 69/200] [Batch 25/59] [D loss: 0.594925] [G loss: 0.933057]\n",
"[Epoch 69/200] [Batch 26/59] [D loss: 0.621326] [G loss: 0.858963]\n",
"[Epoch 69/200] [Batch 27/59] [D loss: 0.608581] [G loss: 0.790183]\n",
"[Epoch 69/200] [Batch 28/59] [D loss: 0.605555] [G loss: 0.908872]\n",
"[Epoch 69/200] [Batch 29/59] [D loss: 0.637031] [G loss: 0.805577]\n",
"[Epoch 69/200] [Batch 30/59] [D loss: 0.611432] [G loss: 0.905984]\n",
"[Epoch 69/200] [Batch 31/59] [D loss: 0.613183] [G loss: 0.850648]\n",
"[Epoch 69/200] [Batch 32/59] [D loss: 0.605534] [G loss: 0.761776]\n",
"[Epoch 69/200] [Batch 33/59] [D loss: 0.625173] [G loss: 0.807687]\n",
"[Epoch 69/200] [Batch 34/59] [D loss: 0.652263] [G loss: 0.867845]\n",
"[Epoch 69/200] [Batch 35/59] [D loss: 0.640568] [G loss: 0.782592]\n",
"[Epoch 69/200] [Batch 36/59] [D loss: 0.590465] [G loss: 0.842973]\n",
"[Epoch 69/200] [Batch 37/59] [D loss: 0.656399] [G loss: 0.786557]\n",
"[Epoch 69/200] [Batch 38/59] [D loss: 0.623863] [G loss: 0.771365]\n",
"[Epoch 69/200] [Batch 39/59] [D loss: 0.644734] [G loss: 0.994394]\n",
"[Epoch 69/200] [Batch 40/59] [D loss: 0.530390] [G loss: 0.772367]\n",
"[Epoch 69/200] [Batch 41/59] [D loss: 0.626627] [G loss: 0.656625]\n",
"[Epoch 69/200] [Batch 42/59] [D loss: 0.657702] [G loss: 1.106752]\n",
"[Epoch 69/200] [Batch 43/59] [D loss: 0.574631] [G loss: 0.788121]\n",
"[Epoch 69/200] [Batch 44/59] [D loss: 0.585052] [G loss: 0.651701]\n",
"[Epoch 69/200] [Batch 45/59] [D loss: 0.632372] [G loss: 1.221718]\n",
"[Epoch 69/200] [Batch 46/59] [D loss: 0.586648] [G loss: 0.858449]\n",
"[Epoch 69/200] [Batch 47/59] [D loss: 0.648593] [G loss: 0.538485]\n",
"[Epoch 69/200] [Batch 48/59] [D loss: 0.623141] [G loss: 1.323195]\n",
"[Epoch 69/200] [Batch 49/59] [D loss: 0.593736] [G loss: 0.781878]\n",
"[Epoch 69/200] [Batch 50/59] [D loss: 0.580242] [G loss: 0.779228]\n",
"[Epoch 69/200] [Batch 51/59] [D loss: 0.615963] [G loss: 0.729258]\n",
"[Epoch 69/200] [Batch 52/59] [D loss: 0.566369] [G loss: 0.969354]\n",
"[Epoch 69/200] [Batch 53/59] [D loss: 0.631045] [G loss: 0.932705]\n",
"[Epoch 69/200] [Batch 54/59] [D loss: 0.640139] [G loss: 0.614850]\n",
"[Epoch 69/200] [Batch 55/59] [D loss: 0.598211] [G loss: 0.977778]\n",
"[Epoch 69/200] [Batch 56/59] [D loss: 0.618613] [G loss: 0.888877]\n",
"[Epoch 69/200] [Batch 57/59] [D loss: 0.614803] [G loss: 0.784886]\n",
"[Epoch 69/200] [Batch 58/59] [D loss: 0.629491] [G loss: 0.891689]\n",
"[Epoch 70/200] [Batch 0/59] [D loss: 0.632917] [G loss: 0.936635]\n",
"[Epoch 70/200] [Batch 1/59] [D loss: 0.643843] [G loss: 0.647382]\n",
"[Epoch 70/200] [Batch 2/59] [D loss: 0.623147] [G loss: 1.095006]\n",
"[Epoch 70/200] [Batch 3/59] [D loss: 0.625817] [G loss: 0.905499]\n",
"[Epoch 70/200] [Batch 4/59] [D loss: 0.615988] [G loss: 0.659959]\n",
"[Epoch 70/200] [Batch 5/59] [D loss: 0.645366] [G loss: 0.817665]\n",
"[Epoch 70/200] [Batch 6/59] [D loss: 0.609044] [G loss: 1.020996]\n",
"[Epoch 70/200] [Batch 7/59] [D loss: 0.591307] [G loss: 0.749994]\n",
"[Epoch 70/200] [Batch 8/59] [D loss: 0.603050] [G loss: 0.700399]\n",
"[Epoch 70/200] [Batch 9/59] [D loss: 0.605573] [G loss: 1.034546]\n",
"[Epoch 70/200] [Batch 10/59] [D loss: 0.590000] [G loss: 0.811294]\n",
"[Epoch 70/200] [Batch 11/59] [D loss: 0.600756] [G loss: 0.754028]\n",
"[Epoch 70/200] [Batch 12/59] [D loss: 0.606877] [G loss: 0.846986]\n",
"[Epoch 70/200] [Batch 13/59] [D loss: 0.633043] [G loss: 0.787828]\n",
"[Epoch 70/200] [Batch 14/59] [D loss: 0.638415] [G loss: 0.769221]\n",
"[Epoch 70/200] [Batch 15/59] [D loss: 0.609365] [G loss: 0.960384]\n",
"[Epoch 70/200] [Batch 16/59] [D loss: 0.627190] [G loss: 0.664235]\n",
"[Epoch 70/200] [Batch 17/59] [D loss: 0.618268] [G loss: 0.848101]\n",
"[Epoch 70/200] [Batch 18/59] [D loss: 0.627608] [G loss: 0.899450]\n",
"[Epoch 70/200] [Batch 19/59] [D loss: 0.612483] [G loss: 0.778182]\n",
"[Epoch 70/200] [Batch 20/59] [D loss: 0.602046] [G loss: 0.787456]\n",
"[Epoch 70/200] [Batch 21/59] [D loss: 0.596264] [G loss: 0.922749]\n",
"[Epoch 70/200] [Batch 22/59] [D loss: 0.622704] [G loss: 0.843707]\n",
"[Epoch 70/200] [Batch 23/59] [D loss: 0.584813] [G loss: 0.865335]\n",
"[Epoch 70/200] [Batch 24/59] [D loss: 0.588298] [G loss: 0.808314]\n",
"[Epoch 70/200] [Batch 25/59] [D loss: 0.557188] [G loss: 0.912204]\n",
"[Epoch 70/200] [Batch 26/59] [D loss: 0.614194] [G loss: 0.850306]\n",
"[Epoch 70/200] [Batch 27/59] [D loss: 0.601567] [G loss: 0.688895]\n",
"[Epoch 70/200] [Batch 28/59] [D loss: 0.620211] [G loss: 0.901888]\n",
"[Epoch 70/200] [Batch 29/59] [D loss: 0.611100] [G loss: 0.861194]\n",
"[Epoch 70/200] [Batch 30/59] [D loss: 0.579042] [G loss: 0.928160]\n",
"[Epoch 70/200] [Batch 31/59] [D loss: 0.620249] [G loss: 0.685752]\n",
"[Epoch 70/200] [Batch 32/59] [D loss: 0.617275] [G loss: 0.908132]\n",
"[Epoch 70/200] [Batch 33/59] [D loss: 0.675397] [G loss: 0.865259]\n",
"[Epoch 70/200] [Batch 34/59] [D loss: 0.606467] [G loss: 0.850236]\n",
"[Epoch 70/200] [Batch 35/59] [D loss: 0.623108] [G loss: 0.774149]\n",
"[Epoch 70/200] [Batch 36/59] [D loss: 0.635059] [G loss: 0.743448]\n",
"[Epoch 70/200] [Batch 37/59] [D loss: 0.657300] [G loss: 0.863376]\n",
"[Epoch 70/200] [Batch 38/59] [D loss: 0.627359] [G loss: 0.777901]\n",
"[Epoch 70/200] [Batch 39/59] [D loss: 0.652481] [G loss: 0.784903]\n",
"[Epoch 70/200] [Batch 40/59] [D loss: 0.643489] [G loss: 0.848639]\n",
"[Epoch 70/200] [Batch 41/59] [D loss: 0.617000] [G loss: 0.854759]\n",
"[Epoch 70/200] [Batch 42/59] [D loss: 0.607340] [G loss: 0.924563]\n",
"[Epoch 70/200] [Batch 43/59] [D loss: 0.606691] [G loss: 0.900409]\n",
"[Epoch 70/200] [Batch 44/59] [D loss: 0.614030] [G loss: 0.765361]\n",
"[Epoch 70/200] [Batch 45/59] [D loss: 0.587114] [G loss: 0.908151]\n",
"[Epoch 70/200] [Batch 46/59] [D loss: 0.593187] [G loss: 0.764321]\n",
"[Epoch 70/200] [Batch 47/59] [D loss: 0.615946] [G loss: 0.710706]\n",
"[Epoch 70/200] [Batch 48/59] [D loss: 0.656608] [G loss: 1.054176]\n",
"[Epoch 70/200] [Batch 49/59] [D loss: 0.552803] [G loss: 0.767110]\n",
"[Epoch 70/200] [Batch 50/59] [D loss: 0.629468] [G loss: 0.655654]\n",
"[Epoch 70/200] [Batch 51/59] [D loss: 0.666776] [G loss: 1.034702]\n",
"[Epoch 70/200] [Batch 52/59] [D loss: 0.632045] [G loss: 0.774893]\n",
"[Epoch 70/200] [Batch 53/59] [D loss: 0.607555] [G loss: 0.829442]\n",
"[Epoch 70/200] [Batch 54/59] [D loss: 0.589472] [G loss: 0.877733]\n",
"[Epoch 70/200] [Batch 55/59] [D loss: 0.701750] [G loss: 0.851920]\n",
"[Epoch 70/200] [Batch 56/59] [D loss: 0.590724] [G loss: 0.799436]\n",
"[Epoch 70/200] [Batch 57/59] [D loss: 0.620365] [G loss: 0.917763]\n",
"[Epoch 70/200] [Batch 58/59] [D loss: 0.647624] [G loss: 0.787093]\n",
"[Epoch 71/200] [Batch 0/59] [D loss: 0.652037] [G loss: 0.980516]\n",
"[Epoch 71/200] [Batch 1/59] [D loss: 0.652504] [G loss: 0.656089]\n",
"[Epoch 71/200] [Batch 2/59] [D loss: 0.619174] [G loss: 1.060407]\n",
"[Epoch 71/200] [Batch 3/59] [D loss: 0.594381] [G loss: 0.843923]\n",
"[Epoch 71/200] [Batch 4/59] [D loss: 0.596375] [G loss: 0.733493]\n",
"[Epoch 71/200] [Batch 5/59] [D loss: 0.619419] [G loss: 1.074896]\n",
"[Epoch 71/200] [Batch 6/59] [D loss: 0.611314] [G loss: 0.749161]\n",
"[Epoch 71/200] [Batch 7/59] [D loss: 0.647504] [G loss: 0.729433]\n",
"[Epoch 71/200] [Batch 8/59] [D loss: 0.610391] [G loss: 1.006168]\n",
"[Epoch 71/200] [Batch 9/59] [D loss: 0.650423] [G loss: 0.934193]\n",
"[Epoch 71/200] [Batch 10/59] [D loss: 0.616496] [G loss: 0.628127]\n",
"[Epoch 71/200] [Batch 11/59] [D loss: 0.667550] [G loss: 0.829823]\n",
"[Epoch 71/200] [Batch 12/59] [D loss: 0.664441] [G loss: 0.827482]\n",
"[Epoch 71/200] [Batch 13/59] [D loss: 0.644835] [G loss: 0.773266]\n",
"[Epoch 71/200] [Batch 14/59] [D loss: 0.655561] [G loss: 0.999723]\n",
"[Epoch 71/200] [Batch 15/59] [D loss: 0.596574] [G loss: 0.842835]\n",
"[Epoch 71/200] [Batch 16/59] [D loss: 0.630301] [G loss: 0.702778]\n",
"[Epoch 71/200] [Batch 17/59] [D loss: 0.640957] [G loss: 0.782934]\n",
"[Epoch 71/200] [Batch 18/59] [D loss: 0.582022] [G loss: 0.883679]\n",
"[Epoch 71/200] [Batch 19/59] [D loss: 0.634335] [G loss: 0.915512]\n",
"[Epoch 71/200] [Batch 20/59] [D loss: 0.599389] [G loss: 0.685189]\n",
"[Epoch 71/200] [Batch 21/59] [D loss: 0.577841] [G loss: 0.994941]\n",
"[Epoch 71/200] [Batch 22/59] [D loss: 0.557432] [G loss: 0.911857]\n",
"[Epoch 71/200] [Batch 23/59] [D loss: 0.610821] [G loss: 0.760891]\n",
"[Epoch 71/200] [Batch 24/59] [D loss: 0.658360] [G loss: 0.772491]\n",
"[Epoch 71/200] [Batch 25/59] [D loss: 0.647453] [G loss: 1.056476]\n",
"[Epoch 71/200] [Batch 26/59] [D loss: 0.628929] [G loss: 0.730968]\n",
"[Epoch 71/200] [Batch 27/59] [D loss: 0.625096] [G loss: 0.734836]\n",
"[Epoch 71/200] [Batch 28/59] [D loss: 0.623493] [G loss: 0.917739]\n",
"[Epoch 71/200] [Batch 29/59] [D loss: 0.602245] [G loss: 0.837992]\n",
"[Epoch 71/200] [Batch 30/59] [D loss: 0.612325] [G loss: 0.721741]\n",
"[Epoch 71/200] [Batch 31/59] [D loss: 0.569048] [G loss: 0.867908]\n",
"[Epoch 71/200] [Batch 32/59] [D loss: 0.588186] [G loss: 0.851584]\n",
"[Epoch 71/200] [Batch 33/59] [D loss: 0.616975] [G loss: 0.692735]\n",
"[Epoch 71/200] [Batch 34/59] [D loss: 0.628519] [G loss: 0.992869]\n",
"[Epoch 71/200] [Batch 35/59] [D loss: 0.669082] [G loss: 0.733729]\n",
"[Epoch 71/200] [Batch 36/59] [D loss: 0.659417] [G loss: 0.774364]\n",
"[Epoch 71/200] [Batch 37/59] [D loss: 0.625968] [G loss: 0.932671]\n",
"[Epoch 71/200] [Batch 38/59] [D loss: 0.641058] [G loss: 0.914384]\n",
"[Epoch 71/200] [Batch 39/59] [D loss: 0.613554] [G loss: 0.926183]\n",
"[Epoch 71/200] [Batch 40/59] [D loss: 0.602018] [G loss: 0.793773]\n",
"[Epoch 71/200] [Batch 41/59] [D loss: 0.604466] [G loss: 0.888339]\n",
"[Epoch 71/200] [Batch 42/59] [D loss: 0.579297] [G loss: 0.806023]\n",
"[Epoch 71/200] [Batch 43/59] [D loss: 0.634441] [G loss: 0.802602]\n",
"[Epoch 71/200] [Batch 44/59] [D loss: 0.604161] [G loss: 0.979397]\n",
"[Epoch 71/200] [Batch 45/59] [D loss: 0.631764] [G loss: 0.892309]\n",
"[Epoch 71/200] [Batch 46/59] [D loss: 0.598151] [G loss: 1.001694]\n",
"[Epoch 71/200] [Batch 47/59] [D loss: 0.608073] [G loss: 0.709744]\n",
"[Epoch 71/200] [Batch 48/59] [D loss: 0.648255] [G loss: 0.796062]\n",
"[Epoch 71/200] [Batch 49/59] [D loss: 0.644964] [G loss: 0.976678]\n",
"[Epoch 71/200] [Batch 50/59] [D loss: 0.610747] [G loss: 0.738497]\n",
"[Epoch 71/200] [Batch 51/59] [D loss: 0.624890] [G loss: 0.790873]\n",
"[Epoch 71/200] [Batch 52/59] [D loss: 0.627822] [G loss: 0.941319]\n",
"[Epoch 71/200] [Batch 53/59] [D loss: 0.637128] [G loss: 0.774539]\n",
"[Epoch 71/200] [Batch 54/59] [D loss: 0.665195] [G loss: 0.863844]\n",
"[Epoch 71/200] [Batch 55/59] [D loss: 0.606207] [G loss: 0.891991]\n",
"[Epoch 71/200] [Batch 56/59] [D loss: 0.634650] [G loss: 0.942628]\n",
"[Epoch 71/200] [Batch 57/59] [D loss: 0.598455] [G loss: 0.780278]\n",
"[Epoch 71/200] [Batch 58/59] [D loss: 0.593988] [G loss: 0.999423]\n",
"[Epoch 72/200] [Batch 0/59] [D loss: 0.621444] [G loss: 1.000966]\n",
"[Epoch 72/200] [Batch 1/59] [D loss: 0.586843] [G loss: 0.789194]\n",
"[Epoch 72/200] [Batch 2/59] [D loss: 0.611174] [G loss: 0.901808]\n",
"[Epoch 72/200] [Batch 3/59] [D loss: 0.574872] [G loss: 0.787848]\n",
"[Epoch 72/200] [Batch 4/59] [D loss: 0.610446] [G loss: 0.860223]\n",
"[Epoch 72/200] [Batch 5/59] [D loss: 0.614841] [G loss: 0.893861]\n",
"[Epoch 72/200] [Batch 6/59] [D loss: 0.591120] [G loss: 0.891254]\n",
"[Epoch 72/200] [Batch 7/59] [D loss: 0.643353] [G loss: 0.720528]\n",
"[Epoch 72/200] [Batch 8/59] [D loss: 0.641326] [G loss: 0.905774]\n",
"[Epoch 72/200] [Batch 9/59] [D loss: 0.646640] [G loss: 0.931036]\n",
"[Epoch 72/200] [Batch 10/59] [D loss: 0.571421] [G loss: 0.699316]\n",
"[Epoch 72/200] [Batch 11/59] [D loss: 0.584342] [G loss: 0.845338]\n",
"[Epoch 72/200] [Batch 12/59] [D loss: 0.609520] [G loss: 0.956786]\n",
"[Epoch 72/200] [Batch 13/59] [D loss: 0.650167] [G loss: 0.730854]\n",
"[Epoch 72/200] [Batch 14/59] [D loss: 0.624355] [G loss: 0.789822]\n",
"[Epoch 72/200] [Batch 15/59] [D loss: 0.611932] [G loss: 0.720929]\n",
"[Epoch 72/200] [Batch 16/59] [D loss: 0.655700] [G loss: 0.940770]\n",
"[Epoch 72/200] [Batch 17/59] [D loss: 0.579793] [G loss: 0.755063]\n",
"[Epoch 72/200] [Batch 18/59] [D loss: 0.619252] [G loss: 0.802294]\n",
"[Epoch 72/200] [Batch 19/59] [D loss: 0.578444] [G loss: 0.912715]\n",
"[Epoch 72/200] [Batch 20/59] [D loss: 0.577552] [G loss: 0.768966]\n",
"[Epoch 72/200] [Batch 21/59] [D loss: 0.631323] [G loss: 0.958418]\n",
"[Epoch 72/200] [Batch 22/59] [D loss: 0.608359] [G loss: 0.921787]\n",
"[Epoch 72/200] [Batch 23/59] [D loss: 0.639208] [G loss: 0.724339]\n",
"[Epoch 72/200] [Batch 24/59] [D loss: 0.632727] [G loss: 0.853388]\n",
"[Epoch 72/200] [Batch 25/59] [D loss: 0.603695] [G loss: 0.961360]\n",
"[Epoch 72/200] [Batch 26/59] [D loss: 0.613295] [G loss: 0.875563]\n",
"[Epoch 72/200] [Batch 27/59] [D loss: 0.614488] [G loss: 0.706887]\n",
"[Epoch 72/200] [Batch 28/59] [D loss: 0.604925] [G loss: 0.893520]\n",
"[Epoch 72/200] [Batch 29/59] [D loss: 0.631922] [G loss: 0.960921]\n",
"[Epoch 72/200] [Batch 30/59] [D loss: 0.661715] [G loss: 0.585575]\n",
"[Epoch 72/200] [Batch 31/59] [D loss: 0.639717] [G loss: 0.878331]\n",
"[Epoch 72/200] [Batch 32/59] [D loss: 0.648838] [G loss: 0.771220]\n",
"[Epoch 72/200] [Batch 33/59] [D loss: 0.612885] [G loss: 0.720607]\n",
"[Epoch 72/200] [Batch 34/59] [D loss: 0.656128] [G loss: 0.795013]\n",
"[Epoch 72/200] [Batch 35/59] [D loss: 0.612239] [G loss: 1.096057]\n",
"[Epoch 72/200] [Batch 36/59] [D loss: 0.601449] [G loss: 0.740758]\n",
"[Epoch 72/200] [Batch 37/59] [D loss: 0.605636] [G loss: 0.900143]\n",
"[Epoch 72/200] [Batch 38/59] [D loss: 0.592043] [G loss: 0.969820]\n",
"[Epoch 72/200] [Batch 39/59] [D loss: 0.616786] [G loss: 0.728301]\n",
"[Epoch 72/200] [Batch 40/59] [D loss: 0.630579] [G loss: 0.892664]\n",
"[Epoch 72/200] [Batch 41/59] [D loss: 0.640426] [G loss: 0.716223]\n",
"[Epoch 72/200] [Batch 42/59] [D loss: 0.615708] [G loss: 0.942229]\n",
"[Epoch 72/200] [Batch 43/59] [D loss: 0.645730] [G loss: 0.723101]\n",
"[Epoch 72/200] [Batch 44/59] [D loss: 0.633113] [G loss: 0.892821]\n",
"[Epoch 72/200] [Batch 45/59] [D loss: 0.626412] [G loss: 0.790976]\n",
"[Epoch 72/200] [Batch 46/59] [D loss: 0.628538] [G loss: 0.765149]\n",
"[Epoch 72/200] [Batch 47/59] [D loss: 0.622116] [G loss: 0.829273]\n",
"[Epoch 72/200] [Batch 48/59] [D loss: 0.633767] [G loss: 0.704588]\n",
"[Epoch 72/200] [Batch 49/59] [D loss: 0.644671] [G loss: 0.749870]\n",
"[Epoch 72/200] [Batch 50/59] [D loss: 0.681562] [G loss: 1.001645]\n",
"[Epoch 72/200] [Batch 51/59] [D loss: 0.663495] [G loss: 0.851374]\n",
"[Epoch 72/200] [Batch 52/59] [D loss: 0.624115] [G loss: 0.959163]\n",
"[Epoch 72/200] [Batch 53/59] [D loss: 0.591662] [G loss: 0.712242]\n",
"[Epoch 72/200] [Batch 54/59] [D loss: 0.611994] [G loss: 0.951960]\n",
"[Epoch 72/200] [Batch 55/59] [D loss: 0.604127] [G loss: 0.863121]\n",
"[Epoch 72/200] [Batch 56/59] [D loss: 0.591437] [G loss: 0.783284]\n",
"[Epoch 72/200] [Batch 57/59] [D loss: 0.608678] [G loss: 1.114930]\n",
"[Epoch 72/200] [Batch 58/59] [D loss: 0.547989] [G loss: 0.882811]\n",
"[Epoch 73/200] [Batch 0/59] [D loss: 0.615518] [G loss: 0.761776]\n",
"[Epoch 73/200] [Batch 1/59] [D loss: 0.628905] [G loss: 0.977347]\n",
"[Epoch 73/200] [Batch 2/59] [D loss: 0.608378] [G loss: 1.063433]\n",
"[Epoch 73/200] [Batch 3/59] [D loss: 0.620278] [G loss: 0.723472]\n",
"[Epoch 73/200] [Batch 4/59] [D loss: 0.619269] [G loss: 0.690911]\n",
"[Epoch 73/200] [Batch 5/59] [D loss: 0.629836] [G loss: 1.115722]\n",
"[Epoch 73/200] [Batch 6/59] [D loss: 0.579939] [G loss: 0.786028]\n",
"[Epoch 73/200] [Batch 7/59] [D loss: 0.644446] [G loss: 0.619304]\n",
"[Epoch 73/200] [Batch 8/59] [D loss: 0.651466] [G loss: 0.838626]\n",
"[Epoch 73/200] [Batch 9/59] [D loss: 0.614286] [G loss: 0.850634]\n",
"[Epoch 73/200] [Batch 10/59] [D loss: 0.684136] [G loss: 0.899496]\n",
"[Epoch 73/200] [Batch 11/59] [D loss: 0.636498] [G loss: 0.752164]\n",
"[Epoch 73/200] [Batch 12/59] [D loss: 0.579552] [G loss: 0.827380]\n",
"[Epoch 73/200] [Batch 13/59] [D loss: 0.649293] [G loss: 0.979697]\n",
"[Epoch 73/200] [Batch 14/59] [D loss: 0.612554] [G loss: 0.738969]\n",
"[Epoch 73/200] [Batch 15/59] [D loss: 0.595180] [G loss: 1.025052]\n",
"[Epoch 73/200] [Batch 16/59] [D loss: 0.629052] [G loss: 0.884927]\n",
"[Epoch 73/200] [Batch 17/59] [D loss: 0.580688] [G loss: 0.837919]\n",
"[Epoch 73/200] [Batch 18/59] [D loss: 0.602660] [G loss: 0.912859]\n",
"[Epoch 73/200] [Batch 19/59] [D loss: 0.615768] [G loss: 0.814064]\n",
"[Epoch 73/200] [Batch 20/59] [D loss: 0.595487] [G loss: 0.914596]\n",
"[Epoch 73/200] [Batch 21/59] [D loss: 0.582215] [G loss: 0.805302]\n",
"[Epoch 73/200] [Batch 22/59] [D loss: 0.579822] [G loss: 0.836236]\n",
"[Epoch 73/200] [Batch 23/59] [D loss: 0.577259] [G loss: 0.919370]\n",
"[Epoch 73/200] [Batch 24/59] [D loss: 0.608532] [G loss: 0.671503]\n",
"[Epoch 73/200] [Batch 25/59] [D loss: 0.608231] [G loss: 0.899443]\n",
"[Epoch 73/200] [Batch 26/59] [D loss: 0.627535] [G loss: 0.911807]\n",
"[Epoch 73/200] [Batch 27/59] [D loss: 0.638524] [G loss: 0.685963]\n",
"[Epoch 73/200] [Batch 28/59] [D loss: 0.631150] [G loss: 0.868696]\n",
"[Epoch 73/200] [Batch 29/59] [D loss: 0.630697] [G loss: 0.821954]\n",
"[Epoch 73/200] [Batch 30/59] [D loss: 0.638857] [G loss: 0.647853]\n",
"[Epoch 73/200] [Batch 31/59] [D loss: 0.621691] [G loss: 1.099312]\n",
"[Epoch 73/200] [Batch 32/59] [D loss: 0.631465] [G loss: 0.788556]\n",
"[Epoch 73/200] [Batch 33/59] [D loss: 0.654072] [G loss: 0.760195]\n",
"[Epoch 73/200] [Batch 34/59] [D loss: 0.624606] [G loss: 0.813141]\n",
"[Epoch 73/200] [Batch 35/59] [D loss: 0.597924] [G loss: 0.831801]\n",
"[Epoch 73/200] [Batch 36/59] [D loss: 0.632396] [G loss: 0.891209]\n",
"[Epoch 73/200] [Batch 37/59] [D loss: 0.615224] [G loss: 0.827067]\n",
"[Epoch 73/200] [Batch 38/59] [D loss: 0.602563] [G loss: 0.790396]\n",
"[Epoch 73/200] [Batch 39/59] [D loss: 0.618965] [G loss: 0.921263]\n",
"[Epoch 73/200] [Batch 40/59] [D loss: 0.613712] [G loss: 0.849639]\n",
"[Epoch 73/200] [Batch 41/59] [D loss: 0.635316] [G loss: 0.678085]\n",
"[Epoch 73/200] [Batch 42/59] [D loss: 0.611969] [G loss: 0.832469]\n",
"[Epoch 73/200] [Batch 43/59] [D loss: 0.607237] [G loss: 1.185167]\n",
"[Epoch 73/200] [Batch 44/59] [D loss: 0.611707] [G loss: 0.687090]\n",
"[Epoch 73/200] [Batch 45/59] [D loss: 0.594567] [G loss: 0.784175]\n",
"[Epoch 73/200] [Batch 46/59] [D loss: 0.639119] [G loss: 1.135593]\n",
"[Epoch 73/200] [Batch 47/59] [D loss: 0.655145] [G loss: 0.779426]\n",
"[Epoch 73/200] [Batch 48/59] [D loss: 0.614723] [G loss: 0.751577]\n",
"[Epoch 73/200] [Batch 49/59] [D loss: 0.604599] [G loss: 0.920747]\n",
"[Epoch 73/200] [Batch 50/59] [D loss: 0.610997] [G loss: 0.904689]\n",
"[Epoch 73/200] [Batch 51/59] [D loss: 0.623527] [G loss: 0.835181]\n",
"[Epoch 73/200] [Batch 52/59] [D loss: 0.594958] [G loss: 0.812734]\n",
"[Epoch 73/200] [Batch 53/59] [D loss: 0.639156] [G loss: 0.655640]\n",
"[Epoch 73/200] [Batch 54/59] [D loss: 0.593849] [G loss: 0.811434]\n",
"[Epoch 73/200] [Batch 55/59] [D loss: 0.611347] [G loss: 1.024133]\n",
"[Epoch 73/200] [Batch 56/59] [D loss: 0.589811] [G loss: 0.699737]\n",
"[Epoch 73/200] [Batch 57/59] [D loss: 0.604672] [G loss: 0.911689]\n",
"[Epoch 73/200] [Batch 58/59] [D loss: 0.586884] [G loss: 1.132888]\n",
"[Epoch 74/200] [Batch 0/59] [D loss: 0.613957] [G loss: 0.790156]\n",
"[Epoch 74/200] [Batch 1/59] [D loss: 0.633584] [G loss: 0.723950]\n",
"[Epoch 74/200] [Batch 2/59] [D loss: 0.637753] [G loss: 1.155303]\n",
"[Epoch 74/200] [Batch 3/59] [D loss: 0.682240] [G loss: 0.759100]\n",
"[Epoch 74/200] [Batch 4/59] [D loss: 0.612693] [G loss: 0.792673]\n",
"[Epoch 74/200] [Batch 5/59] [D loss: 0.586903] [G loss: 0.941116]\n",
"[Epoch 74/200] [Batch 6/59] [D loss: 0.599895] [G loss: 0.802963]\n",
"[Epoch 74/200] [Batch 7/59] [D loss: 0.627417] [G loss: 0.757011]\n",
"[Epoch 74/200] [Batch 8/59] [D loss: 0.635900] [G loss: 0.857067]\n",
"[Epoch 74/200] [Batch 9/59] [D loss: 0.588462] [G loss: 0.881324]\n",
"[Epoch 74/200] [Batch 10/59] [D loss: 0.624288] [G loss: 0.901136]\n",
"[Epoch 74/200] [Batch 11/59] [D loss: 0.655708] [G loss: 0.814426]\n",
"[Epoch 74/200] [Batch 12/59] [D loss: 0.615827] [G loss: 0.750352]\n",
"[Epoch 74/200] [Batch 13/59] [D loss: 0.618931] [G loss: 0.798216]\n",
"[Epoch 74/200] [Batch 14/59] [D loss: 0.609522] [G loss: 0.869251]\n",
"[Epoch 74/200] [Batch 15/59] [D loss: 0.633304] [G loss: 0.673017]\n",
"[Epoch 74/200] [Batch 16/59] [D loss: 0.643188] [G loss: 0.919434]\n",
"[Epoch 74/200] [Batch 17/59] [D loss: 0.597654] [G loss: 0.958670]\n",
"[Epoch 74/200] [Batch 18/59] [D loss: 0.581954] [G loss: 0.668009]\n",
"[Epoch 74/200] [Batch 19/59] [D loss: 0.631106] [G loss: 0.761856]\n",
"[Epoch 74/200] [Batch 20/59] [D loss: 0.624846] [G loss: 1.029688]\n",
"[Epoch 74/200] [Batch 21/59] [D loss: 0.598086] [G loss: 0.714722]\n",
"[Epoch 74/200] [Batch 22/59] [D loss: 0.586601] [G loss: 0.771517]\n",
"[Epoch 74/200] [Batch 23/59] [D loss: 0.596589] [G loss: 1.061702]\n",
"[Epoch 74/200] [Batch 24/59] [D loss: 0.603929] [G loss: 0.733583]\n",
"[Epoch 74/200] [Batch 25/59] [D loss: 0.606832] [G loss: 0.947224]\n",
"[Epoch 74/200] [Batch 26/59] [D loss: 0.623144] [G loss: 0.816753]\n",
"[Epoch 74/200] [Batch 27/59] [D loss: 0.569960] [G loss: 0.796269]\n",
"[Epoch 74/200] [Batch 28/59] [D loss: 0.657301] [G loss: 0.915784]\n",
"[Epoch 74/200] [Batch 29/59] [D loss: 0.604419] [G loss: 0.889096]\n",
"[Epoch 74/200] [Batch 30/59] [D loss: 0.609636] [G loss: 0.758985]\n",
"[Epoch 74/200] [Batch 31/59] [D loss: 0.638345] [G loss: 0.847518]\n",
"[Epoch 74/200] [Batch 32/59] [D loss: 0.576557] [G loss: 0.939202]\n",
"[Epoch 74/200] [Batch 33/59] [D loss: 0.627852] [G loss: 0.860247]\n",
"[Epoch 74/200] [Batch 34/59] [D loss: 0.627181] [G loss: 0.930719]\n",
"[Epoch 74/200] [Batch 35/59] [D loss: 0.610959] [G loss: 0.776550]\n",
"[Epoch 74/200] [Batch 36/59] [D loss: 0.589947] [G loss: 0.805375]\n",
"[Epoch 74/200] [Batch 37/59] [D loss: 0.568877] [G loss: 1.060714]\n",
"[Epoch 74/200] [Batch 38/59] [D loss: 0.576733] [G loss: 0.816999]\n",
"[Epoch 74/200] [Batch 39/59] [D loss: 0.614943] [G loss: 0.820785]\n",
"[Epoch 74/200] [Batch 40/59] [D loss: 0.567679] [G loss: 1.023789]\n",
"[Epoch 74/200] [Batch 41/59] [D loss: 0.603540] [G loss: 0.878063]\n",
"[Epoch 74/200] [Batch 42/59] [D loss: 0.602194] [G loss: 0.726796]\n",
"[Epoch 74/200] [Batch 43/59] [D loss: 0.602320] [G loss: 0.749505]\n",
"[Epoch 74/200] [Batch 44/59] [D loss: 0.657685] [G loss: 1.044859]\n",
"[Epoch 74/200] [Batch 45/59] [D loss: 0.587190] [G loss: 0.622055]\n",
"[Epoch 74/200] [Batch 46/59] [D loss: 0.633540] [G loss: 0.852463]\n",
"[Epoch 74/200] [Batch 47/59] [D loss: 0.644961] [G loss: 0.951185]\n",
"[Epoch 74/200] [Batch 48/59] [D loss: 0.628643] [G loss: 0.672852]\n",
"[Epoch 74/200] [Batch 49/59] [D loss: 0.629632] [G loss: 0.969854]\n",
"[Epoch 74/200] [Batch 50/59] [D loss: 0.614006] [G loss: 0.888086]\n",
"[Epoch 74/200] [Batch 51/59] [D loss: 0.596962] [G loss: 0.795334]\n",
"[Epoch 74/200] [Batch 52/59] [D loss: 0.626438] [G loss: 0.790734]\n",
"[Epoch 74/200] [Batch 53/59] [D loss: 0.629241] [G loss: 0.837843]\n",
"[Epoch 74/200] [Batch 54/59] [D loss: 0.617979] [G loss: 0.770909]\n",
"[Epoch 74/200] [Batch 55/59] [D loss: 0.610791] [G loss: 0.780595]\n",
"[Epoch 74/200] [Batch 56/59] [D loss: 0.614438] [G loss: 0.944581]\n",
"[Epoch 74/200] [Batch 57/59] [D loss: 0.650632] [G loss: 0.782668]\n",
"[Epoch 74/200] [Batch 58/59] [D loss: 0.640918] [G loss: 0.819873]\n",
"[Epoch 75/200] [Batch 0/59] [D loss: 0.643373] [G loss: 0.647849]\n",
"[Epoch 75/200] [Batch 1/59] [D loss: 0.639552] [G loss: 0.994782]\n",
"[Epoch 75/200] [Batch 2/59] [D loss: 0.583631] [G loss: 0.783255]\n",
"[Epoch 75/200] [Batch 3/59] [D loss: 0.636477] [G loss: 0.812234]\n",
"[Epoch 75/200] [Batch 4/59] [D loss: 0.625534] [G loss: 0.871562]\n",
"[Epoch 75/200] [Batch 5/59] [D loss: 0.599269] [G loss: 0.839003]\n",
"[Epoch 75/200] [Batch 6/59] [D loss: 0.635112] [G loss: 0.780943]\n",
"[Epoch 75/200] [Batch 7/59] [D loss: 0.651303] [G loss: 1.039566]\n",
"[Epoch 75/200] [Batch 8/59] [D loss: 0.576734] [G loss: 0.817062]\n",
"[Epoch 75/200] [Batch 9/59] [D loss: 0.598697] [G loss: 0.838236]\n",
"[Epoch 75/200] [Batch 10/59] [D loss: 0.613695] [G loss: 0.749742]\n",
"[Epoch 75/200] [Batch 11/59] [D loss: 0.583897] [G loss: 0.866789]\n",
"[Epoch 75/200] [Batch 12/59] [D loss: 0.601633] [G loss: 0.979158]\n",
"[Epoch 75/200] [Batch 13/59] [D loss: 0.576692] [G loss: 0.787807]\n",
"[Epoch 75/200] [Batch 14/59] [D loss: 0.621779] [G loss: 0.780565]\n",
"[Epoch 75/200] [Batch 15/59] [D loss: 0.640172] [G loss: 1.110565]\n",
"[Epoch 75/200] [Batch 16/59] [D loss: 0.563430] [G loss: 0.785604]\n",
"[Epoch 75/200] [Batch 17/59] [D loss: 0.608972] [G loss: 0.782722]\n",
"[Epoch 75/200] [Batch 18/59] [D loss: 0.595420] [G loss: 0.912624]\n",
"[Epoch 75/200] [Batch 19/59] [D loss: 0.565572] [G loss: 0.934806]\n",
"[Epoch 75/200] [Batch 20/59] [D loss: 0.613200] [G loss: 0.747211]\n",
"[Epoch 75/200] [Batch 21/59] [D loss: 0.694597] [G loss: 0.880641]\n",
"[Epoch 75/200] [Batch 22/59] [D loss: 0.614472] [G loss: 0.981106]\n",
"[Epoch 75/200] [Batch 23/59] [D loss: 0.624025] [G loss: 0.829214]\n",
"[Epoch 75/200] [Batch 24/59] [D loss: 0.627223] [G loss: 0.710626]\n",
"[Epoch 75/200] [Batch 25/59] [D loss: 0.620837] [G loss: 1.043869]\n",
"[Epoch 75/200] [Batch 26/59] [D loss: 0.604576] [G loss: 0.773511]\n",
"[Epoch 75/200] [Batch 27/59] [D loss: 0.637615] [G loss: 0.706551]\n",
"[Epoch 75/200] [Batch 28/59] [D loss: 0.591759] [G loss: 1.006593]\n",
"[Epoch 75/200] [Batch 29/59] [D loss: 0.628221] [G loss: 0.653552]\n",
"[Epoch 75/200] [Batch 30/59] [D loss: 0.636358] [G loss: 0.824891]\n",
"[Epoch 75/200] [Batch 31/59] [D loss: 0.632025] [G loss: 0.879216]\n",
"[Epoch 75/200] [Batch 32/59] [D loss: 0.618698] [G loss: 0.675643]\n",
"[Epoch 75/200] [Batch 33/59] [D loss: 0.618914] [G loss: 0.808496]\n",
"[Epoch 75/200] [Batch 34/59] [D loss: 0.639820] [G loss: 0.992822]\n",
"[Epoch 75/200] [Batch 35/59] [D loss: 0.579245] [G loss: 0.871719]\n",
"[Epoch 75/200] [Batch 36/59] [D loss: 0.608353] [G loss: 0.820244]\n",
"[Epoch 75/200] [Batch 37/59] [D loss: 0.588073] [G loss: 0.865827]\n",
"[Epoch 75/200] [Batch 38/59] [D loss: 0.602900] [G loss: 0.770518]\n",
"[Epoch 75/200] [Batch 39/59] [D loss: 0.641393] [G loss: 0.678105]\n",
"[Epoch 75/200] [Batch 40/59] [D loss: 0.644918] [G loss: 1.051215]\n",
"[Epoch 75/200] [Batch 41/59] [D loss: 0.656367] [G loss: 0.766135]\n",
"[Epoch 75/200] [Batch 42/59] [D loss: 0.587473] [G loss: 0.764640]\n",
"[Epoch 75/200] [Batch 43/59] [D loss: 0.637906] [G loss: 1.067010]\n",
"[Epoch 75/200] [Batch 44/59] [D loss: 0.613623] [G loss: 0.921697]\n",
"[Epoch 75/200] [Batch 45/59] [D loss: 0.633769] [G loss: 0.821398]\n",
"[Epoch 75/200] [Batch 46/59] [D loss: 0.565228] [G loss: 0.963899]\n",
"[Epoch 75/200] [Batch 47/59] [D loss: 0.600765] [G loss: 0.948257]\n",
"[Epoch 75/200] [Batch 48/59] [D loss: 0.622040] [G loss: 0.830972]\n",
"[Epoch 75/200] [Batch 49/59] [D loss: 0.647979] [G loss: 0.781730]\n",
"[Epoch 75/200] [Batch 50/59] [D loss: 0.607877] [G loss: 0.859722]\n",
"[Epoch 75/200] [Batch 51/59] [D loss: 0.691314] [G loss: 1.001108]\n",
"[Epoch 75/200] [Batch 52/59] [D loss: 0.617681] [G loss: 0.662378]\n",
"[Epoch 75/200] [Batch 53/59] [D loss: 0.635306] [G loss: 0.794811]\n",
"[Epoch 75/200] [Batch 54/59] [D loss: 0.624821] [G loss: 1.136699]\n",
"[Epoch 75/200] [Batch 55/59] [D loss: 0.635517] [G loss: 0.797206]\n",
"[Epoch 75/200] [Batch 56/59] [D loss: 0.636719] [G loss: 0.655880]\n",
"[Epoch 75/200] [Batch 57/59] [D loss: 0.608402] [G loss: 1.028390]\n",
"[Epoch 75/200] [Batch 58/59] [D loss: 0.635782] [G loss: 0.974264]\n",
"[Epoch 76/200] [Batch 0/59] [D loss: 0.654911] [G loss: 0.557953]\n",
"[Epoch 76/200] [Batch 1/59] [D loss: 0.622285] [G loss: 0.991579]\n",
"[Epoch 76/200] [Batch 2/59] [D loss: 0.657450] [G loss: 0.987529]\n",
"[Epoch 76/200] [Batch 3/59] [D loss: 0.587590] [G loss: 0.694086]\n",
"[Epoch 76/200] [Batch 4/59] [D loss: 0.609804] [G loss: 0.658703]\n",
"[Epoch 76/200] [Batch 5/59] [D loss: 0.612177] [G loss: 1.139725]\n",
"[Epoch 76/200] [Batch 6/59] [D loss: 0.603796] [G loss: 0.812683]\n",
"[Epoch 76/200] [Batch 7/59] [D loss: 0.627728] [G loss: 0.689893]\n",
"[Epoch 76/200] [Batch 8/59] [D loss: 0.593393] [G loss: 0.853722]\n",
"[Epoch 76/200] [Batch 9/59] [D loss: 0.633988] [G loss: 0.858706]\n",
"[Epoch 76/200] [Batch 10/59] [D loss: 0.653046] [G loss: 0.969221]\n",
"[Epoch 76/200] [Batch 11/59] [D loss: 0.629210] [G loss: 0.745420]\n",
"[Epoch 76/200] [Batch 12/59] [D loss: 0.665579] [G loss: 0.739291]\n",
"[Epoch 76/200] [Batch 13/59] [D loss: 0.632944] [G loss: 0.966707]\n",
"[Epoch 76/200] [Batch 14/59] [D loss: 0.645150] [G loss: 0.833777]\n",
"[Epoch 76/200] [Batch 15/59] [D loss: 0.581846] [G loss: 0.800709]\n",
"[Epoch 76/200] [Batch 16/59] [D loss: 0.621331] [G loss: 0.830489]\n",
"[Epoch 76/200] [Batch 17/59] [D loss: 0.568935] [G loss: 0.830351]\n",
"[Epoch 76/200] [Batch 18/59] [D loss: 0.583979] [G loss: 1.098030]\n",
"[Epoch 76/200] [Batch 19/59] [D loss: 0.595352] [G loss: 0.787886]\n",
"[Epoch 76/200] [Batch 20/59] [D loss: 0.619771] [G loss: 0.876295]\n",
"[Epoch 76/200] [Batch 21/59] [D loss: 0.619073] [G loss: 0.882483]\n",
"[Epoch 76/200] [Batch 22/59] [D loss: 0.620686] [G loss: 0.756180]\n",
"[Epoch 76/200] [Batch 23/59] [D loss: 0.607020] [G loss: 0.944900]\n",
"[Epoch 76/200] [Batch 24/59] [D loss: 0.622498] [G loss: 0.775309]\n",
"[Epoch 76/200] [Batch 25/59] [D loss: 0.627561] [G loss: 0.662326]\n",
"[Epoch 76/200] [Batch 26/59] [D loss: 0.587497] [G loss: 0.856808]\n",
"[Epoch 76/200] [Batch 27/59] [D loss: 0.644330] [G loss: 0.905265]\n",
"[Epoch 76/200] [Batch 28/59] [D loss: 0.590903] [G loss: 0.811562]\n",
"[Epoch 76/200] [Batch 29/59] [D loss: 0.604403] [G loss: 0.718180]\n",
"[Epoch 76/200] [Batch 30/59] [D loss: 0.586761] [G loss: 0.879520]\n",
"[Epoch 76/200] [Batch 31/59] [D loss: 0.620157] [G loss: 1.000334]\n",
"[Epoch 76/200] [Batch 32/59] [D loss: 0.602081] [G loss: 0.558013]\n",
"[Epoch 76/200] [Batch 33/59] [D loss: 0.603742] [G loss: 0.892334]\n",
"[Epoch 76/200] [Batch 34/59] [D loss: 0.674962] [G loss: 1.076115]\n",
"[Epoch 76/200] [Batch 35/59] [D loss: 0.611500] [G loss: 0.755731]\n",
"[Epoch 76/200] [Batch 36/59] [D loss: 0.620245] [G loss: 0.765264]\n",
"[Epoch 76/200] [Batch 37/59] [D loss: 0.641400] [G loss: 1.022531]\n",
"[Epoch 76/200] [Batch 38/59] [D loss: 0.597288] [G loss: 0.847020]\n",
"[Epoch 76/200] [Batch 39/59] [D loss: 0.553190] [G loss: 0.694588]\n",
"[Epoch 76/200] [Batch 40/59] [D loss: 0.597828] [G loss: 0.922037]\n",
"[Epoch 76/200] [Batch 41/59] [D loss: 0.632462] [G loss: 0.824934]\n",
"[Epoch 76/200] [Batch 42/59] [D loss: 0.634464] [G loss: 0.696245]\n",
"[Epoch 76/200] [Batch 43/59] [D loss: 0.582259] [G loss: 0.899638]\n",
"[Epoch 76/200] [Batch 44/59] [D loss: 0.600233] [G loss: 0.863883]\n",
"[Epoch 76/200] [Batch 45/59] [D loss: 0.687237] [G loss: 0.810452]\n",
"[Epoch 76/200] [Batch 46/59] [D loss: 0.570059] [G loss: 0.783125]\n",
"[Epoch 76/200] [Batch 47/59] [D loss: 0.642849] [G loss: 0.860887]\n",
"[Epoch 76/200] [Batch 48/59] [D loss: 0.616618] [G loss: 0.649453]\n",
"[Epoch 76/200] [Batch 49/59] [D loss: 0.638585] [G loss: 0.904614]\n",
"[Epoch 76/200] [Batch 50/59] [D loss: 0.599583] [G loss: 0.882590]\n",
"[Epoch 76/200] [Batch 51/59] [D loss: 0.594547] [G loss: 0.652661]\n",
"[Epoch 76/200] [Batch 52/59] [D loss: 0.655756] [G loss: 0.985838]\n",
"[Epoch 76/200] [Batch 53/59] [D loss: 0.624616] [G loss: 0.970830]\n",
"[Epoch 76/200] [Batch 54/59] [D loss: 0.614764] [G loss: 0.786555]\n",
"[Epoch 76/200] [Batch 55/59] [D loss: 0.554664] [G loss: 0.858166]\n",
"[Epoch 76/200] [Batch 56/59] [D loss: 0.608553] [G loss: 0.937287]\n",
"[Epoch 76/200] [Batch 57/59] [D loss: 0.638688] [G loss: 0.995865]\n",
"[Epoch 76/200] [Batch 58/59] [D loss: 0.633920] [G loss: 0.788144]\n",
"[Epoch 77/200] [Batch 0/59] [D loss: 0.637630] [G loss: 0.836454]\n",
"[Epoch 77/200] [Batch 1/59] [D loss: 0.601779] [G loss: 0.980793]\n",
"[Epoch 77/200] [Batch 2/59] [D loss: 0.642166] [G loss: 0.746156]\n",
"[Epoch 77/200] [Batch 3/59] [D loss: 0.653318] [G loss: 0.700446]\n",
"[Epoch 77/200] [Batch 4/59] [D loss: 0.645620] [G loss: 1.139374]\n",
"[Epoch 77/200] [Batch 5/59] [D loss: 0.642375] [G loss: 0.710977]\n",
"[Epoch 77/200] [Batch 6/59] [D loss: 0.638094] [G loss: 0.748070]\n",
"[Epoch 77/200] [Batch 7/59] [D loss: 0.661139] [G loss: 0.823947]\n",
"[Epoch 77/200] [Batch 8/59] [D loss: 0.583267] [G loss: 0.884971]\n",
"[Epoch 77/200] [Batch 9/59] [D loss: 0.603146] [G loss: 0.884926]\n",
"[Epoch 77/200] [Batch 10/59] [D loss: 0.617467] [G loss: 0.771417]\n",
"[Epoch 77/200] [Batch 11/59] [D loss: 0.655365] [G loss: 0.965973]\n",
"[Epoch 77/200] [Batch 12/59] [D loss: 0.635631] [G loss: 0.634129]\n",
"[Epoch 77/200] [Batch 13/59] [D loss: 0.592272] [G loss: 0.927387]\n",
"[Epoch 77/200] [Batch 14/59] [D loss: 0.638645] [G loss: 1.066016]\n",
"[Epoch 77/200] [Batch 15/59] [D loss: 0.591251] [G loss: 0.843197]\n",
"[Epoch 77/200] [Batch 16/59] [D loss: 0.542902] [G loss: 0.751164]\n",
"[Epoch 77/200] [Batch 17/59] [D loss: 0.587951] [G loss: 1.149647]\n",
"[Epoch 77/200] [Batch 18/59] [D loss: 0.568795] [G loss: 0.874611]\n",
"[Epoch 77/200] [Batch 19/59] [D loss: 0.603387] [G loss: 0.745661]\n",
"[Epoch 77/200] [Batch 20/59] [D loss: 0.639386] [G loss: 0.824728]\n",
"[Epoch 77/200] [Batch 21/59] [D loss: 0.664880] [G loss: 1.034939]\n",
"[Epoch 77/200] [Batch 22/59] [D loss: 0.651241] [G loss: 0.773706]\n",
"[Epoch 77/200] [Batch 23/59] [D loss: 0.641852] [G loss: 0.727546]\n",
"[Epoch 77/200] [Batch 24/59] [D loss: 0.661806] [G loss: 1.031597]\n",
"[Epoch 77/200] [Batch 25/59] [D loss: 0.641882] [G loss: 0.660146]\n",
"[Epoch 77/200] [Batch 26/59] [D loss: 0.598834] [G loss: 0.950462]\n",
"[Epoch 77/200] [Batch 27/59] [D loss: 0.612552] [G loss: 0.951917]\n",
"[Epoch 77/200] [Batch 28/59] [D loss: 0.564987] [G loss: 0.728408]\n",
"[Epoch 77/200] [Batch 29/59] [D loss: 0.618241] [G loss: 0.940862]\n",
"[Epoch 77/200] [Batch 30/59] [D loss: 0.630992] [G loss: 0.782911]\n",
"[Epoch 77/200] [Batch 31/59] [D loss: 0.589911] [G loss: 0.722125]\n",
"[Epoch 77/200] [Batch 32/59] [D loss: 0.594712] [G loss: 0.785269]\n",
"[Epoch 77/200] [Batch 33/59] [D loss: 0.605368] [G loss: 1.157631]\n",
"[Epoch 77/200] [Batch 34/59] [D loss: 0.611198] [G loss: 0.732299]\n",
"[Epoch 77/200] [Batch 35/59] [D loss: 0.619273] [G loss: 0.738423]\n",
"[Epoch 77/200] [Batch 36/59] [D loss: 0.593292] [G loss: 1.065879]\n",
"[Epoch 77/200] [Batch 37/59] [D loss: 0.631764] [G loss: 0.798949]\n",
"[Epoch 77/200] [Batch 38/59] [D loss: 0.640877] [G loss: 0.783848]\n",
"[Epoch 77/200] [Batch 39/59] [D loss: 0.621422] [G loss: 0.931364]\n",
"[Epoch 77/200] [Batch 40/59] [D loss: 0.647557] [G loss: 0.824530]\n",
"[Epoch 77/200] [Batch 41/59] [D loss: 0.596277] [G loss: 0.880628]\n",
"[Epoch 77/200] [Batch 42/59] [D loss: 0.610129] [G loss: 0.809780]\n",
"[Epoch 77/200] [Batch 43/59] [D loss: 0.648390] [G loss: 1.014668]\n",
"[Epoch 77/200] [Batch 44/59] [D loss: 0.631782] [G loss: 0.823974]\n",
"[Epoch 77/200] [Batch 45/59] [D loss: 0.578555] [G loss: 0.872502]\n",
"[Epoch 77/200] [Batch 46/59] [D loss: 0.611913] [G loss: 0.755832]\n",
"[Epoch 77/200] [Batch 47/59] [D loss: 0.612189] [G loss: 0.790978]\n",
"[Epoch 77/200] [Batch 48/59] [D loss: 0.593595] [G loss: 0.923335]\n",
"[Epoch 77/200] [Batch 49/59] [D loss: 0.636941] [G loss: 0.965518]\n",
"[Epoch 77/200] [Batch 50/59] [D loss: 0.623737] [G loss: 0.678770]\n",
"[Epoch 77/200] [Batch 51/59] [D loss: 0.625118] [G loss: 0.882934]\n",
"[Epoch 77/200] [Batch 52/59] [D loss: 0.638464] [G loss: 0.866693]\n",
"[Epoch 77/200] [Batch 53/59] [D loss: 0.621931] [G loss: 0.917709]\n",
"[Epoch 77/200] [Batch 54/59] [D loss: 0.546219] [G loss: 0.840465]\n",
"[Epoch 77/200] [Batch 55/59] [D loss: 0.626288] [G loss: 0.875431]\n",
"[Epoch 77/200] [Batch 56/59] [D loss: 0.636376] [G loss: 0.866543]\n",
"[Epoch 77/200] [Batch 57/59] [D loss: 0.614971] [G loss: 0.806211]\n",
"[Epoch 77/200] [Batch 58/59] [D loss: 0.607682] [G loss: 0.741571]\n",
"[Epoch 78/200] [Batch 0/59] [D loss: 0.615222] [G loss: 1.036712]\n",
"[Epoch 78/200] [Batch 1/59] [D loss: 0.657500] [G loss: 0.988226]\n",
"[Epoch 78/200] [Batch 2/59] [D loss: 0.587127] [G loss: 0.722164]\n",
"[Epoch 78/200] [Batch 3/59] [D loss: 0.653345] [G loss: 0.957040]\n",
"[Epoch 78/200] [Batch 4/59] [D loss: 0.697666] [G loss: 0.885664]\n",
"[Epoch 78/200] [Batch 5/59] [D loss: 0.618154] [G loss: 0.720607]\n",
"[Epoch 78/200] [Batch 6/59] [D loss: 0.591802] [G loss: 1.002880]\n",
"[Epoch 78/200] [Batch 7/59] [D loss: 0.659429] [G loss: 0.896640]\n",
"[Epoch 78/200] [Batch 8/59] [D loss: 0.620627] [G loss: 0.871527]\n",
"[Epoch 78/200] [Batch 9/59] [D loss: 0.634484] [G loss: 0.694964]\n",
"[Epoch 78/200] [Batch 10/59] [D loss: 0.612620] [G loss: 0.852968]\n",
"[Epoch 78/200] [Batch 11/59] [D loss: 0.652028] [G loss: 0.783378]\n",
"[Epoch 78/200] [Batch 12/59] [D loss: 0.633402] [G loss: 0.772519]\n",
"[Epoch 78/200] [Batch 13/59] [D loss: 0.596668] [G loss: 0.875403]\n",
"[Epoch 78/200] [Batch 14/59] [D loss: 0.529530] [G loss: 0.918755]\n",
"[Epoch 78/200] [Batch 15/59] [D loss: 0.598751] [G loss: 0.839173]\n",
"[Epoch 78/200] [Batch 16/59] [D loss: 0.599460] [G loss: 0.817438]\n",
"[Epoch 78/200] [Batch 17/59] [D loss: 0.606542] [G loss: 0.858820]\n",
"[Epoch 78/200] [Batch 18/59] [D loss: 0.680770] [G loss: 0.898593]\n",
"[Epoch 78/200] [Batch 19/59] [D loss: 0.604607] [G loss: 0.849051]\n",
"[Epoch 78/200] [Batch 20/59] [D loss: 0.587283] [G loss: 0.768182]\n",
"[Epoch 78/200] [Batch 21/59] [D loss: 0.614472] [G loss: 0.796780]\n",
"[Epoch 78/200] [Batch 22/59] [D loss: 0.667870] [G loss: 0.811612]\n",
"[Epoch 78/200] [Batch 23/59] [D loss: 0.555684] [G loss: 0.785276]\n",
"[Epoch 78/200] [Batch 24/59] [D loss: 0.596139] [G loss: 0.902551]\n",
"[Epoch 78/200] [Batch 25/59] [D loss: 0.634481] [G loss: 0.797958]\n",
"[Epoch 78/200] [Batch 26/59] [D loss: 0.632094] [G loss: 0.790082]\n",
"[Epoch 78/200] [Batch 27/59] [D loss: 0.617844] [G loss: 0.918346]\n",
"[Epoch 78/200] [Batch 28/59] [D loss: 0.629463] [G loss: 0.737759]\n",
"[Epoch 78/200] [Batch 29/59] [D loss: 0.598426] [G loss: 0.659983]\n",
"[Epoch 78/200] [Batch 30/59] [D loss: 0.626306] [G loss: 0.793589]\n",
"[Epoch 78/200] [Batch 31/59] [D loss: 0.597362] [G loss: 0.921684]\n",
"[Epoch 78/200] [Batch 32/59] [D loss: 0.618493] [G loss: 0.672428]\n",
"[Epoch 78/200] [Batch 33/59] [D loss: 0.654710] [G loss: 0.755565]\n",
"[Epoch 78/200] [Batch 34/59] [D loss: 0.600594] [G loss: 1.013310]\n",
"[Epoch 78/200] [Batch 35/59] [D loss: 0.581072] [G loss: 0.781683]\n",
"[Epoch 78/200] [Batch 36/59] [D loss: 0.605506] [G loss: 0.759213]\n",
"[Epoch 78/200] [Batch 37/59] [D loss: 0.622261] [G loss: 0.952984]\n",
"[Epoch 78/200] [Batch 38/59] [D loss: 0.626507] [G loss: 0.729917]\n",
"[Epoch 78/200] [Batch 39/59] [D loss: 0.607430] [G loss: 0.798123]\n",
"[Epoch 78/200] [Batch 40/59] [D loss: 0.595227] [G loss: 0.810476]\n",
"[Epoch 78/200] [Batch 41/59] [D loss: 0.641245] [G loss: 0.948612]\n",
"[Epoch 78/200] [Batch 42/59] [D loss: 0.616472] [G loss: 0.995507]\n",
"[Epoch 78/200] [Batch 43/59] [D loss: 0.635467] [G loss: 0.779215]\n",
"[Epoch 78/200] [Batch 44/59] [D loss: 0.573327] [G loss: 0.814657]\n",
"[Epoch 78/200] [Batch 45/59] [D loss: 0.651813] [G loss: 1.024965]\n",
"[Epoch 78/200] [Batch 46/59] [D loss: 0.599298] [G loss: 0.933129]\n",
"[Epoch 78/200] [Batch 47/59] [D loss: 0.604426] [G loss: 0.688826]\n",
"[Epoch 78/200] [Batch 48/59] [D loss: 0.612210] [G loss: 1.010052]\n",
"[Epoch 78/200] [Batch 49/59] [D loss: 0.593000] [G loss: 0.910691]\n",
"[Epoch 78/200] [Batch 50/59] [D loss: 0.646636] [G loss: 0.714222]\n",
"[Epoch 78/200] [Batch 51/59] [D loss: 0.599130] [G loss: 1.023582]\n",
"[Epoch 78/200] [Batch 52/59] [D loss: 0.620967] [G loss: 0.768683]\n",
"[Epoch 78/200] [Batch 53/59] [D loss: 0.624880] [G loss: 0.745888]\n",
"[Epoch 78/200] [Batch 54/59] [D loss: 0.592667] [G loss: 0.964892]\n",
"[Epoch 78/200] [Batch 55/59] [D loss: 0.616807] [G loss: 0.835485]\n",
"[Epoch 78/200] [Batch 56/59] [D loss: 0.634312] [G loss: 0.804759]\n",
"[Epoch 78/200] [Batch 57/59] [D loss: 0.618622] [G loss: 0.914944]\n",
"[Epoch 78/200] [Batch 58/59] [D loss: 0.597645] [G loss: 0.800134]\n",
"[Epoch 79/200] [Batch 0/59] [D loss: 0.592445] [G loss: 0.807892]\n",
"[Epoch 79/200] [Batch 1/59] [D loss: 0.609376] [G loss: 0.920422]\n",
"[Epoch 79/200] [Batch 2/59] [D loss: 0.597130] [G loss: 0.833406]\n",
"[Epoch 79/200] [Batch 3/59] [D loss: 0.595935] [G loss: 0.751587]\n",
"[Epoch 79/200] [Batch 4/59] [D loss: 0.586768] [G loss: 0.855821]\n",
"[Epoch 79/200] [Batch 5/59] [D loss: 0.645754] [G loss: 0.905094]\n",
"[Epoch 79/200] [Batch 6/59] [D loss: 0.626230] [G loss: 0.907599]\n",
"[Epoch 79/200] [Batch 7/59] [D loss: 0.630870] [G loss: 0.724442]\n",
"[Epoch 79/200] [Batch 8/59] [D loss: 0.593433] [G loss: 0.982090]\n",
"[Epoch 79/200] [Batch 9/59] [D loss: 0.595780] [G loss: 0.860384]\n",
"[Epoch 79/200] [Batch 10/59] [D loss: 0.595348] [G loss: 0.765519]\n",
"[Epoch 79/200] [Batch 11/59] [D loss: 0.604769] [G loss: 0.828785]\n",
"[Epoch 79/200] [Batch 12/59] [D loss: 0.596405] [G loss: 0.996227]\n",
"[Epoch 79/200] [Batch 13/59] [D loss: 0.571090] [G loss: 0.789729]\n",
"[Epoch 79/200] [Batch 14/59] [D loss: 0.587619] [G loss: 0.994760]\n",
"[Epoch 79/200] [Batch 15/59] [D loss: 0.587280] [G loss: 0.802792]\n",
"[Epoch 79/200] [Batch 16/59] [D loss: 0.646443] [G loss: 0.650903]\n",
"[Epoch 79/200] [Batch 17/59] [D loss: 0.655496] [G loss: 1.022918]\n",
"[Epoch 79/200] [Batch 18/59] [D loss: 0.603073] [G loss: 0.875128]\n",
"[Epoch 79/200] [Batch 19/59] [D loss: 0.642798] [G loss: 0.844660]\n",
"[Epoch 79/200] [Batch 20/59] [D loss: 0.656559] [G loss: 0.999041]\n",
"[Epoch 79/200] [Batch 21/59] [D loss: 0.604870] [G loss: 0.811974]\n",
"[Epoch 79/200] [Batch 22/59] [D loss: 0.578229] [G loss: 0.780639]\n",
"[Epoch 79/200] [Batch 23/59] [D loss: 0.601924] [G loss: 0.899424]\n",
"[Epoch 79/200] [Batch 24/59] [D loss: 0.614451] [G loss: 0.855420]\n",
"[Epoch 79/200] [Batch 25/59] [D loss: 0.610428] [G loss: 0.695615]\n",
"[Epoch 79/200] [Batch 26/59] [D loss: 0.610605] [G loss: 0.842886]\n",
"[Epoch 79/200] [Batch 27/59] [D loss: 0.620575] [G loss: 0.859144]\n",
"[Epoch 79/200] [Batch 28/59] [D loss: 0.602258] [G loss: 0.666817]\n",
"[Epoch 79/200] [Batch 29/59] [D loss: 0.628129] [G loss: 1.055517]\n",
"[Epoch 79/200] [Batch 30/59] [D loss: 0.604541] [G loss: 0.762130]\n",
"[Epoch 79/200] [Batch 31/59] [D loss: 0.601393] [G loss: 0.867153]\n",
"[Epoch 79/200] [Batch 32/59] [D loss: 0.598367] [G loss: 0.808272]\n",
"[Epoch 79/200] [Batch 33/59] [D loss: 0.609718] [G loss: 0.804044]\n",
"[Epoch 79/200] [Batch 34/59] [D loss: 0.618726] [G loss: 0.817230]\n",
"[Epoch 79/200] [Batch 35/59] [D loss: 0.606652] [G loss: 0.746539]\n",
"[Epoch 79/200] [Batch 36/59] [D loss: 0.572562] [G loss: 0.997129]\n",
"[Epoch 79/200] [Batch 37/59] [D loss: 0.606584] [G loss: 0.795155]\n",
"[Epoch 79/200] [Batch 38/59] [D loss: 0.573944] [G loss: 0.932935]\n",
"[Epoch 79/200] [Batch 39/59] [D loss: 0.670877] [G loss: 0.816423]\n",
"[Epoch 79/200] [Batch 40/59] [D loss: 0.587581] [G loss: 0.947846]\n",
"[Epoch 79/200] [Batch 41/59] [D loss: 0.551361] [G loss: 0.786759]\n",
"[Epoch 79/200] [Batch 42/59] [D loss: 0.598334] [G loss: 0.850338]\n",
"[Epoch 79/200] [Batch 43/59] [D loss: 0.606198] [G loss: 0.756822]\n",
"[Epoch 79/200] [Batch 44/59] [D loss: 0.620344] [G loss: 1.043360]\n",
"[Epoch 79/200] [Batch 45/59] [D loss: 0.583774] [G loss: 0.941296]\n",
"[Epoch 79/200] [Batch 46/59] [D loss: 0.626849] [G loss: 0.768663]\n",
"[Epoch 79/200] [Batch 47/59] [D loss: 0.613914] [G loss: 0.982384]\n",
"[Epoch 79/200] [Batch 48/59] [D loss: 0.587962] [G loss: 0.963674]\n",
"[Epoch 79/200] [Batch 49/59] [D loss: 0.540537] [G loss: 0.669793]\n",
"[Epoch 79/200] [Batch 50/59] [D loss: 0.652672] [G loss: 0.672242]\n",
"[Epoch 79/200] [Batch 51/59] [D loss: 0.635192] [G loss: 1.172108]\n",
"[Epoch 79/200] [Batch 52/59] [D loss: 0.649483] [G loss: 0.722578]\n",
"[Epoch 79/200] [Batch 53/59] [D loss: 0.621385] [G loss: 0.873151]\n",
"[Epoch 79/200] [Batch 54/59] [D loss: 0.646774] [G loss: 1.028313]\n",
"[Epoch 79/200] [Batch 55/59] [D loss: 0.564373] [G loss: 0.793116]\n",
"[Epoch 79/200] [Batch 56/59] [D loss: 0.591493] [G loss: 0.834241]\n",
"[Epoch 79/200] [Batch 57/59] [D loss: 0.611029] [G loss: 1.006179]\n",
"[Epoch 79/200] [Batch 58/59] [D loss: 0.588323] [G loss: 0.894005]\n",
"[Epoch 80/200] [Batch 0/59] [D loss: 0.581074] [G loss: 0.720799]\n",
"[Epoch 80/200] [Batch 1/59] [D loss: 0.596121] [G loss: 0.944990]\n",
"[Epoch 80/200] [Batch 2/59] [D loss: 0.660487] [G loss: 1.068706]\n",
"[Epoch 80/200] [Batch 3/59] [D loss: 0.625882] [G loss: 0.796479]\n",
"[Epoch 80/200] [Batch 4/59] [D loss: 0.563628] [G loss: 0.884343]\n",
"[Epoch 80/200] [Batch 5/59] [D loss: 0.613364] [G loss: 0.957732]\n",
"[Epoch 80/200] [Batch 6/59] [D loss: 0.611343] [G loss: 0.861509]\n",
"[Epoch 80/200] [Batch 7/59] [D loss: 0.581924] [G loss: 0.855303]\n",
"[Epoch 80/200] [Batch 8/59] [D loss: 0.681341] [G loss: 1.068616]\n",
"[Epoch 80/200] [Batch 9/59] [D loss: 0.619947] [G loss: 0.656008]\n",
"[Epoch 80/200] [Batch 10/59] [D loss: 0.623742] [G loss: 0.870317]\n",
"[Epoch 80/200] [Batch 11/59] [D loss: 0.647209] [G loss: 1.016381]\n",
"[Epoch 80/200] [Batch 12/59] [D loss: 0.643386] [G loss: 0.604839]\n",
"[Epoch 80/200] [Batch 13/59] [D loss: 0.654462] [G loss: 1.134534]\n",
"[Epoch 80/200] [Batch 14/59] [D loss: 0.592968] [G loss: 0.853605]\n",
"[Epoch 80/200] [Batch 15/59] [D loss: 0.613639] [G loss: 0.790214]\n",
"[Epoch 80/200] [Batch 16/59] [D loss: 0.588750] [G loss: 0.738963]\n",
"[Epoch 80/200] [Batch 17/59] [D loss: 0.586788] [G loss: 1.202501]\n",
"[Epoch 80/200] [Batch 18/59] [D loss: 0.621080] [G loss: 0.820708]\n",
"[Epoch 80/200] [Batch 19/59] [D loss: 0.661073] [G loss: 0.604933]\n",
"[Epoch 80/200] [Batch 20/59] [D loss: 0.644749] [G loss: 1.242885]\n",
"[Epoch 80/200] [Batch 21/59] [D loss: 0.564732] [G loss: 0.780089]\n",
"[Epoch 80/200] [Batch 22/59] [D loss: 0.618982] [G loss: 0.693925]\n",
"[Epoch 80/200] [Batch 23/59] [D loss: 0.569252] [G loss: 0.937864]\n",
"[Epoch 80/200] [Batch 24/59] [D loss: 0.600862] [G loss: 0.858393]\n",
"[Epoch 80/200] [Batch 25/59] [D loss: 0.589488] [G loss: 0.623529]\n",
"[Epoch 80/200] [Batch 26/59] [D loss: 0.604769] [G loss: 0.927629]\n",
"[Epoch 80/200] [Batch 27/59] [D loss: 0.629639] [G loss: 0.907399]\n",
"[Epoch 80/200] [Batch 28/59] [D loss: 0.560305] [G loss: 0.919220]\n",
"[Epoch 80/200] [Batch 29/59] [D loss: 0.566679] [G loss: 0.686103]\n",
"[Epoch 80/200] [Batch 30/59] [D loss: 0.594290] [G loss: 0.881463]\n",
"[Epoch 80/200] [Batch 31/59] [D loss: 0.609937] [G loss: 1.132896]\n",
"[Epoch 80/200] [Batch 32/59] [D loss: 0.591079] [G loss: 0.647364]\n",
"[Epoch 80/200] [Batch 33/59] [D loss: 0.616719] [G loss: 0.738558]\n",
"[Epoch 80/200] [Batch 34/59] [D loss: 0.616485] [G loss: 0.982487]\n",
"[Epoch 80/200] [Batch 35/59] [D loss: 0.627796] [G loss: 0.797792]\n",
"[Epoch 80/200] [Batch 36/59] [D loss: 0.657890] [G loss: 0.665344]\n",
"[Epoch 80/200] [Batch 37/59] [D loss: 0.585841] [G loss: 0.958339]\n",
"[Epoch 80/200] [Batch 38/59] [D loss: 0.613078] [G loss: 0.913260]\n",
"[Epoch 80/200] [Batch 39/59] [D loss: 0.600030] [G loss: 0.723026]\n",
"[Epoch 80/200] [Batch 40/59] [D loss: 0.644888] [G loss: 0.849425]\n",
"[Epoch 80/200] [Batch 41/59] [D loss: 0.602259] [G loss: 0.849740]\n",
"[Epoch 80/200] [Batch 42/59] [D loss: 0.568892] [G loss: 0.736629]\n",
"[Epoch 80/200] [Batch 43/59] [D loss: 0.636354] [G loss: 0.940598]\n",
"[Epoch 80/200] [Batch 44/59] [D loss: 0.616707] [G loss: 0.843715]\n",
"[Epoch 80/200] [Batch 45/59] [D loss: 0.615464] [G loss: 0.948528]\n",
"[Epoch 80/200] [Batch 46/59] [D loss: 0.608238] [G loss: 0.773042]\n",
"[Epoch 80/200] [Batch 47/59] [D loss: 0.587714] [G loss: 0.915222]\n",
"[Epoch 80/200] [Batch 48/59] [D loss: 0.615052] [G loss: 1.007997]\n",
"[Epoch 80/200] [Batch 49/59] [D loss: 0.657454] [G loss: 0.766046]\n",
"[Epoch 80/200] [Batch 50/59] [D loss: 0.593873] [G loss: 0.914419]\n",
"[Epoch 80/200] [Batch 51/59] [D loss: 0.589721] [G loss: 0.927393]\n",
"[Epoch 80/200] [Batch 52/59] [D loss: 0.575240] [G loss: 0.941968]\n",
"[Epoch 80/200] [Batch 53/59] [D loss: 0.623334] [G loss: 0.746309]\n",
"[Epoch 80/200] [Batch 54/59] [D loss: 0.597597] [G loss: 0.912960]\n",
"[Epoch 80/200] [Batch 55/59] [D loss: 0.604331] [G loss: 0.813106]\n",
"[Epoch 80/200] [Batch 56/59] [D loss: 0.607412] [G loss: 0.902141]\n",
"[Epoch 80/200] [Batch 57/59] [D loss: 0.587647] [G loss: 0.757428]\n",
"[Epoch 80/200] [Batch 58/59] [D loss: 0.603133] [G loss: 0.818588]\n",
"[Epoch 81/200] [Batch 0/59] [D loss: 0.642000] [G loss: 0.930431]\n",
"[Epoch 81/200] [Batch 1/59] [D loss: 0.578936] [G loss: 0.771862]\n",
"[Epoch 81/200] [Batch 2/59] [D loss: 0.608424] [G loss: 0.782128]\n",
"[Epoch 81/200] [Batch 3/59] [D loss: 0.642830] [G loss: 0.823212]\n",
"[Epoch 81/200] [Batch 4/59] [D loss: 0.666798] [G loss: 1.020722]\n",
"[Epoch 81/200] [Batch 5/59] [D loss: 0.612549] [G loss: 0.486949]\n",
"[Epoch 81/200] [Batch 6/59] [D loss: 0.643870] [G loss: 0.927835]\n",
"[Epoch 81/200] [Batch 7/59] [D loss: 0.617647] [G loss: 0.955011]\n",
"[Epoch 81/200] [Batch 8/59] [D loss: 0.599713] [G loss: 0.746395]\n",
"[Epoch 81/200] [Batch 9/59] [D loss: 0.617104] [G loss: 0.785600]\n",
"[Epoch 81/200] [Batch 10/59] [D loss: 0.615611] [G loss: 0.915329]\n",
"[Epoch 81/200] [Batch 11/59] [D loss: 0.644033] [G loss: 0.882748]\n",
"[Epoch 81/200] [Batch 12/59] [D loss: 0.669248] [G loss: 1.017403]\n",
"[Epoch 81/200] [Batch 13/59] [D loss: 0.598258] [G loss: 0.849370]\n",
"[Epoch 81/200] [Batch 14/59] [D loss: 0.584313] [G loss: 0.790889]\n",
"[Epoch 81/200] [Batch 15/59] [D loss: 0.592618] [G loss: 0.875847]\n",
"[Epoch 81/200] [Batch 16/59] [D loss: 0.622385] [G loss: 0.814579]\n",
"[Epoch 81/200] [Batch 17/59] [D loss: 0.620377] [G loss: 0.832353]\n",
"[Epoch 81/200] [Batch 18/59] [D loss: 0.601339] [G loss: 0.938978]\n",
"[Epoch 81/200] [Batch 19/59] [D loss: 0.615993] [G loss: 0.838800]\n",
"[Epoch 81/200] [Batch 20/59] [D loss: 0.659053] [G loss: 0.819471]\n",
"[Epoch 81/200] [Batch 21/59] [D loss: 0.570664] [G loss: 1.055049]\n",
"[Epoch 81/200] [Batch 22/59] [D loss: 0.600944] [G loss: 0.771779]\n",
"[Epoch 81/200] [Batch 23/59] [D loss: 0.629316] [G loss: 0.703210]\n",
"[Epoch 81/200] [Batch 24/59] [D loss: 0.601099] [G loss: 0.970972]\n",
"[Epoch 81/200] [Batch 25/59] [D loss: 0.632684] [G loss: 0.697767]\n",
"[Epoch 81/200] [Batch 26/59] [D loss: 0.616011] [G loss: 0.976348]\n",
"[Epoch 81/200] [Batch 27/59] [D loss: 0.575059] [G loss: 0.839686]\n",
"[Epoch 81/200] [Batch 28/59] [D loss: 0.564576] [G loss: 0.716650]\n",
"[Epoch 81/200] [Batch 29/59] [D loss: 0.661011] [G loss: 1.012420]\n",
"[Epoch 81/200] [Batch 30/59] [D loss: 0.606170] [G loss: 0.779191]\n",
"[Epoch 81/200] [Batch 31/59] [D loss: 0.647346] [G loss: 0.706235]\n",
"[Epoch 81/200] [Batch 32/59] [D loss: 0.642808] [G loss: 1.069297]\n",
"[Epoch 81/200] [Batch 33/59] [D loss: 0.619147] [G loss: 0.690314]\n",
"[Epoch 81/200] [Batch 34/59] [D loss: 0.661347] [G loss: 0.880256]\n",
"[Epoch 81/200] [Batch 35/59] [D loss: 0.612921] [G loss: 0.814419]\n",
"[Epoch 81/200] [Batch 36/59] [D loss: 0.658421] [G loss: 0.812241]\n",
"[Epoch 81/200] [Batch 37/59] [D loss: 0.619016] [G loss: 0.939267]\n",
"[Epoch 81/200] [Batch 38/59] [D loss: 0.629719] [G loss: 0.759417]\n",
"[Epoch 81/200] [Batch 39/59] [D loss: 0.626755] [G loss: 1.040031]\n",
"[Epoch 81/200] [Batch 40/59] [D loss: 0.636316] [G loss: 0.830285]\n",
"[Epoch 81/200] [Batch 41/59] [D loss: 0.567831] [G loss: 0.737686]\n",
"[Epoch 81/200] [Batch 42/59] [D loss: 0.611557] [G loss: 1.044320]\n",
"[Epoch 81/200] [Batch 43/59] [D loss: 0.601326] [G loss: 0.940311]\n",
"[Epoch 81/200] [Batch 44/59] [D loss: 0.629597] [G loss: 0.800973]\n",
"[Epoch 81/200] [Batch 45/59] [D loss: 0.560662] [G loss: 0.912100]\n",
"[Epoch 81/200] [Batch 46/59] [D loss: 0.677533] [G loss: 0.948037]\n",
"[Epoch 81/200] [Batch 47/59] [D loss: 0.631080] [G loss: 0.776809]\n",
"[Epoch 81/200] [Batch 48/59] [D loss: 0.613192] [G loss: 0.706373]\n",
"[Epoch 81/200] [Batch 49/59] [D loss: 0.625258] [G loss: 0.980177]\n",
"[Epoch 81/200] [Batch 50/59] [D loss: 0.641121] [G loss: 0.847263]\n",
"[Epoch 81/200] [Batch 51/59] [D loss: 0.599845] [G loss: 0.754326]\n",
"[Epoch 81/200] [Batch 52/59] [D loss: 0.613899] [G loss: 0.675651]\n",
"[Epoch 81/200] [Batch 53/59] [D loss: 0.578060] [G loss: 1.072487]\n",
"[Epoch 81/200] [Batch 54/59] [D loss: 0.614713] [G loss: 0.726409]\n",
"[Epoch 81/200] [Batch 55/59] [D loss: 0.578425] [G loss: 0.616123]\n",
"[Epoch 81/200] [Batch 56/59] [D loss: 0.603653] [G loss: 1.023882]\n",
"[Epoch 81/200] [Batch 57/59] [D loss: 0.567235] [G loss: 0.803644]\n",
"[Epoch 81/200] [Batch 58/59] [D loss: 0.623015] [G loss: 0.615675]\n",
"[Epoch 82/200] [Batch 0/59] [D loss: 0.648602] [G loss: 1.061874]\n",
"[Epoch 82/200] [Batch 1/59] [D loss: 0.628945] [G loss: 0.843757]\n",
"[Epoch 82/200] [Batch 2/59] [D loss: 0.617964] [G loss: 0.670517]\n",
"[Epoch 82/200] [Batch 3/59] [D loss: 0.635880] [G loss: 1.060787]\n",
"[Epoch 82/200] [Batch 4/59] [D loss: 0.578648] [G loss: 0.804950]\n",
"[Epoch 82/200] [Batch 5/59] [D loss: 0.632930] [G loss: 0.823999]\n",
"[Epoch 82/200] [Batch 6/59] [D loss: 0.607940] [G loss: 0.894133]\n",
"[Epoch 82/200] [Batch 7/59] [D loss: 0.608550] [G loss: 0.894801]\n",
"[Epoch 82/200] [Batch 8/59] [D loss: 0.619446] [G loss: 0.892106]\n",
"[Epoch 82/200] [Batch 9/59] [D loss: 0.640459] [G loss: 0.942181]\n",
"[Epoch 82/200] [Batch 10/59] [D loss: 0.572765] [G loss: 0.753715]\n",
"[Epoch 82/200] [Batch 11/59] [D loss: 0.619036] [G loss: 1.060737]\n",
"[Epoch 82/200] [Batch 12/59] [D loss: 0.633686] [G loss: 0.734391]\n",
"[Epoch 82/200] [Batch 13/59] [D loss: 0.646316] [G loss: 0.799675]\n",
"[Epoch 82/200] [Batch 14/59] [D loss: 0.644398] [G loss: 1.041854]\n",
"[Epoch 82/200] [Batch 15/59] [D loss: 0.681173] [G loss: 0.755002]\n",
"[Epoch 82/200] [Batch 16/59] [D loss: 0.551879] [G loss: 0.863734]\n",
"[Epoch 82/200] [Batch 17/59] [D loss: 0.635552] [G loss: 0.960116]\n",
"[Epoch 82/200] [Batch 18/59] [D loss: 0.608497] [G loss: 0.755196]\n",
"[Epoch 82/200] [Batch 19/59] [D loss: 0.657890] [G loss: 1.013093]\n",
"[Epoch 82/200] [Batch 20/59] [D loss: 0.590939] [G loss: 0.963814]\n",
"[Epoch 82/200] [Batch 21/59] [D loss: 0.585240] [G loss: 0.763208]\n",
"[Epoch 82/200] [Batch 22/59] [D loss: 0.618230] [G loss: 1.085814]\n",
"[Epoch 82/200] [Batch 23/59] [D loss: 0.606811] [G loss: 0.854648]\n",
"[Epoch 82/200] [Batch 24/59] [D loss: 0.627820] [G loss: 0.739205]\n",
"[Epoch 82/200] [Batch 25/59] [D loss: 0.613811] [G loss: 0.907485]\n",
"[Epoch 82/200] [Batch 26/59] [D loss: 0.578478] [G loss: 0.797827]\n",
"[Epoch 82/200] [Batch 27/59] [D loss: 0.607769] [G loss: 0.913874]\n",
"[Epoch 82/200] [Batch 28/59] [D loss: 0.592402] [G loss: 0.880185]\n",
"[Epoch 82/200] [Batch 29/59] [D loss: 0.652174] [G loss: 0.621671]\n",
"[Epoch 82/200] [Batch 30/59] [D loss: 0.615108] [G loss: 1.271120]\n",
"[Epoch 82/200] [Batch 31/59] [D loss: 0.532553] [G loss: 0.814510]\n",
"[Epoch 82/200] [Batch 32/59] [D loss: 0.612636] [G loss: 0.574311]\n",
"[Epoch 82/200] [Batch 33/59] [D loss: 0.649094] [G loss: 1.272679]\n",
"[Epoch 82/200] [Batch 34/59] [D loss: 0.603612] [G loss: 0.862985]\n",
"[Epoch 82/200] [Batch 35/59] [D loss: 0.578463] [G loss: 0.694954]\n",
"[Epoch 82/200] [Batch 36/59] [D loss: 0.584716] [G loss: 0.906147]\n",
"[Epoch 82/200] [Batch 37/59] [D loss: 0.673992] [G loss: 1.043206]\n",
"[Epoch 82/200] [Batch 38/59] [D loss: 0.624475] [G loss: 0.708554]\n",
"[Epoch 82/200] [Batch 39/59] [D loss: 0.616587] [G loss: 0.720910]\n",
"[Epoch 82/200] [Batch 40/59] [D loss: 0.649236] [G loss: 1.090541]\n",
"[Epoch 82/200] [Batch 41/59] [D loss: 0.611836] [G loss: 0.781902]\n",
"[Epoch 82/200] [Batch 42/59] [D loss: 0.564817] [G loss: 0.788139]\n",
"[Epoch 82/200] [Batch 43/59] [D loss: 0.561312] [G loss: 0.939544]\n",
"[Epoch 82/200] [Batch 44/59] [D loss: 0.615341] [G loss: 0.937970]\n",
"[Epoch 82/200] [Batch 45/59] [D loss: 0.549605] [G loss: 0.725873]\n",
"[Epoch 82/200] [Batch 46/59] [D loss: 0.599769] [G loss: 0.967028]\n",
"[Epoch 82/200] [Batch 47/59] [D loss: 0.661390] [G loss: 1.051340]\n",
"[Epoch 82/200] [Batch 48/59] [D loss: 0.616261] [G loss: 0.653406]\n",
"[Epoch 82/200] [Batch 49/59] [D loss: 0.611490] [G loss: 0.839481]\n",
"[Epoch 82/200] [Batch 50/59] [D loss: 0.653165] [G loss: 0.930219]\n",
"[Epoch 82/200] [Batch 51/59] [D loss: 0.570765] [G loss: 0.661137]\n",
"[Epoch 82/200] [Batch 52/59] [D loss: 0.577134] [G loss: 0.849373]\n",
"[Epoch 82/200] [Batch 53/59] [D loss: 0.625276] [G loss: 0.894328]\n",
"[Epoch 82/200] [Batch 54/59] [D loss: 0.584106] [G loss: 0.759805]\n",
"[Epoch 82/200] [Batch 55/59] [D loss: 0.639440] [G loss: 0.888150]\n",
"[Epoch 82/200] [Batch 56/59] [D loss: 0.608131] [G loss: 0.879239]\n",
"[Epoch 82/200] [Batch 57/59] [D loss: 0.575270] [G loss: 0.840492]\n",
"[Epoch 82/200] [Batch 58/59] [D loss: 0.599078] [G loss: 0.943662]\n",
"[Epoch 83/200] [Batch 0/59] [D loss: 0.655528] [G loss: 0.960052]\n",
"[Epoch 83/200] [Batch 1/59] [D loss: 0.620782] [G loss: 0.738462]\n",
"[Epoch 83/200] [Batch 2/59] [D loss: 0.608857] [G loss: 0.913332]\n",
"[Epoch 83/200] [Batch 3/59] [D loss: 0.577645] [G loss: 1.090383]\n",
"[Epoch 83/200] [Batch 4/59] [D loss: 0.630587] [G loss: 0.853372]\n",
"[Epoch 83/200] [Batch 5/59] [D loss: 0.605003] [G loss: 0.688565]\n",
"[Epoch 83/200] [Batch 6/59] [D loss: 0.608883] [G loss: 1.047383]\n",
"[Epoch 83/200] [Batch 7/59] [D loss: 0.589947] [G loss: 1.016548]\n",
"[Epoch 83/200] [Batch 8/59] [D loss: 0.615534] [G loss: 0.591123]\n",
"[Epoch 83/200] [Batch 9/59] [D loss: 0.615468] [G loss: 1.053214]\n",
"[Epoch 83/200] [Batch 10/59] [D loss: 0.609990] [G loss: 0.978492]\n",
"[Epoch 83/200] [Batch 11/59] [D loss: 0.581044] [G loss: 0.759457]\n",
"[Epoch 83/200] [Batch 12/59] [D loss: 0.562778] [G loss: 0.858096]\n",
"[Epoch 83/200] [Batch 13/59] [D loss: 0.629006] [G loss: 0.873076]\n",
"[Epoch 83/200] [Batch 14/59] [D loss: 0.611324] [G loss: 0.929069]\n",
"[Epoch 83/200] [Batch 15/59] [D loss: 0.601400] [G loss: 0.690675]\n",
"[Epoch 83/200] [Batch 16/59] [D loss: 0.683378] [G loss: 0.992309]\n",
"[Epoch 83/200] [Batch 17/59] [D loss: 0.632486] [G loss: 0.818873]\n",
"[Epoch 83/200] [Batch 18/59] [D loss: 0.622500] [G loss: 0.849853]\n",
"[Epoch 83/200] [Batch 19/59] [D loss: 0.656323] [G loss: 1.003344]\n",
"[Epoch 83/200] [Batch 20/59] [D loss: 0.624985] [G loss: 0.690070]\n",
"[Epoch 83/200] [Batch 21/59] [D loss: 0.623936] [G loss: 0.859349]\n",
"[Epoch 83/200] [Batch 22/59] [D loss: 0.673508] [G loss: 0.850296]\n",
"[Epoch 83/200] [Batch 23/59] [D loss: 0.599408] [G loss: 0.791739]\n",
"[Epoch 83/200] [Batch 24/59] [D loss: 0.590614] [G loss: 0.794108]\n",
"[Epoch 83/200] [Batch 25/59] [D loss: 0.640741] [G loss: 0.962041]\n",
"[Epoch 83/200] [Batch 26/59] [D loss: 0.563698] [G loss: 0.775730]\n",
"[Epoch 83/200] [Batch 27/59] [D loss: 0.600560] [G loss: 0.992082]\n",
"[Epoch 83/200] [Batch 28/59] [D loss: 0.589172] [G loss: 0.816150]\n",
"[Epoch 83/200] [Batch 29/59] [D loss: 0.608970] [G loss: 0.824287]\n",
"[Epoch 83/200] [Batch 30/59] [D loss: 0.603813] [G loss: 0.795444]\n",
"[Epoch 83/200] [Batch 31/59] [D loss: 0.628115] [G loss: 0.861314]\n",
"[Epoch 83/200] [Batch 32/59] [D loss: 0.661112] [G loss: 0.891902]\n",
"[Epoch 83/200] [Batch 33/59] [D loss: 0.626809] [G loss: 0.768580]\n",
"[Epoch 83/200] [Batch 34/59] [D loss: 0.611926] [G loss: 0.946595]\n",
"[Epoch 83/200] [Batch 35/59] [D loss: 0.564724] [G loss: 0.818461]\n",
"[Epoch 83/200] [Batch 36/59] [D loss: 0.572529] [G loss: 0.842538]\n",
"[Epoch 83/200] [Batch 37/59] [D loss: 0.611436] [G loss: 1.013881]\n",
"[Epoch 83/200] [Batch 38/59] [D loss: 0.613982] [G loss: 0.988910]\n",
"[Epoch 83/200] [Batch 39/59] [D loss: 0.609683] [G loss: 0.786225]\n",
"[Epoch 83/200] [Batch 40/59] [D loss: 0.635226] [G loss: 0.771926]\n",
"[Epoch 83/200] [Batch 41/59] [D loss: 0.645813] [G loss: 1.046544]\n",
"[Epoch 83/200] [Batch 42/59] [D loss: 0.652971] [G loss: 0.713157]\n",
"[Epoch 83/200] [Batch 43/59] [D loss: 0.675555] [G loss: 0.733042]\n",
"[Epoch 83/200] [Batch 44/59] [D loss: 0.602384] [G loss: 0.972962]\n",
"[Epoch 83/200] [Batch 45/59] [D loss: 0.655792] [G loss: 0.781549]\n",
"[Epoch 83/200] [Batch 46/59] [D loss: 0.616828] [G loss: 0.964384]\n",
"[Epoch 83/200] [Batch 47/59] [D loss: 0.607098] [G loss: 0.763806]\n",
"[Epoch 83/200] [Batch 48/59] [D loss: 0.620678] [G loss: 0.783547]\n",
"[Epoch 83/200] [Batch 49/59] [D loss: 0.622957] [G loss: 0.923070]\n",
"[Epoch 83/200] [Batch 50/59] [D loss: 0.581133] [G loss: 0.761931]\n",
"[Epoch 83/200] [Batch 51/59] [D loss: 0.606736] [G loss: 0.810632]\n",
"[Epoch 83/200] [Batch 52/59] [D loss: 0.575891] [G loss: 0.993055]\n",
"[Epoch 83/200] [Batch 53/59] [D loss: 0.621888] [G loss: 0.690503]\n",
"[Epoch 83/200] [Batch 54/59] [D loss: 0.623452] [G loss: 0.669724]\n",
"[Epoch 83/200] [Batch 55/59] [D loss: 0.617627] [G loss: 1.078491]\n",
"[Epoch 83/200] [Batch 56/59] [D loss: 0.628846] [G loss: 0.855224]\n",
"[Epoch 83/200] [Batch 57/59] [D loss: 0.571382] [G loss: 0.798587]\n",
"[Epoch 83/200] [Batch 58/59] [D loss: 0.639584] [G loss: 0.814083]\n",
"[Epoch 84/200] [Batch 0/59] [D loss: 0.670063] [G loss: 1.051372]\n",
"[Epoch 84/200] [Batch 1/59] [D loss: 0.629290] [G loss: 0.784091]\n",
"[Epoch 84/200] [Batch 2/59] [D loss: 0.587638] [G loss: 0.818763]\n",
"[Epoch 84/200] [Batch 3/59] [D loss: 0.632549] [G loss: 1.089876]\n",
"[Epoch 84/200] [Batch 4/59] [D loss: 0.580250] [G loss: 0.766462]\n",
"[Epoch 84/200] [Batch 5/59] [D loss: 0.631643] [G loss: 0.647729]\n",
"[Epoch 84/200] [Batch 6/59] [D loss: 0.655857] [G loss: 1.114937]\n",
"[Epoch 84/200] [Batch 7/59] [D loss: 0.585394] [G loss: 0.889459]\n",
"[Epoch 84/200] [Batch 8/59] [D loss: 0.656448] [G loss: 0.645554]\n",
"[Epoch 84/200] [Batch 9/59] [D loss: 0.665670] [G loss: 0.884558]\n",
"[Epoch 84/200] [Batch 10/59] [D loss: 0.606944] [G loss: 0.946914]\n",
"[Epoch 84/200] [Batch 11/59] [D loss: 0.657297] [G loss: 0.740708]\n",
"[Epoch 84/200] [Batch 12/59] [D loss: 0.643062] [G loss: 0.960675]\n",
"[Epoch 84/200] [Batch 13/59] [D loss: 0.590089] [G loss: 0.652196]\n",
"[Epoch 84/200] [Batch 14/59] [D loss: 0.619311] [G loss: 0.778645]\n",
"[Epoch 84/200] [Batch 15/59] [D loss: 0.655817] [G loss: 1.083309]\n",
"[Epoch 84/200] [Batch 16/59] [D loss: 0.616130] [G loss: 0.637857]\n",
"[Epoch 84/200] [Batch 17/59] [D loss: 0.630169] [G loss: 0.773459]\n",
"[Epoch 84/200] [Batch 18/59] [D loss: 0.588842] [G loss: 0.886767]\n",
"[Epoch 84/200] [Batch 19/59] [D loss: 0.634849] [G loss: 0.695049]\n",
"[Epoch 84/200] [Batch 20/59] [D loss: 0.564945] [G loss: 0.750931]\n",
"[Epoch 84/200] [Batch 21/59] [D loss: 0.614247] [G loss: 1.072548]\n",
"[Epoch 84/200] [Batch 22/59] [D loss: 0.586793] [G loss: 0.785543]\n",
"[Epoch 84/200] [Batch 23/59] [D loss: 0.625808] [G loss: 0.770965]\n",
"[Epoch 84/200] [Batch 24/59] [D loss: 0.653112] [G loss: 0.927910]\n",
"[Epoch 84/200] [Batch 25/59] [D loss: 0.617029] [G loss: 0.949349]\n",
"[Epoch 84/200] [Batch 26/59] [D loss: 0.663334] [G loss: 0.607643]\n",
"[Epoch 84/200] [Batch 27/59] [D loss: 0.608116] [G loss: 1.039228]\n",
"[Epoch 84/200] [Batch 28/59] [D loss: 0.584533] [G loss: 0.853276]\n",
"[Epoch 84/200] [Batch 29/59] [D loss: 0.580950] [G loss: 0.795798]\n",
"[Epoch 84/200] [Batch 30/59] [D loss: 0.620208] [G loss: 0.915508]\n",
"[Epoch 84/200] [Batch 31/59] [D loss: 0.612501] [G loss: 0.784362]\n",
"[Epoch 84/200] [Batch 32/59] [D loss: 0.595388] [G loss: 1.022832]\n",
"[Epoch 84/200] [Batch 33/59] [D loss: 0.597311] [G loss: 0.915291]\n",
"[Epoch 84/200] [Batch 34/59] [D loss: 0.651393] [G loss: 0.778169]\n",
"[Epoch 84/200] [Batch 35/59] [D loss: 0.594221] [G loss: 0.810815]\n",
"[Epoch 84/200] [Batch 36/59] [D loss: 0.607004] [G loss: 0.980628]\n",
"[Epoch 84/200] [Batch 37/59] [D loss: 0.604452] [G loss: 0.860402]\n",
"[Epoch 84/200] [Batch 38/59] [D loss: 0.614107] [G loss: 0.796189]\n",
"[Epoch 84/200] [Batch 39/59] [D loss: 0.628766] [G loss: 0.949238]\n",
"[Epoch 84/200] [Batch 40/59] [D loss: 0.585674] [G loss: 0.807703]\n",
"[Epoch 84/200] [Batch 41/59] [D loss: 0.577734] [G loss: 0.754832]\n",
"[Epoch 84/200] [Batch 42/59] [D loss: 0.608674] [G loss: 0.847375]\n",
"[Epoch 84/200] [Batch 43/59] [D loss: 0.636871] [G loss: 0.799264]\n",
"[Epoch 84/200] [Batch 44/59] [D loss: 0.606915] [G loss: 0.952277]\n",
"[Epoch 84/200] [Batch 45/59] [D loss: 0.633341] [G loss: 0.747183]\n",
"[Epoch 84/200] [Batch 46/59] [D loss: 0.574690] [G loss: 0.799090]\n",
"[Epoch 84/200] [Batch 47/59] [D loss: 0.645644] [G loss: 0.810276]\n",
"[Epoch 84/200] [Batch 48/59] [D loss: 0.585641] [G loss: 0.955312]\n",
"[Epoch 84/200] [Batch 49/59] [D loss: 0.635026] [G loss: 0.773423]\n",
"[Epoch 84/200] [Batch 50/59] [D loss: 0.638229] [G loss: 0.987140]\n",
"[Epoch 84/200] [Batch 51/59] [D loss: 0.593991] [G loss: 0.761222]\n",
"[Epoch 84/200] [Batch 52/59] [D loss: 0.595818] [G loss: 0.910285]\n",
"[Epoch 84/200] [Batch 53/59] [D loss: 0.592653] [G loss: 0.887646]\n",
"[Epoch 84/200] [Batch 54/59] [D loss: 0.618137] [G loss: 1.012433]\n",
"[Epoch 84/200] [Batch 55/59] [D loss: 0.587347] [G loss: 0.811669]\n",
"[Epoch 84/200] [Batch 56/59] [D loss: 0.607343] [G loss: 0.812306]\n",
"[Epoch 84/200] [Batch 57/59] [D loss: 0.607095] [G loss: 0.797285]\n",
"[Epoch 84/200] [Batch 58/59] [D loss: 0.603780] [G loss: 0.933481]\n",
"[Epoch 85/200] [Batch 0/59] [D loss: 0.638143] [G loss: 0.753895]\n",
"[Epoch 85/200] [Batch 1/59] [D loss: 0.648000] [G loss: 0.861362]\n",
"[Epoch 85/200] [Batch 2/59] [D loss: 0.611767] [G loss: 0.803751]\n",
"[Epoch 85/200] [Batch 3/59] [D loss: 0.616463] [G loss: 0.838120]\n",
"[Epoch 85/200] [Batch 4/59] [D loss: 0.604432] [G loss: 0.645508]\n",
"[Epoch 85/200] [Batch 5/59] [D loss: 0.684482] [G loss: 0.994152]\n",
"[Epoch 85/200] [Batch 6/59] [D loss: 0.560294] [G loss: 0.787417]\n",
"[Epoch 85/200] [Batch 7/59] [D loss: 0.603049] [G loss: 0.702419]\n",
"[Epoch 85/200] [Batch 8/59] [D loss: 0.627981] [G loss: 0.895059]\n",
"[Epoch 85/200] [Batch 9/59] [D loss: 0.636256] [G loss: 0.884462]\n",
"[Epoch 85/200] [Batch 10/59] [D loss: 0.598791] [G loss: 0.593334]\n",
"[Epoch 85/200] [Batch 11/59] [D loss: 0.606087] [G loss: 0.999200]\n",
"[Epoch 85/200] [Batch 12/59] [D loss: 0.617781] [G loss: 0.911822]\n",
"[Epoch 85/200] [Batch 13/59] [D loss: 0.586692] [G loss: 0.759838]\n",
"[Epoch 85/200] [Batch 14/59] [D loss: 0.625413] [G loss: 0.921715]\n",
"[Epoch 85/200] [Batch 15/59] [D loss: 0.635554] [G loss: 0.694951]\n",
"[Epoch 85/200] [Batch 16/59] [D loss: 0.612319] [G loss: 0.898174]\n",
"[Epoch 85/200] [Batch 17/59] [D loss: 0.644277] [G loss: 1.037483]\n",
"[Epoch 85/200] [Batch 18/59] [D loss: 0.630690] [G loss: 0.699537]\n",
"[Epoch 85/200] [Batch 19/59] [D loss: 0.612188] [G loss: 0.790841]\n",
"[Epoch 85/200] [Batch 20/59] [D loss: 0.598239] [G loss: 1.010503]\n",
"[Epoch 85/200] [Batch 21/59] [D loss: 0.598697] [G loss: 0.840870]\n",
"[Epoch 85/200] [Batch 22/59] [D loss: 0.615408] [G loss: 0.665269]\n",
"[Epoch 85/200] [Batch 23/59] [D loss: 0.563149] [G loss: 0.923917]\n",
"[Epoch 85/200] [Batch 24/59] [D loss: 0.614612] [G loss: 0.855415]\n",
"[Epoch 85/200] [Batch 25/59] [D loss: 0.591731] [G loss: 0.694559]\n",
"[Epoch 85/200] [Batch 26/59] [D loss: 0.628942] [G loss: 0.716916]\n",
"[Epoch 85/200] [Batch 27/59] [D loss: 0.613133] [G loss: 1.120901]\n",
"[Epoch 85/200] [Batch 28/59] [D loss: 0.605034] [G loss: 0.768377]\n",
"[Epoch 85/200] [Batch 29/59] [D loss: 0.574222] [G loss: 0.760273]\n",
"[Epoch 85/200] [Batch 30/59] [D loss: 0.637906] [G loss: 0.993902]\n",
"[Epoch 85/200] [Batch 31/59] [D loss: 0.662150] [G loss: 0.785296]\n",
"[Epoch 85/200] [Batch 32/59] [D loss: 0.619298] [G loss: 0.826323]\n",
"[Epoch 85/200] [Batch 33/59] [D loss: 0.585210] [G loss: 0.832660]\n",
"[Epoch 85/200] [Batch 34/59] [D loss: 0.614028] [G loss: 1.145419]\n",
"[Epoch 85/200] [Batch 35/59] [D loss: 0.567939] [G loss: 0.775971]\n",
"[Epoch 85/200] [Batch 36/59] [D loss: 0.630855] [G loss: 0.837257]\n",
"[Epoch 85/200] [Batch 37/59] [D loss: 0.594049] [G loss: 0.916464]\n",
"[Epoch 85/200] [Batch 38/59] [D loss: 0.612300] [G loss: 0.712052]\n",
"[Epoch 85/200] [Batch 39/59] [D loss: 0.627571] [G loss: 0.783550]\n",
"[Epoch 85/200] [Batch 40/59] [D loss: 0.623035] [G loss: 1.026767]\n",
"[Epoch 85/200] [Batch 41/59] [D loss: 0.630235] [G loss: 0.669896]\n",
"[Epoch 85/200] [Batch 42/59] [D loss: 0.584668] [G loss: 0.778677]\n",
"[Epoch 85/200] [Batch 43/59] [D loss: 0.647304] [G loss: 1.155906]\n",
"[Epoch 85/200] [Batch 44/59] [D loss: 0.595125] [G loss: 0.724192]\n",
"[Epoch 85/200] [Batch 45/59] [D loss: 0.583926] [G loss: 0.886008]\n",
"[Epoch 85/200] [Batch 46/59] [D loss: 0.594500] [G loss: 0.866800]\n",
"[Epoch 85/200] [Batch 47/59] [D loss: 0.604293] [G loss: 0.761825]\n",
"[Epoch 85/200] [Batch 48/59] [D loss: 0.608592] [G loss: 0.901883]\n",
"[Epoch 85/200] [Batch 49/59] [D loss: 0.586152] [G loss: 0.930586]\n",
"[Epoch 85/200] [Batch 50/59] [D loss: 0.611118] [G loss: 0.570940]\n",
"[Epoch 85/200] [Batch 51/59] [D loss: 0.604201] [G loss: 0.884368]\n",
"[Epoch 85/200] [Batch 52/59] [D loss: 0.596960] [G loss: 1.089091]\n",
"[Epoch 85/200] [Batch 53/59] [D loss: 0.602742] [G loss: 0.845128]\n",
"[Epoch 85/200] [Batch 54/59] [D loss: 0.616947] [G loss: 0.654972]\n",
"[Epoch 85/200] [Batch 55/59] [D loss: 0.657997] [G loss: 1.130117]\n",
"[Epoch 85/200] [Batch 56/59] [D loss: 0.627873] [G loss: 0.852525]\n",
"[Epoch 85/200] [Batch 57/59] [D loss: 0.623795] [G loss: 0.759920]\n",
"[Epoch 85/200] [Batch 58/59] [D loss: 0.655926] [G loss: 1.137366]\n",
"[Epoch 86/200] [Batch 0/59] [D loss: 0.579415] [G loss: 0.856089]\n",
"[Epoch 86/200] [Batch 1/59] [D loss: 0.587355] [G loss: 0.743969]\n",
"[Epoch 86/200] [Batch 2/59] [D loss: 0.603121] [G loss: 1.089359]\n",
"[Epoch 86/200] [Batch 3/59] [D loss: 0.600067] [G loss: 1.002336]\n",
"[Epoch 86/200] [Batch 4/59] [D loss: 0.620038] [G loss: 0.805022]\n",
"[Epoch 86/200] [Batch 5/59] [D loss: 0.554303] [G loss: 0.823038]\n",
"[Epoch 86/200] [Batch 6/59] [D loss: 0.590324] [G loss: 0.889481]\n",
"[Epoch 86/200] [Batch 7/59] [D loss: 0.654990] [G loss: 0.864619]\n",
"[Epoch 86/200] [Batch 8/59] [D loss: 0.682362] [G loss: 0.827398]\n",
"[Epoch 86/200] [Batch 9/59] [D loss: 0.664514] [G loss: 0.969087]\n",
"[Epoch 86/200] [Batch 10/59] [D loss: 0.586008] [G loss: 0.804541]\n",
"[Epoch 86/200] [Batch 11/59] [D loss: 0.631476] [G loss: 0.702542]\n",
"[Epoch 86/200] [Batch 12/59] [D loss: 0.673144] [G loss: 0.951167]\n",
"[Epoch 86/200] [Batch 13/59] [D loss: 0.566119] [G loss: 0.881393]\n",
"[Epoch 86/200] [Batch 14/59] [D loss: 0.675767] [G loss: 0.667037]\n",
"[Epoch 86/200] [Batch 15/59] [D loss: 0.672484] [G loss: 0.972929]\n",
"[Epoch 86/200] [Batch 16/59] [D loss: 0.589818] [G loss: 1.030217]\n",
"[Epoch 86/200] [Batch 17/59] [D loss: 0.617375] [G loss: 0.788134]\n",
"[Epoch 86/200] [Batch 18/59] [D loss: 0.614495] [G loss: 0.847375]\n",
"[Epoch 86/200] [Batch 19/59] [D loss: 0.577483] [G loss: 0.984586]\n",
"[Epoch 86/200] [Batch 20/59] [D loss: 0.571071] [G loss: 0.936519]\n",
"[Epoch 86/200] [Batch 21/59] [D loss: 0.592562] [G loss: 0.772355]\n",
"[Epoch 86/200] [Batch 22/59] [D loss: 0.663379] [G loss: 0.870951]\n",
"[Epoch 86/200] [Batch 23/59] [D loss: 0.624500] [G loss: 0.837252]\n",
"[Epoch 86/200] [Batch 24/59] [D loss: 0.588097] [G loss: 0.870599]\n",
"[Epoch 86/200] [Batch 25/59] [D loss: 0.624943] [G loss: 0.925191]\n",
"[Epoch 86/200] [Batch 26/59] [D loss: 0.636827] [G loss: 0.789690]\n",
"[Epoch 86/200] [Batch 27/59] [D loss: 0.629991] [G loss: 0.785954]\n",
"[Epoch 86/200] [Batch 28/59] [D loss: 0.605437] [G loss: 0.949279]\n",
"[Epoch 86/200] [Batch 29/59] [D loss: 0.596754] [G loss: 0.865531]\n",
"[Epoch 86/200] [Batch 30/59] [D loss: 0.582345] [G loss: 0.716262]\n",
"[Epoch 86/200] [Batch 31/59] [D loss: 0.655482] [G loss: 0.928557]\n",
"[Epoch 86/200] [Batch 32/59] [D loss: 0.615095] [G loss: 0.884674]\n",
"[Epoch 86/200] [Batch 33/59] [D loss: 0.574223] [G loss: 0.816873]\n",
"[Epoch 86/200] [Batch 34/59] [D loss: 0.642395] [G loss: 0.708783]\n",
"[Epoch 86/200] [Batch 35/59] [D loss: 0.593274] [G loss: 1.039243]\n",
"[Epoch 86/200] [Batch 36/59] [D loss: 0.577010] [G loss: 0.870290]\n",
"[Epoch 86/200] [Batch 37/59] [D loss: 0.625132] [G loss: 0.645460]\n",
"[Epoch 86/200] [Batch 38/59] [D loss: 0.577655] [G loss: 0.943921]\n",
"[Epoch 86/200] [Batch 39/59] [D loss: 0.636951] [G loss: 0.850125]\n",
"[Epoch 86/200] [Batch 40/59] [D loss: 0.624761] [G loss: 0.643283]\n",
"[Epoch 86/200] [Batch 41/59] [D loss: 0.676042] [G loss: 0.984724]\n",
"[Epoch 86/200] [Batch 42/59] [D loss: 0.629981] [G loss: 0.830881]\n",
"[Epoch 86/200] [Batch 43/59] [D loss: 0.666417] [G loss: 0.714775]\n",
"[Epoch 86/200] [Batch 44/59] [D loss: 0.637553] [G loss: 0.884905]\n",
"[Epoch 86/200] [Batch 45/59] [D loss: 0.573398] [G loss: 1.046781]\n",
"[Epoch 86/200] [Batch 46/59] [D loss: 0.675115] [G loss: 0.778389]\n",
"[Epoch 86/200] [Batch 47/59] [D loss: 0.582762] [G loss: 0.857294]\n",
"[Epoch 86/200] [Batch 48/59] [D loss: 0.610106] [G loss: 0.896220]\n",
"[Epoch 86/200] [Batch 49/59] [D loss: 0.638427] [G loss: 0.806727]\n",
"[Epoch 86/200] [Batch 50/59] [D loss: 0.562856] [G loss: 0.780056]\n",
"[Epoch 86/200] [Batch 51/59] [D loss: 0.633638] [G loss: 0.970429]\n",
"[Epoch 86/200] [Batch 52/59] [D loss: 0.574832] [G loss: 0.719008]\n",
"[Epoch 86/200] [Batch 53/59] [D loss: 0.597752] [G loss: 0.723026]\n",
"[Epoch 86/200] [Batch 54/59] [D loss: 0.610111] [G loss: 1.069374]\n",
"[Epoch 86/200] [Batch 55/59] [D loss: 0.568173] [G loss: 0.851827]\n",
"[Epoch 86/200] [Batch 56/59] [D loss: 0.599038] [G loss: 0.724032]\n",
"[Epoch 86/200] [Batch 57/59] [D loss: 0.601541] [G loss: 0.884782]\n",
"[Epoch 86/200] [Batch 58/59] [D loss: 0.654143] [G loss: 0.927740]\n",
"[Epoch 87/200] [Batch 0/59] [D loss: 0.595518] [G loss: 0.661859]\n",
"[Epoch 87/200] [Batch 1/59] [D loss: 0.628381] [G loss: 0.870984]\n",
"[Epoch 87/200] [Batch 2/59] [D loss: 0.619399] [G loss: 0.998198]\n",
"[Epoch 87/200] [Batch 3/59] [D loss: 0.638430] [G loss: 0.665024]\n",
"[Epoch 87/200] [Batch 4/59] [D loss: 0.606060] [G loss: 0.865902]\n",
"[Epoch 87/200] [Batch 5/59] [D loss: 0.605280] [G loss: 1.045851]\n",
"[Epoch 87/200] [Batch 6/59] [D loss: 0.590662] [G loss: 0.622190]\n",
"[Epoch 87/200] [Batch 7/59] [D loss: 0.636693] [G loss: 0.974260]\n",
"[Epoch 87/200] [Batch 8/59] [D loss: 0.643706] [G loss: 1.041334]\n",
"[Epoch 87/200] [Batch 9/59] [D loss: 0.578508] [G loss: 0.875005]\n",
"[Epoch 87/200] [Batch 10/59] [D loss: 0.591092] [G loss: 0.901219]\n",
"[Epoch 87/200] [Batch 11/59] [D loss: 0.657181] [G loss: 0.781149]\n",
"[Epoch 87/200] [Batch 12/59] [D loss: 0.584072] [G loss: 0.901546]\n",
"[Epoch 87/200] [Batch 13/59] [D loss: 0.636151] [G loss: 0.916661]\n",
"[Epoch 87/200] [Batch 14/59] [D loss: 0.618468] [G loss: 0.764662]\n",
"[Epoch 87/200] [Batch 15/59] [D loss: 0.569714] [G loss: 0.850112]\n",
"[Epoch 87/200] [Batch 16/59] [D loss: 0.656585] [G loss: 1.101382]\n",
"[Epoch 87/200] [Batch 17/59] [D loss: 0.566384] [G loss: 0.691114]\n",
"[Epoch 87/200] [Batch 18/59] [D loss: 0.615055] [G loss: 0.719298]\n",
"[Epoch 87/200] [Batch 19/59] [D loss: 0.609923] [G loss: 1.083007]\n",
"[Epoch 87/200] [Batch 20/59] [D loss: 0.600666] [G loss: 0.782692]\n",
"[Epoch 87/200] [Batch 21/59] [D loss: 0.620514] [G loss: 0.838684]\n",
"[Epoch 87/200] [Batch 22/59] [D loss: 0.636059] [G loss: 0.749463]\n",
"[Epoch 87/200] [Batch 23/59] [D loss: 0.589732] [G loss: 0.920669]\n",
"[Epoch 87/200] [Batch 24/59] [D loss: 0.618611] [G loss: 0.914282]\n",
"[Epoch 87/200] [Batch 25/59] [D loss: 0.607291] [G loss: 0.702887]\n",
"[Epoch 87/200] [Batch 26/59] [D loss: 0.608112] [G loss: 0.799055]\n",
"[Epoch 87/200] [Batch 27/59] [D loss: 0.590386] [G loss: 0.866231]\n",
"[Epoch 87/200] [Batch 28/59] [D loss: 0.604314] [G loss: 0.813238]\n",
"[Epoch 87/200] [Batch 29/59] [D loss: 0.578754] [G loss: 0.792694]\n",
"[Epoch 87/200] [Batch 30/59] [D loss: 0.591162] [G loss: 0.899354]\n",
"[Epoch 87/200] [Batch 31/59] [D loss: 0.608387] [G loss: 0.843511]\n",
"[Epoch 87/200] [Batch 32/59] [D loss: 0.630910] [G loss: 0.887834]\n",
"[Epoch 87/200] [Batch 33/59] [D loss: 0.625975] [G loss: 0.912615]\n",
"[Epoch 87/200] [Batch 34/59] [D loss: 0.632843] [G loss: 0.827124]\n",
"[Epoch 87/200] [Batch 35/59] [D loss: 0.580166] [G loss: 0.856653]\n",
"[Epoch 87/200] [Batch 36/59] [D loss: 0.570503] [G loss: 0.833690]\n",
"[Epoch 87/200] [Batch 37/59] [D loss: 0.568658] [G loss: 0.794674]\n",
"[Epoch 87/200] [Batch 38/59] [D loss: 0.579702] [G loss: 0.832368]\n",
"[Epoch 87/200] [Batch 39/59] [D loss: 0.633505] [G loss: 0.838415]\n",
"[Epoch 87/200] [Batch 40/59] [D loss: 0.644605] [G loss: 0.903189]\n",
"[Epoch 87/200] [Batch 41/59] [D loss: 0.583237] [G loss: 1.104957]\n",
"[Epoch 87/200] [Batch 42/59] [D loss: 0.633724] [G loss: 0.683676]\n",
"[Epoch 87/200] [Batch 43/59] [D loss: 0.621799] [G loss: 0.766880]\n",
"[Epoch 87/200] [Batch 44/59] [D loss: 0.572111] [G loss: 0.958449]\n",
"[Epoch 87/200] [Batch 45/59] [D loss: 0.623011] [G loss: 0.820195]\n",
"[Epoch 87/200] [Batch 46/59] [D loss: 0.587656] [G loss: 0.884397]\n",
"[Epoch 87/200] [Batch 47/59] [D loss: 0.611707] [G loss: 0.868264]\n",
"[Epoch 87/200] [Batch 48/59] [D loss: 0.623412] [G loss: 0.833594]\n",
"[Epoch 87/200] [Batch 49/59] [D loss: 0.600353] [G loss: 1.022924]\n",
"[Epoch 87/200] [Batch 50/59] [D loss: 0.639040] [G loss: 0.779276]\n",
"[Epoch 87/200] [Batch 51/59] [D loss: 0.619499] [G loss: 0.726858]\n",
"[Epoch 87/200] [Batch 52/59] [D loss: 0.633829] [G loss: 1.048373]\n",
"[Epoch 87/200] [Batch 53/59] [D loss: 0.624485] [G loss: 0.689844]\n",
"[Epoch 87/200] [Batch 54/59] [D loss: 0.619143] [G loss: 0.814788]\n",
"[Epoch 87/200] [Batch 55/59] [D loss: 0.611304] [G loss: 1.117918]\n",
"[Epoch 87/200] [Batch 56/59] [D loss: 0.598572] [G loss: 0.758907]\n",
"[Epoch 87/200] [Batch 57/59] [D loss: 0.651706] [G loss: 0.947818]\n",
"[Epoch 87/200] [Batch 58/59] [D loss: 0.682870] [G loss: 0.969805]\n",
"[Epoch 88/200] [Batch 0/59] [D loss: 0.636330] [G loss: 0.766213]\n",
"[Epoch 88/200] [Batch 1/59] [D loss: 0.635987] [G loss: 0.729953]\n",
"[Epoch 88/200] [Batch 2/59] [D loss: 0.613038] [G loss: 0.929236]\n",
"[Epoch 88/200] [Batch 3/59] [D loss: 0.623467] [G loss: 0.779774]\n",
"[Epoch 88/200] [Batch 4/59] [D loss: 0.608823] [G loss: 0.865505]\n",
"[Epoch 88/200] [Batch 5/59] [D loss: 0.599351] [G loss: 1.098764]\n",
"[Epoch 88/200] [Batch 6/59] [D loss: 0.604945] [G loss: 0.716594]\n",
"[Epoch 88/200] [Batch 7/59] [D loss: 0.627440] [G loss: 0.882766]\n",
"[Epoch 88/200] [Batch 8/59] [D loss: 0.644345] [G loss: 0.813251]\n",
"[Epoch 88/200] [Batch 9/59] [D loss: 0.584569] [G loss: 0.953030]\n",
"[Epoch 88/200] [Batch 10/59] [D loss: 0.594798] [G loss: 0.801278]\n",
"[Epoch 88/200] [Batch 11/59] [D loss: 0.624783] [G loss: 1.022041]\n",
"[Epoch 88/200] [Batch 12/59] [D loss: 0.613170] [G loss: 0.774131]\n",
"[Epoch 88/200] [Batch 13/59] [D loss: 0.601685] [G loss: 0.836609]\n",
"[Epoch 88/200] [Batch 14/59] [D loss: 0.583625] [G loss: 0.857802]\n",
"[Epoch 88/200] [Batch 15/59] [D loss: 0.665230] [G loss: 0.748663]\n",
"[Epoch 88/200] [Batch 16/59] [D loss: 0.561273] [G loss: 0.860596]\n",
"[Epoch 88/200] [Batch 17/59] [D loss: 0.572335] [G loss: 0.874305]\n",
"[Epoch 88/200] [Batch 18/59] [D loss: 0.605946] [G loss: 0.744605]\n",
"[Epoch 88/200] [Batch 19/59] [D loss: 0.637577] [G loss: 0.911295]\n",
"[Epoch 88/200] [Batch 20/59] [D loss: 0.590688] [G loss: 0.840135]\n",
"[Epoch 88/200] [Batch 21/59] [D loss: 0.650313] [G loss: 0.718471]\n",
"[Epoch 88/200] [Batch 22/59] [D loss: 0.629359] [G loss: 0.977147]\n",
"[Epoch 88/200] [Batch 23/59] [D loss: 0.571465] [G loss: 0.804218]\n",
"[Epoch 88/200] [Batch 24/59] [D loss: 0.636730] [G loss: 0.591917]\n",
"[Epoch 88/200] [Batch 25/59] [D loss: 0.626936] [G loss: 1.080816]\n",
"[Epoch 88/200] [Batch 26/59] [D loss: 0.584514] [G loss: 0.940197]\n",
"[Epoch 88/200] [Batch 27/59] [D loss: 0.540123] [G loss: 0.600006]\n",
"[Epoch 88/200] [Batch 28/59] [D loss: 0.656049] [G loss: 0.797997]\n",
"[Epoch 88/200] [Batch 29/59] [D loss: 0.635727] [G loss: 0.939655]\n",
"[Epoch 88/200] [Batch 30/59] [D loss: 0.609681] [G loss: 0.788928]\n",
"[Epoch 88/200] [Batch 31/59] [D loss: 0.597589] [G loss: 0.787789]\n",
"[Epoch 88/200] [Batch 32/59] [D loss: 0.636947] [G loss: 0.841129]\n",
"[Epoch 88/200] [Batch 33/59] [D loss: 0.577873] [G loss: 1.003240]\n",
"[Epoch 88/200] [Batch 34/59] [D loss: 0.599592] [G loss: 0.852928]\n",
"[Epoch 88/200] [Batch 35/59] [D loss: 0.593680] [G loss: 0.731923]\n",
"[Epoch 88/200] [Batch 36/59] [D loss: 0.646987] [G loss: 0.954979]\n",
"[Epoch 88/200] [Batch 37/59] [D loss: 0.612623] [G loss: 0.919634]\n",
"[Epoch 88/200] [Batch 38/59] [D loss: 0.639390] [G loss: 0.556036]\n",
"[Epoch 88/200] [Batch 39/59] [D loss: 0.663500] [G loss: 1.106081]\n",
"[Epoch 88/200] [Batch 40/59] [D loss: 0.609263] [G loss: 0.895579]\n",
"[Epoch 88/200] [Batch 41/59] [D loss: 0.599887] [G loss: 0.637163]\n",
"[Epoch 88/200] [Batch 42/59] [D loss: 0.587536] [G loss: 1.108267]\n",
"[Epoch 88/200] [Batch 43/59] [D loss: 0.616686] [G loss: 1.062543]\n",
"[Epoch 88/200] [Batch 44/59] [D loss: 0.638983] [G loss: 0.611620]\n",
"[Epoch 88/200] [Batch 45/59] [D loss: 0.681606] [G loss: 0.945814]\n",
"[Epoch 88/200] [Batch 46/59] [D loss: 0.635359] [G loss: 0.890504]\n",
"[Epoch 88/200] [Batch 47/59] [D loss: 0.631233] [G loss: 0.763211]\n",
"[Epoch 88/200] [Batch 48/59] [D loss: 0.571546] [G loss: 0.852032]\n",
"[Epoch 88/200] [Batch 49/59] [D loss: 0.635613] [G loss: 0.891617]\n",
"[Epoch 88/200] [Batch 50/59] [D loss: 0.597028] [G loss: 0.894416]\n",
"[Epoch 88/200] [Batch 51/59] [D loss: 0.625813] [G loss: 0.771281]\n",
"[Epoch 88/200] [Batch 52/59] [D loss: 0.578747] [G loss: 1.110275]\n",
"[Epoch 88/200] [Batch 53/59] [D loss: 0.597690] [G loss: 0.889873]\n",
"[Epoch 88/200] [Batch 54/59] [D loss: 0.608769] [G loss: 0.712053]\n",
"[Epoch 88/200] [Batch 55/59] [D loss: 0.614268] [G loss: 1.003984]\n",
"[Epoch 88/200] [Batch 56/59] [D loss: 0.608065] [G loss: 0.799698]\n",
"[Epoch 88/200] [Batch 57/59] [D loss: 0.594114] [G loss: 0.671407]\n",
"[Epoch 88/200] [Batch 58/59] [D loss: 0.629364] [G loss: 0.982975]\n",
"[Epoch 89/200] [Batch 0/59] [D loss: 0.618560] [G loss: 0.793559]\n",
"[Epoch 89/200] [Batch 1/59] [D loss: 0.574107] [G loss: 0.842057]\n",
"[Epoch 89/200] [Batch 2/59] [D loss: 0.619253] [G loss: 1.005394]\n",
"[Epoch 89/200] [Batch 3/59] [D loss: 0.627161] [G loss: 0.710598]\n",
"[Epoch 89/200] [Batch 4/59] [D loss: 0.582263] [G loss: 0.889146]\n",
"[Epoch 89/200] [Batch 5/59] [D loss: 0.562236] [G loss: 0.951539]\n",
"[Epoch 89/200] [Batch 6/59] [D loss: 0.657678] [G loss: 0.766716]\n",
"[Epoch 89/200] [Batch 7/59] [D loss: 0.610090] [G loss: 0.842598]\n",
"[Epoch 89/200] [Batch 8/59] [D loss: 0.621764] [G loss: 0.916942]\n",
"[Epoch 89/200] [Batch 9/59] [D loss: 0.630003] [G loss: 0.773753]\n",
"[Epoch 89/200] [Batch 10/59] [D loss: 0.627171] [G loss: 0.695312]\n",
"[Epoch 89/200] [Batch 11/59] [D loss: 0.592899] [G loss: 0.915831]\n",
"[Epoch 89/200] [Batch 12/59] [D loss: 0.601616] [G loss: 0.784386]\n",
"[Epoch 89/200] [Batch 13/59] [D loss: 0.617976] [G loss: 0.782847]\n",
"[Epoch 89/200] [Batch 14/59] [D loss: 0.575604] [G loss: 0.732063]\n",
"[Epoch 89/200] [Batch 15/59] [D loss: 0.582091] [G loss: 1.065570]\n",
"[Epoch 89/200] [Batch 16/59] [D loss: 0.654872] [G loss: 0.728181]\n",
"[Epoch 89/200] [Batch 17/59] [D loss: 0.621526] [G loss: 0.674745]\n",
"[Epoch 89/200] [Batch 18/59] [D loss: 0.617383] [G loss: 1.012787]\n",
"[Epoch 89/200] [Batch 19/59] [D loss: 0.641545] [G loss: 0.802252]\n",
"[Epoch 89/200] [Batch 20/59] [D loss: 0.591577] [G loss: 0.899774]\n",
"[Epoch 89/200] [Batch 21/59] [D loss: 0.602134] [G loss: 1.006358]\n",
"[Epoch 89/200] [Batch 22/59] [D loss: 0.594618] [G loss: 0.730089]\n",
"[Epoch 89/200] [Batch 23/59] [D loss: 0.604189] [G loss: 0.904923]\n",
"[Epoch 89/200] [Batch 24/59] [D loss: 0.623364] [G loss: 0.778805]\n",
"[Epoch 89/200] [Batch 25/59] [D loss: 0.620424] [G loss: 0.787137]\n",
"[Epoch 89/200] [Batch 26/59] [D loss: 0.615009] [G loss: 0.878349]\n",
"[Epoch 89/200] [Batch 27/59] [D loss: 0.556661] [G loss: 0.939470]\n",
"[Epoch 89/200] [Batch 28/59] [D loss: 0.638140] [G loss: 1.097179]\n",
"[Epoch 89/200] [Batch 29/59] [D loss: 0.602560] [G loss: 0.623538]\n",
"[Epoch 89/200] [Batch 30/59] [D loss: 0.651470] [G loss: 1.040031]\n",
"[Epoch 89/200] [Batch 31/59] [D loss: 0.639991] [G loss: 0.911796]\n",
"[Epoch 89/200] [Batch 32/59] [D loss: 0.631617] [G loss: 0.686765]\n",
"[Epoch 89/200] [Batch 33/59] [D loss: 0.616044] [G loss: 0.893121]\n",
"[Epoch 89/200] [Batch 34/59] [D loss: 0.671025] [G loss: 0.834980]\n",
"[Epoch 89/200] [Batch 35/59] [D loss: 0.614109] [G loss: 1.018234]\n",
"[Epoch 89/200] [Batch 36/59] [D loss: 0.560758] [G loss: 0.898817]\n",
"[Epoch 89/200] [Batch 37/59] [D loss: 0.621094] [G loss: 0.783768]\n",
"[Epoch 89/200] [Batch 38/59] [D loss: 0.609276] [G loss: 0.991151]\n",
"[Epoch 89/200] [Batch 39/59] [D loss: 0.672027] [G loss: 0.729438]\n",
"[Epoch 89/200] [Batch 40/59] [D loss: 0.628038] [G loss: 0.762437]\n",
"[Epoch 89/200] [Batch 41/59] [D loss: 0.650024] [G loss: 0.879314]\n",
"[Epoch 89/200] [Batch 42/59] [D loss: 0.639531] [G loss: 1.071265]\n",
"[Epoch 89/200] [Batch 43/59] [D loss: 0.591412] [G loss: 0.709833]\n",
"[Epoch 89/200] [Batch 44/59] [D loss: 0.591045] [G loss: 0.867536]\n",
"[Epoch 89/200] [Batch 45/59] [D loss: 0.629464] [G loss: 0.885842]\n",
"[Epoch 89/200] [Batch 46/59] [D loss: 0.601787] [G loss: 1.056639]\n",
"[Epoch 89/200] [Batch 47/59] [D loss: 0.626105] [G loss: 0.736740]\n",
"[Epoch 89/200] [Batch 48/59] [D loss: 0.658312] [G loss: 0.697713]\n",
"[Epoch 89/200] [Batch 49/59] [D loss: 0.589658] [G loss: 0.892012]\n",
"[Epoch 89/200] [Batch 50/59] [D loss: 0.597239] [G loss: 0.830609]\n",
"[Epoch 89/200] [Batch 51/59] [D loss: 0.597196] [G loss: 0.842952]\n",
"[Epoch 89/200] [Batch 52/59] [D loss: 0.569363] [G loss: 0.853365]\n",
"[Epoch 89/200] [Batch 53/59] [D loss: 0.620060] [G loss: 0.956972]\n",
"[Epoch 89/200] [Batch 54/59] [D loss: 0.615776] [G loss: 0.799278]\n",
"[Epoch 89/200] [Batch 55/59] [D loss: 0.604921] [G loss: 0.814766]\n",
"[Epoch 89/200] [Batch 56/59] [D loss: 0.644036] [G loss: 1.039188]\n",
"[Epoch 89/200] [Batch 57/59] [D loss: 0.693912] [G loss: 0.684046]\n",
"[Epoch 89/200] [Batch 58/59] [D loss: 0.604579] [G loss: 0.910238]\n",
"[Epoch 90/200] [Batch 0/59] [D loss: 0.553550] [G loss: 0.918874]\n",
"[Epoch 90/200] [Batch 1/59] [D loss: 0.625739] [G loss: 0.719365]\n",
"[Epoch 90/200] [Batch 2/59] [D loss: 0.609048] [G loss: 0.932824]\n",
"[Epoch 90/200] [Batch 3/59] [D loss: 0.610551] [G loss: 0.968053]\n",
"[Epoch 90/200] [Batch 4/59] [D loss: 0.609927] [G loss: 0.657562]\n",
"[Epoch 90/200] [Batch 5/59] [D loss: 0.621989] [G loss: 0.800315]\n",
"[Epoch 90/200] [Batch 6/59] [D loss: 0.576523] [G loss: 0.952538]\n",
"[Epoch 90/200] [Batch 7/59] [D loss: 0.619114] [G loss: 0.780888]\n",
"[Epoch 90/200] [Batch 8/59] [D loss: 0.568726] [G loss: 0.893120]\n",
"[Epoch 90/200] [Batch 9/59] [D loss: 0.597868] [G loss: 0.921316]\n",
"[Epoch 90/200] [Batch 10/59] [D loss: 0.633259] [G loss: 0.780512]\n",
"[Epoch 90/200] [Batch 11/59] [D loss: 0.594064] [G loss: 0.914469]\n",
"[Epoch 90/200] [Batch 12/59] [D loss: 0.595155] [G loss: 0.796514]\n",
"[Epoch 90/200] [Batch 13/59] [D loss: 0.638422] [G loss: 0.802144]\n",
"[Epoch 90/200] [Batch 14/59] [D loss: 0.639285] [G loss: 1.011203]\n",
"[Epoch 90/200] [Batch 15/59] [D loss: 0.602832] [G loss: 0.882797]\n",
"[Epoch 90/200] [Batch 16/59] [D loss: 0.586010] [G loss: 0.714551]\n",
"[Epoch 90/200] [Batch 17/59] [D loss: 0.648352] [G loss: 0.758780]\n",
"[Epoch 90/200] [Batch 18/59] [D loss: 0.649201] [G loss: 0.969106]\n",
"[Epoch 90/200] [Batch 19/59] [D loss: 0.533226] [G loss: 0.888346]\n",
"[Epoch 90/200] [Batch 20/59] [D loss: 0.617665] [G loss: 0.779053]\n",
"[Epoch 90/200] [Batch 21/59] [D loss: 0.643458] [G loss: 1.025851]\n",
"[Epoch 90/200] [Batch 22/59] [D loss: 0.610493] [G loss: 0.793684]\n",
"[Epoch 90/200] [Batch 23/59] [D loss: 0.580799] [G loss: 0.786664]\n",
"[Epoch 90/200] [Batch 24/59] [D loss: 0.673003] [G loss: 0.980204]\n",
"[Epoch 90/200] [Batch 25/59] [D loss: 0.596388] [G loss: 0.684861]\n",
"[Epoch 90/200] [Batch 26/59] [D loss: 0.562548] [G loss: 0.847258]\n",
"[Epoch 90/200] [Batch 27/59] [D loss: 0.583245] [G loss: 1.043252]\n",
"[Epoch 90/200] [Batch 28/59] [D loss: 0.600411] [G loss: 0.673935]\n",
"[Epoch 90/200] [Batch 29/59] [D loss: 0.630330] [G loss: 0.715701]\n",
"[Epoch 90/200] [Batch 30/59] [D loss: 0.677974] [G loss: 1.092638]\n",
"[Epoch 90/200] [Batch 31/59] [D loss: 0.601199] [G loss: 0.805176]\n",
"[Epoch 90/200] [Batch 32/59] [D loss: 0.576539] [G loss: 0.752604]\n",
"[Epoch 90/200] [Batch 33/59] [D loss: 0.566989] [G loss: 0.865951]\n",
"[Epoch 90/200] [Batch 34/59] [D loss: 0.607719] [G loss: 0.975903]\n",
"[Epoch 90/200] [Batch 35/59] [D loss: 0.618179] [G loss: 0.700164]\n",
"[Epoch 90/200] [Batch 36/59] [D loss: 0.603882] [G loss: 0.941935]\n",
"[Epoch 90/200] [Batch 37/59] [D loss: 0.610591] [G loss: 0.832235]\n",
"[Epoch 90/200] [Batch 38/59] [D loss: 0.573551] [G loss: 0.896054]\n",
"[Epoch 90/200] [Batch 39/59] [D loss: 0.641812] [G loss: 1.066170]\n",
"[Epoch 90/200] [Batch 40/59] [D loss: 0.624773] [G loss: 0.750984]\n",
"[Epoch 90/200] [Batch 41/59] [D loss: 0.622711] [G loss: 0.966666]\n",
"[Epoch 90/200] [Batch 42/59] [D loss: 0.582041] [G loss: 0.914176]\n",
"[Epoch 90/200] [Batch 43/59] [D loss: 0.650915] [G loss: 0.812336]\n",
"[Epoch 90/200] [Batch 44/59] [D loss: 0.634427] [G loss: 0.799743]\n",
"[Epoch 90/200] [Batch 45/59] [D loss: 0.634602] [G loss: 1.009539]\n",
"[Epoch 90/200] [Batch 46/59] [D loss: 0.612005] [G loss: 0.632789]\n",
"[Epoch 90/200] [Batch 47/59] [D loss: 0.621699] [G loss: 1.007055]\n",
"[Epoch 90/200] [Batch 48/59] [D loss: 0.614910] [G loss: 0.988249]\n",
"[Epoch 90/200] [Batch 49/59] [D loss: 0.601865] [G loss: 0.799248]\n",
"[Epoch 90/200] [Batch 50/59] [D loss: 0.595603] [G loss: 0.727952]\n",
"[Epoch 90/200] [Batch 51/59] [D loss: 0.647389] [G loss: 1.129501]\n",
"[Epoch 90/200] [Batch 52/59] [D loss: 0.587945] [G loss: 0.648217]\n",
"[Epoch 90/200] [Batch 53/59] [D loss: 0.644160] [G loss: 0.752754]\n",
"[Epoch 90/200] [Batch 54/59] [D loss: 0.628030] [G loss: 1.137415]\n",
"[Epoch 90/200] [Batch 55/59] [D loss: 0.542512] [G loss: 0.857309]\n",
"[Epoch 90/200] [Batch 56/59] [D loss: 0.632107] [G loss: 0.673348]\n",
"[Epoch 90/200] [Batch 57/59] [D loss: 0.581147] [G loss: 1.234963]\n",
"[Epoch 90/200] [Batch 58/59] [D loss: 0.573809] [G loss: 0.793861]\n",
"[Epoch 91/200] [Batch 0/59] [D loss: 0.592348] [G loss: 0.693926]\n",
"[Epoch 91/200] [Batch 1/59] [D loss: 0.637421] [G loss: 1.051340]\n",
"[Epoch 91/200] [Batch 2/59] [D loss: 0.634456] [G loss: 0.853482]\n",
"[Epoch 91/200] [Batch 3/59] [D loss: 0.600966] [G loss: 0.729720]\n",
"[Epoch 91/200] [Batch 4/59] [D loss: 0.594144] [G loss: 0.779304]\n",
"[Epoch 91/200] [Batch 5/59] [D loss: 0.580767] [G loss: 0.931615]\n",
"[Epoch 91/200] [Batch 6/59] [D loss: 0.584917] [G loss: 0.929947]\n",
"[Epoch 91/200] [Batch 7/59] [D loss: 0.610911] [G loss: 0.762439]\n",
"[Epoch 91/200] [Batch 8/59] [D loss: 0.616993] [G loss: 0.776437]\n",
"[Epoch 91/200] [Batch 9/59] [D loss: 0.580623] [G loss: 0.909934]\n",
"[Epoch 91/200] [Batch 10/59] [D loss: 0.581586] [G loss: 0.884889]\n",
"[Epoch 91/200] [Batch 11/59] [D loss: 0.596032] [G loss: 0.748327]\n",
"[Epoch 91/200] [Batch 12/59] [D loss: 0.598407] [G loss: 0.836258]\n",
"[Epoch 91/200] [Batch 13/59] [D loss: 0.584868] [G loss: 0.863639]\n",
"[Epoch 91/200] [Batch 14/59] [D loss: 0.594609] [G loss: 0.861526]\n",
"[Epoch 91/200] [Batch 15/59] [D loss: 0.570358] [G loss: 0.802579]\n",
"[Epoch 91/200] [Batch 16/59] [D loss: 0.597917] [G loss: 0.954158]\n",
"[Epoch 91/200] [Batch 17/59] [D loss: 0.570714] [G loss: 0.912050]\n",
"[Epoch 91/200] [Batch 18/59] [D loss: 0.583469] [G loss: 0.862521]\n",
"[Epoch 91/200] [Batch 19/59] [D loss: 0.578788] [G loss: 0.809110]\n",
"[Epoch 91/200] [Batch 20/59] [D loss: 0.636495] [G loss: 0.791270]\n",
"[Epoch 91/200] [Batch 21/59] [D loss: 0.625227] [G loss: 1.045126]\n",
"[Epoch 91/200] [Batch 22/59] [D loss: 0.580185] [G loss: 0.803122]\n",
"[Epoch 91/200] [Batch 23/59] [D loss: 0.650963] [G loss: 0.862569]\n",
"[Epoch 91/200] [Batch 24/59] [D loss: 0.604786] [G loss: 0.875803]\n",
"[Epoch 91/200] [Batch 25/59] [D loss: 0.629204] [G loss: 0.885544]\n",
"[Epoch 91/200] [Batch 26/59] [D loss: 0.587607] [G loss: 0.826299]\n",
"[Epoch 91/200] [Batch 27/59] [D loss: 0.606847] [G loss: 0.701884]\n",
"[Epoch 91/200] [Batch 28/59] [D loss: 0.550210] [G loss: 1.048612]\n",
"[Epoch 91/200] [Batch 29/59] [D loss: 0.598275] [G loss: 0.750669]\n",
"[Epoch 91/200] [Batch 30/59] [D loss: 0.595860] [G loss: 0.812343]\n",
"[Epoch 91/200] [Batch 31/59] [D loss: 0.596756] [G loss: 1.135051]\n",
"[Epoch 91/200] [Batch 32/59] [D loss: 0.629192] [G loss: 0.676324]\n",
"[Epoch 91/200] [Batch 33/59] [D loss: 0.618160] [G loss: 0.791412]\n",
"[Epoch 91/200] [Batch 34/59] [D loss: 0.573638] [G loss: 1.005067]\n",
"[Epoch 91/200] [Batch 35/59] [D loss: 0.623762] [G loss: 0.887564]\n",
"[Epoch 91/200] [Batch 36/59] [D loss: 0.656550] [G loss: 0.893174]\n",
"[Epoch 91/200] [Batch 37/59] [D loss: 0.560299] [G loss: 0.837964]\n",
"[Epoch 91/200] [Batch 38/59] [D loss: 0.592206] [G loss: 0.772876]\n",
"[Epoch 91/200] [Batch 39/59] [D loss: 0.630165] [G loss: 0.716753]\n",
"[Epoch 91/200] [Batch 40/59] [D loss: 0.603275] [G loss: 0.810542]\n",
"[Epoch 91/200] [Batch 41/59] [D loss: 0.613303] [G loss: 0.986851]\n",
"[Epoch 91/200] [Batch 42/59] [D loss: 0.632010] [G loss: 0.884800]\n",
"[Epoch 91/200] [Batch 43/59] [D loss: 0.587455] [G loss: 0.756589]\n",
"[Epoch 91/200] [Batch 44/59] [D loss: 0.604519] [G loss: 0.789226]\n",
"[Epoch 91/200] [Batch 45/59] [D loss: 0.610089] [G loss: 0.874290]\n",
"[Epoch 91/200] [Batch 46/59] [D loss: 0.632360] [G loss: 0.863540]\n",
"[Epoch 91/200] [Batch 47/59] [D loss: 0.587036] [G loss: 0.929745]\n",
"[Epoch 91/200] [Batch 48/59] [D loss: 0.633682] [G loss: 0.824802]\n",
"[Epoch 91/200] [Batch 49/59] [D loss: 0.603476] [G loss: 0.810899]\n",
"[Epoch 91/200] [Batch 50/59] [D loss: 0.578176] [G loss: 0.918708]\n",
"[Epoch 91/200] [Batch 51/59] [D loss: 0.598132] [G loss: 0.789163]\n",
"[Epoch 91/200] [Batch 52/59] [D loss: 0.542487] [G loss: 0.842094]\n",
"[Epoch 91/200] [Batch 53/59] [D loss: 0.634267] [G loss: 0.860551]\n",
"[Epoch 91/200] [Batch 54/59] [D loss: 0.577567] [G loss: 0.780236]\n",
"[Epoch 91/200] [Batch 55/59] [D loss: 0.616850] [G loss: 0.760133]\n",
"[Epoch 91/200] [Batch 56/59] [D loss: 0.590452] [G loss: 1.065869]\n",
"[Epoch 91/200] [Batch 57/59] [D loss: 0.582144] [G loss: 0.781143]\n",
"[Epoch 91/200] [Batch 58/59] [D loss: 0.644313] [G loss: 0.660161]\n",
"[Epoch 92/200] [Batch 0/59] [D loss: 0.625642] [G loss: 1.078960]\n",
"[Epoch 92/200] [Batch 1/59] [D loss: 0.558187] [G loss: 0.815601]\n",
"[Epoch 92/200] [Batch 2/59] [D loss: 0.618236] [G loss: 0.715375]\n",
"[Epoch 92/200] [Batch 3/59] [D loss: 0.623760] [G loss: 1.100143]\n",
"[Epoch 92/200] [Batch 4/59] [D loss: 0.601687] [G loss: 0.781904]\n",
"[Epoch 92/200] [Batch 5/59] [D loss: 0.542303] [G loss: 0.787535]\n",
"[Epoch 92/200] [Batch 6/59] [D loss: 0.637009] [G loss: 0.885703]\n",
"[Epoch 92/200] [Batch 7/59] [D loss: 0.624493] [G loss: 1.119714]\n",
"[Epoch 92/200] [Batch 8/59] [D loss: 0.591702] [G loss: 0.654420]\n",
"[Epoch 92/200] [Batch 9/59] [D loss: 0.605635] [G loss: 0.818337]\n",
"[Epoch 92/200] [Batch 10/59] [D loss: 0.635990] [G loss: 1.069099]\n",
"[Epoch 92/200] [Batch 11/59] [D loss: 0.587852] [G loss: 0.838631]\n",
"[Epoch 92/200] [Batch 12/59] [D loss: 0.627964] [G loss: 0.782676]\n",
"[Epoch 92/200] [Batch 13/59] [D loss: 0.689576] [G loss: 1.093107]\n",
"[Epoch 92/200] [Batch 14/59] [D loss: 0.617005] [G loss: 0.726320]\n",
"[Epoch 92/200] [Batch 15/59] [D loss: 0.618558] [G loss: 0.884426]\n",
"[Epoch 92/200] [Batch 16/59] [D loss: 0.590619] [G loss: 0.899565]\n",
"[Epoch 92/200] [Batch 17/59] [D loss: 0.591368] [G loss: 0.802235]\n",
"[Epoch 92/200] [Batch 18/59] [D loss: 0.650629] [G loss: 0.829404]\n",
"[Epoch 92/200] [Batch 19/59] [D loss: 0.643621] [G loss: 1.036299]\n",
"[Epoch 92/200] [Batch 20/59] [D loss: 0.561264] [G loss: 0.806126]\n",
"[Epoch 92/200] [Batch 21/59] [D loss: 0.627977] [G loss: 0.719766]\n",
"[Epoch 92/200] [Batch 22/59] [D loss: 0.614735] [G loss: 1.256870]\n",
"[Epoch 92/200] [Batch 23/59] [D loss: 0.631134] [G loss: 0.852078]\n",
"[Epoch 92/200] [Batch 24/59] [D loss: 0.586412] [G loss: 0.790147]\n",
"[Epoch 92/200] [Batch 25/59] [D loss: 0.640191] [G loss: 1.024127]\n",
"[Epoch 92/200] [Batch 26/59] [D loss: 0.602342] [G loss: 0.864985]\n",
"[Epoch 92/200] [Batch 27/59] [D loss: 0.678178] [G loss: 0.602923]\n",
"[Epoch 92/200] [Batch 28/59] [D loss: 0.630524] [G loss: 1.103837]\n",
"[Epoch 92/200] [Batch 29/59] [D loss: 0.608185] [G loss: 0.826101]\n",
"[Epoch 92/200] [Batch 30/59] [D loss: 0.619232] [G loss: 0.787612]\n",
"[Epoch 92/200] [Batch 31/59] [D loss: 0.601048] [G loss: 1.251373]\n",
"[Epoch 92/200] [Batch 32/59] [D loss: 0.539604] [G loss: 0.824720]\n",
"[Epoch 92/200] [Batch 33/59] [D loss: 0.605319] [G loss: 0.692108]\n",
"[Epoch 92/200] [Batch 34/59] [D loss: 0.555829] [G loss: 1.318718]\n",
"[Epoch 92/200] [Batch 35/59] [D loss: 0.587680] [G loss: 0.875318]\n",
"[Epoch 92/200] [Batch 36/59] [D loss: 0.618997] [G loss: 0.687976]\n",
"[Epoch 92/200] [Batch 37/59] [D loss: 0.588442] [G loss: 1.090015]\n",
"[Epoch 92/200] [Batch 38/59] [D loss: 0.635933] [G loss: 0.840959]\n",
"[Epoch 92/200] [Batch 39/59] [D loss: 0.602952] [G loss: 0.749178]\n",
"[Epoch 92/200] [Batch 40/59] [D loss: 0.620224] [G loss: 1.026200]\n",
"[Epoch 92/200] [Batch 41/59] [D loss: 0.620306] [G loss: 0.911387]\n",
"[Epoch 92/200] [Batch 42/59] [D loss: 0.606092] [G loss: 0.783416]\n",
"[Epoch 92/200] [Batch 43/59] [D loss: 0.610791] [G loss: 0.849970]\n",
"[Epoch 92/200] [Batch 44/59] [D loss: 0.564022] [G loss: 0.757770]\n",
"[Epoch 92/200] [Batch 45/59] [D loss: 0.582275] [G loss: 0.860271]\n",
"[Epoch 92/200] [Batch 46/59] [D loss: 0.653261] [G loss: 0.842391]\n",
"[Epoch 92/200] [Batch 47/59] [D loss: 0.560250] [G loss: 0.802352]\n",
"[Epoch 92/200] [Batch 48/59] [D loss: 0.600672] [G loss: 1.032415]\n",
"[Epoch 92/200] [Batch 49/59] [D loss: 0.578569] [G loss: 0.845041]\n",
"[Epoch 92/200] [Batch 50/59] [D loss: 0.660172] [G loss: 0.698402]\n",
"[Epoch 92/200] [Batch 51/59] [D loss: 0.609767] [G loss: 0.761282]\n",
"[Epoch 92/200] [Batch 52/59] [D loss: 0.682200] [G loss: 1.055679]\n",
"[Epoch 92/200] [Batch 53/59] [D loss: 0.595697] [G loss: 0.802479]\n",
"[Epoch 92/200] [Batch 54/59] [D loss: 0.568926] [G loss: 0.835718]\n",
"[Epoch 92/200] [Batch 55/59] [D loss: 0.558428] [G loss: 0.951336]\n",
"[Epoch 92/200] [Batch 56/59] [D loss: 0.542000] [G loss: 0.959890]\n",
"[Epoch 92/200] [Batch 57/59] [D loss: 0.631799] [G loss: 0.709783]\n",
"[Epoch 92/200] [Batch 58/59] [D loss: 0.605186] [G loss: 1.039766]\n",
"[Epoch 93/200] [Batch 0/59] [D loss: 0.595022] [G loss: 0.892758]\n",
"[Epoch 93/200] [Batch 1/59] [D loss: 0.633605] [G loss: 0.681832]\n",
"[Epoch 93/200] [Batch 2/59] [D loss: 0.657428] [G loss: 1.053155]\n",
"[Epoch 93/200] [Batch 3/59] [D loss: 0.601304] [G loss: 0.801483]\n",
"[Epoch 93/200] [Batch 4/59] [D loss: 0.585480] [G loss: 0.842068]\n",
"[Epoch 93/200] [Batch 5/59] [D loss: 0.575115] [G loss: 1.057056]\n",
"[Epoch 93/200] [Batch 6/59] [D loss: 0.609460] [G loss: 0.815015]\n",
"[Epoch 93/200] [Batch 7/59] [D loss: 0.619743] [G loss: 0.858176]\n",
"[Epoch 93/200] [Batch 8/59] [D loss: 0.645641] [G loss: 0.804270]\n",
"[Epoch 93/200] [Batch 9/59] [D loss: 0.652307] [G loss: 0.831779]\n",
"[Epoch 93/200] [Batch 10/59] [D loss: 0.588609] [G loss: 0.838149]\n",
"[Epoch 93/200] [Batch 11/59] [D loss: 0.632391] [G loss: 0.854766]\n",
"[Epoch 93/200] [Batch 12/59] [D loss: 0.623693] [G loss: 0.895363]\n",
"[Epoch 93/200] [Batch 13/59] [D loss: 0.576037] [G loss: 0.655483]\n",
"[Epoch 93/200] [Batch 14/59] [D loss: 0.579915] [G loss: 1.048783]\n",
"[Epoch 93/200] [Batch 15/59] [D loss: 0.609985] [G loss: 0.946740]\n",
"[Epoch 93/200] [Batch 16/59] [D loss: 0.623525] [G loss: 0.695139]\n",
"[Epoch 93/200] [Batch 17/59] [D loss: 0.614838] [G loss: 1.016478]\n",
"[Epoch 93/200] [Batch 18/59] [D loss: 0.674252] [G loss: 0.817238]\n",
"[Epoch 93/200] [Batch 19/59] [D loss: 0.581091] [G loss: 0.817039]\n",
"[Epoch 93/200] [Batch 20/59] [D loss: 0.589487] [G loss: 0.981021]\n",
"[Epoch 93/200] [Batch 21/59] [D loss: 0.608049] [G loss: 1.009048]\n",
"[Epoch 93/200] [Batch 22/59] [D loss: 0.663854] [G loss: 0.662415]\n",
"[Epoch 93/200] [Batch 23/59] [D loss: 0.596764] [G loss: 0.880744]\n",
"[Epoch 93/200] [Batch 24/59] [D loss: 0.678715] [G loss: 0.962219]\n",
"[Epoch 93/200] [Batch 25/59] [D loss: 0.641771] [G loss: 0.683640]\n",
"[Epoch 93/200] [Batch 26/59] [D loss: 0.638699] [G loss: 0.864485]\n",
"[Epoch 93/200] [Batch 27/59] [D loss: 0.636267] [G loss: 1.296810]\n",
"[Epoch 93/200] [Batch 28/59] [D loss: 0.541159] [G loss: 0.674344]\n",
"[Epoch 93/200] [Batch 29/59] [D loss: 0.591884] [G loss: 0.693733]\n",
"[Epoch 93/200] [Batch 30/59] [D loss: 0.586260] [G loss: 1.016336]\n",
"[Epoch 93/200] [Batch 31/59] [D loss: 0.613669] [G loss: 0.676825]\n",
"[Epoch 93/200] [Batch 32/59] [D loss: 0.683095] [G loss: 0.982939]\n",
"[Epoch 93/200] [Batch 33/59] [D loss: 0.591430] [G loss: 0.741181]\n",
"[Epoch 93/200] [Batch 34/59] [D loss: 0.629830] [G loss: 0.682126]\n",
"[Epoch 93/200] [Batch 35/59] [D loss: 0.658191] [G loss: 1.101891]\n",
"[Epoch 93/200] [Batch 36/59] [D loss: 0.601662] [G loss: 0.908254]\n",
"[Epoch 93/200] [Batch 37/59] [D loss: 0.628910] [G loss: 0.715530]\n",
"[Epoch 93/200] [Batch 38/59] [D loss: 0.614666] [G loss: 1.092075]\n",
"[Epoch 93/200] [Batch 39/59] [D loss: 0.570126] [G loss: 0.768739]\n",
"[Epoch 93/200] [Batch 40/59] [D loss: 0.615104] [G loss: 0.723529]\n",
"[Epoch 93/200] [Batch 41/59] [D loss: 0.631474] [G loss: 0.866683]\n",
"[Epoch 93/200] [Batch 42/59] [D loss: 0.610858] [G loss: 1.227855]\n",
"[Epoch 93/200] [Batch 43/59] [D loss: 0.612728] [G loss: 0.845521]\n",
"[Epoch 93/200] [Batch 44/59] [D loss: 0.639771] [G loss: 0.496919]\n",
"[Epoch 93/200] [Batch 45/59] [D loss: 0.639998] [G loss: 1.282861]\n",
"[Epoch 93/200] [Batch 46/59] [D loss: 0.594885] [G loss: 0.861448]\n",
"[Epoch 93/200] [Batch 47/59] [D loss: 0.609625] [G loss: 0.570179]\n",
"[Epoch 93/200] [Batch 48/59] [D loss: 0.587029] [G loss: 1.108144]\n",
"[Epoch 93/200] [Batch 49/59] [D loss: 0.617147] [G loss: 0.834139]\n",
"[Epoch 93/200] [Batch 50/59] [D loss: 0.550432] [G loss: 0.814698]\n",
"[Epoch 93/200] [Batch 51/59] [D loss: 0.585774] [G loss: 0.744142]\n",
"[Epoch 93/200] [Batch 52/59] [D loss: 0.575361] [G loss: 1.055560]\n",
"[Epoch 93/200] [Batch 53/59] [D loss: 0.584166] [G loss: 0.754146]\n",
"[Epoch 93/200] [Batch 54/59] [D loss: 0.567104] [G loss: 0.730233]\n",
"[Epoch 93/200] [Batch 55/59] [D loss: 0.641921] [G loss: 1.103828]\n",
"[Epoch 93/200] [Batch 56/59] [D loss: 0.637346] [G loss: 0.890556]\n",
"[Epoch 93/200] [Batch 57/59] [D loss: 0.607968] [G loss: 0.827387]\n",
"[Epoch 93/200] [Batch 58/59] [D loss: 0.597376] [G loss: 0.803360]\n",
"[Epoch 94/200] [Batch 0/59] [D loss: 0.629196] [G loss: 0.740284]\n",
"[Epoch 94/200] [Batch 1/59] [D loss: 0.629880] [G loss: 0.819272]\n",
"[Epoch 94/200] [Batch 2/59] [D loss: 0.622505] [G loss: 0.881985]\n",
"[Epoch 94/200] [Batch 3/59] [D loss: 0.604944] [G loss: 0.830851]\n",
"[Epoch 94/200] [Batch 4/59] [D loss: 0.588573] [G loss: 0.920518]\n",
"[Epoch 94/200] [Batch 5/59] [D loss: 0.600510] [G loss: 0.849861]\n",
"[Epoch 94/200] [Batch 6/59] [D loss: 0.570783] [G loss: 0.855796]\n",
"[Epoch 94/200] [Batch 7/59] [D loss: 0.663121] [G loss: 0.963307]\n",
"[Epoch 94/200] [Batch 8/59] [D loss: 0.577783] [G loss: 0.829514]\n",
"[Epoch 94/200] [Batch 9/59] [D loss: 0.635502] [G loss: 0.754557]\n",
"[Epoch 94/200] [Batch 10/59] [D loss: 0.641492] [G loss: 0.890080]\n",
"[Epoch 94/200] [Batch 11/59] [D loss: 0.643283] [G loss: 0.792614]\n",
"[Epoch 94/200] [Batch 12/59] [D loss: 0.558741] [G loss: 0.855381]\n",
"[Epoch 94/200] [Batch 13/59] [D loss: 0.616304] [G loss: 0.668923]\n",
"[Epoch 94/200] [Batch 14/59] [D loss: 0.595493] [G loss: 0.980807]\n",
"[Epoch 94/200] [Batch 15/59] [D loss: 0.604821] [G loss: 0.814062]\n",
"[Epoch 94/200] [Batch 16/59] [D loss: 0.629975] [G loss: 0.756014]\n",
"[Epoch 94/200] [Batch 17/59] [D loss: 0.585333] [G loss: 0.841198]\n",
"[Epoch 94/200] [Batch 18/59] [D loss: 0.605839] [G loss: 0.877917]\n",
"[Epoch 94/200] [Batch 19/59] [D loss: 0.633083] [G loss: 0.727876]\n",
"[Epoch 94/200] [Batch 20/59] [D loss: 0.553547] [G loss: 0.889235]\n",
"[Epoch 94/200] [Batch 21/59] [D loss: 0.567202] [G loss: 1.060595]\n",
"[Epoch 94/200] [Batch 22/59] [D loss: 0.536812] [G loss: 0.793954]\n",
"[Epoch 94/200] [Batch 23/59] [D loss: 0.626488] [G loss: 0.786094]\n",
"[Epoch 94/200] [Batch 24/59] [D loss: 0.573221] [G loss: 0.987643]\n",
"[Epoch 94/200] [Batch 25/59] [D loss: 0.595593] [G loss: 0.744007]\n",
"[Epoch 94/200] [Batch 26/59] [D loss: 0.609021] [G loss: 0.830595]\n",
"[Epoch 94/200] [Batch 27/59] [D loss: 0.615306] [G loss: 0.918375]\n",
"[Epoch 94/200] [Batch 28/59] [D loss: 0.656706] [G loss: 0.758101]\n",
"[Epoch 94/200] [Batch 29/59] [D loss: 0.610439] [G loss: 1.131785]\n",
"[Epoch 94/200] [Batch 30/59] [D loss: 0.604282] [G loss: 0.746412]\n",
"[Epoch 94/200] [Batch 31/59] [D loss: 0.645967] [G loss: 0.798401]\n",
"[Epoch 94/200] [Batch 32/59] [D loss: 0.589028] [G loss: 0.929401]\n",
"[Epoch 94/200] [Batch 33/59] [D loss: 0.650119] [G loss: 0.677397]\n",
"[Epoch 94/200] [Batch 34/59] [D loss: 0.603785] [G loss: 0.835951]\n",
"[Epoch 94/200] [Batch 35/59] [D loss: 0.635957] [G loss: 1.100726]\n",
"[Epoch 94/200] [Batch 36/59] [D loss: 0.547236] [G loss: 0.721895]\n",
"[Epoch 94/200] [Batch 37/59] [D loss: 0.613311] [G loss: 0.708334]\n",
"[Epoch 94/200] [Batch 38/59] [D loss: 0.655945] [G loss: 1.293751]\n",
"[Epoch 94/200] [Batch 39/59] [D loss: 0.647336] [G loss: 0.660783]\n",
"[Epoch 94/200] [Batch 40/59] [D loss: 0.612580] [G loss: 0.802352]\n",
"[Epoch 94/200] [Batch 41/59] [D loss: 0.608948] [G loss: 1.011356]\n",
"[Epoch 94/200] [Batch 42/59] [D loss: 0.582407] [G loss: 0.653563]\n",
"[Epoch 94/200] [Batch 43/59] [D loss: 0.588455] [G loss: 0.956689]\n",
"[Epoch 94/200] [Batch 44/59] [D loss: 0.608718] [G loss: 0.930722]\n",
"[Epoch 94/200] [Batch 45/59] [D loss: 0.603492] [G loss: 0.654044]\n",
"[Epoch 94/200] [Batch 46/59] [D loss: 0.622205] [G loss: 1.059085]\n",
"[Epoch 94/200] [Batch 47/59] [D loss: 0.593178] [G loss: 0.966207]\n",
"[Epoch 94/200] [Batch 48/59] [D loss: 0.657062] [G loss: 0.665164]\n",
"[Epoch 94/200] [Batch 49/59] [D loss: 0.596731] [G loss: 1.035002]\n",
"[Epoch 94/200] [Batch 50/59] [D loss: 0.641895] [G loss: 0.809352]\n",
"[Epoch 94/200] [Batch 51/59] [D loss: 0.619225] [G loss: 0.949602]\n",
"[Epoch 94/200] [Batch 52/59] [D loss: 0.598125] [G loss: 0.846796]\n",
"[Epoch 94/200] [Batch 53/59] [D loss: 0.605448] [G loss: 0.892407]\n",
"[Epoch 94/200] [Batch 54/59] [D loss: 0.638117] [G loss: 0.836699]\n",
"[Epoch 94/200] [Batch 55/59] [D loss: 0.606222] [G loss: 1.000919]\n",
"[Epoch 94/200] [Batch 56/59] [D loss: 0.612472] [G loss: 1.080308]\n",
"[Epoch 94/200] [Batch 57/59] [D loss: 0.604505] [G loss: 0.721581]\n",
"[Epoch 94/200] [Batch 58/59] [D loss: 0.574199] [G loss: 0.730129]\n",
"[Epoch 95/200] [Batch 0/59] [D loss: 0.659315] [G loss: 1.165298]\n",
"[Epoch 95/200] [Batch 1/59] [D loss: 0.669350] [G loss: 0.650963]\n",
"[Epoch 95/200] [Batch 2/59] [D loss: 0.611748] [G loss: 0.800465]\n",
"[Epoch 95/200] [Batch 3/59] [D loss: 0.593913] [G loss: 0.776775]\n",
"[Epoch 95/200] [Batch 4/59] [D loss: 0.593483] [G loss: 0.934263]\n",
"[Epoch 95/200] [Batch 5/59] [D loss: 0.639429] [G loss: 0.830410]\n",
"[Epoch 95/200] [Batch 6/59] [D loss: 0.634815] [G loss: 0.802542]\n",
"[Epoch 95/200] [Batch 7/59] [D loss: 0.604998] [G loss: 1.080555]\n",
"[Epoch 95/200] [Batch 8/59] [D loss: 0.669255] [G loss: 0.941953]\n",
"[Epoch 95/200] [Batch 9/59] [D loss: 0.679702] [G loss: 0.641311]\n",
"[Epoch 95/200] [Batch 10/59] [D loss: 0.555982] [G loss: 0.912535]\n",
"[Epoch 95/200] [Batch 11/59] [D loss: 0.594965] [G loss: 1.088725]\n",
"[Epoch 95/200] [Batch 12/59] [D loss: 0.561360] [G loss: 0.936918]\n",
"[Epoch 95/200] [Batch 13/59] [D loss: 0.671803] [G loss: 0.614136]\n",
"[Epoch 95/200] [Batch 14/59] [D loss: 0.612037] [G loss: 1.054346]\n",
"[Epoch 95/200] [Batch 15/59] [D loss: 0.652794] [G loss: 0.909187]\n",
"[Epoch 95/200] [Batch 16/59] [D loss: 0.537568] [G loss: 0.766114]\n",
"[Epoch 95/200] [Batch 17/59] [D loss: 0.549690] [G loss: 0.968750]\n",
"[Epoch 95/200] [Batch 18/59] [D loss: 0.607511] [G loss: 1.029803]\n",
"[Epoch 95/200] [Batch 19/59] [D loss: 0.662645] [G loss: 0.616303]\n",
"[Epoch 95/200] [Batch 20/59] [D loss: 0.632254] [G loss: 0.871204]\n",
"[Epoch 95/200] [Batch 21/59] [D loss: 0.597930] [G loss: 1.068937]\n",
"[Epoch 95/200] [Batch 22/59] [D loss: 0.608006] [G loss: 0.728057]\n",
"[Epoch 95/200] [Batch 23/59] [D loss: 0.600588] [G loss: 0.914468]\n",
"[Epoch 95/200] [Batch 24/59] [D loss: 0.597471] [G loss: 0.936856]\n",
"[Epoch 95/200] [Batch 25/59] [D loss: 0.607270] [G loss: 0.903238]\n",
"[Epoch 95/200] [Batch 26/59] [D loss: 0.597915] [G loss: 0.795818]\n",
"[Epoch 95/200] [Batch 27/59] [D loss: 0.648139] [G loss: 0.743623]\n",
"[Epoch 95/200] [Batch 28/59] [D loss: 0.640062] [G loss: 1.025759]\n",
"[Epoch 95/200] [Batch 29/59] [D loss: 0.580622] [G loss: 0.715136]\n",
"[Epoch 95/200] [Batch 30/59] [D loss: 0.575314] [G loss: 0.866397]\n",
"[Epoch 95/200] [Batch 31/59] [D loss: 0.584425] [G loss: 0.920994]\n",
"[Epoch 95/200] [Batch 32/59] [D loss: 0.655556] [G loss: 0.631212]\n",
"[Epoch 95/200] [Batch 33/59] [D loss: 0.555251] [G loss: 0.919284]\n",
"[Epoch 95/200] [Batch 34/59] [D loss: 0.599082] [G loss: 0.894145]\n",
"[Epoch 95/200] [Batch 35/59] [D loss: 0.651494] [G loss: 0.695175]\n",
"[Epoch 95/200] [Batch 36/59] [D loss: 0.578225] [G loss: 1.063279]\n",
"[Epoch 95/200] [Batch 37/59] [D loss: 0.642191] [G loss: 1.070476]\n",
"[Epoch 95/200] [Batch 38/59] [D loss: 0.575612] [G loss: 0.633196]\n",
"[Epoch 95/200] [Batch 39/59] [D loss: 0.590058] [G loss: 0.665425]\n",
"[Epoch 95/200] [Batch 40/59] [D loss: 0.585499] [G loss: 1.195518]\n",
"[Epoch 95/200] [Batch 41/59] [D loss: 0.567308] [G loss: 0.806793]\n",
"[Epoch 95/200] [Batch 42/59] [D loss: 0.687040] [G loss: 0.559720]\n",
"[Epoch 95/200] [Batch 43/59] [D loss: 0.568116] [G loss: 1.183795]\n",
"[Epoch 95/200] [Batch 44/59] [D loss: 0.577849] [G loss: 0.882609]\n",
"[Epoch 95/200] [Batch 45/59] [D loss: 0.643314] [G loss: 0.625995]\n",
"[Epoch 95/200] [Batch 46/59] [D loss: 0.577598] [G loss: 1.082209]\n",
"[Epoch 95/200] [Batch 47/59] [D loss: 0.585393] [G loss: 0.839826]\n",
"[Epoch 95/200] [Batch 48/59] [D loss: 0.593036] [G loss: 0.845374]\n",
"[Epoch 95/200] [Batch 49/59] [D loss: 0.620236] [G loss: 0.908683]\n",
"[Epoch 95/200] [Batch 50/59] [D loss: 0.628183] [G loss: 0.842572]\n",
"[Epoch 95/200] [Batch 51/59] [D loss: 0.613817] [G loss: 0.816454]\n",
"[Epoch 95/200] [Batch 52/59] [D loss: 0.628214] [G loss: 0.896390]\n",
"[Epoch 95/200] [Batch 53/59] [D loss: 0.635484] [G loss: 0.911788]\n",
"[Epoch 95/200] [Batch 54/59] [D loss: 0.609056] [G loss: 0.796057]\n",
"[Epoch 95/200] [Batch 55/59] [D loss: 0.615855] [G loss: 0.896545]\n",
"[Epoch 95/200] [Batch 56/59] [D loss: 0.590253] [G loss: 0.826619]\n",
"[Epoch 95/200] [Batch 57/59] [D loss: 0.584670] [G loss: 1.028965]\n",
"[Epoch 95/200] [Batch 58/59] [D loss: 0.670088] [G loss: 0.916370]\n",
"[Epoch 96/200] [Batch 0/59] [D loss: 0.606995] [G loss: 0.660607]\n",
"[Epoch 96/200] [Batch 1/59] [D loss: 0.638749] [G loss: 0.678675]\n",
"[Epoch 96/200] [Batch 2/59] [D loss: 0.638124] [G loss: 1.077113]\n",
"[Epoch 96/200] [Batch 3/59] [D loss: 0.574235] [G loss: 0.894068]\n",
"[Epoch 96/200] [Batch 4/59] [D loss: 0.568231] [G loss: 0.844649]\n",
"[Epoch 96/200] [Batch 5/59] [D loss: 0.569601] [G loss: 0.831957]\n",
"[Epoch 96/200] [Batch 6/59] [D loss: 0.600757] [G loss: 0.860448]\n",
"[Epoch 96/200] [Batch 7/59] [D loss: 0.608622] [G loss: 0.847214]\n",
"[Epoch 96/200] [Batch 8/59] [D loss: 0.586456] [G loss: 0.977871]\n",
"[Epoch 96/200] [Batch 9/59] [D loss: 0.580519] [G loss: 0.819293]\n",
"[Epoch 96/200] [Batch 10/59] [D loss: 0.584075] [G loss: 0.947195]\n",
"[Epoch 96/200] [Batch 11/59] [D loss: 0.618452] [G loss: 0.906382]\n",
"[Epoch 96/200] [Batch 12/59] [D loss: 0.647110] [G loss: 0.726912]\n",
"[Epoch 96/200] [Batch 13/59] [D loss: 0.616585] [G loss: 0.979303]\n",
"[Epoch 96/200] [Batch 14/59] [D loss: 0.616267] [G loss: 0.975400]\n",
"[Epoch 96/200] [Batch 15/59] [D loss: 0.552410] [G loss: 0.761865]\n",
"[Epoch 96/200] [Batch 16/59] [D loss: 0.632024] [G loss: 0.789171]\n",
"[Epoch 96/200] [Batch 17/59] [D loss: 0.539461] [G loss: 0.905258]\n",
"[Epoch 96/200] [Batch 18/59] [D loss: 0.646335] [G loss: 0.931927]\n",
"[Epoch 96/200] [Batch 19/59] [D loss: 0.591486] [G loss: 0.944077]\n",
"[Epoch 96/200] [Batch 20/59] [D loss: 0.630229] [G loss: 0.854751]\n",
"[Epoch 96/200] [Batch 21/59] [D loss: 0.584993] [G loss: 0.876686]\n",
"[Epoch 96/200] [Batch 22/59] [D loss: 0.593609] [G loss: 0.839045]\n",
"[Epoch 96/200] [Batch 23/59] [D loss: 0.633800] [G loss: 0.678574]\n",
"[Epoch 96/200] [Batch 24/59] [D loss: 0.597996] [G loss: 0.848340]\n",
"[Epoch 96/200] [Batch 25/59] [D loss: 0.631943] [G loss: 1.033771]\n",
"[Epoch 96/200] [Batch 26/59] [D loss: 0.623046] [G loss: 0.702353]\n",
"[Epoch 96/200] [Batch 27/59] [D loss: 0.572217] [G loss: 0.788969]\n",
"[Epoch 96/200] [Batch 28/59] [D loss: 0.650555] [G loss: 0.941444]\n",
"[Epoch 96/200] [Batch 29/59] [D loss: 0.616152] [G loss: 0.908180]\n",
"[Epoch 96/200] [Batch 30/59] [D loss: 0.592957] [G loss: 0.703576]\n",
"[Epoch 96/200] [Batch 31/59] [D loss: 0.617316] [G loss: 1.057970]\n",
"[Epoch 96/200] [Batch 32/59] [D loss: 0.653970] [G loss: 0.860632]\n",
"[Epoch 96/200] [Batch 33/59] [D loss: 0.625172] [G loss: 0.787523]\n",
"[Epoch 96/200] [Batch 34/59] [D loss: 0.630939] [G loss: 1.217013]\n",
"[Epoch 96/200] [Batch 35/59] [D loss: 0.614460] [G loss: 0.661542]\n",
"[Epoch 96/200] [Batch 36/59] [D loss: 0.538955] [G loss: 0.950834]\n",
"[Epoch 96/200] [Batch 37/59] [D loss: 0.536626] [G loss: 0.799485]\n",
"[Epoch 96/200] [Batch 38/59] [D loss: 0.588593] [G loss: 0.880066]\n",
"[Epoch 96/200] [Batch 39/59] [D loss: 0.585927] [G loss: 0.932614]\n",
"[Epoch 96/200] [Batch 40/59] [D loss: 0.522896] [G loss: 0.701139]\n",
"[Epoch 96/200] [Batch 41/59] [D loss: 0.631288] [G loss: 1.074685]\n",
"[Epoch 96/200] [Batch 42/59] [D loss: 0.568270] [G loss: 0.815746]\n",
"[Epoch 96/200] [Batch 43/59] [D loss: 0.638533] [G loss: 0.780766]\n",
"[Epoch 96/200] [Batch 44/59] [D loss: 0.547925] [G loss: 0.852520]\n",
"[Epoch 96/200] [Batch 45/59] [D loss: 0.583294] [G loss: 0.830408]\n",
"[Epoch 96/200] [Batch 46/59] [D loss: 0.544761] [G loss: 0.846196]\n",
"[Epoch 96/200] [Batch 47/59] [D loss: 0.552970] [G loss: 0.958872]\n",
"[Epoch 96/200] [Batch 48/59] [D loss: 0.618718] [G loss: 0.748372]\n",
"[Epoch 96/200] [Batch 49/59] [D loss: 0.628052] [G loss: 0.812430]\n",
"[Epoch 96/200] [Batch 50/59] [D loss: 0.622280] [G loss: 0.877321]\n",
"[Epoch 96/200] [Batch 51/59] [D loss: 0.587269] [G loss: 0.874572]\n",
"[Epoch 96/200] [Batch 52/59] [D loss: 0.634850] [G loss: 0.948070]\n",
"[Epoch 96/200] [Batch 53/59] [D loss: 0.612270] [G loss: 0.739700]\n",
"[Epoch 96/200] [Batch 54/59] [D loss: 0.588581] [G loss: 1.007474]\n",
"[Epoch 96/200] [Batch 55/59] [D loss: 0.651812] [G loss: 0.943364]\n",
"[Epoch 96/200] [Batch 56/59] [D loss: 0.654199] [G loss: 0.765690]\n",
"[Epoch 96/200] [Batch 57/59] [D loss: 0.566493] [G loss: 0.889951]\n",
"[Epoch 96/200] [Batch 58/59] [D loss: 0.603614] [G loss: 0.766077]\n",
"[Epoch 97/200] [Batch 0/59] [D loss: 0.588346] [G loss: 0.874726]\n",
"[Epoch 97/200] [Batch 1/59] [D loss: 0.556112] [G loss: 0.836911]\n",
"[Epoch 97/200] [Batch 2/59] [D loss: 0.560519] [G loss: 0.838657]\n",
"[Epoch 97/200] [Batch 3/59] [D loss: 0.578848] [G loss: 0.901026]\n",
"[Epoch 97/200] [Batch 4/59] [D loss: 0.634577] [G loss: 0.902607]\n",
"[Epoch 97/200] [Batch 5/59] [D loss: 0.589354] [G loss: 0.765968]\n",
"[Epoch 97/200] [Batch 6/59] [D loss: 0.611715] [G loss: 0.848713]\n",
"[Epoch 97/200] [Batch 7/59] [D loss: 0.581061] [G loss: 1.052713]\n",
"[Epoch 97/200] [Batch 8/59] [D loss: 0.597489] [G loss: 0.741380]\n",
"[Epoch 97/200] [Batch 9/59] [D loss: 0.605644] [G loss: 1.013587]\n",
"[Epoch 97/200] [Batch 10/59] [D loss: 0.634446] [G loss: 0.810133]\n",
"[Epoch 97/200] [Batch 11/59] [D loss: 0.621542] [G loss: 0.749139]\n",
"[Epoch 97/200] [Batch 12/59] [D loss: 0.613877] [G loss: 1.030080]\n",
"[Epoch 97/200] [Batch 13/59] [D loss: 0.589570] [G loss: 0.841322]\n",
"[Epoch 97/200] [Batch 14/59] [D loss: 0.552294] [G loss: 0.801539]\n",
"[Epoch 97/200] [Batch 15/59] [D loss: 0.633722] [G loss: 0.914205]\n",
"[Epoch 97/200] [Batch 16/59] [D loss: 0.651441] [G loss: 0.868790]\n",
"[Epoch 97/200] [Batch 17/59] [D loss: 0.620626] [G loss: 0.903447]\n",
"[Epoch 97/200] [Batch 18/59] [D loss: 0.563062] [G loss: 0.746909]\n",
"[Epoch 97/200] [Batch 19/59] [D loss: 0.580455] [G loss: 0.911293]\n",
"[Epoch 97/200] [Batch 20/59] [D loss: 0.618696] [G loss: 0.799109]\n",
"[Epoch 97/200] [Batch 21/59] [D loss: 0.625681] [G loss: 0.777583]\n",
"[Epoch 97/200] [Batch 22/59] [D loss: 0.542470] [G loss: 0.897077]\n",
"[Epoch 97/200] [Batch 23/59] [D loss: 0.571080] [G loss: 0.972069]\n",
"[Epoch 97/200] [Batch 24/59] [D loss: 0.598296] [G loss: 0.850914]\n",
"[Epoch 97/200] [Batch 25/59] [D loss: 0.630617] [G loss: 0.944574]\n",
"[Epoch 97/200] [Batch 26/59] [D loss: 0.581193] [G loss: 0.912225]\n",
"[Epoch 97/200] [Batch 27/59] [D loss: 0.618633] [G loss: 0.803859]\n",
"[Epoch 97/200] [Batch 28/59] [D loss: 0.572287] [G loss: 0.875222]\n",
"[Epoch 97/200] [Batch 29/59] [D loss: 0.607025] [G loss: 0.929872]\n",
"[Epoch 97/200] [Batch 30/59] [D loss: 0.596182] [G loss: 0.932929]\n",
"[Epoch 97/200] [Batch 31/59] [D loss: 0.613755] [G loss: 0.777591]\n",
"[Epoch 97/200] [Batch 32/59] [D loss: 0.630388] [G loss: 0.941859]\n",
"[Epoch 97/200] [Batch 33/59] [D loss: 0.563674] [G loss: 0.880884]\n",
"[Epoch 97/200] [Batch 34/59] [D loss: 0.595540] [G loss: 0.951033]\n",
"[Epoch 97/200] [Batch 35/59] [D loss: 0.589005] [G loss: 0.858359]\n",
"[Epoch 97/200] [Batch 36/59] [D loss: 0.601170] [G loss: 0.833866]\n",
"[Epoch 97/200] [Batch 37/59] [D loss: 0.609631] [G loss: 0.936752]\n",
"[Epoch 97/200] [Batch 38/59] [D loss: 0.606783] [G loss: 1.107298]\n",
"[Epoch 97/200] [Batch 39/59] [D loss: 0.586426] [G loss: 0.942949]\n",
"[Epoch 97/200] [Batch 40/59] [D loss: 0.602690] [G loss: 0.849234]\n",
"[Epoch 97/200] [Batch 41/59] [D loss: 0.607240] [G loss: 0.885069]\n",
"[Epoch 97/200] [Batch 42/59] [D loss: 0.608310] [G loss: 0.537968]\n",
"[Epoch 97/200] [Batch 43/59] [D loss: 0.546703] [G loss: 0.931130]\n",
"[Epoch 97/200] [Batch 44/59] [D loss: 0.621788] [G loss: 1.093552]\n",
"[Epoch 97/200] [Batch 45/59] [D loss: 0.617001] [G loss: 0.675555]\n",
"[Epoch 97/200] [Batch 46/59] [D loss: 0.615994] [G loss: 0.914404]\n",
"[Epoch 97/200] [Batch 47/59] [D loss: 0.589588] [G loss: 1.073539]\n",
"[Epoch 97/200] [Batch 48/59] [D loss: 0.578967] [G loss: 0.794649]\n",
"[Epoch 97/200] [Batch 49/59] [D loss: 0.612261] [G loss: 1.076636]\n",
"[Epoch 97/200] [Batch 50/59] [D loss: 0.598431] [G loss: 0.693824]\n",
"[Epoch 97/200] [Batch 51/59] [D loss: 0.658084] [G loss: 0.969494]\n",
"[Epoch 97/200] [Batch 52/59] [D loss: 0.598631] [G loss: 0.892831]\n",
"[Epoch 97/200] [Batch 53/59] [D loss: 0.665471] [G loss: 0.709362]\n",
"[Epoch 97/200] [Batch 54/59] [D loss: 0.558924] [G loss: 0.906967]\n",
"[Epoch 97/200] [Batch 55/59] [D loss: 0.605291] [G loss: 0.878818]\n",
"[Epoch 97/200] [Batch 56/59] [D loss: 0.585881] [G loss: 0.795925]\n",
"[Epoch 97/200] [Batch 57/59] [D loss: 0.610924] [G loss: 1.064873]\n",
"[Epoch 97/200] [Batch 58/59] [D loss: 0.554291] [G loss: 0.902006]\n",
"[Epoch 98/200] [Batch 0/59] [D loss: 0.631386] [G loss: 0.709640]\n",
"[Epoch 98/200] [Batch 1/59] [D loss: 0.666403] [G loss: 1.186560]\n",
"[Epoch 98/200] [Batch 2/59] [D loss: 0.583550] [G loss: 0.793936]\n",
"[Epoch 98/200] [Batch 3/59] [D loss: 0.584564] [G loss: 0.791275]\n",
"[Epoch 98/200] [Batch 4/59] [D loss: 0.581568] [G loss: 1.123329]\n",
"[Epoch 98/200] [Batch 5/59] [D loss: 0.567389] [G loss: 0.795726]\n",
"[Epoch 98/200] [Batch 6/59] [D loss: 0.668550] [G loss: 0.800618]\n",
"[Epoch 98/200] [Batch 7/59] [D loss: 0.609222] [G loss: 0.842119]\n",
"[Epoch 98/200] [Batch 8/59] [D loss: 0.600109] [G loss: 0.945181]\n",
"[Epoch 98/200] [Batch 9/59] [D loss: 0.625430] [G loss: 0.670168]\n",
"[Epoch 98/200] [Batch 10/59] [D loss: 0.620320] [G loss: 1.110651]\n",
"[Epoch 98/200] [Batch 11/59] [D loss: 0.590509] [G loss: 0.679649]\n",
"[Epoch 98/200] [Batch 12/59] [D loss: 0.572667] [G loss: 0.970622]\n",
"[Epoch 98/200] [Batch 13/59] [D loss: 0.546580] [G loss: 0.829072]\n",
"[Epoch 98/200] [Batch 14/59] [D loss: 0.673035] [G loss: 0.630160]\n",
"[Epoch 98/200] [Batch 15/59] [D loss: 0.625209] [G loss: 1.153512]\n",
"[Epoch 98/200] [Batch 16/59] [D loss: 0.600023] [G loss: 1.062845]\n",
"[Epoch 98/200] [Batch 17/59] [D loss: 0.610850] [G loss: 0.688064]\n",
"[Epoch 98/200] [Batch 18/59] [D loss: 0.643114] [G loss: 0.964847]\n",
"[Epoch 98/200] [Batch 19/59] [D loss: 0.589137] [G loss: 1.035977]\n",
"[Epoch 98/200] [Batch 20/59] [D loss: 0.573456] [G loss: 0.825991]\n",
"[Epoch 98/200] [Batch 21/59] [D loss: 0.543501] [G loss: 0.771735]\n",
"[Epoch 98/200] [Batch 22/59] [D loss: 0.639719] [G loss: 1.021733]\n",
"[Epoch 98/200] [Batch 23/59] [D loss: 0.645724] [G loss: 0.803633]\n",
"[Epoch 98/200] [Batch 24/59] [D loss: 0.573492] [G loss: 0.918490]\n",
"[Epoch 98/200] [Batch 25/59] [D loss: 0.575649] [G loss: 0.747175]\n",
"[Epoch 98/200] [Batch 26/59] [D loss: 0.621874] [G loss: 0.820382]\n",
"[Epoch 98/200] [Batch 27/59] [D loss: 0.601078] [G loss: 1.012973]\n",
"[Epoch 98/200] [Batch 28/59] [D loss: 0.615532] [G loss: 0.895822]\n",
"[Epoch 98/200] [Batch 29/59] [D loss: 0.530755] [G loss: 0.819386]\n",
"[Epoch 98/200] [Batch 30/59] [D loss: 0.558586] [G loss: 0.934310]\n",
"[Epoch 98/200] [Batch 31/59] [D loss: 0.578329] [G loss: 0.783380]\n",
"[Epoch 98/200] [Batch 32/59] [D loss: 0.618361] [G loss: 0.829875]\n",
"[Epoch 98/200] [Batch 33/59] [D loss: 0.602941] [G loss: 0.915806]\n",
"[Epoch 98/200] [Batch 34/59] [D loss: 0.581409] [G loss: 0.790232]\n",
"[Epoch 98/200] [Batch 35/59] [D loss: 0.613872] [G loss: 0.835606]\n",
"[Epoch 98/200] [Batch 36/59] [D loss: 0.600936] [G loss: 0.798189]\n",
"[Epoch 98/200] [Batch 37/59] [D loss: 0.596597] [G loss: 0.739853]\n",
"[Epoch 98/200] [Batch 38/59] [D loss: 0.611008] [G loss: 0.759342]\n",
"[Epoch 98/200] [Batch 39/59] [D loss: 0.591299] [G loss: 1.078937]\n",
"[Epoch 98/200] [Batch 40/59] [D loss: 0.632006] [G loss: 0.797450]\n",
"[Epoch 98/200] [Batch 41/59] [D loss: 0.628984] [G loss: 0.875635]\n",
"[Epoch 98/200] [Batch 42/59] [D loss: 0.614705] [G loss: 0.868443]\n",
"[Epoch 98/200] [Batch 43/59] [D loss: 0.632114] [G loss: 0.783967]\n",
"[Epoch 98/200] [Batch 44/59] [D loss: 0.652928] [G loss: 0.905488]\n",
"[Epoch 98/200] [Batch 45/59] [D loss: 0.590286] [G loss: 0.658452]\n",
"[Epoch 98/200] [Batch 46/59] [D loss: 0.594086] [G loss: 0.789381]\n",
"[Epoch 98/200] [Batch 47/59] [D loss: 0.633096] [G loss: 1.025822]\n",
"[Epoch 98/200] [Batch 48/59] [D loss: 0.608321] [G loss: 0.949481]\n",
"[Epoch 98/200] [Batch 49/59] [D loss: 0.539392] [G loss: 0.852598]\n",
"[Epoch 98/200] [Batch 50/59] [D loss: 0.559844] [G loss: 0.933412]\n",
"[Epoch 98/200] [Batch 51/59] [D loss: 0.579697] [G loss: 0.867101]\n",
"[Epoch 98/200] [Batch 52/59] [D loss: 0.600343] [G loss: 1.102559]\n",
"[Epoch 98/200] [Batch 53/59] [D loss: 0.655034] [G loss: 0.724206]\n",
"[Epoch 98/200] [Batch 54/59] [D loss: 0.584644] [G loss: 0.620003]\n",
"[Epoch 98/200] [Batch 55/59] [D loss: 0.576490] [G loss: 0.939500]\n",
"[Epoch 98/200] [Batch 56/59] [D loss: 0.562004] [G loss: 0.763039]\n",
"[Epoch 98/200] [Batch 57/59] [D loss: 0.622875] [G loss: 0.911096]\n",
"[Epoch 98/200] [Batch 58/59] [D loss: 0.644564] [G loss: 0.879132]\n",
"[Epoch 99/200] [Batch 0/59] [D loss: 0.572978] [G loss: 0.875211]\n",
"[Epoch 99/200] [Batch 1/59] [D loss: 0.529884] [G loss: 0.893349]\n",
"[Epoch 99/200] [Batch 2/59] [D loss: 0.595298] [G loss: 0.914592]\n",
"[Epoch 99/200] [Batch 3/59] [D loss: 0.610401] [G loss: 0.768397]\n",
"[Epoch 99/200] [Batch 4/59] [D loss: 0.607810] [G loss: 0.955216]\n",
"[Epoch 99/200] [Batch 5/59] [D loss: 0.626932] [G loss: 0.963210]\n",
"[Epoch 99/200] [Batch 6/59] [D loss: 0.581107] [G loss: 0.942814]\n",
"[Epoch 99/200] [Batch 7/59] [D loss: 0.594694] [G loss: 0.768622]\n",
"[Epoch 99/200] [Batch 8/59] [D loss: 0.653829] [G loss: 0.936649]\n",
"[Epoch 99/200] [Batch 9/59] [D loss: 0.587467] [G loss: 0.728949]\n",
"[Epoch 99/200] [Batch 10/59] [D loss: 0.621934] [G loss: 0.689512]\n",
"[Epoch 99/200] [Batch 11/59] [D loss: 0.617007] [G loss: 1.214522]\n",
"[Epoch 99/200] [Batch 12/59] [D loss: 0.590341] [G loss: 0.886397]\n",
"[Epoch 99/200] [Batch 13/59] [D loss: 0.630384] [G loss: 0.825955]\n",
"[Epoch 99/200] [Batch 14/59] [D loss: 0.582564] [G loss: 1.007762]\n",
"[Epoch 99/200] [Batch 15/59] [D loss: 0.559764] [G loss: 0.846942]\n",
"[Epoch 99/200] [Batch 16/59] [D loss: 0.583794] [G loss: 0.925005]\n",
"[Epoch 99/200] [Batch 17/59] [D loss: 0.624157] [G loss: 0.705192]\n",
"[Epoch 99/200] [Batch 18/59] [D loss: 0.587348] [G loss: 0.836475]\n",
"[Epoch 99/200] [Batch 19/59] [D loss: 0.555463] [G loss: 1.002678]\n",
"[Epoch 99/200] [Batch 20/59] [D loss: 0.618189] [G loss: 0.647684]\n",
"[Epoch 99/200] [Batch 21/59] [D loss: 0.612086] [G loss: 0.958143]\n",
"[Epoch 99/200] [Batch 22/59] [D loss: 0.604464] [G loss: 0.683279]\n",
"[Epoch 99/200] [Batch 23/59] [D loss: 0.607222] [G loss: 0.845433]\n",
"[Epoch 99/200] [Batch 24/59] [D loss: 0.631932] [G loss: 1.118336]\n",
"[Epoch 99/200] [Batch 25/59] [D loss: 0.617200] [G loss: 0.756532]\n",
"[Epoch 99/200] [Batch 26/59] [D loss: 0.628274] [G loss: 0.728515]\n",
"[Epoch 99/200] [Batch 27/59] [D loss: 0.617239] [G loss: 0.917889]\n",
"[Epoch 99/200] [Batch 28/59] [D loss: 0.603869] [G loss: 1.101169]\n",
"[Epoch 99/200] [Batch 29/59] [D loss: 0.679746] [G loss: 0.530485]\n",
"[Epoch 99/200] [Batch 30/59] [D loss: 0.584504] [G loss: 0.913034]\n",
"[Epoch 99/200] [Batch 31/59] [D loss: 0.601861] [G loss: 1.131335]\n",
"[Epoch 99/200] [Batch 32/59] [D loss: 0.594666] [G loss: 0.666649]\n",
"[Epoch 99/200] [Batch 33/59] [D loss: 0.614643] [G loss: 0.791564]\n",
"[Epoch 99/200] [Batch 34/59] [D loss: 0.605616] [G loss: 0.969351]\n",
"[Epoch 99/200] [Batch 35/59] [D loss: 0.614412] [G loss: 0.792625]\n",
"[Epoch 99/200] [Batch 36/59] [D loss: 0.645668] [G loss: 0.904992]\n",
"[Epoch 99/200] [Batch 37/59] [D loss: 0.569555] [G loss: 0.854958]\n",
"[Epoch 99/200] [Batch 38/59] [D loss: 0.570626] [G loss: 0.838522]\n",
"[Epoch 99/200] [Batch 39/59] [D loss: 0.573137] [G loss: 0.735013]\n",
"[Epoch 99/200] [Batch 40/59] [D loss: 0.647068] [G loss: 1.032401]\n",
"[Epoch 99/200] [Batch 41/59] [D loss: 0.646888] [G loss: 0.677704]\n",
"[Epoch 99/200] [Batch 42/59] [D loss: 0.623448] [G loss: 1.291236]\n",
"[Epoch 99/200] [Batch 43/59] [D loss: 0.590618] [G loss: 0.728386]\n",
"[Epoch 99/200] [Batch 44/59] [D loss: 0.594915] [G loss: 0.906317]\n",
"[Epoch 99/200] [Batch 45/59] [D loss: 0.568316] [G loss: 0.913201]\n",
"[Epoch 99/200] [Batch 46/59] [D loss: 0.570293] [G loss: 0.786459]\n",
"[Epoch 99/200] [Batch 47/59] [D loss: 0.600385] [G loss: 0.798346]\n",
"[Epoch 99/200] [Batch 48/59] [D loss: 0.622390] [G loss: 1.032992]\n",
"[Epoch 99/200] [Batch 49/59] [D loss: 0.594033] [G loss: 0.756441]\n",
"[Epoch 99/200] [Batch 50/59] [D loss: 0.589618] [G loss: 0.948168]\n",
"[Epoch 99/200] [Batch 51/59] [D loss: 0.577880] [G loss: 0.838451]\n",
"[Epoch 99/200] [Batch 52/59] [D loss: 0.566398] [G loss: 1.014754]\n",
"[Epoch 99/200] [Batch 53/59] [D loss: 0.613760] [G loss: 0.800734]\n",
"[Epoch 99/200] [Batch 54/59] [D loss: 0.593613] [G loss: 0.794070]\n",
"[Epoch 99/200] [Batch 55/59] [D loss: 0.654368] [G loss: 1.069100]\n",
"[Epoch 99/200] [Batch 56/59] [D loss: 0.601707] [G loss: 0.916183]\n",
"[Epoch 99/200] [Batch 57/59] [D loss: 0.623863] [G loss: 0.890795]\n",
"[Epoch 99/200] [Batch 58/59] [D loss: 0.568074] [G loss: 1.172947]\n",
"[Epoch 100/200] [Batch 0/59] [D loss: 0.571730] [G loss: 0.817637]\n",
"[Epoch 100/200] [Batch 1/59] [D loss: 0.655345] [G loss: 1.071723]\n",
"[Epoch 100/200] [Batch 2/59] [D loss: 0.618901] [G loss: 0.675325]\n",
"[Epoch 100/200] [Batch 3/59] [D loss: 0.601417] [G loss: 0.849835]\n",
"[Epoch 100/200] [Batch 4/59] [D loss: 0.536590] [G loss: 0.869665]\n",
"[Epoch 100/200] [Batch 5/59] [D loss: 0.667388] [G loss: 1.016670]\n",
"[Epoch 100/200] [Batch 6/59] [D loss: 0.584019] [G loss: 0.891545]\n",
"[Epoch 100/200] [Batch 7/59] [D loss: 0.552615] [G loss: 0.837463]\n",
"[Epoch 100/200] [Batch 8/59] [D loss: 0.554514] [G loss: 0.827708]\n",
"[Epoch 100/200] [Batch 9/59] [D loss: 0.571462] [G loss: 0.985092]\n",
"[Epoch 100/200] [Batch 10/59] [D loss: 0.598823] [G loss: 0.693563]\n",
"[Epoch 100/200] [Batch 11/59] [D loss: 0.569159] [G loss: 0.920480]\n",
"[Epoch 100/200] [Batch 12/59] [D loss: 0.620811] [G loss: 0.953724]\n",
"[Epoch 100/200] [Batch 13/59] [D loss: 0.564542] [G loss: 0.733589]\n",
"[Epoch 100/200] [Batch 14/59] [D loss: 0.556701] [G loss: 1.081770]\n",
"[Epoch 100/200] [Batch 15/59] [D loss: 0.580248] [G loss: 0.830479]\n",
"[Epoch 100/200] [Batch 16/59] [D loss: 0.605583] [G loss: 0.949047]\n",
"[Epoch 100/200] [Batch 17/59] [D loss: 0.557800] [G loss: 0.853043]\n",
"[Epoch 100/200] [Batch 18/59] [D loss: 0.599430] [G loss: 0.988969]\n",
"[Epoch 100/200] [Batch 19/59] [D loss: 0.613681] [G loss: 0.701904]\n",
"[Epoch 100/200] [Batch 20/59] [D loss: 0.546631] [G loss: 1.029619]\n",
"[Epoch 100/200] [Batch 21/59] [D loss: 0.598322] [G loss: 0.995731]\n",
"[Epoch 100/200] [Batch 22/59] [D loss: 0.576431] [G loss: 0.655258]\n",
"[Epoch 100/200] [Batch 23/59] [D loss: 0.618502] [G loss: 0.969220]\n",
"[Epoch 100/200] [Batch 24/59] [D loss: 0.628770] [G loss: 1.002376]\n",
"[Epoch 100/200] [Batch 25/59] [D loss: 0.608607] [G loss: 0.763556]\n",
"[Epoch 100/200] [Batch 26/59] [D loss: 0.638389] [G loss: 1.066882]\n",
"[Epoch 100/200] [Batch 27/59] [D loss: 0.619900] [G loss: 0.772975]\n",
"[Epoch 100/200] [Batch 28/59] [D loss: 0.605527] [G loss: 0.659997]\n",
"[Epoch 100/200] [Batch 29/59] [D loss: 0.580672] [G loss: 1.058961]\n",
"[Epoch 100/200] [Batch 30/59] [D loss: 0.582661] [G loss: 0.907897]\n",
"[Epoch 100/200] [Batch 31/59] [D loss: 0.645698] [G loss: 0.815305]\n",
"[Epoch 100/200] [Batch 32/59] [D loss: 0.600572] [G loss: 0.945577]\n",
"[Epoch 100/200] [Batch 33/59] [D loss: 0.652455] [G loss: 0.954452]\n",
"[Epoch 100/200] [Batch 34/59] [D loss: 0.576051] [G loss: 0.759553]\n",
"[Epoch 100/200] [Batch 35/59] [D loss: 0.582050] [G loss: 0.787454]\n",
"[Epoch 100/200] [Batch 36/59] [D loss: 0.632302] [G loss: 1.063353]\n",
"[Epoch 100/200] [Batch 37/59] [D loss: 0.595447] [G loss: 0.830044]\n",
"[Epoch 100/200] [Batch 38/59] [D loss: 0.671206] [G loss: 0.638837]\n",
"[Epoch 100/200] [Batch 39/59] [D loss: 0.643461] [G loss: 1.221359]\n",
"[Epoch 100/200] [Batch 40/59] [D loss: 0.597737] [G loss: 0.749319]\n",
"[Epoch 100/200] [Batch 41/59] [D loss: 0.661955] [G loss: 0.703074]\n",
"[Epoch 100/200] [Batch 42/59] [D loss: 0.635870] [G loss: 1.515780]\n",
"[Epoch 100/200] [Batch 43/59] [D loss: 0.575515] [G loss: 0.867263]\n",
"[Epoch 100/200] [Batch 44/59] [D loss: 0.622105] [G loss: 0.789528]\n",
"[Epoch 100/200] [Batch 45/59] [D loss: 0.626886] [G loss: 1.162294]\n",
"[Epoch 100/200] [Batch 46/59] [D loss: 0.570793] [G loss: 0.698315]\n",
"[Epoch 100/200] [Batch 47/59] [D loss: 0.602059] [G loss: 0.822327]\n",
"[Epoch 100/200] [Batch 48/59] [D loss: 0.614161] [G loss: 1.129461]\n",
"[Epoch 100/200] [Batch 49/59] [D loss: 0.617838] [G loss: 0.761907]\n",
"[Epoch 100/200] [Batch 50/59] [D loss: 0.564405] [G loss: 0.801560]\n",
"[Epoch 100/200] [Batch 51/59] [D loss: 0.583964] [G loss: 0.753683]\n",
"[Epoch 100/200] [Batch 52/59] [D loss: 0.581854] [G loss: 0.925437]\n",
"[Epoch 100/200] [Batch 53/59] [D loss: 0.631064] [G loss: 0.832104]\n",
"[Epoch 100/200] [Batch 54/59] [D loss: 0.573876] [G loss: 0.879587]\n",
"[Epoch 100/200] [Batch 55/59] [D loss: 0.634258] [G loss: 0.778576]\n",
"[Epoch 100/200] [Batch 56/59] [D loss: 0.677383] [G loss: 0.943309]\n",
"[Epoch 100/200] [Batch 57/59] [D loss: 0.586442] [G loss: 0.903118]\n",
"[Epoch 100/200] [Batch 58/59] [D loss: 0.593451] [G loss: 0.789057]\n",
"[Epoch 101/200] [Batch 0/59] [D loss: 0.565051] [G loss: 0.862617]\n",
"[Epoch 101/200] [Batch 1/59] [D loss: 0.612080] [G loss: 1.005996]\n",
"[Epoch 101/200] [Batch 2/59] [D loss: 0.597635] [G loss: 0.742861]\n",
"[Epoch 101/200] [Batch 3/59] [D loss: 0.649019] [G loss: 1.045992]\n",
"[Epoch 101/200] [Batch 4/59] [D loss: 0.546660] [G loss: 0.898361]\n",
"[Epoch 101/200] [Batch 5/59] [D loss: 0.614464] [G loss: 0.927826]\n",
"[Epoch 101/200] [Batch 6/59] [D loss: 0.539685] [G loss: 0.753551]\n",
"[Epoch 101/200] [Batch 7/59] [D loss: 0.630945] [G loss: 0.833091]\n",
"[Epoch 101/200] [Batch 8/59] [D loss: 0.562386] [G loss: 1.139515]\n",
"[Epoch 101/200] [Batch 9/59] [D loss: 0.554951] [G loss: 0.842856]\n",
"[Epoch 101/200] [Batch 10/59] [D loss: 0.564482] [G loss: 0.812289]\n",
"[Epoch 101/200] [Batch 11/59] [D loss: 0.656779] [G loss: 0.998032]\n",
"[Epoch 101/200] [Batch 12/59] [D loss: 0.529036] [G loss: 0.756496]\n",
"[Epoch 101/200] [Batch 13/59] [D loss: 0.586596] [G loss: 0.752691]\n",
"[Epoch 101/200] [Batch 14/59] [D loss: 0.626956] [G loss: 1.238239]\n",
"[Epoch 101/200] [Batch 15/59] [D loss: 0.606524] [G loss: 0.652001]\n",
"[Epoch 101/200] [Batch 16/59] [D loss: 0.581320] [G loss: 0.824918]\n",
"[Epoch 101/200] [Batch 17/59] [D loss: 0.579165] [G loss: 1.023950]\n",
"[Epoch 101/200] [Batch 18/59] [D loss: 0.635049] [G loss: 0.723239]\n",
"[Epoch 101/200] [Batch 19/59] [D loss: 0.577219] [G loss: 0.984548]\n",
"[Epoch 101/200] [Batch 20/59] [D loss: 0.562052] [G loss: 0.972313]\n",
"[Epoch 101/200] [Batch 21/59] [D loss: 0.557492] [G loss: 0.872663]\n",
"[Epoch 101/200] [Batch 22/59] [D loss: 0.654847] [G loss: 0.819404]\n",
"[Epoch 101/200] [Batch 23/59] [D loss: 0.603765] [G loss: 0.934649]\n",
"[Epoch 101/200] [Batch 24/59] [D loss: 0.602854] [G loss: 0.704356]\n",
"[Epoch 101/200] [Batch 25/59] [D loss: 0.638678] [G loss: 1.014142]\n",
"[Epoch 101/200] [Batch 26/59] [D loss: 0.513369] [G loss: 0.821295]\n",
"[Epoch 101/200] [Batch 27/59] [D loss: 0.583013] [G loss: 0.698551]\n",
"[Epoch 101/200] [Batch 28/59] [D loss: 0.561803] [G loss: 1.159563]\n",
"[Epoch 101/200] [Batch 29/59] [D loss: 0.603221] [G loss: 0.891950]\n",
"[Epoch 101/200] [Batch 30/59] [D loss: 0.576940] [G loss: 0.627032]\n",
"[Epoch 101/200] [Batch 31/59] [D loss: 0.596927] [G loss: 1.071118]\n",
"[Epoch 101/200] [Batch 32/59] [D loss: 0.575012] [G loss: 0.863628]\n",
"[Epoch 101/200] [Batch 33/59] [D loss: 0.635981] [G loss: 0.818380]\n",
"[Epoch 101/200] [Batch 34/59] [D loss: 0.607528] [G loss: 1.072722]\n",
"[Epoch 101/200] [Batch 35/59] [D loss: 0.598082] [G loss: 0.868518]\n",
"[Epoch 101/200] [Batch 36/59] [D loss: 0.612466] [G loss: 0.928473]\n",
"[Epoch 101/200] [Batch 37/59] [D loss: 0.661942] [G loss: 0.811152]\n",
"[Epoch 101/200] [Batch 38/59] [D loss: 0.675610] [G loss: 1.181614]\n",
"[Epoch 101/200] [Batch 39/59] [D loss: 0.553437] [G loss: 0.671525]\n",
"[Epoch 101/200] [Batch 40/59] [D loss: 0.618792] [G loss: 0.650919]\n",
"[Epoch 101/200] [Batch 41/59] [D loss: 0.608925] [G loss: 1.180484]\n",
"[Epoch 101/200] [Batch 42/59] [D loss: 0.585141] [G loss: 0.929126]\n",
"[Epoch 101/200] [Batch 43/59] [D loss: 0.633987] [G loss: 0.516503]\n",
"[Epoch 101/200] [Batch 44/59] [D loss: 0.601985] [G loss: 1.158270]\n",
"[Epoch 101/200] [Batch 45/59] [D loss: 0.614941] [G loss: 1.045775]\n",
"[Epoch 101/200] [Batch 46/59] [D loss: 0.631451] [G loss: 0.642301]\n",
"[Epoch 101/200] [Batch 47/59] [D loss: 0.631674] [G loss: 0.928721]\n",
"[Epoch 101/200] [Batch 48/59] [D loss: 0.658996] [G loss: 0.975305]\n",
"[Epoch 101/200] [Batch 49/59] [D loss: 0.592361] [G loss: 1.029419]\n",
"[Epoch 101/200] [Batch 50/59] [D loss: 0.574451] [G loss: 0.733801]\n",
"[Epoch 101/200] [Batch 51/59] [D loss: 0.582961] [G loss: 0.985760]\n",
"[Epoch 101/200] [Batch 52/59] [D loss: 0.557384] [G loss: 0.975321]\n",
"[Epoch 101/200] [Batch 53/59] [D loss: 0.595147] [G loss: 0.887985]\n",
"[Epoch 101/200] [Batch 54/59] [D loss: 0.597752] [G loss: 0.808725]\n",
"[Epoch 101/200] [Batch 55/59] [D loss: 0.578005] [G loss: 0.889469]\n",
"[Epoch 101/200] [Batch 56/59] [D loss: 0.588506] [G loss: 0.958789]\n",
"[Epoch 101/200] [Batch 57/59] [D loss: 0.585667] [G loss: 0.593839]\n",
"[Epoch 101/200] [Batch 58/59] [D loss: 0.586186] [G loss: 1.254934]\n",
"[Epoch 102/200] [Batch 0/59] [D loss: 0.600567] [G loss: 0.789549]\n",
"[Epoch 102/200] [Batch 1/59] [D loss: 0.573061] [G loss: 0.864768]\n",
"[Epoch 102/200] [Batch 2/59] [D loss: 0.583809] [G loss: 0.920165]\n",
"[Epoch 102/200] [Batch 3/59] [D loss: 0.567174] [G loss: 0.843891]\n",
"[Epoch 102/200] [Batch 4/59] [D loss: 0.581930] [G loss: 0.813299]\n",
"[Epoch 102/200] [Batch 5/59] [D loss: 0.621396] [G loss: 0.899332]\n",
"[Epoch 102/200] [Batch 6/59] [D loss: 0.612394] [G loss: 1.179376]\n",
"[Epoch 102/200] [Batch 7/59] [D loss: 0.577473] [G loss: 0.845031]\n",
"[Epoch 102/200] [Batch 8/59] [D loss: 0.610806] [G loss: 0.817243]\n",
"[Epoch 102/200] [Batch 9/59] [D loss: 0.638914] [G loss: 1.311474]\n",
"[Epoch 102/200] [Batch 10/59] [D loss: 0.547002] [G loss: 0.811928]\n",
"[Epoch 102/200] [Batch 11/59] [D loss: 0.726957] [G loss: 0.575134]\n",
"[Epoch 102/200] [Batch 12/59] [D loss: 0.663793] [G loss: 1.431370]\n",
"[Epoch 102/200] [Batch 13/59] [D loss: 0.601336] [G loss: 0.977762]\n",
"[Epoch 102/200] [Batch 14/59] [D loss: 0.641237] [G loss: 0.647447]\n",
"[Epoch 102/200] [Batch 15/59] [D loss: 0.643719] [G loss: 1.055564]\n",
"[Epoch 102/200] [Batch 16/59] [D loss: 0.613994] [G loss: 1.046771]\n",
"[Epoch 102/200] [Batch 17/59] [D loss: 0.554365] [G loss: 0.764752]\n",
"[Epoch 102/200] [Batch 18/59] [D loss: 0.588827] [G loss: 0.940143]\n",
"[Epoch 102/200] [Batch 19/59] [D loss: 0.656110] [G loss: 0.837984]\n",
"[Epoch 102/200] [Batch 20/59] [D loss: 0.646490] [G loss: 0.850897]\n",
"[Epoch 102/200] [Batch 21/59] [D loss: 0.601821] [G loss: 0.892640]\n",
"[Epoch 102/200] [Batch 22/59] [D loss: 0.526841] [G loss: 0.755689]\n",
"[Epoch 102/200] [Batch 23/59] [D loss: 0.577200] [G loss: 0.795482]\n",
"[Epoch 102/200] [Batch 24/59] [D loss: 0.639344] [G loss: 0.895124]\n",
"[Epoch 102/200] [Batch 25/59] [D loss: 0.578437] [G loss: 0.721655]\n",
"[Epoch 102/200] [Batch 26/59] [D loss: 0.568562] [G loss: 0.841757]\n",
"[Epoch 102/200] [Batch 27/59] [D loss: 0.555213] [G loss: 1.179834]\n",
"[Epoch 102/200] [Batch 28/59] [D loss: 0.535356] [G loss: 0.796685]\n",
"[Epoch 102/200] [Batch 29/59] [D loss: 0.570850] [G loss: 0.746310]\n",
"[Epoch 102/200] [Batch 30/59] [D loss: 0.597154] [G loss: 0.914507]\n",
"[Epoch 102/200] [Batch 31/59] [D loss: 0.571655] [G loss: 0.771664]\n",
"[Epoch 102/200] [Batch 32/59] [D loss: 0.584834] [G loss: 0.772537]\n",
"[Epoch 102/200] [Batch 33/59] [D loss: 0.598893] [G loss: 1.001495]\n",
"[Epoch 102/200] [Batch 34/59] [D loss: 0.613793] [G loss: 0.860919]\n",
"[Epoch 102/200] [Batch 35/59] [D loss: 0.561224] [G loss: 0.986721]\n",
"[Epoch 102/200] [Batch 36/59] [D loss: 0.602213] [G loss: 0.882755]\n",
"[Epoch 102/200] [Batch 37/59] [D loss: 0.616448] [G loss: 0.805738]\n",
"[Epoch 102/200] [Batch 38/59] [D loss: 0.574266] [G loss: 1.127170]\n",
"[Epoch 102/200] [Batch 39/59] [D loss: 0.586585] [G loss: 0.752207]\n",
"[Epoch 102/200] [Batch 40/59] [D loss: 0.587860] [G loss: 0.949302]\n",
"[Epoch 102/200] [Batch 41/59] [D loss: 0.588513] [G loss: 0.995224]\n",
"[Epoch 102/200] [Batch 42/59] [D loss: 0.564210] [G loss: 0.767603]\n",
"[Epoch 102/200] [Batch 43/59] [D loss: 0.592170] [G loss: 0.828600]\n",
"[Epoch 102/200] [Batch 44/59] [D loss: 0.596502] [G loss: 1.171102]\n",
"[Epoch 102/200] [Batch 45/59] [D loss: 0.660295] [G loss: 0.745848]\n",
"[Epoch 102/200] [Batch 46/59] [D loss: 0.557728] [G loss: 0.766948]\n",
"[Epoch 102/200] [Batch 47/59] [D loss: 0.589186] [G loss: 1.063131]\n",
"[Epoch 102/200] [Batch 48/59] [D loss: 0.541884] [G loss: 0.971906]\n",
"[Epoch 102/200] [Batch 49/59] [D loss: 0.575037] [G loss: 0.843156]\n",
"[Epoch 102/200] [Batch 50/59] [D loss: 0.611469] [G loss: 0.856547]\n",
"[Epoch 102/200] [Batch 51/59] [D loss: 0.653807] [G loss: 0.996201]\n",
"[Epoch 102/200] [Batch 52/59] [D loss: 0.575679] [G loss: 0.795299]\n",
"[Epoch 102/200] [Batch 53/59] [D loss: 0.634459] [G loss: 0.735028]\n",
"[Epoch 102/200] [Batch 54/59] [D loss: 0.591144] [G loss: 1.091419]\n",
"[Epoch 102/200] [Batch 55/59] [D loss: 0.589326] [G loss: 0.759451]\n",
"[Epoch 102/200] [Batch 56/59] [D loss: 0.572406] [G loss: 0.851280]\n",
"[Epoch 102/200] [Batch 57/59] [D loss: 0.549963] [G loss: 0.952688]\n",
"[Epoch 102/200] [Batch 58/59] [D loss: 0.548795] [G loss: 0.854759]\n",
"[Epoch 103/200] [Batch 0/59] [D loss: 0.591215] [G loss: 1.044264]\n",
"[Epoch 103/200] [Batch 1/59] [D loss: 0.577133] [G loss: 0.721743]\n",
"[Epoch 103/200] [Batch 2/59] [D loss: 0.593984] [G loss: 0.947589]\n",
"[Epoch 103/200] [Batch 3/59] [D loss: 0.663334] [G loss: 0.881646]\n",
"[Epoch 103/200] [Batch 4/59] [D loss: 0.583841] [G loss: 0.880491]\n",
"[Epoch 103/200] [Batch 5/59] [D loss: 0.591437] [G loss: 0.821033]\n",
"[Epoch 103/200] [Batch 6/59] [D loss: 0.624329] [G loss: 1.081908]\n",
"[Epoch 103/200] [Batch 7/59] [D loss: 0.576930] [G loss: 0.789851]\n",
"[Epoch 103/200] [Batch 8/59] [D loss: 0.564307] [G loss: 0.842838]\n",
"[Epoch 103/200] [Batch 9/59] [D loss: 0.609454] [G loss: 0.904208]\n",
"[Epoch 103/200] [Batch 10/59] [D loss: 0.563897] [G loss: 0.970000]\n",
"[Epoch 103/200] [Batch 11/59] [D loss: 0.623441] [G loss: 0.857670]\n",
"[Epoch 103/200] [Batch 12/59] [D loss: 0.579281] [G loss: 0.731180]\n",
"[Epoch 103/200] [Batch 13/59] [D loss: 0.523300] [G loss: 1.041935]\n",
"[Epoch 103/200] [Batch 14/59] [D loss: 0.600153] [G loss: 1.015844]\n",
"[Epoch 103/200] [Batch 15/59] [D loss: 0.549492] [G loss: 0.767233]\n",
"[Epoch 103/200] [Batch 16/59] [D loss: 0.553478] [G loss: 0.798580]\n",
"[Epoch 103/200] [Batch 17/59] [D loss: 0.590093] [G loss: 0.892197]\n",
"[Epoch 103/200] [Batch 18/59] [D loss: 0.576310] [G loss: 1.031433]\n",
"[Epoch 103/200] [Batch 19/59] [D loss: 0.552158] [G loss: 0.731440]\n",
"[Epoch 103/200] [Batch 20/59] [D loss: 0.544974] [G loss: 0.805655]\n",
"[Epoch 103/200] [Batch 21/59] [D loss: 0.653237] [G loss: 1.205770]\n",
"[Epoch 103/200] [Batch 22/59] [D loss: 0.598291] [G loss: 0.774085]\n",
"[Epoch 103/200] [Batch 23/59] [D loss: 0.596126] [G loss: 0.682370]\n",
"[Epoch 103/200] [Batch 24/59] [D loss: 0.622178] [G loss: 1.030441]\n",
"[Epoch 103/200] [Batch 25/59] [D loss: 0.607889] [G loss: 0.846577]\n",
"[Epoch 103/200] [Batch 26/59] [D loss: 0.604219] [G loss: 0.809938]\n",
"[Epoch 103/200] [Batch 27/59] [D loss: 0.607960] [G loss: 0.893367]\n",
"[Epoch 103/200] [Batch 28/59] [D loss: 0.593681] [G loss: 0.769406]\n",
"[Epoch 103/200] [Batch 29/59] [D loss: 0.540207] [G loss: 0.927906]\n",
"[Epoch 103/200] [Batch 30/59] [D loss: 0.581338] [G loss: 0.724773]\n",
"[Epoch 103/200] [Batch 31/59] [D loss: 0.577549] [G loss: 0.835802]\n",
"[Epoch 103/200] [Batch 32/59] [D loss: 0.566665] [G loss: 1.243134]\n",
"[Epoch 103/200] [Batch 33/59] [D loss: 0.565736] [G loss: 0.774124]\n",
"[Epoch 103/200] [Batch 34/59] [D loss: 0.618075] [G loss: 0.816219]\n",
"[Epoch 103/200] [Batch 35/59] [D loss: 0.605940] [G loss: 1.125160]\n",
"[Epoch 103/200] [Batch 36/59] [D loss: 0.575656] [G loss: 0.842289]\n",
"[Epoch 103/200] [Batch 37/59] [D loss: 0.563282] [G loss: 0.826816]\n",
"[Epoch 103/200] [Batch 38/59] [D loss: 0.572585] [G loss: 1.153324]\n",
"[Epoch 103/200] [Batch 39/59] [D loss: 0.574835] [G loss: 0.753982]\n",
"[Epoch 103/200] [Batch 40/59] [D loss: 0.673016] [G loss: 0.588612]\n",
"[Epoch 103/200] [Batch 41/59] [D loss: 0.703250] [G loss: 1.426233]\n",
"[Epoch 103/200] [Batch 42/59] [D loss: 0.585670] [G loss: 0.753607]\n",
"[Epoch 103/200] [Batch 43/59] [D loss: 0.597931] [G loss: 0.552216]\n",
"[Epoch 103/200] [Batch 44/59] [D loss: 0.623321] [G loss: 1.488582]\n",
"[Epoch 103/200] [Batch 45/59] [D loss: 0.571579] [G loss: 0.843735]\n",
"[Epoch 103/200] [Batch 46/59] [D loss: 0.652823] [G loss: 0.555081]\n",
"[Epoch 103/200] [Batch 47/59] [D loss: 0.586349] [G loss: 1.101484]\n",
"[Epoch 103/200] [Batch 48/59] [D loss: 0.603251] [G loss: 1.339581]\n",
"[Epoch 103/200] [Batch 49/59] [D loss: 0.526319] [G loss: 0.654350]\n",
"[Epoch 103/200] [Batch 50/59] [D loss: 0.560409] [G loss: 0.844522]\n",
"[Epoch 103/200] [Batch 51/59] [D loss: 0.564978] [G loss: 1.079976]\n",
"[Epoch 103/200] [Batch 52/59] [D loss: 0.595840] [G loss: 0.953231]\n",
"[Epoch 103/200] [Batch 53/59] [D loss: 0.608980] [G loss: 0.899758]\n",
"[Epoch 103/200] [Batch 54/59] [D loss: 0.579056] [G loss: 0.828517]\n",
"[Epoch 103/200] [Batch 55/59] [D loss: 0.617230] [G loss: 0.862494]\n",
"[Epoch 103/200] [Batch 56/59] [D loss: 0.606666] [G loss: 0.908808]\n",
"[Epoch 103/200] [Batch 57/59] [D loss: 0.609959] [G loss: 0.876288]\n",
"[Epoch 103/200] [Batch 58/59] [D loss: 0.605593] [G loss: 0.763290]\n",
"[Epoch 104/200] [Batch 0/59] [D loss: 0.577062] [G loss: 0.957534]\n",
"[Epoch 104/200] [Batch 1/59] [D loss: 0.618374] [G loss: 0.752102]\n",
"[Epoch 104/200] [Batch 2/59] [D loss: 0.603656] [G loss: 0.922868]\n",
"[Epoch 104/200] [Batch 3/59] [D loss: 0.630000] [G loss: 0.922820]\n",
"[Epoch 104/200] [Batch 4/59] [D loss: 0.663615] [G loss: 0.731988]\n",
"[Epoch 104/200] [Batch 5/59] [D loss: 0.650238] [G loss: 0.917229]\n",
"[Epoch 104/200] [Batch 6/59] [D loss: 0.587642] [G loss: 0.998052]\n",
"[Epoch 104/200] [Batch 7/59] [D loss: 0.547409] [G loss: 0.795876]\n",
"[Epoch 104/200] [Batch 8/59] [D loss: 0.573528] [G loss: 0.788616]\n",
"[Epoch 104/200] [Batch 9/59] [D loss: 0.581328] [G loss: 1.040390]\n",
"[Epoch 104/200] [Batch 10/59] [D loss: 0.617292] [G loss: 0.785114]\n",
"[Epoch 104/200] [Batch 11/59] [D loss: 0.584221] [G loss: 0.920422]\n",
"[Epoch 104/200] [Batch 12/59] [D loss: 0.582211] [G loss: 0.876578]\n",
"[Epoch 104/200] [Batch 13/59] [D loss: 0.532827] [G loss: 0.782137]\n",
"[Epoch 104/200] [Batch 14/59] [D loss: 0.560756] [G loss: 1.070815]\n",
"[Epoch 104/200] [Batch 15/59] [D loss: 0.565862] [G loss: 1.197429]\n",
"[Epoch 104/200] [Batch 16/59] [D loss: 0.650074] [G loss: 0.638281]\n",
"[Epoch 104/200] [Batch 17/59] [D loss: 0.576959] [G loss: 0.986942]\n",
"[Epoch 104/200] [Batch 18/59] [D loss: 0.578251] [G loss: 0.957353]\n",
"[Epoch 104/200] [Batch 19/59] [D loss: 0.536221] [G loss: 0.759257]\n",
"[Epoch 104/200] [Batch 20/59] [D loss: 0.573751] [G loss: 0.731735]\n",
"[Epoch 104/200] [Batch 21/59] [D loss: 0.548366] [G loss: 0.956665]\n",
"[Epoch 104/200] [Batch 22/59] [D loss: 0.638010] [G loss: 0.653412]\n",
"[Epoch 104/200] [Batch 23/59] [D loss: 0.631177] [G loss: 0.957601]\n",
"[Epoch 104/200] [Batch 24/59] [D loss: 0.594981] [G loss: 0.958798]\n",
"[Epoch 104/200] [Batch 25/59] [D loss: 0.618085] [G loss: 0.791601]\n",
"[Epoch 104/200] [Batch 26/59] [D loss: 0.598271] [G loss: 0.865647]\n",
"[Epoch 104/200] [Batch 27/59] [D loss: 0.622950] [G loss: 1.016725]\n",
"[Epoch 104/200] [Batch 28/59] [D loss: 0.641760] [G loss: 0.826552]\n",
"[Epoch 104/200] [Batch 29/59] [D loss: 0.594470] [G loss: 0.750607]\n",
"[Epoch 104/200] [Batch 30/59] [D loss: 0.640638] [G loss: 0.969098]\n",
"[Epoch 104/200] [Batch 31/59] [D loss: 0.583461] [G loss: 1.106016]\n",
"[Epoch 104/200] [Batch 32/59] [D loss: 0.583279] [G loss: 0.907496]\n",
"[Epoch 104/200] [Batch 33/59] [D loss: 0.613003] [G loss: 0.830992]\n",
"[Epoch 104/200] [Batch 34/59] [D loss: 0.591992] [G loss: 1.025283]\n",
"[Epoch 104/200] [Batch 35/59] [D loss: 0.599414] [G loss: 0.833605]\n",
"[Epoch 104/200] [Batch 36/59] [D loss: 0.580333] [G loss: 0.856668]\n",
"[Epoch 104/200] [Batch 37/59] [D loss: 0.567054] [G loss: 0.923946]\n",
"[Epoch 104/200] [Batch 38/59] [D loss: 0.604383] [G loss: 0.865752]\n",
"[Epoch 104/200] [Batch 39/59] [D loss: 0.581728] [G loss: 0.897131]\n",
"[Epoch 104/200] [Batch 40/59] [D loss: 0.591933] [G loss: 0.879069]\n",
"[Epoch 104/200] [Batch 41/59] [D loss: 0.559658] [G loss: 0.850541]\n",
"[Epoch 104/200] [Batch 42/59] [D loss: 0.585479] [G loss: 0.964152]\n",
"[Epoch 104/200] [Batch 43/59] [D loss: 0.598150] [G loss: 1.100071]\n",
"[Epoch 104/200] [Batch 44/59] [D loss: 0.660903] [G loss: 0.943351]\n",
"[Epoch 104/200] [Batch 45/59] [D loss: 0.610204] [G loss: 0.799505]\n",
"[Epoch 104/200] [Batch 46/59] [D loss: 0.626243] [G loss: 1.006049]\n",
"[Epoch 104/200] [Batch 47/59] [D loss: 0.635683] [G loss: 0.786719]\n",
"[Epoch 104/200] [Batch 48/59] [D loss: 0.593903] [G loss: 0.689125]\n",
"[Epoch 104/200] [Batch 49/59] [D loss: 0.567810] [G loss: 1.042224]\n",
"[Epoch 104/200] [Batch 50/59] [D loss: 0.591114] [G loss: 1.043142]\n",
"[Epoch 104/200] [Batch 51/59] [D loss: 0.569332] [G loss: 0.835461]\n",
"[Epoch 104/200] [Batch 52/59] [D loss: 0.567773] [G loss: 0.977561]\n",
"[Epoch 104/200] [Batch 53/59] [D loss: 0.598088] [G loss: 0.869473]\n",
"[Epoch 104/200] [Batch 54/59] [D loss: 0.552914] [G loss: 0.907213]\n",
"[Epoch 104/200] [Batch 55/59] [D loss: 0.576920] [G loss: 0.927096]\n",
"[Epoch 104/200] [Batch 56/59] [D loss: 0.599158] [G loss: 0.891645]\n",
"[Epoch 104/200] [Batch 57/59] [D loss: 0.584398] [G loss: 0.832547]\n",
"[Epoch 104/200] [Batch 58/59] [D loss: 0.598481] [G loss: 0.848197]\n",
"[Epoch 105/200] [Batch 0/59] [D loss: 0.585849] [G loss: 0.762202]\n",
"[Epoch 105/200] [Batch 1/59] [D loss: 0.595136] [G loss: 0.784294]\n",
"[Epoch 105/200] [Batch 2/59] [D loss: 0.587539] [G loss: 1.182234]\n",
"[Epoch 105/200] [Batch 3/59] [D loss: 0.644887] [G loss: 0.734238]\n",
"[Epoch 105/200] [Batch 4/59] [D loss: 0.563386] [G loss: 0.663246]\n",
"[Epoch 105/200] [Batch 5/59] [D loss: 0.655739] [G loss: 1.297860]\n",
"[Epoch 105/200] [Batch 6/59] [D loss: 0.532716] [G loss: 0.790825]\n",
"[Epoch 105/200] [Batch 7/59] [D loss: 0.601843] [G loss: 0.654635]\n",
"[Epoch 105/200] [Batch 8/59] [D loss: 0.624504] [G loss: 1.223623]\n",
"[Epoch 105/200] [Batch 9/59] [D loss: 0.578612] [G loss: 0.794850]\n",
"[Epoch 105/200] [Batch 10/59] [D loss: 0.643572] [G loss: 0.714958]\n",
"[Epoch 105/200] [Batch 11/59] [D loss: 0.586378] [G loss: 1.155227]\n",
"[Epoch 105/200] [Batch 12/59] [D loss: 0.525446] [G loss: 0.808140]\n",
"[Epoch 105/200] [Batch 13/59] [D loss: 0.588187] [G loss: 0.717504]\n",
"[Epoch 105/200] [Batch 14/59] [D loss: 0.599160] [G loss: 0.895368]\n",
"[Epoch 105/200] [Batch 15/59] [D loss: 0.543925] [G loss: 0.925148]\n",
"[Epoch 105/200] [Batch 16/59] [D loss: 0.636538] [G loss: 0.792774]\n",
"[Epoch 105/200] [Batch 17/59] [D loss: 0.548633] [G loss: 1.070751]\n",
"[Epoch 105/200] [Batch 18/59] [D loss: 0.610605] [G loss: 0.973854]\n",
"[Epoch 105/200] [Batch 19/59] [D loss: 0.596127] [G loss: 0.824780]\n",
"[Epoch 105/200] [Batch 20/59] [D loss: 0.624230] [G loss: 0.986001]\n",
"[Epoch 105/200] [Batch 21/59] [D loss: 0.600809] [G loss: 0.872308]\n",
"[Epoch 105/200] [Batch 22/59] [D loss: 0.657037] [G loss: 0.723293]\n",
"[Epoch 105/200] [Batch 23/59] [D loss: 0.652851] [G loss: 1.149356]\n",
"[Epoch 105/200] [Batch 24/59] [D loss: 0.546837] [G loss: 0.985621]\n",
"[Epoch 105/200] [Batch 25/59] [D loss: 0.565843] [G loss: 0.625425]\n",
"[Epoch 105/200] [Batch 26/59] [D loss: 0.574413] [G loss: 0.975318]\n",
"[Epoch 105/200] [Batch 27/59] [D loss: 0.611498] [G loss: 0.865238]\n",
"[Epoch 105/200] [Batch 28/59] [D loss: 0.623798] [G loss: 0.814642]\n",
"[Epoch 105/200] [Batch 29/59] [D loss: 0.544020] [G loss: 0.959781]\n",
"[Epoch 105/200] [Batch 30/59] [D loss: 0.586744] [G loss: 0.864427]\n",
"[Epoch 105/200] [Batch 31/59] [D loss: 0.607518] [G loss: 0.678186]\n",
"[Epoch 105/200] [Batch 32/59] [D loss: 0.608634] [G loss: 0.712030]\n",
"[Epoch 105/200] [Batch 33/59] [D loss: 0.562940] [G loss: 1.145849]\n",
"[Epoch 105/200] [Batch 34/59] [D loss: 0.589768] [G loss: 0.829453]\n",
"[Epoch 105/200] [Batch 35/59] [D loss: 0.605495] [G loss: 0.819908]\n",
"[Epoch 105/200] [Batch 36/59] [D loss: 0.602186] [G loss: 0.919539]\n",
"[Epoch 105/200] [Batch 37/59] [D loss: 0.550492] [G loss: 1.001865]\n",
"[Epoch 105/200] [Batch 38/59] [D loss: 0.633268] [G loss: 0.944462]\n",
"[Epoch 105/200] [Batch 39/59] [D loss: 0.573220] [G loss: 0.695242]\n",
"[Epoch 105/200] [Batch 40/59] [D loss: 0.524179] [G loss: 0.938433]\n",
"[Epoch 105/200] [Batch 41/59] [D loss: 0.572142] [G loss: 0.941051]\n",
"[Epoch 105/200] [Batch 42/59] [D loss: 0.553121] [G loss: 0.957332]\n",
"[Epoch 105/200] [Batch 43/59] [D loss: 0.583023] [G loss: 0.780164]\n",
"[Epoch 105/200] [Batch 44/59] [D loss: 0.587991] [G loss: 0.828515]\n",
"[Epoch 105/200] [Batch 45/59] [D loss: 0.601225] [G loss: 0.966987]\n",
"[Epoch 105/200] [Batch 46/59] [D loss: 0.622743] [G loss: 0.897241]\n",
"[Epoch 105/200] [Batch 47/59] [D loss: 0.626183] [G loss: 1.108947]\n",
"[Epoch 105/200] [Batch 48/59] [D loss: 0.631034] [G loss: 0.857429]\n",
"[Epoch 105/200] [Batch 49/59] [D loss: 0.585799] [G loss: 1.068586]\n",
"[Epoch 105/200] [Batch 50/59] [D loss: 0.584109] [G loss: 0.901606]\n",
"[Epoch 105/200] [Batch 51/59] [D loss: 0.591880] [G loss: 0.853806]\n",
"[Epoch 105/200] [Batch 52/59] [D loss: 0.570358] [G loss: 0.897038]\n",
"[Epoch 105/200] [Batch 53/59] [D loss: 0.611640] [G loss: 0.905771]\n",
"[Epoch 105/200] [Batch 54/59] [D loss: 0.547054] [G loss: 1.024922]\n",
"[Epoch 105/200] [Batch 55/59] [D loss: 0.564454] [G loss: 0.803259]\n",
"[Epoch 105/200] [Batch 56/59] [D loss: 0.603966] [G loss: 0.808389]\n",
"[Epoch 105/200] [Batch 57/59] [D loss: 0.590290] [G loss: 0.903053]\n",
"[Epoch 105/200] [Batch 58/59] [D loss: 0.573637] [G loss: 0.944098]\n",
"[Epoch 106/200] [Batch 0/59] [D loss: 0.563641] [G loss: 0.795758]\n",
"[Epoch 106/200] [Batch 1/59] [D loss: 0.562747] [G loss: 0.982352]\n",
"[Epoch 106/200] [Batch 2/59] [D loss: 0.637574] [G loss: 0.793423]\n",
"[Epoch 106/200] [Batch 3/59] [D loss: 0.600536] [G loss: 0.820263]\n",
"[Epoch 106/200] [Batch 4/59] [D loss: 0.565332] [G loss: 1.050946]\n",
"[Epoch 106/200] [Batch 5/59] [D loss: 0.576596] [G loss: 0.960141]\n",
"[Epoch 106/200] [Batch 6/59] [D loss: 0.581246] [G loss: 0.791546]\n",
"[Epoch 106/200] [Batch 7/59] [D loss: 0.553364] [G loss: 0.906584]\n",
"[Epoch 106/200] [Batch 8/59] [D loss: 0.564355] [G loss: 1.095727]\n",
"[Epoch 106/200] [Batch 9/59] [D loss: 0.572184] [G loss: 0.877030]\n",
"[Epoch 106/200] [Batch 10/59] [D loss: 0.558067] [G loss: 0.968472]\n",
"[Epoch 106/200] [Batch 11/59] [D loss: 0.584108] [G loss: 0.983024]\n",
"[Epoch 106/200] [Batch 12/59] [D loss: 0.603159] [G loss: 0.865467]\n",
"[Epoch 106/200] [Batch 13/59] [D loss: 0.578535] [G loss: 0.802539]\n",
"[Epoch 106/200] [Batch 14/59] [D loss: 0.607434] [G loss: 0.824430]\n",
"[Epoch 106/200] [Batch 15/59] [D loss: 0.625023] [G loss: 0.941152]\n",
"[Epoch 106/200] [Batch 16/59] [D loss: 0.595811] [G loss: 0.980125]\n",
"[Epoch 106/200] [Batch 17/59] [D loss: 0.599005] [G loss: 1.114329]\n",
"[Epoch 106/200] [Batch 18/59] [D loss: 0.645318] [G loss: 0.617678]\n",
"[Epoch 106/200] [Batch 19/59] [D loss: 0.552319] [G loss: 1.254085]\n",
"[Epoch 106/200] [Batch 20/59] [D loss: 0.594566] [G loss: 1.058553]\n",
"[Epoch 106/200] [Batch 21/59] [D loss: 0.710057] [G loss: 0.521887]\n",
"[Epoch 106/200] [Batch 22/59] [D loss: 0.599221] [G loss: 1.140924]\n",
"[Epoch 106/200] [Batch 23/59] [D loss: 0.611172] [G loss: 0.838557]\n",
"[Epoch 106/200] [Batch 24/59] [D loss: 0.570500] [G loss: 0.783424]\n",
"[Epoch 106/200] [Batch 25/59] [D loss: 0.580675] [G loss: 1.136358]\n",
"[Epoch 106/200] [Batch 26/59] [D loss: 0.573494] [G loss: 0.763615]\n",
"[Epoch 106/200] [Batch 27/59] [D loss: 0.606868] [G loss: 0.866397]\n",
"[Epoch 106/200] [Batch 28/59] [D loss: 0.631846] [G loss: 0.948888]\n",
"[Epoch 106/200] [Batch 29/59] [D loss: 0.556090] [G loss: 0.639914]\n",
"[Epoch 106/200] [Batch 30/59] [D loss: 0.542166] [G loss: 1.169613]\n",
"[Epoch 106/200] [Batch 31/59] [D loss: 0.560922] [G loss: 0.692704]\n",
"[Epoch 106/200] [Batch 32/59] [D loss: 0.612273] [G loss: 0.770410]\n",
"[Epoch 106/200] [Batch 33/59] [D loss: 0.595620] [G loss: 1.027321]\n",
"[Epoch 106/200] [Batch 34/59] [D loss: 0.584411] [G loss: 0.843555]\n",
"[Epoch 106/200] [Batch 35/59] [D loss: 0.576358] [G loss: 0.684418]\n",
"[Epoch 106/200] [Batch 36/59] [D loss: 0.632125] [G loss: 1.039706]\n",
"[Epoch 106/200] [Batch 37/59] [D loss: 0.596649] [G loss: 0.840330]\n",
"[Epoch 106/200] [Batch 38/59] [D loss: 0.607313] [G loss: 0.874369]\n",
"[Epoch 106/200] [Batch 39/59] [D loss: 0.589676] [G loss: 1.072729]\n",
"[Epoch 106/200] [Batch 40/59] [D loss: 0.557783] [G loss: 0.722905]\n",
"[Epoch 106/200] [Batch 41/59] [D loss: 0.540769] [G loss: 0.902673]\n",
"[Epoch 106/200] [Batch 42/59] [D loss: 0.601373] [G loss: 0.987605]\n",
"[Epoch 106/200] [Batch 43/59] [D loss: 0.582246] [G loss: 0.705193]\n",
"[Epoch 106/200] [Batch 44/59] [D loss: 0.575934] [G loss: 0.964680]\n",
"[Epoch 106/200] [Batch 45/59] [D loss: 0.559798] [G loss: 0.920512]\n",
"[Epoch 106/200] [Batch 46/59] [D loss: 0.582523] [G loss: 0.678203]\n",
"[Epoch 106/200] [Batch 47/59] [D loss: 0.635429] [G loss: 0.758050]\n",
"[Epoch 106/200] [Batch 48/59] [D loss: 0.661089] [G loss: 1.130551]\n",
"[Epoch 106/200] [Batch 49/59] [D loss: 0.584760] [G loss: 0.793360]\n",
"[Epoch 106/200] [Batch 50/59] [D loss: 0.544514] [G loss: 0.798891]\n",
"[Epoch 106/200] [Batch 51/59] [D loss: 0.590967] [G loss: 1.071137]\n",
"[Epoch 106/200] [Batch 52/59] [D loss: 0.620055] [G loss: 0.898435]\n",
"[Epoch 106/200] [Batch 53/59] [D loss: 0.663401] [G loss: 1.003290]\n",
"[Epoch 106/200] [Batch 54/59] [D loss: 0.594232] [G loss: 0.750695]\n",
"[Epoch 106/200] [Batch 55/59] [D loss: 0.609917] [G loss: 0.931538]\n",
"[Epoch 106/200] [Batch 56/59] [D loss: 0.570926] [G loss: 0.984965]\n",
"[Epoch 106/200] [Batch 57/59] [D loss: 0.579414] [G loss: 0.844497]\n",
"[Epoch 106/200] [Batch 58/59] [D loss: 0.596576] [G loss: 0.808155]\n",
"[Epoch 107/200] [Batch 0/59] [D loss: 0.577643] [G loss: 1.004395]\n",
"[Epoch 107/200] [Batch 1/59] [D loss: 0.538347] [G loss: 0.864268]\n",
"[Epoch 107/200] [Batch 2/59] [D loss: 0.625750] [G loss: 0.894668]\n",
"[Epoch 107/200] [Batch 3/59] [D loss: 0.648635] [G loss: 0.933188]\n",
"[Epoch 107/200] [Batch 4/59] [D loss: 0.644419] [G loss: 0.930792]\n",
"[Epoch 107/200] [Batch 5/59] [D loss: 0.652167] [G loss: 0.733920]\n",
"[Epoch 107/200] [Batch 6/59] [D loss: 0.615842] [G loss: 0.816052]\n",
"[Epoch 107/200] [Batch 7/59] [D loss: 0.608948] [G loss: 1.168364]\n",
"[Epoch 107/200] [Batch 8/59] [D loss: 0.604267] [G loss: 0.677371]\n",
"[Epoch 107/200] [Batch 9/59] [D loss: 0.554676] [G loss: 0.966894]\n",
"[Epoch 107/200] [Batch 10/59] [D loss: 0.593318] [G loss: 0.843909]\n",
"[Epoch 107/200] [Batch 11/59] [D loss: 0.589728] [G loss: 0.732774]\n",
"[Epoch 107/200] [Batch 12/59] [D loss: 0.560423] [G loss: 0.924941]\n",
"[Epoch 107/200] [Batch 13/59] [D loss: 0.597700] [G loss: 0.894967]\n",
"[Epoch 107/200] [Batch 14/59] [D loss: 0.617239] [G loss: 0.748935]\n",
"[Epoch 107/200] [Batch 15/59] [D loss: 0.616385] [G loss: 0.951472]\n",
"[Epoch 107/200] [Batch 16/59] [D loss: 0.590826] [G loss: 1.048191]\n",
"[Epoch 107/200] [Batch 17/59] [D loss: 0.662536] [G loss: 0.585193]\n",
"[Epoch 107/200] [Batch 18/59] [D loss: 0.588295] [G loss: 1.064067]\n",
"[Epoch 107/200] [Batch 19/59] [D loss: 0.597826] [G loss: 1.103060]\n",
"[Epoch 107/200] [Batch 20/59] [D loss: 0.659800] [G loss: 0.594432]\n",
"[Epoch 107/200] [Batch 21/59] [D loss: 0.621152] [G loss: 1.220768]\n",
"[Epoch 107/200] [Batch 22/59] [D loss: 0.537062] [G loss: 1.032207]\n",
"[Epoch 107/200] [Batch 23/59] [D loss: 0.526660] [G loss: 0.988670]\n",
"[Epoch 107/200] [Batch 24/59] [D loss: 0.663298] [G loss: 0.711208]\n",
"[Epoch 107/200] [Batch 25/59] [D loss: 0.621981] [G loss: 1.409520]\n",
"[Epoch 107/200] [Batch 26/59] [D loss: 0.536108] [G loss: 0.955853]\n",
"[Epoch 107/200] [Batch 27/59] [D loss: 0.629537] [G loss: 0.664685]\n",
"[Epoch 107/200] [Batch 28/59] [D loss: 0.563444] [G loss: 1.209939]\n",
"[Epoch 107/200] [Batch 29/59] [D loss: 0.594308] [G loss: 0.854006]\n",
"[Epoch 107/200] [Batch 30/59] [D loss: 0.596186] [G loss: 0.739796]\n",
"[Epoch 107/200] [Batch 31/59] [D loss: 0.577738] [G loss: 0.947318]\n",
"[Epoch 107/200] [Batch 32/59] [D loss: 0.562391] [G loss: 1.113695]\n",
"[Epoch 107/200] [Batch 33/59] [D loss: 0.546327] [G loss: 0.747767]\n",
"[Epoch 107/200] [Batch 34/59] [D loss: 0.583492] [G loss: 1.007200]\n",
"[Epoch 107/200] [Batch 35/59] [D loss: 0.587657] [G loss: 1.094623]\n",
"[Epoch 107/200] [Batch 36/59] [D loss: 0.632539] [G loss: 0.778254]\n",
"[Epoch 107/200] [Batch 37/59] [D loss: 0.497989] [G loss: 0.958541]\n",
"[Epoch 107/200] [Batch 38/59] [D loss: 0.657081] [G loss: 0.763005]\n",
"[Epoch 107/200] [Batch 39/59] [D loss: 0.589916] [G loss: 0.982870]\n",
"[Epoch 107/200] [Batch 40/59] [D loss: 0.616075] [G loss: 0.840327]\n",
"[Epoch 107/200] [Batch 41/59] [D loss: 0.590961] [G loss: 0.667556]\n",
"[Epoch 107/200] [Batch 42/59] [D loss: 0.632969] [G loss: 1.212280]\n",
"[Epoch 107/200] [Batch 43/59] [D loss: 0.588212] [G loss: 0.919014]\n",
"[Epoch 107/200] [Batch 44/59] [D loss: 0.545020] [G loss: 0.924172]\n",
"[Epoch 107/200] [Batch 45/59] [D loss: 0.581554] [G loss: 0.889373]\n",
"[Epoch 107/200] [Batch 46/59] [D loss: 0.564114] [G loss: 1.028155]\n",
"[Epoch 107/200] [Batch 47/59] [D loss: 0.592449] [G loss: 0.797673]\n",
"[Epoch 107/200] [Batch 48/59] [D loss: 0.643599] [G loss: 0.643453]\n",
"[Epoch 107/200] [Batch 49/59] [D loss: 0.621047] [G loss: 1.337353]\n",
"[Epoch 107/200] [Batch 50/59] [D loss: 0.592495] [G loss: 0.844593]\n",
"[Epoch 107/200] [Batch 51/59] [D loss: 0.598883] [G loss: 0.641118]\n",
"[Epoch 107/200] [Batch 52/59] [D loss: 0.592718] [G loss: 1.213196]\n",
"[Epoch 107/200] [Batch 53/59] [D loss: 0.583180] [G loss: 0.797269]\n",
"[Epoch 107/200] [Batch 54/59] [D loss: 0.599708] [G loss: 0.681947]\n",
"[Epoch 107/200] [Batch 55/59] [D loss: 0.572482] [G loss: 1.140264]\n",
"[Epoch 107/200] [Batch 56/59] [D loss: 0.560906] [G loss: 0.849480]\n",
"[Epoch 107/200] [Batch 57/59] [D loss: 0.580541] [G loss: 0.902748]\n",
"[Epoch 107/200] [Batch 58/59] [D loss: 0.582516] [G loss: 0.939956]\n",
"[Epoch 108/200] [Batch 0/59] [D loss: 0.655431] [G loss: 0.609467]\n",
"[Epoch 108/200] [Batch 1/59] [D loss: 0.585131] [G loss: 0.935273]\n",
"[Epoch 108/200] [Batch 2/59] [D loss: 0.579890] [G loss: 1.108028]\n",
"[Epoch 108/200] [Batch 3/59] [D loss: 0.549071] [G loss: 0.751016]\n",
"[Epoch 108/200] [Batch 4/59] [D loss: 0.547613] [G loss: 0.836591]\n",
"[Epoch 108/200] [Batch 5/59] [D loss: 0.575327] [G loss: 0.938381]\n",
"[Epoch 108/200] [Batch 6/59] [D loss: 0.579304] [G loss: 0.894244]\n",
"[Epoch 108/200] [Batch 7/59] [D loss: 0.612056] [G loss: 1.032715]\n",
"[Epoch 108/200] [Batch 8/59] [D loss: 0.565598] [G loss: 0.995205]\n",
"[Epoch 108/200] [Batch 9/59] [D loss: 0.573155] [G loss: 0.846754]\n",
"[Epoch 108/200] [Batch 10/59] [D loss: 0.558849] [G loss: 0.827516]\n",
"[Epoch 108/200] [Batch 11/59] [D loss: 0.542368] [G loss: 0.795399]\n",
"[Epoch 108/200] [Batch 12/59] [D loss: 0.586801] [G loss: 0.838872]\n",
"[Epoch 108/200] [Batch 13/59] [D loss: 0.576744] [G loss: 0.787991]\n",
"[Epoch 108/200] [Batch 14/59] [D loss: 0.604618] [G loss: 0.948234]\n",
"[Epoch 108/200] [Batch 15/59] [D loss: 0.561957] [G loss: 1.133537]\n",
"[Epoch 108/200] [Batch 16/59] [D loss: 0.656088] [G loss: 0.586013]\n",
"[Epoch 108/200] [Batch 17/59] [D loss: 0.549297] [G loss: 0.947162]\n",
"[Epoch 108/200] [Batch 18/59] [D loss: 0.603407] [G loss: 1.384574]\n",
"[Epoch 108/200] [Batch 19/59] [D loss: 0.645512] [G loss: 0.623468]\n",
"[Epoch 108/200] [Batch 20/59] [D loss: 0.559322] [G loss: 0.850303]\n",
"[Epoch 108/200] [Batch 21/59] [D loss: 0.587653] [G loss: 1.373091]\n",
"[Epoch 108/200] [Batch 22/59] [D loss: 0.599449] [G loss: 0.667494]\n",
"[Epoch 108/200] [Batch 23/59] [D loss: 0.585812] [G loss: 0.851312]\n",
"[Epoch 108/200] [Batch 24/59] [D loss: 0.607202] [G loss: 1.071357]\n",
"[Epoch 108/200] [Batch 25/59] [D loss: 0.526870] [G loss: 0.844183]\n",
"[Epoch 108/200] [Batch 26/59] [D loss: 0.552647] [G loss: 0.736060]\n",
"[Epoch 108/200] [Batch 27/59] [D loss: 0.562411] [G loss: 1.023250]\n",
"[Epoch 108/200] [Batch 28/59] [D loss: 0.605577] [G loss: 0.975253]\n",
"[Epoch 108/200] [Batch 29/59] [D loss: 0.597134] [G loss: 0.697077]\n",
"[Epoch 108/200] [Batch 30/59] [D loss: 0.606449] [G loss: 1.052837]\n",
"[Epoch 108/200] [Batch 31/59] [D loss: 0.546699] [G loss: 0.879488]\n",
"[Epoch 108/200] [Batch 32/59] [D loss: 0.555020] [G loss: 0.757112]\n",
"[Epoch 108/200] [Batch 33/59] [D loss: 0.619835] [G loss: 0.976353]\n",
"[Epoch 108/200] [Batch 34/59] [D loss: 0.598179] [G loss: 0.901297]\n",
"[Epoch 108/200] [Batch 35/59] [D loss: 0.636899] [G loss: 0.875031]\n",
"[Epoch 108/200] [Batch 36/59] [D loss: 0.585644] [G loss: 0.991904]\n",
"[Epoch 108/200] [Batch 37/59] [D loss: 0.589757] [G loss: 0.787773]\n",
"[Epoch 108/200] [Batch 38/59] [D loss: 0.627846] [G loss: 0.918479]\n",
"[Epoch 108/200] [Batch 39/59] [D loss: 0.601499] [G loss: 0.963406]\n",
"[Epoch 108/200] [Batch 40/59] [D loss: 0.600296] [G loss: 0.903118]\n",
"[Epoch 108/200] [Batch 41/59] [D loss: 0.559551] [G loss: 0.853622]\n",
"[Epoch 108/200] [Batch 42/59] [D loss: 0.601878] [G loss: 0.833584]\n",
"[Epoch 108/200] [Batch 43/59] [D loss: 0.586432] [G loss: 0.732501]\n",
"[Epoch 108/200] [Batch 44/59] [D loss: 0.551129] [G loss: 0.903144]\n",
"[Epoch 108/200] [Batch 45/59] [D loss: 0.536057] [G loss: 0.866777]\n",
"[Epoch 108/200] [Batch 46/59] [D loss: 0.604575] [G loss: 0.806711]\n",
"[Epoch 108/200] [Batch 47/59] [D loss: 0.553437] [G loss: 0.955554]\n",
"[Epoch 108/200] [Batch 48/59] [D loss: 0.542977] [G loss: 0.895615]\n",
"[Epoch 108/200] [Batch 49/59] [D loss: 0.545084] [G loss: 1.127479]\n",
"[Epoch 108/200] [Batch 50/59] [D loss: 0.554191] [G loss: 0.618179]\n",
"[Epoch 108/200] [Batch 51/59] [D loss: 0.574364] [G loss: 1.180479]\n",
"[Epoch 108/200] [Batch 52/59] [D loss: 0.622555] [G loss: 0.719068]\n",
"[Epoch 108/200] [Batch 53/59] [D loss: 0.589822] [G loss: 0.993570]\n",
"[Epoch 108/200] [Batch 54/59] [D loss: 0.597208] [G loss: 0.919715]\n",
"[Epoch 108/200] [Batch 55/59] [D loss: 0.658771] [G loss: 0.897627]\n",
"[Epoch 108/200] [Batch 56/59] [D loss: 0.573872] [G loss: 0.948614]\n",
"[Epoch 108/200] [Batch 57/59] [D loss: 0.619868] [G loss: 0.884422]\n",
"[Epoch 108/200] [Batch 58/59] [D loss: 0.669120] [G loss: 1.103234]\n",
"[Epoch 109/200] [Batch 0/59] [D loss: 0.641941] [G loss: 0.586587]\n",
"[Epoch 109/200] [Batch 1/59] [D loss: 0.655432] [G loss: 1.104422]\n",
"[Epoch 109/200] [Batch 2/59] [D loss: 0.625975] [G loss: 0.734488]\n",
"[Epoch 109/200] [Batch 3/59] [D loss: 0.611000] [G loss: 0.983947]\n",
"[Epoch 109/200] [Batch 4/59] [D loss: 0.592538] [G loss: 0.833536]\n",
"[Epoch 109/200] [Batch 5/59] [D loss: 0.536162] [G loss: 0.763407]\n",
"[Epoch 109/200] [Batch 6/59] [D loss: 0.599699] [G loss: 0.759442]\n",
"[Epoch 109/200] [Batch 7/59] [D loss: 0.642131] [G loss: 1.099205]\n",
"[Epoch 109/200] [Batch 8/59] [D loss: 0.611306] [G loss: 0.682437]\n",
"[Epoch 109/200] [Batch 9/59] [D loss: 0.637936] [G loss: 0.763869]\n",
"[Epoch 109/200] [Batch 10/59] [D loss: 0.614726] [G loss: 1.137132]\n",
"[Epoch 109/200] [Batch 11/59] [D loss: 0.595633] [G loss: 0.763147]\n",
"[Epoch 109/200] [Batch 12/59] [D loss: 0.627109] [G loss: 0.656356]\n",
"[Epoch 109/200] [Batch 13/59] [D loss: 0.597732] [G loss: 0.937858]\n",
"[Epoch 109/200] [Batch 14/59] [D loss: 0.529403] [G loss: 0.985090]\n",
"[Epoch 109/200] [Batch 15/59] [D loss: 0.581824] [G loss: 1.149075]\n",
"[Epoch 109/200] [Batch 16/59] [D loss: 0.663315] [G loss: 0.579294]\n",
"[Epoch 109/200] [Batch 17/59] [D loss: 0.653161] [G loss: 1.235036]\n",
"[Epoch 109/200] [Batch 18/59] [D loss: 0.580567] [G loss: 0.835747]\n",
"[Epoch 109/200] [Batch 19/59] [D loss: 0.635800] [G loss: 0.753771]\n",
"[Epoch 109/200] [Batch 20/59] [D loss: 0.581254] [G loss: 1.050850]\n",
"[Epoch 109/200] [Batch 21/59] [D loss: 0.656798] [G loss: 1.112870]\n",
"[Epoch 109/200] [Batch 22/59] [D loss: 0.584926] [G loss: 0.789555]\n",
"[Epoch 109/200] [Batch 23/59] [D loss: 0.593439] [G loss: 0.778201]\n",
"[Epoch 109/200] [Batch 24/59] [D loss: 0.534177] [G loss: 1.042687]\n",
"[Epoch 109/200] [Batch 25/59] [D loss: 0.616572] [G loss: 1.085114]\n",
"[Epoch 109/200] [Batch 26/59] [D loss: 0.613030] [G loss: 0.742955]\n",
"[Epoch 109/200] [Batch 27/59] [D loss: 0.614525] [G loss: 1.110834]\n",
"[Epoch 109/200] [Batch 28/59] [D loss: 0.630114] [G loss: 1.010057]\n",
"[Epoch 109/200] [Batch 29/59] [D loss: 0.544046] [G loss: 0.804511]\n",
"[Epoch 109/200] [Batch 30/59] [D loss: 0.579195] [G loss: 1.043883]\n",
"[Epoch 109/200] [Batch 31/59] [D loss: 0.601929] [G loss: 0.954180]\n",
"[Epoch 109/200] [Batch 32/59] [D loss: 0.574762] [G loss: 0.716602]\n",
"[Epoch 109/200] [Batch 33/59] [D loss: 0.624163] [G loss: 0.817016]\n",
"[Epoch 109/200] [Batch 34/59] [D loss: 0.655534] [G loss: 1.449993]\n",
"[Epoch 109/200] [Batch 35/59] [D loss: 0.539482] [G loss: 0.759854]\n",
"[Epoch 109/200] [Batch 36/59] [D loss: 0.657313] [G loss: 0.511813]\n",
"[Epoch 109/200] [Batch 37/59] [D loss: 0.633701] [G loss: 1.427275]\n",
"[Epoch 109/200] [Batch 38/59] [D loss: 0.601040] [G loss: 0.832444]\n",
"[Epoch 109/200] [Batch 39/59] [D loss: 0.603190] [G loss: 0.607693]\n",
"[Epoch 109/200] [Batch 40/59] [D loss: 0.595768] [G loss: 1.150752]\n",
"[Epoch 109/200] [Batch 41/59] [D loss: 0.579654] [G loss: 0.835428]\n",
"[Epoch 109/200] [Batch 42/59] [D loss: 0.576783] [G loss: 0.659694]\n",
"[Epoch 109/200] [Batch 43/59] [D loss: 0.590840] [G loss: 1.183315]\n",
"[Epoch 109/200] [Batch 44/59] [D loss: 0.589275] [G loss: 0.915085]\n",
"[Epoch 109/200] [Batch 45/59] [D loss: 0.600150] [G loss: 0.743630]\n",
"[Epoch 109/200] [Batch 46/59] [D loss: 0.554329] [G loss: 1.030576]\n",
"[Epoch 109/200] [Batch 47/59] [D loss: 0.661035] [G loss: 0.879317]\n",
"[Epoch 109/200] [Batch 48/59] [D loss: 0.594890] [G loss: 0.762119]\n",
"[Epoch 109/200] [Batch 49/59] [D loss: 0.543585] [G loss: 0.842505]\n",
"[Epoch 109/200] [Batch 50/59] [D loss: 0.641801] [G loss: 1.172014]\n",
"[Epoch 109/200] [Batch 51/59] [D loss: 0.642392] [G loss: 0.868258]\n",
"[Epoch 109/200] [Batch 52/59] [D loss: 0.545498] [G loss: 0.853082]\n",
"[Epoch 109/200] [Batch 53/59] [D loss: 0.576695] [G loss: 1.169687]\n",
"[Epoch 109/200] [Batch 54/59] [D loss: 0.554813] [G loss: 0.906499]\n",
"[Epoch 109/200] [Batch 55/59] [D loss: 0.568779] [G loss: 0.825561]\n",
"[Epoch 109/200] [Batch 56/59] [D loss: 0.590952] [G loss: 0.924051]\n",
"[Epoch 109/200] [Batch 57/59] [D loss: 0.556965] [G loss: 1.032307]\n",
"[Epoch 109/200] [Batch 58/59] [D loss: 0.638461] [G loss: 0.912583]\n",
"[Epoch 110/200] [Batch 0/59] [D loss: 0.628297] [G loss: 0.936057]\n",
"[Epoch 110/200] [Batch 1/59] [D loss: 0.639577] [G loss: 1.133903]\n",
"[Epoch 110/200] [Batch 2/59] [D loss: 0.612019] [G loss: 0.721816]\n",
"[Epoch 110/200] [Batch 3/59] [D loss: 0.558260] [G loss: 0.837773]\n",
"[Epoch 110/200] [Batch 4/59] [D loss: 0.577052] [G loss: 1.050344]\n",
"[Epoch 110/200] [Batch 5/59] [D loss: 0.606420] [G loss: 0.982928]\n",
"[Epoch 110/200] [Batch 6/59] [D loss: 0.650745] [G loss: 0.756967]\n",
"[Epoch 110/200] [Batch 7/59] [D loss: 0.643183] [G loss: 0.864678]\n",
"[Epoch 110/200] [Batch 8/59] [D loss: 0.604051] [G loss: 0.963357]\n",
"[Epoch 110/200] [Batch 9/59] [D loss: 0.607468] [G loss: 0.731995]\n",
"[Epoch 110/200] [Batch 10/59] [D loss: 0.600525] [G loss: 0.976180]\n",
"[Epoch 110/200] [Batch 11/59] [D loss: 0.524853] [G loss: 0.920037]\n",
"[Epoch 110/200] [Batch 12/59] [D loss: 0.597689] [G loss: 0.816063]\n",
"[Epoch 110/200] [Batch 13/59] [D loss: 0.579665] [G loss: 0.822735]\n",
"[Epoch 110/200] [Batch 14/59] [D loss: 0.606743] [G loss: 1.002754]\n",
"[Epoch 110/200] [Batch 15/59] [D loss: 0.597686] [G loss: 0.937544]\n",
"[Epoch 110/200] [Batch 16/59] [D loss: 0.604136] [G loss: 0.763938]\n",
"[Epoch 110/200] [Batch 17/59] [D loss: 0.629196] [G loss: 1.113540]\n",
"[Epoch 110/200] [Batch 18/59] [D loss: 0.587480] [G loss: 0.792953]\n",
"[Epoch 110/200] [Batch 19/59] [D loss: 0.637460] [G loss: 0.831128]\n",
"[Epoch 110/200] [Batch 20/59] [D loss: 0.596316] [G loss: 1.258528]\n",
"[Epoch 110/200] [Batch 21/59] [D loss: 0.565488] [G loss: 0.711072]\n",
"[Epoch 110/200] [Batch 22/59] [D loss: 0.665207] [G loss: 0.673229]\n",
"[Epoch 110/200] [Batch 23/59] [D loss: 0.625007] [G loss: 1.147038]\n",
"[Epoch 110/200] [Batch 24/59] [D loss: 0.617048] [G loss: 1.059769]\n",
"[Epoch 110/200] [Batch 25/59] [D loss: 0.595476] [G loss: 0.663219]\n",
"[Epoch 110/200] [Batch 26/59] [D loss: 0.579614] [G loss: 1.118148]\n",
"[Epoch 110/200] [Batch 27/59] [D loss: 0.664698] [G loss: 1.133815]\n",
"[Epoch 110/200] [Batch 28/59] [D loss: 0.652631] [G loss: 0.695342]\n",
"[Epoch 110/200] [Batch 29/59] [D loss: 0.578759] [G loss: 1.003580]\n",
"[Epoch 110/200] [Batch 30/59] [D loss: 0.582805] [G loss: 0.839002]\n",
"[Epoch 110/200] [Batch 31/59] [D loss: 0.595348] [G loss: 0.723135]\n",
"[Epoch 110/200] [Batch 32/59] [D loss: 0.583251] [G loss: 0.794642]\n",
"[Epoch 110/200] [Batch 33/59] [D loss: 0.628817] [G loss: 1.060977]\n",
"[Epoch 110/200] [Batch 34/59] [D loss: 0.598104] [G loss: 0.771787]\n",
"[Epoch 110/200] [Batch 35/59] [D loss: 0.563714] [G loss: 0.960542]\n",
"[Epoch 110/200] [Batch 36/59] [D loss: 0.557849] [G loss: 0.926251]\n",
"[Epoch 110/200] [Batch 37/59] [D loss: 0.588940] [G loss: 0.722443]\n",
"[Epoch 110/200] [Batch 38/59] [D loss: 0.569363] [G loss: 0.910623]\n",
"[Epoch 110/200] [Batch 39/59] [D loss: 0.603206] [G loss: 0.838895]\n",
"[Epoch 110/200] [Batch 40/59] [D loss: 0.632705] [G loss: 0.859884]\n",
"[Epoch 110/200] [Batch 41/59] [D loss: 0.567200] [G loss: 0.718149]\n",
"[Epoch 110/200] [Batch 42/59] [D loss: 0.535069] [G loss: 0.887218]\n",
"[Epoch 110/200] [Batch 43/59] [D loss: 0.606520] [G loss: 1.211852]\n",
"[Epoch 110/200] [Batch 44/59] [D loss: 0.578179] [G loss: 0.801475]\n",
"[Epoch 110/200] [Batch 45/59] [D loss: 0.697425] [G loss: 0.773901]\n",
"[Epoch 110/200] [Batch 46/59] [D loss: 0.715913] [G loss: 1.713839]\n",
"[Epoch 110/200] [Batch 47/59] [D loss: 0.519003] [G loss: 0.869651]\n",
"[Epoch 110/200] [Batch 48/59] [D loss: 0.700965] [G loss: 0.477834]\n",
"[Epoch 110/200] [Batch 49/59] [D loss: 0.640592] [G loss: 1.596147]\n",
"[Epoch 110/200] [Batch 50/59] [D loss: 0.580950] [G loss: 1.054581]\n",
"[Epoch 110/200] [Batch 51/59] [D loss: 0.698935] [G loss: 0.626618]\n",
"[Epoch 110/200] [Batch 52/59] [D loss: 0.636185] [G loss: 1.136931]\n",
"[Epoch 110/200] [Batch 53/59] [D loss: 0.597362] [G loss: 1.092145]\n",
"[Epoch 110/200] [Batch 54/59] [D loss: 0.621602] [G loss: 0.751459]\n",
"[Epoch 110/200] [Batch 55/59] [D loss: 0.597529] [G loss: 0.709630]\n",
"[Epoch 110/200] [Batch 56/59] [D loss: 0.616057] [G loss: 1.081657]\n",
"[Epoch 110/200] [Batch 57/59] [D loss: 0.537922] [G loss: 0.806388]\n",
"[Epoch 110/200] [Batch 58/59] [D loss: 0.583101] [G loss: 0.541278]\n",
"[Epoch 111/200] [Batch 0/59] [D loss: 0.573144] [G loss: 1.025949]\n",
"[Epoch 111/200] [Batch 1/59] [D loss: 0.588983] [G loss: 0.832163]\n",
"[Epoch 111/200] [Batch 2/59] [D loss: 0.582712] [G loss: 0.701047]\n",
"[Epoch 111/200] [Batch 3/59] [D loss: 0.577728] [G loss: 1.045794]\n",
"[Epoch 111/200] [Batch 4/59] [D loss: 0.638493] [G loss: 0.985152]\n",
"[Epoch 111/200] [Batch 5/59] [D loss: 0.604707] [G loss: 0.732643]\n",
"[Epoch 111/200] [Batch 6/59] [D loss: 0.533465] [G loss: 1.049966]\n",
"[Epoch 111/200] [Batch 7/59] [D loss: 0.574297] [G loss: 0.965004]\n",
"[Epoch 111/200] [Batch 8/59] [D loss: 0.559867] [G loss: 0.817794]\n",
"[Epoch 111/200] [Batch 9/59] [D loss: 0.566843] [G loss: 0.777794]\n",
"[Epoch 111/200] [Batch 10/59] [D loss: 0.592775] [G loss: 0.934949]\n",
"[Epoch 111/200] [Batch 11/59] [D loss: 0.577677] [G loss: 0.727659]\n",
"[Epoch 111/200] [Batch 12/59] [D loss: 0.620735] [G loss: 0.904448]\n",
"[Epoch 111/200] [Batch 13/59] [D loss: 0.676808] [G loss: 0.788085]\n",
"[Epoch 111/200] [Batch 14/59] [D loss: 0.584312] [G loss: 0.902361]\n",
"[Epoch 111/200] [Batch 15/59] [D loss: 0.607523] [G loss: 0.760278]\n",
"[Epoch 111/200] [Batch 16/59] [D loss: 0.553427] [G loss: 0.763211]\n",
"[Epoch 111/200] [Batch 17/59] [D loss: 0.560870] [G loss: 1.045430]\n",
"[Epoch 111/200] [Batch 18/59] [D loss: 0.635218] [G loss: 1.014837]\n",
"[Epoch 111/200] [Batch 19/59] [D loss: 0.561634] [G loss: 0.864037]\n",
"[Epoch 111/200] [Batch 20/59] [D loss: 0.575513] [G loss: 0.805947]\n",
"[Epoch 111/200] [Batch 21/59] [D loss: 0.539402] [G loss: 1.016604]\n",
"[Epoch 111/200] [Batch 22/59] [D loss: 0.517573] [G loss: 1.072561]\n",
"[Epoch 111/200] [Batch 23/59] [D loss: 0.551667] [G loss: 0.873509]\n",
"[Epoch 111/200] [Batch 24/59] [D loss: 0.596024] [G loss: 0.890843]\n",
"[Epoch 111/200] [Batch 25/59] [D loss: 0.588292] [G loss: 0.861219]\n",
"[Epoch 111/200] [Batch 26/59] [D loss: 0.513553] [G loss: 0.861936]\n",
"[Epoch 111/200] [Batch 27/59] [D loss: 0.605973] [G loss: 0.903565]\n",
"[Epoch 111/200] [Batch 28/59] [D loss: 0.604843] [G loss: 0.904679]\n",
"[Epoch 111/200] [Batch 29/59] [D loss: 0.621595] [G loss: 0.955138]\n",
"[Epoch 111/200] [Batch 30/59] [D loss: 0.585994] [G loss: 0.727730]\n",
"[Epoch 111/200] [Batch 31/59] [D loss: 0.584311] [G loss: 0.967369]\n",
"[Epoch 111/200] [Batch 32/59] [D loss: 0.581584] [G loss: 1.058432]\n",
"[Epoch 111/200] [Batch 33/59] [D loss: 0.602273] [G loss: 0.562122]\n",
"[Epoch 111/200] [Batch 34/59] [D loss: 0.543375] [G loss: 0.752696]\n",
"[Epoch 111/200] [Batch 35/59] [D loss: 0.582406] [G loss: 1.209053]\n",
"[Epoch 111/200] [Batch 36/59] [D loss: 0.559424] [G loss: 0.967088]\n",
"[Epoch 111/200] [Batch 37/59] [D loss: 0.589800] [G loss: 0.747951]\n",
"[Epoch 111/200] [Batch 38/59] [D loss: 0.572573] [G loss: 0.936891]\n",
"[Epoch 111/200] [Batch 39/59] [D loss: 0.634926] [G loss: 1.047662]\n",
"[Epoch 111/200] [Batch 40/59] [D loss: 0.589907] [G loss: 0.505988]\n",
"[Epoch 111/200] [Batch 41/59] [D loss: 0.623374] [G loss: 0.929361]\n",
"[Epoch 111/200] [Batch 42/59] [D loss: 0.564786] [G loss: 1.187289]\n",
"[Epoch 111/200] [Batch 43/59] [D loss: 0.555878] [G loss: 0.752021]\n",
"[Epoch 111/200] [Batch 44/59] [D loss: 0.629161] [G loss: 0.663206]\n",
"[Epoch 111/200] [Batch 45/59] [D loss: 0.588794] [G loss: 1.092931]\n",
"[Epoch 111/200] [Batch 46/59] [D loss: 0.595987] [G loss: 0.851496]\n",
"[Epoch 111/200] [Batch 47/59] [D loss: 0.585903] [G loss: 0.812951]\n",
"[Epoch 111/200] [Batch 48/59] [D loss: 0.612439] [G loss: 0.908032]\n",
"[Epoch 111/200] [Batch 49/59] [D loss: 0.607952] [G loss: 0.922175]\n",
"[Epoch 111/200] [Batch 50/59] [D loss: 0.586008] [G loss: 0.835222]\n",
"[Epoch 111/200] [Batch 51/59] [D loss: 0.677776] [G loss: 0.871369]\n",
"[Epoch 111/200] [Batch 52/59] [D loss: 0.577128] [G loss: 0.937203]\n",
"[Epoch 111/200] [Batch 53/59] [D loss: 0.526136] [G loss: 1.105658]\n",
"[Epoch 111/200] [Batch 54/59] [D loss: 0.586111] [G loss: 0.901477]\n",
"[Epoch 111/200] [Batch 55/59] [D loss: 0.554894] [G loss: 0.833299]\n",
"[Epoch 111/200] [Batch 56/59] [D loss: 0.611735] [G loss: 1.054993]\n",
"[Epoch 111/200] [Batch 57/59] [D loss: 0.606283] [G loss: 0.825119]\n",
"[Epoch 111/200] [Batch 58/59] [D loss: 0.605200] [G loss: 0.969813]\n",
"[Epoch 112/200] [Batch 0/59] [D loss: 0.629747] [G loss: 0.712248]\n",
"[Epoch 112/200] [Batch 1/59] [D loss: 0.601760] [G loss: 0.851984]\n",
"[Epoch 112/200] [Batch 2/59] [D loss: 0.551901] [G loss: 0.950386]\n",
"[Epoch 112/200] [Batch 3/59] [D loss: 0.585325] [G loss: 0.734579]\n",
"[Epoch 112/200] [Batch 4/59] [D loss: 0.571594] [G loss: 1.105258]\n",
"[Epoch 112/200] [Batch 5/59] [D loss: 0.616596] [G loss: 0.938535]\n",
"[Epoch 112/200] [Batch 6/59] [D loss: 0.622619] [G loss: 0.742505]\n",
"[Epoch 112/200] [Batch 7/59] [D loss: 0.554072] [G loss: 1.036951]\n",
"[Epoch 112/200] [Batch 8/59] [D loss: 0.635544] [G loss: 0.806141]\n",
"[Epoch 112/200] [Batch 9/59] [D loss: 0.561870] [G loss: 0.839301]\n",
"[Epoch 112/200] [Batch 10/59] [D loss: 0.621290] [G loss: 0.893190]\n",
"[Epoch 112/200] [Batch 11/59] [D loss: 0.481155] [G loss: 0.790144]\n",
"[Epoch 112/200] [Batch 12/59] [D loss: 0.569638] [G loss: 0.787181]\n",
"[Epoch 112/200] [Batch 13/59] [D loss: 0.526789] [G loss: 0.938170]\n",
"[Epoch 112/200] [Batch 14/59] [D loss: 0.538187] [G loss: 0.812077]\n",
"[Epoch 112/200] [Batch 15/59] [D loss: 0.550888] [G loss: 1.051985]\n",
"[Epoch 112/200] [Batch 16/59] [D loss: 0.584529] [G loss: 0.941123]\n",
"[Epoch 112/200] [Batch 17/59] [D loss: 0.592038] [G loss: 0.850110]\n",
"[Epoch 112/200] [Batch 18/59] [D loss: 0.572095] [G loss: 0.733865]\n",
"[Epoch 112/200] [Batch 19/59] [D loss: 0.575651] [G loss: 0.921270]\n",
"[Epoch 112/200] [Batch 20/59] [D loss: 0.588577] [G loss: 0.971450]\n",
"[Epoch 112/200] [Batch 21/59] [D loss: 0.605201] [G loss: 0.852297]\n",
"[Epoch 112/200] [Batch 22/59] [D loss: 0.567067] [G loss: 0.764300]\n",
"[Epoch 112/200] [Batch 23/59] [D loss: 0.563510] [G loss: 0.658646]\n",
"[Epoch 112/200] [Batch 24/59] [D loss: 0.589088] [G loss: 1.190795]\n",
"[Epoch 112/200] [Batch 25/59] [D loss: 0.585730] [G loss: 0.719887]\n",
"[Epoch 112/200] [Batch 26/59] [D loss: 0.557399] [G loss: 0.926707]\n",
"[Epoch 112/200] [Batch 27/59] [D loss: 0.559770] [G loss: 1.006930]\n",
"[Epoch 112/200] [Batch 28/59] [D loss: 0.609619] [G loss: 0.948342]\n",
"[Epoch 112/200] [Batch 29/59] [D loss: 0.561598] [G loss: 0.817418]\n",
"[Epoch 112/200] [Batch 30/59] [D loss: 0.557234] [G loss: 1.076885]\n",
"[Epoch 112/200] [Batch 31/59] [D loss: 0.588439] [G loss: 1.047489]\n",
"[Epoch 112/200] [Batch 32/59] [D loss: 0.539879] [G loss: 0.767082]\n",
"[Epoch 112/200] [Batch 33/59] [D loss: 0.558495] [G loss: 0.677084]\n",
"[Epoch 112/200] [Batch 34/59] [D loss: 0.679680] [G loss: 1.022662]\n",
"[Epoch 112/200] [Batch 35/59] [D loss: 0.562871] [G loss: 0.798870]\n",
"[Epoch 112/200] [Batch 36/59] [D loss: 0.594218] [G loss: 0.923085]\n",
"[Epoch 112/200] [Batch 37/59] [D loss: 0.555733] [G loss: 1.018443]\n",
"[Epoch 112/200] [Batch 38/59] [D loss: 0.542356] [G loss: 0.969659]\n",
"[Epoch 112/200] [Batch 39/59] [D loss: 0.586164] [G loss: 0.992345]\n",
"[Epoch 112/200] [Batch 40/59] [D loss: 0.619966] [G loss: 0.720290]\n",
"[Epoch 112/200] [Batch 41/59] [D loss: 0.604585] [G loss: 1.056727]\n",
"[Epoch 112/200] [Batch 42/59] [D loss: 0.602279] [G loss: 0.825754]\n",
"[Epoch 112/200] [Batch 43/59] [D loss: 0.614653] [G loss: 0.712889]\n",
"[Epoch 112/200] [Batch 44/59] [D loss: 0.579938] [G loss: 1.164237]\n",
"[Epoch 112/200] [Batch 45/59] [D loss: 0.550081] [G loss: 0.972104]\n",
"[Epoch 112/200] [Batch 46/59] [D loss: 0.697543] [G loss: 0.666644]\n",
"[Epoch 112/200] [Batch 47/59] [D loss: 0.617275] [G loss: 1.101717]\n",
"[Epoch 112/200] [Batch 48/59] [D loss: 0.599354] [G loss: 1.216753]\n",
"[Epoch 112/200] [Batch 49/59] [D loss: 0.611584] [G loss: 0.608106]\n",
"[Epoch 112/200] [Batch 50/59] [D loss: 0.535998] [G loss: 0.861431]\n",
"[Epoch 112/200] [Batch 51/59] [D loss: 0.537108] [G loss: 1.093592]\n",
"[Epoch 112/200] [Batch 52/59] [D loss: 0.586763] [G loss: 0.726737]\n",
"[Epoch 112/200] [Batch 53/59] [D loss: 0.531843] [G loss: 0.915007]\n",
"[Epoch 112/200] [Batch 54/59] [D loss: 0.527741] [G loss: 1.036904]\n",
"[Epoch 112/200] [Batch 55/59] [D loss: 0.562528] [G loss: 0.767792]\n",
"[Epoch 112/200] [Batch 56/59] [D loss: 0.575303] [G loss: 0.782560]\n",
"[Epoch 112/200] [Batch 57/59] [D loss: 0.575388] [G loss: 1.267113]\n",
"[Epoch 112/200] [Batch 58/59] [D loss: 0.655301] [G loss: 0.647243]\n",
"[Epoch 113/200] [Batch 0/59] [D loss: 0.558916] [G loss: 0.917850]\n",
"[Epoch 113/200] [Batch 1/59] [D loss: 0.572850] [G loss: 1.209232]\n",
"[Epoch 113/200] [Batch 2/59] [D loss: 0.604840] [G loss: 0.595901]\n",
"[Epoch 113/200] [Batch 3/59] [D loss: 0.557255] [G loss: 0.931539]\n",
"[Epoch 113/200] [Batch 4/59] [D loss: 0.523521] [G loss: 0.854144]\n",
"[Epoch 113/200] [Batch 5/59] [D loss: 0.579127] [G loss: 0.886809]\n",
"[Epoch 113/200] [Batch 6/59] [D loss: 0.596709] [G loss: 0.988930]\n",
"[Epoch 113/200] [Batch 7/59] [D loss: 0.602406] [G loss: 0.729905]\n",
"[Epoch 113/200] [Batch 8/59] [D loss: 0.529136] [G loss: 0.744901]\n",
"[Epoch 113/200] [Batch 9/59] [D loss: 0.600818] [G loss: 0.978340]\n",
"[Epoch 113/200] [Batch 10/59] [D loss: 0.567685] [G loss: 1.101016]\n",
"[Epoch 113/200] [Batch 11/59] [D loss: 0.609448] [G loss: 0.867871]\n",
"[Epoch 113/200] [Batch 12/59] [D loss: 0.569790] [G loss: 0.791175]\n",
"[Epoch 113/200] [Batch 13/59] [D loss: 0.571109] [G loss: 0.856233]\n",
"[Epoch 113/200] [Batch 14/59] [D loss: 0.612459] [G loss: 1.286628]\n",
"[Epoch 113/200] [Batch 15/59] [D loss: 0.582762] [G loss: 0.762694]\n",
"[Epoch 113/200] [Batch 16/59] [D loss: 0.567459] [G loss: 0.854475]\n",
"[Epoch 113/200] [Batch 17/59] [D loss: 0.577107] [G loss: 0.897501]\n",
"[Epoch 113/200] [Batch 18/59] [D loss: 0.528687] [G loss: 0.985964]\n",
"[Epoch 113/200] [Batch 19/59] [D loss: 0.584440] [G loss: 0.960012]\n",
"[Epoch 113/200] [Batch 20/59] [D loss: 0.591380] [G loss: 0.757177]\n",
"[Epoch 113/200] [Batch 21/59] [D loss: 0.564492] [G loss: 0.997317]\n",
"[Epoch 113/200] [Batch 22/59] [D loss: 0.578738] [G loss: 0.837292]\n",
"[Epoch 113/200] [Batch 23/59] [D loss: 0.574341] [G loss: 0.753468]\n",
"[Epoch 113/200] [Batch 24/59] [D loss: 0.651071] [G loss: 0.790762]\n",
"[Epoch 113/200] [Batch 25/59] [D loss: 0.571035] [G loss: 0.893302]\n",
"[Epoch 113/200] [Batch 26/59] [D loss: 0.563517] [G loss: 0.838698]\n",
"[Epoch 113/200] [Batch 27/59] [D loss: 0.572324] [G loss: 0.912848]\n",
"[Epoch 113/200] [Batch 28/59] [D loss: 0.588888] [G loss: 1.023959]\n",
"[Epoch 113/200] [Batch 29/59] [D loss: 0.561184] [G loss: 0.848061]\n",
"[Epoch 113/200] [Batch 30/59] [D loss: 0.592013] [G loss: 0.672046]\n",
"[Epoch 113/200] [Batch 31/59] [D loss: 0.632984] [G loss: 1.119809]\n",
"[Epoch 113/200] [Batch 32/59] [D loss: 0.546333] [G loss: 0.838137]\n",
"[Epoch 113/200] [Batch 33/59] [D loss: 0.622639] [G loss: 0.846831]\n",
"[Epoch 113/200] [Batch 34/59] [D loss: 0.575354] [G loss: 1.057453]\n",
"[Epoch 113/200] [Batch 35/59] [D loss: 0.587002] [G loss: 0.853354]\n",
"[Epoch 113/200] [Batch 36/59] [D loss: 0.567462] [G loss: 0.912501]\n",
"[Epoch 113/200] [Batch 37/59] [D loss: 0.651941] [G loss: 0.985047]\n",
"[Epoch 113/200] [Batch 38/59] [D loss: 0.585607] [G loss: 0.629720]\n",
"[Epoch 113/200] [Batch 39/59] [D loss: 0.559051] [G loss: 1.011774]\n",
"[Epoch 113/200] [Batch 40/59] [D loss: 0.590049] [G loss: 0.756072]\n",
"[Epoch 113/200] [Batch 41/59] [D loss: 0.650839] [G loss: 0.781162]\n",
"[Epoch 113/200] [Batch 42/59] [D loss: 0.558955] [G loss: 0.935671]\n",
"[Epoch 113/200] [Batch 43/59] [D loss: 0.624418] [G loss: 0.713258]\n",
"[Epoch 113/200] [Batch 44/59] [D loss: 0.580491] [G loss: 0.703445]\n",
"[Epoch 113/200] [Batch 45/59] [D loss: 0.619577] [G loss: 0.948566]\n",
"[Epoch 113/200] [Batch 46/59] [D loss: 0.592832] [G loss: 0.753918]\n",
"[Epoch 113/200] [Batch 47/59] [D loss: 0.575272] [G loss: 0.899755]\n",
"[Epoch 113/200] [Batch 48/59] [D loss: 0.578415] [G loss: 1.064641]\n",
"[Epoch 113/200] [Batch 49/59] [D loss: 0.566542] [G loss: 0.673457]\n",
"[Epoch 113/200] [Batch 50/59] [D loss: 0.624656] [G loss: 0.851330]\n",
"[Epoch 113/200] [Batch 51/59] [D loss: 0.619806] [G loss: 1.293728]\n",
"[Epoch 113/200] [Batch 52/59] [D loss: 0.588243] [G loss: 0.682613]\n",
"[Epoch 113/200] [Batch 53/59] [D loss: 0.531258] [G loss: 1.048443]\n",
"[Epoch 113/200] [Batch 54/59] [D loss: 0.585789] [G loss: 0.893559]\n",
"[Epoch 113/200] [Batch 55/59] [D loss: 0.553949] [G loss: 0.777155]\n",
"[Epoch 113/200] [Batch 56/59] [D loss: 0.597339] [G loss: 0.807349]\n",
"[Epoch 113/200] [Batch 57/59] [D loss: 0.585884] [G loss: 0.723721]\n",
"[Epoch 113/200] [Batch 58/59] [D loss: 0.591203] [G loss: 1.153109]\n",
"[Epoch 114/200] [Batch 0/59] [D loss: 0.606520] [G loss: 0.864826]\n",
"[Epoch 114/200] [Batch 1/59] [D loss: 0.585202] [G loss: 0.970817]\n",
"[Epoch 114/200] [Batch 2/59] [D loss: 0.602664] [G loss: 0.875216]\n",
"[Epoch 114/200] [Batch 3/59] [D loss: 0.619625] [G loss: 0.854331]\n",
"[Epoch 114/200] [Batch 4/59] [D loss: 0.554575] [G loss: 0.781006]\n",
"[Epoch 114/200] [Batch 5/59] [D loss: 0.569098] [G loss: 0.863190]\n",
"[Epoch 114/200] [Batch 6/59] [D loss: 0.621425] [G loss: 1.045749]\n",
"[Epoch 114/200] [Batch 7/59] [D loss: 0.666735] [G loss: 1.044698]\n",
"[Epoch 114/200] [Batch 8/59] [D loss: 0.594094] [G loss: 0.758691]\n",
"[Epoch 114/200] [Batch 9/59] [D loss: 0.636627] [G loss: 0.988023]\n",
"[Epoch 114/200] [Batch 10/59] [D loss: 0.522066] [G loss: 0.932000]\n",
"[Epoch 114/200] [Batch 11/59] [D loss: 0.527606] [G loss: 0.955288]\n",
"[Epoch 114/200] [Batch 12/59] [D loss: 0.622214] [G loss: 0.823689]\n",
"[Epoch 114/200] [Batch 13/59] [D loss: 0.573425] [G loss: 0.857182]\n",
"[Epoch 114/200] [Batch 14/59] [D loss: 0.583863] [G loss: 0.633780]\n",
"[Epoch 114/200] [Batch 15/59] [D loss: 0.550123] [G loss: 1.050821]\n",
"[Epoch 114/200] [Batch 16/59] [D loss: 0.626382] [G loss: 0.992981]\n",
"[Epoch 114/200] [Batch 17/59] [D loss: 0.667115] [G loss: 0.602596]\n",
"[Epoch 114/200] [Batch 18/59] [D loss: 0.603917] [G loss: 1.136668]\n",
"[Epoch 114/200] [Batch 19/59] [D loss: 0.591156] [G loss: 0.958284]\n",
"[Epoch 114/200] [Batch 20/59] [D loss: 0.538399] [G loss: 0.699129]\n",
"[Epoch 114/200] [Batch 21/59] [D loss: 0.535161] [G loss: 1.245517]\n",
"[Epoch 114/200] [Batch 22/59] [D loss: 0.587740] [G loss: 0.978507]\n",
"[Epoch 114/200] [Batch 23/59] [D loss: 0.627258] [G loss: 0.832870]\n",
"[Epoch 114/200] [Batch 24/59] [D loss: 0.598330] [G loss: 0.982591]\n",
"[Epoch 114/200] [Batch 25/59] [D loss: 0.583097] [G loss: 0.723877]\n",
"[Epoch 114/200] [Batch 26/59] [D loss: 0.573820] [G loss: 0.991863]\n",
"[Epoch 114/200] [Batch 27/59] [D loss: 0.542658] [G loss: 0.968450]\n",
"[Epoch 114/200] [Batch 28/59] [D loss: 0.613272] [G loss: 1.012666]\n",
"[Epoch 114/200] [Batch 29/59] [D loss: 0.587344] [G loss: 0.837667]\n",
"[Epoch 114/200] [Batch 30/59] [D loss: 0.585836] [G loss: 1.015721]\n",
"[Epoch 114/200] [Batch 31/59] [D loss: 0.597525] [G loss: 0.865009]\n",
"[Epoch 114/200] [Batch 32/59] [D loss: 0.530944] [G loss: 0.856861]\n",
"[Epoch 114/200] [Batch 33/59] [D loss: 0.590631] [G loss: 1.043317]\n",
"[Epoch 114/200] [Batch 34/59] [D loss: 0.557790] [G loss: 0.826558]\n",
"[Epoch 114/200] [Batch 35/59] [D loss: 0.577396] [G loss: 0.914132]\n",
"[Epoch 114/200] [Batch 36/59] [D loss: 0.620867] [G loss: 0.793052]\n",
"[Epoch 114/200] [Batch 37/59] [D loss: 0.682327] [G loss: 0.646893]\n",
"[Epoch 114/200] [Batch 38/59] [D loss: 0.613475] [G loss: 1.039973]\n",
"[Epoch 114/200] [Batch 39/59] [D loss: 0.523565] [G loss: 0.745737]\n",
"[Epoch 114/200] [Batch 40/59] [D loss: 0.550049] [G loss: 0.782104]\n",
"[Epoch 114/200] [Batch 41/59] [D loss: 0.655054] [G loss: 1.162422]\n",
"[Epoch 114/200] [Batch 42/59] [D loss: 0.581917] [G loss: 0.838149]\n",
"[Epoch 114/200] [Batch 43/59] [D loss: 0.652887] [G loss: 0.626969]\n",
"[Epoch 114/200] [Batch 44/59] [D loss: 0.585144] [G loss: 1.205202]\n",
"[Epoch 114/200] [Batch 45/59] [D loss: 0.584546] [G loss: 0.967032]\n",
"[Epoch 114/200] [Batch 46/59] [D loss: 0.607126] [G loss: 0.462016]\n",
"[Epoch 114/200] [Batch 47/59] [D loss: 0.603409] [G loss: 1.198027]\n",
"[Epoch 114/200] [Batch 48/59] [D loss: 0.584303] [G loss: 0.736266]\n",
"[Epoch 114/200] [Batch 49/59] [D loss: 0.620685] [G loss: 0.967179]\n",
"[Epoch 114/200] [Batch 50/59] [D loss: 0.608790] [G loss: 0.820087]\n",
"[Epoch 114/200] [Batch 51/59] [D loss: 0.600672] [G loss: 0.977868]\n",
"[Epoch 114/200] [Batch 52/59] [D loss: 0.559128] [G loss: 1.033825]\n",
"[Epoch 114/200] [Batch 53/59] [D loss: 0.593085] [G loss: 0.907080]\n",
"[Epoch 114/200] [Batch 54/59] [D loss: 0.542329] [G loss: 0.839515]\n",
"[Epoch 114/200] [Batch 55/59] [D loss: 0.589643] [G loss: 0.858219]\n",
"[Epoch 114/200] [Batch 56/59] [D loss: 0.602776] [G loss: 0.962644]\n",
"[Epoch 114/200] [Batch 57/59] [D loss: 0.538915] [G loss: 0.812262]\n",
"[Epoch 114/200] [Batch 58/59] [D loss: 0.555870] [G loss: 0.967487]\n",
"[Epoch 115/200] [Batch 0/59] [D loss: 0.514235] [G loss: 0.850408]\n",
"[Epoch 115/200] [Batch 1/59] [D loss: 0.588936] [G loss: 0.829019]\n",
"[Epoch 115/200] [Batch 2/59] [D loss: 0.602446] [G loss: 0.979988]\n",
"[Epoch 115/200] [Batch 3/59] [D loss: 0.633723] [G loss: 0.913751]\n",
"[Epoch 115/200] [Batch 4/59] [D loss: 0.593009] [G loss: 0.621454]\n",
"[Epoch 115/200] [Batch 5/59] [D loss: 0.569572] [G loss: 1.101009]\n",
"[Epoch 115/200] [Batch 6/59] [D loss: 0.522363] [G loss: 0.938931]\n",
"[Epoch 115/200] [Batch 7/59] [D loss: 0.614821] [G loss: 0.756153]\n",
"[Epoch 115/200] [Batch 8/59] [D loss: 0.573876] [G loss: 0.929544]\n",
"[Epoch 115/200] [Batch 9/59] [D loss: 0.541143] [G loss: 1.000701]\n",
"[Epoch 115/200] [Batch 10/59] [D loss: 0.627582] [G loss: 1.017936]\n",
"[Epoch 115/200] [Batch 11/59] [D loss: 0.589629] [G loss: 0.734316]\n",
"[Epoch 115/200] [Batch 12/59] [D loss: 0.595377] [G loss: 1.097624]\n",
"[Epoch 115/200] [Batch 13/59] [D loss: 0.607124] [G loss: 1.161669]\n",
"[Epoch 115/200] [Batch 14/59] [D loss: 0.652978] [G loss: 0.575415]\n",
"[Epoch 115/200] [Batch 15/59] [D loss: 0.613178] [G loss: 1.162077]\n",
"[Epoch 115/200] [Batch 16/59] [D loss: 0.550353] [G loss: 0.942002]\n",
"[Epoch 115/200] [Batch 17/59] [D loss: 0.587721] [G loss: 0.544730]\n",
"[Epoch 115/200] [Batch 18/59] [D loss: 0.556731] [G loss: 1.170590]\n",
"[Epoch 115/200] [Batch 19/59] [D loss: 0.552914] [G loss: 0.990484]\n",
"[Epoch 115/200] [Batch 20/59] [D loss: 0.603457] [G loss: 0.470281]\n",
"[Epoch 115/200] [Batch 21/59] [D loss: 0.603399] [G loss: 0.981967]\n",
"[Epoch 115/200] [Batch 22/59] [D loss: 0.628339] [G loss: 1.239081]\n",
"[Epoch 115/200] [Batch 23/59] [D loss: 0.574150] [G loss: 0.770618]\n",
"[Epoch 115/200] [Batch 24/59] [D loss: 0.592133] [G loss: 0.753335]\n",
"[Epoch 115/200] [Batch 25/59] [D loss: 0.619004] [G loss: 1.097895]\n",
"[Epoch 115/200] [Batch 26/59] [D loss: 0.606024] [G loss: 0.743330]\n",
"[Epoch 115/200] [Batch 27/59] [D loss: 0.625379] [G loss: 0.863477]\n",
"[Epoch 115/200] [Batch 28/59] [D loss: 0.575270] [G loss: 1.203751]\n",
"[Epoch 115/200] [Batch 29/59] [D loss: 0.600018] [G loss: 0.890080]\n",
"[Epoch 115/200] [Batch 30/59] [D loss: 0.573910] [G loss: 0.922642]\n",
"[Epoch 115/200] [Batch 31/59] [D loss: 0.592689] [G loss: 0.991305]\n",
"[Epoch 115/200] [Batch 32/59] [D loss: 0.545125] [G loss: 0.884519]\n",
"[Epoch 115/200] [Batch 33/59] [D loss: 0.534415] [G loss: 0.817410]\n",
"[Epoch 115/200] [Batch 34/59] [D loss: 0.557938] [G loss: 1.069077]\n",
"[Epoch 115/200] [Batch 35/59] [D loss: 0.548110] [G loss: 0.915183]\n",
"[Epoch 115/200] [Batch 36/59] [D loss: 0.574066] [G loss: 0.868570]\n",
"[Epoch 115/200] [Batch 37/59] [D loss: 0.584637] [G loss: 0.990681]\n",
"[Epoch 115/200] [Batch 38/59] [D loss: 0.548105] [G loss: 0.871126]\n",
"[Epoch 115/200] [Batch 39/59] [D loss: 0.600517] [G loss: 0.884246]\n",
"[Epoch 115/200] [Batch 40/59] [D loss: 0.623228] [G loss: 1.294518]\n",
"[Epoch 115/200] [Batch 41/59] [D loss: 0.608633] [G loss: 0.786934]\n",
"[Epoch 115/200] [Batch 42/59] [D loss: 0.534964] [G loss: 0.862314]\n",
"[Epoch 115/200] [Batch 43/59] [D loss: 0.569076] [G loss: 1.174391]\n",
"[Epoch 115/200] [Batch 44/59] [D loss: 0.553953] [G loss: 0.798900]\n",
"[Epoch 115/200] [Batch 45/59] [D loss: 0.587756] [G loss: 1.085380]\n",
"[Epoch 115/200] [Batch 46/59] [D loss: 0.636488] [G loss: 0.775317]\n",
"[Epoch 115/200] [Batch 47/59] [D loss: 0.577554] [G loss: 0.901161]\n",
"[Epoch 115/200] [Batch 48/59] [D loss: 0.624616] [G loss: 1.135971]\n",
"[Epoch 115/200] [Batch 49/59] [D loss: 0.519074] [G loss: 0.912043]\n",
"[Epoch 115/200] [Batch 50/59] [D loss: 0.570529] [G loss: 1.044271]\n",
"[Epoch 115/200] [Batch 51/59] [D loss: 0.610699] [G loss: 0.816026]\n",
"[Epoch 115/200] [Batch 52/59] [D loss: 0.573601] [G loss: 0.839827]\n",
"[Epoch 115/200] [Batch 53/59] [D loss: 0.554374] [G loss: 0.884570]\n",
"[Epoch 115/200] [Batch 54/59] [D loss: 0.597729] [G loss: 0.702038]\n",
"[Epoch 115/200] [Batch 55/59] [D loss: 0.648460] [G loss: 1.060531]\n",
"[Epoch 115/200] [Batch 56/59] [D loss: 0.629669] [G loss: 0.853600]\n",
"[Epoch 115/200] [Batch 57/59] [D loss: 0.645333] [G loss: 0.631374]\n",
"[Epoch 115/200] [Batch 58/59] [D loss: 0.604257] [G loss: 1.111753]\n",
"[Epoch 116/200] [Batch 0/59] [D loss: 0.543787] [G loss: 1.035135]\n",
"[Epoch 116/200] [Batch 1/59] [D loss: 0.569445] [G loss: 0.894399]\n",
"[Epoch 116/200] [Batch 2/59] [D loss: 0.605848] [G loss: 0.726603]\n",
"[Epoch 116/200] [Batch 3/59] [D loss: 0.563560] [G loss: 0.924300]\n",
"[Epoch 116/200] [Batch 4/59] [D loss: 0.638159] [G loss: 1.102156]\n",
"[Epoch 116/200] [Batch 5/59] [D loss: 0.579239] [G loss: 0.912206]\n",
"[Epoch 116/200] [Batch 6/59] [D loss: 0.599653] [G loss: 0.951560]\n",
"[Epoch 116/200] [Batch 7/59] [D loss: 0.630902] [G loss: 0.787907]\n",
"[Epoch 116/200] [Batch 8/59] [D loss: 0.576542] [G loss: 0.939154]\n",
"[Epoch 116/200] [Batch 9/59] [D loss: 0.573220] [G loss: 0.797062]\n",
"[Epoch 116/200] [Batch 10/59] [D loss: 0.608151] [G loss: 0.778249]\n",
"[Epoch 116/200] [Batch 11/59] [D loss: 0.595557] [G loss: 1.003736]\n",
"[Epoch 116/200] [Batch 12/59] [D loss: 0.547585] [G loss: 0.730218]\n",
"[Epoch 116/200] [Batch 13/59] [D loss: 0.622984] [G loss: 0.673093]\n",
"[Epoch 116/200] [Batch 14/59] [D loss: 0.592157] [G loss: 1.157420]\n",
"[Epoch 116/200] [Batch 15/59] [D loss: 0.552368] [G loss: 1.044900]\n",
"[Epoch 116/200] [Batch 16/59] [D loss: 0.632971] [G loss: 0.793752]\n",
"[Epoch 116/200] [Batch 17/59] [D loss: 0.654924] [G loss: 0.854863]\n",
"[Epoch 116/200] [Batch 18/59] [D loss: 0.645252] [G loss: 1.326642]\n",
"[Epoch 116/200] [Batch 19/59] [D loss: 0.586226] [G loss: 0.756790]\n",
"[Epoch 116/200] [Batch 20/59] [D loss: 0.574624] [G loss: 0.607498]\n",
"[Epoch 116/200] [Batch 21/59] [D loss: 0.622630] [G loss: 1.407262]\n",
"[Epoch 116/200] [Batch 22/59] [D loss: 0.570981] [G loss: 0.877437]\n",
"[Epoch 116/200] [Batch 23/59] [D loss: 0.630533] [G loss: 0.593831]\n",
"[Epoch 116/200] [Batch 24/59] [D loss: 0.574087] [G loss: 1.326151]\n",
"[Epoch 116/200] [Batch 25/59] [D loss: 0.542906] [G loss: 0.940095]\n",
"[Epoch 116/200] [Batch 26/59] [D loss: 0.654327] [G loss: 0.707466]\n",
"[Epoch 116/200] [Batch 27/59] [D loss: 0.566865] [G loss: 1.196397]\n",
"[Epoch 116/200] [Batch 28/59] [D loss: 0.581998] [G loss: 0.936955]\n",
"[Epoch 116/200] [Batch 29/59] [D loss: 0.526646] [G loss: 0.946964]\n",
"[Epoch 116/200] [Batch 30/59] [D loss: 0.570916] [G loss: 0.771235]\n",
"[Epoch 116/200] [Batch 31/59] [D loss: 0.629795] [G loss: 0.651707]\n",
"[Epoch 116/200] [Batch 32/59] [D loss: 0.622742] [G loss: 1.058962]\n",
"[Epoch 116/200] [Batch 33/59] [D loss: 0.555088] [G loss: 0.778325]\n",
"[Epoch 116/200] [Batch 34/59] [D loss: 0.556773] [G loss: 0.984311]\n",
"[Epoch 116/200] [Batch 35/59] [D loss: 0.564511] [G loss: 0.692901]\n",
"[Epoch 116/200] [Batch 36/59] [D loss: 0.583020] [G loss: 1.003546]\n",
"[Epoch 116/200] [Batch 37/59] [D loss: 0.595390] [G loss: 0.805086]\n",
"[Epoch 116/200] [Batch 38/59] [D loss: 0.559254] [G loss: 0.753791]\n",
"[Epoch 116/200] [Batch 39/59] [D loss: 0.647997] [G loss: 0.894099]\n",
"[Epoch 116/200] [Batch 40/59] [D loss: 0.567738] [G loss: 1.081934]\n",
"[Epoch 116/200] [Batch 41/59] [D loss: 0.530322] [G loss: 1.013391]\n",
"[Epoch 116/200] [Batch 42/59] [D loss: 0.588396] [G loss: 0.749575]\n",
"[Epoch 116/200] [Batch 43/59] [D loss: 0.603803] [G loss: 1.079785]\n",
"[Epoch 116/200] [Batch 44/59] [D loss: 0.676557] [G loss: 0.836402]\n",
"[Epoch 116/200] [Batch 45/59] [D loss: 0.582859] [G loss: 0.977688]\n",
"[Epoch 116/200] [Batch 46/59] [D loss: 0.599507] [G loss: 0.898164]\n",
"[Epoch 116/200] [Batch 47/59] [D loss: 0.541242] [G loss: 0.879643]\n",
"[Epoch 116/200] [Batch 48/59] [D loss: 0.600266] [G loss: 0.865476]\n",
"[Epoch 116/200] [Batch 49/59] [D loss: 0.616888] [G loss: 0.907491]\n",
"[Epoch 116/200] [Batch 50/59] [D loss: 0.611601] [G loss: 1.071597]\n",
"[Epoch 116/200] [Batch 51/59] [D loss: 0.525049] [G loss: 0.966662]\n",
"[Epoch 116/200] [Batch 52/59] [D loss: 0.552574] [G loss: 0.909810]\n",
"[Epoch 116/200] [Batch 53/59] [D loss: 0.563598] [G loss: 0.798032]\n",
"[Epoch 116/200] [Batch 54/59] [D loss: 0.529853] [G loss: 1.032407]\n",
"[Epoch 116/200] [Batch 55/59] [D loss: 0.621627] [G loss: 1.093418]\n",
"[Epoch 116/200] [Batch 56/59] [D loss: 0.567688] [G loss: 0.845602]\n",
"[Epoch 116/200] [Batch 57/59] [D loss: 0.613330] [G loss: 0.683309]\n",
"[Epoch 116/200] [Batch 58/59] [D loss: 0.672298] [G loss: 1.260940]\n",
"[Epoch 117/200] [Batch 0/59] [D loss: 0.609866] [G loss: 0.763689]\n",
"[Epoch 117/200] [Batch 1/59] [D loss: 0.628013] [G loss: 0.709352]\n",
"[Epoch 117/200] [Batch 2/59] [D loss: 0.598178] [G loss: 0.892660]\n",
"[Epoch 117/200] [Batch 3/59] [D loss: 0.546201] [G loss: 0.824438]\n",
"[Epoch 117/200] [Batch 4/59] [D loss: 0.584826] [G loss: 0.705406]\n",
"[Epoch 117/200] [Batch 5/59] [D loss: 0.607003] [G loss: 0.805752]\n",
"[Epoch 117/200] [Batch 6/59] [D loss: 0.563631] [G loss: 1.128987]\n",
"[Epoch 117/200] [Batch 7/59] [D loss: 0.535988] [G loss: 0.789437]\n",
"[Epoch 117/200] [Batch 8/59] [D loss: 0.569889] [G loss: 0.735450]\n",
"[Epoch 117/200] [Batch 9/59] [D loss: 0.568539] [G loss: 1.116012]\n",
"[Epoch 117/200] [Batch 10/59] [D loss: 0.618102] [G loss: 0.715288]\n",
"[Epoch 117/200] [Batch 11/59] [D loss: 0.596111] [G loss: 0.912118]\n",
"[Epoch 117/200] [Batch 12/59] [D loss: 0.587995] [G loss: 1.191456]\n",
"[Epoch 117/200] [Batch 13/59] [D loss: 0.568317] [G loss: 0.762881]\n",
"[Epoch 117/200] [Batch 14/59] [D loss: 0.621574] [G loss: 0.750869]\n",
"[Epoch 117/200] [Batch 15/59] [D loss: 0.545099] [G loss: 1.197296]\n",
"[Epoch 117/200] [Batch 16/59] [D loss: 0.585197] [G loss: 0.855222]\n",
"[Epoch 117/200] [Batch 17/59] [D loss: 0.521239] [G loss: 0.885811]\n",
"[Epoch 117/200] [Batch 18/59] [D loss: 0.551360] [G loss: 0.928256]\n",
"[Epoch 117/200] [Batch 19/59] [D loss: 0.537484] [G loss: 0.692220]\n",
"[Epoch 117/200] [Batch 20/59] [D loss: 0.540461] [G loss: 0.752079]\n",
"[Epoch 117/200] [Batch 21/59] [D loss: 0.562554] [G loss: 0.691076]\n",
"[Epoch 117/200] [Batch 22/59] [D loss: 0.545164] [G loss: 1.064601]\n",
"[Epoch 117/200] [Batch 23/59] [D loss: 0.543973] [G loss: 1.052174]\n",
"[Epoch 117/200] [Batch 24/59] [D loss: 0.517433] [G loss: 0.901705]\n",
"[Epoch 117/200] [Batch 25/59] [D loss: 0.624200] [G loss: 0.785670]\n",
"[Epoch 117/200] [Batch 26/59] [D loss: 0.583596] [G loss: 1.316412]\n",
"[Epoch 117/200] [Batch 27/59] [D loss: 0.620740] [G loss: 1.038277]\n",
"[Epoch 117/200] [Batch 28/59] [D loss: 0.680542] [G loss: 0.623483]\n",
"[Epoch 117/200] [Batch 29/59] [D loss: 0.576328] [G loss: 1.469464]\n",
"[Epoch 117/200] [Batch 30/59] [D loss: 0.531026] [G loss: 0.801924]\n",
"[Epoch 117/200] [Batch 31/59] [D loss: 0.567777] [G loss: 0.669750]\n",
"[Epoch 117/200] [Batch 32/59] [D loss: 0.664913] [G loss: 0.798739]\n",
"[Epoch 117/200] [Batch 33/59] [D loss: 0.598037] [G loss: 1.188021]\n",
"[Epoch 117/200] [Batch 34/59] [D loss: 0.621659] [G loss: 0.922118]\n",
"[Epoch 117/200] [Batch 35/59] [D loss: 0.584122] [G loss: 0.777503]\n",
"[Epoch 117/200] [Batch 36/59] [D loss: 0.608593] [G loss: 1.356906]\n",
"[Epoch 117/200] [Batch 37/59] [D loss: 0.601964] [G loss: 0.776902]\n",
"[Epoch 117/200] [Batch 38/59] [D loss: 0.524851] [G loss: 1.020723]\n",
"[Epoch 117/200] [Batch 39/59] [D loss: 0.499831] [G loss: 0.840432]\n",
"[Epoch 117/200] [Batch 40/59] [D loss: 0.530606] [G loss: 0.868987]\n",
"[Epoch 117/200] [Batch 41/59] [D loss: 0.572891] [G loss: 1.170858]\n",
"[Epoch 117/200] [Batch 42/59] [D loss: 0.641917] [G loss: 0.931266]\n",
"[Epoch 117/200] [Batch 43/59] [D loss: 0.528231] [G loss: 1.056761]\n",
"[Epoch 117/200] [Batch 44/59] [D loss: 0.585461] [G loss: 0.771032]\n",
"[Epoch 117/200] [Batch 45/59] [D loss: 0.589141] [G loss: 1.125305]\n",
"[Epoch 117/200] [Batch 46/59] [D loss: 0.632281] [G loss: 0.992611]\n",
"[Epoch 117/200] [Batch 47/59] [D loss: 0.552904] [G loss: 0.849663]\n",
"[Epoch 117/200] [Batch 48/59] [D loss: 0.614515] [G loss: 0.855566]\n",
"[Epoch 117/200] [Batch 49/59] [D loss: 0.553129] [G loss: 0.803676]\n",
"[Epoch 117/200] [Batch 50/59] [D loss: 0.585150] [G loss: 0.995601]\n",
"[Epoch 117/200] [Batch 51/59] [D loss: 0.568768] [G loss: 0.966835]\n",
"[Epoch 117/200] [Batch 52/59] [D loss: 0.572915] [G loss: 0.934282]\n",
"[Epoch 117/200] [Batch 53/59] [D loss: 0.584385] [G loss: 0.973058]\n",
"[Epoch 117/200] [Batch 54/59] [D loss: 0.598569] [G loss: 0.665525]\n",
"[Epoch 117/200] [Batch 55/59] [D loss: 0.623525] [G loss: 1.004728]\n",
"[Epoch 117/200] [Batch 56/59] [D loss: 0.604962] [G loss: 1.043618]\n",
"[Epoch 117/200] [Batch 57/59] [D loss: 0.588680] [G loss: 1.043564]\n",
"[Epoch 117/200] [Batch 58/59] [D loss: 0.619264] [G loss: 0.595246]\n",
"[Epoch 118/200] [Batch 0/59] [D loss: 0.644918] [G loss: 0.845484]\n",
"[Epoch 118/200] [Batch 1/59] [D loss: 0.605691] [G loss: 1.017336]\n",
"[Epoch 118/200] [Batch 2/59] [D loss: 0.576939] [G loss: 0.936314]\n",
"[Epoch 118/200] [Batch 3/59] [D loss: 0.626598] [G loss: 0.720330]\n",
"[Epoch 118/200] [Batch 4/59] [D loss: 0.598537] [G loss: 1.096334]\n",
"[Epoch 118/200] [Batch 5/59] [D loss: 0.609784] [G loss: 1.017925]\n",
"[Epoch 118/200] [Batch 6/59] [D loss: 0.589065] [G loss: 0.827760]\n",
"[Epoch 118/200] [Batch 7/59] [D loss: 0.614084] [G loss: 0.922320]\n",
"[Epoch 118/200] [Batch 8/59] [D loss: 0.624356] [G loss: 0.708317]\n",
"[Epoch 118/200] [Batch 9/59] [D loss: 0.598931] [G loss: 1.067241]\n",
"[Epoch 118/200] [Batch 10/59] [D loss: 0.494080] [G loss: 0.831865]\n",
"[Epoch 118/200] [Batch 11/59] [D loss: 0.612876] [G loss: 0.607272]\n",
"[Epoch 118/200] [Batch 12/59] [D loss: 0.635777] [G loss: 1.160181]\n",
"[Epoch 118/200] [Batch 13/59] [D loss: 0.516663] [G loss: 0.753697]\n",
"[Epoch 118/200] [Batch 14/59] [D loss: 0.680159] [G loss: 0.604734]\n",
"[Epoch 118/200] [Batch 15/59] [D loss: 0.661477] [G loss: 1.344296]\n",
"[Epoch 118/200] [Batch 16/59] [D loss: 0.550648] [G loss: 0.872458]\n",
"[Epoch 118/200] [Batch 17/59] [D loss: 0.589223] [G loss: 0.635590]\n",
"[Epoch 118/200] [Batch 18/59] [D loss: 0.562077] [G loss: 1.015734]\n",
"[Epoch 118/200] [Batch 19/59] [D loss: 0.637648] [G loss: 1.017071]\n",
"[Epoch 118/200] [Batch 20/59] [D loss: 0.552322] [G loss: 0.779078]\n",
"[Epoch 118/200] [Batch 21/59] [D loss: 0.528264] [G loss: 0.883730]\n",
"[Epoch 118/200] [Batch 22/59] [D loss: 0.604624] [G loss: 0.917179]\n",
"[Epoch 118/200] [Batch 23/59] [D loss: 0.548598] [G loss: 0.843074]\n",
"[Epoch 118/200] [Batch 24/59] [D loss: 0.616958] [G loss: 0.789837]\n",
"[Epoch 118/200] [Batch 25/59] [D loss: 0.612218] [G loss: 0.950928]\n",
"[Epoch 118/200] [Batch 26/59] [D loss: 0.590464] [G loss: 0.761384]\n",
"[Epoch 118/200] [Batch 27/59] [D loss: 0.568772] [G loss: 0.984983]\n",
"[Epoch 118/200] [Batch 28/59] [D loss: 0.635751] [G loss: 0.968126]\n",
"[Epoch 118/200] [Batch 29/59] [D loss: 0.574954] [G loss: 0.666375]\n",
"[Epoch 118/200] [Batch 30/59] [D loss: 0.566700] [G loss: 1.068241]\n",
"[Epoch 118/200] [Batch 31/59] [D loss: 0.643743] [G loss: 0.858430]\n",
"[Epoch 118/200] [Batch 32/59] [D loss: 0.581690] [G loss: 0.767561]\n",
"[Epoch 118/200] [Batch 33/59] [D loss: 0.570359] [G loss: 0.825027]\n",
"[Epoch 118/200] [Batch 34/59] [D loss: 0.632462] [G loss: 0.909079]\n",
"[Epoch 118/200] [Batch 35/59] [D loss: 0.559602] [G loss: 0.701980]\n",
"[Epoch 118/200] [Batch 36/59] [D loss: 0.601915] [G loss: 0.875750]\n",
"[Epoch 118/200] [Batch 37/59] [D loss: 0.649765] [G loss: 1.025231]\n",
"[Epoch 118/200] [Batch 38/59] [D loss: 0.594500] [G loss: 0.873804]\n",
"[Epoch 118/200] [Batch 39/59] [D loss: 0.590886] [G loss: 1.008770]\n",
"[Epoch 118/200] [Batch 40/59] [D loss: 0.634114] [G loss: 0.840562]\n",
"[Epoch 118/200] [Batch 41/59] [D loss: 0.631476] [G loss: 1.204850]\n",
"[Epoch 118/200] [Batch 42/59] [D loss: 0.539090] [G loss: 0.947786]\n",
"[Epoch 118/200] [Batch 43/59] [D loss: 0.609704] [G loss: 0.666566]\n",
"[Epoch 118/200] [Batch 44/59] [D loss: 0.597296] [G loss: 1.162468]\n",
"[Epoch 118/200] [Batch 45/59] [D loss: 0.587998] [G loss: 0.960084]\n",
"[Epoch 118/200] [Batch 46/59] [D loss: 0.622630] [G loss: 0.538912]\n",
"[Epoch 118/200] [Batch 47/59] [D loss: 0.609043] [G loss: 1.242111]\n",
"[Epoch 118/200] [Batch 48/59] [D loss: 0.545676] [G loss: 0.869880]\n",
"[Epoch 118/200] [Batch 49/59] [D loss: 0.594480] [G loss: 0.904701]\n",
"[Epoch 118/200] [Batch 50/59] [D loss: 0.572332] [G loss: 0.866948]\n",
"[Epoch 118/200] [Batch 51/59] [D loss: 0.607655] [G loss: 0.946613]\n",
"[Epoch 118/200] [Batch 52/59] [D loss: 0.585840] [G loss: 0.692745]\n",
"[Epoch 118/200] [Batch 53/59] [D loss: 0.584133] [G loss: 0.980514]\n",
"[Epoch 118/200] [Batch 54/59] [D loss: 0.663906] [G loss: 0.874088]\n",
"[Epoch 118/200] [Batch 55/59] [D loss: 0.579538] [G loss: 0.723216]\n",
"[Epoch 118/200] [Batch 56/59] [D loss: 0.578961] [G loss: 1.117896]\n",
"[Epoch 118/200] [Batch 57/59] [D loss: 0.569172] [G loss: 0.848227]\n",
"[Epoch 118/200] [Batch 58/59] [D loss: 0.497944] [G loss: 0.876652]\n",
"[Epoch 119/200] [Batch 0/59] [D loss: 0.593365] [G loss: 0.878185]\n",
"[Epoch 119/200] [Batch 1/59] [D loss: 0.568920] [G loss: 0.805058]\n",
"[Epoch 119/200] [Batch 2/59] [D loss: 0.593053] [G loss: 0.674031]\n",
"[Epoch 119/200] [Batch 3/59] [D loss: 0.561252] [G loss: 0.947094]\n",
"[Epoch 119/200] [Batch 4/59] [D loss: 0.542170] [G loss: 0.983557]\n",
"[Epoch 119/200] [Batch 5/59] [D loss: 0.600083] [G loss: 0.857773]\n",
"[Epoch 119/200] [Batch 6/59] [D loss: 0.643074] [G loss: 0.980401]\n",
"[Epoch 119/200] [Batch 7/59] [D loss: 0.544030] [G loss: 0.966106]\n",
"[Epoch 119/200] [Batch 8/59] [D loss: 0.575776] [G loss: 0.763967]\n",
"[Epoch 119/200] [Batch 9/59] [D loss: 0.544287] [G loss: 1.031401]\n",
"[Epoch 119/200] [Batch 10/59] [D loss: 0.581043] [G loss: 1.027261]\n",
"[Epoch 119/200] [Batch 11/59] [D loss: 0.567035] [G loss: 0.862401]\n",
"[Epoch 119/200] [Batch 12/59] [D loss: 0.544797] [G loss: 0.812612]\n",
"[Epoch 119/200] [Batch 13/59] [D loss: 0.581580] [G loss: 0.945504]\n",
"[Epoch 119/200] [Batch 14/59] [D loss: 0.542886] [G loss: 0.846712]\n",
"[Epoch 119/200] [Batch 15/59] [D loss: 0.545349] [G loss: 0.986781]\n",
"[Epoch 119/200] [Batch 16/59] [D loss: 0.534649] [G loss: 0.861205]\n",
"[Epoch 119/200] [Batch 17/59] [D loss: 0.568815] [G loss: 0.774522]\n",
"[Epoch 119/200] [Batch 18/59] [D loss: 0.562075] [G loss: 0.737040]\n",
"[Epoch 119/200] [Batch 19/59] [D loss: 0.527771] [G loss: 1.054323]\n",
"[Epoch 119/200] [Batch 20/59] [D loss: 0.558453] [G loss: 0.768491]\n",
"[Epoch 119/200] [Batch 21/59] [D loss: 0.564547] [G loss: 0.778836]\n",
"[Epoch 119/200] [Batch 22/59] [D loss: 0.544640] [G loss: 1.157722]\n",
"[Epoch 119/200] [Batch 23/59] [D loss: 0.554258] [G loss: 0.950458]\n",
"[Epoch 119/200] [Batch 24/59] [D loss: 0.582803] [G loss: 0.947580]\n",
"[Epoch 119/200] [Batch 25/59] [D loss: 0.535083] [G loss: 0.887055]\n",
"[Epoch 119/200] [Batch 26/59] [D loss: 0.549428] [G loss: 0.974805]\n",
"[Epoch 119/200] [Batch 27/59] [D loss: 0.571079] [G loss: 0.713091]\n",
"[Epoch 119/200] [Batch 28/59] [D loss: 0.587688] [G loss: 0.926831]\n",
"[Epoch 119/200] [Batch 29/59] [D loss: 0.528024] [G loss: 0.930777]\n",
"[Epoch 119/200] [Batch 30/59] [D loss: 0.653616] [G loss: 0.738679]\n",
"[Epoch 119/200] [Batch 31/59] [D loss: 0.635090] [G loss: 1.270831]\n",
"[Epoch 119/200] [Batch 32/59] [D loss: 0.520240] [G loss: 0.933349]\n",
"[Epoch 119/200] [Batch 33/59] [D loss: 0.619982] [G loss: 0.700354]\n",
"[Epoch 119/200] [Batch 34/59] [D loss: 0.571257] [G loss: 0.805611]\n",
"[Epoch 119/200] [Batch 35/59] [D loss: 0.564156] [G loss: 1.116733]\n",
"[Epoch 119/200] [Batch 36/59] [D loss: 0.584631] [G loss: 0.679837]\n",
"[Epoch 119/200] [Batch 37/59] [D loss: 0.595041] [G loss: 0.841775]\n",
"[Epoch 119/200] [Batch 38/59] [D loss: 0.575997] [G loss: 1.042720]\n",
"[Epoch 119/200] [Batch 39/59] [D loss: 0.619271] [G loss: 0.812797]\n",
"[Epoch 119/200] [Batch 40/59] [D loss: 0.623652] [G loss: 0.828368]\n",
"[Epoch 119/200] [Batch 41/59] [D loss: 0.611383] [G loss: 0.968270]\n",
"[Epoch 119/200] [Batch 42/59] [D loss: 0.548402] [G loss: 0.941360]\n",
"[Epoch 119/200] [Batch 43/59] [D loss: 0.627061] [G loss: 1.021216]\n",
"[Epoch 119/200] [Batch 44/59] [D loss: 0.581017] [G loss: 0.759459]\n",
"[Epoch 119/200] [Batch 45/59] [D loss: 0.563180] [G loss: 0.901841]\n",
"[Epoch 119/200] [Batch 46/59] [D loss: 0.551409] [G loss: 0.836834]\n",
"[Epoch 119/200] [Batch 47/59] [D loss: 0.535086] [G loss: 1.108932]\n",
"[Epoch 119/200] [Batch 48/59] [D loss: 0.546900] [G loss: 0.852948]\n",
"[Epoch 119/200] [Batch 49/59] [D loss: 0.611557] [G loss: 0.643448]\n",
"[Epoch 119/200] [Batch 50/59] [D loss: 0.553276] [G loss: 1.040789]\n",
"[Epoch 119/200] [Batch 51/59] [D loss: 0.533986] [G loss: 0.917885]\n",
"[Epoch 119/200] [Batch 52/59] [D loss: 0.585880] [G loss: 0.611810]\n",
"[Epoch 119/200] [Batch 53/59] [D loss: 0.616966] [G loss: 1.175601]\n",
"[Epoch 119/200] [Batch 54/59] [D loss: 0.569000] [G loss: 0.775270]\n",
"[Epoch 119/200] [Batch 55/59] [D loss: 0.559045] [G loss: 0.647792]\n",
"[Epoch 119/200] [Batch 56/59] [D loss: 0.571264] [G loss: 1.066518]\n",
"[Epoch 119/200] [Batch 57/59] [D loss: 0.518394] [G loss: 1.088451]\n",
"[Epoch 119/200] [Batch 58/59] [D loss: 0.579781] [G loss: 1.012492]\n",
"[Epoch 120/200] [Batch 0/59] [D loss: 0.574765] [G loss: 0.911653]\n",
"[Epoch 120/200] [Batch 1/59] [D loss: 0.552334] [G loss: 0.967183]\n",
"[Epoch 120/200] [Batch 2/59] [D loss: 0.600243] [G loss: 1.115021]\n",
"[Epoch 120/200] [Batch 3/59] [D loss: 0.524810] [G loss: 0.849067]\n",
"[Epoch 120/200] [Batch 4/59] [D loss: 0.548421] [G loss: 1.104255]\n",
"[Epoch 120/200] [Batch 5/59] [D loss: 0.572020] [G loss: 0.836135]\n",
"[Epoch 120/200] [Batch 6/59] [D loss: 0.671167] [G loss: 1.164573]\n",
"[Epoch 120/200] [Batch 7/59] [D loss: 0.630472] [G loss: 0.719388]\n",
"[Epoch 120/200] [Batch 8/59] [D loss: 0.664770] [G loss: 0.803613]\n",
"[Epoch 120/200] [Batch 9/59] [D loss: 0.608232] [G loss: 1.236038]\n",
"[Epoch 120/200] [Batch 10/59] [D loss: 0.546691] [G loss: 0.684990]\n",
"[Epoch 120/200] [Batch 11/59] [D loss: 0.545469] [G loss: 0.911264]\n",
"[Epoch 120/200] [Batch 12/59] [D loss: 0.574508] [G loss: 1.069736]\n",
"[Epoch 120/200] [Batch 13/59] [D loss: 0.596450] [G loss: 0.880264]\n",
"[Epoch 120/200] [Batch 14/59] [D loss: 0.556535] [G loss: 0.766606]\n",
"[Epoch 120/200] [Batch 15/59] [D loss: 0.552380] [G loss: 1.030580]\n",
"[Epoch 120/200] [Batch 16/59] [D loss: 0.616041] [G loss: 1.064573]\n",
"[Epoch 120/200] [Batch 17/59] [D loss: 0.483461] [G loss: 0.951235]\n",
"[Epoch 120/200] [Batch 18/59] [D loss: 0.581996] [G loss: 0.783166]\n",
"[Epoch 120/200] [Batch 19/59] [D loss: 0.529723] [G loss: 0.867672]\n",
"[Epoch 120/200] [Batch 20/59] [D loss: 0.573017] [G loss: 0.718819]\n",
"[Epoch 120/200] [Batch 21/59] [D loss: 0.571261] [G loss: 1.123178]\n",
"[Epoch 120/200] [Batch 22/59] [D loss: 0.511013] [G loss: 1.096685]\n",
"[Epoch 120/200] [Batch 23/59] [D loss: 0.616147] [G loss: 0.933609]\n",
"[Epoch 120/200] [Batch 24/59] [D loss: 0.628412] [G loss: 1.004853]\n",
"[Epoch 120/200] [Batch 25/59] [D loss: 0.595393] [G loss: 0.831242]\n",
"[Epoch 120/200] [Batch 26/59] [D loss: 0.596003] [G loss: 0.782704]\n",
"[Epoch 120/200] [Batch 27/59] [D loss: 0.554639] [G loss: 0.875040]\n",
"[Epoch 120/200] [Batch 28/59] [D loss: 0.583037] [G loss: 1.144369]\n",
"[Epoch 120/200] [Batch 29/59] [D loss: 0.603076] [G loss: 0.725836]\n",
"[Epoch 120/200] [Batch 30/59] [D loss: 0.541040] [G loss: 1.064908]\n",
"[Epoch 120/200] [Batch 31/59] [D loss: 0.552206] [G loss: 1.127126]\n",
"[Epoch 120/200] [Batch 32/59] [D loss: 0.628289] [G loss: 0.617708]\n",
"[Epoch 120/200] [Batch 33/59] [D loss: 0.565035] [G loss: 0.999383]\n",
"[Epoch 120/200] [Batch 34/59] [D loss: 0.567145] [G loss: 1.059487]\n",
"[Epoch 120/200] [Batch 35/59] [D loss: 0.590639] [G loss: 0.865479]\n",
"[Epoch 120/200] [Batch 36/59] [D loss: 0.610338] [G loss: 1.039132]\n",
"[Epoch 120/200] [Batch 37/59] [D loss: 0.552115] [G loss: 0.925504]\n",
"[Epoch 120/200] [Batch 38/59] [D loss: 0.554416] [G loss: 0.981271]\n",
"[Epoch 120/200] [Batch 39/59] [D loss: 0.542464] [G loss: 0.930286]\n",
"[Epoch 120/200] [Batch 40/59] [D loss: 0.575106] [G loss: 0.914872]\n",
"[Epoch 120/200] [Batch 41/59] [D loss: 0.549644] [G loss: 0.728057]\n",
"[Epoch 120/200] [Batch 42/59] [D loss: 0.661890] [G loss: 1.169389]\n",
"[Epoch 120/200] [Batch 43/59] [D loss: 0.576035] [G loss: 0.693283]\n",
"[Epoch 120/200] [Batch 44/59] [D loss: 0.576665] [G loss: 0.996510]\n",
"[Epoch 120/200] [Batch 45/59] [D loss: 0.552712] [G loss: 1.025056]\n",
"[Epoch 120/200] [Batch 46/59] [D loss: 0.562315] [G loss: 0.915854]\n",
"[Epoch 120/200] [Batch 47/59] [D loss: 0.509679] [G loss: 0.876650]\n",
"[Epoch 120/200] [Batch 48/59] [D loss: 0.598273] [G loss: 1.053930]\n",
"[Epoch 120/200] [Batch 49/59] [D loss: 0.549715] [G loss: 0.966409]\n",
"[Epoch 120/200] [Batch 50/59] [D loss: 0.535802] [G loss: 0.896368]\n",
"[Epoch 120/200] [Batch 51/59] [D loss: 0.597254] [G loss: 1.107774]\n",
"[Epoch 120/200] [Batch 52/59] [D loss: 0.546067] [G loss: 0.884361]\n",
"[Epoch 120/200] [Batch 53/59] [D loss: 0.622158] [G loss: 0.649158]\n",
"[Epoch 120/200] [Batch 54/59] [D loss: 0.600177] [G loss: 1.046622]\n",
"[Epoch 120/200] [Batch 55/59] [D loss: 0.627944] [G loss: 0.510695]\n",
"[Epoch 120/200] [Batch 56/59] [D loss: 0.692078] [G loss: 1.258902]\n",
"[Epoch 120/200] [Batch 57/59] [D loss: 0.562354] [G loss: 0.766348]\n",
"[Epoch 120/200] [Batch 58/59] [D loss: 0.554530] [G loss: 1.020166]\n",
"[Epoch 121/200] [Batch 0/59] [D loss: 0.632202] [G loss: 0.932040]\n",
"[Epoch 121/200] [Batch 1/59] [D loss: 0.566623] [G loss: 0.906762]\n",
"[Epoch 121/200] [Batch 2/59] [D loss: 0.598892] [G loss: 0.921883]\n",
"[Epoch 121/200] [Batch 3/59] [D loss: 0.596144] [G loss: 0.899086]\n",
"[Epoch 121/200] [Batch 4/59] [D loss: 0.568005] [G loss: 1.017613]\n",
"[Epoch 121/200] [Batch 5/59] [D loss: 0.546977] [G loss: 0.734814]\n",
"[Epoch 121/200] [Batch 6/59] [D loss: 0.600318] [G loss: 1.192007]\n",
"[Epoch 121/200] [Batch 7/59] [D loss: 0.581162] [G loss: 0.623250]\n",
"[Epoch 121/200] [Batch 8/59] [D loss: 0.575519] [G loss: 1.074119]\n",
"[Epoch 121/200] [Batch 9/59] [D loss: 0.614471] [G loss: 0.933049]\n",
"[Epoch 121/200] [Batch 10/59] [D loss: 0.576102] [G loss: 0.844646]\n",
"[Epoch 121/200] [Batch 11/59] [D loss: 0.666328] [G loss: 0.776644]\n",
"[Epoch 121/200] [Batch 12/59] [D loss: 0.572312] [G loss: 1.135261]\n",
"[Epoch 121/200] [Batch 13/59] [D loss: 0.589740] [G loss: 0.734211]\n",
"[Epoch 121/200] [Batch 14/59] [D loss: 0.633655] [G loss: 1.002574]\n",
"[Epoch 121/200] [Batch 15/59] [D loss: 0.569143] [G loss: 0.761287]\n",
"[Epoch 121/200] [Batch 16/59] [D loss: 0.526276] [G loss: 0.986640]\n",
"[Epoch 121/200] [Batch 17/59] [D loss: 0.580259] [G loss: 0.862236]\n",
"[Epoch 121/200] [Batch 18/59] [D loss: 0.564147] [G loss: 0.917942]\n",
"[Epoch 121/200] [Batch 19/59] [D loss: 0.666094] [G loss: 1.096450]\n",
"[Epoch 121/200] [Batch 20/59] [D loss: 0.560888] [G loss: 0.953572]\n",
"[Epoch 121/200] [Batch 21/59] [D loss: 0.615847] [G loss: 0.928347]\n",
"[Epoch 121/200] [Batch 22/59] [D loss: 0.566209] [G loss: 1.001789]\n",
"[Epoch 121/200] [Batch 23/59] [D loss: 0.691911] [G loss: 0.582818]\n",
"[Epoch 121/200] [Batch 24/59] [D loss: 0.686817] [G loss: 1.345833]\n",
"[Epoch 121/200] [Batch 25/59] [D loss: 0.593789] [G loss: 0.928984]\n",
"[Epoch 121/200] [Batch 26/59] [D loss: 0.579763] [G loss: 0.797768]\n",
"[Epoch 121/200] [Batch 27/59] [D loss: 0.606557] [G loss: 1.168275]\n",
"[Epoch 121/200] [Batch 28/59] [D loss: 0.492669] [G loss: 0.798713]\n",
"[Epoch 121/200] [Batch 29/59] [D loss: 0.643792] [G loss: 0.677501]\n",
"[Epoch 121/200] [Batch 30/59] [D loss: 0.637607] [G loss: 1.594234]\n",
"[Epoch 121/200] [Batch 31/59] [D loss: 0.503455] [G loss: 0.756096]\n",
"[Epoch 121/200] [Batch 32/59] [D loss: 0.657951] [G loss: 0.451042]\n",
"[Epoch 121/200] [Batch 33/59] [D loss: 0.668575] [G loss: 1.622492]\n",
"[Epoch 121/200] [Batch 34/59] [D loss: 0.520507] [G loss: 0.970225]\n",
"[Epoch 121/200] [Batch 35/59] [D loss: 0.673231] [G loss: 0.590926]\n",
"[Epoch 121/200] [Batch 36/59] [D loss: 0.643558] [G loss: 1.393373]\n",
"[Epoch 121/200] [Batch 37/59] [D loss: 0.556327] [G loss: 0.943770]\n",
"[Epoch 121/200] [Batch 38/59] [D loss: 0.586747] [G loss: 0.652745]\n",
"[Epoch 121/200] [Batch 39/59] [D loss: 0.576310] [G loss: 0.943232]\n",
"[Epoch 121/200] [Batch 40/59] [D loss: 0.644016] [G loss: 1.002874]\n",
"[Epoch 121/200] [Batch 41/59] [D loss: 0.562244] [G loss: 0.801868]\n",
"[Epoch 121/200] [Batch 42/59] [D loss: 0.589191] [G loss: 1.089487]\n",
"[Epoch 121/200] [Batch 43/59] [D loss: 0.516095] [G loss: 0.936622]\n",
"[Epoch 121/200] [Batch 44/59] [D loss: 0.550453] [G loss: 0.997019]\n",
"[Epoch 121/200] [Batch 45/59] [D loss: 0.568719] [G loss: 0.724157]\n",
"[Epoch 121/200] [Batch 46/59] [D loss: 0.541277] [G loss: 1.097952]\n",
"[Epoch 121/200] [Batch 47/59] [D loss: 0.537077] [G loss: 0.939009]\n",
"[Epoch 121/200] [Batch 48/59] [D loss: 0.588824] [G loss: 0.744393]\n",
"[Epoch 121/200] [Batch 49/59] [D loss: 0.565285] [G loss: 0.824254]\n",
"[Epoch 121/200] [Batch 50/59] [D loss: 0.556372] [G loss: 1.023453]\n",
"[Epoch 121/200] [Batch 51/59] [D loss: 0.619393] [G loss: 0.973936]\n",
"[Epoch 121/200] [Batch 52/59] [D loss: 0.576318] [G loss: 0.846173]\n",
"[Epoch 121/200] [Batch 53/59] [D loss: 0.554459] [G loss: 0.903117]\n",
"[Epoch 121/200] [Batch 54/59] [D loss: 0.576631] [G loss: 0.959490]\n",
"[Epoch 121/200] [Batch 55/59] [D loss: 0.554901] [G loss: 0.780577]\n",
"[Epoch 121/200] [Batch 56/59] [D loss: 0.519742] [G loss: 0.988295]\n",
"[Epoch 121/200] [Batch 57/59] [D loss: 0.558281] [G loss: 1.133356]\n",
"[Epoch 121/200] [Batch 58/59] [D loss: 0.607626] [G loss: 0.863901]\n",
"[Epoch 122/200] [Batch 0/59] [D loss: 0.593824] [G loss: 0.944656]\n",
"[Epoch 122/200] [Batch 1/59] [D loss: 0.630950] [G loss: 0.958832]\n",
"[Epoch 122/200] [Batch 2/59] [D loss: 0.521647] [G loss: 0.837542]\n",
"[Epoch 122/200] [Batch 3/59] [D loss: 0.546775] [G loss: 0.821183]\n",
"[Epoch 122/200] [Batch 4/59] [D loss: 0.588105] [G loss: 0.930583]\n",
"[Epoch 122/200] [Batch 5/59] [D loss: 0.623363] [G loss: 0.892375]\n",
"[Epoch 122/200] [Batch 6/59] [D loss: 0.522292] [G loss: 1.079922]\n",
"[Epoch 122/200] [Batch 7/59] [D loss: 0.566380] [G loss: 0.919255]\n",
"[Epoch 122/200] [Batch 8/59] [D loss: 0.598155] [G loss: 0.760289]\n",
"[Epoch 122/200] [Batch 9/59] [D loss: 0.649314] [G loss: 0.956466]\n",
"[Epoch 122/200] [Batch 10/59] [D loss: 0.513633] [G loss: 0.914319]\n",
"[Epoch 122/200] [Batch 11/59] [D loss: 0.681394] [G loss: 0.779038]\n",
"[Epoch 122/200] [Batch 12/59] [D loss: 0.602515] [G loss: 1.090585]\n",
"[Epoch 122/200] [Batch 13/59] [D loss: 0.570512] [G loss: 0.835800]\n",
"[Epoch 122/200] [Batch 14/59] [D loss: 0.571639] [G loss: 0.899942]\n",
"[Epoch 122/200] [Batch 15/59] [D loss: 0.550497] [G loss: 0.879779]\n",
"[Epoch 122/200] [Batch 16/59] [D loss: 0.558221] [G loss: 0.863748]\n",
"[Epoch 122/200] [Batch 17/59] [D loss: 0.589975] [G loss: 0.911156]\n",
"[Epoch 122/200] [Batch 18/59] [D loss: 0.573120] [G loss: 0.994765]\n",
"[Epoch 122/200] [Batch 19/59] [D loss: 0.616053] [G loss: 0.806344]\n",
"[Epoch 122/200] [Batch 20/59] [D loss: 0.566078] [G loss: 1.006520]\n",
"[Epoch 122/200] [Batch 21/59] [D loss: 0.579589] [G loss: 1.108395]\n",
"[Epoch 122/200] [Batch 22/59] [D loss: 0.652171] [G loss: 0.913243]\n",
"[Epoch 122/200] [Batch 23/59] [D loss: 0.605027] [G loss: 1.065868]\n",
"[Epoch 122/200] [Batch 24/59] [D loss: 0.599382] [G loss: 0.893943]\n",
"[Epoch 122/200] [Batch 25/59] [D loss: 0.641278] [G loss: 0.670891]\n",
"[Epoch 122/200] [Batch 26/59] [D loss: 0.620954] [G loss: 0.815872]\n",
"[Epoch 122/200] [Batch 27/59] [D loss: 0.636180] [G loss: 1.180244]\n",
"[Epoch 122/200] [Batch 28/59] [D loss: 0.559193] [G loss: 0.658543]\n",
"[Epoch 122/200] [Batch 29/59] [D loss: 0.563612] [G loss: 0.698251]\n",
"[Epoch 122/200] [Batch 30/59] [D loss: 0.599396] [G loss: 1.281079]\n",
"[Epoch 122/200] [Batch 31/59] [D loss: 0.634977] [G loss: 0.906771]\n",
"[Epoch 122/200] [Batch 32/59] [D loss: 0.632198] [G loss: 0.634538]\n",
"[Epoch 122/200] [Batch 33/59] [D loss: 0.627574] [G loss: 1.140838]\n",
"[Epoch 122/200] [Batch 34/59] [D loss: 0.635340] [G loss: 0.547427]\n",
"[Epoch 122/200] [Batch 35/59] [D loss: 0.591582] [G loss: 0.945547]\n",
"[Epoch 122/200] [Batch 36/59] [D loss: 0.620793] [G loss: 1.275817]\n",
"[Epoch 122/200] [Batch 37/59] [D loss: 0.592726] [G loss: 0.802242]\n",
"[Epoch 122/200] [Batch 38/59] [D loss: 0.563817] [G loss: 1.006967]\n",
"[Epoch 122/200] [Batch 39/59] [D loss: 0.561517] [G loss: 0.982629]\n",
"[Epoch 122/200] [Batch 40/59] [D loss: 0.625344] [G loss: 0.880767]\n",
"[Epoch 122/200] [Batch 41/59] [D loss: 0.605137] [G loss: 1.133966]\n",
"[Epoch 122/200] [Batch 42/59] [D loss: 0.586104] [G loss: 1.055023]\n",
"[Epoch 122/200] [Batch 43/59] [D loss: 0.633858] [G loss: 0.898818]\n",
"[Epoch 122/200] [Batch 44/59] [D loss: 0.604344] [G loss: 0.901015]\n",
"[Epoch 122/200] [Batch 45/59] [D loss: 0.606345] [G loss: 1.088838]\n",
"[Epoch 122/200] [Batch 46/59] [D loss: 0.592549] [G loss: 0.713596]\n",
"[Epoch 122/200] [Batch 47/59] [D loss: 0.548319] [G loss: 0.865913]\n",
"[Epoch 122/200] [Batch 48/59] [D loss: 0.597797] [G loss: 0.770875]\n",
"[Epoch 122/200] [Batch 49/59] [D loss: 0.572642] [G loss: 0.834103]\n",
"[Epoch 122/200] [Batch 50/59] [D loss: 0.638866] [G loss: 0.726064]\n",
"[Epoch 122/200] [Batch 51/59] [D loss: 0.552975] [G loss: 1.025912]\n",
"[Epoch 122/200] [Batch 52/59] [D loss: 0.585197] [G loss: 0.749233]\n",
"[Epoch 122/200] [Batch 53/59] [D loss: 0.573321] [G loss: 0.755024]\n",
"[Epoch 122/200] [Batch 54/59] [D loss: 0.615927] [G loss: 0.956930]\n",
"[Epoch 122/200] [Batch 55/59] [D loss: 0.553319] [G loss: 1.036281]\n",
"[Epoch 122/200] [Batch 56/59] [D loss: 0.640105] [G loss: 0.778324]\n",
"[Epoch 122/200] [Batch 57/59] [D loss: 0.570043] [G loss: 0.816725]\n",
"[Epoch 122/200] [Batch 58/59] [D loss: 0.620944] [G loss: 1.082430]\n",
"[Epoch 123/200] [Batch 0/59] [D loss: 0.581821] [G loss: 0.767837]\n",
"[Epoch 123/200] [Batch 1/59] [D loss: 0.475023] [G loss: 0.937477]\n",
"[Epoch 123/200] [Batch 2/59] [D loss: 0.559540] [G loss: 0.997162]\n",
"[Epoch 123/200] [Batch 3/59] [D loss: 0.552495] [G loss: 0.794074]\n",
"[Epoch 123/200] [Batch 4/59] [D loss: 0.630328] [G loss: 0.924710]\n",
"[Epoch 123/200] [Batch 5/59] [D loss: 0.661260] [G loss: 0.913800]\n",
"[Epoch 123/200] [Batch 6/59] [D loss: 0.558404] [G loss: 0.948240]\n",
"[Epoch 123/200] [Batch 7/59] [D loss: 0.597113] [G loss: 0.903287]\n",
"[Epoch 123/200] [Batch 8/59] [D loss: 0.555692] [G loss: 0.675866]\n",
"[Epoch 123/200] [Batch 9/59] [D loss: 0.546040] [G loss: 1.034640]\n",
"[Epoch 123/200] [Batch 10/59] [D loss: 0.546028] [G loss: 0.830522]\n",
"[Epoch 123/200] [Batch 11/59] [D loss: 0.544989] [G loss: 0.959486]\n",
"[Epoch 123/200] [Batch 12/59] [D loss: 0.597027] [G loss: 1.028086]\n",
"[Epoch 123/200] [Batch 13/59] [D loss: 0.620410] [G loss: 1.020860]\n",
"[Epoch 123/200] [Batch 14/59] [D loss: 0.681857] [G loss: 1.017840]\n",
"[Epoch 123/200] [Batch 15/59] [D loss: 0.587762] [G loss: 0.842468]\n",
"[Epoch 123/200] [Batch 16/59] [D loss: 0.610328] [G loss: 0.772243]\n",
"[Epoch 123/200] [Batch 17/59] [D loss: 0.540163] [G loss: 0.846380]\n",
"[Epoch 123/200] [Batch 18/59] [D loss: 0.620653] [G loss: 0.964704]\n",
"[Epoch 123/200] [Batch 19/59] [D loss: 0.614053] [G loss: 0.992637]\n",
"[Epoch 123/200] [Batch 20/59] [D loss: 0.544773] [G loss: 0.832408]\n",
"[Epoch 123/200] [Batch 21/59] [D loss: 0.601684] [G loss: 0.760036]\n",
"[Epoch 123/200] [Batch 22/59] [D loss: 0.669472] [G loss: 1.382417]\n",
"[Epoch 123/200] [Batch 23/59] [D loss: 0.581440] [G loss: 0.752610]\n",
"[Epoch 123/200] [Batch 24/59] [D loss: 0.551013] [G loss: 0.676223]\n",
"[Epoch 123/200] [Batch 25/59] [D loss: 0.600461] [G loss: 1.220506]\n",
"[Epoch 123/200] [Batch 26/59] [D loss: 0.605678] [G loss: 0.697567]\n",
"[Epoch 123/200] [Batch 27/59] [D loss: 0.680075] [G loss: 0.744256]\n",
"[Epoch 123/200] [Batch 28/59] [D loss: 0.573021] [G loss: 1.376356]\n",
"[Epoch 123/200] [Batch 29/59] [D loss: 0.603569] [G loss: 0.736913]\n",
"[Epoch 123/200] [Batch 30/59] [D loss: 0.629892] [G loss: 1.081070]\n",
"[Epoch 123/200] [Batch 31/59] [D loss: 0.574754] [G loss: 0.897494]\n",
"[Epoch 123/200] [Batch 32/59] [D loss: 0.605742] [G loss: 0.785209]\n",
"[Epoch 123/200] [Batch 33/59] [D loss: 0.610483] [G loss: 1.123119]\n",
"[Epoch 123/200] [Batch 34/59] [D loss: 0.574358] [G loss: 1.048877]\n",
"[Epoch 123/200] [Batch 35/59] [D loss: 0.541874] [G loss: 0.885620]\n",
"[Epoch 123/200] [Batch 36/59] [D loss: 0.579349] [G loss: 0.931232]\n",
"[Epoch 123/200] [Batch 37/59] [D loss: 0.531791] [G loss: 0.903586]\n",
"[Epoch 123/200] [Batch 38/59] [D loss: 0.598003] [G loss: 0.754383]\n",
"[Epoch 123/200] [Batch 39/59] [D loss: 0.560011] [G loss: 1.004453]\n",
"[Epoch 123/200] [Batch 40/59] [D loss: 0.630770] [G loss: 0.959802]\n",
"[Epoch 123/200] [Batch 41/59] [D loss: 0.503148] [G loss: 1.131447]\n",
"[Epoch 123/200] [Batch 42/59] [D loss: 0.642183] [G loss: 0.815965]\n",
"[Epoch 123/200] [Batch 43/59] [D loss: 0.570532] [G loss: 0.998660]\n",
"[Epoch 123/200] [Batch 44/59] [D loss: 0.546911] [G loss: 0.971450]\n",
"[Epoch 123/200] [Batch 45/59] [D loss: 0.535292] [G loss: 0.870896]\n",
"[Epoch 123/200] [Batch 46/59] [D loss: 0.590091] [G loss: 0.875640]\n",
"[Epoch 123/200] [Batch 47/59] [D loss: 0.603750] [G loss: 0.881799]\n",
"[Epoch 123/200] [Batch 48/59] [D loss: 0.561727] [G loss: 0.608421]\n",
"[Epoch 123/200] [Batch 49/59] [D loss: 0.546110] [G loss: 0.877262]\n",
"[Epoch 123/200] [Batch 50/59] [D loss: 0.550329] [G loss: 0.814255]\n",
"[Epoch 123/200] [Batch 51/59] [D loss: 0.585994] [G loss: 1.017080]\n",
"[Epoch 123/200] [Batch 52/59] [D loss: 0.523743] [G loss: 0.757587]\n",
"[Epoch 123/200] [Batch 53/59] [D loss: 0.537546] [G loss: 0.978497]\n",
"[Epoch 123/200] [Batch 54/59] [D loss: 0.563531] [G loss: 0.959966]\n",
"[Epoch 123/200] [Batch 55/59] [D loss: 0.633342] [G loss: 0.620465]\n",
"[Epoch 123/200] [Batch 56/59] [D loss: 0.588916] [G loss: 0.994501]\n",
"[Epoch 123/200] [Batch 57/59] [D loss: 0.580760] [G loss: 1.275174]\n",
"[Epoch 123/200] [Batch 58/59] [D loss: 0.627337] [G loss: 0.678226]\n",
"[Epoch 124/200] [Batch 0/59] [D loss: 0.608270] [G loss: 1.307993]\n",
"[Epoch 124/200] [Batch 1/59] [D loss: 0.589376] [G loss: 1.139804]\n",
"[Epoch 124/200] [Batch 2/59] [D loss: 0.618239] [G loss: 0.600926]\n",
"[Epoch 124/200] [Batch 3/59] [D loss: 0.640757] [G loss: 1.368517]\n",
"[Epoch 124/200] [Batch 4/59] [D loss: 0.583518] [G loss: 1.006245]\n",
"[Epoch 124/200] [Batch 5/59] [D loss: 0.546249] [G loss: 0.680154]\n",
"[Epoch 124/200] [Batch 6/59] [D loss: 0.629143] [G loss: 1.119927]\n",
"[Epoch 124/200] [Batch 7/59] [D loss: 0.535466] [G loss: 0.770890]\n",
"[Epoch 124/200] [Batch 8/59] [D loss: 0.583306] [G loss: 0.826138]\n",
"[Epoch 124/200] [Batch 9/59] [D loss: 0.599141] [G loss: 1.054567]\n",
"[Epoch 124/200] [Batch 10/59] [D loss: 0.620590] [G loss: 0.784919]\n",
"[Epoch 124/200] [Batch 11/59] [D loss: 0.547462] [G loss: 0.888468]\n",
"[Epoch 124/200] [Batch 12/59] [D loss: 0.642065] [G loss: 0.861400]\n",
"[Epoch 124/200] [Batch 13/59] [D loss: 0.596235] [G loss: 0.834511]\n",
"[Epoch 124/200] [Batch 14/59] [D loss: 0.523655] [G loss: 0.815884]\n",
"[Epoch 124/200] [Batch 15/59] [D loss: 0.555109] [G loss: 0.696362]\n",
"[Epoch 124/200] [Batch 16/59] [D loss: 0.598991] [G loss: 1.070279]\n",
"[Epoch 124/200] [Batch 17/59] [D loss: 0.609909] [G loss: 1.051511]\n",
"[Epoch 124/200] [Batch 18/59] [D loss: 0.607231] [G loss: 0.610165]\n",
"[Epoch 124/200] [Batch 19/59] [D loss: 0.569188] [G loss: 1.070144]\n",
"[Epoch 124/200] [Batch 20/59] [D loss: 0.545321] [G loss: 0.848797]\n",
"[Epoch 124/200] [Batch 21/59] [D loss: 0.588459] [G loss: 0.776975]\n",
"[Epoch 124/200] [Batch 22/59] [D loss: 0.520655] [G loss: 0.909404]\n",
"[Epoch 124/200] [Batch 23/59] [D loss: 0.549556] [G loss: 0.868120]\n",
"[Epoch 124/200] [Batch 24/59] [D loss: 0.642154] [G loss: 0.580795]\n",
"[Epoch 124/200] [Batch 25/59] [D loss: 0.617315] [G loss: 1.483841]\n",
"[Epoch 124/200] [Batch 26/59] [D loss: 0.540072] [G loss: 0.885883]\n",
"[Epoch 124/200] [Batch 27/59] [D loss: 0.610935] [G loss: 0.658438]\n",
"[Epoch 124/200] [Batch 28/59] [D loss: 0.603587] [G loss: 1.159288]\n",
"[Epoch 124/200] [Batch 29/59] [D loss: 0.629659] [G loss: 0.842893]\n",
"[Epoch 124/200] [Batch 30/59] [D loss: 0.631364] [G loss: 0.723378]\n",
"[Epoch 124/200] [Batch 31/59] [D loss: 0.581509] [G loss: 1.004290]\n",
"[Epoch 124/200] [Batch 32/59] [D loss: 0.632020] [G loss: 1.358669]\n",
"[Epoch 124/200] [Batch 33/59] [D loss: 0.495407] [G loss: 0.568460]\n",
"[Epoch 124/200] [Batch 34/59] [D loss: 0.575716] [G loss: 0.649033]\n",
"[Epoch 124/200] [Batch 35/59] [D loss: 0.550246] [G loss: 1.299549]\n",
"[Epoch 124/200] [Batch 36/59] [D loss: 0.532920] [G loss: 0.940886]\n",
"[Epoch 124/200] [Batch 37/59] [D loss: 0.661203] [G loss: 0.600309]\n",
"[Epoch 124/200] [Batch 38/59] [D loss: 0.618559] [G loss: 1.210541]\n",
"[Epoch 124/200] [Batch 39/59] [D loss: 0.584987] [G loss: 0.885680]\n",
"[Epoch 124/200] [Batch 40/59] [D loss: 0.587220] [G loss: 0.792051]\n",
"[Epoch 124/200] [Batch 41/59] [D loss: 0.584704] [G loss: 0.945007]\n",
"[Epoch 124/200] [Batch 42/59] [D loss: 0.555353] [G loss: 0.747132]\n",
"[Epoch 124/200] [Batch 43/59] [D loss: 0.566811] [G loss: 0.947970]\n",
"[Epoch 124/200] [Batch 44/59] [D loss: 0.527375] [G loss: 0.970379]\n",
"[Epoch 124/200] [Batch 45/59] [D loss: 0.581477] [G loss: 0.807019]\n",
"[Epoch 124/200] [Batch 46/59] [D loss: 0.652084] [G loss: 0.969382]\n",
"[Epoch 124/200] [Batch 47/59] [D loss: 0.592585] [G loss: 0.969292]\n",
"[Epoch 124/200] [Batch 48/59] [D loss: 0.593175] [G loss: 0.939676]\n",
"[Epoch 124/200] [Batch 49/59] [D loss: 0.564007] [G loss: 0.786370]\n",
"[Epoch 124/200] [Batch 50/59] [D loss: 0.648454] [G loss: 0.701009]\n",
"[Epoch 124/200] [Batch 51/59] [D loss: 0.597914] [G loss: 1.508849]\n",
"[Epoch 124/200] [Batch 52/59] [D loss: 0.534930] [G loss: 0.962442]\n",
"[Epoch 124/200] [Batch 53/59] [D loss: 0.578841] [G loss: 0.755035]\n",
"[Epoch 124/200] [Batch 54/59] [D loss: 0.633419] [G loss: 1.121367]\n",
"[Epoch 124/200] [Batch 55/59] [D loss: 0.527617] [G loss: 0.964587]\n",
"[Epoch 124/200] [Batch 56/59] [D loss: 0.591123] [G loss: 0.912447]\n",
"[Epoch 124/200] [Batch 57/59] [D loss: 0.541805] [G loss: 0.846384]\n",
"[Epoch 124/200] [Batch 58/59] [D loss: 0.547635] [G loss: 0.918574]\n",
"[Epoch 125/200] [Batch 0/59] [D loss: 0.581487] [G loss: 0.990515]\n",
"[Epoch 125/200] [Batch 1/59] [D loss: 0.633213] [G loss: 1.180568]\n",
"[Epoch 125/200] [Batch 2/59] [D loss: 0.587003] [G loss: 0.642618]\n",
"[Epoch 125/200] [Batch 3/59] [D loss: 0.580994] [G loss: 1.045568]\n",
"[Epoch 125/200] [Batch 4/59] [D loss: 0.542360] [G loss: 1.134871]\n",
"[Epoch 125/200] [Batch 5/59] [D loss: 0.509215] [G loss: 0.647542]\n",
"[Epoch 125/200] [Batch 6/59] [D loss: 0.555166] [G loss: 0.913357]\n",
"[Epoch 125/200] [Batch 7/59] [D loss: 0.661732] [G loss: 1.387609]\n",
"[Epoch 125/200] [Batch 8/59] [D loss: 0.593381] [G loss: 0.894117]\n",
"[Epoch 125/200] [Batch 9/59] [D loss: 0.581688] [G loss: 0.593924]\n",
"[Epoch 125/200] [Batch 10/59] [D loss: 0.578074] [G loss: 1.112864]\n",
"[Epoch 125/200] [Batch 11/59] [D loss: 0.569562] [G loss: 0.886404]\n",
"[Epoch 125/200] [Batch 12/59] [D loss: 0.595336] [G loss: 0.783546]\n",
"[Epoch 125/200] [Batch 13/59] [D loss: 0.514436] [G loss: 1.132097]\n",
"[Epoch 125/200] [Batch 14/59] [D loss: 0.589027] [G loss: 0.932928]\n",
"[Epoch 125/200] [Batch 15/59] [D loss: 0.629676] [G loss: 0.753253]\n",
"[Epoch 125/200] [Batch 16/59] [D loss: 0.583248] [G loss: 1.122025]\n",
"[Epoch 125/200] [Batch 17/59] [D loss: 0.541028] [G loss: 0.919565]\n",
"[Epoch 125/200] [Batch 18/59] [D loss: 0.560513] [G loss: 0.867477]\n",
"[Epoch 125/200] [Batch 19/59] [D loss: 0.648507] [G loss: 0.891835]\n",
"[Epoch 125/200] [Batch 20/59] [D loss: 0.626285] [G loss: 0.944524]\n",
"[Epoch 125/200] [Batch 21/59] [D loss: 0.575612] [G loss: 0.853944]\n",
"[Epoch 125/200] [Batch 22/59] [D loss: 0.620611] [G loss: 0.883271]\n",
"[Epoch 125/200] [Batch 23/59] [D loss: 0.613757] [G loss: 0.904447]\n",
"[Epoch 125/200] [Batch 24/59] [D loss: 0.553012] [G loss: 0.891951]\n",
"[Epoch 125/200] [Batch 25/59] [D loss: 0.529887] [G loss: 0.882249]\n",
"[Epoch 125/200] [Batch 26/59] [D loss: 0.576209] [G loss: 0.810931]\n",
"[Epoch 125/200] [Batch 27/59] [D loss: 0.537878] [G loss: 1.091756]\n",
"[Epoch 125/200] [Batch 28/59] [D loss: 0.575735] [G loss: 0.878082]\n",
"[Epoch 125/200] [Batch 29/59] [D loss: 0.732167] [G loss: 0.763389]\n",
"[Epoch 125/200] [Batch 30/59] [D loss: 0.568865] [G loss: 0.945005]\n",
"[Epoch 125/200] [Batch 31/59] [D loss: 0.559864] [G loss: 0.917732]\n",
"[Epoch 125/200] [Batch 32/59] [D loss: 0.583768] [G loss: 0.768113]\n",
"[Epoch 125/200] [Batch 33/59] [D loss: 0.561815] [G loss: 0.856692]\n",
"[Epoch 125/200] [Batch 34/59] [D loss: 0.543913] [G loss: 0.975429]\n",
"[Epoch 125/200] [Batch 35/59] [D loss: 0.582656] [G loss: 0.928415]\n",
"[Epoch 125/200] [Batch 36/59] [D loss: 0.633697] [G loss: 0.879471]\n",
"[Epoch 125/200] [Batch 37/59] [D loss: 0.557969] [G loss: 1.308412]\n",
"[Epoch 125/200] [Batch 38/59] [D loss: 0.565639] [G loss: 0.805056]\n",
"[Epoch 125/200] [Batch 39/59] [D loss: 0.564774] [G loss: 0.807212]\n",
"[Epoch 125/200] [Batch 40/59] [D loss: 0.561485] [G loss: 1.004864]\n",
"[Epoch 125/200] [Batch 41/59] [D loss: 0.555087] [G loss: 0.976090]\n",
"[Epoch 125/200] [Batch 42/59] [D loss: 0.560276] [G loss: 0.790981]\n",
"[Epoch 125/200] [Batch 43/59] [D loss: 0.543397] [G loss: 1.069056]\n",
"[Epoch 125/200] [Batch 44/59] [D loss: 0.547222] [G loss: 0.976718]\n",
"[Epoch 125/200] [Batch 45/59] [D loss: 0.526412] [G loss: 0.708018]\n",
"[Epoch 125/200] [Batch 46/59] [D loss: 0.598153] [G loss: 0.972623]\n",
"[Epoch 125/200] [Batch 47/59] [D loss: 0.575207] [G loss: 1.053872]\n",
"[Epoch 125/200] [Batch 48/59] [D loss: 0.577361] [G loss: 1.057985]\n",
"[Epoch 125/200] [Batch 49/59] [D loss: 0.498156] [G loss: 0.659505]\n",
"[Epoch 125/200] [Batch 50/59] [D loss: 0.602920] [G loss: 1.013618]\n",
"[Epoch 125/200] [Batch 51/59] [D loss: 0.555774] [G loss: 1.256856]\n",
"[Epoch 125/200] [Batch 52/59] [D loss: 0.559565] [G loss: 0.970581]\n",
"[Epoch 125/200] [Batch 53/59] [D loss: 0.575240] [G loss: 0.708832]\n",
"[Epoch 125/200] [Batch 54/59] [D loss: 0.683369] [G loss: 1.246844]\n",
"[Epoch 125/200] [Batch 55/59] [D loss: 0.705908] [G loss: 0.668873]\n",
"[Epoch 125/200] [Batch 56/59] [D loss: 0.519481] [G loss: 0.871758]\n",
"[Epoch 125/200] [Batch 57/59] [D loss: 0.535304] [G loss: 1.089080]\n",
"[Epoch 125/200] [Batch 58/59] [D loss: 0.549544] [G loss: 0.934196]\n",
"[Epoch 126/200] [Batch 0/59] [D loss: 0.596378] [G loss: 0.789544]\n",
"[Epoch 126/200] [Batch 1/59] [D loss: 0.542257] [G loss: 0.794848]\n",
"[Epoch 126/200] [Batch 2/59] [D loss: 0.567723] [G loss: 1.328443]\n",
"[Epoch 126/200] [Batch 3/59] [D loss: 0.602949] [G loss: 0.789451]\n",
"[Epoch 126/200] [Batch 4/59] [D loss: 0.629304] [G loss: 0.732248]\n",
"[Epoch 126/200] [Batch 5/59] [D loss: 0.611379] [G loss: 1.491189]\n",
"[Epoch 126/200] [Batch 6/59] [D loss: 0.482044] [G loss: 0.825931]\n",
"[Epoch 126/200] [Batch 7/59] [D loss: 0.620628] [G loss: 0.646385]\n",
"[Epoch 126/200] [Batch 8/59] [D loss: 0.669230] [G loss: 1.489647]\n",
"[Epoch 126/200] [Batch 9/59] [D loss: 0.583751] [G loss: 0.796435]\n",
"[Epoch 126/200] [Batch 10/59] [D loss: 0.543802] [G loss: 0.721963]\n",
"[Epoch 126/200] [Batch 11/59] [D loss: 0.592413] [G loss: 1.100576]\n",
"[Epoch 126/200] [Batch 12/59] [D loss: 0.528696] [G loss: 0.846674]\n",
"[Epoch 126/200] [Batch 13/59] [D loss: 0.660616] [G loss: 0.752309]\n",
"[Epoch 126/200] [Batch 14/59] [D loss: 0.534250] [G loss: 1.156988]\n",
"[Epoch 126/200] [Batch 15/59] [D loss: 0.649717] [G loss: 0.960921]\n",
"[Epoch 126/200] [Batch 16/59] [D loss: 0.624271] [G loss: 0.895767]\n",
"[Epoch 126/200] [Batch 17/59] [D loss: 0.589592] [G loss: 1.209976]\n",
"[Epoch 126/200] [Batch 18/59] [D loss: 0.648543] [G loss: 0.832222]\n",
"[Epoch 126/200] [Batch 19/59] [D loss: 0.575030] [G loss: 0.664948]\n",
"[Epoch 126/200] [Batch 20/59] [D loss: 0.617933] [G loss: 1.213467]\n",
"[Epoch 126/200] [Batch 21/59] [D loss: 0.531456] [G loss: 0.978790]\n",
"[Epoch 126/200] [Batch 22/59] [D loss: 0.556224] [G loss: 0.776310]\n",
"[Epoch 126/200] [Batch 23/59] [D loss: 0.578604] [G loss: 0.823395]\n",
"[Epoch 126/200] [Batch 24/59] [D loss: 0.595515] [G loss: 1.100605]\n",
"[Epoch 126/200] [Batch 25/59] [D loss: 0.614550] [G loss: 1.112275]\n",
"[Epoch 126/200] [Batch 26/59] [D loss: 0.576585] [G loss: 1.022768]\n",
"[Epoch 126/200] [Batch 27/59] [D loss: 0.580242] [G loss: 0.927431]\n",
"[Epoch 126/200] [Batch 28/59] [D loss: 0.636496] [G loss: 0.704941]\n",
"[Epoch 126/200] [Batch 29/59] [D loss: 0.603913] [G loss: 1.292747]\n",
"[Epoch 126/200] [Batch 30/59] [D loss: 0.581879] [G loss: 0.764199]\n",
"[Epoch 126/200] [Batch 31/59] [D loss: 0.630571] [G loss: 0.824451]\n",
"[Epoch 126/200] [Batch 32/59] [D loss: 0.559358] [G loss: 1.023466]\n",
"[Epoch 126/200] [Batch 33/59] [D loss: 0.531298] [G loss: 0.956648]\n",
"[Epoch 126/200] [Batch 34/59] [D loss: 0.519950] [G loss: 0.901952]\n",
"[Epoch 126/200] [Batch 35/59] [D loss: 0.563121] [G loss: 1.091414]\n",
"[Epoch 126/200] [Batch 36/59] [D loss: 0.597144] [G loss: 0.864090]\n",
"[Epoch 126/200] [Batch 37/59] [D loss: 0.563371] [G loss: 0.817318]\n",
"[Epoch 126/200] [Batch 38/59] [D loss: 0.562708] [G loss: 1.041319]\n",
"[Epoch 126/200] [Batch 39/59] [D loss: 0.559469] [G loss: 0.937857]\n",
"[Epoch 126/200] [Batch 40/59] [D loss: 0.617542] [G loss: 0.967353]\n",
"[Epoch 126/200] [Batch 41/59] [D loss: 0.576716] [G loss: 1.027776]\n",
"[Epoch 126/200] [Batch 42/59] [D loss: 0.515257] [G loss: 1.255890]\n",
"[Epoch 126/200] [Batch 43/59] [D loss: 0.565994] [G loss: 0.870235]\n",
"[Epoch 126/200] [Batch 44/59] [D loss: 0.570117] [G loss: 0.941382]\n",
"[Epoch 126/200] [Batch 45/59] [D loss: 0.555714] [G loss: 1.026278]\n",
"[Epoch 126/200] [Batch 46/59] [D loss: 0.559878] [G loss: 0.733398]\n",
"[Epoch 126/200] [Batch 47/59] [D loss: 0.582335] [G loss: 1.206261]\n",
"[Epoch 126/200] [Batch 48/59] [D loss: 0.562255] [G loss: 1.154602]\n",
"[Epoch 126/200] [Batch 49/59] [D loss: 0.591991] [G loss: 0.795454]\n",
"[Epoch 126/200] [Batch 50/59] [D loss: 0.564743] [G loss: 0.871164]\n",
"[Epoch 126/200] [Batch 51/59] [D loss: 0.637245] [G loss: 1.159328]\n",
"[Epoch 126/200] [Batch 52/59] [D loss: 0.662504] [G loss: 0.738294]\n",
"[Epoch 126/200] [Batch 53/59] [D loss: 0.572448] [G loss: 0.752229]\n",
"[Epoch 126/200] [Batch 54/59] [D loss: 0.619612] [G loss: 1.365871]\n",
"[Epoch 126/200] [Batch 55/59] [D loss: 0.569620] [G loss: 0.963038]\n",
"[Epoch 126/200] [Batch 56/59] [D loss: 0.642678] [G loss: 0.645153]\n",
"[Epoch 126/200] [Batch 57/59] [D loss: 0.522590] [G loss: 0.997351]\n",
"[Epoch 126/200] [Batch 58/59] [D loss: 0.623583] [G loss: 0.999375]\n",
"[Epoch 127/200] [Batch 0/59] [D loss: 0.565831] [G loss: 0.770398]\n",
"[Epoch 127/200] [Batch 1/59] [D loss: 0.578829] [G loss: 0.715538]\n",
"[Epoch 127/200] [Batch 2/59] [D loss: 0.593439] [G loss: 1.160588]\n",
"[Epoch 127/200] [Batch 3/59] [D loss: 0.556155] [G loss: 1.026998]\n",
"[Epoch 127/200] [Batch 4/59] [D loss: 0.581214] [G loss: 0.908168]\n",
"[Epoch 127/200] [Batch 5/59] [D loss: 0.560167] [G loss: 1.168517]\n",
"[Epoch 127/200] [Batch 6/59] [D loss: 0.509559] [G loss: 0.834165]\n",
"[Epoch 127/200] [Batch 7/59] [D loss: 0.600115] [G loss: 0.716037]\n",
"[Epoch 127/200] [Batch 8/59] [D loss: 0.575039] [G loss: 1.021208]\n",
"[Epoch 127/200] [Batch 9/59] [D loss: 0.576577] [G loss: 0.783441]\n",
"[Epoch 127/200] [Batch 10/59] [D loss: 0.595291] [G loss: 0.843630]\n",
"[Epoch 127/200] [Batch 11/59] [D loss: 0.598988] [G loss: 1.159931]\n",
"[Epoch 127/200] [Batch 12/59] [D loss: 0.544245] [G loss: 0.987164]\n",
"[Epoch 127/200] [Batch 13/59] [D loss: 0.570705] [G loss: 0.836763]\n",
"[Epoch 127/200] [Batch 14/59] [D loss: 0.624237] [G loss: 1.148693]\n",
"[Epoch 127/200] [Batch 15/59] [D loss: 0.574403] [G loss: 0.830520]\n",
"[Epoch 127/200] [Batch 16/59] [D loss: 0.553156] [G loss: 0.726562]\n",
"[Epoch 127/200] [Batch 17/59] [D loss: 0.567139] [G loss: 0.739183]\n",
"[Epoch 127/200] [Batch 18/59] [D loss: 0.580377] [G loss: 1.011904]\n",
"[Epoch 127/200] [Batch 19/59] [D loss: 0.612812] [G loss: 0.997715]\n",
"[Epoch 127/200] [Batch 20/59] [D loss: 0.583757] [G loss: 0.703446]\n",
"[Epoch 127/200] [Batch 21/59] [D loss: 0.575538] [G loss: 1.007696]\n",
"[Epoch 127/200] [Batch 22/59] [D loss: 0.587347] [G loss: 0.881485]\n",
"[Epoch 127/200] [Batch 23/59] [D loss: 0.564892] [G loss: 1.177241]\n",
"[Epoch 127/200] [Batch 24/59] [D loss: 0.527136] [G loss: 0.795879]\n",
"[Epoch 127/200] [Batch 25/59] [D loss: 0.512399] [G loss: 0.918707]\n",
"[Epoch 127/200] [Batch 26/59] [D loss: 0.537123] [G loss: 1.180714]\n",
"[Epoch 127/200] [Batch 27/59] [D loss: 0.532538] [G loss: 0.734958]\n",
"[Epoch 127/200] [Batch 28/59] [D loss: 0.601220] [G loss: 0.666770]\n",
"[Epoch 127/200] [Batch 29/59] [D loss: 0.599420] [G loss: 1.299061]\n",
"[Epoch 127/200] [Batch 30/59] [D loss: 0.506358] [G loss: 0.948112]\n",
"[Epoch 127/200] [Batch 31/59] [D loss: 0.635345] [G loss: 0.657534]\n",
"[Epoch 127/200] [Batch 32/59] [D loss: 0.611033] [G loss: 1.260250]\n",
"[Epoch 127/200] [Batch 33/59] [D loss: 0.617307] [G loss: 1.039391]\n",
"[Epoch 127/200] [Batch 34/59] [D loss: 0.658418] [G loss: 0.593087]\n",
"[Epoch 127/200] [Batch 35/59] [D loss: 0.623586] [G loss: 0.998999]\n",
"[Epoch 127/200] [Batch 36/59] [D loss: 0.597790] [G loss: 0.776136]\n",
"[Epoch 127/200] [Batch 37/59] [D loss: 0.567091] [G loss: 0.818562]\n",
"[Epoch 127/200] [Batch 38/59] [D loss: 0.572727] [G loss: 0.779141]\n",
"[Epoch 127/200] [Batch 39/59] [D loss: 0.645108] [G loss: 1.216725]\n",
"[Epoch 127/200] [Batch 40/59] [D loss: 0.639946] [G loss: 0.679009]\n",
"[Epoch 127/200] [Batch 41/59] [D loss: 0.577164] [G loss: 0.903740]\n",
"[Epoch 127/200] [Batch 42/59] [D loss: 0.578596] [G loss: 1.173132]\n",
"[Epoch 127/200] [Batch 43/59] [D loss: 0.552733] [G loss: 0.723029]\n",
"[Epoch 127/200] [Batch 44/59] [D loss: 0.525234] [G loss: 1.044951]\n",
"[Epoch 127/200] [Batch 45/59] [D loss: 0.561905] [G loss: 0.898809]\n",
"[Epoch 127/200] [Batch 46/59] [D loss: 0.653466] [G loss: 0.558923]\n",
"[Epoch 127/200] [Batch 47/59] [D loss: 0.687548] [G loss: 1.072158]\n",
"[Epoch 127/200] [Batch 48/59] [D loss: 0.558689] [G loss: 1.039646]\n",
"[Epoch 127/200] [Batch 49/59] [D loss: 0.593683] [G loss: 0.691011]\n",
"[Epoch 127/200] [Batch 50/59] [D loss: 0.614365] [G loss: 1.058477]\n",
"[Epoch 127/200] [Batch 51/59] [D loss: 0.570963] [G loss: 1.040607]\n",
"[Epoch 127/200] [Batch 52/59] [D loss: 0.548859] [G loss: 1.090167]\n",
"[Epoch 127/200] [Batch 53/59] [D loss: 0.627262] [G loss: 0.973831]\n",
"[Epoch 127/200] [Batch 54/59] [D loss: 0.629332] [G loss: 0.662116]\n",
"[Epoch 127/200] [Batch 55/59] [D loss: 0.535364] [G loss: 1.116208]\n",
"[Epoch 127/200] [Batch 56/59] [D loss: 0.577443] [G loss: 0.934725]\n",
"[Epoch 127/200] [Batch 57/59] [D loss: 0.614522] [G loss: 0.873795]\n",
"[Epoch 127/200] [Batch 58/59] [D loss: 0.653193] [G loss: 0.925974]\n",
"[Epoch 128/200] [Batch 0/59] [D loss: 0.556311] [G loss: 0.746673]\n",
"[Epoch 128/200] [Batch 1/59] [D loss: 0.618916] [G loss: 0.673637]\n",
"[Epoch 128/200] [Batch 2/59] [D loss: 0.602772] [G loss: 1.258059]\n",
"[Epoch 128/200] [Batch 3/59] [D loss: 0.490521] [G loss: 0.949012]\n",
"[Epoch 128/200] [Batch 4/59] [D loss: 0.532491] [G loss: 0.742279]\n",
"[Epoch 128/200] [Batch 5/59] [D loss: 0.602136] [G loss: 0.889314]\n",
"[Epoch 128/200] [Batch 6/59] [D loss: 0.551043] [G loss: 1.321568]\n",
"[Epoch 128/200] [Batch 7/59] [D loss: 0.578253] [G loss: 0.817190]\n",
"[Epoch 128/200] [Batch 8/59] [D loss: 0.595684] [G loss: 0.754953]\n",
"[Epoch 128/200] [Batch 9/59] [D loss: 0.610287] [G loss: 1.186391]\n",
"[Epoch 128/200] [Batch 10/59] [D loss: 0.546606] [G loss: 0.895969]\n",
"[Epoch 128/200] [Batch 11/59] [D loss: 0.609744] [G loss: 0.655477]\n",
"[Epoch 128/200] [Batch 12/59] [D loss: 0.579276] [G loss: 1.264752]\n",
"[Epoch 128/200] [Batch 13/59] [D loss: 0.544286] [G loss: 0.626475]\n",
"[Epoch 128/200] [Batch 14/59] [D loss: 0.578018] [G loss: 0.963706]\n",
"[Epoch 128/200] [Batch 15/59] [D loss: 0.611911] [G loss: 1.224401]\n",
"[Epoch 128/200] [Batch 16/59] [D loss: 0.588248] [G loss: 0.662858]\n",
"[Epoch 128/200] [Batch 17/59] [D loss: 0.603864] [G loss: 1.062153]\n",
"[Epoch 128/200] [Batch 18/59] [D loss: 0.642172] [G loss: 0.832292]\n",
"[Epoch 128/200] [Batch 19/59] [D loss: 0.586635] [G loss: 1.118703]\n",
"[Epoch 128/200] [Batch 20/59] [D loss: 0.569170] [G loss: 0.862147]\n",
"[Epoch 128/200] [Batch 21/59] [D loss: 0.552539] [G loss: 0.925405]\n",
"[Epoch 128/200] [Batch 22/59] [D loss: 0.612255] [G loss: 0.975961]\n",
"[Epoch 128/200] [Batch 23/59] [D loss: 0.586582] [G loss: 0.850191]\n",
"[Epoch 128/200] [Batch 24/59] [D loss: 0.507487] [G loss: 1.012504]\n",
"[Epoch 128/200] [Batch 25/59] [D loss: 0.610459] [G loss: 1.048185]\n",
"[Epoch 128/200] [Batch 26/59] [D loss: 0.613504] [G loss: 1.077453]\n",
"[Epoch 128/200] [Batch 27/59] [D loss: 0.587870] [G loss: 0.816285]\n",
"[Epoch 128/200] [Batch 28/59] [D loss: 0.575097] [G loss: 0.848637]\n",
"[Epoch 128/200] [Batch 29/59] [D loss: 0.530550] [G loss: 0.968925]\n",
"[Epoch 128/200] [Batch 30/59] [D loss: 0.576359] [G loss: 0.886600]\n",
"[Epoch 128/200] [Batch 31/59] [D loss: 0.568190] [G loss: 1.048024]\n",
"[Epoch 128/200] [Batch 32/59] [D loss: 0.630381] [G loss: 0.703709]\n",
"[Epoch 128/200] [Batch 33/59] [D loss: 0.640652] [G loss: 0.959497]\n",
"[Epoch 128/200] [Batch 34/59] [D loss: 0.582907] [G loss: 0.808108]\n",
"[Epoch 128/200] [Batch 35/59] [D loss: 0.575075] [G loss: 0.834547]\n",
"[Epoch 128/200] [Batch 36/59] [D loss: 0.619417] [G loss: 0.903585]\n",
"[Epoch 128/200] [Batch 37/59] [D loss: 0.586621] [G loss: 0.756486]\n",
"[Epoch 128/200] [Batch 38/59] [D loss: 0.546393] [G loss: 1.047620]\n",
"[Epoch 128/200] [Batch 39/59] [D loss: 0.562203] [G loss: 1.011693]\n",
"[Epoch 128/200] [Batch 40/59] [D loss: 0.526893] [G loss: 0.886884]\n",
"[Epoch 128/200] [Batch 41/59] [D loss: 0.613834] [G loss: 1.097099]\n",
"[Epoch 128/200] [Batch 42/59] [D loss: 0.621887] [G loss: 0.936780]\n",
"[Epoch 128/200] [Batch 43/59] [D loss: 0.599373] [G loss: 0.875246]\n",
"[Epoch 128/200] [Batch 44/59] [D loss: 0.600579] [G loss: 1.181279]\n",
"[Epoch 128/200] [Batch 45/59] [D loss: 0.563672] [G loss: 0.815231]\n",
"[Epoch 128/200] [Batch 46/59] [D loss: 0.573833] [G loss: 0.682279]\n",
"[Epoch 128/200] [Batch 47/59] [D loss: 0.684108] [G loss: 1.224346]\n",
"[Epoch 128/200] [Batch 48/59] [D loss: 0.620784] [G loss: 0.867472]\n",
"[Epoch 128/200] [Batch 49/59] [D loss: 0.601515] [G loss: 0.960927]\n",
"[Epoch 128/200] [Batch 50/59] [D loss: 0.575487] [G loss: 1.206269]\n",
"[Epoch 128/200] [Batch 51/59] [D loss: 0.583577] [G loss: 0.793691]\n",
"[Epoch 128/200] [Batch 52/59] [D loss: 0.585382] [G loss: 0.762554]\n",
"[Epoch 128/200] [Batch 53/59] [D loss: 0.633568] [G loss: 1.273597]\n",
"[Epoch 128/200] [Batch 54/59] [D loss: 0.653683] [G loss: 0.934896]\n",
"[Epoch 128/200] [Batch 55/59] [D loss: 0.557348] [G loss: 0.680392]\n",
"[Epoch 128/200] [Batch 56/59] [D loss: 0.590317] [G loss: 1.222491]\n",
"[Epoch 128/200] [Batch 57/59] [D loss: 0.624599] [G loss: 1.083014]\n",
"[Epoch 128/200] [Batch 58/59] [D loss: 0.583093] [G loss: 0.591017]\n",
"[Epoch 129/200] [Batch 0/59] [D loss: 0.559262] [G loss: 1.027130]\n",
"[Epoch 129/200] [Batch 1/59] [D loss: 0.585116] [G loss: 1.119258]\n",
"[Epoch 129/200] [Batch 2/59] [D loss: 0.584683] [G loss: 0.777924]\n",
"[Epoch 129/200] [Batch 3/59] [D loss: 0.570889] [G loss: 1.016744]\n",
"[Epoch 129/200] [Batch 4/59] [D loss: 0.593912] [G loss: 1.040239]\n",
"[Epoch 129/200] [Batch 5/59] [D loss: 0.614885] [G loss: 0.959807]\n",
"[Epoch 129/200] [Batch 6/59] [D loss: 0.602387] [G loss: 0.959950]\n",
"[Epoch 129/200] [Batch 7/59] [D loss: 0.539263] [G loss: 1.048950]\n",
"[Epoch 129/200] [Batch 8/59] [D loss: 0.569928] [G loss: 0.797902]\n",
"[Epoch 129/200] [Batch 9/59] [D loss: 0.591477] [G loss: 0.865325]\n",
"[Epoch 129/200] [Batch 10/59] [D loss: 0.519903] [G loss: 1.026273]\n",
"[Epoch 129/200] [Batch 11/59] [D loss: 0.576650] [G loss: 0.871192]\n",
"[Epoch 129/200] [Batch 12/59] [D loss: 0.559645] [G loss: 0.868215]\n",
"[Epoch 129/200] [Batch 13/59] [D loss: 0.569475] [G loss: 0.982211]\n",
"[Epoch 129/200] [Batch 14/59] [D loss: 0.576294] [G loss: 0.839238]\n",
"[Epoch 129/200] [Batch 15/59] [D loss: 0.575404] [G loss: 1.223442]\n",
"[Epoch 129/200] [Batch 16/59] [D loss: 0.562975] [G loss: 0.742678]\n",
"[Epoch 129/200] [Batch 17/59] [D loss: 0.555773] [G loss: 0.881967]\n",
"[Epoch 129/200] [Batch 18/59] [D loss: 0.558766] [G loss: 1.262535]\n",
"[Epoch 129/200] [Batch 19/59] [D loss: 0.530169] [G loss: 1.129955]\n",
"[Epoch 129/200] [Batch 20/59] [D loss: 0.626962] [G loss: 0.488675]\n",
"[Epoch 129/200] [Batch 21/59] [D loss: 0.569319] [G loss: 1.243421]\n",
"[Epoch 129/200] [Batch 22/59] [D loss: 0.568850] [G loss: 0.960196]\n",
"[Epoch 129/200] [Batch 23/59] [D loss: 0.564951] [G loss: 0.514284]\n",
"[Epoch 129/200] [Batch 24/59] [D loss: 0.644141] [G loss: 1.289282]\n",
"[Epoch 129/200] [Batch 25/59] [D loss: 0.554131] [G loss: 0.783786]\n",
"[Epoch 129/200] [Batch 26/59] [D loss: 0.602304] [G loss: 0.807084]\n",
"[Epoch 129/200] [Batch 27/59] [D loss: 0.506388] [G loss: 1.278993]\n",
"[Epoch 129/200] [Batch 28/59] [D loss: 0.573334] [G loss: 0.925153]\n",
"[Epoch 129/200] [Batch 29/59] [D loss: 0.611915] [G loss: 0.655623]\n",
"[Epoch 129/200] [Batch 30/59] [D loss: 0.554483] [G loss: 0.986566]\n",
"[Epoch 129/200] [Batch 31/59] [D loss: 0.538499] [G loss: 0.888827]\n",
"[Epoch 129/200] [Batch 32/59] [D loss: 0.547422] [G loss: 0.952147]\n",
"[Epoch 129/200] [Batch 33/59] [D loss: 0.551236] [G loss: 0.900202]\n",
"[Epoch 129/200] [Batch 34/59] [D loss: 0.584566] [G loss: 1.073375]\n",
"[Epoch 129/200] [Batch 35/59] [D loss: 0.588417] [G loss: 0.935552]\n",
"[Epoch 129/200] [Batch 36/59] [D loss: 0.590734] [G loss: 0.797729]\n",
"[Epoch 129/200] [Batch 37/59] [D loss: 0.554934] [G loss: 0.734763]\n",
"[Epoch 129/200] [Batch 38/59] [D loss: 0.648643] [G loss: 0.891623]\n",
"[Epoch 129/200] [Batch 39/59] [D loss: 0.621637] [G loss: 1.145677]\n",
"[Epoch 129/200] [Batch 40/59] [D loss: 0.580420] [G loss: 0.709415]\n",
"[Epoch 129/200] [Batch 41/59] [D loss: 0.549432] [G loss: 0.871889]\n",
"[Epoch 129/200] [Batch 42/59] [D loss: 0.618609] [G loss: 1.047340]\n",
"[Epoch 129/200] [Batch 43/59] [D loss: 0.512060] [G loss: 0.755568]\n",
"[Epoch 129/200] [Batch 44/59] [D loss: 0.512417] [G loss: 1.071028]\n",
"[Epoch 129/200] [Batch 45/59] [D loss: 0.563202] [G loss: 1.181258]\n",
"[Epoch 129/200] [Batch 46/59] [D loss: 0.628971] [G loss: 0.871267]\n",
"[Epoch 129/200] [Batch 47/59] [D loss: 0.564235] [G loss: 1.265176]\n",
"[Epoch 129/200] [Batch 48/59] [D loss: 0.592104] [G loss: 0.918188]\n",
"[Epoch 129/200] [Batch 49/59] [D loss: 0.679511] [G loss: 0.733941]\n",
"[Epoch 129/200] [Batch 50/59] [D loss: 0.563409] [G loss: 0.974603]\n",
"[Epoch 129/200] [Batch 51/59] [D loss: 0.597611] [G loss: 0.837829]\n",
"[Epoch 129/200] [Batch 52/59] [D loss: 0.601577] [G loss: 0.658776]\n",
"[Epoch 129/200] [Batch 53/59] [D loss: 0.639651] [G loss: 1.296105]\n",
"[Epoch 129/200] [Batch 54/59] [D loss: 0.566455] [G loss: 0.844684]\n",
"[Epoch 129/200] [Batch 55/59] [D loss: 0.586933] [G loss: 0.991764]\n",
"[Epoch 129/200] [Batch 56/59] [D loss: 0.575791] [G loss: 0.924239]\n",
"[Epoch 129/200] [Batch 57/59] [D loss: 0.583301] [G loss: 0.972408]\n",
"[Epoch 129/200] [Batch 58/59] [D loss: 0.562886] [G loss: 1.054331]\n",
"[Epoch 130/200] [Batch 0/59] [D loss: 0.643859] [G loss: 1.260525]\n",
"[Epoch 130/200] [Batch 1/59] [D loss: 0.658176] [G loss: 0.508372]\n",
"[Epoch 130/200] [Batch 2/59] [D loss: 0.593556] [G loss: 1.134489]\n",
"[Epoch 130/200] [Batch 3/59] [D loss: 0.599283] [G loss: 1.148710]\n",
"[Epoch 130/200] [Batch 4/59] [D loss: 0.627750] [G loss: 0.564031]\n",
"[Epoch 130/200] [Batch 5/59] [D loss: 0.521789] [G loss: 1.135496]\n",
"[Epoch 130/200] [Batch 6/59] [D loss: 0.571968] [G loss: 1.262105]\n",
"[Epoch 130/200] [Batch 7/59] [D loss: 0.560501] [G loss: 0.560074]\n",
"[Epoch 130/200] [Batch 8/59] [D loss: 0.658681] [G loss: 0.838860]\n",
"[Epoch 130/200] [Batch 9/59] [D loss: 0.709876] [G loss: 1.498987]\n",
"[Epoch 130/200] [Batch 10/59] [D loss: 0.564629] [G loss: 0.772907]\n",
"[Epoch 130/200] [Batch 11/59] [D loss: 0.610559] [G loss: 0.554913]\n",
"[Epoch 130/200] [Batch 12/59] [D loss: 0.652136] [G loss: 1.429653]\n",
"[Epoch 130/200] [Batch 13/59] [D loss: 0.567079] [G loss: 0.917497]\n",
"[Epoch 130/200] [Batch 14/59] [D loss: 0.590538] [G loss: 0.864238]\n",
"[Epoch 130/200] [Batch 15/59] [D loss: 0.606478] [G loss: 1.115624]\n",
"[Epoch 130/200] [Batch 16/59] [D loss: 0.560245] [G loss: 1.100520]\n",
"[Epoch 130/200] [Batch 17/59] [D loss: 0.566599] [G loss: 0.717611]\n",
"[Epoch 130/200] [Batch 18/59] [D loss: 0.525412] [G loss: 1.019341]\n",
"[Epoch 130/200] [Batch 19/59] [D loss: 0.561778] [G loss: 1.036425]\n",
"[Epoch 130/200] [Batch 20/59] [D loss: 0.536946] [G loss: 0.970947]\n",
"[Epoch 130/200] [Batch 21/59] [D loss: 0.601788] [G loss: 1.049886]\n",
"[Epoch 130/200] [Batch 22/59] [D loss: 0.560561] [G loss: 0.914483]\n",
"[Epoch 130/200] [Batch 23/59] [D loss: 0.627742] [G loss: 1.053621]\n",
"[Epoch 130/200] [Batch 24/59] [D loss: 0.591258] [G loss: 0.650423]\n",
"[Epoch 130/200] [Batch 25/59] [D loss: 0.530636] [G loss: 1.172942]\n",
"[Epoch 130/200] [Batch 26/59] [D loss: 0.617689] [G loss: 1.041661]\n",
"[Epoch 130/200] [Batch 27/59] [D loss: 0.577277] [G loss: 1.007109]\n",
"[Epoch 130/200] [Batch 28/59] [D loss: 0.598857] [G loss: 1.129072]\n",
"[Epoch 130/200] [Batch 29/59] [D loss: 0.576723] [G loss: 0.947748]\n",
"[Epoch 130/200] [Batch 30/59] [D loss: 0.599641] [G loss: 0.837678]\n",
"[Epoch 130/200] [Batch 31/59] [D loss: 0.585281] [G loss: 0.783984]\n",
"[Epoch 130/200] [Batch 32/59] [D loss: 0.644914] [G loss: 1.282605]\n",
"[Epoch 130/200] [Batch 33/59] [D loss: 0.618034] [G loss: 0.847683]\n",
"[Epoch 130/200] [Batch 34/59] [D loss: 0.603428] [G loss: 0.633637]\n",
"[Epoch 130/200] [Batch 35/59] [D loss: 0.519342] [G loss: 1.125677]\n",
"[Epoch 130/200] [Batch 36/59] [D loss: 0.489266] [G loss: 0.789300]\n",
"[Epoch 130/200] [Batch 37/59] [D loss: 0.573251] [G loss: 0.778226]\n",
"[Epoch 130/200] [Batch 38/59] [D loss: 0.574852] [G loss: 0.906637]\n",
"[Epoch 130/200] [Batch 39/59] [D loss: 0.541386] [G loss: 1.177854]\n",
"[Epoch 130/200] [Batch 40/59] [D loss: 0.509836] [G loss: 0.910263]\n",
"[Epoch 130/200] [Batch 41/59] [D loss: 0.575872] [G loss: 0.706650]\n",
"[Epoch 130/200] [Batch 42/59] [D loss: 0.541647] [G loss: 1.000054]\n",
"[Epoch 130/200] [Batch 43/59] [D loss: 0.586796] [G loss: 0.994643]\n",
"[Epoch 130/200] [Batch 44/59] [D loss: 0.590506] [G loss: 0.674782]\n",
"[Epoch 130/200] [Batch 45/59] [D loss: 0.605711] [G loss: 0.870677]\n",
"[Epoch 130/200] [Batch 46/59] [D loss: 0.590738] [G loss: 1.145783]\n",
"[Epoch 130/200] [Batch 47/59] [D loss: 0.566731] [G loss: 0.873236]\n",
"[Epoch 130/200] [Batch 48/59] [D loss: 0.570988] [G loss: 0.997330]\n",
"[Epoch 130/200] [Batch 49/59] [D loss: 0.495393] [G loss: 1.099466]\n",
"[Epoch 130/200] [Batch 50/59] [D loss: 0.609007] [G loss: 1.125672]\n",
"[Epoch 130/200] [Batch 51/59] [D loss: 0.564994] [G loss: 0.864964]\n",
"[Epoch 130/200] [Batch 52/59] [D loss: 0.630758] [G loss: 0.912892]\n",
"[Epoch 130/200] [Batch 53/59] [D loss: 0.554926] [G loss: 1.001204]\n",
"[Epoch 130/200] [Batch 54/59] [D loss: 0.626475] [G loss: 0.683141]\n",
"[Epoch 130/200] [Batch 55/59] [D loss: 0.575216] [G loss: 1.115735]\n",
"[Epoch 130/200] [Batch 56/59] [D loss: 0.558949] [G loss: 0.917647]\n",
"[Epoch 130/200] [Batch 57/59] [D loss: 0.535389] [G loss: 0.819544]\n",
"[Epoch 130/200] [Batch 58/59] [D loss: 0.529654] [G loss: 0.889340]\n",
"[Epoch 131/200] [Batch 0/59] [D loss: 0.575452] [G loss: 0.943351]\n",
"[Epoch 131/200] [Batch 1/59] [D loss: 0.655251] [G loss: 1.141792]\n",
"[Epoch 131/200] [Batch 2/59] [D loss: 0.548543] [G loss: 0.953050]\n",
"[Epoch 131/200] [Batch 3/59] [D loss: 0.518681] [G loss: 0.934211]\n",
"[Epoch 131/200] [Batch 4/59] [D loss: 0.619064] [G loss: 0.969031]\n",
"[Epoch 131/200] [Batch 5/59] [D loss: 0.550663] [G loss: 0.846703]\n",
"[Epoch 131/200] [Batch 6/59] [D loss: 0.506711] [G loss: 0.902182]\n",
"[Epoch 131/200] [Batch 7/59] [D loss: 0.626621] [G loss: 1.033816]\n",
"[Epoch 131/200] [Batch 8/59] [D loss: 0.669641] [G loss: 1.005889]\n",
"[Epoch 131/200] [Batch 9/59] [D loss: 0.536650] [G loss: 0.601260]\n",
"[Epoch 131/200] [Batch 10/59] [D loss: 0.593701] [G loss: 1.068571]\n",
"[Epoch 131/200] [Batch 11/59] [D loss: 0.617362] [G loss: 0.935346]\n",
"[Epoch 131/200] [Batch 12/59] [D loss: 0.642449] [G loss: 0.681874]\n",
"[Epoch 131/200] [Batch 13/59] [D loss: 0.542975] [G loss: 1.109102]\n",
"[Epoch 131/200] [Batch 14/59] [D loss: 0.572903] [G loss: 1.041063]\n",
"[Epoch 131/200] [Batch 15/59] [D loss: 0.502969] [G loss: 0.699019]\n",
"[Epoch 131/200] [Batch 16/59] [D loss: 0.545362] [G loss: 0.809935]\n",
"[Epoch 131/200] [Batch 17/59] [D loss: 0.539580] [G loss: 1.058633]\n",
"[Epoch 131/200] [Batch 18/59] [D loss: 0.519023] [G loss: 0.979632]\n",
"[Epoch 131/200] [Batch 19/59] [D loss: 0.539961] [G loss: 1.064542]\n",
"[Epoch 131/200] [Batch 20/59] [D loss: 0.588875] [G loss: 0.888804]\n",
"[Epoch 131/200] [Batch 21/59] [D loss: 0.599497] [G loss: 0.825063]\n",
"[Epoch 131/200] [Batch 22/59] [D loss: 0.526320] [G loss: 1.238547]\n",
"[Epoch 131/200] [Batch 23/59] [D loss: 0.591935] [G loss: 1.044388]\n",
"[Epoch 131/200] [Batch 24/59] [D loss: 0.603467] [G loss: 0.776460]\n",
"[Epoch 131/200] [Batch 25/59] [D loss: 0.579878] [G loss: 1.246014]\n",
"[Epoch 131/200] [Batch 26/59] [D loss: 0.614961] [G loss: 0.963371]\n",
"[Epoch 131/200] [Batch 27/59] [D loss: 0.572823] [G loss: 0.880468]\n",
"[Epoch 131/200] [Batch 28/59] [D loss: 0.573261] [G loss: 0.850424]\n",
"[Epoch 131/200] [Batch 29/59] [D loss: 0.556312] [G loss: 1.026613]\n",
"[Epoch 131/200] [Batch 30/59] [D loss: 0.571975] [G loss: 0.734351]\n",
"[Epoch 131/200] [Batch 31/59] [D loss: 0.597498] [G loss: 0.897519]\n",
"[Epoch 131/200] [Batch 32/59] [D loss: 0.620878] [G loss: 1.436944]\n",
"[Epoch 131/200] [Batch 33/59] [D loss: 0.580974] [G loss: 0.683867]\n",
"[Epoch 131/200] [Batch 34/59] [D loss: 0.532592] [G loss: 0.924577]\n",
"[Epoch 131/200] [Batch 35/59] [D loss: 0.576663] [G loss: 0.967905]\n",
"[Epoch 131/200] [Batch 36/59] [D loss: 0.543232] [G loss: 0.978806]\n",
"[Epoch 131/200] [Batch 37/59] [D loss: 0.553657] [G loss: 0.711626]\n",
"[Epoch 131/200] [Batch 38/59] [D loss: 0.565235] [G loss: 1.132220]\n",
"[Epoch 131/200] [Batch 39/59] [D loss: 0.573374] [G loss: 1.139548]\n",
"[Epoch 131/200] [Batch 40/59] [D loss: 0.587583] [G loss: 0.810012]\n",
"[Epoch 131/200] [Batch 41/59] [D loss: 0.598051] [G loss: 1.034504]\n",
"[Epoch 131/200] [Batch 42/59] [D loss: 0.613385] [G loss: 0.757545]\n",
"[Epoch 131/200] [Batch 43/59] [D loss: 0.554110] [G loss: 1.125520]\n",
"[Epoch 131/200] [Batch 44/59] [D loss: 0.567321] [G loss: 0.917888]\n",
"[Epoch 131/200] [Batch 45/59] [D loss: 0.553921] [G loss: 0.846752]\n",
"[Epoch 131/200] [Batch 46/59] [D loss: 0.534047] [G loss: 1.051666]\n",
"[Epoch 131/200] [Batch 47/59] [D loss: 0.533093] [G loss: 0.874361]\n",
"[Epoch 131/200] [Batch 48/59] [D loss: 0.475737] [G loss: 1.075405]\n",
"[Epoch 131/200] [Batch 49/59] [D loss: 0.561615] [G loss: 1.031511]\n",
"[Epoch 131/200] [Batch 50/59] [D loss: 0.587710] [G loss: 0.697450]\n",
"[Epoch 131/200] [Batch 51/59] [D loss: 0.586792] [G loss: 0.945665]\n",
"[Epoch 131/200] [Batch 52/59] [D loss: 0.553874] [G loss: 0.959825]\n",
"[Epoch 131/200] [Batch 53/59] [D loss: 0.612411] [G loss: 0.800584]\n",
"[Epoch 131/200] [Batch 54/59] [D loss: 0.523663] [G loss: 0.805987]\n",
"[Epoch 131/200] [Batch 55/59] [D loss: 0.589596] [G loss: 1.002002]\n",
"[Epoch 131/200] [Batch 56/59] [D loss: 0.627215] [G loss: 0.826655]\n",
"[Epoch 131/200] [Batch 57/59] [D loss: 0.641997] [G loss: 1.109020]\n",
"[Epoch 131/200] [Batch 58/59] [D loss: 0.702423] [G loss: 0.649188]\n",
"[Epoch 132/200] [Batch 0/59] [D loss: 0.478842] [G loss: 1.095129]\n",
"[Epoch 132/200] [Batch 1/59] [D loss: 0.553353] [G loss: 0.999946]\n",
"[Epoch 132/200] [Batch 2/59] [D loss: 0.537343] [G loss: 0.778685]\n",
"[Epoch 132/200] [Batch 3/59] [D loss: 0.601090] [G loss: 1.364679]\n",
"[Epoch 132/200] [Batch 4/59] [D loss: 0.570729] [G loss: 0.734746]\n",
"[Epoch 132/200] [Batch 5/59] [D loss: 0.569108] [G loss: 0.903674]\n",
"[Epoch 132/200] [Batch 6/59] [D loss: 0.587327] [G loss: 1.118290]\n",
"[Epoch 132/200] [Batch 7/59] [D loss: 0.570973] [G loss: 0.784685]\n",
"[Epoch 132/200] [Batch 8/59] [D loss: 0.511223] [G loss: 1.038057]\n",
"[Epoch 132/200] [Batch 9/59] [D loss: 0.522740] [G loss: 0.920271]\n",
"[Epoch 132/200] [Batch 10/59] [D loss: 0.655370] [G loss: 0.604088]\n",
"[Epoch 132/200] [Batch 11/59] [D loss: 0.595725] [G loss: 1.568070]\n",
"[Epoch 132/200] [Batch 12/59] [D loss: 0.566435] [G loss: 0.728926]\n",
"[Epoch 132/200] [Batch 13/59] [D loss: 0.704829] [G loss: 0.609274]\n",
"[Epoch 132/200] [Batch 14/59] [D loss: 0.755060] [G loss: 1.588296]\n",
"[Epoch 132/200] [Batch 15/59] [D loss: 0.563451] [G loss: 0.973207]\n",
"[Epoch 132/200] [Batch 16/59] [D loss: 0.648418] [G loss: 0.593886]\n",
"[Epoch 132/200] [Batch 17/59] [D loss: 0.515741] [G loss: 1.294243]\n",
"[Epoch 132/200] [Batch 18/59] [D loss: 0.512799] [G loss: 0.896689]\n",
"[Epoch 132/200] [Batch 19/59] [D loss: 0.552462] [G loss: 0.766024]\n",
"[Epoch 132/200] [Batch 20/59] [D loss: 0.564787] [G loss: 0.905997]\n",
"[Epoch 132/200] [Batch 21/59] [D loss: 0.579663] [G loss: 1.116982]\n",
"[Epoch 132/200] [Batch 22/59] [D loss: 0.572227] [G loss: 0.788386]\n",
"[Epoch 132/200] [Batch 23/59] [D loss: 0.566086] [G loss: 0.847831]\n",
"[Epoch 132/200] [Batch 24/59] [D loss: 0.592055] [G loss: 0.909608]\n",
"[Epoch 132/200] [Batch 25/59] [D loss: 0.587951] [G loss: 1.186899]\n",
"[Epoch 132/200] [Batch 26/59] [D loss: 0.597264] [G loss: 0.714567]\n",
"[Epoch 132/200] [Batch 27/59] [D loss: 0.563862] [G loss: 0.771712]\n",
"[Epoch 132/200] [Batch 28/59] [D loss: 0.565251] [G loss: 1.302956]\n",
"[Epoch 132/200] [Batch 29/59] [D loss: 0.508272] [G loss: 1.003796]\n",
"[Epoch 132/200] [Batch 30/59] [D loss: 0.563584] [G loss: 0.826740]\n",
"[Epoch 132/200] [Batch 31/59] [D loss: 0.527354] [G loss: 1.018166]\n",
"[Epoch 132/200] [Batch 32/59] [D loss: 0.558982] [G loss: 1.067179]\n",
"[Epoch 132/200] [Batch 33/59] [D loss: 0.626212] [G loss: 0.829075]\n",
"[Epoch 132/200] [Batch 34/59] [D loss: 0.535543] [G loss: 1.115795]\n",
"[Epoch 132/200] [Batch 35/59] [D loss: 0.595785] [G loss: 0.859024]\n",
"[Epoch 132/200] [Batch 36/59] [D loss: 0.587201] [G loss: 0.856297]\n",
"[Epoch 132/200] [Batch 37/59] [D loss: 0.618042] [G loss: 0.999770]\n",
"[Epoch 132/200] [Batch 38/59] [D loss: 0.546147] [G loss: 1.006913]\n",
"[Epoch 132/200] [Batch 39/59] [D loss: 0.555421] [G loss: 0.817211]\n",
"[Epoch 132/200] [Batch 40/59] [D loss: 0.571931] [G loss: 0.983090]\n",
"[Epoch 132/200] [Batch 41/59] [D loss: 0.477096] [G loss: 0.924375]\n",
"[Epoch 132/200] [Batch 42/59] [D loss: 0.566391] [G loss: 0.783921]\n",
"[Epoch 132/200] [Batch 43/59] [D loss: 0.602713] [G loss: 0.985269]\n",
"[Epoch 132/200] [Batch 44/59] [D loss: 0.514573] [G loss: 0.955969]\n",
"[Epoch 132/200] [Batch 45/59] [D loss: 0.614278] [G loss: 0.904114]\n",
"[Epoch 132/200] [Batch 46/59] [D loss: 0.698616] [G loss: 1.008070]\n",
"[Epoch 132/200] [Batch 47/59] [D loss: 0.561457] [G loss: 0.669472]\n",
"[Epoch 132/200] [Batch 48/59] [D loss: 0.700666] [G loss: 0.924375]\n",
"[Epoch 132/200] [Batch 49/59] [D loss: 0.660175] [G loss: 1.396246]\n",
"[Epoch 132/200] [Batch 50/59] [D loss: 0.506824] [G loss: 0.873653]\n",
"[Epoch 132/200] [Batch 51/59] [D loss: 0.596953] [G loss: 1.043251]\n",
"[Epoch 132/200] [Batch 52/59] [D loss: 0.631925] [G loss: 0.745826]\n",
"[Epoch 132/200] [Batch 53/59] [D loss: 0.597960] [G loss: 1.169675]\n",
"[Epoch 132/200] [Batch 54/59] [D loss: 0.583269] [G loss: 0.723764]\n",
"[Epoch 132/200] [Batch 55/59] [D loss: 0.575834] [G loss: 0.895024]\n",
"[Epoch 132/200] [Batch 56/59] [D loss: 0.638860] [G loss: 0.828996]\n",
"[Epoch 132/200] [Batch 57/59] [D loss: 0.553228] [G loss: 0.664817]\n",
"[Epoch 132/200] [Batch 58/59] [D loss: 0.523338] [G loss: 0.985571]\n",
"[Epoch 133/200] [Batch 0/59] [D loss: 0.546443] [G loss: 0.968383]\n",
"[Epoch 133/200] [Batch 1/59] [D loss: 0.548783] [G loss: 0.787412]\n",
"[Epoch 133/200] [Batch 2/59] [D loss: 0.613454] [G loss: 0.972342]\n",
"[Epoch 133/200] [Batch 3/59] [D loss: 0.551775] [G loss: 0.956479]\n",
"[Epoch 133/200] [Batch 4/59] [D loss: 0.536261] [G loss: 0.787371]\n",
"[Epoch 133/200] [Batch 5/59] [D loss: 0.585850] [G loss: 1.019475]\n",
"[Epoch 133/200] [Batch 6/59] [D loss: 0.547966] [G loss: 1.174805]\n",
"[Epoch 133/200] [Batch 7/59] [D loss: 0.529255] [G loss: 0.794004]\n",
"[Epoch 133/200] [Batch 8/59] [D loss: 0.590798] [G loss: 0.868577]\n",
"[Epoch 133/200] [Batch 9/59] [D loss: 0.592556] [G loss: 0.884697]\n",
"[Epoch 133/200] [Batch 10/59] [D loss: 0.559151] [G loss: 1.451627]\n",
"[Epoch 133/200] [Batch 11/59] [D loss: 0.503670] [G loss: 0.893173]\n",
"[Epoch 133/200] [Batch 12/59] [D loss: 0.548221] [G loss: 0.859958]\n",
"[Epoch 133/200] [Batch 13/59] [D loss: 0.625250] [G loss: 1.055760]\n",
"[Epoch 133/200] [Batch 14/59] [D loss: 0.554494] [G loss: 0.837996]\n",
"[Epoch 133/200] [Batch 15/59] [D loss: 0.529160] [G loss: 0.768917]\n",
"[Epoch 133/200] [Batch 16/59] [D loss: 0.647588] [G loss: 1.221175]\n",
"[Epoch 133/200] [Batch 17/59] [D loss: 0.521203] [G loss: 0.745460]\n",
"[Epoch 133/200] [Batch 18/59] [D loss: 0.525983] [G loss: 0.949821]\n",
"[Epoch 133/200] [Batch 19/59] [D loss: 0.634772] [G loss: 1.181438]\n",
"[Epoch 133/200] [Batch 20/59] [D loss: 0.560673] [G loss: 0.789945]\n",
"[Epoch 133/200] [Batch 21/59] [D loss: 0.585705] [G loss: 0.647371]\n",
"[Epoch 133/200] [Batch 22/59] [D loss: 0.643456] [G loss: 1.269793]\n",
"[Epoch 133/200] [Batch 23/59] [D loss: 0.547556] [G loss: 0.974901]\n",
"[Epoch 133/200] [Batch 24/59] [D loss: 0.648527] [G loss: 0.657447]\n",
"[Epoch 133/200] [Batch 25/59] [D loss: 0.626864] [G loss: 1.492545]\n",
"[Epoch 133/200] [Batch 26/59] [D loss: 0.569444] [G loss: 0.831263]\n",
"[Epoch 133/200] [Batch 27/59] [D loss: 0.571943] [G loss: 0.801474]\n",
"[Epoch 133/200] [Batch 28/59] [D loss: 0.578041] [G loss: 1.208402]\n",
"[Epoch 133/200] [Batch 29/59] [D loss: 0.526115] [G loss: 0.913951]\n",
"[Epoch 133/200] [Batch 30/59] [D loss: 0.554416] [G loss: 0.696770]\n",
"[Epoch 133/200] [Batch 31/59] [D loss: 0.524881] [G loss: 1.040661]\n",
"[Epoch 133/200] [Batch 32/59] [D loss: 0.549744] [G loss: 0.888862]\n",
"[Epoch 133/200] [Batch 33/59] [D loss: 0.593675] [G loss: 0.946371]\n",
"[Epoch 133/200] [Batch 34/59] [D loss: 0.563832] [G loss: 0.864562]\n",
"[Epoch 133/200] [Batch 35/59] [D loss: 0.573854] [G loss: 1.104826]\n",
"[Epoch 133/200] [Batch 36/59] [D loss: 0.551798] [G loss: 0.974699]\n",
"[Epoch 133/200] [Batch 37/59] [D loss: 0.559029] [G loss: 0.754562]\n",
"[Epoch 133/200] [Batch 38/59] [D loss: 0.621972] [G loss: 0.979008]\n",
"[Epoch 133/200] [Batch 39/59] [D loss: 0.528706] [G loss: 1.109659]\n",
"[Epoch 133/200] [Batch 40/59] [D loss: 0.525092] [G loss: 0.866898]\n",
"[Epoch 133/200] [Batch 41/59] [D loss: 0.523960] [G loss: 1.116896]\n",
"[Epoch 133/200] [Batch 42/59] [D loss: 0.551015] [G loss: 0.797195]\n",
"[Epoch 133/200] [Batch 43/59] [D loss: 0.603002] [G loss: 1.485900]\n",
"[Epoch 133/200] [Batch 44/59] [D loss: 0.607354] [G loss: 0.742273]\n",
"[Epoch 133/200] [Batch 45/59] [D loss: 0.550036] [G loss: 1.020169]\n",
"[Epoch 133/200] [Batch 46/59] [D loss: 0.558411] [G loss: 0.647019]\n",
"[Epoch 133/200] [Batch 47/59] [D loss: 0.485994] [G loss: 0.979861]\n",
"[Epoch 133/200] [Batch 48/59] [D loss: 0.503795] [G loss: 0.950640]\n",
"[Epoch 133/200] [Batch 49/59] [D loss: 0.549324] [G loss: 0.767531]\n",
"[Epoch 133/200] [Batch 50/59] [D loss: 0.467733] [G loss: 0.956849]\n",
"[Epoch 133/200] [Batch 51/59] [D loss: 0.494478] [G loss: 0.875126]\n",
"[Epoch 133/200] [Batch 52/59] [D loss: 0.539323] [G loss: 1.148435]\n",
"[Epoch 133/200] [Batch 53/59] [D loss: 0.601739] [G loss: 0.800957]\n",
"[Epoch 133/200] [Batch 54/59] [D loss: 0.500046] [G loss: 0.978099]\n",
"[Epoch 133/200] [Batch 55/59] [D loss: 0.609243] [G loss: 0.837429]\n",
"[Epoch 133/200] [Batch 56/59] [D loss: 0.553659] [G loss: 1.074820]\n",
"[Epoch 133/200] [Batch 57/59] [D loss: 0.638711] [G loss: 0.787804]\n",
"[Epoch 133/200] [Batch 58/59] [D loss: 0.540099] [G loss: 1.022046]\n",
"[Epoch 134/200] [Batch 0/59] [D loss: 0.611359] [G loss: 1.168003]\n",
"[Epoch 134/200] [Batch 1/59] [D loss: 0.525739] [G loss: 0.685945]\n",
"[Epoch 134/200] [Batch 2/59] [D loss: 0.593890] [G loss: 0.737462]\n",
"[Epoch 134/200] [Batch 3/59] [D loss: 0.588968] [G loss: 0.874489]\n",
"[Epoch 134/200] [Batch 4/59] [D loss: 0.610983] [G loss: 0.802593]\n",
"[Epoch 134/200] [Batch 5/59] [D loss: 0.607828] [G loss: 1.014910]\n",
"[Epoch 134/200] [Batch 6/59] [D loss: 0.496779] [G loss: 1.002355]\n",
"[Epoch 134/200] [Batch 7/59] [D loss: 0.628328] [G loss: 0.693869]\n",
"[Epoch 134/200] [Batch 8/59] [D loss: 0.557970] [G loss: 1.129727]\n",
"[Epoch 134/200] [Batch 9/59] [D loss: 0.613736] [G loss: 0.842158]\n",
"[Epoch 134/200] [Batch 10/59] [D loss: 0.641271] [G loss: 0.740308]\n",
"[Epoch 134/200] [Batch 11/59] [D loss: 0.617266] [G loss: 1.088198]\n",
"[Epoch 134/200] [Batch 12/59] [D loss: 0.522513] [G loss: 0.797479]\n",
"[Epoch 134/200] [Batch 13/59] [D loss: 0.562319] [G loss: 0.734530]\n",
"[Epoch 134/200] [Batch 14/59] [D loss: 0.584069] [G loss: 1.227550]\n",
"[Epoch 134/200] [Batch 15/59] [D loss: 0.543110] [G loss: 0.852491]\n",
"[Epoch 134/200] [Batch 16/59] [D loss: 0.610847] [G loss: 0.644605]\n",
"[Epoch 134/200] [Batch 17/59] [D loss: 0.605416] [G loss: 1.314726]\n",
"[Epoch 134/200] [Batch 18/59] [D loss: 0.548482] [G loss: 1.001605]\n",
"[Epoch 134/200] [Batch 19/59] [D loss: 0.633093] [G loss: 0.595258]\n",
"[Epoch 134/200] [Batch 20/59] [D loss: 0.615927] [G loss: 1.356076]\n",
"[Epoch 134/200] [Batch 21/59] [D loss: 0.505318] [G loss: 0.768842]\n",
"[Epoch 134/200] [Batch 22/59] [D loss: 0.615684] [G loss: 0.733923]\n",
"[Epoch 134/200] [Batch 23/59] [D loss: 0.557906] [G loss: 1.118570]\n",
"[Epoch 134/200] [Batch 24/59] [D loss: 0.585384] [G loss: 0.862493]\n",
"[Epoch 134/200] [Batch 25/59] [D loss: 0.550818] [G loss: 0.725864]\n",
"[Epoch 134/200] [Batch 26/59] [D loss: 0.594661] [G loss: 0.996387]\n",
"[Epoch 134/200] [Batch 27/59] [D loss: 0.566264] [G loss: 1.226289]\n",
"[Epoch 134/200] [Batch 28/59] [D loss: 0.565617] [G loss: 0.631572]\n",
"[Epoch 134/200] [Batch 29/59] [D loss: 0.562899] [G loss: 0.846343]\n",
"[Epoch 134/200] [Batch 30/59] [D loss: 0.628510] [G loss: 1.124937]\n",
"[Epoch 134/200] [Batch 31/59] [D loss: 0.553553] [G loss: 0.746658]\n",
"[Epoch 134/200] [Batch 32/59] [D loss: 0.603575] [G loss: 0.835264]\n",
"[Epoch 134/200] [Batch 33/59] [D loss: 0.596786] [G loss: 0.959740]\n",
"[Epoch 134/200] [Batch 34/59] [D loss: 0.620138] [G loss: 0.881844]\n",
"[Epoch 134/200] [Batch 35/59] [D loss: 0.593393] [G loss: 0.915652]\n",
"[Epoch 134/200] [Batch 36/59] [D loss: 0.647188] [G loss: 1.199448]\n",
"[Epoch 134/200] [Batch 37/59] [D loss: 0.548893] [G loss: 0.894510]\n",
"[Epoch 134/200] [Batch 38/59] [D loss: 0.587008] [G loss: 0.959490]\n",
"[Epoch 134/200] [Batch 39/59] [D loss: 0.547146] [G loss: 0.910443]\n",
"[Epoch 134/200] [Batch 40/59] [D loss: 0.555835] [G loss: 0.890170]\n",
"[Epoch 134/200] [Batch 41/59] [D loss: 0.538801] [G loss: 0.793757]\n",
"[Epoch 134/200] [Batch 42/59] [D loss: 0.540807] [G loss: 0.930407]\n",
"[Epoch 134/200] [Batch 43/59] [D loss: 0.620837] [G loss: 0.810418]\n",
"[Epoch 134/200] [Batch 44/59] [D loss: 0.553193] [G loss: 1.233627]\n",
"[Epoch 134/200] [Batch 45/59] [D loss: 0.596212] [G loss: 0.738964]\n",
"[Epoch 134/200] [Batch 46/59] [D loss: 0.541332] [G loss: 0.622690]\n",
"[Epoch 134/200] [Batch 47/59] [D loss: 0.599827] [G loss: 1.141161]\n",
"[Epoch 134/200] [Batch 48/59] [D loss: 0.581486] [G loss: 1.031897]\n",
"[Epoch 134/200] [Batch 49/59] [D loss: 0.645809] [G loss: 0.740250]\n",
"[Epoch 134/200] [Batch 50/59] [D loss: 0.561842] [G loss: 1.362571]\n",
"[Epoch 134/200] [Batch 51/59] [D loss: 0.527964] [G loss: 1.023731]\n",
"[Epoch 134/200] [Batch 52/59] [D loss: 0.529598] [G loss: 0.604434]\n",
"[Epoch 134/200] [Batch 53/59] [D loss: 0.626828] [G loss: 1.157554]\n",
"[Epoch 134/200] [Batch 54/59] [D loss: 0.488442] [G loss: 1.087645]\n",
"[Epoch 134/200] [Batch 55/59] [D loss: 0.561460] [G loss: 0.715036]\n",
"[Epoch 134/200] [Batch 56/59] [D loss: 0.674036] [G loss: 0.936376]\n",
"[Epoch 134/200] [Batch 57/59] [D loss: 0.529225] [G loss: 1.126574]\n",
"[Epoch 134/200] [Batch 58/59] [D loss: 0.579519] [G loss: 0.809431]\n",
"[Epoch 135/200] [Batch 0/59] [D loss: 0.611171] [G loss: 1.117291]\n",
"[Epoch 135/200] [Batch 1/59] [D loss: 0.452123] [G loss: 0.909931]\n",
"[Epoch 135/200] [Batch 2/59] [D loss: 0.452983] [G loss: 1.006722]\n",
"[Epoch 135/200] [Batch 3/59] [D loss: 0.557024] [G loss: 0.929250]\n",
"[Epoch 135/200] [Batch 4/59] [D loss: 0.541647] [G loss: 1.053289]\n",
"[Epoch 135/200] [Batch 5/59] [D loss: 0.658250] [G loss: 0.823001]\n",
"[Epoch 135/200] [Batch 6/59] [D loss: 0.527322] [G loss: 0.825699]\n",
"[Epoch 135/200] [Batch 7/59] [D loss: 0.641771] [G loss: 1.194884]\n",
"[Epoch 135/200] [Batch 8/59] [D loss: 0.562124] [G loss: 1.113202]\n",
"[Epoch 135/200] [Batch 9/59] [D loss: 0.661840] [G loss: 0.637821]\n",
"[Epoch 135/200] [Batch 10/59] [D loss: 0.488848] [G loss: 1.141466]\n",
"[Epoch 135/200] [Batch 11/59] [D loss: 0.639183] [G loss: 1.162138]\n",
"[Epoch 135/200] [Batch 12/59] [D loss: 0.634450] [G loss: 0.651146]\n",
"[Epoch 135/200] [Batch 13/59] [D loss: 0.572287] [G loss: 1.141139]\n",
"[Epoch 135/200] [Batch 14/59] [D loss: 0.577998] [G loss: 0.822900]\n",
"[Epoch 135/200] [Batch 15/59] [D loss: 0.535083] [G loss: 1.041181]\n",
"[Epoch 135/200] [Batch 16/59] [D loss: 0.599437] [G loss: 0.806306]\n",
"[Epoch 135/200] [Batch 17/59] [D loss: 0.518967] [G loss: 1.074445]\n",
"[Epoch 135/200] [Batch 18/59] [D loss: 0.571033] [G loss: 1.025606]\n",
"[Epoch 135/200] [Batch 19/59] [D loss: 0.582335] [G loss: 0.774990]\n",
"[Epoch 135/200] [Batch 20/59] [D loss: 0.558566] [G loss: 1.028022]\n",
"[Epoch 135/200] [Batch 21/59] [D loss: 0.560751] [G loss: 0.947048]\n",
"[Epoch 135/200] [Batch 22/59] [D loss: 0.596591] [G loss: 0.907555]\n",
"[Epoch 135/200] [Batch 23/59] [D loss: 0.574213] [G loss: 0.914301]\n",
"[Epoch 135/200] [Batch 24/59] [D loss: 0.556071] [G loss: 0.958924]\n",
"[Epoch 135/200] [Batch 25/59] [D loss: 0.562769] [G loss: 0.729829]\n",
"[Epoch 135/200] [Batch 26/59] [D loss: 0.584032] [G loss: 1.013221]\n",
"[Epoch 135/200] [Batch 27/59] [D loss: 0.588947] [G loss: 0.966545]\n",
"[Epoch 135/200] [Batch 28/59] [D loss: 0.596944] [G loss: 0.848206]\n",
"[Epoch 135/200] [Batch 29/59] [D loss: 0.572825] [G loss: 1.079159]\n",
"[Epoch 135/200] [Batch 30/59] [D loss: 0.520292] [G loss: 0.754010]\n",
"[Epoch 135/200] [Batch 31/59] [D loss: 0.551294] [G loss: 1.209545]\n",
"[Epoch 135/200] [Batch 32/59] [D loss: 0.528584] [G loss: 0.714111]\n",
"[Epoch 135/200] [Batch 33/59] [D loss: 0.558092] [G loss: 0.956645]\n",
"[Epoch 135/200] [Batch 34/59] [D loss: 0.549909] [G loss: 0.973381]\n",
"[Epoch 135/200] [Batch 35/59] [D loss: 0.600247] [G loss: 1.149915]\n",
"[Epoch 135/200] [Batch 36/59] [D loss: 0.559175] [G loss: 0.850217]\n",
"[Epoch 135/200] [Batch 37/59] [D loss: 0.583625] [G loss: 0.795953]\n",
"[Epoch 135/200] [Batch 38/59] [D loss: 0.587499] [G loss: 1.106732]\n",
"[Epoch 135/200] [Batch 39/59] [D loss: 0.545857] [G loss: 0.784748]\n",
"[Epoch 135/200] [Batch 40/59] [D loss: 0.567302] [G loss: 0.802777]\n",
"[Epoch 135/200] [Batch 41/59] [D loss: 0.584752] [G loss: 1.215358]\n",
"[Epoch 135/200] [Batch 42/59] [D loss: 0.547061] [G loss: 0.985974]\n",
"[Epoch 135/200] [Batch 43/59] [D loss: 0.578734] [G loss: 1.005394]\n",
"[Epoch 135/200] [Batch 44/59] [D loss: 0.559855] [G loss: 1.051049]\n",
"[Epoch 135/200] [Batch 45/59] [D loss: 0.571214] [G loss: 0.976368]\n",
"[Epoch 135/200] [Batch 46/59] [D loss: 0.530764] [G loss: 0.709912]\n",
"[Epoch 135/200] [Batch 47/59] [D loss: 0.557239] [G loss: 0.999962]\n",
"[Epoch 135/200] [Batch 48/59] [D loss: 0.666448] [G loss: 0.911775]\n",
"[Epoch 135/200] [Batch 49/59] [D loss: 0.580201] [G loss: 0.740824]\n",
"[Epoch 135/200] [Batch 50/59] [D loss: 0.573463] [G loss: 1.078994]\n",
"[Epoch 135/200] [Batch 51/59] [D loss: 0.684890] [G loss: 0.981017]\n",
"[Epoch 135/200] [Batch 52/59] [D loss: 0.582140] [G loss: 0.884351]\n",
"[Epoch 135/200] [Batch 53/59] [D loss: 0.591211] [G loss: 1.272905]\n",
"[Epoch 135/200] [Batch 54/59] [D loss: 0.547964] [G loss: 0.904461]\n",
"[Epoch 135/200] [Batch 55/59] [D loss: 0.581271] [G loss: 0.951771]\n",
"[Epoch 135/200] [Batch 56/59] [D loss: 0.522321] [G loss: 0.734617]\n",
"[Epoch 135/200] [Batch 57/59] [D loss: 0.567790] [G loss: 1.190649]\n",
"[Epoch 135/200] [Batch 58/59] [D loss: 0.555900] [G loss: 0.971832]\n",
"[Epoch 136/200] [Batch 0/59] [D loss: 0.589063] [G loss: 0.491044]\n",
"[Epoch 136/200] [Batch 1/59] [D loss: 0.554455] [G loss: 1.123257]\n",
"[Epoch 136/200] [Batch 2/59] [D loss: 0.615001] [G loss: 1.175050]\n",
"[Epoch 136/200] [Batch 3/59] [D loss: 0.612128] [G loss: 0.503223]\n",
"[Epoch 136/200] [Batch 4/59] [D loss: 0.639841] [G loss: 1.173573]\n",
"[Epoch 136/200] [Batch 5/59] [D loss: 0.527805] [G loss: 1.065450]\n",
"[Epoch 136/200] [Batch 6/59] [D loss: 0.633323] [G loss: 0.562951]\n",
"[Epoch 136/200] [Batch 7/59] [D loss: 0.578644] [G loss: 1.198430]\n",
"[Epoch 136/200] [Batch 8/59] [D loss: 0.533337] [G loss: 0.955827]\n",
"[Epoch 136/200] [Batch 9/59] [D loss: 0.576647] [G loss: 0.868344]\n",
"[Epoch 136/200] [Batch 10/59] [D loss: 0.598072] [G loss: 1.032183]\n",
"[Epoch 136/200] [Batch 11/59] [D loss: 0.556846] [G loss: 0.921767]\n",
"[Epoch 136/200] [Batch 12/59] [D loss: 0.573520] [G loss: 1.223891]\n",
"[Epoch 136/200] [Batch 13/59] [D loss: 0.560242] [G loss: 0.915889]\n",
"[Epoch 136/200] [Batch 14/59] [D loss: 0.580298] [G loss: 1.084812]\n",
"[Epoch 136/200] [Batch 15/59] [D loss: 0.617548] [G loss: 0.695110]\n",
"[Epoch 136/200] [Batch 16/59] [D loss: 0.558623] [G loss: 1.200649]\n",
"[Epoch 136/200] [Batch 17/59] [D loss: 0.596286] [G loss: 1.099883]\n",
"[Epoch 136/200] [Batch 18/59] [D loss: 0.575538] [G loss: 0.691517]\n",
"[Epoch 136/200] [Batch 19/59] [D loss: 0.535417] [G loss: 1.031762]\n",
"[Epoch 136/200] [Batch 20/59] [D loss: 0.582242] [G loss: 1.039239]\n",
"[Epoch 136/200] [Batch 21/59] [D loss: 0.461098] [G loss: 1.120211]\n",
"[Epoch 136/200] [Batch 22/59] [D loss: 0.608721] [G loss: 0.765861]\n",
"[Epoch 136/200] [Batch 23/59] [D loss: 0.590603] [G loss: 0.985380]\n",
"[Epoch 136/200] [Batch 24/59] [D loss: 0.598529] [G loss: 1.147726]\n",
"[Epoch 136/200] [Batch 25/59] [D loss: 0.607706] [G loss: 0.617982]\n",
"[Epoch 136/200] [Batch 26/59] [D loss: 0.569802] [G loss: 1.285750]\n",
"[Epoch 136/200] [Batch 27/59] [D loss: 0.546839] [G loss: 0.752614]\n",
"[Epoch 136/200] [Batch 28/59] [D loss: 0.539099] [G loss: 0.727073]\n",
"[Epoch 136/200] [Batch 29/59] [D loss: 0.672266] [G loss: 0.917403]\n",
"[Epoch 136/200] [Batch 30/59] [D loss: 0.559776] [G loss: 1.146026]\n",
"[Epoch 136/200] [Batch 31/59] [D loss: 0.595023] [G loss: 0.984412]\n",
"[Epoch 136/200] [Batch 32/59] [D loss: 0.506353] [G loss: 0.939274]\n",
"[Epoch 136/200] [Batch 33/59] [D loss: 0.556266] [G loss: 1.330398]\n",
"[Epoch 136/200] [Batch 34/59] [D loss: 0.533391] [G loss: 0.854672]\n",
"[Epoch 136/200] [Batch 35/59] [D loss: 0.594431] [G loss: 0.896364]\n",
"[Epoch 136/200] [Batch 36/59] [D loss: 0.544039] [G loss: 1.478232]\n",
"[Epoch 136/200] [Batch 37/59] [D loss: 0.580862] [G loss: 0.785507]\n",
"[Epoch 136/200] [Batch 38/59] [D loss: 0.599642] [G loss: 0.662174]\n",
"[Epoch 136/200] [Batch 39/59] [D loss: 0.605032] [G loss: 1.002733]\n",
"[Epoch 136/200] [Batch 40/59] [D loss: 0.557191] [G loss: 0.925325]\n",
"[Epoch 136/200] [Batch 41/59] [D loss: 0.584835] [G loss: 0.876290]\n",
"[Epoch 136/200] [Batch 42/59] [D loss: 0.535023] [G loss: 1.047434]\n",
"[Epoch 136/200] [Batch 43/59] [D loss: 0.614216] [G loss: 1.174765]\n",
"[Epoch 136/200] [Batch 44/59] [D loss: 0.573744] [G loss: 0.597372]\n",
"[Epoch 136/200] [Batch 45/59] [D loss: 0.515451] [G loss: 0.903610]\n",
"[Epoch 136/200] [Batch 46/59] [D loss: 0.642654] [G loss: 1.363228]\n",
"[Epoch 136/200] [Batch 47/59] [D loss: 0.553760] [G loss: 0.601268]\n",
"[Epoch 136/200] [Batch 48/59] [D loss: 0.608947] [G loss: 1.012360]\n",
"[Epoch 136/200] [Batch 49/59] [D loss: 0.534557] [G loss: 1.123595]\n",
"[Epoch 136/200] [Batch 50/59] [D loss: 0.534045] [G loss: 1.223066]\n",
"[Epoch 136/200] [Batch 51/59] [D loss: 0.641754] [G loss: 0.544623]\n",
"[Epoch 136/200] [Batch 52/59] [D loss: 0.516779] [G loss: 1.069467]\n",
"[Epoch 136/200] [Batch 53/59] [D loss: 0.556150] [G loss: 0.894917]\n",
"[Epoch 136/200] [Batch 54/59] [D loss: 0.609146] [G loss: 0.888670]\n",
"[Epoch 136/200] [Batch 55/59] [D loss: 0.548059] [G loss: 0.874535]\n",
"[Epoch 136/200] [Batch 56/59] [D loss: 0.563954] [G loss: 0.956333]\n",
"[Epoch 136/200] [Batch 57/59] [D loss: 0.593228] [G loss: 0.820972]\n",
"[Epoch 136/200] [Batch 58/59] [D loss: 0.573538] [G loss: 0.667339]\n",
"[Epoch 137/200] [Batch 0/59] [D loss: 0.607809] [G loss: 1.064106]\n",
"[Epoch 137/200] [Batch 1/59] [D loss: 0.560115] [G loss: 0.878966]\n",
"[Epoch 137/200] [Batch 2/59] [D loss: 0.687433] [G loss: 0.868014]\n",
"[Epoch 137/200] [Batch 3/59] [D loss: 0.610710] [G loss: 0.977557]\n",
"[Epoch 137/200] [Batch 4/59] [D loss: 0.592230] [G loss: 0.762515]\n",
"[Epoch 137/200] [Batch 5/59] [D loss: 0.537767] [G loss: 1.282269]\n",
"[Epoch 137/200] [Batch 6/59] [D loss: 0.587378] [G loss: 1.096597]\n",
"[Epoch 137/200] [Batch 7/59] [D loss: 0.548706] [G loss: 0.692824]\n",
"[Epoch 137/200] [Batch 8/59] [D loss: 0.611817] [G loss: 1.020909]\n",
"[Epoch 137/200] [Batch 9/59] [D loss: 0.590119] [G loss: 1.092036]\n",
"[Epoch 137/200] [Batch 10/59] [D loss: 0.611751] [G loss: 0.579144]\n",
"[Epoch 137/200] [Batch 11/59] [D loss: 0.548753] [G loss: 0.959874]\n",
"[Epoch 137/200] [Batch 12/59] [D loss: 0.532720] [G loss: 1.131864]\n",
"[Epoch 137/200] [Batch 13/59] [D loss: 0.565212] [G loss: 1.000928]\n",
"[Epoch 137/200] [Batch 14/59] [D loss: 0.563249] [G loss: 0.892054]\n",
"[Epoch 137/200] [Batch 15/59] [D loss: 0.550843] [G loss: 0.964625]\n",
"[Epoch 137/200] [Batch 16/59] [D loss: 0.598575] [G loss: 1.078500]\n",
"[Epoch 137/200] [Batch 17/59] [D loss: 0.629569] [G loss: 0.829445]\n",
"[Epoch 137/200] [Batch 18/59] [D loss: 0.694158] [G loss: 0.669582]\n",
"[Epoch 137/200] [Batch 19/59] [D loss: 0.592496] [G loss: 0.996118]\n",
"[Epoch 137/200] [Batch 20/59] [D loss: 0.552416] [G loss: 0.888869]\n",
"[Epoch 137/200] [Batch 21/59] [D loss: 0.569190] [G loss: 0.841011]\n",
"[Epoch 137/200] [Batch 22/59] [D loss: 0.538418] [G loss: 1.148055]\n",
"[Epoch 137/200] [Batch 23/59] [D loss: 0.657941] [G loss: 1.050614]\n",
"[Epoch 137/200] [Batch 24/59] [D loss: 0.508657] [G loss: 0.709167]\n",
"[Epoch 137/200] [Batch 25/59] [D loss: 0.539104] [G loss: 0.948436]\n",
"[Epoch 137/200] [Batch 26/59] [D loss: 0.624426] [G loss: 0.833997]\n",
"[Epoch 137/200] [Batch 27/59] [D loss: 0.519088] [G loss: 0.952084]\n",
"[Epoch 137/200] [Batch 28/59] [D loss: 0.606103] [G loss: 0.870769]\n",
"[Epoch 137/200] [Batch 29/59] [D loss: 0.585761] [G loss: 0.956687]\n",
"[Epoch 137/200] [Batch 30/59] [D loss: 0.616051] [G loss: 0.869583]\n",
"[Epoch 137/200] [Batch 31/59] [D loss: 0.545100] [G loss: 0.886984]\n",
"[Epoch 137/200] [Batch 32/59] [D loss: 0.509062] [G loss: 0.820802]\n",
"[Epoch 137/200] [Batch 33/59] [D loss: 0.637121] [G loss: 0.940214]\n",
"[Epoch 137/200] [Batch 34/59] [D loss: 0.602704] [G loss: 1.258807]\n",
"[Epoch 137/200] [Batch 35/59] [D loss: 0.554468] [G loss: 0.723524]\n",
"[Epoch 137/200] [Batch 36/59] [D loss: 0.571986] [G loss: 0.934350]\n",
"[Epoch 137/200] [Batch 37/59] [D loss: 0.623314] [G loss: 0.801351]\n",
"[Epoch 137/200] [Batch 38/59] [D loss: 0.642792] [G loss: 0.893537]\n",
"[Epoch 137/200] [Batch 39/59] [D loss: 0.530015] [G loss: 1.030145]\n",
"[Epoch 137/200] [Batch 40/59] [D loss: 0.645860] [G loss: 0.702237]\n",
"[Epoch 137/200] [Batch 41/59] [D loss: 0.627117] [G loss: 1.175187]\n",
"[Epoch 137/200] [Batch 42/59] [D loss: 0.497186] [G loss: 0.755399]\n",
"[Epoch 137/200] [Batch 43/59] [D loss: 0.534074] [G loss: 0.984555]\n",
"[Epoch 137/200] [Batch 44/59] [D loss: 0.628509] [G loss: 1.218601]\n",
"[Epoch 137/200] [Batch 45/59] [D loss: 0.557269] [G loss: 0.926736]\n",
"[Epoch 137/200] [Batch 46/59] [D loss: 0.587396] [G loss: 0.741434]\n",
"[Epoch 137/200] [Batch 47/59] [D loss: 0.635885] [G loss: 1.065969]\n",
"[Epoch 137/200] [Batch 48/59] [D loss: 0.624519] [G loss: 0.665872]\n",
"[Epoch 137/200] [Batch 49/59] [D loss: 0.572583] [G loss: 0.968677]\n",
"[Epoch 137/200] [Batch 50/59] [D loss: 0.536898] [G loss: 0.821270]\n",
"[Epoch 137/200] [Batch 51/59] [D loss: 0.616019] [G loss: 1.234707]\n",
"[Epoch 137/200] [Batch 52/59] [D loss: 0.497046] [G loss: 0.944323]\n",
"[Epoch 137/200] [Batch 53/59] [D loss: 0.508615] [G loss: 0.775419]\n",
"[Epoch 137/200] [Batch 54/59] [D loss: 0.583848] [G loss: 0.830040]\n",
"[Epoch 137/200] [Batch 55/59] [D loss: 0.521347] [G loss: 1.044363]\n",
"[Epoch 137/200] [Batch 56/59] [D loss: 0.577405] [G loss: 0.868646]\n",
"[Epoch 137/200] [Batch 57/59] [D loss: 0.566876] [G loss: 0.979790]\n",
"[Epoch 137/200] [Batch 58/59] [D loss: 0.580283] [G loss: 0.960844]\n",
"[Epoch 138/200] [Batch 0/59] [D loss: 0.511943] [G loss: 0.958469]\n",
"[Epoch 138/200] [Batch 1/59] [D loss: 0.588581] [G loss: 0.970722]\n",
"[Epoch 138/200] [Batch 2/59] [D loss: 0.543511] [G loss: 1.172134]\n",
"[Epoch 138/200] [Batch 3/59] [D loss: 0.620430] [G loss: 0.641627]\n",
"[Epoch 138/200] [Batch 4/59] [D loss: 0.543668] [G loss: 0.937721]\n",
"[Epoch 138/200] [Batch 5/59] [D loss: 0.581505] [G loss: 1.229478]\n",
"[Epoch 138/200] [Batch 6/59] [D loss: 0.529416] [G loss: 0.760083]\n",
"[Epoch 138/200] [Batch 7/59] [D loss: 0.535679] [G loss: 0.830402]\n",
"[Epoch 138/200] [Batch 8/59] [D loss: 0.586062] [G loss: 0.816826]\n",
"[Epoch 138/200] [Batch 9/59] [D loss: 0.527382] [G loss: 0.868932]\n",
"[Epoch 138/200] [Batch 10/59] [D loss: 0.596181] [G loss: 0.844618]\n",
"[Epoch 138/200] [Batch 11/59] [D loss: 0.485979] [G loss: 0.800760]\n",
"[Epoch 138/200] [Batch 12/59] [D loss: 0.517398] [G loss: 1.062508]\n",
"[Epoch 138/200] [Batch 13/59] [D loss: 0.533555] [G loss: 1.213696]\n",
"[Epoch 138/200] [Batch 14/59] [D loss: 0.542482] [G loss: 0.737771]\n",
"[Epoch 138/200] [Batch 15/59] [D loss: 0.520079] [G loss: 0.926979]\n",
"[Epoch 138/200] [Batch 16/59] [D loss: 0.552620] [G loss: 1.190729]\n",
"[Epoch 138/200] [Batch 17/59] [D loss: 0.512100] [G loss: 0.733927]\n",
"[Epoch 138/200] [Batch 18/59] [D loss: 0.597342] [G loss: 0.829717]\n",
"[Epoch 138/200] [Batch 19/59] [D loss: 0.587071] [G loss: 1.300097]\n",
"[Epoch 138/200] [Batch 20/59] [D loss: 0.480498] [G loss: 0.813065]\n",
"[Epoch 138/200] [Batch 21/59] [D loss: 0.559496] [G loss: 0.562695]\n",
"[Epoch 138/200] [Batch 22/59] [D loss: 0.502978] [G loss: 1.522065]\n",
"[Epoch 138/200] [Batch 23/59] [D loss: 0.535198] [G loss: 0.992974]\n",
"[Epoch 138/200] [Batch 24/59] [D loss: 0.606095] [G loss: 0.635361]\n",
"[Epoch 138/200] [Batch 25/59] [D loss: 0.573374] [G loss: 1.457752]\n",
"[Epoch 138/200] [Batch 26/59] [D loss: 0.500823] [G loss: 0.629665]\n",
"[Epoch 138/200] [Batch 27/59] [D loss: 0.537465] [G loss: 0.815081]\n",
"[Epoch 138/200] [Batch 28/59] [D loss: 0.539266] [G loss: 1.252742]\n",
"[Epoch 138/200] [Batch 29/59] [D loss: 0.540012] [G loss: 0.831373]\n",
"[Epoch 138/200] [Batch 30/59] [D loss: 0.577775] [G loss: 0.909020]\n",
"[Epoch 138/200] [Batch 31/59] [D loss: 0.616726] [G loss: 1.122010]\n",
"[Epoch 138/200] [Batch 32/59] [D loss: 0.634033] [G loss: 0.680860]\n",
"[Epoch 138/200] [Batch 33/59] [D loss: 0.638121] [G loss: 0.846654]\n",
"[Epoch 138/200] [Batch 34/59] [D loss: 0.551929] [G loss: 0.716368]\n",
"[Epoch 138/200] [Batch 35/59] [D loss: 0.550183] [G loss: 0.902561]\n",
"[Epoch 138/200] [Batch 36/59] [D loss: 0.520937] [G loss: 0.924251]\n",
"[Epoch 138/200] [Batch 37/59] [D loss: 0.607759] [G loss: 0.962603]\n",
"[Epoch 138/200] [Batch 38/59] [D loss: 0.622027] [G loss: 0.805873]\n",
"[Epoch 138/200] [Batch 39/59] [D loss: 0.538638] [G loss: 1.015244]\n",
"[Epoch 138/200] [Batch 40/59] [D loss: 0.602977] [G loss: 0.727697]\n",
"[Epoch 138/200] [Batch 41/59] [D loss: 0.562879] [G loss: 1.012216]\n",
"[Epoch 138/200] [Batch 42/59] [D loss: 0.579670] [G loss: 1.097879]\n",
"[Epoch 138/200] [Batch 43/59] [D loss: 0.579634] [G loss: 1.038271]\n",
"[Epoch 138/200] [Batch 44/59] [D loss: 0.577626] [G loss: 0.733951]\n",
"[Epoch 138/200] [Batch 45/59] [D loss: 0.556160] [G loss: 0.807885]\n",
"[Epoch 138/200] [Batch 46/59] [D loss: 0.585885] [G loss: 0.996541]\n",
"[Epoch 138/200] [Batch 47/59] [D loss: 0.545621] [G loss: 1.199383]\n",
"[Epoch 138/200] [Batch 48/59] [D loss: 0.533489] [G loss: 0.887675]\n",
"[Epoch 138/200] [Batch 49/59] [D loss: 0.502417] [G loss: 0.748369]\n",
"[Epoch 138/200] [Batch 50/59] [D loss: 0.626610] [G loss: 1.001793]\n",
"[Epoch 138/200] [Batch 51/59] [D loss: 0.630353] [G loss: 0.641875]\n",
"[Epoch 138/200] [Batch 52/59] [D loss: 0.657797] [G loss: 1.194759]\n",
"[Epoch 138/200] [Batch 53/59] [D loss: 0.634805] [G loss: 0.748891]\n",
"[Epoch 138/200] [Batch 54/59] [D loss: 0.562486] [G loss: 1.273981]\n",
"[Epoch 138/200] [Batch 55/59] [D loss: 0.636470] [G loss: 1.241890]\n",
"[Epoch 138/200] [Batch 56/59] [D loss: 0.555602] [G loss: 0.643116]\n",
"[Epoch 138/200] [Batch 57/59] [D loss: 0.574050] [G loss: 0.979917]\n",
"[Epoch 138/200] [Batch 58/59] [D loss: 0.607314] [G loss: 1.011776]\n",
"[Epoch 139/200] [Batch 0/59] [D loss: 0.497419] [G loss: 0.742804]\n",
"[Epoch 139/200] [Batch 1/59] [D loss: 0.560405] [G loss: 1.062788]\n",
"[Epoch 139/200] [Batch 2/59] [D loss: 0.555545] [G loss: 1.160233]\n",
"[Epoch 139/200] [Batch 3/59] [D loss: 0.491050] [G loss: 0.773593]\n",
"[Epoch 139/200] [Batch 4/59] [D loss: 0.579890] [G loss: 0.828331]\n",
"[Epoch 139/200] [Batch 5/59] [D loss: 0.578726] [G loss: 1.449823]\n",
"[Epoch 139/200] [Batch 6/59] [D loss: 0.516827] [G loss: 0.817170]\n",
"[Epoch 139/200] [Batch 7/59] [D loss: 0.593149] [G loss: 0.824972]\n",
"[Epoch 139/200] [Batch 8/59] [D loss: 0.556319] [G loss: 1.591677]\n",
"[Epoch 139/200] [Batch 9/59] [D loss: 0.586912] [G loss: 0.923803]\n",
"[Epoch 139/200] [Batch 10/59] [D loss: 0.682399] [G loss: 0.576840]\n",
"[Epoch 139/200] [Batch 11/59] [D loss: 0.647547] [G loss: 1.806089]\n",
"[Epoch 139/200] [Batch 12/59] [D loss: 0.440909] [G loss: 0.907753]\n",
"[Epoch 139/200] [Batch 13/59] [D loss: 0.749266] [G loss: 0.351906]\n",
"[Epoch 139/200] [Batch 14/59] [D loss: 0.640384] [G loss: 1.697578]\n",
"[Epoch 139/200] [Batch 15/59] [D loss: 0.590239] [G loss: 0.910263]\n",
"[Epoch 139/200] [Batch 16/59] [D loss: 0.524426] [G loss: 0.676684]\n",
"[Epoch 139/200] [Batch 17/59] [D loss: 0.587025] [G loss: 1.115507]\n",
"[Epoch 139/200] [Batch 18/59] [D loss: 0.586437] [G loss: 1.480090]\n",
"[Epoch 139/200] [Batch 19/59] [D loss: 0.645117] [G loss: 0.585155]\n",
"[Epoch 139/200] [Batch 20/59] [D loss: 0.571376] [G loss: 0.952831]\n",
"[Epoch 139/200] [Batch 21/59] [D loss: 0.644672] [G loss: 1.713644]\n",
"[Epoch 139/200] [Batch 22/59] [D loss: 0.552011] [G loss: 0.757684]\n",
"[Epoch 139/200] [Batch 23/59] [D loss: 0.669007] [G loss: 0.549622]\n",
"[Epoch 139/200] [Batch 24/59] [D loss: 0.596880] [G loss: 1.278052]\n",
"[Epoch 139/200] [Batch 25/59] [D loss: 0.575465] [G loss: 0.848339]\n",
"[Epoch 139/200] [Batch 26/59] [D loss: 0.670924] [G loss: 0.673382]\n",
"[Epoch 139/200] [Batch 27/59] [D loss: 0.588316] [G loss: 1.155825]\n",
"[Epoch 139/200] [Batch 28/59] [D loss: 0.516186] [G loss: 1.035109]\n",
"[Epoch 139/200] [Batch 29/59] [D loss: 0.630141] [G loss: 0.912243]\n",
"[Epoch 139/200] [Batch 30/59] [D loss: 0.599157] [G loss: 0.750863]\n",
"[Epoch 139/200] [Batch 31/59] [D loss: 0.544307] [G loss: 1.069529]\n",
"[Epoch 139/200] [Batch 32/59] [D loss: 0.555996] [G loss: 0.927463]\n",
"[Epoch 139/200] [Batch 33/59] [D loss: 0.618226] [G loss: 0.668506]\n",
"[Epoch 139/200] [Batch 34/59] [D loss: 0.555080] [G loss: 1.101714]\n",
"[Epoch 139/200] [Batch 35/59] [D loss: 0.565732] [G loss: 0.866236]\n",
"[Epoch 139/200] [Batch 36/59] [D loss: 0.560938] [G loss: 0.683527]\n",
"[Epoch 139/200] [Batch 37/59] [D loss: 0.504255] [G loss: 1.021887]\n",
"[Epoch 139/200] [Batch 38/59] [D loss: 0.631624] [G loss: 0.932167]\n",
"[Epoch 139/200] [Batch 39/59] [D loss: 0.563655] [G loss: 0.720844]\n",
"[Epoch 139/200] [Batch 40/59] [D loss: 0.539180] [G loss: 1.112447]\n",
"[Epoch 139/200] [Batch 41/59] [D loss: 0.574133] [G loss: 0.830089]\n",
"[Epoch 139/200] [Batch 42/59] [D loss: 0.535451] [G loss: 1.113681]\n",
"[Epoch 139/200] [Batch 43/59] [D loss: 0.562177] [G loss: 0.938383]\n",
"[Epoch 139/200] [Batch 44/59] [D loss: 0.592966] [G loss: 1.054058]\n",
"[Epoch 139/200] [Batch 45/59] [D loss: 0.624905] [G loss: 0.873592]\n",
"[Epoch 139/200] [Batch 46/59] [D loss: 0.568607] [G loss: 1.096443]\n",
"[Epoch 139/200] [Batch 47/59] [D loss: 0.589630] [G loss: 0.726644]\n",
"[Epoch 139/200] [Batch 48/59] [D loss: 0.616006] [G loss: 1.032993]\n",
"[Epoch 139/200] [Batch 49/59] [D loss: 0.604455] [G loss: 0.905723]\n",
"[Epoch 139/200] [Batch 50/59] [D loss: 0.554178] [G loss: 0.782858]\n",
"[Epoch 139/200] [Batch 51/59] [D loss: 0.555979] [G loss: 1.178436]\n",
"[Epoch 139/200] [Batch 52/59] [D loss: 0.635987] [G loss: 0.656040]\n",
"[Epoch 139/200] [Batch 53/59] [D loss: 0.674514] [G loss: 0.934682]\n",
"[Epoch 139/200] [Batch 54/59] [D loss: 0.576279] [G loss: 1.347237]\n",
"[Epoch 139/200] [Batch 55/59] [D loss: 0.597426] [G loss: 0.989449]\n",
"[Epoch 139/200] [Batch 56/59] [D loss: 0.573028] [G loss: 0.744278]\n",
"[Epoch 139/200] [Batch 57/59] [D loss: 0.532709] [G loss: 0.872508]\n",
"[Epoch 139/200] [Batch 58/59] [D loss: 0.523290] [G loss: 1.285774]\n",
"[Epoch 140/200] [Batch 0/59] [D loss: 0.563112] [G loss: 0.918233]\n",
"[Epoch 140/200] [Batch 1/59] [D loss: 0.533891] [G loss: 0.774000]\n",
"[Epoch 140/200] [Batch 2/59] [D loss: 0.523943] [G loss: 1.246605]\n",
"[Epoch 140/200] [Batch 3/59] [D loss: 0.531708] [G loss: 1.207652]\n",
"[Epoch 140/200] [Batch 4/59] [D loss: 0.542545] [G loss: 0.586085]\n",
"[Epoch 140/200] [Batch 5/59] [D loss: 0.610109] [G loss: 0.912188]\n",
"[Epoch 140/200] [Batch 6/59] [D loss: 0.613912] [G loss: 1.130443]\n",
"[Epoch 140/200] [Batch 7/59] [D loss: 0.688795] [G loss: 0.731151]\n",
"[Epoch 140/200] [Batch 8/59] [D loss: 0.527514] [G loss: 1.063010]\n",
"[Epoch 140/200] [Batch 9/59] [D loss: 0.533002] [G loss: 1.039418]\n",
"[Epoch 140/200] [Batch 10/59] [D loss: 0.542210] [G loss: 0.803527]\n",
"[Epoch 140/200] [Batch 11/59] [D loss: 0.586509] [G loss: 0.830686]\n",
"[Epoch 140/200] [Batch 12/59] [D loss: 0.575995] [G loss: 0.930334]\n",
"[Epoch 140/200] [Batch 13/59] [D loss: 0.576796] [G loss: 1.041375]\n",
"[Epoch 140/200] [Batch 14/59] [D loss: 0.575767] [G loss: 0.634838]\n",
"[Epoch 140/200] [Batch 15/59] [D loss: 0.593914] [G loss: 1.112335]\n",
"[Epoch 140/200] [Batch 16/59] [D loss: 0.537590] [G loss: 0.909370]\n",
"[Epoch 140/200] [Batch 17/59] [D loss: 0.602703] [G loss: 0.897669]\n",
"[Epoch 140/200] [Batch 18/59] [D loss: 0.541464] [G loss: 0.810513]\n",
"[Epoch 140/200] [Batch 19/59] [D loss: 0.542975] [G loss: 1.064005]\n",
"[Epoch 140/200] [Batch 20/59] [D loss: 0.519381] [G loss: 0.707184]\n",
"[Epoch 140/200] [Batch 21/59] [D loss: 0.600019] [G loss: 0.769721]\n",
"[Epoch 140/200] [Batch 22/59] [D loss: 0.569900] [G loss: 1.135818]\n",
"[Epoch 140/200] [Batch 23/59] [D loss: 0.607462] [G loss: 1.191045]\n",
"[Epoch 140/200] [Batch 24/59] [D loss: 0.511571] [G loss: 0.792082]\n",
"[Epoch 140/200] [Batch 25/59] [D loss: 0.548398] [G loss: 0.933656]\n",
"[Epoch 140/200] [Batch 26/59] [D loss: 0.613184] [G loss: 1.103637]\n",
"[Epoch 140/200] [Batch 27/59] [D loss: 0.564492] [G loss: 0.863731]\n",
"[Epoch 140/200] [Batch 28/59] [D loss: 0.602811] [G loss: 1.005912]\n",
"[Epoch 140/200] [Batch 29/59] [D loss: 0.543329] [G loss: 0.920198]\n",
"[Epoch 140/200] [Batch 30/59] [D loss: 0.554743] [G loss: 1.078488]\n",
"[Epoch 140/200] [Batch 31/59] [D loss: 0.567674] [G loss: 0.751395]\n",
"[Epoch 140/200] [Batch 32/59] [D loss: 0.568592] [G loss: 1.088720]\n",
"[Epoch 140/200] [Batch 33/59] [D loss: 0.582374] [G loss: 1.082777]\n",
"[Epoch 140/200] [Batch 34/59] [D loss: 0.569502] [G loss: 0.681072]\n",
"[Epoch 140/200] [Batch 35/59] [D loss: 0.594123] [G loss: 1.137981]\n",
"[Epoch 140/200] [Batch 36/59] [D loss: 0.513622] [G loss: 1.061786]\n",
"[Epoch 140/200] [Batch 37/59] [D loss: 0.562918] [G loss: 0.716751]\n",
"[Epoch 140/200] [Batch 38/59] [D loss: 0.549522] [G loss: 0.997337]\n",
"[Epoch 140/200] [Batch 39/59] [D loss: 0.510522] [G loss: 0.860447]\n",
"[Epoch 140/200] [Batch 40/59] [D loss: 0.569314] [G loss: 0.739434]\n",
"[Epoch 140/200] [Batch 41/59] [D loss: 0.607785] [G loss: 1.225897]\n",
"[Epoch 140/200] [Batch 42/59] [D loss: 0.609434] [G loss: 0.726054]\n",
"[Epoch 140/200] [Batch 43/59] [D loss: 0.580258] [G loss: 0.740435]\n",
"[Epoch 140/200] [Batch 44/59] [D loss: 0.562830] [G loss: 1.130130]\n",
"[Epoch 140/200] [Batch 45/59] [D loss: 0.657886] [G loss: 1.034504]\n",
"[Epoch 140/200] [Batch 46/59] [D loss: 0.565628] [G loss: 0.650881]\n",
"[Epoch 140/200] [Batch 47/59] [D loss: 0.574793] [G loss: 0.953891]\n",
"[Epoch 140/200] [Batch 48/59] [D loss: 0.564613] [G loss: 1.134595]\n",
"[Epoch 140/200] [Batch 49/59] [D loss: 0.514635] [G loss: 0.738532]\n",
"[Epoch 140/200] [Batch 50/59] [D loss: 0.499647] [G loss: 0.886603]\n",
"[Epoch 140/200] [Batch 51/59] [D loss: 0.582802] [G loss: 1.299470]\n",
"[Epoch 140/200] [Batch 52/59] [D loss: 0.533232] [G loss: 0.802825]\n",
"[Epoch 140/200] [Batch 53/59] [D loss: 0.666002] [G loss: 0.636062]\n",
"[Epoch 140/200] [Batch 54/59] [D loss: 0.600868] [G loss: 1.208207]\n",
"[Epoch 140/200] [Batch 55/59] [D loss: 0.553047] [G loss: 1.080094]\n",
"[Epoch 140/200] [Batch 56/59] [D loss: 0.547787] [G loss: 0.820872]\n",
"[Epoch 140/200] [Batch 57/59] [D loss: 0.556363] [G loss: 1.013541]\n",
"[Epoch 140/200] [Batch 58/59] [D loss: 0.573213] [G loss: 1.154687]\n",
"[Epoch 141/200] [Batch 0/59] [D loss: 0.498444] [G loss: 0.707023]\n",
"[Epoch 141/200] [Batch 1/59] [D loss: 0.588839] [G loss: 0.601784]\n",
"[Epoch 141/200] [Batch 2/59] [D loss: 0.572890] [G loss: 1.318605]\n",
"[Epoch 141/200] [Batch 3/59] [D loss: 0.542786] [G loss: 0.702620]\n",
"[Epoch 141/200] [Batch 4/59] [D loss: 0.576801] [G loss: 0.850258]\n",
"[Epoch 141/200] [Batch 5/59] [D loss: 0.492794] [G loss: 1.121176]\n",
"[Epoch 141/200] [Batch 6/59] [D loss: 0.515455] [G loss: 0.808211]\n",
"[Epoch 141/200] [Batch 7/59] [D loss: 0.479679] [G loss: 0.779782]\n",
"[Epoch 141/200] [Batch 8/59] [D loss: 0.610016] [G loss: 1.113532]\n",
"[Epoch 141/200] [Batch 9/59] [D loss: 0.533005] [G loss: 1.086493]\n",
"[Epoch 141/200] [Batch 10/59] [D loss: 0.586405] [G loss: 0.712356]\n",
"[Epoch 141/200] [Batch 11/59] [D loss: 0.603596] [G loss: 1.060491]\n",
"[Epoch 141/200] [Batch 12/59] [D loss: 0.577860] [G loss: 0.944057]\n",
"[Epoch 141/200] [Batch 13/59] [D loss: 0.625762] [G loss: 0.592184]\n",
"[Epoch 141/200] [Batch 14/59] [D loss: 0.518165] [G loss: 1.284903]\n",
"[Epoch 141/200] [Batch 15/59] [D loss: 0.604262] [G loss: 1.096583]\n",
"[Epoch 141/200] [Batch 16/59] [D loss: 0.664217] [G loss: 0.617459]\n",
"[Epoch 141/200] [Batch 17/59] [D loss: 0.528304] [G loss: 1.202339]\n",
"[Epoch 141/200] [Batch 18/59] [D loss: 0.559401] [G loss: 0.957727]\n",
"[Epoch 141/200] [Batch 19/59] [D loss: 0.563280] [G loss: 0.937988]\n",
"[Epoch 141/200] [Batch 20/59] [D loss: 0.546553] [G loss: 0.848733]\n",
"[Epoch 141/200] [Batch 21/59] [D loss: 0.503630] [G loss: 1.108934]\n",
"[Epoch 141/200] [Batch 22/59] [D loss: 0.603341] [G loss: 0.892972]\n",
"[Epoch 141/200] [Batch 23/59] [D loss: 0.555542] [G loss: 0.996022]\n",
"[Epoch 141/200] [Batch 24/59] [D loss: 0.603045] [G loss: 0.973888]\n",
"[Epoch 141/200] [Batch 25/59] [D loss: 0.585624] [G loss: 0.847265]\n",
"[Epoch 141/200] [Batch 26/59] [D loss: 0.526555] [G loss: 1.109482]\n",
"[Epoch 141/200] [Batch 27/59] [D loss: 0.607842] [G loss: 0.850740]\n",
"[Epoch 141/200] [Batch 28/59] [D loss: 0.518258] [G loss: 1.031727]\n",
"[Epoch 141/200] [Batch 29/59] [D loss: 0.646930] [G loss: 0.859401]\n",
"[Epoch 141/200] [Batch 30/59] [D loss: 0.614980] [G loss: 0.609830]\n",
"[Epoch 141/200] [Batch 31/59] [D loss: 0.664952] [G loss: 1.431776]\n",
"[Epoch 141/200] [Batch 32/59] [D loss: 0.536452] [G loss: 1.164466]\n",
"[Epoch 141/200] [Batch 33/59] [D loss: 0.618740] [G loss: 0.913472]\n",
"[Epoch 141/200] [Batch 34/59] [D loss: 0.535098] [G loss: 1.172279]\n",
"[Epoch 141/200] [Batch 35/59] [D loss: 0.512999] [G loss: 0.908832]\n",
"[Epoch 141/200] [Batch 36/59] [D loss: 0.478278] [G loss: 0.820014]\n",
"[Epoch 141/200] [Batch 37/59] [D loss: 0.582467] [G loss: 0.935966]\n",
"[Epoch 141/200] [Batch 38/59] [D loss: 0.556239] [G loss: 0.690461]\n",
"[Epoch 141/200] [Batch 39/59] [D loss: 0.644516] [G loss: 1.096940]\n",
"[Epoch 141/200] [Batch 40/59] [D loss: 0.561856] [G loss: 0.937107]\n",
"[Epoch 141/200] [Batch 41/59] [D loss: 0.532825] [G loss: 1.154636]\n",
"[Epoch 141/200] [Batch 42/59] [D loss: 0.575389] [G loss: 0.802391]\n",
"[Epoch 141/200] [Batch 43/59] [D loss: 0.561760] [G loss: 1.142029]\n",
"[Epoch 141/200] [Batch 44/59] [D loss: 0.557377] [G loss: 0.880388]\n",
"[Epoch 141/200] [Batch 45/59] [D loss: 0.608532] [G loss: 0.994312]\n",
"[Epoch 141/200] [Batch 46/59] [D loss: 0.546819] [G loss: 1.150124]\n",
"[Epoch 141/200] [Batch 47/59] [D loss: 0.542730] [G loss: 0.798358]\n",
"[Epoch 141/200] [Batch 48/59] [D loss: 0.602854] [G loss: 0.890609]\n",
"[Epoch 141/200] [Batch 49/59] [D loss: 0.625677] [G loss: 1.458404]\n",
"[Epoch 141/200] [Batch 50/59] [D loss: 0.560123] [G loss: 0.975949]\n",
"[Epoch 141/200] [Batch 51/59] [D loss: 0.776457] [G loss: 0.552855]\n",
"[Epoch 141/200] [Batch 52/59] [D loss: 0.663161] [G loss: 1.623360]\n",
"[Epoch 141/200] [Batch 53/59] [D loss: 0.506376] [G loss: 0.959139]\n",
"[Epoch 141/200] [Batch 54/59] [D loss: 0.635631] [G loss: 0.459704]\n",
"[Epoch 141/200] [Batch 55/59] [D loss: 0.556975] [G loss: 1.079444]\n",
"[Epoch 141/200] [Batch 56/59] [D loss: 0.550464] [G loss: 1.438105]\n",
"[Epoch 141/200] [Batch 57/59] [D loss: 0.565893] [G loss: 0.741121]\n",
"[Epoch 141/200] [Batch 58/59] [D loss: 0.666860] [G loss: 0.911646]\n",
"[Epoch 142/200] [Batch 0/59] [D loss: 0.619177] [G loss: 1.269721]\n",
"[Epoch 142/200] [Batch 1/59] [D loss: 0.566054] [G loss: 0.922672]\n",
"[Epoch 142/200] [Batch 2/59] [D loss: 0.568505] [G loss: 0.752411]\n",
"[Epoch 142/200] [Batch 3/59] [D loss: 0.513265] [G loss: 1.205822]\n",
"[Epoch 142/200] [Batch 4/59] [D loss: 0.503955] [G loss: 0.770017]\n",
"[Epoch 142/200] [Batch 5/59] [D loss: 0.594700] [G loss: 0.775580]\n",
"[Epoch 142/200] [Batch 6/59] [D loss: 0.594337] [G loss: 1.159508]\n",
"[Epoch 142/200] [Batch 7/59] [D loss: 0.547485] [G loss: 0.995846]\n",
"[Epoch 142/200] [Batch 8/59] [D loss: 0.561937] [G loss: 0.659851]\n",
"[Epoch 142/200] [Batch 9/59] [D loss: 0.565166] [G loss: 1.049651]\n",
"[Epoch 142/200] [Batch 10/59] [D loss: 0.607186] [G loss: 1.341872]\n",
"[Epoch 142/200] [Batch 11/59] [D loss: 0.579163] [G loss: 0.808339]\n",
"[Epoch 142/200] [Batch 12/59] [D loss: 0.615180] [G loss: 0.817298]\n",
"[Epoch 142/200] [Batch 13/59] [D loss: 0.500099] [G loss: 1.317345]\n",
"[Epoch 142/200] [Batch 14/59] [D loss: 0.620535] [G loss: 0.909289]\n",
"[Epoch 142/200] [Batch 15/59] [D loss: 0.498189] [G loss: 0.722207]\n",
"[Epoch 142/200] [Batch 16/59] [D loss: 0.542533] [G loss: 0.725035]\n",
"[Epoch 142/200] [Batch 17/59] [D loss: 0.518675] [G loss: 1.106793]\n",
"[Epoch 142/200] [Batch 18/59] [D loss: 0.522954] [G loss: 1.002016]\n",
"[Epoch 142/200] [Batch 19/59] [D loss: 0.565500] [G loss: 0.789696]\n",
"[Epoch 142/200] [Batch 20/59] [D loss: 0.558148] [G loss: 0.911759]\n",
"[Epoch 142/200] [Batch 21/59] [D loss: 0.584504] [G loss: 1.257651]\n",
"[Epoch 142/200] [Batch 22/59] [D loss: 0.607018] [G loss: 0.722969]\n",
"[Epoch 142/200] [Batch 23/59] [D loss: 0.573272] [G loss: 0.840436]\n",
"[Epoch 142/200] [Batch 24/59] [D loss: 0.577825] [G loss: 1.018264]\n",
"[Epoch 142/200] [Batch 25/59] [D loss: 0.595502] [G loss: 0.858190]\n",
"[Epoch 142/200] [Batch 26/59] [D loss: 0.565282] [G loss: 0.872053]\n",
"[Epoch 142/200] [Batch 27/59] [D loss: 0.626710] [G loss: 0.971747]\n",
"[Epoch 142/200] [Batch 28/59] [D loss: 0.602770] [G loss: 0.877717]\n",
"[Epoch 142/200] [Batch 29/59] [D loss: 0.594352] [G loss: 0.883596]\n",
"[Epoch 142/200] [Batch 30/59] [D loss: 0.659542] [G loss: 1.069373]\n",
"[Epoch 142/200] [Batch 31/59] [D loss: 0.561015] [G loss: 0.959633]\n",
"[Epoch 142/200] [Batch 32/59] [D loss: 0.604898] [G loss: 0.717936]\n",
"[Epoch 142/200] [Batch 33/59] [D loss: 0.662042] [G loss: 1.409799]\n",
"[Epoch 142/200] [Batch 34/59] [D loss: 0.560379] [G loss: 0.884341]\n",
"[Epoch 142/200] [Batch 35/59] [D loss: 0.729720] [G loss: 0.572577]\n",
"[Epoch 142/200] [Batch 36/59] [D loss: 0.668009] [G loss: 1.326784]\n",
"[Epoch 142/200] [Batch 37/59] [D loss: 0.565901] [G loss: 0.832149]\n",
"[Epoch 142/200] [Batch 38/59] [D loss: 0.610500] [G loss: 0.938112]\n",
"[Epoch 142/200] [Batch 39/59] [D loss: 0.518943] [G loss: 1.045894]\n",
"[Epoch 142/200] [Batch 40/59] [D loss: 0.539313] [G loss: 1.012307]\n",
"[Epoch 142/200] [Batch 41/59] [D loss: 0.600905] [G loss: 0.925767]\n",
"[Epoch 142/200] [Batch 42/59] [D loss: 0.533054] [G loss: 0.841865]\n",
"[Epoch 142/200] [Batch 43/59] [D loss: 0.565006] [G loss: 1.115050]\n",
"[Epoch 142/200] [Batch 44/59] [D loss: 0.604220] [G loss: 0.924679]\n",
"[Epoch 142/200] [Batch 45/59] [D loss: 0.595771] [G loss: 0.854167]\n",
"[Epoch 142/200] [Batch 46/59] [D loss: 0.516646] [G loss: 1.165980]\n",
"[Epoch 142/200] [Batch 47/59] [D loss: 0.578183] [G loss: 1.042451]\n",
"[Epoch 142/200] [Batch 48/59] [D loss: 0.504554] [G loss: 0.778343]\n",
"[Epoch 142/200] [Batch 49/59] [D loss: 0.560856] [G loss: 0.974370]\n",
"[Epoch 142/200] [Batch 50/59] [D loss: 0.444387] [G loss: 0.998236]\n",
"[Epoch 142/200] [Batch 51/59] [D loss: 0.508143] [G loss: 0.821591]\n",
"[Epoch 142/200] [Batch 52/59] [D loss: 0.584056] [G loss: 0.966319]\n",
"[Epoch 142/200] [Batch 53/59] [D loss: 0.576095] [G loss: 0.965716]\n",
"[Epoch 142/200] [Batch 54/59] [D loss: 0.508739] [G loss: 0.872165]\n",
"[Epoch 142/200] [Batch 55/59] [D loss: 0.521776] [G loss: 0.950400]\n",
"[Epoch 142/200] [Batch 56/59] [D loss: 0.517093] [G loss: 0.871993]\n",
"[Epoch 142/200] [Batch 57/59] [D loss: 0.567810] [G loss: 0.985357]\n",
"[Epoch 142/200] [Batch 58/59] [D loss: 0.505063] [G loss: 0.826958]\n",
"[Epoch 143/200] [Batch 0/59] [D loss: 0.578277] [G loss: 0.831439]\n",
"[Epoch 143/200] [Batch 1/59] [D loss: 0.534788] [G loss: 1.323514]\n",
"[Epoch 143/200] [Batch 2/59] [D loss: 0.567931] [G loss: 0.981389]\n",
"[Epoch 143/200] [Batch 3/59] [D loss: 0.621529] [G loss: 0.604806]\n",
"[Epoch 143/200] [Batch 4/59] [D loss: 0.540429] [G loss: 1.112198]\n",
"[Epoch 143/200] [Batch 5/59] [D loss: 0.602526] [G loss: 1.210578]\n",
"[Epoch 143/200] [Batch 6/59] [D loss: 0.607023] [G loss: 0.673854]\n",
"[Epoch 143/200] [Batch 7/59] [D loss: 0.615200] [G loss: 1.063133]\n",
"[Epoch 143/200] [Batch 8/59] [D loss: 0.554432] [G loss: 1.193878]\n",
"[Epoch 143/200] [Batch 9/59] [D loss: 0.715899] [G loss: 0.586704]\n",
"[Epoch 143/200] [Batch 10/59] [D loss: 0.599104] [G loss: 1.204895]\n",
"[Epoch 143/200] [Batch 11/59] [D loss: 0.485690] [G loss: 0.854922]\n",
"[Epoch 143/200] [Batch 12/59] [D loss: 0.651595] [G loss: 0.776521]\n",
"[Epoch 143/200] [Batch 13/59] [D loss: 0.502757] [G loss: 1.243748]\n",
"[Epoch 143/200] [Batch 14/59] [D loss: 0.537425] [G loss: 0.857963]\n",
"[Epoch 143/200] [Batch 15/59] [D loss: 0.532899] [G loss: 0.844809]\n",
"[Epoch 143/200] [Batch 16/59] [D loss: 0.557553] [G loss: 1.248071]\n",
"[Epoch 143/200] [Batch 17/59] [D loss: 0.534220] [G loss: 0.910543]\n",
"[Epoch 143/200] [Batch 18/59] [D loss: 0.601871] [G loss: 0.937066]\n",
"[Epoch 143/200] [Batch 19/59] [D loss: 0.538259] [G loss: 1.077764]\n",
"[Epoch 143/200] [Batch 20/59] [D loss: 0.523903] [G loss: 0.919214]\n",
"[Epoch 143/200] [Batch 21/59] [D loss: 0.563300] [G loss: 0.875541]\n",
"[Epoch 143/200] [Batch 22/59] [D loss: 0.548304] [G loss: 0.813470]\n",
"[Epoch 143/200] [Batch 23/59] [D loss: 0.544418] [G loss: 1.098050]\n",
"[Epoch 143/200] [Batch 24/59] [D loss: 0.495485] [G loss: 0.813958]\n",
"[Epoch 143/200] [Batch 25/59] [D loss: 0.573379] [G loss: 0.914249]\n",
"[Epoch 143/200] [Batch 26/59] [D loss: 0.585917] [G loss: 0.915988]\n",
"[Epoch 143/200] [Batch 27/59] [D loss: 0.600607] [G loss: 0.941545]\n",
"[Epoch 143/200] [Batch 28/59] [D loss: 0.603994] [G loss: 0.821765]\n",
"[Epoch 143/200] [Batch 29/59] [D loss: 0.536437] [G loss: 1.123769]\n",
"[Epoch 143/200] [Batch 30/59] [D loss: 0.594499] [G loss: 0.943615]\n",
"[Epoch 143/200] [Batch 31/59] [D loss: 0.593984] [G loss: 0.795853]\n",
"[Epoch 143/200] [Batch 32/59] [D loss: 0.531014] [G loss: 0.900386]\n",
"[Epoch 143/200] [Batch 33/59] [D loss: 0.557776] [G loss: 0.944660]\n",
"[Epoch 143/200] [Batch 34/59] [D loss: 0.602897] [G loss: 1.045799]\n",
"[Epoch 143/200] [Batch 35/59] [D loss: 0.576372] [G loss: 0.883586]\n",
"[Epoch 143/200] [Batch 36/59] [D loss: 0.646523] [G loss: 1.093177]\n",
"[Epoch 143/200] [Batch 37/59] [D loss: 0.506131] [G loss: 0.911770]\n",
"[Epoch 143/200] [Batch 38/59] [D loss: 0.578153] [G loss: 0.713261]\n",
"[Epoch 143/200] [Batch 39/59] [D loss: 0.501111] [G loss: 1.239069]\n",
"[Epoch 143/200] [Batch 40/59] [D loss: 0.591700] [G loss: 1.000305]\n",
"[Epoch 143/200] [Batch 41/59] [D loss: 0.653834] [G loss: 0.816263]\n",
"[Epoch 143/200] [Batch 42/59] [D loss: 0.654021] [G loss: 1.132927]\n",
"[Epoch 143/200] [Batch 43/59] [D loss: 0.536720] [G loss: 0.804550]\n",
"[Epoch 143/200] [Batch 44/59] [D loss: 0.565540] [G loss: 0.731596]\n",
"[Epoch 143/200] [Batch 45/59] [D loss: 0.568074] [G loss: 1.082729]\n",
"[Epoch 143/200] [Batch 46/59] [D loss: 0.540264] [G loss: 1.034843]\n",
"[Epoch 143/200] [Batch 47/59] [D loss: 0.497323] [G loss: 0.705673]\n",
"[Epoch 143/200] [Batch 48/59] [D loss: 0.497646] [G loss: 0.929386]\n",
"[Epoch 143/200] [Batch 49/59] [D loss: 0.625421] [G loss: 0.938774]\n",
"[Epoch 143/200] [Batch 50/59] [D loss: 0.556913] [G loss: 0.767631]\n",
"[Epoch 143/200] [Batch 51/59] [D loss: 0.599114] [G loss: 0.976211]\n",
"[Epoch 143/200] [Batch 52/59] [D loss: 0.580288] [G loss: 1.135840]\n",
"[Epoch 143/200] [Batch 53/59] [D loss: 0.522943] [G loss: 0.730937]\n",
"[Epoch 143/200] [Batch 54/59] [D loss: 0.661419] [G loss: 0.764481]\n",
"[Epoch 143/200] [Batch 55/59] [D loss: 0.575973] [G loss: 1.131112]\n",
"[Epoch 143/200] [Batch 56/59] [D loss: 0.549998] [G loss: 0.662248]\n",
"[Epoch 143/200] [Batch 57/59] [D loss: 0.540924] [G loss: 0.945376]\n",
"[Epoch 143/200] [Batch 58/59] [D loss: 0.497892] [G loss: 1.044032]\n",
"[Epoch 144/200] [Batch 0/59] [D loss: 0.480084] [G loss: 1.216301]\n",
"[Epoch 144/200] [Batch 1/59] [D loss: 0.539911] [G loss: 0.809785]\n",
"[Epoch 144/200] [Batch 2/59] [D loss: 0.515912] [G loss: 1.048077]\n",
"[Epoch 144/200] [Batch 3/59] [D loss: 0.529534] [G loss: 0.963236]\n",
"[Epoch 144/200] [Batch 4/59] [D loss: 0.581928] [G loss: 0.932207]\n",
"[Epoch 144/200] [Batch 5/59] [D loss: 0.627229] [G loss: 1.047048]\n",
"[Epoch 144/200] [Batch 6/59] [D loss: 0.535247] [G loss: 0.738470]\n",
"[Epoch 144/200] [Batch 7/59] [D loss: 0.554220] [G loss: 0.748793]\n",
"[Epoch 144/200] [Batch 8/59] [D loss: 0.527680] [G loss: 1.010467]\n",
"[Epoch 144/200] [Batch 9/59] [D loss: 0.591091] [G loss: 1.003925]\n",
"[Epoch 144/200] [Batch 10/59] [D loss: 0.611549] [G loss: 0.666436]\n",
"[Epoch 144/200] [Batch 11/59] [D loss: 0.525747] [G loss: 1.012558]\n",
"[Epoch 144/200] [Batch 12/59] [D loss: 0.543143] [G loss: 1.077756]\n",
"[Epoch 144/200] [Batch 13/59] [D loss: 0.589339] [G loss: 0.737443]\n",
"[Epoch 144/200] [Batch 14/59] [D loss: 0.651655] [G loss: 1.073963]\n",
"[Epoch 144/200] [Batch 15/59] [D loss: 0.588994] [G loss: 0.887671]\n",
"[Epoch 144/200] [Batch 16/59] [D loss: 0.495573] [G loss: 1.015995]\n",
"[Epoch 144/200] [Batch 17/59] [D loss: 0.574597] [G loss: 0.590708]\n",
"[Epoch 144/200] [Batch 18/59] [D loss: 0.641779] [G loss: 1.702647]\n",
"[Epoch 144/200] [Batch 19/59] [D loss: 0.497071] [G loss: 0.820055]\n",
"[Epoch 144/200] [Batch 20/59] [D loss: 0.630064] [G loss: 0.561294]\n",
"[Epoch 144/200] [Batch 21/59] [D loss: 0.610363] [G loss: 1.562774]\n",
"[Epoch 144/200] [Batch 22/59] [D loss: 0.521733] [G loss: 1.153180]\n",
"[Epoch 144/200] [Batch 23/59] [D loss: 0.596629] [G loss: 0.540419]\n",
"[Epoch 144/200] [Batch 24/59] [D loss: 0.481804] [G loss: 1.241262]\n",
"[Epoch 144/200] [Batch 25/59] [D loss: 0.595159] [G loss: 1.294834]\n",
"[Epoch 144/200] [Batch 26/59] [D loss: 0.644773] [G loss: 0.778660]\n",
"[Epoch 144/200] [Batch 27/59] [D loss: 0.527492] [G loss: 0.801812]\n",
"[Epoch 144/200] [Batch 28/59] [D loss: 0.578710] [G loss: 1.271112]\n",
"[Epoch 144/200] [Batch 29/59] [D loss: 0.527188] [G loss: 0.974300]\n",
"[Epoch 144/200] [Batch 30/59] [D loss: 0.571503] [G loss: 0.654745]\n",
"[Epoch 144/200] [Batch 31/59] [D loss: 0.580990] [G loss: 1.289429]\n",
"[Epoch 144/200] [Batch 32/59] [D loss: 0.597121] [G loss: 1.043602]\n",
"[Epoch 144/200] [Batch 33/59] [D loss: 0.685511] [G loss: 0.398400]\n",
"[Epoch 144/200] [Batch 34/59] [D loss: 0.653352] [G loss: 1.380256]\n",
"[Epoch 144/200] [Batch 35/59] [D loss: 0.588049] [G loss: 1.143920]\n",
"[Epoch 144/200] [Batch 36/59] [D loss: 0.508505] [G loss: 0.786491]\n",
"[Epoch 144/200] [Batch 37/59] [D loss: 0.556967] [G loss: 1.015873]\n",
"[Epoch 144/200] [Batch 38/59] [D loss: 0.521147] [G loss: 0.782372]\n",
"[Epoch 144/200] [Batch 39/59] [D loss: 0.510237] [G loss: 0.896705]\n",
"[Epoch 144/200] [Batch 40/59] [D loss: 0.558145] [G loss: 0.913999]\n",
"[Epoch 144/200] [Batch 41/59] [D loss: 0.541634] [G loss: 1.100171]\n",
"[Epoch 144/200] [Batch 42/59] [D loss: 0.576279] [G loss: 0.848944]\n",
"[Epoch 144/200] [Batch 43/59] [D loss: 0.579215] [G loss: 1.095286]\n",
"[Epoch 144/200] [Batch 44/59] [D loss: 0.583447] [G loss: 1.108762]\n",
"[Epoch 144/200] [Batch 45/59] [D loss: 0.658128] [G loss: 0.828333]\n",
"[Epoch 144/200] [Batch 46/59] [D loss: 0.546193] [G loss: 0.829633]\n",
"[Epoch 144/200] [Batch 47/59] [D loss: 0.551606] [G loss: 1.109172]\n",
"[Epoch 144/200] [Batch 48/59] [D loss: 0.582035] [G loss: 0.934641]\n",
"[Epoch 144/200] [Batch 49/59] [D loss: 0.614527] [G loss: 0.914687]\n",
"[Epoch 144/200] [Batch 50/59] [D loss: 0.499859] [G loss: 0.971042]\n",
"[Epoch 144/200] [Batch 51/59] [D loss: 0.606078] [G loss: 1.211949]\n",
"[Epoch 144/200] [Batch 52/59] [D loss: 0.546559] [G loss: 0.972052]\n",
"[Epoch 144/200] [Batch 53/59] [D loss: 0.577613] [G loss: 1.024159]\n",
"[Epoch 144/200] [Batch 54/59] [D loss: 0.611304] [G loss: 0.703269]\n",
"[Epoch 144/200] [Batch 55/59] [D loss: 0.602386] [G loss: 0.979502]\n",
"[Epoch 144/200] [Batch 56/59] [D loss: 0.573743] [G loss: 0.893366]\n",
"[Epoch 144/200] [Batch 57/59] [D loss: 0.605784] [G loss: 0.667470]\n",
"[Epoch 144/200] [Batch 58/59] [D loss: 0.581570] [G loss: 1.119370]\n",
"[Epoch 145/200] [Batch 0/59] [D loss: 0.591742] [G loss: 0.834771]\n",
"[Epoch 145/200] [Batch 1/59] [D loss: 0.541559] [G loss: 0.752877]\n",
"[Epoch 145/200] [Batch 2/59] [D loss: 0.661872] [G loss: 1.264927]\n",
"[Epoch 145/200] [Batch 3/59] [D loss: 0.522789] [G loss: 0.566026]\n",
"[Epoch 145/200] [Batch 4/59] [D loss: 0.535740] [G loss: 0.721417]\n",
"[Epoch 145/200] [Batch 5/59] [D loss: 0.585182] [G loss: 0.937431]\n",
"[Epoch 145/200] [Batch 6/59] [D loss: 0.526143] [G loss: 1.278352]\n",
"[Epoch 145/200] [Batch 7/59] [D loss: 0.586708] [G loss: 0.781766]\n",
"[Epoch 145/200] [Batch 8/59] [D loss: 0.540296] [G loss: 0.934907]\n",
"[Epoch 145/200] [Batch 9/59] [D loss: 0.539650] [G loss: 1.121107]\n",
"[Epoch 145/200] [Batch 10/59] [D loss: 0.553604] [G loss: 0.992911]\n",
"[Epoch 145/200] [Batch 11/59] [D loss: 0.496001] [G loss: 0.811145]\n",
"[Epoch 145/200] [Batch 12/59] [D loss: 0.520945] [G loss: 0.801472]\n",
"[Epoch 145/200] [Batch 13/59] [D loss: 0.608387] [G loss: 1.175169]\n",
"[Epoch 145/200] [Batch 14/59] [D loss: 0.609044] [G loss: 0.696361]\n",
"[Epoch 145/200] [Batch 15/59] [D loss: 0.571105] [G loss: 0.820859]\n",
"[Epoch 145/200] [Batch 16/59] [D loss: 0.572708] [G loss: 1.075157]\n",
"[Epoch 145/200] [Batch 17/59] [D loss: 0.617993] [G loss: 0.749573]\n",
"[Epoch 145/200] [Batch 18/59] [D loss: 0.574578] [G loss: 0.947877]\n",
"[Epoch 145/200] [Batch 19/59] [D loss: 0.571005] [G loss: 0.983829]\n",
"[Epoch 145/200] [Batch 20/59] [D loss: 0.556408] [G loss: 1.161852]\n",
"[Epoch 145/200] [Batch 21/59] [D loss: 0.546737] [G loss: 0.765619]\n",
"[Epoch 145/200] [Batch 22/59] [D loss: 0.580298] [G loss: 0.845252]\n",
"[Epoch 145/200] [Batch 23/59] [D loss: 0.557237] [G loss: 1.251274]\n",
"[Epoch 145/200] [Batch 24/59] [D loss: 0.569518] [G loss: 0.811727]\n",
"[Epoch 145/200] [Batch 25/59] [D loss: 0.589081] [G loss: 0.779263]\n",
"[Epoch 145/200] [Batch 26/59] [D loss: 0.671934] [G loss: 1.291029]\n",
"[Epoch 145/200] [Batch 27/59] [D loss: 0.547386] [G loss: 1.195947]\n",
"[Epoch 145/200] [Batch 28/59] [D loss: 0.642560] [G loss: 0.662614]\n",
"[Epoch 145/200] [Batch 29/59] [D loss: 0.533089] [G loss: 1.080563]\n",
"[Epoch 145/200] [Batch 30/59] [D loss: 0.508488] [G loss: 1.009115]\n",
"[Epoch 145/200] [Batch 31/59] [D loss: 0.542660] [G loss: 1.007227]\n",
"[Epoch 145/200] [Batch 32/59] [D loss: 0.621074] [G loss: 0.712973]\n",
"[Epoch 145/200] [Batch 33/59] [D loss: 0.603712] [G loss: 1.198586]\n",
"[Epoch 145/200] [Batch 34/59] [D loss: 0.599254] [G loss: 0.728840]\n",
"[Epoch 145/200] [Batch 35/59] [D loss: 0.559296] [G loss: 0.855845]\n",
"[Epoch 145/200] [Batch 36/59] [D loss: 0.624362] [G loss: 1.054519]\n",
"[Epoch 145/200] [Batch 37/59] [D loss: 0.552377] [G loss: 0.774740]\n",
"[Epoch 145/200] [Batch 38/59] [D loss: 0.477489] [G loss: 1.004818]\n",
"[Epoch 145/200] [Batch 39/59] [D loss: 0.595806] [G loss: 1.126624]\n",
"[Epoch 145/200] [Batch 40/59] [D loss: 0.532990] [G loss: 0.854419]\n",
"[Epoch 145/200] [Batch 41/59] [D loss: 0.505725] [G loss: 0.774376]\n",
"[Epoch 145/200] [Batch 42/59] [D loss: 0.599361] [G loss: 0.679873]\n",
"[Epoch 145/200] [Batch 43/59] [D loss: 0.578813] [G loss: 1.334733]\n",
"[Epoch 145/200] [Batch 44/59] [D loss: 0.607272] [G loss: 1.073223]\n",
"[Epoch 145/200] [Batch 45/59] [D loss: 0.628621] [G loss: 0.743624]\n",
"[Epoch 145/200] [Batch 46/59] [D loss: 0.579475] [G loss: 0.796368]\n",
"[Epoch 145/200] [Batch 47/59] [D loss: 0.579072] [G loss: 0.997168]\n",
"[Epoch 145/200] [Batch 48/59] [D loss: 0.601662] [G loss: 1.007768]\n",
"[Epoch 145/200] [Batch 49/59] [D loss: 0.561819] [G loss: 0.777983]\n",
"[Epoch 145/200] [Batch 50/59] [D loss: 0.546802] [G loss: 0.972109]\n",
"[Epoch 145/200] [Batch 51/59] [D loss: 0.565590] [G loss: 0.819967]\n",
"[Epoch 145/200] [Batch 52/59] [D loss: 0.534060] [G loss: 0.926876]\n",
"[Epoch 145/200] [Batch 53/59] [D loss: 0.641513] [G loss: 1.255141]\n",
"[Epoch 145/200] [Batch 54/59] [D loss: 0.580684] [G loss: 0.842891]\n",
"[Epoch 145/200] [Batch 55/59] [D loss: 0.505876] [G loss: 0.733918]\n",
"[Epoch 145/200] [Batch 56/59] [D loss: 0.556456] [G loss: 0.890255]\n",
"[Epoch 145/200] [Batch 57/59] [D loss: 0.526859] [G loss: 1.292494]\n",
"[Epoch 145/200] [Batch 58/59] [D loss: 0.538611] [G loss: 0.870853]\n",
"[Epoch 146/200] [Batch 0/59] [D loss: 0.595890] [G loss: 0.703805]\n",
"[Epoch 146/200] [Batch 1/59] [D loss: 0.538190] [G loss: 0.980803]\n",
"[Epoch 146/200] [Batch 2/59] [D loss: 0.590761] [G loss: 0.880387]\n",
"[Epoch 146/200] [Batch 3/59] [D loss: 0.536760] [G loss: 0.945892]\n",
"[Epoch 146/200] [Batch 4/59] [D loss: 0.507773] [G loss: 1.031820]\n",
"[Epoch 146/200] [Batch 5/59] [D loss: 0.554893] [G loss: 0.999443]\n",
"[Epoch 146/200] [Batch 6/59] [D loss: 0.557047] [G loss: 0.875586]\n",
"[Epoch 146/200] [Batch 7/59] [D loss: 0.527678] [G loss: 0.819012]\n",
"[Epoch 146/200] [Batch 8/59] [D loss: 0.592134] [G loss: 1.265996]\n",
"[Epoch 146/200] [Batch 9/59] [D loss: 0.597905] [G loss: 0.770846]\n",
"[Epoch 146/200] [Batch 10/59] [D loss: 0.540406] [G loss: 1.007955]\n",
"[Epoch 146/200] [Batch 11/59] [D loss: 0.743097] [G loss: 1.341371]\n",
"[Epoch 146/200] [Batch 12/59] [D loss: 0.477155] [G loss: 0.779980]\n",
"[Epoch 146/200] [Batch 13/59] [D loss: 0.675391] [G loss: 0.523362]\n",
"[Epoch 146/200] [Batch 14/59] [D loss: 0.557879] [G loss: 1.779628]\n",
"[Epoch 146/200] [Batch 15/59] [D loss: 0.464124] [G loss: 1.158351]\n",
"[Epoch 146/200] [Batch 16/59] [D loss: 0.725256] [G loss: 0.428614]\n",
"[Epoch 146/200] [Batch 17/59] [D loss: 0.512046] [G loss: 1.503237]\n",
"[Epoch 146/200] [Batch 18/59] [D loss: 0.605259] [G loss: 1.117864]\n",
"[Epoch 146/200] [Batch 19/59] [D loss: 0.597320] [G loss: 0.709835]\n",
"[Epoch 146/200] [Batch 20/59] [D loss: 0.496407] [G loss: 1.041569]\n",
"[Epoch 146/200] [Batch 21/59] [D loss: 0.535616] [G loss: 1.167427]\n",
"[Epoch 146/200] [Batch 22/59] [D loss: 0.567991] [G loss: 0.780919]\n",
"[Epoch 146/200] [Batch 23/59] [D loss: 0.517781] [G loss: 0.920165]\n",
"[Epoch 146/200] [Batch 24/59] [D loss: 0.567988] [G loss: 1.073810]\n",
"[Epoch 146/200] [Batch 25/59] [D loss: 0.583287] [G loss: 0.953548]\n",
"[Epoch 146/200] [Batch 26/59] [D loss: 0.516161] [G loss: 1.000879]\n",
"[Epoch 146/200] [Batch 27/59] [D loss: 0.540100] [G loss: 0.869185]\n",
"[Epoch 146/200] [Batch 28/59] [D loss: 0.575836] [G loss: 0.771229]\n",
"[Epoch 146/200] [Batch 29/59] [D loss: 0.633957] [G loss: 1.108675]\n",
"[Epoch 146/200] [Batch 30/59] [D loss: 0.592287] [G loss: 0.810969]\n",
"[Epoch 146/200] [Batch 31/59] [D loss: 0.572665] [G loss: 0.728647]\n",
"[Epoch 146/200] [Batch 32/59] [D loss: 0.507869] [G loss: 1.073420]\n",
"[Epoch 146/200] [Batch 33/59] [D loss: 0.586386] [G loss: 1.016312]\n",
"[Epoch 146/200] [Batch 34/59] [D loss: 0.591689] [G loss: 0.962779]\n",
"[Epoch 146/200] [Batch 35/59] [D loss: 0.549281] [G loss: 1.158469]\n",
"[Epoch 146/200] [Batch 36/59] [D loss: 0.535674] [G loss: 0.951006]\n",
"[Epoch 146/200] [Batch 37/59] [D loss: 0.550713] [G loss: 0.759335]\n",
"[Epoch 146/200] [Batch 38/59] [D loss: 0.560713] [G loss: 0.775790]\n",
"[Epoch 146/200] [Batch 39/59] [D loss: 0.562572] [G loss: 1.122560]\n",
"[Epoch 146/200] [Batch 40/59] [D loss: 0.599054] [G loss: 0.983613]\n",
"[Epoch 146/200] [Batch 41/59] [D loss: 0.578336] [G loss: 0.722371]\n",
"[Epoch 146/200] [Batch 42/59] [D loss: 0.572421] [G loss: 1.121058]\n",
"[Epoch 146/200] [Batch 43/59] [D loss: 0.535680] [G loss: 0.728037]\n",
"[Epoch 146/200] [Batch 44/59] [D loss: 0.558907] [G loss: 1.211326]\n",
"[Epoch 146/200] [Batch 45/59] [D loss: 0.586200] [G loss: 0.850023]\n",
"[Epoch 146/200] [Batch 46/59] [D loss: 0.568977] [G loss: 0.652505]\n",
"[Epoch 146/200] [Batch 47/59] [D loss: 0.560172] [G loss: 0.711066]\n",
"[Epoch 146/200] [Batch 48/59] [D loss: 0.586598] [G loss: 1.063071]\n",
"[Epoch 146/200] [Batch 49/59] [D loss: 0.519777] [G loss: 0.890408]\n",
"[Epoch 146/200] [Batch 50/59] [D loss: 0.525613] [G loss: 0.919677]\n",
"[Epoch 146/200] [Batch 51/59] [D loss: 0.616363] [G loss: 1.102111]\n",
"[Epoch 146/200] [Batch 52/59] [D loss: 0.545098] [G loss: 0.792481]\n",
"[Epoch 146/200] [Batch 53/59] [D loss: 0.640192] [G loss: 0.801437]\n",
"[Epoch 146/200] [Batch 54/59] [D loss: 0.557881] [G loss: 1.667771]\n",
"[Epoch 146/200] [Batch 55/59] [D loss: 0.526339] [G loss: 0.792286]\n",
"[Epoch 146/200] [Batch 56/59] [D loss: 0.637461] [G loss: 0.441415]\n",
"[Epoch 146/200] [Batch 57/59] [D loss: 0.576114] [G loss: 1.613541]\n",
"[Epoch 146/200] [Batch 58/59] [D loss: 0.477609] [G loss: 1.056157]\n",
"[Epoch 147/200] [Batch 0/59] [D loss: 0.645602] [G loss: 0.753108]\n",
"[Epoch 147/200] [Batch 1/59] [D loss: 0.544946] [G loss: 1.338169]\n",
"[Epoch 147/200] [Batch 2/59] [D loss: 0.579515] [G loss: 0.756306]\n",
"[Epoch 147/200] [Batch 3/59] [D loss: 0.646535] [G loss: 0.826502]\n",
"[Epoch 147/200] [Batch 4/59] [D loss: 0.518092] [G loss: 1.183585]\n",
"[Epoch 147/200] [Batch 5/59] [D loss: 0.522293] [G loss: 0.972003]\n",
"[Epoch 147/200] [Batch 6/59] [D loss: 0.507541] [G loss: 0.936191]\n",
"[Epoch 147/200] [Batch 7/59] [D loss: 0.580280] [G loss: 1.044210]\n",
"[Epoch 147/200] [Batch 8/59] [D loss: 0.537414] [G loss: 1.067409]\n",
"[Epoch 147/200] [Batch 9/59] [D loss: 0.601993] [G loss: 0.703090]\n",
"[Epoch 147/200] [Batch 10/59] [D loss: 0.535226] [G loss: 1.218693]\n",
"[Epoch 147/200] [Batch 11/59] [D loss: 0.560468] [G loss: 0.637305]\n",
"[Epoch 147/200] [Batch 12/59] [D loss: 0.579043] [G loss: 1.125745]\n",
"[Epoch 147/200] [Batch 13/59] [D loss: 0.595918] [G loss: 0.798771]\n",
"[Epoch 147/200] [Batch 14/59] [D loss: 0.556271] [G loss: 0.866415]\n",
"[Epoch 147/200] [Batch 15/59] [D loss: 0.511984] [G loss: 0.923597]\n",
"[Epoch 147/200] [Batch 16/59] [D loss: 0.676624] [G loss: 0.689236]\n",
"[Epoch 147/200] [Batch 17/59] [D loss: 0.669843] [G loss: 1.110643]\n",
"[Epoch 147/200] [Batch 18/59] [D loss: 0.562662] [G loss: 0.915423]\n",
"[Epoch 147/200] [Batch 19/59] [D loss: 0.652753] [G loss: 0.724340]\n",
"[Epoch 147/200] [Batch 20/59] [D loss: 0.544623] [G loss: 1.019104]\n",
"[Epoch 147/200] [Batch 21/59] [D loss: 0.589148] [G loss: 1.084060]\n",
"[Epoch 147/200] [Batch 22/59] [D loss: 0.501588] [G loss: 0.833644]\n",
"[Epoch 147/200] [Batch 23/59] [D loss: 0.514626] [G loss: 0.891151]\n",
"[Epoch 147/200] [Batch 24/59] [D loss: 0.619052] [G loss: 1.088450]\n",
"[Epoch 147/200] [Batch 25/59] [D loss: 0.518663] [G loss: 0.728200]\n",
"[Epoch 147/200] [Batch 26/59] [D loss: 0.592497] [G loss: 1.051147]\n",
"[Epoch 147/200] [Batch 27/59] [D loss: 0.559999] [G loss: 0.726306]\n",
"[Epoch 147/200] [Batch 28/59] [D loss: 0.499193] [G loss: 0.976729]\n",
"[Epoch 147/200] [Batch 29/59] [D loss: 0.529994] [G loss: 1.023107]\n",
"[Epoch 147/200] [Batch 30/59] [D loss: 0.650895] [G loss: 0.769087]\n",
"[Epoch 147/200] [Batch 31/59] [D loss: 0.627787] [G loss: 0.851903]\n",
"[Epoch 147/200] [Batch 32/59] [D loss: 0.521976] [G loss: 1.099681]\n",
"[Epoch 147/200] [Batch 33/59] [D loss: 0.579435] [G loss: 0.776874]\n",
"[Epoch 147/200] [Batch 34/59] [D loss: 0.519494] [G loss: 1.090182]\n",
"[Epoch 147/200] [Batch 35/59] [D loss: 0.556470] [G loss: 1.119291]\n",
"[Epoch 147/200] [Batch 36/59] [D loss: 0.567975] [G loss: 0.767575]\n",
"[Epoch 147/200] [Batch 37/59] [D loss: 0.522281] [G loss: 1.186879]\n",
"[Epoch 147/200] [Batch 38/59] [D loss: 0.545408] [G loss: 0.930481]\n",
"[Epoch 147/200] [Batch 39/59] [D loss: 0.559397] [G loss: 0.770578]\n",
"[Epoch 147/200] [Batch 40/59] [D loss: 0.544796] [G loss: 0.872712]\n",
"[Epoch 147/200] [Batch 41/59] [D loss: 0.590587] [G loss: 0.804786]\n",
"[Epoch 147/200] [Batch 42/59] [D loss: 0.535141] [G loss: 1.137430]\n",
"[Epoch 147/200] [Batch 43/59] [D loss: 0.518304] [G loss: 1.100397]\n",
"[Epoch 147/200] [Batch 44/59] [D loss: 0.648941] [G loss: 0.591568]\n",
"[Epoch 147/200] [Batch 45/59] [D loss: 0.513403] [G loss: 1.102920]\n",
"[Epoch 147/200] [Batch 46/59] [D loss: 0.582888] [G loss: 1.067575]\n",
"[Epoch 147/200] [Batch 47/59] [D loss: 0.565187] [G loss: 0.595517]\n",
"[Epoch 147/200] [Batch 48/59] [D loss: 0.579144] [G loss: 0.911895]\n",
"[Epoch 147/200] [Batch 49/59] [D loss: 0.569641] [G loss: 1.243016]\n",
"[Epoch 147/200] [Batch 50/59] [D loss: 0.539986] [G loss: 0.903713]\n",
"[Epoch 147/200] [Batch 51/59] [D loss: 0.630567] [G loss: 0.630751]\n",
"[Epoch 147/200] [Batch 52/59] [D loss: 0.539650] [G loss: 1.276174]\n",
"[Epoch 147/200] [Batch 53/59] [D loss: 0.596316] [G loss: 1.024638]\n",
"[Epoch 147/200] [Batch 54/59] [D loss: 0.747755] [G loss: 0.441726]\n",
"[Epoch 147/200] [Batch 55/59] [D loss: 0.596174] [G loss: 1.729938]\n",
"[Epoch 147/200] [Batch 56/59] [D loss: 0.516585] [G loss: 1.031906]\n",
"[Epoch 147/200] [Batch 57/59] [D loss: 0.669230] [G loss: 0.551396]\n",
"[Epoch 147/200] [Batch 58/59] [D loss: 0.582219] [G loss: 1.695742]\n",
"[Epoch 148/200] [Batch 0/59] [D loss: 0.516482] [G loss: 0.972010]\n",
"[Epoch 148/200] [Batch 1/59] [D loss: 0.668872] [G loss: 0.451671]\n",
"[Epoch 148/200] [Batch 2/59] [D loss: 0.555050] [G loss: 1.731174]\n",
"[Epoch 148/200] [Batch 3/59] [D loss: 0.425544] [G loss: 1.146266]\n",
"[Epoch 148/200] [Batch 4/59] [D loss: 0.649100] [G loss: 0.607692]\n",
"[Epoch 148/200] [Batch 5/59] [D loss: 0.492246] [G loss: 1.036705]\n",
"[Epoch 148/200] [Batch 6/59] [D loss: 0.573219] [G loss: 1.240119]\n",
"[Epoch 148/200] [Batch 7/59] [D loss: 0.527279] [G loss: 0.840706]\n",
"[Epoch 148/200] [Batch 8/59] [D loss: 0.565182] [G loss: 0.874780]\n",
"[Epoch 148/200] [Batch 9/59] [D loss: 0.622580] [G loss: 1.128389]\n",
"[Epoch 148/200] [Batch 10/59] [D loss: 0.625201] [G loss: 0.924007]\n",
"[Epoch 148/200] [Batch 11/59] [D loss: 0.541317] [G loss: 1.088854]\n",
"[Epoch 148/200] [Batch 12/59] [D loss: 0.498276] [G loss: 1.159806]\n",
"[Epoch 148/200] [Batch 13/59] [D loss: 0.507236] [G loss: 0.976909]\n",
"[Epoch 148/200] [Batch 14/59] [D loss: 0.594081] [G loss: 0.934180]\n",
"[Epoch 148/200] [Batch 15/59] [D loss: 0.571836] [G loss: 0.881071]\n",
"[Epoch 148/200] [Batch 16/59] [D loss: 0.544557] [G loss: 0.937212]\n",
"[Epoch 148/200] [Batch 17/59] [D loss: 0.506475] [G loss: 1.039749]\n",
"[Epoch 148/200] [Batch 18/59] [D loss: 0.534127] [G loss: 1.022061]\n",
"[Epoch 148/200] [Batch 19/59] [D loss: 0.602896] [G loss: 0.838601]\n",
"[Epoch 148/200] [Batch 20/59] [D loss: 0.615794] [G loss: 0.714719]\n",
"[Epoch 148/200] [Batch 21/59] [D loss: 0.527824] [G loss: 1.156738]\n",
"[Epoch 148/200] [Batch 22/59] [D loss: 0.507353] [G loss: 1.167183]\n",
"[Epoch 148/200] [Batch 23/59] [D loss: 0.595679] [G loss: 0.696924]\n",
"[Epoch 148/200] [Batch 24/59] [D loss: 0.461278] [G loss: 1.063514]\n",
"[Epoch 148/200] [Batch 25/59] [D loss: 0.548721] [G loss: 0.975790]\n",
"[Epoch 148/200] [Batch 26/59] [D loss: 0.584701] [G loss: 0.923263]\n",
"[Epoch 148/200] [Batch 27/59] [D loss: 0.618406] [G loss: 0.750020]\n",
"[Epoch 148/200] [Batch 28/59] [D loss: 0.533868] [G loss: 0.848740]\n",
"[Epoch 148/200] [Batch 29/59] [D loss: 0.577710] [G loss: 1.068886]\n",
"[Epoch 148/200] [Batch 30/59] [D loss: 0.551373] [G loss: 0.615586]\n",
"[Epoch 148/200] [Batch 31/59] [D loss: 0.505314] [G loss: 0.914485]\n",
"[Epoch 148/200] [Batch 32/59] [D loss: 0.603654] [G loss: 1.162226]\n",
"[Epoch 148/200] [Batch 33/59] [D loss: 0.641063] [G loss: 0.660656]\n",
"[Epoch 148/200] [Batch 34/59] [D loss: 0.567167] [G loss: 1.097864]\n",
"[Epoch 148/200] [Batch 35/59] [D loss: 0.543785] [G loss: 1.476649]\n",
"[Epoch 148/200] [Batch 36/59] [D loss: 0.574436] [G loss: 0.782209]\n",
"[Epoch 148/200] [Batch 37/59] [D loss: 0.535359] [G loss: 0.717124]\n",
"[Epoch 148/200] [Batch 38/59] [D loss: 0.556674] [G loss: 1.368817]\n",
"[Epoch 148/200] [Batch 39/59] [D loss: 0.605247] [G loss: 0.925895]\n",
"[Epoch 148/200] [Batch 40/59] [D loss: 0.563340] [G loss: 0.739380]\n",
"[Epoch 148/200] [Batch 41/59] [D loss: 0.512470] [G loss: 0.994653]\n",
"[Epoch 148/200] [Batch 42/59] [D loss: 0.488831] [G loss: 0.995752]\n",
"[Epoch 148/200] [Batch 43/59] [D loss: 0.556731] [G loss: 0.909170]\n",
"[Epoch 148/200] [Batch 44/59] [D loss: 0.578236] [G loss: 1.071695]\n",
"[Epoch 148/200] [Batch 45/59] [D loss: 0.521988] [G loss: 0.849147]\n",
"[Epoch 148/200] [Batch 46/59] [D loss: 0.575678] [G loss: 0.783354]\n",
"[Epoch 148/200] [Batch 47/59] [D loss: 0.520642] [G loss: 1.337430]\n",
"[Epoch 148/200] [Batch 48/59] [D loss: 0.545964] [G loss: 1.098696]\n",
"[Epoch 148/200] [Batch 49/59] [D loss: 0.539080] [G loss: 0.911171]\n",
"[Epoch 148/200] [Batch 50/59] [D loss: 0.514986] [G loss: 1.037905]\n",
"[Epoch 148/200] [Batch 51/59] [D loss: 0.618566] [G loss: 0.823921]\n",
"[Epoch 148/200] [Batch 52/59] [D loss: 0.595140] [G loss: 1.177587]\n",
"[Epoch 148/200] [Batch 53/59] [D loss: 0.573632] [G loss: 0.788577]\n",
"[Epoch 148/200] [Batch 54/59] [D loss: 0.531346] [G loss: 0.909944]\n",
"[Epoch 148/200] [Batch 55/59] [D loss: 0.555424] [G loss: 0.924602]\n",
"[Epoch 148/200] [Batch 56/59] [D loss: 0.571307] [G loss: 1.160666]\n",
"[Epoch 148/200] [Batch 57/59] [D loss: 0.498146] [G loss: 0.917835]\n",
"[Epoch 148/200] [Batch 58/59] [D loss: 0.595997] [G loss: 1.046248]\n",
"[Epoch 149/200] [Batch 0/59] [D loss: 0.522777] [G loss: 0.862725]\n",
"[Epoch 149/200] [Batch 1/59] [D loss: 0.557133] [G loss: 0.754752]\n",
"[Epoch 149/200] [Batch 2/59] [D loss: 0.541396] [G loss: 0.938066]\n",
"[Epoch 149/200] [Batch 3/59] [D loss: 0.555185] [G loss: 1.261656]\n",
"[Epoch 149/200] [Batch 4/59] [D loss: 0.705712] [G loss: 0.640270]\n",
"[Epoch 149/200] [Batch 5/59] [D loss: 0.572896] [G loss: 1.220881]\n",
"[Epoch 149/200] [Batch 6/59] [D loss: 0.573962] [G loss: 0.980064]\n",
"[Epoch 149/200] [Batch 7/59] [D loss: 0.582208] [G loss: 1.119457]\n",
"[Epoch 149/200] [Batch 8/59] [D loss: 0.630906] [G loss: 0.964763]\n",
"[Epoch 149/200] [Batch 9/59] [D loss: 0.632746] [G loss: 0.829786]\n",
"[Epoch 149/200] [Batch 10/59] [D loss: 0.554735] [G loss: 1.223793]\n",
"[Epoch 149/200] [Batch 11/59] [D loss: 0.592122] [G loss: 0.970878]\n",
"[Epoch 149/200] [Batch 12/59] [D loss: 0.560610] [G loss: 0.764625]\n",
"[Epoch 149/200] [Batch 13/59] [D loss: 0.548157] [G loss: 1.290195]\n",
"[Epoch 149/200] [Batch 14/59] [D loss: 0.553634] [G loss: 0.962192]\n",
"[Epoch 149/200] [Batch 15/59] [D loss: 0.542319] [G loss: 0.879471]\n",
"[Epoch 149/200] [Batch 16/59] [D loss: 0.639950] [G loss: 1.067966]\n",
"[Epoch 149/200] [Batch 17/59] [D loss: 0.555045] [G loss: 0.900535]\n",
"[Epoch 149/200] [Batch 18/59] [D loss: 0.513715] [G loss: 0.843359]\n",
"[Epoch 149/200] [Batch 19/59] [D loss: 0.513156] [G loss: 0.958985]\n",
"[Epoch 149/200] [Batch 20/59] [D loss: 0.565924] [G loss: 0.880879]\n",
"[Epoch 149/200] [Batch 21/59] [D loss: 0.546284] [G loss: 0.956320]\n",
"[Epoch 149/200] [Batch 22/59] [D loss: 0.587679] [G loss: 0.914835]\n",
"[Epoch 149/200] [Batch 23/59] [D loss: 0.626886] [G loss: 0.715867]\n",
"[Epoch 149/200] [Batch 24/59] [D loss: 0.575942] [G loss: 0.842111]\n",
"[Epoch 149/200] [Batch 25/59] [D loss: 0.555121] [G loss: 0.976902]\n",
"[Epoch 149/200] [Batch 26/59] [D loss: 0.456993] [G loss: 1.030596]\n",
"[Epoch 149/200] [Batch 27/59] [D loss: 0.543591] [G loss: 0.674216]\n",
"[Epoch 149/200] [Batch 28/59] [D loss: 0.560712] [G loss: 0.818049]\n",
"[Epoch 149/200] [Batch 29/59] [D loss: 0.593790] [G loss: 1.194013]\n",
"[Epoch 149/200] [Batch 30/59] [D loss: 0.494773] [G loss: 0.999699]\n",
"[Epoch 149/200] [Batch 31/59] [D loss: 0.538186] [G loss: 0.775477]\n",
"[Epoch 149/200] [Batch 32/59] [D loss: 0.557088] [G loss: 1.116911]\n",
"[Epoch 149/200] [Batch 33/59] [D loss: 0.558117] [G loss: 0.921176]\n",
"[Epoch 149/200] [Batch 34/59] [D loss: 0.565060] [G loss: 0.721404]\n",
"[Epoch 149/200] [Batch 35/59] [D loss: 0.595329] [G loss: 1.127583]\n",
"[Epoch 149/200] [Batch 36/59] [D loss: 0.550189] [G loss: 0.664774]\n",
"[Epoch 149/200] [Batch 37/59] [D loss: 0.569447] [G loss: 0.855832]\n",
"[Epoch 149/200] [Batch 38/59] [D loss: 0.650305] [G loss: 1.318590]\n",
"[Epoch 149/200] [Batch 39/59] [D loss: 0.628234] [G loss: 0.845597]\n",
"[Epoch 149/200] [Batch 40/59] [D loss: 0.605879] [G loss: 0.904684]\n",
"[Epoch 149/200] [Batch 41/59] [D loss: 0.583328] [G loss: 1.272255]\n",
"[Epoch 149/200] [Batch 42/59] [D loss: 0.565398] [G loss: 0.845151]\n",
"[Epoch 149/200] [Batch 43/59] [D loss: 0.521256] [G loss: 0.857839]\n",
"[Epoch 149/200] [Batch 44/59] [D loss: 0.523023] [G loss: 1.152958]\n",
"[Epoch 149/200] [Batch 45/59] [D loss: 0.634025] [G loss: 0.821131]\n",
"[Epoch 149/200] [Batch 46/59] [D loss: 0.496853] [G loss: 1.022041]\n",
"[Epoch 149/200] [Batch 47/59] [D loss: 0.598138] [G loss: 1.186164]\n",
"[Epoch 149/200] [Batch 48/59] [D loss: 0.501889] [G loss: 0.763352]\n",
"[Epoch 149/200] [Batch 49/59] [D loss: 0.574704] [G loss: 0.738061]\n",
"[Epoch 149/200] [Batch 50/59] [D loss: 0.530086] [G loss: 1.164629]\n",
"[Epoch 149/200] [Batch 51/59] [D loss: 0.502696] [G loss: 0.972575]\n",
"[Epoch 149/200] [Batch 52/59] [D loss: 0.608745] [G loss: 0.635288]\n",
"[Epoch 149/200] [Batch 53/59] [D loss: 0.613210] [G loss: 0.892064]\n",
"[Epoch 149/200] [Batch 54/59] [D loss: 0.552760] [G loss: 1.012616]\n",
"[Epoch 149/200] [Batch 55/59] [D loss: 0.548217] [G loss: 0.996808]\n",
"[Epoch 149/200] [Batch 56/59] [D loss: 0.559901] [G loss: 0.860891]\n",
"[Epoch 149/200] [Batch 57/59] [D loss: 0.547046] [G loss: 0.626662]\n",
"[Epoch 149/200] [Batch 58/59] [D loss: 0.620161] [G loss: 1.244476]\n",
"[Epoch 150/200] [Batch 0/59] [D loss: 0.581508] [G loss: 0.668996]\n",
"[Epoch 150/200] [Batch 1/59] [D loss: 0.568878] [G loss: 1.204558]\n",
"[Epoch 150/200] [Batch 2/59] [D loss: 0.611819] [G loss: 0.935263]\n",
"[Epoch 150/200] [Batch 3/59] [D loss: 0.637017] [G loss: 0.552547]\n",
"[Epoch 150/200] [Batch 4/59] [D loss: 0.635095] [G loss: 1.347187]\n",
"[Epoch 150/200] [Batch 5/59] [D loss: 0.511421] [G loss: 0.921746]\n",
"[Epoch 150/200] [Batch 6/59] [D loss: 0.641028] [G loss: 0.909589]\n",
"[Epoch 150/200] [Batch 7/59] [D loss: 0.553639] [G loss: 1.214404]\n",
"[Epoch 150/200] [Batch 8/59] [D loss: 0.510507] [G loss: 0.935337]\n",
"[Epoch 150/200] [Batch 9/59] [D loss: 0.608857] [G loss: 0.842023]\n",
"[Epoch 150/200] [Batch 10/59] [D loss: 0.573385] [G loss: 1.450334]\n",
"[Epoch 150/200] [Batch 11/59] [D loss: 0.611230] [G loss: 0.722002]\n",
"[Epoch 150/200] [Batch 12/59] [D loss: 0.583715] [G loss: 0.675744]\n",
"[Epoch 150/200] [Batch 13/59] [D loss: 0.636414] [G loss: 1.069220]\n",
"[Epoch 150/200] [Batch 14/59] [D loss: 0.516267] [G loss: 1.143438]\n",
"[Epoch 150/200] [Batch 15/59] [D loss: 0.560454] [G loss: 0.837232]\n",
"[Epoch 150/200] [Batch 16/59] [D loss: 0.550354] [G loss: 0.902228]\n",
"[Epoch 150/200] [Batch 17/59] [D loss: 0.605472] [G loss: 1.194284]\n",
"[Epoch 150/200] [Batch 18/59] [D loss: 0.558750] [G loss: 0.815781]\n",
"[Epoch 150/200] [Batch 19/59] [D loss: 0.579766] [G loss: 0.791235]\n",
"[Epoch 150/200] [Batch 20/59] [D loss: 0.552345] [G loss: 1.012676]\n",
"[Epoch 150/200] [Batch 21/59] [D loss: 0.545204] [G loss: 0.929466]\n",
"[Epoch 150/200] [Batch 22/59] [D loss: 0.612088] [G loss: 1.248263]\n",
"[Epoch 150/200] [Batch 23/59] [D loss: 0.640967] [G loss: 0.712982]\n",
"[Epoch 150/200] [Batch 24/59] [D loss: 0.546943] [G loss: 0.961235]\n",
"[Epoch 150/200] [Batch 25/59] [D loss: 0.556409] [G loss: 1.033144]\n",
"[Epoch 150/200] [Batch 26/59] [D loss: 0.566647] [G loss: 0.845306]\n",
"[Epoch 150/200] [Batch 27/59] [D loss: 0.599249] [G loss: 0.997206]\n",
"[Epoch 150/200] [Batch 28/59] [D loss: 0.525720] [G loss: 0.993934]\n",
"[Epoch 150/200] [Batch 29/59] [D loss: 0.612887] [G loss: 0.976164]\n",
"[Epoch 150/200] [Batch 30/59] [D loss: 0.516757] [G loss: 0.883158]\n",
"[Epoch 150/200] [Batch 31/59] [D loss: 0.601758] [G loss: 0.879652]\n",
"[Epoch 150/200] [Batch 32/59] [D loss: 0.533877] [G loss: 1.100940]\n",
"[Epoch 150/200] [Batch 33/59] [D loss: 0.526874] [G loss: 0.913648]\n",
"[Epoch 150/200] [Batch 34/59] [D loss: 0.610133] [G loss: 1.274610]\n",
"[Epoch 150/200] [Batch 35/59] [D loss: 0.603148] [G loss: 0.955084]\n",
"[Epoch 150/200] [Batch 36/59] [D loss: 0.539133] [G loss: 0.767094]\n",
"[Epoch 150/200] [Batch 37/59] [D loss: 0.561273] [G loss: 1.124320]\n",
"[Epoch 150/200] [Batch 38/59] [D loss: 0.570875] [G loss: 0.924416]\n",
"[Epoch 150/200] [Batch 39/59] [D loss: 0.507196] [G loss: 0.865848]\n",
"[Epoch 150/200] [Batch 40/59] [D loss: 0.527485] [G loss: 0.860179]\n",
"[Epoch 150/200] [Batch 41/59] [D loss: 0.638943] [G loss: 1.049065]\n",
"[Epoch 150/200] [Batch 42/59] [D loss: 0.584077] [G loss: 0.794592]\n",
"[Epoch 150/200] [Batch 43/59] [D loss: 0.584016] [G loss: 0.938657]\n",
"[Epoch 150/200] [Batch 44/59] [D loss: 0.519885] [G loss: 0.988687]\n",
"[Epoch 150/200] [Batch 45/59] [D loss: 0.558081] [G loss: 1.234373]\n",
"[Epoch 150/200] [Batch 46/59] [D loss: 0.617452] [G loss: 0.600811]\n",
"[Epoch 150/200] [Batch 47/59] [D loss: 0.517377] [G loss: 1.040941]\n",
"[Epoch 150/200] [Batch 48/59] [D loss: 0.595386] [G loss: 1.223665]\n",
"[Epoch 150/200] [Batch 49/59] [D loss: 0.588375] [G loss: 0.624708]\n",
"[Epoch 150/200] [Batch 50/59] [D loss: 0.562970] [G loss: 0.712063]\n",
"[Epoch 150/200] [Batch 51/59] [D loss: 0.586950] [G loss: 1.263755]\n",
"[Epoch 150/200] [Batch 52/59] [D loss: 0.659271] [G loss: 0.635205]\n",
"[Epoch 150/200] [Batch 53/59] [D loss: 0.594053] [G loss: 1.093363]\n",
"[Epoch 150/200] [Batch 54/59] [D loss: 0.522802] [G loss: 1.006869]\n",
"[Epoch 150/200] [Batch 55/59] [D loss: 0.542639] [G loss: 0.849029]\n",
"[Epoch 150/200] [Batch 56/59] [D loss: 0.547760] [G loss: 1.035014]\n",
"[Epoch 150/200] [Batch 57/59] [D loss: 0.480706] [G loss: 0.919493]\n",
"[Epoch 150/200] [Batch 58/59] [D loss: 0.644628] [G loss: 0.770115]\n",
"[Epoch 151/200] [Batch 0/59] [D loss: 0.567588] [G loss: 1.000637]\n",
"[Epoch 151/200] [Batch 1/59] [D loss: 0.594202] [G loss: 0.807686]\n",
"[Epoch 151/200] [Batch 2/59] [D loss: 0.574924] [G loss: 1.193828]\n",
"[Epoch 151/200] [Batch 3/59] [D loss: 0.566536] [G loss: 0.926653]\n",
"[Epoch 151/200] [Batch 4/59] [D loss: 0.550824] [G loss: 0.920907]\n",
"[Epoch 151/200] [Batch 5/59] [D loss: 0.549570] [G loss: 1.114713]\n",
"[Epoch 151/200] [Batch 6/59] [D loss: 0.500257] [G loss: 1.070670]\n",
"[Epoch 151/200] [Batch 7/59] [D loss: 0.536038] [G loss: 0.818467]\n",
"[Epoch 151/200] [Batch 8/59] [D loss: 0.603748] [G loss: 0.866634]\n",
"[Epoch 151/200] [Batch 9/59] [D loss: 0.567831] [G loss: 1.020056]\n",
"[Epoch 151/200] [Batch 10/59] [D loss: 0.541402] [G loss: 0.706520]\n",
"[Epoch 151/200] [Batch 11/59] [D loss: 0.550820] [G loss: 0.906065]\n",
"[Epoch 151/200] [Batch 12/59] [D loss: 0.638017] [G loss: 1.277798]\n",
"[Epoch 151/200] [Batch 13/59] [D loss: 0.514154] [G loss: 0.712996]\n",
"[Epoch 151/200] [Batch 14/59] [D loss: 0.569593] [G loss: 0.655292]\n",
"[Epoch 151/200] [Batch 15/59] [D loss: 0.489337] [G loss: 1.030198]\n",
"[Epoch 151/200] [Batch 16/59] [D loss: 0.580430] [G loss: 0.980351]\n",
"[Epoch 151/200] [Batch 17/59] [D loss: 0.519383] [G loss: 1.090002]\n",
"[Epoch 151/200] [Batch 18/59] [D loss: 0.541187] [G loss: 1.164592]\n",
"[Epoch 151/200] [Batch 19/59] [D loss: 0.584539] [G loss: 0.740487]\n",
"[Epoch 151/200] [Batch 20/59] [D loss: 0.520542] [G loss: 0.988235]\n",
"[Epoch 151/200] [Batch 21/59] [D loss: 0.520299] [G loss: 1.232795]\n",
"[Epoch 151/200] [Batch 22/59] [D loss: 0.591379] [G loss: 0.652391]\n",
"[Epoch 151/200] [Batch 23/59] [D loss: 0.501031] [G loss: 0.979949]\n",
"[Epoch 151/200] [Batch 24/59] [D loss: 0.596753] [G loss: 1.115725]\n",
"[Epoch 151/200] [Batch 25/59] [D loss: 0.519499] [G loss: 0.669708]\n",
"[Epoch 151/200] [Batch 26/59] [D loss: 0.528028] [G loss: 0.842681]\n",
"[Epoch 151/200] [Batch 27/59] [D loss: 0.504422] [G loss: 1.104345]\n",
"[Epoch 151/200] [Batch 28/59] [D loss: 0.569920] [G loss: 0.949442]\n",
"[Epoch 151/200] [Batch 29/59] [D loss: 0.574888] [G loss: 0.966079]\n",
"[Epoch 151/200] [Batch 30/59] [D loss: 0.576603] [G loss: 0.872457]\n",
"[Epoch 151/200] [Batch 31/59] [D loss: 0.570493] [G loss: 0.923238]\n",
"[Epoch 151/200] [Batch 32/59] [D loss: 0.513670] [G loss: 0.944716]\n",
"[Epoch 151/200] [Batch 33/59] [D loss: 0.525928] [G loss: 1.155927]\n",
"[Epoch 151/200] [Batch 34/59] [D loss: 0.515199] [G loss: 0.718778]\n",
"[Epoch 151/200] [Batch 35/59] [D loss: 0.615305] [G loss: 0.968435]\n",
"[Epoch 151/200] [Batch 36/59] [D loss: 0.520833] [G loss: 0.862969]\n",
"[Epoch 151/200] [Batch 37/59] [D loss: 0.529303] [G loss: 0.869557]\n",
"[Epoch 151/200] [Batch 38/59] [D loss: 0.472598] [G loss: 1.165798]\n",
"[Epoch 151/200] [Batch 39/59] [D loss: 0.526658] [G loss: 1.129194]\n",
"[Epoch 151/200] [Batch 40/59] [D loss: 0.635815] [G loss: 1.104679]\n",
"[Epoch 151/200] [Batch 41/59] [D loss: 0.590506] [G loss: 0.803780]\n",
"[Epoch 151/200] [Batch 42/59] [D loss: 0.676222] [G loss: 1.352279]\n",
"[Epoch 151/200] [Batch 43/59] [D loss: 0.541187] [G loss: 0.928380]\n",
"[Epoch 151/200] [Batch 44/59] [D loss: 0.514827] [G loss: 0.780301]\n",
"[Epoch 151/200] [Batch 45/59] [D loss: 0.490236] [G loss: 1.106081]\n",
"[Epoch 151/200] [Batch 46/59] [D loss: 0.526219] [G loss: 0.952311]\n",
"[Epoch 151/200] [Batch 47/59] [D loss: 0.590936] [G loss: 0.885212]\n",
"[Epoch 151/200] [Batch 48/59] [D loss: 0.585103] [G loss: 0.704771]\n",
"[Epoch 151/200] [Batch 49/59] [D loss: 0.513876] [G loss: 0.903473]\n",
"[Epoch 151/200] [Batch 50/59] [D loss: 0.575367] [G loss: 1.298283]\n",
"[Epoch 151/200] [Batch 51/59] [D loss: 0.556967] [G loss: 0.779069]\n",
"[Epoch 151/200] [Batch 52/59] [D loss: 0.512128] [G loss: 1.146428]\n",
"[Epoch 151/200] [Batch 53/59] [D loss: 0.520224] [G loss: 1.199892]\n",
"[Epoch 151/200] [Batch 54/59] [D loss: 0.563177] [G loss: 0.755343]\n",
"[Epoch 151/200] [Batch 55/59] [D loss: 0.501443] [G loss: 0.805276]\n",
"[Epoch 151/200] [Batch 56/59] [D loss: 0.600710] [G loss: 0.903342]\n",
"[Epoch 151/200] [Batch 57/59] [D loss: 0.540725] [G loss: 1.268803]\n",
"[Epoch 151/200] [Batch 58/59] [D loss: 0.529824] [G loss: 0.956582]\n",
"[Epoch 152/200] [Batch 0/59] [D loss: 0.655251] [G loss: 0.948339]\n",
"[Epoch 152/200] [Batch 1/59] [D loss: 0.553549] [G loss: 1.003410]\n",
"[Epoch 152/200] [Batch 2/59] [D loss: 0.536432] [G loss: 1.143104]\n",
"[Epoch 152/200] [Batch 3/59] [D loss: 0.567429] [G loss: 0.516422]\n",
"[Epoch 152/200] [Batch 4/59] [D loss: 0.564970] [G loss: 1.088527]\n",
"[Epoch 152/200] [Batch 5/59] [D loss: 0.508228] [G loss: 1.309690]\n",
"[Epoch 152/200] [Batch 6/59] [D loss: 0.547877] [G loss: 0.708099]\n",
"[Epoch 152/200] [Batch 7/59] [D loss: 0.585088] [G loss: 0.654333]\n",
"[Epoch 152/200] [Batch 8/59] [D loss: 0.548499] [G loss: 1.388532]\n",
"[Epoch 152/200] [Batch 9/59] [D loss: 0.499140] [G loss: 1.141929]\n",
"[Epoch 152/200] [Batch 10/59] [D loss: 0.564381] [G loss: 0.752141]\n",
"[Epoch 152/200] [Batch 11/59] [D loss: 0.620360] [G loss: 0.799676]\n",
"[Epoch 152/200] [Batch 12/59] [D loss: 0.630813] [G loss: 1.071101]\n",
"[Epoch 152/200] [Batch 13/59] [D loss: 0.534821] [G loss: 1.095196]\n",
"[Epoch 152/200] [Batch 14/59] [D loss: 0.574586] [G loss: 0.712321]\n",
"[Epoch 152/200] [Batch 15/59] [D loss: 0.660693] [G loss: 1.191760]\n",
"[Epoch 152/200] [Batch 16/59] [D loss: 0.589598] [G loss: 1.153472]\n",
"[Epoch 152/200] [Batch 17/59] [D loss: 0.497089] [G loss: 0.707962]\n",
"[Epoch 152/200] [Batch 18/59] [D loss: 0.552049] [G loss: 1.063263]\n",
"[Epoch 152/200] [Batch 19/59] [D loss: 0.574027] [G loss: 0.945591]\n",
"[Epoch 152/200] [Batch 20/59] [D loss: 0.627237] [G loss: 0.776275]\n",
"[Epoch 152/200] [Batch 21/59] [D loss: 0.549902] [G loss: 0.959133]\n",
"[Epoch 152/200] [Batch 22/59] [D loss: 0.518790] [G loss: 0.757141]\n",
"[Epoch 152/200] [Batch 23/59] [D loss: 0.564194] [G loss: 0.761482]\n",
"[Epoch 152/200] [Batch 24/59] [D loss: 0.637079] [G loss: 1.492045]\n",
"[Epoch 152/200] [Batch 25/59] [D loss: 0.552051] [G loss: 0.885702]\n",
"[Epoch 152/200] [Batch 26/59] [D loss: 0.657135] [G loss: 0.730152]\n",
"[Epoch 152/200] [Batch 27/59] [D loss: 0.568629] [G loss: 1.403073]\n",
"[Epoch 152/200] [Batch 28/59] [D loss: 0.506078] [G loss: 0.853678]\n",
"[Epoch 152/200] [Batch 29/59] [D loss: 0.574294] [G loss: 1.012106]\n",
"[Epoch 152/200] [Batch 30/59] [D loss: 0.573371] [G loss: 1.022061]\n",
"[Epoch 152/200] [Batch 31/59] [D loss: 0.561098] [G loss: 1.095036]\n",
"[Epoch 152/200] [Batch 32/59] [D loss: 0.565571] [G loss: 0.889264]\n",
"[Epoch 152/200] [Batch 33/59] [D loss: 0.569288] [G loss: 0.902945]\n",
"[Epoch 152/200] [Batch 34/59] [D loss: 0.611353] [G loss: 1.152238]\n",
"[Epoch 152/200] [Batch 35/59] [D loss: 0.578028] [G loss: 1.137900]\n",
"[Epoch 152/200] [Batch 36/59] [D loss: 0.537132] [G loss: 0.687280]\n",
"[Epoch 152/200] [Batch 37/59] [D loss: 0.488663] [G loss: 0.951213]\n",
"[Epoch 152/200] [Batch 38/59] [D loss: 0.576300] [G loss: 1.099008]\n",
"[Epoch 152/200] [Batch 39/59] [D loss: 0.487183] [G loss: 0.960468]\n",
"[Epoch 152/200] [Batch 40/59] [D loss: 0.565390] [G loss: 0.889163]\n",
"[Epoch 152/200] [Batch 41/59] [D loss: 0.565553] [G loss: 1.309773]\n",
"[Epoch 152/200] [Batch 42/59] [D loss: 0.537741] [G loss: 0.885728]\n",
"[Epoch 152/200] [Batch 43/59] [D loss: 0.505841] [G loss: 0.693074]\n",
"[Epoch 152/200] [Batch 44/59] [D loss: 0.570240] [G loss: 0.949403]\n",
"[Epoch 152/200] [Batch 45/59] [D loss: 0.574517] [G loss: 0.991184]\n",
"[Epoch 152/200] [Batch 46/59] [D loss: 0.532201] [G loss: 0.955719]\n",
"[Epoch 152/200] [Batch 47/59] [D loss: 0.596592] [G loss: 0.652388]\n",
"[Epoch 152/200] [Batch 48/59] [D loss: 0.518205] [G loss: 1.506279]\n",
"[Epoch 152/200] [Batch 49/59] [D loss: 0.512569] [G loss: 0.844760]\n",
"[Epoch 152/200] [Batch 50/59] [D loss: 0.729143] [G loss: 0.391118]\n",
"[Epoch 152/200] [Batch 51/59] [D loss: 0.473239] [G loss: 1.425100]\n",
"[Epoch 152/200] [Batch 52/59] [D loss: 0.577883] [G loss: 1.200298]\n",
"[Epoch 152/200] [Batch 53/59] [D loss: 0.641194] [G loss: 0.668704]\n",
"[Epoch 152/200] [Batch 54/59] [D loss: 0.513963] [G loss: 1.241194]\n",
"[Epoch 152/200] [Batch 55/59] [D loss: 0.634965] [G loss: 1.230110]\n",
"[Epoch 152/200] [Batch 56/59] [D loss: 0.700688] [G loss: 0.483424]\n",
"[Epoch 152/200] [Batch 57/59] [D loss: 0.615560] [G loss: 1.237282]\n",
"[Epoch 152/200] [Batch 58/59] [D loss: 0.641828] [G loss: 1.018895]\n",
"[Epoch 153/200] [Batch 0/59] [D loss: 0.632726] [G loss: 0.866212]\n",
"[Epoch 153/200] [Batch 1/59] [D loss: 0.501991] [G loss: 0.782554]\n",
"[Epoch 153/200] [Batch 2/59] [D loss: 0.551318] [G loss: 1.121189]\n",
"[Epoch 153/200] [Batch 3/59] [D loss: 0.633461] [G loss: 0.797657]\n",
"[Epoch 153/200] [Batch 4/59] [D loss: 0.691978] [G loss: 0.833332]\n",
"[Epoch 153/200] [Batch 5/59] [D loss: 0.524371] [G loss: 0.855310]\n",
"[Epoch 153/200] [Batch 6/59] [D loss: 0.625711] [G loss: 1.038926]\n",
"[Epoch 153/200] [Batch 7/59] [D loss: 0.564805] [G loss: 1.146410]\n",
"[Epoch 153/200] [Batch 8/59] [D loss: 0.658267] [G loss: 0.631329]\n",
"[Epoch 153/200] [Batch 9/59] [D loss: 0.657955] [G loss: 1.436275]\n",
"[Epoch 153/200] [Batch 10/59] [D loss: 0.587294] [G loss: 0.797226]\n",
"[Epoch 153/200] [Batch 11/59] [D loss: 0.592812] [G loss: 0.807403]\n",
"[Epoch 153/200] [Batch 12/59] [D loss: 0.568216] [G loss: 1.195284]\n",
"[Epoch 153/200] [Batch 13/59] [D loss: 0.645941] [G loss: 0.793027]\n",
"[Epoch 153/200] [Batch 14/59] [D loss: 0.464200] [G loss: 0.784701]\n",
"[Epoch 153/200] [Batch 15/59] [D loss: 0.583634] [G loss: 0.781781]\n",
"[Epoch 153/200] [Batch 16/59] [D loss: 0.613899] [G loss: 1.102274]\n",
"[Epoch 153/200] [Batch 17/59] [D loss: 0.486916] [G loss: 1.264001]\n",
"[Epoch 153/200] [Batch 18/59] [D loss: 0.557352] [G loss: 0.753493]\n",
"[Epoch 153/200] [Batch 19/59] [D loss: 0.583334] [G loss: 0.820282]\n",
"[Epoch 153/200] [Batch 20/59] [D loss: 0.550979] [G loss: 0.669795]\n",
"[Epoch 153/200] [Batch 21/59] [D loss: 0.632635] [G loss: 0.986175]\n",
"[Epoch 153/200] [Batch 22/59] [D loss: 0.564576] [G loss: 0.979402]\n",
"[Epoch 153/200] [Batch 23/59] [D loss: 0.518548] [G loss: 0.853131]\n",
"[Epoch 153/200] [Batch 24/59] [D loss: 0.528601] [G loss: 0.959187]\n",
"[Epoch 153/200] [Batch 25/59] [D loss: 0.617548] [G loss: 0.752915]\n",
"[Epoch 153/200] [Batch 26/59] [D loss: 0.572597] [G loss: 1.057917]\n",
"[Epoch 153/200] [Batch 27/59] [D loss: 0.582612] [G loss: 0.760771]\n",
"[Epoch 153/200] [Batch 28/59] [D loss: 0.602605] [G loss: 0.780089]\n",
"[Epoch 153/200] [Batch 29/59] [D loss: 0.646121] [G loss: 1.077877]\n",
"[Epoch 153/200] [Batch 30/59] [D loss: 0.567658] [G loss: 0.985527]\n",
"[Epoch 153/200] [Batch 31/59] [D loss: 0.650056] [G loss: 0.607091]\n",
"[Epoch 153/200] [Batch 32/59] [D loss: 0.558583] [G loss: 1.122871]\n",
"[Epoch 153/200] [Batch 33/59] [D loss: 0.544335] [G loss: 0.892107]\n",
"[Epoch 153/200] [Batch 34/59] [D loss: 0.527598] [G loss: 1.013461]\n",
"[Epoch 153/200] [Batch 35/59] [D loss: 0.617252] [G loss: 1.234060]\n",
"[Epoch 153/200] [Batch 36/59] [D loss: 0.571126] [G loss: 0.709022]\n",
"[Epoch 153/200] [Batch 37/59] [D loss: 0.495955] [G loss: 0.924751]\n",
"[Epoch 153/200] [Batch 38/59] [D loss: 0.507787] [G loss: 1.281083]\n",
"[Epoch 153/200] [Batch 39/59] [D loss: 0.479752] [G loss: 1.023630]\n",
"[Epoch 153/200] [Batch 40/59] [D loss: 0.638423] [G loss: 0.753263]\n",
"[Epoch 153/200] [Batch 41/59] [D loss: 0.484824] [G loss: 1.294524]\n",
"[Epoch 153/200] [Batch 42/59] [D loss: 0.586735] [G loss: 1.016894]\n",
"[Epoch 153/200] [Batch 43/59] [D loss: 0.547305] [G loss: 0.813709]\n",
"[Epoch 153/200] [Batch 44/59] [D loss: 0.503809] [G loss: 1.005604]\n",
"[Epoch 153/200] [Batch 45/59] [D loss: 0.546343] [G loss: 1.057822]\n",
"[Epoch 153/200] [Batch 46/59] [D loss: 0.538962] [G loss: 0.853273]\n",
"[Epoch 153/200] [Batch 47/59] [D loss: 0.549361] [G loss: 1.003969]\n",
"[Epoch 153/200] [Batch 48/59] [D loss: 0.529091] [G loss: 0.942360]\n",
"[Epoch 153/200] [Batch 49/59] [D loss: 0.562967] [G loss: 1.016122]\n",
"[Epoch 153/200] [Batch 50/59] [D loss: 0.587384] [G loss: 1.011102]\n",
"[Epoch 153/200] [Batch 51/59] [D loss: 0.492614] [G loss: 0.854656]\n",
"[Epoch 153/200] [Batch 52/59] [D loss: 0.578779] [G loss: 0.945577]\n",
"[Epoch 153/200] [Batch 53/59] [D loss: 0.581664] [G loss: 1.192308]\n",
"[Epoch 153/200] [Batch 54/59] [D loss: 0.602647] [G loss: 0.992583]\n",
"[Epoch 153/200] [Batch 55/59] [D loss: 0.572028] [G loss: 1.074176]\n",
"[Epoch 153/200] [Batch 56/59] [D loss: 0.498445] [G loss: 0.804279]\n",
"[Epoch 153/200] [Batch 57/59] [D loss: 0.530138] [G loss: 0.797737]\n",
"[Epoch 153/200] [Batch 58/59] [D loss: 0.468190] [G loss: 0.925047]\n",
"[Epoch 154/200] [Batch 0/59] [D loss: 0.617133] [G loss: 0.974232]\n",
"[Epoch 154/200] [Batch 1/59] [D loss: 0.591087] [G loss: 0.584501]\n",
"[Epoch 154/200] [Batch 2/59] [D loss: 0.584580] [G loss: 1.250356]\n",
"[Epoch 154/200] [Batch 3/59] [D loss: 0.505612] [G loss: 0.937397]\n",
"[Epoch 154/200] [Batch 4/59] [D loss: 0.565042] [G loss: 0.684089]\n",
"[Epoch 154/200] [Batch 5/59] [D loss: 0.543592] [G loss: 1.123522]\n",
"[Epoch 154/200] [Batch 6/59] [D loss: 0.573241] [G loss: 1.072086]\n",
"[Epoch 154/200] [Batch 7/59] [D loss: 0.615973] [G loss: 1.161414]\n",
"[Epoch 154/200] [Batch 8/59] [D loss: 0.608257] [G loss: 0.540379]\n",
"[Epoch 154/200] [Batch 9/59] [D loss: 0.561385] [G loss: 1.394013]\n",
"[Epoch 154/200] [Batch 10/59] [D loss: 0.520291] [G loss: 0.928513]\n",
"[Epoch 154/200] [Batch 11/59] [D loss: 0.587282] [G loss: 0.561346]\n",
"[Epoch 154/200] [Batch 12/59] [D loss: 0.610129] [G loss: 1.322077]\n",
"[Epoch 154/200] [Batch 13/59] [D loss: 0.641880] [G loss: 1.478397]\n",
"[Epoch 154/200] [Batch 14/59] [D loss: 0.608486] [G loss: 0.577043]\n",
"[Epoch 154/200] [Batch 15/59] [D loss: 0.611237] [G loss: 1.090131]\n",
"[Epoch 154/200] [Batch 16/59] [D loss: 0.551972] [G loss: 1.081890]\n",
"[Epoch 154/200] [Batch 17/59] [D loss: 0.584636] [G loss: 0.761201]\n",
"[Epoch 154/200] [Batch 18/59] [D loss: 0.573761] [G loss: 0.862798]\n",
"[Epoch 154/200] [Batch 19/59] [D loss: 0.563862] [G loss: 0.942072]\n",
"[Epoch 154/200] [Batch 20/59] [D loss: 0.542647] [G loss: 1.031095]\n",
"[Epoch 154/200] [Batch 21/59] [D loss: 0.572580] [G loss: 0.703997]\n",
"[Epoch 154/200] [Batch 22/59] [D loss: 0.611417] [G loss: 1.411823]\n",
"[Epoch 154/200] [Batch 23/59] [D loss: 0.614662] [G loss: 0.790392]\n",
"[Epoch 154/200] [Batch 24/59] [D loss: 0.539345] [G loss: 0.761957]\n",
"[Epoch 154/200] [Batch 25/59] [D loss: 0.550420] [G loss: 1.231612]\n",
"[Epoch 154/200] [Batch 26/59] [D loss: 0.585106] [G loss: 1.135588]\n",
"[Epoch 154/200] [Batch 27/59] [D loss: 0.514368] [G loss: 0.690701]\n",
"[Epoch 154/200] [Batch 28/59] [D loss: 0.597619] [G loss: 0.998334]\n",
"[Epoch 154/200] [Batch 29/59] [D loss: 0.555862] [G loss: 0.761740]\n",
"[Epoch 154/200] [Batch 30/59] [D loss: 0.540073] [G loss: 1.300323]\n",
"[Epoch 154/200] [Batch 31/59] [D loss: 0.552409] [G loss: 0.941079]\n",
"[Epoch 154/200] [Batch 32/59] [D loss: 0.548386] [G loss: 0.902851]\n",
"[Epoch 154/200] [Batch 33/59] [D loss: 0.579589] [G loss: 1.199211]\n",
"[Epoch 154/200] [Batch 34/59] [D loss: 0.623240] [G loss: 0.763283]\n",
"[Epoch 154/200] [Batch 35/59] [D loss: 0.603551] [G loss: 1.072787]\n",
"[Epoch 154/200] [Batch 36/59] [D loss: 0.577492] [G loss: 0.723323]\n",
"[Epoch 154/200] [Batch 37/59] [D loss: 0.566020] [G loss: 0.772214]\n",
"[Epoch 154/200] [Batch 38/59] [D loss: 0.752163] [G loss: 1.284763]\n",
"[Epoch 154/200] [Batch 39/59] [D loss: 0.583976] [G loss: 0.805798]\n",
"[Epoch 154/200] [Batch 40/59] [D loss: 0.554444] [G loss: 0.817871]\n",
"[Epoch 154/200] [Batch 41/59] [D loss: 0.591695] [G loss: 1.215712]\n",
"[Epoch 154/200] [Batch 42/59] [D loss: 0.494427] [G loss: 0.808434]\n",
"[Epoch 154/200] [Batch 43/59] [D loss: 0.597879] [G loss: 0.918195]\n",
"[Epoch 154/200] [Batch 44/59] [D loss: 0.550286] [G loss: 1.407597]\n",
"[Epoch 154/200] [Batch 45/59] [D loss: 0.421858] [G loss: 1.051352]\n",
"[Epoch 154/200] [Batch 46/59] [D loss: 0.555767] [G loss: 0.602987]\n",
"[Epoch 154/200] [Batch 47/59] [D loss: 0.546848] [G loss: 1.315428]\n",
"[Epoch 154/200] [Batch 48/59] [D loss: 0.581315] [G loss: 1.209584]\n",
"[Epoch 154/200] [Batch 49/59] [D loss: 0.564262] [G loss: 0.801669]\n",
"[Epoch 154/200] [Batch 50/59] [D loss: 0.527474] [G loss: 1.338950]\n",
"[Epoch 154/200] [Batch 51/59] [D loss: 0.533396] [G loss: 0.706015]\n",
"[Epoch 154/200] [Batch 52/59] [D loss: 0.594415] [G loss: 1.073516]\n",
"[Epoch 154/200] [Batch 53/59] [D loss: 0.699681] [G loss: 1.192432]\n",
"[Epoch 154/200] [Batch 54/59] [D loss: 0.637644] [G loss: 0.650266]\n",
"[Epoch 154/200] [Batch 55/59] [D loss: 0.557678] [G loss: 0.980855]\n",
"[Epoch 154/200] [Batch 56/59] [D loss: 0.683980] [G loss: 1.519223]\n",
"[Epoch 154/200] [Batch 57/59] [D loss: 0.680730] [G loss: 0.616986]\n",
"[Epoch 154/200] [Batch 58/59] [D loss: 0.635881] [G loss: 0.595651]\n",
"[Epoch 155/200] [Batch 0/59] [D loss: 0.591619] [G loss: 1.359363]\n",
"[Epoch 155/200] [Batch 1/59] [D loss: 0.577797] [G loss: 0.961424]\n",
"[Epoch 155/200] [Batch 2/59] [D loss: 0.517800] [G loss: 0.788740]\n",
"[Epoch 155/200] [Batch 3/59] [D loss: 0.505160] [G loss: 1.012863]\n",
"[Epoch 155/200] [Batch 4/59] [D loss: 0.556746] [G loss: 1.151191]\n",
"[Epoch 155/200] [Batch 5/59] [D loss: 0.547973] [G loss: 0.831134]\n",
"[Epoch 155/200] [Batch 6/59] [D loss: 0.538853] [G loss: 1.001509]\n",
"[Epoch 155/200] [Batch 7/59] [D loss: 0.581882] [G loss: 0.944241]\n",
"[Epoch 155/200] [Batch 8/59] [D loss: 0.559624] [G loss: 0.920833]\n",
"[Epoch 155/200] [Batch 9/59] [D loss: 0.507833] [G loss: 0.957808]\n",
"[Epoch 155/200] [Batch 10/59] [D loss: 0.509403] [G loss: 1.022302]\n",
"[Epoch 155/200] [Batch 11/59] [D loss: 0.524998] [G loss: 0.987490]\n",
"[Epoch 155/200] [Batch 12/59] [D loss: 0.517861] [G loss: 0.715872]\n",
"[Epoch 155/200] [Batch 13/59] [D loss: 0.518561] [G loss: 0.748179]\n",
"[Epoch 155/200] [Batch 14/59] [D loss: 0.632352] [G loss: 1.142230]\n",
"[Epoch 155/200] [Batch 15/59] [D loss: 0.485548] [G loss: 0.823748]\n",
"[Epoch 155/200] [Batch 16/59] [D loss: 0.505940] [G loss: 0.696135]\n",
"[Epoch 155/200] [Batch 17/59] [D loss: 0.499142] [G loss: 0.859698]\n",
"[Epoch 155/200] [Batch 18/59] [D loss: 0.589998] [G loss: 0.765085]\n",
"[Epoch 155/200] [Batch 19/59] [D loss: 0.563199] [G loss: 0.996814]\n",
"[Epoch 155/200] [Batch 20/59] [D loss: 0.602538] [G loss: 0.978326]\n",
"[Epoch 155/200] [Batch 21/59] [D loss: 0.545203] [G loss: 0.915775]\n",
"[Epoch 155/200] [Batch 22/59] [D loss: 0.532240] [G loss: 0.812921]\n",
"[Epoch 155/200] [Batch 23/59] [D loss: 0.596038] [G loss: 0.989964]\n",
"[Epoch 155/200] [Batch 24/59] [D loss: 0.581119] [G loss: 0.990707]\n",
"[Epoch 155/200] [Batch 25/59] [D loss: 0.535436] [G loss: 1.052387]\n",
"[Epoch 155/200] [Batch 26/59] [D loss: 0.610304] [G loss: 0.807617]\n",
"[Epoch 155/200] [Batch 27/59] [D loss: 0.504861] [G loss: 0.980594]\n",
"[Epoch 155/200] [Batch 28/59] [D loss: 0.443319] [G loss: 0.989393]\n",
"[Epoch 155/200] [Batch 29/59] [D loss: 0.520566] [G loss: 0.926263]\n",
"[Epoch 155/200] [Batch 30/59] [D loss: 0.598707] [G loss: 0.909580]\n",
"[Epoch 155/200] [Batch 31/59] [D loss: 0.654238] [G loss: 1.305891]\n",
"[Epoch 155/200] [Batch 32/59] [D loss: 0.598366] [G loss: 0.987917]\n",
"[Epoch 155/200] [Batch 33/59] [D loss: 0.673955] [G loss: 0.713008]\n",
"[Epoch 155/200] [Batch 34/59] [D loss: 0.518475] [G loss: 1.154736]\n",
"[Epoch 155/200] [Batch 35/59] [D loss: 0.481890] [G loss: 0.958049]\n",
"[Epoch 155/200] [Batch 36/59] [D loss: 0.574516] [G loss: 0.790433]\n",
"[Epoch 155/200] [Batch 37/59] [D loss: 0.565652] [G loss: 0.746119]\n",
"[Epoch 155/200] [Batch 38/59] [D loss: 0.503830] [G loss: 1.202710]\n",
"[Epoch 155/200] [Batch 39/59] [D loss: 0.548886] [G loss: 1.276681]\n",
"[Epoch 155/200] [Batch 40/59] [D loss: 0.580908] [G loss: 0.658344]\n",
"[Epoch 155/200] [Batch 41/59] [D loss: 0.535609] [G loss: 0.844127]\n",
"[Epoch 155/200] [Batch 42/59] [D loss: 0.596899] [G loss: 1.123080]\n",
"[Epoch 155/200] [Batch 43/59] [D loss: 0.489707] [G loss: 0.874501]\n",
"[Epoch 155/200] [Batch 44/59] [D loss: 0.580396] [G loss: 0.752023]\n",
"[Epoch 155/200] [Batch 45/59] [D loss: 0.541215] [G loss: 1.010087]\n",
"[Epoch 155/200] [Batch 46/59] [D loss: 0.613953] [G loss: 0.952445]\n",
"[Epoch 155/200] [Batch 47/59] [D loss: 0.591774] [G loss: 1.016975]\n",
"[Epoch 155/200] [Batch 48/59] [D loss: 0.514239] [G loss: 0.969156]\n",
"[Epoch 155/200] [Batch 49/59] [D loss: 0.537009] [G loss: 0.829302]\n",
"[Epoch 155/200] [Batch 50/59] [D loss: 0.650189] [G loss: 1.265105]\n",
"[Epoch 155/200] [Batch 51/59] [D loss: 0.580580] [G loss: 0.676071]\n",
"[Epoch 155/200] [Batch 52/59] [D loss: 0.658444] [G loss: 0.948963]\n",
"[Epoch 155/200] [Batch 53/59] [D loss: 0.602336] [G loss: 0.896961]\n",
"[Epoch 155/200] [Batch 54/59] [D loss: 0.636230] [G loss: 1.426052]\n",
"[Epoch 155/200] [Batch 55/59] [D loss: 0.566450] [G loss: 0.826557]\n",
"[Epoch 155/200] [Batch 56/59] [D loss: 0.616212] [G loss: 0.864283]\n",
"[Epoch 155/200] [Batch 57/59] [D loss: 0.580335] [G loss: 1.480810]\n",
"[Epoch 155/200] [Batch 58/59] [D loss: 0.540150] [G loss: 0.698074]\n",
"[Epoch 156/200] [Batch 0/59] [D loss: 0.696296] [G loss: 0.679110]\n",
"[Epoch 156/200] [Batch 1/59] [D loss: 0.685165] [G loss: 1.565809]\n",
"[Epoch 156/200] [Batch 2/59] [D loss: 0.541482] [G loss: 0.777966]\n",
"[Epoch 156/200] [Batch 3/59] [D loss: 0.636370] [G loss: 0.617722]\n",
"[Epoch 156/200] [Batch 4/59] [D loss: 0.720020] [G loss: 1.674539]\n",
"[Epoch 156/200] [Batch 5/59] [D loss: 0.512546] [G loss: 1.008672]\n",
"[Epoch 156/200] [Batch 6/59] [D loss: 0.531755] [G loss: 0.749922]\n",
"[Epoch 156/200] [Batch 7/59] [D loss: 0.591879] [G loss: 1.162266]\n",
"[Epoch 156/200] [Batch 8/59] [D loss: 0.522796] [G loss: 1.353870]\n",
"[Epoch 156/200] [Batch 9/59] [D loss: 0.571801] [G loss: 0.965446]\n",
"[Epoch 156/200] [Batch 10/59] [D loss: 0.610206] [G loss: 0.816400]\n",
"[Epoch 156/200] [Batch 11/59] [D loss: 0.533954] [G loss: 0.974669]\n",
"[Epoch 156/200] [Batch 12/59] [D loss: 0.547924] [G loss: 1.044660]\n",
"[Epoch 156/200] [Batch 13/59] [D loss: 0.557782] [G loss: 1.027713]\n",
"[Epoch 156/200] [Batch 14/59] [D loss: 0.516255] [G loss: 0.961032]\n",
"[Epoch 156/200] [Batch 15/59] [D loss: 0.534788] [G loss: 0.935631]\n",
"[Epoch 156/200] [Batch 16/59] [D loss: 0.486742] [G loss: 1.103086]\n",
"[Epoch 156/200] [Batch 17/59] [D loss: 0.581490] [G loss: 0.864318]\n",
"[Epoch 156/200] [Batch 18/59] [D loss: 0.621102] [G loss: 0.869174]\n",
"[Epoch 156/200] [Batch 19/59] [D loss: 0.500232] [G loss: 0.750813]\n",
"[Epoch 156/200] [Batch 20/59] [D loss: 0.588959] [G loss: 1.312554]\n",
"[Epoch 156/200] [Batch 21/59] [D loss: 0.537992] [G loss: 1.153791]\n",
"[Epoch 156/200] [Batch 22/59] [D loss: 0.548968] [G loss: 0.861410]\n",
"[Epoch 156/200] [Batch 23/59] [D loss: 0.726116] [G loss: 1.493280]\n",
"[Epoch 156/200] [Batch 24/59] [D loss: 0.598572] [G loss: 0.757660]\n",
"[Epoch 156/200] [Batch 25/59] [D loss: 0.589237] [G loss: 0.974663]\n",
"[Epoch 156/200] [Batch 26/59] [D loss: 0.590014] [G loss: 0.750209]\n",
"[Epoch 156/200] [Batch 27/59] [D loss: 0.498731] [G loss: 0.993730]\n",
"[Epoch 156/200] [Batch 28/59] [D loss: 0.571508] [G loss: 1.065034]\n",
"[Epoch 156/200] [Batch 29/59] [D loss: 0.528315] [G loss: 0.798695]\n",
"[Epoch 156/200] [Batch 30/59] [D loss: 0.585529] [G loss: 0.761821]\n",
"[Epoch 156/200] [Batch 31/59] [D loss: 0.567707] [G loss: 0.983239]\n",
"[Epoch 156/200] [Batch 32/59] [D loss: 0.527639] [G loss: 0.995068]\n",
"[Epoch 156/200] [Batch 33/59] [D loss: 0.535300] [G loss: 0.783000]\n",
"[Epoch 156/200] [Batch 34/59] [D loss: 0.611564] [G loss: 0.920375]\n",
"[Epoch 156/200] [Batch 35/59] [D loss: 0.542533] [G loss: 0.993993]\n",
"[Epoch 156/200] [Batch 36/59] [D loss: 0.601846] [G loss: 0.783553]\n",
"[Epoch 156/200] [Batch 37/59] [D loss: 0.628298] [G loss: 0.954056]\n",
"[Epoch 156/200] [Batch 38/59] [D loss: 0.548566] [G loss: 1.242039]\n",
"[Epoch 156/200] [Batch 39/59] [D loss: 0.505607] [G loss: 0.837218]\n",
"[Epoch 156/200] [Batch 40/59] [D loss: 0.570925] [G loss: 0.991227]\n",
"[Epoch 156/200] [Batch 41/59] [D loss: 0.597516] [G loss: 0.940296]\n",
"[Epoch 156/200] [Batch 42/59] [D loss: 0.553537] [G loss: 0.876802]\n",
"[Epoch 156/200] [Batch 43/59] [D loss: 0.558386] [G loss: 1.016037]\n",
"[Epoch 156/200] [Batch 44/59] [D loss: 0.459477] [G loss: 1.096550]\n",
"[Epoch 156/200] [Batch 45/59] [D loss: 0.538709] [G loss: 1.120452]\n",
"[Epoch 156/200] [Batch 46/59] [D loss: 0.568451] [G loss: 0.779800]\n",
"[Epoch 156/200] [Batch 47/59] [D loss: 0.515586] [G loss: 0.994546]\n",
"[Epoch 156/200] [Batch 48/59] [D loss: 0.546269] [G loss: 1.258280]\n",
"[Epoch 156/200] [Batch 49/59] [D loss: 0.544937] [G loss: 0.799149]\n",
"[Epoch 156/200] [Batch 50/59] [D loss: 0.535932] [G loss: 0.859419]\n",
"[Epoch 156/200] [Batch 51/59] [D loss: 0.518102] [G loss: 0.994856]\n",
"[Epoch 156/200] [Batch 52/59] [D loss: 0.600029] [G loss: 1.127929]\n",
"[Epoch 156/200] [Batch 53/59] [D loss: 0.607511] [G loss: 0.781323]\n",
"[Epoch 156/200] [Batch 54/59] [D loss: 0.486325] [G loss: 0.978840]\n",
"[Epoch 156/200] [Batch 55/59] [D loss: 0.536861] [G loss: 1.100266]\n",
"[Epoch 156/200] [Batch 56/59] [D loss: 0.540144] [G loss: 0.683662]\n",
"[Epoch 156/200] [Batch 57/59] [D loss: 0.552425] [G loss: 0.943391]\n",
"[Epoch 156/200] [Batch 58/59] [D loss: 0.515204] [G loss: 0.784350]\n",
"[Epoch 157/200] [Batch 0/59] [D loss: 0.555612] [G loss: 0.967654]\n",
"[Epoch 157/200] [Batch 1/59] [D loss: 0.613094] [G loss: 0.976433]\n",
"[Epoch 157/200] [Batch 2/59] [D loss: 0.567943] [G loss: 0.837823]\n",
"[Epoch 157/200] [Batch 3/59] [D loss: 0.497830] [G loss: 0.974492]\n",
"[Epoch 157/200] [Batch 4/59] [D loss: 0.546007] [G loss: 0.711053]\n",
"[Epoch 157/200] [Batch 5/59] [D loss: 0.540368] [G loss: 1.007523]\n",
"[Epoch 157/200] [Batch 6/59] [D loss: 0.596696] [G loss: 0.776303]\n",
"[Epoch 157/200] [Batch 7/59] [D loss: 0.613165] [G loss: 0.903395]\n",
"[Epoch 157/200] [Batch 8/59] [D loss: 0.558968] [G loss: 1.067036]\n",
"[Epoch 157/200] [Batch 9/59] [D loss: 0.548750] [G loss: 0.939461]\n",
"[Epoch 157/200] [Batch 10/59] [D loss: 0.645759] [G loss: 0.743155]\n",
"[Epoch 157/200] [Batch 11/59] [D loss: 0.580176] [G loss: 1.048995]\n",
"[Epoch 157/200] [Batch 12/59] [D loss: 0.535083] [G loss: 0.794403]\n",
"[Epoch 157/200] [Batch 13/59] [D loss: 0.554435] [G loss: 0.854820]\n",
"[Epoch 157/200] [Batch 14/59] [D loss: 0.629617] [G loss: 1.144954]\n",
"[Epoch 157/200] [Batch 15/59] [D loss: 0.559855] [G loss: 0.855735]\n",
"[Epoch 157/200] [Batch 16/59] [D loss: 0.568789] [G loss: 0.940281]\n",
"[Epoch 157/200] [Batch 17/59] [D loss: 0.557401] [G loss: 0.824678]\n",
"[Epoch 157/200] [Batch 18/59] [D loss: 0.410888] [G loss: 0.830093]\n",
"[Epoch 157/200] [Batch 19/59] [D loss: 0.618777] [G loss: 0.865436]\n",
"[Epoch 157/200] [Batch 20/59] [D loss: 0.549324] [G loss: 0.963122]\n",
"[Epoch 157/200] [Batch 21/59] [D loss: 0.596776] [G loss: 1.260548]\n",
"[Epoch 157/200] [Batch 22/59] [D loss: 0.569667] [G loss: 0.586770]\n",
"[Epoch 157/200] [Batch 23/59] [D loss: 0.606175] [G loss: 1.323054]\n",
"[Epoch 157/200] [Batch 24/59] [D loss: 0.615817] [G loss: 0.754586]\n",
"[Epoch 157/200] [Batch 25/59] [D loss: 0.621709] [G loss: 1.049106]\n",
"[Epoch 157/200] [Batch 26/59] [D loss: 0.535853] [G loss: 0.945448]\n",
"[Epoch 157/200] [Batch 27/59] [D loss: 0.587896] [G loss: 0.961776]\n",
"[Epoch 157/200] [Batch 28/59] [D loss: 0.559907] [G loss: 0.709030]\n",
"[Epoch 157/200] [Batch 29/59] [D loss: 0.569258] [G loss: 1.158443]\n",
"[Epoch 157/200] [Batch 30/59] [D loss: 0.646981] [G loss: 0.961159]\n",
"[Epoch 157/200] [Batch 31/59] [D loss: 0.521301] [G loss: 0.735670]\n",
"[Epoch 157/200] [Batch 32/59] [D loss: 0.531026] [G loss: 1.246766]\n",
"[Epoch 157/200] [Batch 33/59] [D loss: 0.603969] [G loss: 0.999950]\n",
"[Epoch 157/200] [Batch 34/59] [D loss: 0.608351] [G loss: 1.131301]\n",
"[Epoch 157/200] [Batch 35/59] [D loss: 0.565241] [G loss: 0.809947]\n",
"[Epoch 157/200] [Batch 36/59] [D loss: 0.639103] [G loss: 1.193076]\n",
"[Epoch 157/200] [Batch 37/59] [D loss: 0.615226] [G loss: 0.677685]\n",
"[Epoch 157/200] [Batch 38/59] [D loss: 0.607602] [G loss: 1.040350]\n",
"[Epoch 157/200] [Batch 39/59] [D loss: 0.537213] [G loss: 0.989443]\n",
"[Epoch 157/200] [Batch 40/59] [D loss: 0.551962] [G loss: 0.980054]\n",
"[Epoch 157/200] [Batch 41/59] [D loss: 0.595317] [G loss: 1.292063]\n",
"[Epoch 157/200] [Batch 42/59] [D loss: 0.518292] [G loss: 0.845454]\n",
"[Epoch 157/200] [Batch 43/59] [D loss: 0.675306] [G loss: 0.694620]\n",
"[Epoch 157/200] [Batch 44/59] [D loss: 0.546657] [G loss: 1.336185]\n",
"[Epoch 157/200] [Batch 45/59] [D loss: 0.534575] [G loss: 0.731010]\n",
"[Epoch 157/200] [Batch 46/59] [D loss: 0.523977] [G loss: 0.929008]\n",
"[Epoch 157/200] [Batch 47/59] [D loss: 0.624153] [G loss: 1.515975]\n",
"[Epoch 157/200] [Batch 48/59] [D loss: 0.557667] [G loss: 0.723103]\n",
"[Epoch 157/200] [Batch 49/59] [D loss: 0.608302] [G loss: 0.549428]\n",
"[Epoch 157/200] [Batch 50/59] [D loss: 0.612036] [G loss: 1.557292]\n",
"[Epoch 157/200] [Batch 51/59] [D loss: 0.572010] [G loss: 0.805377]\n",
"[Epoch 157/200] [Batch 52/59] [D loss: 0.497786] [G loss: 0.763234]\n",
"[Epoch 157/200] [Batch 53/59] [D loss: 0.578053] [G loss: 0.829745]\n",
"[Epoch 157/200] [Batch 54/59] [D loss: 0.655226] [G loss: 1.188867]\n",
"[Epoch 157/200] [Batch 55/59] [D loss: 0.566018] [G loss: 0.785578]\n",
"[Epoch 157/200] [Batch 56/59] [D loss: 0.585057] [G loss: 0.909525]\n",
"[Epoch 157/200] [Batch 57/59] [D loss: 0.535313] [G loss: 0.973344]\n",
"[Epoch 157/200] [Batch 58/59] [D loss: 0.584326] [G loss: 1.179796]\n",
"[Epoch 158/200] [Batch 0/59] [D loss: 0.581351] [G loss: 0.856450]\n",
"[Epoch 158/200] [Batch 1/59] [D loss: 0.489148] [G loss: 0.933014]\n",
"[Epoch 158/200] [Batch 2/59] [D loss: 0.467365] [G loss: 0.958427]\n",
"[Epoch 158/200] [Batch 3/59] [D loss: 0.549071] [G loss: 0.965390]\n",
"[Epoch 158/200] [Batch 4/59] [D loss: 0.622624] [G loss: 1.006372]\n",
"[Epoch 158/200] [Batch 5/59] [D loss: 0.582710] [G loss: 1.290529]\n",
"[Epoch 158/200] [Batch 6/59] [D loss: 0.489605] [G loss: 0.833643]\n",
"[Epoch 158/200] [Batch 7/59] [D loss: 0.467432] [G loss: 0.699550]\n",
"[Epoch 158/200] [Batch 8/59] [D loss: 0.649858] [G loss: 1.214228]\n",
"[Epoch 158/200] [Batch 9/59] [D loss: 0.613827] [G loss: 1.094580]\n",
"[Epoch 158/200] [Batch 10/59] [D loss: 0.548987] [G loss: 0.831477]\n",
"[Epoch 158/200] [Batch 11/59] [D loss: 0.574691] [G loss: 1.004863]\n",
"[Epoch 158/200] [Batch 12/59] [D loss: 0.661243] [G loss: 0.748885]\n",
"[Epoch 158/200] [Batch 13/59] [D loss: 0.552589] [G loss: 1.191251]\n",
"[Epoch 158/200] [Batch 14/59] [D loss: 0.574698] [G loss: 0.766905]\n",
"[Epoch 158/200] [Batch 15/59] [D loss: 0.607875] [G loss: 0.843688]\n",
"[Epoch 158/200] [Batch 16/59] [D loss: 0.477607] [G loss: 0.995719]\n",
"[Epoch 158/200] [Batch 17/59] [D loss: 0.509611] [G loss: 0.971352]\n",
"[Epoch 158/200] [Batch 18/59] [D loss: 0.465236] [G loss: 1.126666]\n",
"[Epoch 158/200] [Batch 19/59] [D loss: 0.631740] [G loss: 1.129092]\n",
"[Epoch 158/200] [Batch 20/59] [D loss: 0.538166] [G loss: 0.864907]\n",
"[Epoch 158/200] [Batch 21/59] [D loss: 0.531634] [G loss: 0.985129]\n",
"[Epoch 158/200] [Batch 22/59] [D loss: 0.572858] [G loss: 1.360771]\n",
"[Epoch 158/200] [Batch 23/59] [D loss: 0.585635] [G loss: 0.852537]\n",
"[Epoch 158/200] [Batch 24/59] [D loss: 0.541846] [G loss: 0.829704]\n",
"[Epoch 158/200] [Batch 25/59] [D loss: 0.529577] [G loss: 0.891900]\n",
"[Epoch 158/200] [Batch 26/59] [D loss: 0.534888] [G loss: 1.182220]\n",
"[Epoch 158/200] [Batch 27/59] [D loss: 0.633569] [G loss: 1.008515]\n",
"[Epoch 158/200] [Batch 28/59] [D loss: 0.508975] [G loss: 0.886682]\n",
"[Epoch 158/200] [Batch 29/59] [D loss: 0.564843] [G loss: 1.359418]\n",
"[Epoch 158/200] [Batch 30/59] [D loss: 0.521701] [G loss: 0.987863]\n",
"[Epoch 158/200] [Batch 31/59] [D loss: 0.482338] [G loss: 0.690796]\n",
"[Epoch 158/200] [Batch 32/59] [D loss: 0.563496] [G loss: 0.851581]\n",
"[Epoch 158/200] [Batch 33/59] [D loss: 0.606836] [G loss: 1.263430]\n",
"[Epoch 158/200] [Batch 34/59] [D loss: 0.614234] [G loss: 0.778007]\n",
"[Epoch 158/200] [Batch 35/59] [D loss: 0.512326] [G loss: 0.710891]\n",
"[Epoch 158/200] [Batch 36/59] [D loss: 0.517308] [G loss: 1.375116]\n",
"[Epoch 158/200] [Batch 37/59] [D loss: 0.559034] [G loss: 1.259451]\n",
"[Epoch 158/200] [Batch 38/59] [D loss: 0.661070] [G loss: 0.684000]\n",
"[Epoch 158/200] [Batch 39/59] [D loss: 0.661735] [G loss: 1.296914]\n",
"[Epoch 158/200] [Batch 40/59] [D loss: 0.570737] [G loss: 1.288726]\n",
"[Epoch 158/200] [Batch 41/59] [D loss: 0.561072] [G loss: 0.696182]\n",
"[Epoch 158/200] [Batch 42/59] [D loss: 0.556915] [G loss: 1.362547]\n",
"[Epoch 158/200] [Batch 43/59] [D loss: 0.502614] [G loss: 1.201409]\n",
"[Epoch 158/200] [Batch 44/59] [D loss: 0.512388] [G loss: 0.657032]\n",
"[Epoch 158/200] [Batch 45/59] [D loss: 0.595233] [G loss: 0.739553]\n",
"[Epoch 158/200] [Batch 46/59] [D loss: 0.560772] [G loss: 1.011465]\n",
"[Epoch 158/200] [Batch 47/59] [D loss: 0.492422] [G loss: 0.957948]\n",
"[Epoch 158/200] [Batch 48/59] [D loss: 0.530869] [G loss: 0.905241]\n",
"[Epoch 158/200] [Batch 49/59] [D loss: 0.601448] [G loss: 0.864569]\n",
"[Epoch 158/200] [Batch 50/59] [D loss: 0.576721] [G loss: 1.249769]\n",
"[Epoch 158/200] [Batch 51/59] [D loss: 0.549111] [G loss: 0.812298]\n",
"[Epoch 158/200] [Batch 52/59] [D loss: 0.583184] [G loss: 0.809837]\n",
"[Epoch 158/200] [Batch 53/59] [D loss: 0.555871] [G loss: 1.513197]\n",
"[Epoch 158/200] [Batch 54/59] [D loss: 0.515789] [G loss: 1.072669]\n",
"[Epoch 158/200] [Batch 55/59] [D loss: 0.566766] [G loss: 0.700260]\n",
"[Epoch 158/200] [Batch 56/59] [D loss: 0.502402] [G loss: 1.078647]\n",
"[Epoch 158/200] [Batch 57/59] [D loss: 0.531251] [G loss: 1.343932]\n",
"[Epoch 158/200] [Batch 58/59] [D loss: 0.501512] [G loss: 0.764787]\n",
"[Epoch 159/200] [Batch 0/59] [D loss: 0.531872] [G loss: 0.766030]\n",
"[Epoch 159/200] [Batch 1/59] [D loss: 0.647802] [G loss: 1.303247]\n",
"[Epoch 159/200] [Batch 2/59] [D loss: 0.628146] [G loss: 0.729001]\n",
"[Epoch 159/200] [Batch 3/59] [D loss: 0.512308] [G loss: 1.025113]\n",
"[Epoch 159/200] [Batch 4/59] [D loss: 0.529774] [G loss: 1.353575]\n",
"[Epoch 159/200] [Batch 5/59] [D loss: 0.503103] [G loss: 1.022486]\n",
"[Epoch 159/200] [Batch 6/59] [D loss: 0.647314] [G loss: 0.633625]\n",
"[Epoch 159/200] [Batch 7/59] [D loss: 0.515498] [G loss: 1.146348]\n",
"[Epoch 159/200] [Batch 8/59] [D loss: 0.526269] [G loss: 0.931128]\n",
"[Epoch 159/200] [Batch 9/59] [D loss: 0.645652] [G loss: 0.828010]\n",
"[Epoch 159/200] [Batch 10/59] [D loss: 0.555907] [G loss: 1.319754]\n",
"[Epoch 159/200] [Batch 11/59] [D loss: 0.524452] [G loss: 1.009058]\n",
"[Epoch 159/200] [Batch 12/59] [D loss: 0.617062] [G loss: 0.661908]\n",
"[Epoch 159/200] [Batch 13/59] [D loss: 0.574022] [G loss: 1.340443]\n",
"[Epoch 159/200] [Batch 14/59] [D loss: 0.500414] [G loss: 0.937743]\n",
"[Epoch 159/200] [Batch 15/59] [D loss: 0.543905] [G loss: 0.795444]\n",
"[Epoch 159/200] [Batch 16/59] [D loss: 0.668332] [G loss: 1.210802]\n",
"[Epoch 159/200] [Batch 17/59] [D loss: 0.523452] [G loss: 1.020748]\n",
"[Epoch 159/200] [Batch 18/59] [D loss: 0.605198] [G loss: 0.831065]\n",
"[Epoch 159/200] [Batch 19/59] [D loss: 0.472451] [G loss: 0.917605]\n",
"[Epoch 159/200] [Batch 20/59] [D loss: 0.549226] [G loss: 1.223000]\n",
"[Epoch 159/200] [Batch 21/59] [D loss: 0.568177] [G loss: 0.863757]\n",
"[Epoch 159/200] [Batch 22/59] [D loss: 0.553753] [G loss: 1.097173]\n",
"[Epoch 159/200] [Batch 23/59] [D loss: 0.496555] [G loss: 0.929247]\n",
"[Epoch 159/200] [Batch 24/59] [D loss: 0.561213] [G loss: 0.833124]\n",
"[Epoch 159/200] [Batch 25/59] [D loss: 0.529809] [G loss: 0.888814]\n",
"[Epoch 159/200] [Batch 26/59] [D loss: 0.484773] [G loss: 1.105923]\n",
"[Epoch 159/200] [Batch 27/59] [D loss: 0.551559] [G loss: 0.837172]\n",
"[Epoch 159/200] [Batch 28/59] [D loss: 0.579781] [G loss: 0.585470]\n",
"[Epoch 159/200] [Batch 29/59] [D loss: 0.531591] [G loss: 1.273657]\n",
"[Epoch 159/200] [Batch 30/59] [D loss: 0.572170] [G loss: 0.778901]\n",
"[Epoch 159/200] [Batch 31/59] [D loss: 0.575292] [G loss: 0.848599]\n",
"[Epoch 159/200] [Batch 32/59] [D loss: 0.622055] [G loss: 1.065470]\n",
"[Epoch 159/200] [Batch 33/59] [D loss: 0.588133] [G loss: 0.630805]\n",
"[Epoch 159/200] [Batch 34/59] [D loss: 0.572984] [G loss: 1.293135]\n",
"[Epoch 159/200] [Batch 35/59] [D loss: 0.531858] [G loss: 1.112891]\n",
"[Epoch 159/200] [Batch 36/59] [D loss: 0.579533] [G loss: 0.856874]\n",
"[Epoch 159/200] [Batch 37/59] [D loss: 0.618989] [G loss: 0.865645]\n",
"[Epoch 159/200] [Batch 38/59] [D loss: 0.648415] [G loss: 0.725828]\n",
"[Epoch 159/200] [Batch 39/59] [D loss: 0.565169] [G loss: 1.158707]\n",
"[Epoch 159/200] [Batch 40/59] [D loss: 0.544788] [G loss: 0.817313]\n",
"[Epoch 159/200] [Batch 41/59] [D loss: 0.606937] [G loss: 0.867704]\n",
"[Epoch 159/200] [Batch 42/59] [D loss: 0.514720] [G loss: 1.127937]\n",
"[Epoch 159/200] [Batch 43/59] [D loss: 0.510584] [G loss: 0.673692]\n",
"[Epoch 159/200] [Batch 44/59] [D loss: 0.562843] [G loss: 1.050402]\n",
"[Epoch 159/200] [Batch 45/59] [D loss: 0.622320] [G loss: 0.680491]\n",
"[Epoch 159/200] [Batch 46/59] [D loss: 0.601293] [G loss: 1.161622]\n",
"[Epoch 159/200] [Batch 47/59] [D loss: 0.586677] [G loss: 0.971492]\n",
"[Epoch 159/200] [Batch 48/59] [D loss: 0.594214] [G loss: 1.049113]\n",
"[Epoch 159/200] [Batch 49/59] [D loss: 0.559968] [G loss: 1.115775]\n",
"[Epoch 159/200] [Batch 50/59] [D loss: 0.592079] [G loss: 0.648519]\n",
"[Epoch 159/200] [Batch 51/59] [D loss: 0.623732] [G loss: 1.207485]\n",
"[Epoch 159/200] [Batch 52/59] [D loss: 0.634403] [G loss: 0.955960]\n",
"[Epoch 159/200] [Batch 53/59] [D loss: 0.597829] [G loss: 1.033638]\n",
"[Epoch 159/200] [Batch 54/59] [D loss: 0.552015] [G loss: 1.063012]\n",
"[Epoch 159/200] [Batch 55/59] [D loss: 0.536168] [G loss: 0.941712]\n",
"[Epoch 159/200] [Batch 56/59] [D loss: 0.607738] [G loss: 0.832435]\n",
"[Epoch 159/200] [Batch 57/59] [D loss: 0.500338] [G loss: 0.965706]\n",
"[Epoch 159/200] [Batch 58/59] [D loss: 0.525884] [G loss: 1.369735]\n",
"[Epoch 160/200] [Batch 0/59] [D loss: 0.569752] [G loss: 1.032515]\n",
"[Epoch 160/200] [Batch 1/59] [D loss: 0.592573] [G loss: 0.837200]\n",
"[Epoch 160/200] [Batch 2/59] [D loss: 0.589750] [G loss: 0.976635]\n",
"[Epoch 160/200] [Batch 3/59] [D loss: 0.614211] [G loss: 0.980762]\n",
"[Epoch 160/200] [Batch 4/59] [D loss: 0.548426] [G loss: 0.740074]\n",
"[Epoch 160/200] [Batch 5/59] [D loss: 0.507838] [G loss: 1.028278]\n",
"[Epoch 160/200] [Batch 6/59] [D loss: 0.476612] [G loss: 1.127113]\n",
"[Epoch 160/200] [Batch 7/59] [D loss: 0.466249] [G loss: 0.861398]\n",
"[Epoch 160/200] [Batch 8/59] [D loss: 0.504423] [G loss: 0.742609]\n",
"[Epoch 160/200] [Batch 9/59] [D loss: 0.521434] [G loss: 1.078722]\n",
"[Epoch 160/200] [Batch 10/59] [D loss: 0.570682] [G loss: 0.895172]\n",
"[Epoch 160/200] [Batch 11/59] [D loss: 0.528338] [G loss: 1.176806]\n",
"[Epoch 160/200] [Batch 12/59] [D loss: 0.590465] [G loss: 0.823128]\n",
"[Epoch 160/200] [Batch 13/59] [D loss: 0.585728] [G loss: 0.739310]\n",
"[Epoch 160/200] [Batch 14/59] [D loss: 0.586123] [G loss: 1.045727]\n",
"[Epoch 160/200] [Batch 15/59] [D loss: 0.594292] [G loss: 0.962361]\n",
"[Epoch 160/200] [Batch 16/59] [D loss: 0.536457] [G loss: 1.186356]\n",
"[Epoch 160/200] [Batch 17/59] [D loss: 0.573524] [G loss: 0.902562]\n",
"[Epoch 160/200] [Batch 18/59] [D loss: 0.586598] [G loss: 1.165148]\n",
"[Epoch 160/200] [Batch 19/59] [D loss: 0.555161] [G loss: 1.228759]\n",
"[Epoch 160/200] [Batch 20/59] [D loss: 0.550613] [G loss: 1.158589]\n",
"[Epoch 160/200] [Batch 21/59] [D loss: 0.582654] [G loss: 0.746189]\n",
"[Epoch 160/200] [Batch 22/59] [D loss: 0.534116] [G loss: 1.180712]\n",
"[Epoch 160/200] [Batch 23/59] [D loss: 0.587263] [G loss: 0.920271]\n",
"[Epoch 160/200] [Batch 24/59] [D loss: 0.633916] [G loss: 0.632941]\n",
"[Epoch 160/200] [Batch 25/59] [D loss: 0.639460] [G loss: 1.326805]\n",
"[Epoch 160/200] [Batch 26/59] [D loss: 0.580843] [G loss: 0.949518]\n",
"[Epoch 160/200] [Batch 27/59] [D loss: 0.670726] [G loss: 0.523117]\n",
"[Epoch 160/200] [Batch 28/59] [D loss: 0.625919] [G loss: 1.383793]\n",
"[Epoch 160/200] [Batch 29/59] [D loss: 0.513619] [G loss: 0.866432]\n",
"[Epoch 160/200] [Batch 30/59] [D loss: 0.563634] [G loss: 0.743952]\n",
"[Epoch 160/200] [Batch 31/59] [D loss: 0.489189] [G loss: 1.061792]\n",
"[Epoch 160/200] [Batch 32/59] [D loss: 0.601601] [G loss: 0.928679]\n",
"[Epoch 160/200] [Batch 33/59] [D loss: 0.547069] [G loss: 0.669790]\n",
"[Epoch 160/200] [Batch 34/59] [D loss: 0.562046] [G loss: 0.967378]\n",
"[Epoch 160/200] [Batch 35/59] [D loss: 0.624573] [G loss: 1.332568]\n",
"[Epoch 160/200] [Batch 36/59] [D loss: 0.595902] [G loss: 0.741826]\n",
"[Epoch 160/200] [Batch 37/59] [D loss: 0.594231] [G loss: 1.082998]\n",
"[Epoch 160/200] [Batch 38/59] [D loss: 0.665835] [G loss: 0.865646]\n",
"[Epoch 160/200] [Batch 39/59] [D loss: 0.539549] [G loss: 0.859361]\n",
"[Epoch 160/200] [Batch 40/59] [D loss: 0.539206] [G loss: 0.844852]\n",
"[Epoch 160/200] [Batch 41/59] [D loss: 0.661217] [G loss: 0.981202]\n",
"[Epoch 160/200] [Batch 42/59] [D loss: 0.553508] [G loss: 1.008642]\n",
"[Epoch 160/200] [Batch 43/59] [D loss: 0.634025] [G loss: 1.074187]\n",
"[Epoch 160/200] [Batch 44/59] [D loss: 0.568017] [G loss: 0.463597]\n",
"[Epoch 160/200] [Batch 45/59] [D loss: 0.545627] [G loss: 1.249323]\n",
"[Epoch 160/200] [Batch 46/59] [D loss: 0.540442] [G loss: 1.145031]\n",
"[Epoch 160/200] [Batch 47/59] [D loss: 0.589667] [G loss: 0.972818]\n",
"[Epoch 160/200] [Batch 48/59] [D loss: 0.590714] [G loss: 1.103340]\n",
"[Epoch 160/200] [Batch 49/59] [D loss: 0.598763] [G loss: 0.844149]\n",
"[Epoch 160/200] [Batch 50/59] [D loss: 0.510719] [G loss: 1.280835]\n",
"[Epoch 160/200] [Batch 51/59] [D loss: 0.527562] [G loss: 0.940558]\n",
"[Epoch 160/200] [Batch 52/59] [D loss: 0.654130] [G loss: 0.842061]\n",
"[Epoch 160/200] [Batch 53/59] [D loss: 0.485405] [G loss: 1.016960]\n",
"[Epoch 160/200] [Batch 54/59] [D loss: 0.540572] [G loss: 0.785219]\n",
"[Epoch 160/200] [Batch 55/59] [D loss: 0.616394] [G loss: 1.070547]\n",
"[Epoch 160/200] [Batch 56/59] [D loss: 0.548025] [G loss: 1.128893]\n",
"[Epoch 160/200] [Batch 57/59] [D loss: 0.653667] [G loss: 0.870241]\n",
"[Epoch 160/200] [Batch 58/59] [D loss: 0.535369] [G loss: 1.509016]\n",
"[Epoch 161/200] [Batch 0/59] [D loss: 0.558127] [G loss: 1.068383]\n",
"[Epoch 161/200] [Batch 1/59] [D loss: 0.534975] [G loss: 0.726079]\n",
"[Epoch 161/200] [Batch 2/59] [D loss: 0.487326] [G loss: 1.108625]\n",
"[Epoch 161/200] [Batch 3/59] [D loss: 0.592514] [G loss: 1.570891]\n",
"[Epoch 161/200] [Batch 4/59] [D loss: 0.549633] [G loss: 0.660419]\n",
"[Epoch 161/200] [Batch 5/59] [D loss: 0.539419] [G loss: 0.933969]\n",
"[Epoch 161/200] [Batch 6/59] [D loss: 0.527077] [G loss: 1.412549]\n",
"[Epoch 161/200] [Batch 7/59] [D loss: 0.490710] [G loss: 0.779626]\n",
"[Epoch 161/200] [Batch 8/59] [D loss: 0.596187] [G loss: 0.656006]\n",
"[Epoch 161/200] [Batch 9/59] [D loss: 0.505070] [G loss: 1.056103]\n",
"[Epoch 161/200] [Batch 10/59] [D loss: 0.498069] [G loss: 0.738476]\n",
"[Epoch 161/200] [Batch 11/59] [D loss: 0.490528] [G loss: 1.007036]\n",
"[Epoch 161/200] [Batch 12/59] [D loss: 0.527449] [G loss: 1.036735]\n",
"[Epoch 161/200] [Batch 13/59] [D loss: 0.475009] [G loss: 0.800617]\n",
"[Epoch 161/200] [Batch 14/59] [D loss: 0.591822] [G loss: 0.957786]\n",
"[Epoch 161/200] [Batch 15/59] [D loss: 0.568110] [G loss: 1.012962]\n",
"[Epoch 161/200] [Batch 16/59] [D loss: 0.537583] [G loss: 1.016069]\n",
"[Epoch 161/200] [Batch 17/59] [D loss: 0.626127] [G loss: 0.954725]\n",
"[Epoch 161/200] [Batch 18/59] [D loss: 0.579216] [G loss: 1.108102]\n",
"[Epoch 161/200] [Batch 19/59] [D loss: 0.667472] [G loss: 0.645083]\n",
"[Epoch 161/200] [Batch 20/59] [D loss: 0.553423] [G loss: 1.169724]\n",
"[Epoch 161/200] [Batch 21/59] [D loss: 0.593593] [G loss: 0.943867]\n",
"[Epoch 161/200] [Batch 22/59] [D loss: 0.648715] [G loss: 1.199112]\n",
"[Epoch 161/200] [Batch 23/59] [D loss: 0.553307] [G loss: 0.635522]\n",
"[Epoch 161/200] [Batch 24/59] [D loss: 0.620246] [G loss: 0.993285]\n",
"[Epoch 161/200] [Batch 25/59] [D loss: 0.605641] [G loss: 1.040761]\n",
"[Epoch 161/200] [Batch 26/59] [D loss: 0.542394] [G loss: 0.851024]\n",
"[Epoch 161/200] [Batch 27/59] [D loss: 0.517318] [G loss: 0.880724]\n",
"[Epoch 161/200] [Batch 28/59] [D loss: 0.567052] [G loss: 0.850691]\n",
"[Epoch 161/200] [Batch 29/59] [D loss: 0.557951] [G loss: 0.990927]\n",
"[Epoch 161/200] [Batch 30/59] [D loss: 0.565567] [G loss: 1.016447]\n",
"[Epoch 161/200] [Batch 31/59] [D loss: 0.557711] [G loss: 0.905634]\n",
"[Epoch 161/200] [Batch 32/59] [D loss: 0.467234] [G loss: 0.930170]\n",
"[Epoch 161/200] [Batch 33/59] [D loss: 0.629006] [G loss: 1.190929]\n",
"[Epoch 161/200] [Batch 34/59] [D loss: 0.452962] [G loss: 0.779837]\n",
"[Epoch 161/200] [Batch 35/59] [D loss: 0.556803] [G loss: 0.819404]\n",
"[Epoch 161/200] [Batch 36/59] [D loss: 0.509806] [G loss: 1.255148]\n",
"[Epoch 161/200] [Batch 37/59] [D loss: 0.571378] [G loss: 0.952708]\n",
"[Epoch 161/200] [Batch 38/59] [D loss: 0.654685] [G loss: 0.477834]\n",
"[Epoch 161/200] [Batch 39/59] [D loss: 0.642548] [G loss: 1.187138]\n",
"[Epoch 161/200] [Batch 40/59] [D loss: 0.615861] [G loss: 0.772394]\n",
"[Epoch 161/200] [Batch 41/59] [D loss: 0.572805] [G loss: 1.007262]\n",
"[Epoch 161/200] [Batch 42/59] [D loss: 0.551531] [G loss: 1.107293]\n",
"[Epoch 161/200] [Batch 43/59] [D loss: 0.538976] [G loss: 0.749250]\n",
"[Epoch 161/200] [Batch 44/59] [D loss: 0.529340] [G loss: 1.128242]\n",
"[Epoch 161/200] [Batch 45/59] [D loss: 0.562475] [G loss: 1.337761]\n",
"[Epoch 161/200] [Batch 46/59] [D loss: 0.556102] [G loss: 0.746575]\n",
"[Epoch 161/200] [Batch 47/59] [D loss: 0.606619] [G loss: 0.835586]\n",
"[Epoch 161/200] [Batch 48/59] [D loss: 0.630062] [G loss: 1.234247]\n",
"[Epoch 161/200] [Batch 49/59] [D loss: 0.602520] [G loss: 1.193210]\n",
"[Epoch 161/200] [Batch 50/59] [D loss: 0.540208] [G loss: 0.678985]\n",
"[Epoch 161/200] [Batch 51/59] [D loss: 0.537663] [G loss: 0.945966]\n",
"[Epoch 161/200] [Batch 52/59] [D loss: 0.609352] [G loss: 1.544265]\n",
"[Epoch 161/200] [Batch 53/59] [D loss: 0.548510] [G loss: 0.784120]\n",
"[Epoch 161/200] [Batch 54/59] [D loss: 0.581043] [G loss: 0.809538]\n",
"[Epoch 161/200] [Batch 55/59] [D loss: 0.633808] [G loss: 1.310361]\n",
"[Epoch 161/200] [Batch 56/59] [D loss: 0.518921] [G loss: 1.007443]\n",
"[Epoch 161/200] [Batch 57/59] [D loss: 0.622118] [G loss: 0.753207]\n",
"[Epoch 161/200] [Batch 58/59] [D loss: 0.452882] [G loss: 1.074835]\n",
"[Epoch 162/200] [Batch 0/59] [D loss: 0.508861] [G loss: 0.649809]\n",
"[Epoch 162/200] [Batch 1/59] [D loss: 0.626909] [G loss: 0.890572]\n",
"[Epoch 162/200] [Batch 2/59] [D loss: 0.528953] [G loss: 1.130440]\n",
"[Epoch 162/200] [Batch 3/59] [D loss: 0.582286] [G loss: 0.857368]\n",
"[Epoch 162/200] [Batch 4/59] [D loss: 0.466448] [G loss: 0.893407]\n",
"[Epoch 162/200] [Batch 5/59] [D loss: 0.534103] [G loss: 0.992356]\n",
"[Epoch 162/200] [Batch 6/59] [D loss: 0.601705] [G loss: 1.050916]\n",
"[Epoch 162/200] [Batch 7/59] [D loss: 0.579550] [G loss: 0.742779]\n",
"[Epoch 162/200] [Batch 8/59] [D loss: 0.594999] [G loss: 1.548292]\n",
"[Epoch 162/200] [Batch 9/59] [D loss: 0.582674] [G loss: 0.698057]\n",
"[Epoch 162/200] [Batch 10/59] [D loss: 0.584915] [G loss: 0.886337]\n",
"[Epoch 162/200] [Batch 11/59] [D loss: 0.521446] [G loss: 0.775308]\n",
"[Epoch 162/200] [Batch 12/59] [D loss: 0.530826] [G loss: 1.237235]\n",
"[Epoch 162/200] [Batch 13/59] [D loss: 0.571452] [G loss: 0.897793]\n",
"[Epoch 162/200] [Batch 14/59] [D loss: 0.570672] [G loss: 0.854465]\n",
"[Epoch 162/200] [Batch 15/59] [D loss: 0.581568] [G loss: 1.282576]\n",
"[Epoch 162/200] [Batch 16/59] [D loss: 0.555704] [G loss: 0.788718]\n",
"[Epoch 162/200] [Batch 17/59] [D loss: 0.522431] [G loss: 0.647536]\n",
"[Epoch 162/200] [Batch 18/59] [D loss: 0.507247] [G loss: 1.130774]\n",
"[Epoch 162/200] [Batch 19/59] [D loss: 0.557060] [G loss: 1.199353]\n",
"[Epoch 162/200] [Batch 20/59] [D loss: 0.503295] [G loss: 0.649343]\n",
"[Epoch 162/200] [Batch 21/59] [D loss: 0.600506] [G loss: 0.767457]\n",
"[Epoch 162/200] [Batch 22/59] [D loss: 0.531524] [G loss: 1.228569]\n",
"[Epoch 162/200] [Batch 23/59] [D loss: 0.556894] [G loss: 1.081010]\n",
"[Epoch 162/200] [Batch 24/59] [D loss: 0.537046] [G loss: 0.894477]\n",
"[Epoch 162/200] [Batch 25/59] [D loss: 0.509791] [G loss: 1.089789]\n",
"[Epoch 162/200] [Batch 26/59] [D loss: 0.503751] [G loss: 1.054497]\n",
"[Epoch 162/200] [Batch 27/59] [D loss: 0.545444] [G loss: 0.971899]\n",
"[Epoch 162/200] [Batch 28/59] [D loss: 0.520197] [G loss: 1.098929]\n",
"[Epoch 162/200] [Batch 29/59] [D loss: 0.565421] [G loss: 0.877559]\n",
"[Epoch 162/200] [Batch 30/59] [D loss: 0.604566] [G loss: 1.250564]\n",
"[Epoch 162/200] [Batch 31/59] [D loss: 0.637376] [G loss: 0.517953]\n",
"[Epoch 162/200] [Batch 32/59] [D loss: 0.569325] [G loss: 0.939745]\n",
"[Epoch 162/200] [Batch 33/59] [D loss: 0.483686] [G loss: 1.011108]\n",
"[Epoch 162/200] [Batch 34/59] [D loss: 0.547785] [G loss: 0.922515]\n",
"[Epoch 162/200] [Batch 35/59] [D loss: 0.499112] [G loss: 0.914313]\n",
"[Epoch 162/200] [Batch 36/59] [D loss: 0.592124] [G loss: 1.368650]\n",
"[Epoch 162/200] [Batch 37/59] [D loss: 0.476213] [G loss: 0.945199]\n",
"[Epoch 162/200] [Batch 38/59] [D loss: 0.662920] [G loss: 0.783491]\n",
"[Epoch 162/200] [Batch 39/59] [D loss: 0.583966] [G loss: 1.423626]\n",
"[Epoch 162/200] [Batch 40/59] [D loss: 0.541254] [G loss: 0.852438]\n",
"[Epoch 162/200] [Batch 41/59] [D loss: 0.598379] [G loss: 1.087397]\n",
"[Epoch 162/200] [Batch 42/59] [D loss: 0.505412] [G loss: 0.728893]\n",
"[Epoch 162/200] [Batch 43/59] [D loss: 0.535560] [G loss: 0.746873]\n",
"[Epoch 162/200] [Batch 44/59] [D loss: 0.496497] [G loss: 0.921608]\n",
"[Epoch 162/200] [Batch 45/59] [D loss: 0.620019] [G loss: 1.090983]\n",
"[Epoch 162/200] [Batch 46/59] [D loss: 0.754027] [G loss: 0.521145]\n",
"[Epoch 162/200] [Batch 47/59] [D loss: 0.555830] [G loss: 1.231585]\n",
"[Epoch 162/200] [Batch 48/59] [D loss: 0.581175] [G loss: 0.972924]\n",
"[Epoch 162/200] [Batch 49/59] [D loss: 0.555647] [G loss: 0.762930]\n",
"[Epoch 162/200] [Batch 50/59] [D loss: 0.537282] [G loss: 0.752995]\n",
"[Epoch 162/200] [Batch 51/59] [D loss: 0.576321] [G loss: 1.363496]\n",
"[Epoch 162/200] [Batch 52/59] [D loss: 0.543658] [G loss: 1.161768]\n",
"[Epoch 162/200] [Batch 53/59] [D loss: 0.631856] [G loss: 0.697708]\n",
"[Epoch 162/200] [Batch 54/59] [D loss: 0.569629] [G loss: 1.667295]\n",
"[Epoch 162/200] [Batch 55/59] [D loss: 0.489861] [G loss: 1.179328]\n",
"[Epoch 162/200] [Batch 56/59] [D loss: 0.579305] [G loss: 0.695555]\n",
"[Epoch 162/200] [Batch 57/59] [D loss: 0.574502] [G loss: 1.675427]\n",
"[Epoch 162/200] [Batch 58/59] [D loss: 0.509820] [G loss: 1.258833]\n",
"[Epoch 163/200] [Batch 0/59] [D loss: 0.648349] [G loss: 0.578447]\n",
"[Epoch 163/200] [Batch 1/59] [D loss: 0.616963] [G loss: 1.349089]\n",
"[Epoch 163/200] [Batch 2/59] [D loss: 0.534931] [G loss: 1.067517]\n",
"[Epoch 163/200] [Batch 3/59] [D loss: 0.581756] [G loss: 0.745450]\n",
"[Epoch 163/200] [Batch 4/59] [D loss: 0.560538] [G loss: 1.369236]\n",
"[Epoch 163/200] [Batch 5/59] [D loss: 0.513018] [G loss: 0.858803]\n",
"[Epoch 163/200] [Batch 6/59] [D loss: 0.503696] [G loss: 0.564941]\n",
"[Epoch 163/200] [Batch 7/59] [D loss: 0.504593] [G loss: 1.391264]\n",
"[Epoch 163/200] [Batch 8/59] [D loss: 0.530448] [G loss: 1.262047]\n",
"[Epoch 163/200] [Batch 9/59] [D loss: 0.581241] [G loss: 0.748249]\n",
"[Epoch 163/200] [Batch 10/59] [D loss: 0.628750] [G loss: 0.892487]\n",
"[Epoch 163/200] [Batch 11/59] [D loss: 0.555533] [G loss: 0.898365]\n",
"[Epoch 163/200] [Batch 12/59] [D loss: 0.534990] [G loss: 1.162301]\n",
"[Epoch 163/200] [Batch 13/59] [D loss: 0.626434] [G loss: 1.126399]\n",
"[Epoch 163/200] [Batch 14/59] [D loss: 0.565585] [G loss: 0.768589]\n",
"[Epoch 163/200] [Batch 15/59] [D loss: 0.585282] [G loss: 1.225122]\n",
"[Epoch 163/200] [Batch 16/59] [D loss: 0.636153] [G loss: 0.706639]\n",
"[Epoch 163/200] [Batch 17/59] [D loss: 0.502713] [G loss: 0.836949]\n",
"[Epoch 163/200] [Batch 18/59] [D loss: 0.618568] [G loss: 1.067569]\n",
"[Epoch 163/200] [Batch 19/59] [D loss: 0.554995] [G loss: 1.126196]\n",
"[Epoch 163/200] [Batch 20/59] [D loss: 0.578673] [G loss: 0.862537]\n",
"[Epoch 163/200] [Batch 21/59] [D loss: 0.669842] [G loss: 0.904897]\n",
"[Epoch 163/200] [Batch 22/59] [D loss: 0.581212] [G loss: 0.996395]\n",
"[Epoch 163/200] [Batch 23/59] [D loss: 0.618424] [G loss: 0.745521]\n",
"[Epoch 163/200] [Batch 24/59] [D loss: 0.617943] [G loss: 0.919885]\n",
"[Epoch 163/200] [Batch 25/59] [D loss: 0.466381] [G loss: 1.036674]\n",
"[Epoch 163/200] [Batch 26/59] [D loss: 0.575934] [G loss: 1.052139]\n",
"[Epoch 163/200] [Batch 27/59] [D loss: 0.565109] [G loss: 0.823137]\n",
"[Epoch 163/200] [Batch 28/59] [D loss: 0.557651] [G loss: 1.092544]\n",
"[Epoch 163/200] [Batch 29/59] [D loss: 0.588895] [G loss: 0.921715]\n",
"[Epoch 163/200] [Batch 30/59] [D loss: 0.561987] [G loss: 0.912072]\n",
"[Epoch 163/200] [Batch 31/59] [D loss: 0.615738] [G loss: 0.883868]\n",
"[Epoch 163/200] [Batch 32/59] [D loss: 0.629928] [G loss: 0.833003]\n",
"[Epoch 163/200] [Batch 33/59] [D loss: 0.615241] [G loss: 1.328410]\n",
"[Epoch 163/200] [Batch 34/59] [D loss: 0.568270] [G loss: 0.886695]\n",
"[Epoch 163/200] [Batch 35/59] [D loss: 0.514655] [G loss: 0.673692]\n",
"[Epoch 163/200] [Batch 36/59] [D loss: 0.621511] [G loss: 1.108612]\n",
"[Epoch 163/200] [Batch 37/59] [D loss: 0.570443] [G loss: 0.936805]\n",
"[Epoch 163/200] [Batch 38/59] [D loss: 0.632228] [G loss: 0.828376]\n",
"[Epoch 163/200] [Batch 39/59] [D loss: 0.561733] [G loss: 0.743617]\n",
"[Epoch 163/200] [Batch 40/59] [D loss: 0.542864] [G loss: 0.771288]\n",
"[Epoch 163/200] [Batch 41/59] [D loss: 0.517883] [G loss: 1.202654]\n",
"[Epoch 163/200] [Batch 42/59] [D loss: 0.553512] [G loss: 0.802556]\n",
"[Epoch 163/200] [Batch 43/59] [D loss: 0.538240] [G loss: 0.642816]\n",
"[Epoch 163/200] [Batch 44/59] [D loss: 0.539626] [G loss: 1.282616]\n",
"[Epoch 163/200] [Batch 45/59] [D loss: 0.517924] [G loss: 1.203109]\n",
"[Epoch 163/200] [Batch 46/59] [D loss: 0.575157] [G loss: 0.795714]\n",
"[Epoch 163/200] [Batch 47/59] [D loss: 0.533237] [G loss: 1.380068]\n",
"[Epoch 163/200] [Batch 48/59] [D loss: 0.551211] [G loss: 1.314089]\n",
"[Epoch 163/200] [Batch 49/59] [D loss: 0.602136] [G loss: 0.479988]\n",
"[Epoch 163/200] [Batch 50/59] [D loss: 0.531509] [G loss: 1.404435]\n",
"[Epoch 163/200] [Batch 51/59] [D loss: 0.501203] [G loss: 1.350866]\n",
"[Epoch 163/200] [Batch 52/59] [D loss: 0.560859] [G loss: 0.525359]\n",
"[Epoch 163/200] [Batch 53/59] [D loss: 0.543341] [G loss: 1.008914]\n",
"[Epoch 163/200] [Batch 54/59] [D loss: 0.666366] [G loss: 1.472946]\n",
"[Epoch 163/200] [Batch 55/59] [D loss: 0.545691] [G loss: 0.380483]\n",
"[Epoch 163/200] [Batch 56/59] [D loss: 0.573045] [G loss: 0.718120]\n",
"[Epoch 163/200] [Batch 57/59] [D loss: 0.579060] [G loss: 1.516109]\n",
"[Epoch 163/200] [Batch 58/59] [D loss: 0.453235] [G loss: 0.861570]\n",
"[Epoch 164/200] [Batch 0/59] [D loss: 0.591616] [G loss: 0.666448]\n",
"[Epoch 164/200] [Batch 1/59] [D loss: 0.550453] [G loss: 1.276727]\n",
"[Epoch 164/200] [Batch 2/59] [D loss: 0.506869] [G loss: 1.186450]\n",
"[Epoch 164/200] [Batch 3/59] [D loss: 0.648590] [G loss: 0.537512]\n",
"[Epoch 164/200] [Batch 4/59] [D loss: 0.576137] [G loss: 1.348713]\n",
"[Epoch 164/200] [Batch 5/59] [D loss: 0.573711] [G loss: 0.867723]\n",
"[Epoch 164/200] [Batch 6/59] [D loss: 0.627790] [G loss: 0.784622]\n",
"[Epoch 164/200] [Batch 7/59] [D loss: 0.551920] [G loss: 1.382901]\n",
"[Epoch 164/200] [Batch 8/59] [D loss: 0.652296] [G loss: 1.040927]\n",
"[Epoch 164/200] [Batch 9/59] [D loss: 0.688203] [G loss: 0.527572]\n",
"[Epoch 164/200] [Batch 10/59] [D loss: 0.637031] [G loss: 1.046145]\n",
"[Epoch 164/200] [Batch 11/59] [D loss: 0.566288] [G loss: 1.073601]\n",
"[Epoch 164/200] [Batch 12/59] [D loss: 0.549039] [G loss: 0.599358]\n",
"[Epoch 164/200] [Batch 13/59] [D loss: 0.485994] [G loss: 1.100564]\n",
"[Epoch 164/200] [Batch 14/59] [D loss: 0.517223] [G loss: 0.858259]\n",
"[Epoch 164/200] [Batch 15/59] [D loss: 0.578594] [G loss: 1.171828]\n",
"[Epoch 164/200] [Batch 16/59] [D loss: 0.563948] [G loss: 0.853068]\n",
"[Epoch 164/200] [Batch 17/59] [D loss: 0.505490] [G loss: 0.842487]\n",
"[Epoch 164/200] [Batch 18/59] [D loss: 0.567483] [G loss: 0.964151]\n",
"[Epoch 164/200] [Batch 19/59] [D loss: 0.498170] [G loss: 0.857099]\n",
"[Epoch 164/200] [Batch 20/59] [D loss: 0.554114] [G loss: 1.069777]\n",
"[Epoch 164/200] [Batch 21/59] [D loss: 0.583664] [G loss: 1.037066]\n",
"[Epoch 164/200] [Batch 22/59] [D loss: 0.568805] [G loss: 0.752399]\n",
"[Epoch 164/200] [Batch 23/59] [D loss: 0.497462] [G loss: 0.847947]\n",
"[Epoch 164/200] [Batch 24/59] [D loss: 0.516744] [G loss: 1.086108]\n",
"[Epoch 164/200] [Batch 25/59] [D loss: 0.575363] [G loss: 0.887164]\n",
"[Epoch 164/200] [Batch 26/59] [D loss: 0.539161] [G loss: 0.816319]\n",
"[Epoch 164/200] [Batch 27/59] [D loss: 0.555623] [G loss: 0.926255]\n",
"[Epoch 164/200] [Batch 28/59] [D loss: 0.547526] [G loss: 1.122305]\n",
"[Epoch 164/200] [Batch 29/59] [D loss: 0.554103] [G loss: 0.746839]\n",
"[Epoch 164/200] [Batch 30/59] [D loss: 0.560209] [G loss: 0.675805]\n",
"[Epoch 164/200] [Batch 31/59] [D loss: 0.513004] [G loss: 1.055544]\n",
"[Epoch 164/200] [Batch 32/59] [D loss: 0.559217] [G loss: 1.079307]\n",
"[Epoch 164/200] [Batch 33/59] [D loss: 0.594201] [G loss: 0.981148]\n",
"[Epoch 164/200] [Batch 34/59] [D loss: 0.564714] [G loss: 0.727681]\n",
"[Epoch 164/200] [Batch 35/59] [D loss: 0.502720] [G loss: 1.133129]\n",
"[Epoch 164/200] [Batch 36/59] [D loss: 0.548525] [G loss: 0.887931]\n",
"[Epoch 164/200] [Batch 37/59] [D loss: 0.602429] [G loss: 0.911502]\n",
"[Epoch 164/200] [Batch 38/59] [D loss: 0.565253] [G loss: 0.881153]\n",
"[Epoch 164/200] [Batch 39/59] [D loss: 0.626141] [G loss: 0.801149]\n",
"[Epoch 164/200] [Batch 40/59] [D loss: 0.503561] [G loss: 1.173854]\n",
"[Epoch 164/200] [Batch 41/59] [D loss: 0.547039] [G loss: 0.916891]\n",
"[Epoch 164/200] [Batch 42/59] [D loss: 0.550917] [G loss: 1.038114]\n",
"[Epoch 164/200] [Batch 43/59] [D loss: 0.522593] [G loss: 0.845747]\n",
"[Epoch 164/200] [Batch 44/59] [D loss: 0.476544] [G loss: 1.041713]\n",
"[Epoch 164/200] [Batch 45/59] [D loss: 0.580827] [G loss: 1.103813]\n",
"[Epoch 164/200] [Batch 46/59] [D loss: 0.581582] [G loss: 0.967794]\n",
"[Epoch 164/200] [Batch 47/59] [D loss: 0.561383] [G loss: 0.722744]\n",
"[Epoch 164/200] [Batch 48/59] [D loss: 0.558749] [G loss: 1.327036]\n",
"[Epoch 164/200] [Batch 49/59] [D loss: 0.473599] [G loss: 1.063734]\n",
"[Epoch 164/200] [Batch 50/59] [D loss: 0.632504] [G loss: 0.994274]\n",
"[Epoch 164/200] [Batch 51/59] [D loss: 0.621204] [G loss: 1.291858]\n",
"[Epoch 164/200] [Batch 52/59] [D loss: 0.521440] [G loss: 0.589906]\n",
"[Epoch 164/200] [Batch 53/59] [D loss: 0.523995] [G loss: 0.843410]\n",
"[Epoch 164/200] [Batch 54/59] [D loss: 0.639827] [G loss: 1.207285]\n",
"[Epoch 164/200] [Batch 55/59] [D loss: 0.594911] [G loss: 0.901710]\n",
"[Epoch 164/200] [Batch 56/59] [D loss: 0.559414] [G loss: 0.828314]\n",
"[Epoch 164/200] [Batch 57/59] [D loss: 0.488771] [G loss: 0.854241]\n",
"[Epoch 164/200] [Batch 58/59] [D loss: 0.539725] [G loss: 1.025209]\n",
"[Epoch 165/200] [Batch 0/59] [D loss: 0.695309] [G loss: 0.627515]\n",
"[Epoch 165/200] [Batch 1/59] [D loss: 0.624897] [G loss: 1.227276]\n",
"[Epoch 165/200] [Batch 2/59] [D loss: 0.557066] [G loss: 1.183475]\n",
"[Epoch 165/200] [Batch 3/59] [D loss: 0.494622] [G loss: 0.694441]\n",
"[Epoch 165/200] [Batch 4/59] [D loss: 0.523409] [G loss: 1.054136]\n",
"[Epoch 165/200] [Batch 5/59] [D loss: 0.577627] [G loss: 1.234581]\n",
"[Epoch 165/200] [Batch 6/59] [D loss: 0.598864] [G loss: 0.703813]\n",
"[Epoch 165/200] [Batch 7/59] [D loss: 0.682270] [G loss: 1.025267]\n",
"[Epoch 165/200] [Batch 8/59] [D loss: 0.545797] [G loss: 0.903856]\n",
"[Epoch 165/200] [Batch 9/59] [D loss: 0.597794] [G loss: 0.724173]\n",
"[Epoch 165/200] [Batch 10/59] [D loss: 0.579149] [G loss: 1.295081]\n",
"[Epoch 165/200] [Batch 11/59] [D loss: 0.540579] [G loss: 0.947604]\n",
"[Epoch 165/200] [Batch 12/59] [D loss: 0.559450] [G loss: 0.805045]\n",
"[Epoch 165/200] [Batch 13/59] [D loss: 0.573381] [G loss: 1.101861]\n",
"[Epoch 165/200] [Batch 14/59] [D loss: 0.481336] [G loss: 0.668691]\n",
"[Epoch 165/200] [Batch 15/59] [D loss: 0.609217] [G loss: 0.801861]\n",
"[Epoch 165/200] [Batch 16/59] [D loss: 0.569698] [G loss: 1.179672]\n",
"[Epoch 165/200] [Batch 17/59] [D loss: 0.493241] [G loss: 0.787960]\n",
"[Epoch 165/200] [Batch 18/59] [D loss: 0.525255] [G loss: 0.732170]\n",
"[Epoch 165/200] [Batch 19/59] [D loss: 0.537482] [G loss: 1.324169]\n",
"[Epoch 165/200] [Batch 20/59] [D loss: 0.547786] [G loss: 0.852780]\n",
"[Epoch 165/200] [Batch 21/59] [D loss: 0.477383] [G loss: 0.815593]\n",
"[Epoch 165/200] [Batch 22/59] [D loss: 0.641384] [G loss: 1.008564]\n",
"[Epoch 165/200] [Batch 23/59] [D loss: 0.604766] [G loss: 1.128034]\n",
"[Epoch 165/200] [Batch 24/59] [D loss: 0.600824] [G loss: 0.788340]\n",
"[Epoch 165/200] [Batch 25/59] [D loss: 0.570540] [G loss: 0.748006]\n",
"[Epoch 165/200] [Batch 26/59] [D loss: 0.580492] [G loss: 1.286343]\n",
"[Epoch 165/200] [Batch 27/59] [D loss: 0.550841] [G loss: 0.859902]\n",
"[Epoch 165/200] [Batch 28/59] [D loss: 0.566604] [G loss: 0.751570]\n",
"[Epoch 165/200] [Batch 29/59] [D loss: 0.567860] [G loss: 1.167072]\n",
"[Epoch 165/200] [Batch 30/59] [D loss: 0.550113] [G loss: 1.000884]\n",
"[Epoch 165/200] [Batch 31/59] [D loss: 0.533584] [G loss: 0.913325]\n",
"[Epoch 165/200] [Batch 32/59] [D loss: 0.542276] [G loss: 1.048018]\n",
"[Epoch 165/200] [Batch 33/59] [D loss: 0.659547] [G loss: 1.180553]\n",
"[Epoch 165/200] [Batch 34/59] [D loss: 0.586003] [G loss: 0.860983]\n",
"[Epoch 165/200] [Batch 35/59] [D loss: 0.643097] [G loss: 0.808146]\n",
"[Epoch 165/200] [Batch 36/59] [D loss: 0.522293] [G loss: 0.975651]\n",
"[Epoch 165/200] [Batch 37/59] [D loss: 0.577406] [G loss: 1.449813]\n",
"[Epoch 165/200] [Batch 38/59] [D loss: 0.577942] [G loss: 0.691826]\n",
"[Epoch 165/200] [Batch 39/59] [D loss: 0.560355] [G loss: 0.768128]\n",
"[Epoch 165/200] [Batch 40/59] [D loss: 0.580693] [G loss: 1.318155]\n",
"[Epoch 165/200] [Batch 41/59] [D loss: 0.586794] [G loss: 1.132837]\n",
"[Epoch 165/200] [Batch 42/59] [D loss: 0.640226] [G loss: 0.587752]\n",
"[Epoch 165/200] [Batch 43/59] [D loss: 0.585787] [G loss: 1.317540]\n",
"[Epoch 165/200] [Batch 44/59] [D loss: 0.486292] [G loss: 1.006213]\n",
"[Epoch 165/200] [Batch 45/59] [D loss: 0.552090] [G loss: 1.013699]\n",
"[Epoch 165/200] [Batch 46/59] [D loss: 0.667751] [G loss: 1.087113]\n",
"[Epoch 165/200] [Batch 47/59] [D loss: 0.523648] [G loss: 0.910048]\n",
"[Epoch 165/200] [Batch 48/59] [D loss: 0.535215] [G loss: 1.304869]\n",
"[Epoch 165/200] [Batch 49/59] [D loss: 0.534783] [G loss: 1.099817]\n",
"[Epoch 165/200] [Batch 50/59] [D loss: 0.512258] [G loss: 1.042670]\n",
"[Epoch 165/200] [Batch 51/59] [D loss: 0.533944] [G loss: 1.136489]\n",
"[Epoch 165/200] [Batch 52/59] [D loss: 0.565040] [G loss: 0.891622]\n",
"[Epoch 165/200] [Batch 53/59] [D loss: 0.577873] [G loss: 1.004522]\n",
"[Epoch 165/200] [Batch 54/59] [D loss: 0.545724] [G loss: 0.796032]\n",
"[Epoch 165/200] [Batch 55/59] [D loss: 0.550089] [G loss: 1.143242]\n",
"[Epoch 165/200] [Batch 56/59] [D loss: 0.671982] [G loss: 1.206119]\n",
"[Epoch 165/200] [Batch 57/59] [D loss: 0.512139] [G loss: 0.971820]\n",
"[Epoch 165/200] [Batch 58/59] [D loss: 0.440239] [G loss: 0.766282]\n",
"[Epoch 166/200] [Batch 0/59] [D loss: 0.561529] [G loss: 0.670913]\n",
"[Epoch 166/200] [Batch 1/59] [D loss: 0.636664] [G loss: 1.329727]\n",
"[Epoch 166/200] [Batch 2/59] [D loss: 0.558993] [G loss: 0.893082]\n",
"[Epoch 166/200] [Batch 3/59] [D loss: 0.509034] [G loss: 0.935803]\n",
"[Epoch 166/200] [Batch 4/59] [D loss: 0.578042] [G loss: 1.196421]\n",
"[Epoch 166/200] [Batch 5/59] [D loss: 0.547177] [G loss: 0.741514]\n",
"[Epoch 166/200] [Batch 6/59] [D loss: 0.524600] [G loss: 1.170066]\n",
"[Epoch 166/200] [Batch 7/59] [D loss: 0.627482] [G loss: 1.030071]\n",
"[Epoch 166/200] [Batch 8/59] [D loss: 0.572089] [G loss: 0.737711]\n",
"[Epoch 166/200] [Batch 9/59] [D loss: 0.526703] [G loss: 1.129561]\n",
"[Epoch 166/200] [Batch 10/59] [D loss: 0.490647] [G loss: 0.941425]\n",
"[Epoch 166/200] [Batch 11/59] [D loss: 0.506084] [G loss: 0.759957]\n",
"[Epoch 166/200] [Batch 12/59] [D loss: 0.552834] [G loss: 0.800464]\n",
"[Epoch 166/200] [Batch 13/59] [D loss: 0.602567] [G loss: 1.090445]\n",
"[Epoch 166/200] [Batch 14/59] [D loss: 0.490213] [G loss: 0.766327]\n",
"[Epoch 166/200] [Batch 15/59] [D loss: 0.554289] [G loss: 0.815256]\n",
"[Epoch 166/200] [Batch 16/59] [D loss: 0.602196] [G loss: 1.180154]\n",
"[Epoch 166/200] [Batch 17/59] [D loss: 0.541304] [G loss: 0.568252]\n",
"[Epoch 166/200] [Batch 18/59] [D loss: 0.583806] [G loss: 1.134791]\n",
"[Epoch 166/200] [Batch 19/59] [D loss: 0.509081] [G loss: 0.949564]\n",
"[Epoch 166/200] [Batch 20/59] [D loss: 0.490034] [G loss: 0.897172]\n",
"[Epoch 166/200] [Batch 21/59] [D loss: 0.655392] [G loss: 1.208603]\n",
"[Epoch 166/200] [Batch 22/59] [D loss: 0.598957] [G loss: 0.889279]\n",
"[Epoch 166/200] [Batch 23/59] [D loss: 0.512995] [G loss: 0.838448]\n",
"[Epoch 166/200] [Batch 24/59] [D loss: 0.558574] [G loss: 0.975388]\n",
"[Epoch 166/200] [Batch 25/59] [D loss: 0.550075] [G loss: 0.891752]\n",
"[Epoch 166/200] [Batch 26/59] [D loss: 0.540897] [G loss: 1.365896]\n",
"[Epoch 166/200] [Batch 27/59] [D loss: 0.586181] [G loss: 0.860239]\n",
"[Epoch 166/200] [Batch 28/59] [D loss: 0.530163] [G loss: 0.569106]\n",
"[Epoch 166/200] [Batch 29/59] [D loss: 0.492773] [G loss: 1.100575]\n",
"[Epoch 166/200] [Batch 30/59] [D loss: 0.541359] [G loss: 1.012038]\n",
"[Epoch 166/200] [Batch 31/59] [D loss: 0.638560] [G loss: 0.662794]\n",
"[Epoch 166/200] [Batch 32/59] [D loss: 0.580901] [G loss: 1.383908]\n",
"[Epoch 166/200] [Batch 33/59] [D loss: 0.587632] [G loss: 0.841885]\n",
"[Epoch 166/200] [Batch 34/59] [D loss: 0.564510] [G loss: 0.636884]\n",
"[Epoch 166/200] [Batch 35/59] [D loss: 0.532073] [G loss: 1.429254]\n",
"[Epoch 166/200] [Batch 36/59] [D loss: 0.460608] [G loss: 0.976918]\n",
"[Epoch 166/200] [Batch 37/59] [D loss: 0.538217] [G loss: 0.712868]\n",
"[Epoch 166/200] [Batch 38/59] [D loss: 0.541876] [G loss: 1.227618]\n",
"[Epoch 166/200] [Batch 39/59] [D loss: 0.527447] [G loss: 1.048096]\n",
"[Epoch 166/200] [Batch 40/59] [D loss: 0.551048] [G loss: 0.788882]\n",
"[Epoch 166/200] [Batch 41/59] [D loss: 0.541594] [G loss: 0.895051]\n",
"[Epoch 166/200] [Batch 42/59] [D loss: 0.605566] [G loss: 1.099088]\n",
"[Epoch 166/200] [Batch 43/59] [D loss: 0.596864] [G loss: 1.191516]\n",
"[Epoch 166/200] [Batch 44/59] [D loss: 0.609351] [G loss: 1.053493]\n",
"[Epoch 166/200] [Batch 45/59] [D loss: 0.562942] [G loss: 0.912439]\n",
"[Epoch 166/200] [Batch 46/59] [D loss: 0.542575] [G loss: 1.005101]\n",
"[Epoch 166/200] [Batch 47/59] [D loss: 0.505579] [G loss: 1.001241]\n",
"[Epoch 166/200] [Batch 48/59] [D loss: 0.594869] [G loss: 0.821145]\n",
"[Epoch 166/200] [Batch 49/59] [D loss: 0.445672] [G loss: 0.732687]\n",
"[Epoch 166/200] [Batch 50/59] [D loss: 0.521512] [G loss: 0.849305]\n",
"[Epoch 166/200] [Batch 51/59] [D loss: 0.646188] [G loss: 0.930063]\n",
"[Epoch 166/200] [Batch 52/59] [D loss: 0.583051] [G loss: 0.760579]\n",
"[Epoch 166/200] [Batch 53/59] [D loss: 0.594084] [G loss: 1.392168]\n",
"[Epoch 166/200] [Batch 54/59] [D loss: 0.564047] [G loss: 0.854968]\n",
"[Epoch 166/200] [Batch 55/59] [D loss: 0.549508] [G loss: 0.900343]\n",
"[Epoch 166/200] [Batch 56/59] [D loss: 0.552015] [G loss: 0.937289]\n",
"[Epoch 166/200] [Batch 57/59] [D loss: 0.574223] [G loss: 1.183228]\n",
"[Epoch 166/200] [Batch 58/59] [D loss: 0.564417] [G loss: 0.542954]\n",
"[Epoch 167/200] [Batch 0/59] [D loss: 0.580856] [G loss: 1.618669]\n",
"[Epoch 167/200] [Batch 1/59] [D loss: 0.551509] [G loss: 0.838611]\n",
"[Epoch 167/200] [Batch 2/59] [D loss: 0.476159] [G loss: 0.744182]\n",
"[Epoch 167/200] [Batch 3/59] [D loss: 0.532731] [G loss: 1.337237]\n",
"[Epoch 167/200] [Batch 4/59] [D loss: 0.532998] [G loss: 1.305392]\n",
"[Epoch 167/200] [Batch 5/59] [D loss: 0.575255] [G loss: 0.902523]\n",
"[Epoch 167/200] [Batch 6/59] [D loss: 0.576044] [G loss: 0.815268]\n",
"[Epoch 167/200] [Batch 7/59] [D loss: 0.594588] [G loss: 1.388973]\n",
"[Epoch 167/200] [Batch 8/59] [D loss: 0.542206] [G loss: 1.021338]\n",
"[Epoch 167/200] [Batch 9/59] [D loss: 0.581954] [G loss: 0.740676]\n",
"[Epoch 167/200] [Batch 10/59] [D loss: 0.586795] [G loss: 0.929998]\n",
"[Epoch 167/200] [Batch 11/59] [D loss: 0.576730] [G loss: 1.065042]\n",
"[Epoch 167/200] [Batch 12/59] [D loss: 0.557082] [G loss: 0.987278]\n",
"[Epoch 167/200] [Batch 13/59] [D loss: 0.540950] [G loss: 0.703728]\n",
"[Epoch 167/200] [Batch 14/59] [D loss: 0.608678] [G loss: 1.185157]\n",
"[Epoch 167/200] [Batch 15/59] [D loss: 0.483954] [G loss: 1.096427]\n",
"[Epoch 167/200] [Batch 16/59] [D loss: 0.547289] [G loss: 0.720613]\n",
"[Epoch 167/200] [Batch 17/59] [D loss: 0.552925] [G loss: 0.941864]\n",
"[Epoch 167/200] [Batch 18/59] [D loss: 0.646737] [G loss: 1.259570]\n",
"[Epoch 167/200] [Batch 19/59] [D loss: 0.538271] [G loss: 0.589121]\n",
"[Epoch 167/200] [Batch 20/59] [D loss: 0.476601] [G loss: 0.963298]\n",
"[Epoch 167/200] [Batch 21/59] [D loss: 0.564325] [G loss: 1.193196]\n",
"[Epoch 167/200] [Batch 22/59] [D loss: 0.583787] [G loss: 0.839826]\n",
"[Epoch 167/200] [Batch 23/59] [D loss: 0.473448] [G loss: 1.035403]\n",
"[Epoch 167/200] [Batch 24/59] [D loss: 0.515944] [G loss: 1.238634]\n",
"[Epoch 167/200] [Batch 25/59] [D loss: 0.618966] [G loss: 0.705048]\n",
"[Epoch 167/200] [Batch 26/59] [D loss: 0.603308] [G loss: 1.156076]\n",
"[Epoch 167/200] [Batch 27/59] [D loss: 0.571712] [G loss: 1.084684]\n",
"[Epoch 167/200] [Batch 28/59] [D loss: 0.542310] [G loss: 0.880291]\n",
"[Epoch 167/200] [Batch 29/59] [D loss: 0.520221] [G loss: 1.145123]\n",
"[Epoch 167/200] [Batch 30/59] [D loss: 0.478960] [G loss: 0.941537]\n",
"[Epoch 167/200] [Batch 31/59] [D loss: 0.590127] [G loss: 0.816739]\n",
"[Epoch 167/200] [Batch 32/59] [D loss: 0.602734] [G loss: 0.866989]\n",
"[Epoch 167/200] [Batch 33/59] [D loss: 0.627158] [G loss: 1.662435]\n",
"[Epoch 167/200] [Batch 34/59] [D loss: 0.577809] [G loss: 0.904963]\n",
"[Epoch 167/200] [Batch 35/59] [D loss: 0.581235] [G loss: 0.641450]\n",
"[Epoch 167/200] [Batch 36/59] [D loss: 0.569324] [G loss: 1.575640]\n",
"[Epoch 167/200] [Batch 37/59] [D loss: 0.481746] [G loss: 1.114531]\n",
"[Epoch 167/200] [Batch 38/59] [D loss: 0.544771] [G loss: 0.597690]\n",
"[Epoch 167/200] [Batch 39/59] [D loss: 0.514754] [G loss: 1.061908]\n",
"[Epoch 167/200] [Batch 40/59] [D loss: 0.666216] [G loss: 1.279648]\n",
"[Epoch 167/200] [Batch 41/59] [D loss: 0.660694] [G loss: 0.683090]\n",
"[Epoch 167/200] [Batch 42/59] [D loss: 0.496584] [G loss: 1.086689]\n",
"[Epoch 167/200] [Batch 43/59] [D loss: 0.550625] [G loss: 1.231562]\n",
"[Epoch 167/200] [Batch 44/59] [D loss: 0.540697] [G loss: 0.671918]\n",
"[Epoch 167/200] [Batch 45/59] [D loss: 0.644053] [G loss: 0.801170]\n",
"[Epoch 167/200] [Batch 46/59] [D loss: 0.620128] [G loss: 1.453593]\n",
"[Epoch 167/200] [Batch 47/59] [D loss: 0.501984] [G loss: 0.984095]\n",
"[Epoch 167/200] [Batch 48/59] [D loss: 0.496074] [G loss: 0.752937]\n",
"[Epoch 167/200] [Batch 49/59] [D loss: 0.514962] [G loss: 0.752533]\n",
"[Epoch 167/200] [Batch 50/59] [D loss: 0.534117] [G loss: 1.042629]\n",
"[Epoch 167/200] [Batch 51/59] [D loss: 0.609999] [G loss: 0.888155]\n",
"[Epoch 167/200] [Batch 52/59] [D loss: 0.595496] [G loss: 1.228500]\n",
"[Epoch 167/200] [Batch 53/59] [D loss: 0.476032] [G loss: 0.725355]\n",
"[Epoch 167/200] [Batch 54/59] [D loss: 0.535159] [G loss: 0.834015]\n",
"[Epoch 167/200] [Batch 55/59] [D loss: 0.661693] [G loss: 1.263406]\n",
"[Epoch 167/200] [Batch 56/59] [D loss: 0.619083] [G loss: 0.542167]\n",
"[Epoch 167/200] [Batch 57/59] [D loss: 0.469911] [G loss: 0.971821]\n",
"[Epoch 167/200] [Batch 58/59] [D loss: 0.645971] [G loss: 1.295757]\n",
"[Epoch 168/200] [Batch 0/59] [D loss: 0.529173] [G loss: 1.532543]\n",
"[Epoch 168/200] [Batch 1/59] [D loss: 0.611286] [G loss: 0.639266]\n",
"[Epoch 168/200] [Batch 2/59] [D loss: 0.494950] [G loss: 0.859761]\n",
"[Epoch 168/200] [Batch 3/59] [D loss: 0.516604] [G loss: 0.915790]\n",
"[Epoch 168/200] [Batch 4/59] [D loss: 0.508816] [G loss: 0.706937]\n",
"[Epoch 168/200] [Batch 5/59] [D loss: 0.686952] [G loss: 1.331603]\n",
"[Epoch 168/200] [Batch 6/59] [D loss: 0.498988] [G loss: 1.038493]\n",
"[Epoch 168/200] [Batch 7/59] [D loss: 0.472593] [G loss: 0.865991]\n",
"[Epoch 168/200] [Batch 8/59] [D loss: 0.529976] [G loss: 0.933369]\n",
"[Epoch 168/200] [Batch 9/59] [D loss: 0.638476] [G loss: 1.052898]\n",
"[Epoch 168/200] [Batch 10/59] [D loss: 0.552688] [G loss: 0.917233]\n",
"[Epoch 168/200] [Batch 11/59] [D loss: 0.523787] [G loss: 1.024258]\n",
"[Epoch 168/200] [Batch 12/59] [D loss: 0.556665] [G loss: 0.883984]\n",
"[Epoch 168/200] [Batch 13/59] [D loss: 0.522519] [G loss: 1.065223]\n",
"[Epoch 168/200] [Batch 14/59] [D loss: 0.552180] [G loss: 0.806129]\n",
"[Epoch 168/200] [Batch 15/59] [D loss: 0.508268] [G loss: 0.571829]\n",
"[Epoch 168/200] [Batch 16/59] [D loss: 0.588408] [G loss: 1.437059]\n",
"[Epoch 168/200] [Batch 17/59] [D loss: 0.505181] [G loss: 1.040808]\n",
"[Epoch 168/200] [Batch 18/59] [D loss: 0.493456] [G loss: 0.793405]\n",
"[Epoch 168/200] [Batch 19/59] [D loss: 0.597061] [G loss: 0.681105]\n",
"[Epoch 168/200] [Batch 20/59] [D loss: 0.704923] [G loss: 1.729703]\n",
"[Epoch 168/200] [Batch 21/59] [D loss: 0.499784] [G loss: 0.887388]\n",
"[Epoch 168/200] [Batch 22/59] [D loss: 0.573607] [G loss: 0.713292]\n",
"[Epoch 168/200] [Batch 23/59] [D loss: 0.609749] [G loss: 1.541588]\n",
"[Epoch 168/200] [Batch 24/59] [D loss: 0.514795] [G loss: 1.274346]\n",
"[Epoch 168/200] [Batch 25/59] [D loss: 0.546722] [G loss: 0.847898]\n",
"[Epoch 168/200] [Batch 26/59] [D loss: 0.500862] [G loss: 1.001913]\n",
"[Epoch 168/200] [Batch 27/59] [D loss: 0.600009] [G loss: 1.074746]\n",
"[Epoch 168/200] [Batch 28/59] [D loss: 0.515531] [G loss: 1.005771]\n",
"[Epoch 168/200] [Batch 29/59] [D loss: 0.566922] [G loss: 0.777564]\n",
"[Epoch 168/200] [Batch 30/59] [D loss: 0.611508] [G loss: 1.176332]\n",
"[Epoch 168/200] [Batch 31/59] [D loss: 0.615560] [G loss: 0.550798]\n",
"[Epoch 168/200] [Batch 32/59] [D loss: 0.481774] [G loss: 0.913302]\n",
"[Epoch 168/200] [Batch 33/59] [D loss: 0.564736] [G loss: 1.281955]\n",
"[Epoch 168/200] [Batch 34/59] [D loss: 0.588615] [G loss: 0.830908]\n",
"[Epoch 168/200] [Batch 35/59] [D loss: 0.575380] [G loss: 0.849809]\n",
"[Epoch 168/200] [Batch 36/59] [D loss: 0.628834] [G loss: 1.533132]\n",
"[Epoch 168/200] [Batch 37/59] [D loss: 0.493576] [G loss: 0.816198]\n",
"[Epoch 168/200] [Batch 38/59] [D loss: 0.558959] [G loss: 0.797950]\n",
"[Epoch 168/200] [Batch 39/59] [D loss: 0.580555] [G loss: 1.144691]\n",
"[Epoch 168/200] [Batch 40/59] [D loss: 0.503833] [G loss: 0.845831]\n",
"[Epoch 168/200] [Batch 41/59] [D loss: 0.586108] [G loss: 0.916814]\n",
"[Epoch 168/200] [Batch 42/59] [D loss: 0.509205] [G loss: 1.285218]\n",
"[Epoch 168/200] [Batch 43/59] [D loss: 0.531222] [G loss: 0.944077]\n",
"[Epoch 168/200] [Batch 44/59] [D loss: 0.569369] [G loss: 0.946811]\n",
"[Epoch 168/200] [Batch 45/59] [D loss: 0.551591] [G loss: 0.845286]\n",
"[Epoch 168/200] [Batch 46/59] [D loss: 0.622684] [G loss: 1.315894]\n",
"[Epoch 168/200] [Batch 47/59] [D loss: 0.490349] [G loss: 1.022865]\n",
"[Epoch 168/200] [Batch 48/59] [D loss: 0.493289] [G loss: 0.711751]\n",
"[Epoch 168/200] [Batch 49/59] [D loss: 0.557071] [G loss: 1.080976]\n",
"[Epoch 168/200] [Batch 50/59] [D loss: 0.580832] [G loss: 1.139126]\n",
"[Epoch 168/200] [Batch 51/59] [D loss: 0.506995] [G loss: 1.009540]\n",
"[Epoch 168/200] [Batch 52/59] [D loss: 0.606043] [G loss: 0.849878]\n",
"[Epoch 168/200] [Batch 53/59] [D loss: 0.595744] [G loss: 1.416105]\n",
"[Epoch 168/200] [Batch 54/59] [D loss: 0.611507] [G loss: 0.769043]\n",
"[Epoch 168/200] [Batch 55/59] [D loss: 0.586362] [G loss: 1.085506]\n",
"[Epoch 168/200] [Batch 56/59] [D loss: 0.533043] [G loss: 1.188004]\n",
"[Epoch 168/200] [Batch 57/59] [D loss: 0.528371] [G loss: 0.868779]\n",
"[Epoch 168/200] [Batch 58/59] [D loss: 0.601261] [G loss: 0.993531]\n",
"[Epoch 169/200] [Batch 0/59] [D loss: 0.538909] [G loss: 1.102095]\n",
"[Epoch 169/200] [Batch 1/59] [D loss: 0.578014] [G loss: 0.988438]\n",
"[Epoch 169/200] [Batch 2/59] [D loss: 0.570037] [G loss: 0.865162]\n",
"[Epoch 169/200] [Batch 3/59] [D loss: 0.559127] [G loss: 0.859361]\n",
"[Epoch 169/200] [Batch 4/59] [D loss: 0.522666] [G loss: 1.016258]\n",
"[Epoch 169/200] [Batch 5/59] [D loss: 0.599243] [G loss: 0.692506]\n",
"[Epoch 169/200] [Batch 6/59] [D loss: 0.570106] [G loss: 1.223239]\n",
"[Epoch 169/200] [Batch 7/59] [D loss: 0.568744] [G loss: 1.043987]\n",
"[Epoch 169/200] [Batch 8/59] [D loss: 0.660651] [G loss: 0.691600]\n",
"[Epoch 169/200] [Batch 9/59] [D loss: 0.636659] [G loss: 1.592630]\n",
"[Epoch 169/200] [Batch 10/59] [D loss: 0.537359] [G loss: 0.640273]\n",
"[Epoch 169/200] [Batch 11/59] [D loss: 0.482362] [G loss: 0.545734]\n",
"[Epoch 169/200] [Batch 12/59] [D loss: 0.514346] [G loss: 1.066106]\n",
"[Epoch 169/200] [Batch 13/59] [D loss: 0.574250] [G loss: 0.846625]\n",
"[Epoch 169/200] [Batch 14/59] [D loss: 0.558161] [G loss: 0.809307]\n",
"[Epoch 169/200] [Batch 15/59] [D loss: 0.567928] [G loss: 0.765596]\n",
"[Epoch 169/200] [Batch 16/59] [D loss: 0.522182] [G loss: 1.131381]\n",
"[Epoch 169/200] [Batch 17/59] [D loss: 0.468312] [G loss: 0.862773]\n",
"[Epoch 169/200] [Batch 18/59] [D loss: 0.541972] [G loss: 0.757276]\n",
"[Epoch 169/200] [Batch 19/59] [D loss: 0.512852] [G loss: 1.140403]\n",
"[Epoch 169/200] [Batch 20/59] [D loss: 0.508544] [G loss: 0.949055]\n",
"[Epoch 169/200] [Batch 21/59] [D loss: 0.568846] [G loss: 0.698277]\n",
"[Epoch 169/200] [Batch 22/59] [D loss: 0.575523] [G loss: 1.305861]\n",
"[Epoch 169/200] [Batch 23/59] [D loss: 0.528072] [G loss: 0.887096]\n",
"[Epoch 169/200] [Batch 24/59] [D loss: 0.569399] [G loss: 0.882275]\n",
"[Epoch 169/200] [Batch 25/59] [D loss: 0.517035] [G loss: 1.194951]\n",
"[Epoch 169/200] [Batch 26/59] [D loss: 0.582755] [G loss: 1.074816]\n",
"[Epoch 169/200] [Batch 27/59] [D loss: 0.655734] [G loss: 0.815153]\n",
"[Epoch 169/200] [Batch 28/59] [D loss: 0.511190] [G loss: 1.078288]\n",
"[Epoch 169/200] [Batch 29/59] [D loss: 0.619252] [G loss: 0.971366]\n",
"[Epoch 169/200] [Batch 30/59] [D loss: 0.553473] [G loss: 0.689337]\n",
"[Epoch 169/200] [Batch 31/59] [D loss: 0.555302] [G loss: 0.885454]\n",
"[Epoch 169/200] [Batch 32/59] [D loss: 0.528586] [G loss: 0.832374]\n",
"[Epoch 169/200] [Batch 33/59] [D loss: 0.527003] [G loss: 1.183975]\n",
"[Epoch 169/200] [Batch 34/59] [D loss: 0.639603] [G loss: 0.925008]\n",
"[Epoch 169/200] [Batch 35/59] [D loss: 0.446091] [G loss: 0.912190]\n",
"[Epoch 169/200] [Batch 36/59] [D loss: 0.614085] [G loss: 0.852998]\n",
"[Epoch 169/200] [Batch 37/59] [D loss: 0.593496] [G loss: 1.038689]\n",
"[Epoch 169/200] [Batch 38/59] [D loss: 0.613143] [G loss: 0.883061]\n",
"[Epoch 169/200] [Batch 39/59] [D loss: 0.588166] [G loss: 0.941897]\n",
"[Epoch 169/200] [Batch 40/59] [D loss: 0.515526] [G loss: 1.069751]\n",
"[Epoch 169/200] [Batch 41/59] [D loss: 0.529467] [G loss: 0.899688]\n",
"[Epoch 169/200] [Batch 42/59] [D loss: 0.521320] [G loss: 0.839340]\n",
"[Epoch 169/200] [Batch 43/59] [D loss: 0.542023] [G loss: 0.808557]\n",
"[Epoch 169/200] [Batch 44/59] [D loss: 0.526922] [G loss: 1.145662]\n",
"[Epoch 169/200] [Batch 45/59] [D loss: 0.599675] [G loss: 0.713380]\n",
"[Epoch 169/200] [Batch 46/59] [D loss: 0.488056] [G loss: 1.086463]\n",
"[Epoch 169/200] [Batch 47/59] [D loss: 0.498329] [G loss: 1.169780]\n",
"[Epoch 169/200] [Batch 48/59] [D loss: 0.564581] [G loss: 0.899791]\n",
"[Epoch 169/200] [Batch 49/59] [D loss: 0.699020] [G loss: 0.790602]\n",
"[Epoch 169/200] [Batch 50/59] [D loss: 0.543046] [G loss: 1.098404]\n",
"[Epoch 169/200] [Batch 51/59] [D loss: 0.488088] [G loss: 0.946145]\n",
"[Epoch 169/200] [Batch 52/59] [D loss: 0.570539] [G loss: 0.797390]\n",
"[Epoch 169/200] [Batch 53/59] [D loss: 0.651239] [G loss: 1.078835]\n",
"[Epoch 169/200] [Batch 54/59] [D loss: 0.526497] [G loss: 0.851400]\n",
"[Epoch 169/200] [Batch 55/59] [D loss: 0.637531] [G loss: 1.116791]\n",
"[Epoch 169/200] [Batch 56/59] [D loss: 0.534585] [G loss: 0.805419]\n",
"[Epoch 169/200] [Batch 57/59] [D loss: 0.508407] [G loss: 0.852013]\n",
"[Epoch 169/200] [Batch 58/59] [D loss: 0.675778] [G loss: 1.270453]\n",
"[Epoch 170/200] [Batch 0/59] [D loss: 0.611832] [G loss: 0.546499]\n",
"[Epoch 170/200] [Batch 1/59] [D loss: 0.561917] [G loss: 0.837775]\n",
"[Epoch 170/200] [Batch 2/59] [D loss: 0.731550] [G loss: 1.593005]\n",
"[Epoch 170/200] [Batch 3/59] [D loss: 0.632448] [G loss: 0.615734]\n",
"[Epoch 170/200] [Batch 4/59] [D loss: 0.553031] [G loss: 0.855849]\n",
"[Epoch 170/200] [Batch 5/59] [D loss: 0.535926] [G loss: 1.226545]\n",
"[Epoch 170/200] [Batch 6/59] [D loss: 0.550627] [G loss: 0.896878]\n",
"[Epoch 170/200] [Batch 7/59] [D loss: 0.552215] [G loss: 0.834277]\n",
"[Epoch 170/200] [Batch 8/59] [D loss: 0.598149] [G loss: 1.093008]\n",
"[Epoch 170/200] [Batch 9/59] [D loss: 0.532096] [G loss: 0.874573]\n",
"[Epoch 170/200] [Batch 10/59] [D loss: 0.548590] [G loss: 1.229646]\n",
"[Epoch 170/200] [Batch 11/59] [D loss: 0.563792] [G loss: 0.692254]\n",
"[Epoch 170/200] [Batch 12/59] [D loss: 0.563508] [G loss: 1.118477]\n",
"[Epoch 170/200] [Batch 13/59] [D loss: 0.543195] [G loss: 1.198584]\n",
"[Epoch 170/200] [Batch 14/59] [D loss: 0.558989] [G loss: 0.507973]\n",
"[Epoch 170/200] [Batch 15/59] [D loss: 0.543011] [G loss: 1.306415]\n",
"[Epoch 170/200] [Batch 16/59] [D loss: 0.505764] [G loss: 1.325612]\n",
"[Epoch 170/200] [Batch 17/59] [D loss: 0.540017] [G loss: 0.716399]\n",
"[Epoch 170/200] [Batch 18/59] [D loss: 0.640128] [G loss: 1.072296]\n",
"[Epoch 170/200] [Batch 19/59] [D loss: 0.452316] [G loss: 0.781917]\n",
"[Epoch 170/200] [Batch 20/59] [D loss: 0.576726] [G loss: 0.804236]\n",
"[Epoch 170/200] [Batch 21/59] [D loss: 0.496565] [G loss: 1.279233]\n",
"[Epoch 170/200] [Batch 22/59] [D loss: 0.542087] [G loss: 0.961937]\n",
"[Epoch 170/200] [Batch 23/59] [D loss: 0.616959] [G loss: 0.610765]\n",
"[Epoch 170/200] [Batch 24/59] [D loss: 0.678889] [G loss: 1.320520]\n",
"[Epoch 170/200] [Batch 25/59] [D loss: 0.529208] [G loss: 0.962592]\n",
"[Epoch 170/200] [Batch 26/59] [D loss: 0.534223] [G loss: 0.830207]\n",
"[Epoch 170/200] [Batch 27/59] [D loss: 0.495120] [G loss: 1.205367]\n",
"[Epoch 170/200] [Batch 28/59] [D loss: 0.489077] [G loss: 1.155148]\n",
"[Epoch 170/200] [Batch 29/59] [D loss: 0.533805] [G loss: 0.751099]\n",
"[Epoch 170/200] [Batch 30/59] [D loss: 0.602864] [G loss: 1.169372]\n",
"[Epoch 170/200] [Batch 31/59] [D loss: 0.527428] [G loss: 0.960149]\n",
"[Epoch 170/200] [Batch 32/59] [D loss: 0.584478] [G loss: 0.760728]\n",
"[Epoch 170/200] [Batch 33/59] [D loss: 0.527530] [G loss: 0.971965]\n",
"[Epoch 170/200] [Batch 34/59] [D loss: 0.555616] [G loss: 1.078276]\n",
"[Epoch 170/200] [Batch 35/59] [D loss: 0.506710] [G loss: 0.693996]\n",
"[Epoch 170/200] [Batch 36/59] [D loss: 0.536463] [G loss: 0.910267]\n",
"[Epoch 170/200] [Batch 37/59] [D loss: 0.582505] [G loss: 1.036221]\n",
"[Epoch 170/200] [Batch 38/59] [D loss: 0.579744] [G loss: 1.122731]\n",
"[Epoch 170/200] [Batch 39/59] [D loss: 0.576719] [G loss: 0.606933]\n",
"[Epoch 170/200] [Batch 40/59] [D loss: 0.596047] [G loss: 1.285517]\n",
"[Epoch 170/200] [Batch 41/59] [D loss: 0.586457] [G loss: 0.822881]\n",
"[Epoch 170/200] [Batch 42/59] [D loss: 0.549045] [G loss: 0.665052]\n",
"[Epoch 170/200] [Batch 43/59] [D loss: 0.574944] [G loss: 1.179142]\n",
"[Epoch 170/200] [Batch 44/59] [D loss: 0.527941] [G loss: 1.116807]\n",
"[Epoch 170/200] [Batch 45/59] [D loss: 0.515314] [G loss: 0.833560]\n",
"[Epoch 170/200] [Batch 46/59] [D loss: 0.615770] [G loss: 1.031796]\n",
"[Epoch 170/200] [Batch 47/59] [D loss: 0.550024] [G loss: 0.783774]\n",
"[Epoch 170/200] [Batch 48/59] [D loss: 0.634806] [G loss: 0.938088]\n",
"[Epoch 170/200] [Batch 49/59] [D loss: 0.705657] [G loss: 1.661998]\n",
"[Epoch 170/200] [Batch 50/59] [D loss: 0.558743] [G loss: 0.638179]\n",
"[Epoch 170/200] [Batch 51/59] [D loss: 0.551103] [G loss: 0.912472]\n",
"[Epoch 170/200] [Batch 52/59] [D loss: 0.490068] [G loss: 1.163139]\n",
"[Epoch 170/200] [Batch 53/59] [D loss: 0.573779] [G loss: 0.809027]\n",
"[Epoch 170/200] [Batch 54/59] [D loss: 0.501571] [G loss: 0.992686]\n",
"[Epoch 170/200] [Batch 55/59] [D loss: 0.595811] [G loss: 1.298061]\n",
"[Epoch 170/200] [Batch 56/59] [D loss: 0.593262] [G loss: 0.869875]\n",
"[Epoch 170/200] [Batch 57/59] [D loss: 0.589252] [G loss: 0.845753]\n",
"[Epoch 170/200] [Batch 58/59] [D loss: 0.510649] [G loss: 1.113065]\n",
"[Epoch 171/200] [Batch 0/59] [D loss: 0.469572] [G loss: 1.050617]\n",
"[Epoch 171/200] [Batch 1/59] [D loss: 0.517191] [G loss: 0.851660]\n",
"[Epoch 171/200] [Batch 2/59] [D loss: 0.603934] [G loss: 1.160047]\n",
"[Epoch 171/200] [Batch 3/59] [D loss: 0.501337] [G loss: 0.764921]\n",
"[Epoch 171/200] [Batch 4/59] [D loss: 0.559841] [G loss: 1.041654]\n",
"[Epoch 171/200] [Batch 5/59] [D loss: 0.522586] [G loss: 0.892854]\n",
"[Epoch 171/200] [Batch 6/59] [D loss: 0.490955] [G loss: 0.714103]\n",
"[Epoch 171/200] [Batch 7/59] [D loss: 0.572681] [G loss: 1.002979]\n",
"[Epoch 171/200] [Batch 8/59] [D loss: 0.645916] [G loss: 1.158501]\n",
"[Epoch 171/200] [Batch 9/59] [D loss: 0.524812] [G loss: 1.097547]\n",
"[Epoch 171/200] [Batch 10/59] [D loss: 0.483197] [G loss: 0.899441]\n",
"[Epoch 171/200] [Batch 11/59] [D loss: 0.548869] [G loss: 0.838630]\n",
"[Epoch 171/200] [Batch 12/59] [D loss: 0.535084] [G loss: 1.174533]\n",
"[Epoch 171/200] [Batch 13/59] [D loss: 0.528143] [G loss: 0.875010]\n",
"[Epoch 171/200] [Batch 14/59] [D loss: 0.552412] [G loss: 0.940840]\n",
"[Epoch 171/200] [Batch 15/59] [D loss: 0.570815] [G loss: 1.292965]\n",
"[Epoch 171/200] [Batch 16/59] [D loss: 0.532645] [G loss: 0.838926]\n",
"[Epoch 171/200] [Batch 17/59] [D loss: 0.627807] [G loss: 1.083803]\n",
"[Epoch 171/200] [Batch 18/59] [D loss: 0.612957] [G loss: 0.934801]\n",
"[Epoch 171/200] [Batch 19/59] [D loss: 0.511611] [G loss: 1.180605]\n",
"[Epoch 171/200] [Batch 20/59] [D loss: 0.528447] [G loss: 0.803374]\n",
"[Epoch 171/200] [Batch 21/59] [D loss: 0.589430] [G loss: 0.757416]\n",
"[Epoch 171/200] [Batch 22/59] [D loss: 0.541721] [G loss: 1.542992]\n",
"[Epoch 171/200] [Batch 23/59] [D loss: 0.506045] [G loss: 1.278286]\n",
"[Epoch 171/200] [Batch 24/59] [D loss: 0.588301] [G loss: 0.463177]\n",
"[Epoch 171/200] [Batch 25/59] [D loss: 0.484264] [G loss: 1.453775]\n",
"[Epoch 171/200] [Batch 26/59] [D loss: 0.494635] [G loss: 0.984008]\n",
"[Epoch 171/200] [Batch 27/59] [D loss: 0.500453] [G loss: 0.661033]\n",
"[Epoch 171/200] [Batch 28/59] [D loss: 0.523078] [G loss: 1.285399]\n",
"[Epoch 171/200] [Batch 29/59] [D loss: 0.639574] [G loss: 0.885305]\n",
"[Epoch 171/200] [Batch 30/59] [D loss: 0.576065] [G loss: 0.793071]\n",
"[Epoch 171/200] [Batch 31/59] [D loss: 0.687836] [G loss: 1.253090]\n",
"[Epoch 171/200] [Batch 32/59] [D loss: 0.603606] [G loss: 0.999763]\n",
"[Epoch 171/200] [Batch 33/59] [D loss: 0.663911] [G loss: 0.400207]\n",
"[Epoch 171/200] [Batch 34/59] [D loss: 0.532560] [G loss: 1.332549]\n",
"[Epoch 171/200] [Batch 35/59] [D loss: 0.490685] [G loss: 1.224090]\n",
"[Epoch 171/200] [Batch 36/59] [D loss: 0.603760] [G loss: 0.946612]\n",
"[Epoch 171/200] [Batch 37/59] [D loss: 0.520992] [G loss: 1.108806]\n",
"[Epoch 171/200] [Batch 38/59] [D loss: 0.500752] [G loss: 1.053563]\n",
"[Epoch 171/200] [Batch 39/59] [D loss: 0.569724] [G loss: 0.991717]\n",
"[Epoch 171/200] [Batch 40/59] [D loss: 0.597411] [G loss: 0.655665]\n",
"[Epoch 171/200] [Batch 41/59] [D loss: 0.543240] [G loss: 0.885809]\n",
"[Epoch 171/200] [Batch 42/59] [D loss: 0.516302] [G loss: 1.144628]\n",
"[Epoch 171/200] [Batch 43/59] [D loss: 0.556713] [G loss: 0.893529]\n",
"[Epoch 171/200] [Batch 44/59] [D loss: 0.484583] [G loss: 0.897518]\n",
"[Epoch 171/200] [Batch 45/59] [D loss: 0.637949] [G loss: 0.778560]\n",
"[Epoch 171/200] [Batch 46/59] [D loss: 0.665616] [G loss: 1.075289]\n",
"[Epoch 171/200] [Batch 47/59] [D loss: 0.496120] [G loss: 0.930976]\n",
"[Epoch 171/200] [Batch 48/59] [D loss: 0.498303] [G loss: 0.786087]\n",
"[Epoch 171/200] [Batch 49/59] [D loss: 0.589972] [G loss: 1.189249]\n",
"[Epoch 171/200] [Batch 50/59] [D loss: 0.460500] [G loss: 1.127073]\n",
"[Epoch 171/200] [Batch 51/59] [D loss: 0.521050] [G loss: 0.750473]\n",
"[Epoch 171/200] [Batch 52/59] [D loss: 0.606279] [G loss: 0.801586]\n",
"[Epoch 171/200] [Batch 53/59] [D loss: 0.507083] [G loss: 1.152745]\n",
"[Epoch 171/200] [Batch 54/59] [D loss: 0.592415] [G loss: 0.611676]\n",
"[Epoch 171/200] [Batch 55/59] [D loss: 0.519810] [G loss: 1.118993]\n",
"[Epoch 171/200] [Batch 56/59] [D loss: 0.419826] [G loss: 0.725548]\n",
"[Epoch 171/200] [Batch 57/59] [D loss: 0.520435] [G loss: 0.924705]\n",
"[Epoch 171/200] [Batch 58/59] [D loss: 0.547165] [G loss: 1.619225]\n",
"[Epoch 172/200] [Batch 0/59] [D loss: 0.610449] [G loss: 0.736560]\n",
"[Epoch 172/200] [Batch 1/59] [D loss: 0.573451] [G loss: 0.959992]\n",
"[Epoch 172/200] [Batch 2/59] [D loss: 0.612975] [G loss: 1.335479]\n",
"[Epoch 172/200] [Batch 3/59] [D loss: 0.546621] [G loss: 1.171974]\n",
"[Epoch 172/200] [Batch 4/59] [D loss: 0.650713] [G loss: 0.731753]\n",
"[Epoch 172/200] [Batch 5/59] [D loss: 0.601716] [G loss: 1.576115]\n",
"[Epoch 172/200] [Batch 6/59] [D loss: 0.592102] [G loss: 1.219935]\n",
"[Epoch 172/200] [Batch 7/59] [D loss: 0.681518] [G loss: 0.615062]\n",
"[Epoch 172/200] [Batch 8/59] [D loss: 0.459877] [G loss: 1.496773]\n",
"[Epoch 172/200] [Batch 9/59] [D loss: 0.563303] [G loss: 1.246795]\n",
"[Epoch 172/200] [Batch 10/59] [D loss: 0.565130] [G loss: 0.665365]\n",
"[Epoch 172/200] [Batch 11/59] [D loss: 0.562275] [G loss: 1.258005]\n",
"[Epoch 172/200] [Batch 12/59] [D loss: 0.477235] [G loss: 1.154987]\n",
"[Epoch 172/200] [Batch 13/59] [D loss: 0.713934] [G loss: 0.587143]\n",
"[Epoch 172/200] [Batch 14/59] [D loss: 0.557071] [G loss: 1.029182]\n",
"[Epoch 172/200] [Batch 15/59] [D loss: 0.555569] [G loss: 1.104938]\n",
"[Epoch 172/200] [Batch 16/59] [D loss: 0.511169] [G loss: 0.665895]\n",
"[Epoch 172/200] [Batch 17/59] [D loss: 0.518129] [G loss: 1.045541]\n",
"[Epoch 172/200] [Batch 18/59] [D loss: 0.597571] [G loss: 1.012768]\n",
"[Epoch 172/200] [Batch 19/59] [D loss: 0.506933] [G loss: 0.732112]\n",
"[Epoch 172/200] [Batch 20/59] [D loss: 0.542400] [G loss: 1.066687]\n",
"[Epoch 172/200] [Batch 21/59] [D loss: 0.512724] [G loss: 0.840009]\n",
"[Epoch 172/200] [Batch 22/59] [D loss: 0.529374] [G loss: 0.982770]\n",
"[Epoch 172/200] [Batch 23/59] [D loss: 0.551518] [G loss: 1.090360]\n",
"[Epoch 172/200] [Batch 24/59] [D loss: 0.527036] [G loss: 0.847323]\n",
"[Epoch 172/200] [Batch 25/59] [D loss: 0.622196] [G loss: 0.707988]\n",
"[Epoch 172/200] [Batch 26/59] [D loss: 0.645530] [G loss: 1.207752]\n",
"[Epoch 172/200] [Batch 27/59] [D loss: 0.572910] [G loss: 1.071386]\n",
"[Epoch 172/200] [Batch 28/59] [D loss: 0.586953] [G loss: 0.838748]\n",
"[Epoch 172/200] [Batch 29/59] [D loss: 0.602789] [G loss: 1.056997]\n",
"[Epoch 172/200] [Batch 30/59] [D loss: 0.537207] [G loss: 0.678456]\n",
"[Epoch 172/200] [Batch 31/59] [D loss: 0.569935] [G loss: 1.078650]\n",
"[Epoch 172/200] [Batch 32/59] [D loss: 0.560883] [G loss: 1.060923]\n",
"[Epoch 172/200] [Batch 33/59] [D loss: 0.563491] [G loss: 0.965118]\n",
"[Epoch 172/200] [Batch 34/59] [D loss: 0.557734] [G loss: 0.879355]\n",
"[Epoch 172/200] [Batch 35/59] [D loss: 0.496057] [G loss: 1.207265]\n",
"[Epoch 172/200] [Batch 36/59] [D loss: 0.560084] [G loss: 0.857718]\n",
"[Epoch 172/200] [Batch 37/59] [D loss: 0.529826] [G loss: 0.813141]\n",
"[Epoch 172/200] [Batch 38/59] [D loss: 0.589403] [G loss: 1.258151]\n",
"[Epoch 172/200] [Batch 39/59] [D loss: 0.646013] [G loss: 1.240052]\n",
"[Epoch 172/200] [Batch 40/59] [D loss: 0.550933] [G loss: 0.812407]\n",
"[Epoch 172/200] [Batch 41/59] [D loss: 0.563948] [G loss: 1.052837]\n",
"[Epoch 172/200] [Batch 42/59] [D loss: 0.590231] [G loss: 1.189646]\n",
"[Epoch 172/200] [Batch 43/59] [D loss: 0.563137] [G loss: 0.872732]\n",
"[Epoch 172/200] [Batch 44/59] [D loss: 0.664297] [G loss: 0.789631]\n",
"[Epoch 172/200] [Batch 45/59] [D loss: 0.525875] [G loss: 1.072714]\n",
"[Epoch 172/200] [Batch 46/59] [D loss: 0.551380] [G loss: 1.286136]\n",
"[Epoch 172/200] [Batch 47/59] [D loss: 0.551589] [G loss: 0.767279]\n",
"[Epoch 172/200] [Batch 48/59] [D loss: 0.641396] [G loss: 0.779694]\n",
"[Epoch 172/200] [Batch 49/59] [D loss: 0.713575] [G loss: 1.556131]\n",
"[Epoch 172/200] [Batch 50/59] [D loss: 0.482172] [G loss: 0.560082]\n",
"[Epoch 172/200] [Batch 51/59] [D loss: 0.546711] [G loss: 0.576039]\n",
"[Epoch 172/200] [Batch 52/59] [D loss: 0.547512] [G loss: 1.538278]\n",
"[Epoch 172/200] [Batch 53/59] [D loss: 0.554528] [G loss: 0.920607]\n",
"[Epoch 172/200] [Batch 54/59] [D loss: 0.587231] [G loss: 0.879339]\n",
"[Epoch 172/200] [Batch 55/59] [D loss: 0.526146] [G loss: 1.041491]\n",
"[Epoch 172/200] [Batch 56/59] [D loss: 0.495885] [G loss: 0.995134]\n",
"[Epoch 172/200] [Batch 57/59] [D loss: 0.499708] [G loss: 0.862602]\n",
"[Epoch 172/200] [Batch 58/59] [D loss: 0.543274] [G loss: 0.686866]\n",
"[Epoch 173/200] [Batch 0/59] [D loss: 0.556188] [G loss: 1.406586]\n",
"[Epoch 173/200] [Batch 1/59] [D loss: 0.551214] [G loss: 1.009362]\n",
"[Epoch 173/200] [Batch 2/59] [D loss: 0.546094] [G loss: 0.771363]\n",
"[Epoch 173/200] [Batch 3/59] [D loss: 0.500320] [G loss: 0.855955]\n",
"[Epoch 173/200] [Batch 4/59] [D loss: 0.504455] [G loss: 1.085336]\n",
"[Epoch 173/200] [Batch 5/59] [D loss: 0.573589] [G loss: 1.288196]\n",
"[Epoch 173/200] [Batch 6/59] [D loss: 0.548148] [G loss: 1.198814]\n",
"[Epoch 173/200] [Batch 7/59] [D loss: 0.524344] [G loss: 1.001008]\n",
"[Epoch 173/200] [Batch 8/59] [D loss: 0.518459] [G loss: 0.907836]\n",
"[Epoch 173/200] [Batch 9/59] [D loss: 0.626245] [G loss: 1.386648]\n",
"[Epoch 173/200] [Batch 10/59] [D loss: 0.569176] [G loss: 0.770539]\n",
"[Epoch 173/200] [Batch 11/59] [D loss: 0.556790] [G loss: 1.031834]\n",
"[Epoch 173/200] [Batch 12/59] [D loss: 0.677520] [G loss: 1.281726]\n",
"[Epoch 173/200] [Batch 13/59] [D loss: 0.593368] [G loss: 0.586871]\n",
"[Epoch 173/200] [Batch 14/59] [D loss: 0.612221] [G loss: 0.792008]\n",
"[Epoch 173/200] [Batch 15/59] [D loss: 0.517439] [G loss: 1.369511]\n",
"[Epoch 173/200] [Batch 16/59] [D loss: 0.563993] [G loss: 1.167831]\n",
"[Epoch 173/200] [Batch 17/59] [D loss: 0.570070] [G loss: 0.692026]\n",
"[Epoch 173/200] [Batch 18/59] [D loss: 0.575739] [G loss: 1.294217]\n",
"[Epoch 173/200] [Batch 19/59] [D loss: 0.553589] [G loss: 0.719514]\n",
"[Epoch 173/200] [Batch 20/59] [D loss: 0.543009] [G loss: 1.156459]\n",
"[Epoch 173/200] [Batch 21/59] [D loss: 0.615322] [G loss: 0.591150]\n",
"[Epoch 173/200] [Batch 22/59] [D loss: 0.435840] [G loss: 1.243188]\n",
"[Epoch 173/200] [Batch 23/59] [D loss: 0.498612] [G loss: 1.284010]\n",
"[Epoch 173/200] [Batch 24/59] [D loss: 0.626950] [G loss: 0.796315]\n",
"[Epoch 173/200] [Batch 25/59] [D loss: 0.566969] [G loss: 1.309309]\n",
"[Epoch 173/200] [Batch 26/59] [D loss: 0.568439] [G loss: 1.020318]\n",
"[Epoch 173/200] [Batch 27/59] [D loss: 0.490055] [G loss: 0.968317]\n",
"[Epoch 173/200] [Batch 28/59] [D loss: 0.618593] [G loss: 0.607882]\n",
"[Epoch 173/200] [Batch 29/59] [D loss: 0.486998] [G loss: 1.075814]\n",
"[Epoch 173/200] [Batch 30/59] [D loss: 0.593502] [G loss: 1.096041]\n",
"[Epoch 173/200] [Batch 31/59] [D loss: 0.598362] [G loss: 0.822821]\n",
"[Epoch 173/200] [Batch 32/59] [D loss: 0.609151] [G loss: 1.247701]\n",
"[Epoch 173/200] [Batch 33/59] [D loss: 0.521627] [G loss: 1.139610]\n",
"[Epoch 173/200] [Batch 34/59] [D loss: 0.525571] [G loss: 0.957830]\n",
"[Epoch 173/200] [Batch 35/59] [D loss: 0.443497] [G loss: 0.940033]\n",
"[Epoch 173/200] [Batch 36/59] [D loss: 0.531828] [G loss: 1.088108]\n",
"[Epoch 173/200] [Batch 37/59] [D loss: 0.566151] [G loss: 1.333076]\n",
"[Epoch 173/200] [Batch 38/59] [D loss: 0.581061] [G loss: 0.700401]\n",
"[Epoch 173/200] [Batch 39/59] [D loss: 0.636883] [G loss: 1.128736]\n",
"[Epoch 173/200] [Batch 40/59] [D loss: 0.532251] [G loss: 0.894980]\n",
"[Epoch 173/200] [Batch 41/59] [D loss: 0.570364] [G loss: 1.005392]\n",
"[Epoch 173/200] [Batch 42/59] [D loss: 0.537308] [G loss: 1.257415]\n",
"[Epoch 173/200] [Batch 43/59] [D loss: 0.592905] [G loss: 0.973630]\n",
"[Epoch 173/200] [Batch 44/59] [D loss: 0.589650] [G loss: 0.997905]\n",
"[Epoch 173/200] [Batch 45/59] [D loss: 0.524029] [G loss: 0.871975]\n",
"[Epoch 173/200] [Batch 46/59] [D loss: 0.541092] [G loss: 1.039953]\n",
"[Epoch 173/200] [Batch 47/59] [D loss: 0.590556] [G loss: 1.113064]\n",
"[Epoch 173/200] [Batch 48/59] [D loss: 0.629785] [G loss: 0.653807]\n",
"[Epoch 173/200] [Batch 49/59] [D loss: 0.564252] [G loss: 0.987535]\n",
"[Epoch 173/200] [Batch 50/59] [D loss: 0.571226] [G loss: 0.874381]\n",
"[Epoch 173/200] [Batch 51/59] [D loss: 0.578191] [G loss: 1.293058]\n",
"[Epoch 173/200] [Batch 52/59] [D loss: 0.490557] [G loss: 0.661711]\n",
"[Epoch 173/200] [Batch 53/59] [D loss: 0.542896] [G loss: 0.685431]\n",
"[Epoch 173/200] [Batch 54/59] [D loss: 0.522314] [G loss: 1.075645]\n",
"[Epoch 173/200] [Batch 55/59] [D loss: 0.561941] [G loss: 1.147013]\n",
"[Epoch 173/200] [Batch 56/59] [D loss: 0.620392] [G loss: 0.782523]\n",
"[Epoch 173/200] [Batch 57/59] [D loss: 0.517569] [G loss: 1.198744]\n",
"[Epoch 173/200] [Batch 58/59] [D loss: 0.584058] [G loss: 1.247961]\n",
"[Epoch 174/200] [Batch 0/59] [D loss: 0.572171] [G loss: 0.893875]\n",
"[Epoch 174/200] [Batch 1/59] [D loss: 0.533209] [G loss: 1.522146]\n",
"[Epoch 174/200] [Batch 2/59] [D loss: 0.557512] [G loss: 1.176301]\n",
"[Epoch 174/200] [Batch 3/59] [D loss: 0.627710] [G loss: 0.379929]\n",
"[Epoch 174/200] [Batch 4/59] [D loss: 0.490240] [G loss: 1.067758]\n",
"[Epoch 174/200] [Batch 5/59] [D loss: 0.633455] [G loss: 1.386271]\n",
"[Epoch 174/200] [Batch 6/59] [D loss: 0.684528] [G loss: 0.530964]\n",
"[Epoch 174/200] [Batch 7/59] [D loss: 0.555279] [G loss: 1.019674]\n",
"[Epoch 174/200] [Batch 8/59] [D loss: 0.490333] [G loss: 1.261121]\n",
"[Epoch 174/200] [Batch 9/59] [D loss: 0.536974] [G loss: 0.737068]\n",
"[Epoch 174/200] [Batch 10/59] [D loss: 0.568258] [G loss: 1.246252]\n",
"[Epoch 174/200] [Batch 11/59] [D loss: 0.490548] [G loss: 0.996508]\n",
"[Epoch 174/200] [Batch 12/59] [D loss: 0.605844] [G loss: 0.863594]\n",
"[Epoch 174/200] [Batch 13/59] [D loss: 0.529080] [G loss: 1.019854]\n",
"[Epoch 174/200] [Batch 14/59] [D loss: 0.554752] [G loss: 0.864678]\n",
"[Epoch 174/200] [Batch 15/59] [D loss: 0.507924] [G loss: 0.703445]\n",
"[Epoch 174/200] [Batch 16/59] [D loss: 0.585911] [G loss: 1.459160]\n",
"[Epoch 174/200] [Batch 17/59] [D loss: 0.465296] [G loss: 1.050453]\n",
"[Epoch 174/200] [Batch 18/59] [D loss: 0.558903] [G loss: 0.803334]\n",
"[Epoch 174/200] [Batch 19/59] [D loss: 0.591694] [G loss: 0.829363]\n",
"[Epoch 174/200] [Batch 20/59] [D loss: 0.444123] [G loss: 1.095901]\n",
"[Epoch 174/200] [Batch 21/59] [D loss: 0.550015] [G loss: 0.930073]\n",
"[Epoch 174/200] [Batch 22/59] [D loss: 0.536770] [G loss: 1.016687]\n",
"[Epoch 174/200] [Batch 23/59] [D loss: 0.641094] [G loss: 1.275706]\n",
"[Epoch 174/200] [Batch 24/59] [D loss: 0.574172] [G loss: 0.950360]\n",
"[Epoch 174/200] [Batch 25/59] [D loss: 0.636262] [G loss: 0.657650]\n",
"[Epoch 174/200] [Batch 26/59] [D loss: 0.617960] [G loss: 1.294480]\n",
"[Epoch 174/200] [Batch 27/59] [D loss: 0.511786] [G loss: 0.945708]\n",
"[Epoch 174/200] [Batch 28/59] [D loss: 0.634979] [G loss: 0.641522]\n",
"[Epoch 174/200] [Batch 29/59] [D loss: 0.670391] [G loss: 1.746880]\n",
"[Epoch 174/200] [Batch 30/59] [D loss: 0.554582] [G loss: 0.664602]\n",
"[Epoch 174/200] [Batch 31/59] [D loss: 0.580777] [G loss: 0.914505]\n",
"[Epoch 174/200] [Batch 32/59] [D loss: 0.515429] [G loss: 0.890098]\n",
"[Epoch 174/200] [Batch 33/59] [D loss: 0.620264] [G loss: 0.874710]\n",
"[Epoch 174/200] [Batch 34/59] [D loss: 0.568232] [G loss: 0.979961]\n",
"[Epoch 174/200] [Batch 35/59] [D loss: 0.590304] [G loss: 0.716345]\n",
"[Epoch 174/200] [Batch 36/59] [D loss: 0.669621] [G loss: 1.473675]\n",
"[Epoch 174/200] [Batch 37/59] [D loss: 0.544039] [G loss: 0.954378]\n",
"[Epoch 174/200] [Batch 38/59] [D loss: 0.470161] [G loss: 0.922797]\n",
"[Epoch 174/200] [Batch 39/59] [D loss: 0.555383] [G loss: 0.987460]\n",
"[Epoch 174/200] [Batch 40/59] [D loss: 0.600695] [G loss: 0.700017]\n",
"[Epoch 174/200] [Batch 41/59] [D loss: 0.598380] [G loss: 1.299078]\n",
"[Epoch 174/200] [Batch 42/59] [D loss: 0.579436] [G loss: 0.859351]\n",
"[Epoch 174/200] [Batch 43/59] [D loss: 0.573977] [G loss: 0.909181]\n",
"[Epoch 174/200] [Batch 44/59] [D loss: 0.464419] [G loss: 0.933021]\n",
"[Epoch 174/200] [Batch 45/59] [D loss: 0.524648] [G loss: 1.010590]\n",
"[Epoch 174/200] [Batch 46/59] [D loss: 0.617956] [G loss: 0.807580]\n",
"[Epoch 174/200] [Batch 47/59] [D loss: 0.532803] [G loss: 1.088954]\n",
"[Epoch 174/200] [Batch 48/59] [D loss: 0.565306] [G loss: 1.287586]\n",
"[Epoch 174/200] [Batch 49/59] [D loss: 0.640859] [G loss: 0.905603]\n",
"[Epoch 174/200] [Batch 50/59] [D loss: 0.548887] [G loss: 1.009064]\n",
"[Epoch 174/200] [Batch 51/59] [D loss: 0.544665] [G loss: 1.019572]\n",
"[Epoch 174/200] [Batch 52/59] [D loss: 0.557189] [G loss: 0.956949]\n",
"[Epoch 174/200] [Batch 53/59] [D loss: 0.585085] [G loss: 0.894844]\n",
"[Epoch 174/200] [Batch 54/59] [D loss: 0.570598] [G loss: 0.984993]\n",
"[Epoch 174/200] [Batch 55/59] [D loss: 0.515427] [G loss: 0.837385]\n",
"[Epoch 174/200] [Batch 56/59] [D loss: 0.606648] [G loss: 1.014503]\n",
"[Epoch 174/200] [Batch 57/59] [D loss: 0.647875] [G loss: 0.736524]\n",
"[Epoch 174/200] [Batch 58/59] [D loss: 0.533078] [G loss: 1.314027]\n",
"[Epoch 175/200] [Batch 0/59] [D loss: 0.541024] [G loss: 0.739837]\n",
"[Epoch 175/200] [Batch 1/59] [D loss: 0.619632] [G loss: 0.914268]\n",
"[Epoch 175/200] [Batch 2/59] [D loss: 0.614653] [G loss: 0.980688]\n",
"[Epoch 175/200] [Batch 3/59] [D loss: 0.583835] [G loss: 1.167092]\n",
"[Epoch 175/200] [Batch 4/59] [D loss: 0.554888] [G loss: 0.845128]\n",
"[Epoch 175/200] [Batch 5/59] [D loss: 0.552610] [G loss: 1.028721]\n",
"[Epoch 175/200] [Batch 6/59] [D loss: 0.464519] [G loss: 0.965057]\n",
"[Epoch 175/200] [Batch 7/59] [D loss: 0.465112] [G loss: 0.886314]\n",
"[Epoch 175/200] [Batch 8/59] [D loss: 0.542795] [G loss: 0.835562]\n",
"[Epoch 175/200] [Batch 9/59] [D loss: 0.609966] [G loss: 0.672322]\n",
"[Epoch 175/200] [Batch 10/59] [D loss: 0.561817] [G loss: 1.374903]\n",
"[Epoch 175/200] [Batch 11/59] [D loss: 0.515834] [G loss: 0.926943]\n",
"[Epoch 175/200] [Batch 12/59] [D loss: 0.536840] [G loss: 0.829412]\n",
"[Epoch 175/200] [Batch 13/59] [D loss: 0.544844] [G loss: 1.353412]\n",
"[Epoch 175/200] [Batch 14/59] [D loss: 0.558044] [G loss: 0.954297]\n",
"[Epoch 175/200] [Batch 15/59] [D loss: 0.502387] [G loss: 0.868699]\n",
"[Epoch 175/200] [Batch 16/59] [D loss: 0.609307] [G loss: 0.753325]\n",
"[Epoch 175/200] [Batch 17/59] [D loss: 0.607354] [G loss: 1.389458]\n",
"[Epoch 175/200] [Batch 18/59] [D loss: 0.497836] [G loss: 0.908738]\n",
"[Epoch 175/200] [Batch 19/59] [D loss: 0.643351] [G loss: 0.622996]\n",
"[Epoch 175/200] [Batch 20/59] [D loss: 0.569122] [G loss: 1.469396]\n",
"[Epoch 175/200] [Batch 21/59] [D loss: 0.500116] [G loss: 1.039552]\n",
"[Epoch 175/200] [Batch 22/59] [D loss: 0.559362] [G loss: 0.640658]\n",
"[Epoch 175/200] [Batch 23/59] [D loss: 0.643658] [G loss: 1.400105]\n",
"[Epoch 175/200] [Batch 24/59] [D loss: 0.548984] [G loss: 0.741196]\n",
"[Epoch 175/200] [Batch 25/59] [D loss: 0.546093] [G loss: 0.741600]\n",
"[Epoch 175/200] [Batch 26/59] [D loss: 0.596519] [G loss: 1.255412]\n",
"[Epoch 175/200] [Batch 27/59] [D loss: 0.516933] [G loss: 1.201336]\n",
"[Epoch 175/200] [Batch 28/59] [D loss: 0.616727] [G loss: 0.817678]\n",
"[Epoch 175/200] [Batch 29/59] [D loss: 0.497683] [G loss: 0.899645]\n",
"[Epoch 175/200] [Batch 30/59] [D loss: 0.509572] [G loss: 0.982893]\n",
"[Epoch 175/200] [Batch 31/59] [D loss: 0.617352] [G loss: 0.851793]\n",
"[Epoch 175/200] [Batch 32/59] [D loss: 0.554264] [G loss: 0.991807]\n",
"[Epoch 175/200] [Batch 33/59] [D loss: 0.647797] [G loss: 1.340631]\n",
"[Epoch 175/200] [Batch 34/59] [D loss: 0.633072] [G loss: 0.562489]\n",
"[Epoch 175/200] [Batch 35/59] [D loss: 0.565785] [G loss: 1.018267]\n",
"[Epoch 175/200] [Batch 36/59] [D loss: 0.591312] [G loss: 1.002645]\n",
"[Epoch 175/200] [Batch 37/59] [D loss: 0.577257] [G loss: 0.887342]\n",
"[Epoch 175/200] [Batch 38/59] [D loss: 0.509932] [G loss: 0.902823]\n",
"[Epoch 175/200] [Batch 39/59] [D loss: 0.515494] [G loss: 0.884316]\n",
"[Epoch 175/200] [Batch 40/59] [D loss: 0.588923] [G loss: 1.216893]\n",
"[Epoch 175/200] [Batch 41/59] [D loss: 0.605767] [G loss: 0.802459]\n",
"[Epoch 175/200] [Batch 42/59] [D loss: 0.514946] [G loss: 0.779232]\n",
"[Epoch 175/200] [Batch 43/59] [D loss: 0.607315] [G loss: 1.242105]\n",
"[Epoch 175/200] [Batch 44/59] [D loss: 0.635580] [G loss: 1.081643]\n",
"[Epoch 175/200] [Batch 45/59] [D loss: 0.563344] [G loss: 0.665980]\n",
"[Epoch 175/200] [Batch 46/59] [D loss: 0.573961] [G loss: 1.377820]\n",
"[Epoch 175/200] [Batch 47/59] [D loss: 0.566102] [G loss: 1.082444]\n",
"[Epoch 175/200] [Batch 48/59] [D loss: 0.598187] [G loss: 0.670272]\n",
"[Epoch 175/200] [Batch 49/59] [D loss: 0.516481] [G loss: 1.220824]\n",
"[Epoch 175/200] [Batch 50/59] [D loss: 0.588006] [G loss: 0.893157]\n",
"[Epoch 175/200] [Batch 51/59] [D loss: 0.580874] [G loss: 0.789245]\n",
"[Epoch 175/200] [Batch 52/59] [D loss: 0.553758] [G loss: 1.047139]\n",
"[Epoch 175/200] [Batch 53/59] [D loss: 0.563517] [G loss: 1.160535]\n",
"[Epoch 175/200] [Batch 54/59] [D loss: 0.592339] [G loss: 0.850425]\n",
"[Epoch 175/200] [Batch 55/59] [D loss: 0.542297] [G loss: 1.031986]\n",
"[Epoch 175/200] [Batch 56/59] [D loss: 0.466033] [G loss: 0.798239]\n",
"[Epoch 175/200] [Batch 57/59] [D loss: 0.573139] [G loss: 0.922212]\n",
"[Epoch 175/200] [Batch 58/59] [D loss: 0.636214] [G loss: 0.813056]\n",
"[Epoch 176/200] [Batch 0/59] [D loss: 0.701527] [G loss: 0.829327]\n",
"[Epoch 176/200] [Batch 1/59] [D loss: 0.588203] [G loss: 1.010635]\n",
"[Epoch 176/200] [Batch 2/59] [D loss: 0.552226] [G loss: 1.180460]\n",
"[Epoch 176/200] [Batch 3/59] [D loss: 0.526528] [G loss: 0.834399]\n",
"[Epoch 176/200] [Batch 4/59] [D loss: 0.561349] [G loss: 1.199093]\n",
"[Epoch 176/200] [Batch 5/59] [D loss: 0.621898] [G loss: 1.177632]\n",
"[Epoch 176/200] [Batch 6/59] [D loss: 0.569167] [G loss: 0.843796]\n",
"[Epoch 176/200] [Batch 7/59] [D loss: 0.542378] [G loss: 0.675986]\n",
"[Epoch 176/200] [Batch 8/59] [D loss: 0.548560] [G loss: 1.145702]\n",
"[Epoch 176/200] [Batch 9/59] [D loss: 0.556476] [G loss: 0.584349]\n",
"[Epoch 176/200] [Batch 10/59] [D loss: 0.454483] [G loss: 1.071994]\n",
"[Epoch 176/200] [Batch 11/59] [D loss: 0.554726] [G loss: 0.751226]\n",
"[Epoch 176/200] [Batch 12/59] [D loss: 0.507626] [G loss: 0.925871]\n",
"[Epoch 176/200] [Batch 13/59] [D loss: 0.532932] [G loss: 1.173352]\n",
"[Epoch 176/200] [Batch 14/59] [D loss: 0.526426] [G loss: 0.923269]\n",
"[Epoch 176/200] [Batch 15/59] [D loss: 0.523706] [G loss: 1.278693]\n",
"[Epoch 176/200] [Batch 16/59] [D loss: 0.566621] [G loss: 0.883008]\n",
"[Epoch 176/200] [Batch 17/59] [D loss: 0.479880] [G loss: 0.903833]\n",
"[Epoch 176/200] [Batch 18/59] [D loss: 0.536174] [G loss: 0.981681]\n",
"[Epoch 176/200] [Batch 19/59] [D loss: 0.542558] [G loss: 1.141307]\n",
"[Epoch 176/200] [Batch 20/59] [D loss: 0.590636] [G loss: 0.782866]\n",
"[Epoch 176/200] [Batch 21/59] [D loss: 0.486457] [G loss: 1.009902]\n",
"[Epoch 176/200] [Batch 22/59] [D loss: 0.595965] [G loss: 0.740399]\n",
"[Epoch 176/200] [Batch 23/59] [D loss: 0.571089] [G loss: 1.099751]\n",
"[Epoch 176/200] [Batch 24/59] [D loss: 0.545810] [G loss: 0.937592]\n",
"[Epoch 176/200] [Batch 25/59] [D loss: 0.536871] [G loss: 1.000711]\n",
"[Epoch 176/200] [Batch 26/59] [D loss: 0.548357] [G loss: 0.933836]\n",
"[Epoch 176/200] [Batch 27/59] [D loss: 0.496134] [G loss: 0.709171]\n",
"[Epoch 176/200] [Batch 28/59] [D loss: 0.601904] [G loss: 1.026375]\n",
"[Epoch 176/200] [Batch 29/59] [D loss: 0.506690] [G loss: 1.141324]\n",
"[Epoch 176/200] [Batch 30/59] [D loss: 0.512071] [G loss: 0.950781]\n",
"[Epoch 176/200] [Batch 31/59] [D loss: 0.499962] [G loss: 0.874085]\n",
"[Epoch 176/200] [Batch 32/59] [D loss: 0.595845] [G loss: 1.408137]\n",
"[Epoch 176/200] [Batch 33/59] [D loss: 0.599853] [G loss: 0.795088]\n",
"[Epoch 176/200] [Batch 34/59] [D loss: 0.576013] [G loss: 0.854510]\n",
"[Epoch 176/200] [Batch 35/59] [D loss: 0.602273] [G loss: 0.995549]\n",
"[Epoch 176/200] [Batch 36/59] [D loss: 0.556924] [G loss: 1.032262]\n",
"[Epoch 176/200] [Batch 37/59] [D loss: 0.579107] [G loss: 1.002188]\n",
"[Epoch 176/200] [Batch 38/59] [D loss: 0.539243] [G loss: 1.101436]\n",
"[Epoch 176/200] [Batch 39/59] [D loss: 0.568932] [G loss: 1.105729]\n",
"[Epoch 176/200] [Batch 40/59] [D loss: 0.577718] [G loss: 1.008159]\n",
"[Epoch 176/200] [Batch 41/59] [D loss: 0.690794] [G loss: 1.147516]\n",
"[Epoch 176/200] [Batch 42/59] [D loss: 0.521588] [G loss: 0.913590]\n",
"[Epoch 176/200] [Batch 43/59] [D loss: 0.593138] [G loss: 0.805616]\n",
"[Epoch 176/200] [Batch 44/59] [D loss: 0.477702] [G loss: 1.171389]\n",
"[Epoch 176/200] [Batch 45/59] [D loss: 0.481811] [G loss: 0.860146]\n",
"[Epoch 176/200] [Batch 46/59] [D loss: 0.533823] [G loss: 0.904660]\n",
"[Epoch 176/200] [Batch 47/59] [D loss: 0.585668] [G loss: 1.027235]\n",
"[Epoch 176/200] [Batch 48/59] [D loss: 0.651412] [G loss: 0.936518]\n",
"[Epoch 176/200] [Batch 49/59] [D loss: 0.590091] [G loss: 0.544811]\n",
"[Epoch 176/200] [Batch 50/59] [D loss: 0.552468] [G loss: 1.184415]\n",
"[Epoch 176/200] [Batch 51/59] [D loss: 0.596186] [G loss: 1.273903]\n",
"[Epoch 176/200] [Batch 52/59] [D loss: 0.529197] [G loss: 0.911753]\n",
"[Epoch 176/200] [Batch 53/59] [D loss: 0.547637] [G loss: 0.813595]\n",
"[Epoch 176/200] [Batch 54/59] [D loss: 0.612021] [G loss: 1.227190]\n",
"[Epoch 176/200] [Batch 55/59] [D loss: 0.681351] [G loss: 1.361835]\n",
"[Epoch 176/200] [Batch 56/59] [D loss: 0.565890] [G loss: 0.947323]\n",
"[Epoch 176/200] [Batch 57/59] [D loss: 0.563063] [G loss: 0.778728]\n",
"[Epoch 176/200] [Batch 58/59] [D loss: 0.550831] [G loss: 0.977184]\n",
"[Epoch 177/200] [Batch 0/59] [D loss: 0.562813] [G loss: 1.050995]\n",
"[Epoch 177/200] [Batch 1/59] [D loss: 0.567161] [G loss: 0.826065]\n",
"[Epoch 177/200] [Batch 2/59] [D loss: 0.517767] [G loss: 1.372938]\n",
"[Epoch 177/200] [Batch 3/59] [D loss: 0.623456] [G loss: 0.959976]\n",
"[Epoch 177/200] [Batch 4/59] [D loss: 0.522094] [G loss: 1.123782]\n",
"[Epoch 177/200] [Batch 5/59] [D loss: 0.588309] [G loss: 1.025168]\n",
"[Epoch 177/200] [Batch 6/59] [D loss: 0.580168] [G loss: 0.707765]\n",
"[Epoch 177/200] [Batch 7/59] [D loss: 0.522336] [G loss: 1.251369]\n",
"[Epoch 177/200] [Batch 8/59] [D loss: 0.507716] [G loss: 1.156238]\n",
"[Epoch 177/200] [Batch 9/59] [D loss: 0.552566] [G loss: 0.911898]\n",
"[Epoch 177/200] [Batch 10/59] [D loss: 0.491400] [G loss: 0.903286]\n",
"[Epoch 177/200] [Batch 11/59] [D loss: 0.539790] [G loss: 1.162114]\n",
"[Epoch 177/200] [Batch 12/59] [D loss: 0.542365] [G loss: 0.958083]\n",
"[Epoch 177/200] [Batch 13/59] [D loss: 0.519436] [G loss: 0.565751]\n",
"[Epoch 177/200] [Batch 14/59] [D loss: 0.543476] [G loss: 1.203017]\n",
"[Epoch 177/200] [Batch 15/59] [D loss: 0.530456] [G loss: 1.257271]\n",
"[Epoch 177/200] [Batch 16/59] [D loss: 0.606064] [G loss: 0.599792]\n",
"[Epoch 177/200] [Batch 17/59] [D loss: 0.538933] [G loss: 1.028927]\n",
"[Epoch 177/200] [Batch 18/59] [D loss: 0.621004] [G loss: 1.168979]\n",
"[Epoch 177/200] [Batch 19/59] [D loss: 0.537288] [G loss: 0.640078]\n",
"[Epoch 177/200] [Batch 20/59] [D loss: 0.609219] [G loss: 0.960361]\n",
"[Epoch 177/200] [Batch 21/59] [D loss: 0.661994] [G loss: 1.566208]\n",
"[Epoch 177/200] [Batch 22/59] [D loss: 0.493413] [G loss: 0.886999]\n",
"[Epoch 177/200] [Batch 23/59] [D loss: 0.532426] [G loss: 0.585019]\n",
"[Epoch 177/200] [Batch 24/59] [D loss: 0.484620] [G loss: 0.993695]\n",
"[Epoch 177/200] [Batch 25/59] [D loss: 0.554897] [G loss: 1.133397]\n",
"[Epoch 177/200] [Batch 26/59] [D loss: 0.489349] [G loss: 0.878795]\n",
"[Epoch 177/200] [Batch 27/59] [D loss: 0.669218] [G loss: 1.093567]\n",
"[Epoch 177/200] [Batch 28/59] [D loss: 0.601961] [G loss: 0.839313]\n",
"[Epoch 177/200] [Batch 29/59] [D loss: 0.555623] [G loss: 1.032049]\n",
"[Epoch 177/200] [Batch 30/59] [D loss: 0.564285] [G loss: 1.008463]\n",
"[Epoch 177/200] [Batch 31/59] [D loss: 0.546634] [G loss: 0.984900]\n",
"[Epoch 177/200] [Batch 32/59] [D loss: 0.558947] [G loss: 0.868158]\n",
"[Epoch 177/200] [Batch 33/59] [D loss: 0.503643] [G loss: 1.008844]\n",
"[Epoch 177/200] [Batch 34/59] [D loss: 0.558143] [G loss: 1.324217]\n",
"[Epoch 177/200] [Batch 35/59] [D loss: 0.556115] [G loss: 0.651371]\n",
"[Epoch 177/200] [Batch 36/59] [D loss: 0.553149] [G loss: 1.231888]\n",
"[Epoch 177/200] [Batch 37/59] [D loss: 0.548081] [G loss: 0.826567]\n",
"[Epoch 177/200] [Batch 38/59] [D loss: 0.514311] [G loss: 0.705229]\n",
"[Epoch 177/200] [Batch 39/59] [D loss: 0.458105] [G loss: 1.043779]\n",
"[Epoch 177/200] [Batch 40/59] [D loss: 0.636877] [G loss: 0.983717]\n",
"[Epoch 177/200] [Batch 41/59] [D loss: 0.564246] [G loss: 0.745155]\n",
"[Epoch 177/200] [Batch 42/59] [D loss: 0.549793] [G loss: 1.202985]\n",
"[Epoch 177/200] [Batch 43/59] [D loss: 0.498634] [G loss: 0.578606]\n",
"[Epoch 177/200] [Batch 44/59] [D loss: 0.500456] [G loss: 1.252992]\n",
"[Epoch 177/200] [Batch 45/59] [D loss: 0.588589] [G loss: 1.165632]\n",
"[Epoch 177/200] [Batch 46/59] [D loss: 0.523634] [G loss: 0.919876]\n",
"[Epoch 177/200] [Batch 47/59] [D loss: 0.574970] [G loss: 0.846244]\n",
"[Epoch 177/200] [Batch 48/59] [D loss: 0.543736] [G loss: 0.682771]\n",
"[Epoch 177/200] [Batch 49/59] [D loss: 0.541524] [G loss: 1.046551]\n",
"[Epoch 177/200] [Batch 50/59] [D loss: 0.596305] [G loss: 0.936882]\n",
"[Epoch 177/200] [Batch 51/59] [D loss: 0.511387] [G loss: 1.097530]\n",
"[Epoch 177/200] [Batch 52/59] [D loss: 0.433167] [G loss: 1.273197]\n",
"[Epoch 177/200] [Batch 53/59] [D loss: 0.564104] [G loss: 0.647672]\n",
"[Epoch 177/200] [Batch 54/59] [D loss: 0.570632] [G loss: 0.826180]\n",
"[Epoch 177/200] [Batch 55/59] [D loss: 0.493017] [G loss: 1.181015]\n",
"[Epoch 177/200] [Batch 56/59] [D loss: 0.508857] [G loss: 0.948253]\n",
"[Epoch 177/200] [Batch 57/59] [D loss: 0.474126] [G loss: 0.861197]\n",
"[Epoch 177/200] [Batch 58/59] [D loss: 0.593482] [G loss: 1.063147]\n",
"[Epoch 178/200] [Batch 0/59] [D loss: 0.472952] [G loss: 0.767748]\n",
"[Epoch 178/200] [Batch 1/59] [D loss: 0.541843] [G loss: 1.091297]\n",
"[Epoch 178/200] [Batch 2/59] [D loss: 0.639700] [G loss: 1.480358]\n",
"[Epoch 178/200] [Batch 3/59] [D loss: 0.539315] [G loss: 0.749397]\n",
"[Epoch 178/200] [Batch 4/59] [D loss: 0.575293] [G loss: 0.775957]\n",
"[Epoch 178/200] [Batch 5/59] [D loss: 0.553227] [G loss: 1.429900]\n",
"[Epoch 178/200] [Batch 6/59] [D loss: 0.503440] [G loss: 1.043696]\n",
"[Epoch 178/200] [Batch 7/59] [D loss: 0.470255] [G loss: 0.661594]\n",
"[Epoch 178/200] [Batch 8/59] [D loss: 0.595392] [G loss: 1.245892]\n",
"[Epoch 178/200] [Batch 9/59] [D loss: 0.605668] [G loss: 1.021403]\n",
"[Epoch 178/200] [Batch 10/59] [D loss: 0.567750] [G loss: 0.819712]\n",
"[Epoch 178/200] [Batch 11/59] [D loss: 0.524063] [G loss: 1.199211]\n",
"[Epoch 178/200] [Batch 12/59] [D loss: 0.567803] [G loss: 1.308762]\n",
"[Epoch 178/200] [Batch 13/59] [D loss: 0.523154] [G loss: 0.725230]\n",
"[Epoch 178/200] [Batch 14/59] [D loss: 0.550660] [G loss: 1.294143]\n",
"[Epoch 178/200] [Batch 15/59] [D loss: 0.614259] [G loss: 0.884225]\n",
"[Epoch 178/200] [Batch 16/59] [D loss: 0.483704] [G loss: 1.061965]\n",
"[Epoch 178/200] [Batch 17/59] [D loss: 0.571359] [G loss: 0.686815]\n",
"[Epoch 178/200] [Batch 18/59] [D loss: 0.581258] [G loss: 0.719126]\n",
"[Epoch 178/200] [Batch 19/59] [D loss: 0.634427] [G loss: 1.466658]\n",
"[Epoch 178/200] [Batch 20/59] [D loss: 0.575038] [G loss: 0.686149]\n",
"[Epoch 178/200] [Batch 21/59] [D loss: 0.528738] [G loss: 0.758373]\n",
"[Epoch 178/200] [Batch 22/59] [D loss: 0.593480] [G loss: 1.519597]\n",
"[Epoch 178/200] [Batch 23/59] [D loss: 0.598644] [G loss: 0.494223]\n",
"[Epoch 178/200] [Batch 24/59] [D loss: 0.481629] [G loss: 0.953374]\n",
"[Epoch 178/200] [Batch 25/59] [D loss: 0.481652] [G loss: 1.168291]\n",
"[Epoch 178/200] [Batch 26/59] [D loss: 0.467156] [G loss: 1.162542]\n",
"[Epoch 178/200] [Batch 27/59] [D loss: 0.578724] [G loss: 0.898745]\n",
"[Epoch 178/200] [Batch 28/59] [D loss: 0.557043] [G loss: 1.206945]\n",
"[Epoch 178/200] [Batch 29/59] [D loss: 0.447305] [G loss: 0.707870]\n",
"[Epoch 178/200] [Batch 30/59] [D loss: 0.567919] [G loss: 0.842048]\n",
"[Epoch 178/200] [Batch 31/59] [D loss: 0.602714] [G loss: 1.035031]\n",
"[Epoch 178/200] [Batch 32/59] [D loss: 0.552238] [G loss: 0.966102]\n",
"[Epoch 178/200] [Batch 33/59] [D loss: 0.557699] [G loss: 0.868001]\n",
"[Epoch 178/200] [Batch 34/59] [D loss: 0.572084] [G loss: 1.049481]\n",
"[Epoch 178/200] [Batch 35/59] [D loss: 0.607972] [G loss: 0.772434]\n",
"[Epoch 178/200] [Batch 36/59] [D loss: 0.624290] [G loss: 0.873083]\n",
"[Epoch 178/200] [Batch 37/59] [D loss: 0.576725] [G loss: 0.787516]\n",
"[Epoch 178/200] [Batch 38/59] [D loss: 0.672143] [G loss: 1.402505]\n",
"[Epoch 178/200] [Batch 39/59] [D loss: 0.528790] [G loss: 0.705398]\n",
"[Epoch 178/200] [Batch 40/59] [D loss: 0.633785] [G loss: 0.628491]\n",
"[Epoch 178/200] [Batch 41/59] [D loss: 0.547786] [G loss: 1.171984]\n",
"[Epoch 178/200] [Batch 42/59] [D loss: 0.546366] [G loss: 0.855296]\n",
"[Epoch 178/200] [Batch 43/59] [D loss: 0.555749] [G loss: 1.218337]\n",
"[Epoch 178/200] [Batch 44/59] [D loss: 0.629798] [G loss: 0.691872]\n",
"[Epoch 178/200] [Batch 45/59] [D loss: 0.615167] [G loss: 1.277506]\n",
"[Epoch 178/200] [Batch 46/59] [D loss: 0.531380] [G loss: 0.875783]\n",
"[Epoch 178/200] [Batch 47/59] [D loss: 0.513130] [G loss: 0.755860]\n",
"[Epoch 178/200] [Batch 48/59] [D loss: 0.507514] [G loss: 0.875195]\n",
"[Epoch 178/200] [Batch 49/59] [D loss: 0.628602] [G loss: 1.537318]\n",
"[Epoch 178/200] [Batch 50/59] [D loss: 0.690621] [G loss: 0.912084]\n",
"[Epoch 178/200] [Batch 51/59] [D loss: 0.503898] [G loss: 1.022402]\n",
"[Epoch 178/200] [Batch 52/59] [D loss: 0.484933] [G loss: 1.091682]\n",
"[Epoch 178/200] [Batch 53/59] [D loss: 0.588278] [G loss: 0.700600]\n",
"[Epoch 178/200] [Batch 54/59] [D loss: 0.551123] [G loss: 1.232408]\n",
"[Epoch 178/200] [Batch 55/59] [D loss: 0.645338] [G loss: 0.794585]\n",
"[Epoch 178/200] [Batch 56/59] [D loss: 0.606119] [G loss: 1.231425]\n",
"[Epoch 178/200] [Batch 57/59] [D loss: 0.558399] [G loss: 1.030510]\n",
"[Epoch 178/200] [Batch 58/59] [D loss: 0.451823] [G loss: 1.045991]\n",
"[Epoch 179/200] [Batch 0/59] [D loss: 0.585141] [G loss: 1.537504]\n",
"[Epoch 179/200] [Batch 1/59] [D loss: 0.511852] [G loss: 0.819338]\n",
"[Epoch 179/200] [Batch 2/59] [D loss: 0.478471] [G loss: 1.004630]\n",
"[Epoch 179/200] [Batch 3/59] [D loss: 0.562126] [G loss: 1.394663]\n",
"[Epoch 179/200] [Batch 4/59] [D loss: 0.526470] [G loss: 1.013102]\n",
"[Epoch 179/200] [Batch 5/59] [D loss: 0.553577] [G loss: 0.779196]\n",
"[Epoch 179/200] [Batch 6/59] [D loss: 0.524952] [G loss: 1.379822]\n",
"[Epoch 179/200] [Batch 7/59] [D loss: 0.495546] [G loss: 1.216922]\n",
"[Epoch 179/200] [Batch 8/59] [D loss: 0.569978] [G loss: 0.896834]\n",
"[Epoch 179/200] [Batch 9/59] [D loss: 0.457746] [G loss: 0.841592]\n",
"[Epoch 179/200] [Batch 10/59] [D loss: 0.606708] [G loss: 1.264942]\n",
"[Epoch 179/200] [Batch 11/59] [D loss: 0.541770] [G loss: 0.811902]\n",
"[Epoch 179/200] [Batch 12/59] [D loss: 0.586508] [G loss: 1.136451]\n",
"[Epoch 179/200] [Batch 13/59] [D loss: 0.597409] [G loss: 1.059130]\n",
"[Epoch 179/200] [Batch 14/59] [D loss: 0.547512] [G loss: 0.885932]\n",
"[Epoch 179/200] [Batch 15/59] [D loss: 0.548373] [G loss: 1.187574]\n",
"[Epoch 179/200] [Batch 16/59] [D loss: 0.571351] [G loss: 1.071457]\n",
"[Epoch 179/200] [Batch 17/59] [D loss: 0.499216] [G loss: 0.818026]\n",
"[Epoch 179/200] [Batch 18/59] [D loss: 0.592503] [G loss: 1.174022]\n",
"[Epoch 179/200] [Batch 19/59] [D loss: 0.617118] [G loss: 1.346757]\n",
"[Epoch 179/200] [Batch 20/59] [D loss: 0.573314] [G loss: 0.655253]\n",
"[Epoch 179/200] [Batch 21/59] [D loss: 0.554607] [G loss: 1.046574]\n",
"[Epoch 179/200] [Batch 22/59] [D loss: 0.544412] [G loss: 1.058576]\n",
"[Epoch 179/200] [Batch 23/59] [D loss: 0.495996] [G loss: 0.769709]\n",
"[Epoch 179/200] [Batch 24/59] [D loss: 0.551136] [G loss: 1.456090]\n",
"[Epoch 179/200] [Batch 25/59] [D loss: 0.571750] [G loss: 1.038529]\n",
"[Epoch 179/200] [Batch 26/59] [D loss: 0.531380] [G loss: 0.812740]\n",
"[Epoch 179/200] [Batch 27/59] [D loss: 0.524645] [G loss: 0.772262]\n",
"[Epoch 179/200] [Batch 28/59] [D loss: 0.533731] [G loss: 1.062427]\n",
"[Epoch 179/200] [Batch 29/59] [D loss: 0.629264] [G loss: 1.306619]\n",
"[Epoch 179/200] [Batch 30/59] [D loss: 0.593840] [G loss: 0.497672]\n",
"[Epoch 179/200] [Batch 31/59] [D loss: 0.479400] [G loss: 1.177689]\n",
"[Epoch 179/200] [Batch 32/59] [D loss: 0.535327] [G loss: 1.150678]\n",
"[Epoch 179/200] [Batch 33/59] [D loss: 0.522282] [G loss: 0.798731]\n",
"[Epoch 179/200] [Batch 34/59] [D loss: 0.575555] [G loss: 1.001161]\n",
"[Epoch 179/200] [Batch 35/59] [D loss: 0.597278] [G loss: 1.365899]\n",
"[Epoch 179/200] [Batch 36/59] [D loss: 0.610711] [G loss: 0.711856]\n",
"[Epoch 179/200] [Batch 37/59] [D loss: 0.538254] [G loss: 0.699783]\n",
"[Epoch 179/200] [Batch 38/59] [D loss: 0.582828] [G loss: 1.344315]\n",
"[Epoch 179/200] [Batch 39/59] [D loss: 0.647384] [G loss: 1.205289]\n",
"[Epoch 179/200] [Batch 40/59] [D loss: 0.503495] [G loss: 0.585912]\n",
"[Epoch 179/200] [Batch 41/59] [D loss: 0.631732] [G loss: 0.720762]\n",
"[Epoch 179/200] [Batch 42/59] [D loss: 0.557276] [G loss: 0.998885]\n",
"[Epoch 179/200] [Batch 43/59] [D loss: 0.496700] [G loss: 1.251981]\n",
"[Epoch 179/200] [Batch 44/59] [D loss: 0.573172] [G loss: 1.101091]\n",
"[Epoch 179/200] [Batch 45/59] [D loss: 0.577215] [G loss: 0.728674]\n",
"[Epoch 179/200] [Batch 46/59] [D loss: 0.570101] [G loss: 1.163349]\n",
"[Epoch 179/200] [Batch 47/59] [D loss: 0.532420] [G loss: 0.636995]\n",
"[Epoch 179/200] [Batch 48/59] [D loss: 0.543394] [G loss: 0.873999]\n",
"[Epoch 179/200] [Batch 49/59] [D loss: 0.527214] [G loss: 1.144345]\n",
"[Epoch 179/200] [Batch 50/59] [D loss: 0.629947] [G loss: 0.899267]\n",
"[Epoch 179/200] [Batch 51/59] [D loss: 0.655221] [G loss: 0.516377]\n",
"[Epoch 179/200] [Batch 52/59] [D loss: 0.660600] [G loss: 1.646480]\n",
"[Epoch 179/200] [Batch 53/59] [D loss: 0.564611] [G loss: 1.060384]\n",
"[Epoch 179/200] [Batch 54/59] [D loss: 0.582914] [G loss: 0.657456]\n",
"[Epoch 179/200] [Batch 55/59] [D loss: 0.469267] [G loss: 1.434381]\n",
"[Epoch 179/200] [Batch 56/59] [D loss: 0.564478] [G loss: 1.062671]\n",
"[Epoch 179/200] [Batch 57/59] [D loss: 0.590648] [G loss: 0.695900]\n",
"[Epoch 179/200] [Batch 58/59] [D loss: 0.441969] [G loss: 1.015994]\n",
"[Epoch 180/200] [Batch 0/59] [D loss: 0.538583] [G loss: 0.875428]\n",
"[Epoch 180/200] [Batch 1/59] [D loss: 0.556577] [G loss: 1.303547]\n",
"[Epoch 180/200] [Batch 2/59] [D loss: 0.650484] [G loss: 0.508343]\n",
"[Epoch 180/200] [Batch 3/59] [D loss: 0.466982] [G loss: 1.441129]\n",
"[Epoch 180/200] [Batch 4/59] [D loss: 0.441681] [G loss: 1.400690]\n",
"[Epoch 180/200] [Batch 5/59] [D loss: 0.625108] [G loss: 0.644299]\n",
"[Epoch 180/200] [Batch 6/59] [D loss: 0.540158] [G loss: 0.785318]\n",
"[Epoch 180/200] [Batch 7/59] [D loss: 0.549684] [G loss: 1.176818]\n",
"[Epoch 180/200] [Batch 8/59] [D loss: 0.498214] [G loss: 0.599836]\n",
"[Epoch 180/200] [Batch 9/59] [D loss: 0.607271] [G loss: 1.016025]\n",
"[Epoch 180/200] [Batch 10/59] [D loss: 0.556514] [G loss: 1.204187]\n",
"[Epoch 180/200] [Batch 11/59] [D loss: 0.561129] [G loss: 1.313061]\n",
"[Epoch 180/200] [Batch 12/59] [D loss: 0.501202] [G loss: 0.747992]\n",
"[Epoch 180/200] [Batch 13/59] [D loss: 0.527374] [G loss: 1.031693]\n",
"[Epoch 180/200] [Batch 14/59] [D loss: 0.474808] [G loss: 0.856146]\n",
"[Epoch 180/200] [Batch 15/59] [D loss: 0.482540] [G loss: 1.132301]\n",
"[Epoch 180/200] [Batch 16/59] [D loss: 0.579513] [G loss: 0.823166]\n",
"[Epoch 180/200] [Batch 17/59] [D loss: 0.521261] [G loss: 1.005370]\n",
"[Epoch 180/200] [Batch 18/59] [D loss: 0.560499] [G loss: 0.893583]\n",
"[Epoch 180/200] [Batch 19/59] [D loss: 0.555945] [G loss: 0.819749]\n",
"[Epoch 180/200] [Batch 20/59] [D loss: 0.550456] [G loss: 1.074416]\n",
"[Epoch 180/200] [Batch 21/59] [D loss: 0.537913] [G loss: 0.925957]\n",
"[Epoch 180/200] [Batch 22/59] [D loss: 0.502335] [G loss: 0.986376]\n",
"[Epoch 180/200] [Batch 23/59] [D loss: 0.585067] [G loss: 1.256201]\n",
"[Epoch 180/200] [Batch 24/59] [D loss: 0.581674] [G loss: 0.800822]\n",
"[Epoch 180/200] [Batch 25/59] [D loss: 0.552117] [G loss: 1.088117]\n",
"[Epoch 180/200] [Batch 26/59] [D loss: 0.588993] [G loss: 1.383677]\n",
"[Epoch 180/200] [Batch 27/59] [D loss: 0.543847] [G loss: 0.734342]\n",
"[Epoch 180/200] [Batch 28/59] [D loss: 0.553045] [G loss: 0.682657]\n",
"[Epoch 180/200] [Batch 29/59] [D loss: 0.615309] [G loss: 1.241169]\n",
"[Epoch 180/200] [Batch 30/59] [D loss: 0.498847] [G loss: 0.801605]\n",
"[Epoch 180/200] [Batch 31/59] [D loss: 0.598063] [G loss: 0.824705]\n",
"[Epoch 180/200] [Batch 32/59] [D loss: 0.554387] [G loss: 1.079711]\n",
"[Epoch 180/200] [Batch 33/59] [D loss: 0.571612] [G loss: 1.024622]\n",
"[Epoch 180/200] [Batch 34/59] [D loss: 0.496679] [G loss: 0.973291]\n",
"[Epoch 180/200] [Batch 35/59] [D loss: 0.476941] [G loss: 1.195479]\n",
"[Epoch 180/200] [Batch 36/59] [D loss: 0.660609] [G loss: 1.426363]\n",
"[Epoch 180/200] [Batch 37/59] [D loss: 0.467176] [G loss: 0.603086]\n",
"[Epoch 180/200] [Batch 38/59] [D loss: 0.532644] [G loss: 1.087922]\n",
"[Epoch 180/200] [Batch 39/59] [D loss: 0.545474] [G loss: 1.467175]\n",
"[Epoch 180/200] [Batch 40/59] [D loss: 0.455680] [G loss: 1.229463]\n",
"[Epoch 180/200] [Batch 41/59] [D loss: 0.568287] [G loss: 0.576484]\n",
"[Epoch 180/200] [Batch 42/59] [D loss: 0.617864] [G loss: 1.616867]\n",
"[Epoch 180/200] [Batch 43/59] [D loss: 0.476839] [G loss: 0.778817]\n",
"[Epoch 180/200] [Batch 44/59] [D loss: 0.526224] [G loss: 0.614874]\n",
"[Epoch 180/200] [Batch 45/59] [D loss: 0.578769] [G loss: 1.049564]\n",
"[Epoch 180/200] [Batch 46/59] [D loss: 0.647818] [G loss: 0.878134]\n",
"[Epoch 180/200] [Batch 47/59] [D loss: 0.597399] [G loss: 0.985000]\n",
"[Epoch 180/200] [Batch 48/59] [D loss: 0.535602] [G loss: 1.257609]\n",
"[Epoch 180/200] [Batch 49/59] [D loss: 0.503052] [G loss: 1.013298]\n",
"[Epoch 180/200] [Batch 50/59] [D loss: 0.608565] [G loss: 0.832944]\n",
"[Epoch 180/200] [Batch 51/59] [D loss: 0.576220] [G loss: 1.059473]\n",
"[Epoch 180/200] [Batch 52/59] [D loss: 0.552817] [G loss: 0.867426]\n",
"[Epoch 180/200] [Batch 53/59] [D loss: 0.491425] [G loss: 0.865080]\n",
"[Epoch 180/200] [Batch 54/59] [D loss: 0.629798] [G loss: 0.990717]\n",
"[Epoch 180/200] [Batch 55/59] [D loss: 0.516973] [G loss: 0.733485]\n",
"[Epoch 180/200] [Batch 56/59] [D loss: 0.502269] [G loss: 1.016101]\n",
"[Epoch 180/200] [Batch 57/59] [D loss: 0.570449] [G loss: 1.188887]\n",
"[Epoch 180/200] [Batch 58/59] [D loss: 0.489339] [G loss: 1.045932]\n",
"[Epoch 181/200] [Batch 0/59] [D loss: 0.586235] [G loss: 0.944692]\n",
"[Epoch 181/200] [Batch 1/59] [D loss: 0.605123] [G loss: 1.396511]\n",
"[Epoch 181/200] [Batch 2/59] [D loss: 0.484954] [G loss: 0.641736]\n",
"[Epoch 181/200] [Batch 3/59] [D loss: 0.586549] [G loss: 0.789556]\n",
"[Epoch 181/200] [Batch 4/59] [D loss: 0.602902] [G loss: 1.054826]\n",
"[Epoch 181/200] [Batch 5/59] [D loss: 0.491706] [G loss: 1.041101]\n",
"[Epoch 181/200] [Batch 6/59] [D loss: 0.599936] [G loss: 0.827269]\n",
"[Epoch 181/200] [Batch 7/59] [D loss: 0.585124] [G loss: 1.503128]\n",
"[Epoch 181/200] [Batch 8/59] [D loss: 0.631748] [G loss: 1.101649]\n",
"[Epoch 181/200] [Batch 9/59] [D loss: 0.540276] [G loss: 0.712001]\n",
"[Epoch 181/200] [Batch 10/59] [D loss: 0.531247] [G loss: 0.982785]\n",
"[Epoch 181/200] [Batch 11/59] [D loss: 0.495696] [G loss: 1.449774]\n",
"[Epoch 181/200] [Batch 12/59] [D loss: 0.568743] [G loss: 0.648167]\n",
"[Epoch 181/200] [Batch 13/59] [D loss: 0.658281] [G loss: 0.839413]\n",
"[Epoch 181/200] [Batch 14/59] [D loss: 0.602027] [G loss: 1.528893]\n",
"[Epoch 181/200] [Batch 15/59] [D loss: 0.572126] [G loss: 1.076912]\n",
"[Epoch 181/200] [Batch 16/59] [D loss: 0.551574] [G loss: 0.576451]\n",
"[Epoch 181/200] [Batch 17/59] [D loss: 0.493381] [G loss: 0.880790]\n",
"[Epoch 181/200] [Batch 18/59] [D loss: 0.536145] [G loss: 1.265777]\n",
"[Epoch 181/200] [Batch 19/59] [D loss: 0.581082] [G loss: 0.791343]\n",
"[Epoch 181/200] [Batch 20/59] [D loss: 0.479444] [G loss: 0.794124]\n",
"[Epoch 181/200] [Batch 21/59] [D loss: 0.507664] [G loss: 1.158778]\n",
"[Epoch 181/200] [Batch 22/59] [D loss: 0.554525] [G loss: 0.985089]\n",
"[Epoch 181/200] [Batch 23/59] [D loss: 0.520486] [G loss: 1.175272]\n",
"[Epoch 181/200] [Batch 24/59] [D loss: 0.543277] [G loss: 1.017765]\n",
"[Epoch 181/200] [Batch 25/59] [D loss: 0.452507] [G loss: 0.934344]\n",
"[Epoch 181/200] [Batch 26/59] [D loss: 0.651859] [G loss: 1.465342]\n",
"[Epoch 181/200] [Batch 27/59] [D loss: 0.541157] [G loss: 0.497162]\n",
"[Epoch 181/200] [Batch 28/59] [D loss: 0.618289] [G loss: 0.850126]\n",
"[Epoch 181/200] [Batch 29/59] [D loss: 0.560760] [G loss: 1.649398]\n",
"[Epoch 181/200] [Batch 30/59] [D loss: 0.541783] [G loss: 0.737001]\n",
"[Epoch 181/200] [Batch 31/59] [D loss: 0.613095] [G loss: 0.650833]\n",
"[Epoch 181/200] [Batch 32/59] [D loss: 0.626118] [G loss: 1.342313]\n",
"[Epoch 181/200] [Batch 33/59] [D loss: 0.507721] [G loss: 0.828879]\n",
"[Epoch 181/200] [Batch 34/59] [D loss: 0.581414] [G loss: 0.809152]\n",
"[Epoch 181/200] [Batch 35/59] [D loss: 0.666542] [G loss: 1.280059]\n",
"[Epoch 181/200] [Batch 36/59] [D loss: 0.506436] [G loss: 1.188600]\n",
"[Epoch 181/200] [Batch 37/59] [D loss: 0.500921] [G loss: 1.176483]\n",
"[Epoch 181/200] [Batch 38/59] [D loss: 0.605761] [G loss: 1.018240]\n",
"[Epoch 181/200] [Batch 39/59] [D loss: 0.500244] [G loss: 0.838201]\n",
"[Epoch 181/200] [Batch 40/59] [D loss: 0.508816] [G loss: 1.280288]\n",
"[Epoch 181/200] [Batch 41/59] [D loss: 0.598678] [G loss: 0.786961]\n",
"[Epoch 181/200] [Batch 42/59] [D loss: 0.640482] [G loss: 1.222840]\n",
"[Epoch 181/200] [Batch 43/59] [D loss: 0.604124] [G loss: 0.947620]\n",
"[Epoch 181/200] [Batch 44/59] [D loss: 0.563382] [G loss: 1.144236]\n",
"[Epoch 181/200] [Batch 45/59] [D loss: 0.585795] [G loss: 0.815287]\n",
"[Epoch 181/200] [Batch 46/59] [D loss: 0.454586] [G loss: 1.048155]\n",
"[Epoch 181/200] [Batch 47/59] [D loss: 0.638168] [G loss: 1.034230]\n",
"[Epoch 181/200] [Batch 48/59] [D loss: 0.594792] [G loss: 0.684297]\n",
"[Epoch 181/200] [Batch 49/59] [D loss: 0.532457] [G loss: 1.308919]\n",
"[Epoch 181/200] [Batch 50/59] [D loss: 0.502461] [G loss: 0.949822]\n",
"[Epoch 181/200] [Batch 51/59] [D loss: 0.624166] [G loss: 0.727581]\n",
"[Epoch 181/200] [Batch 52/59] [D loss: 0.514700] [G loss: 1.267362]\n",
"[Epoch 181/200] [Batch 53/59] [D loss: 0.519520] [G loss: 1.050311]\n",
"[Epoch 181/200] [Batch 54/59] [D loss: 0.471841] [G loss: 0.889200]\n",
"[Epoch 181/200] [Batch 55/59] [D loss: 0.607656] [G loss: 1.033922]\n",
"[Epoch 181/200] [Batch 56/59] [D loss: 0.487801] [G loss: 0.995991]\n",
"[Epoch 181/200] [Batch 57/59] [D loss: 0.479184] [G loss: 0.679487]\n",
"[Epoch 181/200] [Batch 58/59] [D loss: 0.547276] [G loss: 1.356394]\n",
"[Epoch 182/200] [Batch 0/59] [D loss: 0.524662] [G loss: 1.171749]\n",
"[Epoch 182/200] [Batch 1/59] [D loss: 0.451294] [G loss: 1.153725]\n",
"[Epoch 182/200] [Batch 2/59] [D loss: 0.585777] [G loss: 1.106389]\n",
"[Epoch 182/200] [Batch 3/59] [D loss: 0.513188] [G loss: 0.973486]\n",
"[Epoch 182/200] [Batch 4/59] [D loss: 0.532139] [G loss: 0.694511]\n",
"[Epoch 182/200] [Batch 5/59] [D loss: 0.559193] [G loss: 1.066890]\n",
"[Epoch 182/200] [Batch 6/59] [D loss: 0.620483] [G loss: 1.098680]\n",
"[Epoch 182/200] [Batch 7/59] [D loss: 0.518688] [G loss: 0.761407]\n",
"[Epoch 182/200] [Batch 8/59] [D loss: 0.583244] [G loss: 1.081690]\n",
"[Epoch 182/200] [Batch 9/59] [D loss: 0.547636] [G loss: 0.798513]\n",
"[Epoch 182/200] [Batch 10/59] [D loss: 0.638933] [G loss: 0.908382]\n",
"[Epoch 182/200] [Batch 11/59] [D loss: 0.612475] [G loss: 0.875723]\n",
"[Epoch 182/200] [Batch 12/59] [D loss: 0.541476] [G loss: 1.294414]\n",
"[Epoch 182/200] [Batch 13/59] [D loss: 0.545600] [G loss: 0.918845]\n",
"[Epoch 182/200] [Batch 14/59] [D loss: 0.447795] [G loss: 0.974084]\n",
"[Epoch 182/200] [Batch 15/59] [D loss: 0.563644] [G loss: 1.170890]\n",
"[Epoch 182/200] [Batch 16/59] [D loss: 0.557410] [G loss: 0.987185]\n",
"[Epoch 182/200] [Batch 17/59] [D loss: 0.550168] [G loss: 1.131210]\n",
"[Epoch 182/200] [Batch 18/59] [D loss: 0.548367] [G loss: 0.904922]\n",
"[Epoch 182/200] [Batch 19/59] [D loss: 0.622133] [G loss: 1.243989]\n",
"[Epoch 182/200] [Batch 20/59] [D loss: 0.535143] [G loss: 0.846940]\n",
"[Epoch 182/200] [Batch 21/59] [D loss: 0.474924] [G loss: 0.991762]\n",
"[Epoch 182/200] [Batch 22/59] [D loss: 0.533554] [G loss: 1.132458]\n",
"[Epoch 182/200] [Batch 23/59] [D loss: 0.620978] [G loss: 0.850175]\n",
"[Epoch 182/200] [Batch 24/59] [D loss: 0.563962] [G loss: 1.334904]\n",
"[Epoch 182/200] [Batch 25/59] [D loss: 0.553259] [G loss: 0.914607]\n",
"[Epoch 182/200] [Batch 26/59] [D loss: 0.546505] [G loss: 0.977314]\n",
"[Epoch 182/200] [Batch 27/59] [D loss: 0.505699] [G loss: 1.007946]\n",
"[Epoch 182/200] [Batch 28/59] [D loss: 0.575548] [G loss: 0.851900]\n",
"[Epoch 182/200] [Batch 29/59] [D loss: 0.509567] [G loss: 1.002333]\n",
"[Epoch 182/200] [Batch 30/59] [D loss: 0.478262] [G loss: 0.968475]\n",
"[Epoch 182/200] [Batch 31/59] [D loss: 0.617335] [G loss: 0.886826]\n",
"[Epoch 182/200] [Batch 32/59] [D loss: 0.514838] [G loss: 1.393522]\n",
"[Epoch 182/200] [Batch 33/59] [D loss: 0.543845] [G loss: 1.231844]\n",
"[Epoch 182/200] [Batch 34/59] [D loss: 0.511716] [G loss: 0.816563]\n",
"[Epoch 182/200] [Batch 35/59] [D loss: 0.570214] [G loss: 1.022140]\n",
"[Epoch 182/200] [Batch 36/59] [D loss: 0.580217] [G loss: 1.173657]\n",
"[Epoch 182/200] [Batch 37/59] [D loss: 0.711823] [G loss: 0.574008]\n",
"[Epoch 182/200] [Batch 38/59] [D loss: 0.728003] [G loss: 1.719508]\n",
"[Epoch 182/200] [Batch 39/59] [D loss: 0.603690] [G loss: 0.905152]\n",
"[Epoch 182/200] [Batch 40/59] [D loss: 0.558206] [G loss: 0.971174]\n",
"[Epoch 182/200] [Batch 41/59] [D loss: 0.492688] [G loss: 1.176577]\n",
"[Epoch 182/200] [Batch 42/59] [D loss: 0.551442] [G loss: 1.291498]\n",
"[Epoch 182/200] [Batch 43/59] [D loss: 0.578647] [G loss: 0.680785]\n",
"[Epoch 182/200] [Batch 44/59] [D loss: 0.539784] [G loss: 1.004253]\n",
"[Epoch 182/200] [Batch 45/59] [D loss: 0.534170] [G loss: 0.723049]\n",
"[Epoch 182/200] [Batch 46/59] [D loss: 0.473899] [G loss: 1.238510]\n",
"[Epoch 182/200] [Batch 47/59] [D loss: 0.465503] [G loss: 0.743976]\n",
"[Epoch 182/200] [Batch 48/59] [D loss: 0.613011] [G loss: 0.488750]\n",
"[Epoch 182/200] [Batch 49/59] [D loss: 0.579806] [G loss: 1.349177]\n",
"[Epoch 182/200] [Batch 50/59] [D loss: 0.472887] [G loss: 0.869660]\n",
"[Epoch 182/200] [Batch 51/59] [D loss: 0.630014] [G loss: 0.780805]\n",
"[Epoch 182/200] [Batch 52/59] [D loss: 0.522190] [G loss: 1.394787]\n",
"[Epoch 182/200] [Batch 53/59] [D loss: 0.473230] [G loss: 0.784094]\n",
"[Epoch 182/200] [Batch 54/59] [D loss: 0.497789] [G loss: 0.805452]\n",
"[Epoch 182/200] [Batch 55/59] [D loss: 0.562774] [G loss: 1.460357]\n",
"[Epoch 182/200] [Batch 56/59] [D loss: 0.608345] [G loss: 0.839016]\n",
"[Epoch 182/200] [Batch 57/59] [D loss: 0.594056] [G loss: 0.920642]\n",
"[Epoch 182/200] [Batch 58/59] [D loss: 0.484045] [G loss: 1.068734]\n",
"[Epoch 183/200] [Batch 0/59] [D loss: 0.705311] [G loss: 1.611108]\n",
"[Epoch 183/200] [Batch 1/59] [D loss: 0.590429] [G loss: 0.969774]\n",
"[Epoch 183/200] [Batch 2/59] [D loss: 0.580772] [G loss: 0.756885]\n",
"[Epoch 183/200] [Batch 3/59] [D loss: 0.607800] [G loss: 1.315142]\n",
"[Epoch 183/200] [Batch 4/59] [D loss: 0.576125] [G loss: 1.144414]\n",
"[Epoch 183/200] [Batch 5/59] [D loss: 0.632590] [G loss: 0.436567]\n",
"[Epoch 183/200] [Batch 6/59] [D loss: 0.559294] [G loss: 1.195221]\n",
"[Epoch 183/200] [Batch 7/59] [D loss: 0.504922] [G loss: 1.307955]\n",
"[Epoch 183/200] [Batch 8/59] [D loss: 0.536480] [G loss: 0.608151]\n",
"[Epoch 183/200] [Batch 9/59] [D loss: 0.506115] [G loss: 0.947388]\n",
"[Epoch 183/200] [Batch 10/59] [D loss: 0.693649] [G loss: 1.566910]\n",
"[Epoch 183/200] [Batch 11/59] [D loss: 0.575924] [G loss: 0.806900]\n",
"[Epoch 183/200] [Batch 12/59] [D loss: 0.550615] [G loss: 1.004758]\n",
"[Epoch 183/200] [Batch 13/59] [D loss: 0.497156] [G loss: 1.323382]\n",
"[Epoch 183/200] [Batch 14/59] [D loss: 0.510083] [G loss: 0.731539]\n",
"[Epoch 183/200] [Batch 15/59] [D loss: 0.579936] [G loss: 0.998587]\n",
"[Epoch 183/200] [Batch 16/59] [D loss: 0.526816] [G loss: 0.890387]\n",
"[Epoch 183/200] [Batch 17/59] [D loss: 0.550433] [G loss: 0.794548]\n",
"[Epoch 183/200] [Batch 18/59] [D loss: 0.569411] [G loss: 0.757349]\n",
"[Epoch 183/200] [Batch 19/59] [D loss: 0.538109] [G loss: 1.294393]\n",
"[Epoch 183/200] [Batch 20/59] [D loss: 0.529746] [G loss: 0.885852]\n",
"[Epoch 183/200] [Batch 21/59] [D loss: 0.501024] [G loss: 0.866016]\n",
"[Epoch 183/200] [Batch 22/59] [D loss: 0.533759] [G loss: 0.824538]\n",
"[Epoch 183/200] [Batch 23/59] [D loss: 0.544585] [G loss: 1.144884]\n",
"[Epoch 183/200] [Batch 24/59] [D loss: 0.506504] [G loss: 0.999902]\n",
"[Epoch 183/200] [Batch 25/59] [D loss: 0.549348] [G loss: 0.988719]\n",
"[Epoch 183/200] [Batch 26/59] [D loss: 0.556767] [G loss: 1.327703]\n",
"[Epoch 183/200] [Batch 27/59] [D loss: 0.600854] [G loss: 0.908801]\n",
"[Epoch 183/200] [Batch 28/59] [D loss: 0.490520] [G loss: 0.847065]\n",
"[Epoch 183/200] [Batch 29/59] [D loss: 0.531866] [G loss: 0.923514]\n",
"[Epoch 183/200] [Batch 30/59] [D loss: 0.571327] [G loss: 1.129529]\n",
"[Epoch 183/200] [Batch 31/59] [D loss: 0.503210] [G loss: 1.052560]\n",
"[Epoch 183/200] [Batch 32/59] [D loss: 0.493469] [G loss: 1.218019]\n",
"[Epoch 183/200] [Batch 33/59] [D loss: 0.528525] [G loss: 0.859365]\n",
"[Epoch 183/200] [Batch 34/59] [D loss: 0.512709] [G loss: 1.073186]\n",
"[Epoch 183/200] [Batch 35/59] [D loss: 0.549018] [G loss: 0.886716]\n",
"[Epoch 183/200] [Batch 36/59] [D loss: 0.538476] [G loss: 0.844844]\n",
"[Epoch 183/200] [Batch 37/59] [D loss: 0.546918] [G loss: 0.964704]\n",
"[Epoch 183/200] [Batch 38/59] [D loss: 0.56787
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment