CINXE.COM

Data-efficient GANs with Adaptive Discriminator Augmentation

<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="description" content="Keras documentation"> <meta name="author" content="Keras Team"> <link rel="shortcut icon" href="https://keras.io/img/favicon.ico"> <link rel="canonical" href="https://keras.io/examples/generative/gan_ada/" /> <!-- Social --> <meta property="og:title" content="Keras documentation: Data-efficient GANs with Adaptive Discriminator Augmentation"> <meta property="og:image" content="https://keras.io/img/logo-k-keras-wb.png"> <meta name="twitter:title" content="Keras documentation: Data-efficient GANs with Adaptive Discriminator Augmentation"> <meta name="twitter:image" content="https://keras.io/img/k-keras-social.png"> <meta name="twitter:card" content="summary"> <title>Data-efficient GANs with Adaptive Discriminator Augmentation</title> <!-- Bootstrap core CSS --> <link href="/css/bootstrap.min.css" rel="stylesheet"> <!-- Custom fonts for this template --> <link href="https://fonts.googleapis.com/css2?family=Open+Sans:wght@400;600;700;800&display=swap" rel="stylesheet"> <!-- Custom styles for this template --> <link href="/css/docs.css" rel="stylesheet"> <link href="/css/monokai.css" rel="stylesheet"> <!-- Google Tag Manager --> <script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer','GTM-5DNGF4N'); </script> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','https://www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-175165319-128', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Tag Manager --> <script async defer src="https://buttons.github.io/buttons.js"></script> </head> <body> <!-- Google Tag Manager (noscript) --> <noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-5DNGF4N" height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript> <!-- End Google Tag Manager (noscript) --> <div class='k-page'> <div class="k-nav" id="nav-menu"> <a href='/'><img src='/img/logo-small.png' class='logo-small' /></a> <div class="nav flex-column nav-pills" role="tablist" aria-orientation="vertical"> <a class="nav-link" href="/about/" role="tab" aria-selected="">About Keras</a> <a class="nav-link" href="/getting_started/" role="tab" aria-selected="">Getting started</a> <a class="nav-link" href="/guides/" role="tab" aria-selected="">Developer guides</a> <a class="nav-link active" href="/examples/" role="tab" aria-selected="">Code examples</a> <a class="nav-sublink" href="/examples/vision/">Computer Vision</a> <a class="nav-sublink" href="/examples/nlp/">Natural Language Processing</a> <a class="nav-sublink" href="/examples/structured_data/">Structured Data</a> <a class="nav-sublink" href="/examples/timeseries/">Timeseries</a> <a class="nav-sublink active" href="/examples/generative/">Generative Deep Learning</a> <a class="nav-sublink2" href="/examples/generative/ddim/">Denoising Diffusion Implicit Models</a> <a class="nav-sublink2" href="/examples/generative/random_walks_with_stable_diffusion_3/">A walk through latent space with Stable Diffusion 3</a> <a class="nav-sublink2" href="/examples/generative/dreambooth/">DreamBooth</a> <a class="nav-sublink2" href="/examples/generative/ddpm/">Denoising Diffusion Probabilistic Models</a> <a class="nav-sublink2" href="/examples/generative/fine_tune_via_textual_inversion/">Teach StableDiffusion new concepts via Textual Inversion</a> <a class="nav-sublink2" href="/examples/generative/finetune_stable_diffusion/">Fine-tuning Stable Diffusion</a> <a class="nav-sublink2" href="/examples/generative/vae/">Variational AutoEncoder</a> <a class="nav-sublink2" href="/examples/generative/dcgan_overriding_train_step/">GAN overriding Model.train_step</a> <a class="nav-sublink2" href="/examples/generative/wgan_gp/">WGAN-GP overriding Model.train_step</a> <a class="nav-sublink2" href="/examples/generative/conditional_gan/">Conditional GAN</a> <a class="nav-sublink2" href="/examples/generative/cyclegan/">CycleGAN</a> <a class="nav-sublink2 active" href="/examples/generative/gan_ada/">Data-efficient GANs with Adaptive Discriminator Augmentation</a> <a class="nav-sublink2" href="/examples/generative/deep_dream/">Deep Dream</a> <a class="nav-sublink2" href="/examples/generative/gaugan/">GauGAN for conditional image generation</a> <a class="nav-sublink2" href="/examples/generative/pixelcnn/">PixelCNN</a> <a class="nav-sublink2" href="/examples/generative/stylegan/">Face image generation with StyleGAN</a> <a class="nav-sublink2" href="/examples/generative/vq_vae/">Vector-Quantized Variational Autoencoders</a> <a class="nav-sublink2" href="/examples/generative/random_walks_with_stable_diffusion/">A walk through latent space with Stable Diffusion</a> <a class="nav-sublink2" href="/examples/generative/neural_style_transfer/">Neural style transfer</a> <a class="nav-sublink2" href="/examples/generative/adain/">Neural Style Transfer with AdaIN</a> <a class="nav-sublink2" href="/examples/generative/gpt2_text_generation_with_keras_hub/">GPT2 Text Generation with KerasHub</a> <a class="nav-sublink2" href="/examples/generative/text_generation_gpt/">GPT text generation from scratch with KerasHub</a> <a class="nav-sublink2" href="/examples/generative/text_generation_with_miniature_gpt/">Text generation with a miniature GPT</a> <a class="nav-sublink2" href="/examples/generative/lstm_character_level_text_generation/">Character-level text generation with LSTM</a> <a class="nav-sublink2" href="/examples/generative/text_generation_fnet/">Text Generation using FNet</a> <a class="nav-sublink2" href="/examples/generative/midi_generation_with_transformer/">Music Generation with Transformer Models</a> <a class="nav-sublink2" href="/examples/generative/molecule_generation/">Drug Molecule Generation with VAE</a> <a class="nav-sublink2" href="/examples/generative/wgan-graphs/">WGAN-GP with R-GCN for the generation of small molecular graphs</a> <a class="nav-sublink2" href="/examples/generative/real_nvp/">Density estimation using Real NVP</a> <a class="nav-sublink" href="/examples/audio/">Audio Data</a> <a class="nav-sublink" href="/examples/rl/">Reinforcement Learning</a> <a class="nav-sublink" href="/examples/graph/">Graph Data</a> <a class="nav-sublink" href="/examples/keras_recipes/">Quick Keras Recipes</a> <a class="nav-link" href="/api/" role="tab" aria-selected="">Keras 3 API documentation</a> <a class="nav-link" href="/2.18/api/" role="tab" aria-selected="">Keras 2 API documentation</a> <a class="nav-link" href="/keras_tuner/" role="tab" aria-selected="">KerasTuner: Hyperparam Tuning</a> <a class="nav-link" href="/keras_hub/" role="tab" aria-selected="">KerasHub: Pretrained Models</a> </div> </div> <div class='k-main'> <div class='k-main-top'> <script> function displayDropdownMenu() { e = document.getElementById("nav-menu"); if (e.style.display == "block") { e.style.display = "none"; } else { e.style.display = "block"; document.getElementById("dropdown-nav").style.display = "block"; } } function resetMobileUI() { if (window.innerWidth <= 840) { document.getElementById("nav-menu").style.display = "none"; document.getElementById("dropdown-nav").style.display = "block"; } else { document.getElementById("nav-menu").style.display = "block"; document.getElementById("dropdown-nav").style.display = "none"; } var navmenu = document.getElementById("nav-menu"); var menuheight = navmenu.clientHeight; var kmain = document.getElementById("k-main-id"); kmain.style.minHeight = (menuheight + 100) + 'px'; } window.onresize = resetMobileUI; window.addEventListener("load", (event) => { resetMobileUI() }); </script> <div id='dropdown-nav' onclick="displayDropdownMenu();"> <svg viewBox="-20 -20 120 120" width="60" height="60"> <rect width="100" height="20"></rect> <rect y="30" width="100" height="20"></rect> <rect y="60" width="100" height="20"></rect> </svg> </div> <form class="bd-search d-flex align-items-center k-search-form" id="search-form"> <input type="search" class="k-search-input" id="search-input" placeholder="Search Keras documentation..." aria-label="Search Keras documentation..." autocomplete="off"> <button class="k-search-btn"> <svg width="13" height="13" viewBox="0 0 13 13"><title>search</title><path d="m4.8495 7.8226c0.82666 0 1.5262-0.29146 2.0985-0.87438 0.57232-0.58292 0.86378-1.2877 0.87438-2.1144 0.010599-0.82666-0.28086-1.5262-0.87438-2.0985-0.59352-0.57232-1.293-0.86378-2.0985-0.87438-0.8055-0.010599-1.5103 0.28086-2.1144 0.87438-0.60414 0.59352-0.8956 1.293-0.87438 2.0985 0.021197 0.8055 0.31266 1.5103 0.87438 2.1144 0.56172 0.60414 1.2665 0.8956 2.1144 0.87438zm4.4695 0.2115 3.681 3.6819-1.259 1.284-3.6817-3.7 0.0019784-0.69479-0.090043-0.098846c-0.87973 0.76087-1.92 1.1413-3.1207 1.1413-1.3553 0-2.5025-0.46363-3.4417-1.3909s-1.4088-2.0686-1.4088-3.4239c0-1.3553 0.4696-2.4966 1.4088-3.4239 0.9392-0.92727 2.0864-1.3969 3.4417-1.4088 1.3553-0.011889 2.4906 0.45771 3.406 1.4088 0.9154 0.95107 1.379 2.0924 1.3909 3.4239 0 1.2126-0.38043 2.2588-1.1413 3.1385l0.098834 0.090049z"></path></svg> </button> </form> <script> var form = document.getElementById('search-form'); form.onsubmit = function(e) { e.preventDefault(); var query = document.getElementById('search-input').value; window.location.href = '/search.html?query=' + query; return False } </script> </div> <div class='k-main-inner' id='k-main-id'> <div class='k-location-slug'> <span class="k-location-slug-pointer">►</span> <a href='/examples/'>Code examples</a> / <a href='/examples/generative/'>Generative Deep Learning</a> / Data-efficient GANs with Adaptive Discriminator Augmentation </div> <div class='k-content'> <h1 id="dataefficient-gans-with-adaptive-discriminator-augmentation">Data-efficient GANs with Adaptive Discriminator Augmentation</h1> <p><strong>Author:</strong> <a href="https://www.linkedin.com/in/andras-beres-789190210">András Béres</a><br> <strong>Date created:</strong> 2021/10/28<br> <strong>Last modified:</strong> 2025/01/23<br> <strong>Description:</strong> Generating images from limited data using the Caltech Birds dataset.</p> <div class='example_version_banner keras_3'>ⓘ This example uses Keras 3</div> <p><img class="k-inline-icon" src="https://colab.research.google.com/img/colab_favicon.ico"/> <a href="https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/generative/ipynb/gan_ada.ipynb"><strong>View in Colab</strong></a> <span class="k-dot">•</span><img class="k-inline-icon" src="https://github.com/favicon.ico"/> <a href="https://github.com/keras-team/keras-io/blob/master/examples/generative/gan_ada.py"><strong>GitHub source</strong></a></p> <hr /> <h2 id="introduction">Introduction</h2> <h3 id="gans">GANs</h3> <p><a href="https://arxiv.org/abs/1406.2661">Generative Adversarial Networks (GANs)</a> are a popular class of generative deep learning models, commonly used for image generation. They consist of a pair of dueling neural networks, called the discriminator and the generator. The discriminator's task is to distinguish real images from generated (fake) ones, while the generator network tries to fool the discriminator by generating more and more realistic images. If the generator is however too easy or too hard to fool, it might fail to provide useful learning signal for the generator, therefore training GANs is usually considered a difficult task.</p> <h3 id="data-augmentation-for-gans">Data augmentation for GANS</h3> <p>Data augmentation, a popular technique in deep learning, is the process of randomly applying semantics-preserving transformations to the input data to generate multiple realistic versions of it, thereby effectively multiplying the amount of training data available. The simplest example is left-right flipping an image, which preserves its contents while generating a second unique training sample. Data augmentation is commonly used in supervised learning to prevent overfitting and enhance generalization.</p> <p>The authors of <a href="https://arxiv.org/abs/2006.06676">StyleGAN2-ADA</a> show that discriminator overfitting can be an issue in GANs, especially when only low amounts of training data is available. They propose Adaptive Discriminator Augmentation to mitigate this issue.</p> <p>Applying data augmentation to GANs however is not straightforward. Since the generator is updated using the discriminator's gradients, if the generated images are augmented, the augmentation pipeline has to be differentiable and also has to be GPU-compatible for computational efficiency. Luckily, the <a href="https://keras.io/api/layers/preprocessing_layers/image_augmentation/">Keras image augmentation layers</a> fulfill both these requirements, and are therefore very well suited for this task.</p> <h3 id="invertible-data-augmentation">Invertible data augmentation</h3> <p>A possible difficulty when using data augmentation in generative models is the issue of <a href="https://arxiv.org/abs/2006.06676">"leaky augmentations" (section 2.2)</a>, namely when the model generates images that are already augmented. This would mean that it was not able to separate the augmentation from the underlying data distribution, which can be caused by using non-invertible data transformations. For example, if either 0, 90, 180 or 270 degree rotations are performed with equal probability, the original orientation of the images is impossible to infer, and this information is destroyed.</p> <p>A simple trick to make data augmentations invertible is to only apply them with some probability. That way the original version of the images will be more common, and the data distribution can be inferred. By properly choosing this probability, one can effectively regularize the discriminator without making the augmentations leaky.</p> <hr /> <h2 id="setup">Setup</h2> <div class="codehilite"><pre><span></span><code><span class="kn">import</span><span class="w"> </span><span class="nn">os</span> <span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="p">[</span><span class="s2">&quot;KERAS_BACKEND&quot;</span><span class="p">]</span> <span class="o">=</span> <span class="s2">&quot;tensorflow&quot;</span> <span class="kn">import</span><span class="w"> </span><span class="nn">matplotlib.pyplot</span><span class="w"> </span><span class="k">as</span><span class="w"> </span><span class="nn">plt</span> <span class="kn">import</span><span class="w"> </span><span class="nn">tensorflow</span><span class="w"> </span><span class="k">as</span><span class="w"> </span><span class="nn">tf</span> <span class="kn">import</span><span class="w"> </span><span class="nn">tensorflow_datasets</span><span class="w"> </span><span class="k">as</span><span class="w"> </span><span class="nn">tfds</span> <span class="kn">import</span><span class="w"> </span><span class="nn">keras</span> <span class="kn">from</span><span class="w"> </span><span class="nn">keras</span><span class="w"> </span><span class="kn">import</span> <span class="n">ops</span> <span class="kn">from</span><span class="w"> </span><span class="nn">keras</span><span class="w"> </span><span class="kn">import</span> <span class="n">layers</span> </code></pre></div> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>WARNING: All log messages before absl::InitializeLog() is called are written to STDERR E0000 00:00:1738798965.367584 17795 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered E0000 00:00:1738798965.374084 17795 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered </code></pre></div> </div> <hr /> <h2 id="hyperparameterers">Hyperparameterers</h2> <div class="codehilite"><pre><span></span><code><span class="c1"># data</span> <span class="n">num_epochs</span> <span class="o">=</span> <span class="mi">10</span> <span class="c1"># train for 400 epochs for good results</span> <span class="n">image_size</span> <span class="o">=</span> <span class="mi">64</span> <span class="c1"># resolution of Kernel Inception Distance measurement, see related section</span> <span class="n">kid_image_size</span> <span class="o">=</span> <span class="mi">75</span> <span class="n">padding</span> <span class="o">=</span> <span class="mf">0.25</span> <span class="n">dataset_name</span> <span class="o">=</span> <span class="s2">&quot;caltech_birds2011&quot;</span> <span class="c1"># adaptive discriminator augmentation</span> <span class="n">max_translation</span> <span class="o">=</span> <span class="mf">0.125</span> <span class="n">max_rotation</span> <span class="o">=</span> <span class="mf">0.125</span> <span class="n">max_zoom</span> <span class="o">=</span> <span class="mf">0.25</span> <span class="n">target_accuracy</span> <span class="o">=</span> <span class="mf">0.85</span> <span class="n">integration_steps</span> <span class="o">=</span> <span class="mi">1000</span> <span class="c1"># architecture</span> <span class="n">noise_size</span> <span class="o">=</span> <span class="mi">64</span> <span class="n">depth</span> <span class="o">=</span> <span class="mi">4</span> <span class="n">width</span> <span class="o">=</span> <span class="mi">128</span> <span class="n">leaky_relu_slope</span> <span class="o">=</span> <span class="mf">0.2</span> <span class="n">dropout_rate</span> <span class="o">=</span> <span class="mf">0.4</span> <span class="c1"># optimization</span> <span class="n">batch_size</span> <span class="o">=</span> <span class="mi">128</span> <span class="n">learning_rate</span> <span class="o">=</span> <span class="mf">2e-4</span> <span class="n">beta_1</span> <span class="o">=</span> <span class="mf">0.5</span> <span class="c1"># not using the default value of 0.9 is important</span> <span class="n">ema</span> <span class="o">=</span> <span class="mf">0.99</span> </code></pre></div> <hr /> <h2 id="data-pipeline">Data pipeline</h2> <p>In this example, we will use the <a href="https://www.tensorflow.org/datasets/catalog/caltech_birds2011">Caltech Birds (2011)</a> dataset for generating images of birds, which is a diverse natural dataset containing less then 6000 images for training. When working with such low amounts of data, one has to take extra care to retain as high data quality as possible. In this example, we use the provided bounding boxes of the birds to cut them out with square crops while preserving their aspect ratios when possible.</p> <div class="codehilite"><pre><span></span><code><span class="k">def</span><span class="w"> </span><span class="nf">round_to_int</span><span class="p">(</span><span class="n">float_value</span><span class="p">):</span> <span class="k">return</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">ops</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="n">float_value</span><span class="p">),</span> <span class="s2">&quot;int32&quot;</span><span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">preprocess_image</span><span class="p">(</span><span class="n">data</span><span class="p">):</span> <span class="c1"># unnormalize bounding box coordinates</span> <span class="n">height</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">ops</span><span class="o">.</span><span class="n">shape</span><span class="p">(</span><span class="n">data</span><span class="p">[</span><span class="s2">&quot;image&quot;</span><span class="p">])[</span><span class="mi">0</span><span class="p">],</span> <span class="s2">&quot;float32&quot;</span><span class="p">)</span> <span class="n">width</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">ops</span><span class="o">.</span><span class="n">shape</span><span class="p">(</span><span class="n">data</span><span class="p">[</span><span class="s2">&quot;image&quot;</span><span class="p">])[</span><span class="mi">1</span><span class="p">],</span> <span class="s2">&quot;float32&quot;</span><span class="p">)</span> <span class="n">bounding_box</span> <span class="o">=</span> <span class="n">data</span><span class="p">[</span><span class="s2">&quot;bbox&quot;</span><span class="p">]</span> <span class="o">*</span> <span class="n">ops</span><span class="o">.</span><span class="n">stack</span><span class="p">([</span><span class="n">height</span><span class="p">,</span> <span class="n">width</span><span class="p">,</span> <span class="n">height</span><span class="p">,</span> <span class="n">width</span><span class="p">])</span> <span class="c1"># calculate center and length of longer side, add padding</span> <span class="n">target_center_y</span> <span class="o">=</span> <span class="mf">0.5</span> <span class="o">*</span> <span class="p">(</span><span class="n">bounding_box</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">+</span> <span class="n">bounding_box</span><span class="p">[</span><span class="mi">2</span><span class="p">])</span> <span class="n">target_center_x</span> <span class="o">=</span> <span class="mf">0.5</span> <span class="o">*</span> <span class="p">(</span><span class="n">bounding_box</span><span class="p">[</span><span class="mi">1</span><span class="p">]</span> <span class="o">+</span> <span class="n">bounding_box</span><span class="p">[</span><span class="mi">3</span><span class="p">])</span> <span class="n">target_size</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">maximum</span><span class="p">(</span> <span class="p">(</span><span class="mf">1.0</span> <span class="o">+</span> <span class="n">padding</span><span class="p">)</span> <span class="o">*</span> <span class="p">(</span><span class="n">bounding_box</span><span class="p">[</span><span class="mi">2</span><span class="p">]</span> <span class="o">-</span> <span class="n">bounding_box</span><span class="p">[</span><span class="mi">0</span><span class="p">]),</span> <span class="p">(</span><span class="mf">1.0</span> <span class="o">+</span> <span class="n">padding</span><span class="p">)</span> <span class="o">*</span> <span class="p">(</span><span class="n">bounding_box</span><span class="p">[</span><span class="mi">3</span><span class="p">]</span> <span class="o">-</span> <span class="n">bounding_box</span><span class="p">[</span><span class="mi">1</span><span class="p">]),</span> <span class="p">)</span> <span class="c1"># modify crop size to fit into image</span> <span class="n">target_height</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">min</span><span class="p">(</span> <span class="p">[</span><span class="n">target_size</span><span class="p">,</span> <span class="mf">2.0</span> <span class="o">*</span> <span class="n">target_center_y</span><span class="p">,</span> <span class="mf">2.0</span> <span class="o">*</span> <span class="p">(</span><span class="n">height</span> <span class="o">-</span> <span class="n">target_center_y</span><span class="p">)]</span> <span class="p">)</span> <span class="n">target_width</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">min</span><span class="p">(</span> <span class="p">[</span><span class="n">target_size</span><span class="p">,</span> <span class="mf">2.0</span> <span class="o">*</span> <span class="n">target_center_x</span><span class="p">,</span> <span class="mf">2.0</span> <span class="o">*</span> <span class="p">(</span><span class="n">width</span> <span class="o">-</span> <span class="n">target_center_x</span><span class="p">)]</span> <span class="p">)</span> <span class="c1"># crop image, `ops.image.crop_images` only works with non-tensor croppings</span> <span class="n">image</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">slice</span><span class="p">(</span> <span class="n">data</span><span class="p">[</span><span class="s2">&quot;image&quot;</span><span class="p">],</span> <span class="n">start_indices</span><span class="o">=</span><span class="p">(</span> <span class="n">round_to_int</span><span class="p">(</span><span class="n">target_center_y</span> <span class="o">-</span> <span class="mf">0.5</span> <span class="o">*</span> <span class="n">target_height</span><span class="p">),</span> <span class="n">round_to_int</span><span class="p">(</span><span class="n">target_center_x</span> <span class="o">-</span> <span class="mf">0.5</span> <span class="o">*</span> <span class="n">target_width</span><span class="p">),</span> <span class="mi">0</span><span class="p">,</span> <span class="p">),</span> <span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">round_to_int</span><span class="p">(</span><span class="n">target_height</span><span class="p">),</span> <span class="n">round_to_int</span><span class="p">(</span><span class="n">target_width</span><span class="p">),</span> <span class="mi">3</span><span class="p">),</span> <span class="p">)</span> <span class="c1"># resize and clip</span> <span class="n">image</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">image</span><span class="p">,</span> <span class="s2">&quot;float32&quot;</span><span class="p">)</span> <span class="n">image</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">image</span><span class="o">.</span><span class="n">resize</span><span class="p">(</span><span class="n">image</span><span class="p">,</span> <span class="p">[</span><span class="n">image_size</span><span class="p">,</span> <span class="n">image_size</span><span class="p">])</span> <span class="k">return</span> <span class="n">ops</span><span class="o">.</span><span class="n">clip</span><span class="p">(</span><span class="n">image</span> <span class="o">/</span> <span class="mf">255.0</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">,</span> <span class="mf">1.0</span><span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">prepare_dataset</span><span class="p">(</span><span class="n">split</span><span class="p">):</span> <span class="c1"># the validation dataset is shuffled as well, because data order matters</span> <span class="c1"># for the KID calculation</span> <span class="k">return</span> <span class="p">(</span> <span class="n">tfds</span><span class="o">.</span><span class="n">load</span><span class="p">(</span><span class="n">dataset_name</span><span class="p">,</span> <span class="n">split</span><span class="o">=</span><span class="n">split</span><span class="p">,</span> <span class="n">shuffle_files</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="o">.</span><span class="n">map</span><span class="p">(</span><span class="n">preprocess_image</span><span class="p">,</span> <span class="n">num_parallel_calls</span><span class="o">=</span><span class="n">tf</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">AUTOTUNE</span><span class="p">)</span> <span class="o">.</span><span class="n">cache</span><span class="p">()</span> <span class="o">.</span><span class="n">shuffle</span><span class="p">(</span><span class="mi">10</span> <span class="o">*</span> <span class="n">batch_size</span><span class="p">)</span> <span class="o">.</span><span class="n">batch</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">drop_remainder</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="o">.</span><span class="n">prefetch</span><span class="p">(</span><span class="n">buffer_size</span><span class="o">=</span><span class="n">tf</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">AUTOTUNE</span><span class="p">)</span> <span class="p">)</span> <span class="n">train_dataset</span> <span class="o">=</span> <span class="n">prepare_dataset</span><span class="p">(</span><span class="s2">&quot;train&quot;</span><span class="p">)</span> <span class="n">val_dataset</span> <span class="o">=</span> <span class="n">prepare_dataset</span><span class="p">(</span><span class="s2">&quot;test&quot;</span><span class="p">)</span> </code></pre></div> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>I0000 00:00:1738798971.054632 17795 gpu_device.cc:2022] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13840 MB memory: -&gt; device: 0, name: Tesla T4, pci bus id: 0000:00:04.0, compute capability: 7.5 </code></pre></div> </div> <p>After preprocessing the training images look like the following: <img alt="birds dataset" src="https://i.imgur.com/Ru5HgBM.png" /></p> <hr /> <h2 id="kernel-inception-distance">Kernel inception distance</h2> <p><a href="https://arxiv.org/abs/1801.01401">Kernel Inception Distance (KID)</a> was proposed as a replacement for the popular <a href="https://arxiv.org/abs/1706.08500">Frechet Inception Distance (FID)</a> metric for measuring image generation quality. Both metrics measure the difference in the generated and training distributions in the representation space of an <a href="https://keras.io/api/applications/inceptionv3/">InceptionV3</a> network pretrained on <a href="https://www.tensorflow.org/datasets/catalog/imagenet2012">ImageNet</a>.</p> <p>According to the paper, KID was proposed because FID has no unbiased estimator, its expected value is higher when it is measured on fewer images. KID is more suitable for small datasets because its expected value does not depend on the number of samples it is measured on. In my experience it is also computationally lighter, numerically more stable, and simpler to implement because it can be estimated in a per-batch manner.</p> <p>In this example, the images are evaluated at the minimal possible resolution of the Inception network (75x75 instead of 299x299), and the metric is only measured on the validation set for computational efficiency.</p> <div class="codehilite"><pre><span></span><code><span class="k">class</span><span class="w"> </span><span class="nc">KID</span><span class="p">(</span><span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">Metric</span><span class="p">):</span> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">name</span><span class="o">=</span><span class="s2">&quot;kid&quot;</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span> <span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="n">name</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">)</span> <span class="c1"># KID is estimated per batch and is averaged across batches</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid_tracker</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">Mean</span><span class="p">()</span> <span class="c1"># a pretrained InceptionV3 is used without its classification layer</span> <span class="c1"># transform the pixel values to the 0-255 range, then use the same</span> <span class="c1"># preprocessing as during pretraining</span> <span class="bp">self</span><span class="o">.</span><span class="n">encoder</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">Sequential</span><span class="p">(</span> <span class="p">[</span> <span class="n">layers</span><span class="o">.</span><span class="n">InputLayer</span><span class="p">(</span><span class="n">input_shape</span><span class="o">=</span><span class="p">(</span><span class="n">image_size</span><span class="p">,</span> <span class="n">image_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">)),</span> <span class="n">layers</span><span class="o">.</span><span class="n">Rescaling</span><span class="p">(</span><span class="mf">255.0</span><span class="p">),</span> <span class="n">layers</span><span class="o">.</span><span class="n">Resizing</span><span class="p">(</span><span class="n">height</span><span class="o">=</span><span class="n">kid_image_size</span><span class="p">,</span> <span class="n">width</span><span class="o">=</span><span class="n">kid_image_size</span><span class="p">),</span> <span class="n">layers</span><span class="o">.</span><span class="n">Lambda</span><span class="p">(</span><span class="n">keras</span><span class="o">.</span><span class="n">applications</span><span class="o">.</span><span class="n">inception_v3</span><span class="o">.</span><span class="n">preprocess_input</span><span class="p">),</span> <span class="n">keras</span><span class="o">.</span><span class="n">applications</span><span class="o">.</span><span class="n">InceptionV3</span><span class="p">(</span> <span class="n">include_top</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">input_shape</span><span class="o">=</span><span class="p">(</span><span class="n">kid_image_size</span><span class="p">,</span> <span class="n">kid_image_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">),</span> <span class="n">weights</span><span class="o">=</span><span class="s2">&quot;imagenet&quot;</span><span class="p">,</span> <span class="p">),</span> <span class="n">layers</span><span class="o">.</span><span class="n">GlobalAveragePooling2D</span><span class="p">(),</span> <span class="p">],</span> <span class="n">name</span><span class="o">=</span><span class="s2">&quot;inception_encoder&quot;</span><span class="p">,</span> <span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">polynomial_kernel</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">features_1</span><span class="p">,</span> <span class="n">features_2</span><span class="p">):</span> <span class="n">feature_dimensions</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">ops</span><span class="o">.</span><span class="n">shape</span><span class="p">(</span><span class="n">features_1</span><span class="p">)[</span><span class="mi">1</span><span class="p">],</span> <span class="s2">&quot;float32&quot;</span><span class="p">)</span> <span class="k">return</span> <span class="p">(</span> <span class="n">features_1</span> <span class="o">@</span> <span class="n">ops</span><span class="o">.</span><span class="n">transpose</span><span class="p">(</span><span class="n">features_2</span><span class="p">)</span> <span class="o">/</span> <span class="n">feature_dimensions</span> <span class="o">+</span> <span class="mf">1.0</span> <span class="p">)</span> <span class="o">**</span> <span class="mf">3.0</span> <span class="k">def</span><span class="w"> </span><span class="nf">update_state</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">real_images</span><span class="p">,</span> <span class="n">generated_images</span><span class="p">,</span> <span class="n">sample_weight</span><span class="o">=</span><span class="kc">None</span><span class="p">):</span> <span class="n">real_features</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">encoder</span><span class="p">(</span><span class="n">real_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">False</span><span class="p">)</span> <span class="n">generated_features</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">encoder</span><span class="p">(</span><span class="n">generated_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">False</span><span class="p">)</span> <span class="c1"># compute polynomial kernels using the two sets of features</span> <span class="n">kernel_real</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">polynomial_kernel</span><span class="p">(</span><span class="n">real_features</span><span class="p">,</span> <span class="n">real_features</span><span class="p">)</span> <span class="n">kernel_generated</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">polynomial_kernel</span><span class="p">(</span> <span class="n">generated_features</span><span class="p">,</span> <span class="n">generated_features</span> <span class="p">)</span> <span class="n">kernel_cross</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">polynomial_kernel</span><span class="p">(</span><span class="n">real_features</span><span class="p">,</span> <span class="n">generated_features</span><span class="p">)</span> <span class="c1"># estimate the squared maximum mean discrepancy using the average kernel values</span> <span class="n">batch_size</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">shape</span><span class="p">(</span><span class="n">real_features</span><span class="p">)[</span><span class="mi">0</span><span class="p">]</span> <span class="n">batch_size_f</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">cast</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="s2">&quot;float32&quot;</span><span class="p">)</span> <span class="n">mean_kernel_real</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span><span class="n">kernel_real</span> <span class="o">*</span> <span class="p">(</span><span class="mf">1.0</span> <span class="o">-</span> <span class="n">ops</span><span class="o">.</span><span class="n">eye</span><span class="p">(</span><span class="n">batch_size</span><span class="p">)))</span> <span class="o">/</span> <span class="p">(</span> <span class="n">batch_size_f</span> <span class="o">*</span> <span class="p">(</span><span class="n">batch_size_f</span> <span class="o">-</span> <span class="mf">1.0</span><span class="p">)</span> <span class="p">)</span> <span class="n">mean_kernel_generated</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span> <span class="n">kernel_generated</span> <span class="o">*</span> <span class="p">(</span><span class="mf">1.0</span> <span class="o">-</span> <span class="n">ops</span><span class="o">.</span><span class="n">eye</span><span class="p">(</span><span class="n">batch_size</span><span class="p">))</span> <span class="p">)</span> <span class="o">/</span> <span class="p">(</span><span class="n">batch_size_f</span> <span class="o">*</span> <span class="p">(</span><span class="n">batch_size_f</span> <span class="o">-</span> <span class="mf">1.0</span><span class="p">))</span> <span class="n">mean_kernel_cross</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">kernel_cross</span><span class="p">)</span> <span class="n">kid</span> <span class="o">=</span> <span class="n">mean_kernel_real</span> <span class="o">+</span> <span class="n">mean_kernel_generated</span> <span class="o">-</span> <span class="mf">2.0</span> <span class="o">*</span> <span class="n">mean_kernel_cross</span> <span class="c1"># update the average KID estimate</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid_tracker</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="n">kid</span><span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">result</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span> <span class="k">return</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid_tracker</span><span class="o">.</span><span class="n">result</span><span class="p">()</span> <span class="k">def</span><span class="w"> </span><span class="nf">reset_state</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid_tracker</span><span class="o">.</span><span class="n">reset_state</span><span class="p">()</span> </code></pre></div> <hr /> <h2 id="adaptive-discriminator-augmentation">Adaptive discriminator augmentation</h2> <p>The authors of <a href="https://arxiv.org/abs/2006.06676">StyleGAN2-ADA</a> propose to change the augmentation probability adaptively during training. Though it is explained differently in the paper, they use <a href="https://en.wikipedia.org/wiki/PID_controller#Integral">integral control</a> on the augmentation probability to keep the discriminator's accuracy on real images close to a target value. Note, that their controlled variable is actually the average sign of the discriminator logits (r_t in the paper), which corresponds to 2 * accuracy - 1.</p> <p>This method requires two hyperparameters:</p> <ol> <li><code>target_accuracy</code>: the target value for the discriminator's accuracy on real images. I recommend selecting its value from the 80-90% range.</li> <li><a href="https://en.wikipedia.org/wiki/PID_controller#Mathematical_form"><code>integration_steps</code></a>: the number of update steps required for an accuracy error of 100% to transform into an augmentation probability increase of 100%. To give an intuition, this defines how slowly the augmentation probability is changed. I recommend setting this to a relatively high value (1000 in this case) so that the augmentation strength is only adjusted slowly.</li> </ol> <p>The main motivation for this procedure is that the optimal value of the target accuracy is similar across different dataset sizes (see <a href="https://arxiv.org/abs/2006.06676">figure 4 and 5 in the paper</a>), so it does not have to be re-tuned, because the process automatically applies stronger data augmentation when it is needed.</p> <div class="codehilite"><pre><span></span><code><span class="c1"># &quot;hard sigmoid&quot;, useful for binary accuracy calculation from logits</span> <span class="k">def</span><span class="w"> </span><span class="nf">step</span><span class="p">(</span><span class="n">values</span><span class="p">):</span> <span class="c1"># negative values -&gt; 0.0, positive values -&gt; 1.0</span> <span class="k">return</span> <span class="mf">0.5</span> <span class="o">*</span> <span class="p">(</span><span class="mf">1.0</span> <span class="o">+</span> <span class="n">ops</span><span class="o">.</span><span class="n">sign</span><span class="p">(</span><span class="n">values</span><span class="p">))</span> <span class="c1"># augments images with a probability that is dynamically updated during training</span> <span class="k">class</span><span class="w"> </span><span class="nc">AdaptiveAugmenter</span><span class="p">(</span><span class="n">keras</span><span class="o">.</span><span class="n">Model</span><span class="p">):</span> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span> <span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span> <span class="c1"># stores the current probability of an image being augmented</span> <span class="bp">self</span><span class="o">.</span><span class="n">probability</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="mf">0.0</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">seed_generator</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">SeedGenerator</span><span class="p">(</span><span class="mi">42</span><span class="p">)</span> <span class="c1"># the corresponding augmentation names from the paper are shown above each layer</span> <span class="c1"># the authors show (see figure 4), that the blitting and geometric augmentations</span> <span class="c1"># are the most helpful in the low-data regime</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">Sequential</span><span class="p">(</span> <span class="p">[</span> <span class="n">layers</span><span class="o">.</span><span class="n">InputLayer</span><span class="p">(</span><span class="n">input_shape</span><span class="o">=</span><span class="p">(</span><span class="n">image_size</span><span class="p">,</span> <span class="n">image_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">)),</span> <span class="c1"># blitting/x-flip:</span> <span class="n">layers</span><span class="o">.</span><span class="n">RandomFlip</span><span class="p">(</span><span class="s2">&quot;horizontal&quot;</span><span class="p">),</span> <span class="c1"># blitting/integer translation:</span> <span class="n">layers</span><span class="o">.</span><span class="n">RandomTranslation</span><span class="p">(</span> <span class="n">height_factor</span><span class="o">=</span><span class="n">max_translation</span><span class="p">,</span> <span class="n">width_factor</span><span class="o">=</span><span class="n">max_translation</span><span class="p">,</span> <span class="n">interpolation</span><span class="o">=</span><span class="s2">&quot;nearest&quot;</span><span class="p">,</span> <span class="p">),</span> <span class="c1"># geometric/rotation:</span> <span class="n">layers</span><span class="o">.</span><span class="n">RandomRotation</span><span class="p">(</span><span class="n">factor</span><span class="o">=</span><span class="n">max_rotation</span><span class="p">),</span> <span class="c1"># geometric/isotropic and anisotropic scaling:</span> <span class="n">layers</span><span class="o">.</span><span class="n">RandomZoom</span><span class="p">(</span> <span class="n">height_factor</span><span class="o">=</span><span class="p">(</span><span class="o">-</span><span class="n">max_zoom</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">),</span> <span class="n">width_factor</span><span class="o">=</span><span class="p">(</span><span class="o">-</span><span class="n">max_zoom</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">)</span> <span class="p">),</span> <span class="p">],</span> <span class="n">name</span><span class="o">=</span><span class="s2">&quot;adaptive_augmenter&quot;</span><span class="p">,</span> <span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">call</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">images</span><span class="p">,</span> <span class="n">training</span><span class="p">):</span> <span class="k">if</span> <span class="n">training</span><span class="p">:</span> <span class="n">augmented_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span><span class="p">(</span><span class="n">images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="n">training</span><span class="p">)</span> <span class="c1"># during training either the original or the augmented images are selected</span> <span class="c1"># based on self.probability</span> <span class="n">augmentation_values</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">uniform</span><span class="p">(</span> <span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span> <span class="n">seed</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">seed_generator</span> <span class="p">)</span> <span class="n">augmentation_bools</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">less</span><span class="p">(</span><span class="n">augmentation_values</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">probability</span><span class="p">)</span> <span class="n">images</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">where</span><span class="p">(</span><span class="n">augmentation_bools</span><span class="p">,</span> <span class="n">augmented_images</span><span class="p">,</span> <span class="n">images</span><span class="p">)</span> <span class="k">return</span> <span class="n">images</span> <span class="k">def</span><span class="w"> </span><span class="nf">update</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">real_logits</span><span class="p">):</span> <span class="n">current_accuracy</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">step</span><span class="p">(</span><span class="n">real_logits</span><span class="p">))</span> <span class="c1"># the augmentation probability is updated based on the discriminator&#39;s</span> <span class="c1"># accuracy on real images</span> <span class="n">accuracy_error</span> <span class="o">=</span> <span class="n">current_accuracy</span> <span class="o">-</span> <span class="n">target_accuracy</span> <span class="bp">self</span><span class="o">.</span><span class="n">probability</span><span class="o">.</span><span class="n">assign</span><span class="p">(</span> <span class="n">ops</span><span class="o">.</span><span class="n">clip</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">probability</span> <span class="o">+</span> <span class="n">accuracy_error</span> <span class="o">/</span> <span class="n">integration_steps</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">,</span> <span class="mf">1.0</span><span class="p">)</span> <span class="p">)</span> </code></pre></div> <hr /> <h2 id="network-architecture">Network architecture</h2> <p>Here we specify the architecture of the two networks:</p> <ul> <li>generator: maps a random vector to an image, which should be as realistic as possible</li> <li>discriminator: maps an image to a scalar score, which should be high for real and low for generated images</li> </ul> <p>GANs tend to be sensitive to the network architecture, I implemented a DCGAN architecture in this example, because it is relatively stable during training while being simple to implement. We use a constant number of filters throughout the network, use a sigmoid instead of tanh in the last layer of the generator, and use default initialization instead of random normal as further simplifications.</p> <p>As a good practice, we disable the learnable scale parameter in the batch normalization layers, because on one hand the following relu + convolutional layers make it redundant (as noted in the <a href="https://keras.io/api/layers/normalization_layers/batch_normalization/">documentation</a>). But also because it should be disabled based on theory when using <a href="https://arxiv.org/abs/1802.05957">spectral normalization (section 4.1)</a>, which is not used here, but is common in GANs. We also disable the bias in the fully connected and convolutional layers, because the following batch normalization makes it redundant.</p> <div class="codehilite"><pre><span></span><code><span class="c1"># DCGAN generator</span> <span class="k">def</span><span class="w"> </span><span class="nf">get_generator</span><span class="p">():</span> <span class="n">noise_input</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">Input</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">noise_size</span><span class="p">,))</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Dense</span><span class="p">(</span><span class="mi">4</span> <span class="o">*</span> <span class="mi">4</span> <span class="o">*</span> <span class="n">width</span><span class="p">,</span> <span class="n">use_bias</span><span class="o">=</span><span class="kc">False</span><span class="p">)(</span><span class="n">noise_input</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">BatchNormalization</span><span class="p">(</span><span class="n">scale</span><span class="o">=</span><span class="kc">False</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">ReLU</span><span class="p">()(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Reshape</span><span class="p">(</span><span class="n">target_shape</span><span class="o">=</span><span class="p">(</span><span class="mi">4</span><span class="p">,</span> <span class="mi">4</span><span class="p">,</span> <span class="n">width</span><span class="p">))(</span><span class="n">x</span><span class="p">)</span> <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">depth</span> <span class="o">-</span> <span class="mi">1</span><span class="p">):</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Conv2DTranspose</span><span class="p">(</span> <span class="n">width</span><span class="p">,</span> <span class="n">kernel_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">strides</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">padding</span><span class="o">=</span><span class="s2">&quot;same&quot;</span><span class="p">,</span> <span class="n">use_bias</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">BatchNormalization</span><span class="p">(</span><span class="n">scale</span><span class="o">=</span><span class="kc">False</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">ReLU</span><span class="p">()(</span><span class="n">x</span><span class="p">)</span> <span class="n">image_output</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Conv2DTranspose</span><span class="p">(</span> <span class="mi">3</span><span class="p">,</span> <span class="n">kernel_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">strides</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">padding</span><span class="o">=</span><span class="s2">&quot;same&quot;</span><span class="p">,</span> <span class="n">activation</span><span class="o">=</span><span class="s2">&quot;sigmoid&quot;</span><span class="p">,</span> <span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="k">return</span> <span class="n">keras</span><span class="o">.</span><span class="n">Model</span><span class="p">(</span><span class="n">noise_input</span><span class="p">,</span> <span class="n">image_output</span><span class="p">,</span> <span class="n">name</span><span class="o">=</span><span class="s2">&quot;generator&quot;</span><span class="p">)</span> <span class="c1"># DCGAN discriminator</span> <span class="k">def</span><span class="w"> </span><span class="nf">get_discriminator</span><span class="p">():</span> <span class="n">image_input</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">Input</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">image_size</span><span class="p">,</span> <span class="n">image_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">))</span> <span class="n">x</span> <span class="o">=</span> <span class="n">image_input</span> <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">depth</span><span class="p">):</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Conv2D</span><span class="p">(</span> <span class="n">width</span><span class="p">,</span> <span class="n">kernel_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">strides</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">padding</span><span class="o">=</span><span class="s2">&quot;same&quot;</span><span class="p">,</span> <span class="n">use_bias</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">BatchNormalization</span><span class="p">(</span><span class="n">scale</span><span class="o">=</span><span class="kc">False</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">LeakyReLU</span><span class="p">(</span><span class="n">alpha</span><span class="o">=</span><span class="n">leaky_relu_slope</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Flatten</span><span class="p">()(</span><span class="n">x</span><span class="p">)</span> <span class="n">x</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Dropout</span><span class="p">(</span><span class="n">dropout_rate</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="n">output_score</span> <span class="o">=</span> <span class="n">layers</span><span class="o">.</span><span class="n">Dense</span><span class="p">(</span><span class="mi">1</span><span class="p">)(</span><span class="n">x</span><span class="p">)</span> <span class="k">return</span> <span class="n">keras</span><span class="o">.</span><span class="n">Model</span><span class="p">(</span><span class="n">image_input</span><span class="p">,</span> <span class="n">output_score</span><span class="p">,</span> <span class="n">name</span><span class="o">=</span><span class="s2">&quot;discriminator&quot;</span><span class="p">)</span> </code></pre></div> <hr /> <h2 id="gan-model">GAN model</h2> <div class="codehilite"><pre><span></span><code><span class="k">class</span><span class="w"> </span><span class="nc">GAN_ADA</span><span class="p">(</span><span class="n">keras</span><span class="o">.</span><span class="n">Model</span><span class="p">):</span> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span> <span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span> <span class="bp">self</span><span class="o">.</span><span class="n">seed_generator</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">SeedGenerator</span><span class="p">(</span><span class="n">seed</span><span class="o">=</span><span class="mi">42</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span> <span class="o">=</span> <span class="n">AdaptiveAugmenter</span><span class="p">()</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span> <span class="o">=</span> <span class="n">get_generator</span><span class="p">()</span> <span class="bp">self</span><span class="o">.</span><span class="n">ema_generator</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">models</span><span class="o">.</span><span class="n">clone_model</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span> <span class="o">=</span> <span class="n">get_discriminator</span><span class="p">()</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="o">.</span><span class="n">summary</span><span class="p">()</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span><span class="o">.</span><span class="n">summary</span><span class="p">()</span> <span class="c1"># we have created all layers at this point, so we can mark the model</span> <span class="c1"># as having been built</span> <span class="bp">self</span><span class="o">.</span><span class="n">built</span> <span class="o">=</span> <span class="kc">True</span> <span class="k">def</span><span class="w"> </span><span class="nf">compile</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">generator_optimizer</span><span class="p">,</span> <span class="n">discriminator_optimizer</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span> <span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="n">compile</span><span class="p">(</span><span class="o">**</span><span class="n">kwargs</span><span class="p">)</span> <span class="c1"># separate optimizers for the two networks</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator_optimizer</span> <span class="o">=</span> <span class="n">generator_optimizer</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator_optimizer</span> <span class="o">=</span> <span class="n">discriminator_optimizer</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator_loss_tracker</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">Mean</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s2">&quot;g_loss&quot;</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator_loss_tracker</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">Mean</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s2">&quot;d_loss&quot;</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">real_accuracy</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">BinaryAccuracy</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s2">&quot;real_acc&quot;</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">generated_accuracy</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">BinaryAccuracy</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s2">&quot;gen_acc&quot;</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmentation_probability_tracker</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">metrics</span><span class="o">.</span><span class="n">Mean</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s2">&quot;aug_p&quot;</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid</span> <span class="o">=</span> <span class="n">KID</span><span class="p">()</span> <span class="nd">@property</span> <span class="k">def</span><span class="w"> </span><span class="nf">metrics</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span> <span class="k">return</span> <span class="p">[</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator_loss_tracker</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator_loss_tracker</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">real_accuracy</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">generated_accuracy</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmentation_probability_tracker</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid</span><span class="p">,</span> <span class="p">]</span> <span class="k">def</span><span class="w"> </span><span class="nf">generate</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">training</span><span class="p">):</span> <span class="n">latent_samples</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">normal</span><span class="p">(</span> <span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">noise_size</span><span class="p">),</span> <span class="n">seed</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">seed_generator</span> <span class="p">)</span> <span class="c1"># use ema_generator during inference</span> <span class="k">if</span> <span class="n">training</span><span class="p">:</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="p">(</span><span class="n">latent_samples</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="n">training</span><span class="p">)</span> <span class="k">else</span><span class="p">:</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">ema_generator</span><span class="p">(</span><span class="n">latent_samples</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="n">training</span><span class="p">)</span> <span class="k">return</span> <span class="n">generated_images</span> <span class="k">def</span><span class="w"> </span><span class="nf">adversarial_loss</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">real_logits</span><span class="p">,</span> <span class="n">generated_logits</span><span class="p">):</span> <span class="c1"># this is usually called the non-saturating GAN loss</span> <span class="n">real_labels</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">ones</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">1</span><span class="p">))</span> <span class="n">generated_labels</span> <span class="o">=</span> <span class="n">ops</span><span class="o">.</span><span class="n">zeros</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">1</span><span class="p">))</span> <span class="c1"># the generator tries to produce images that the discriminator considers as real</span> <span class="n">generator_loss</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">losses</span><span class="o">.</span><span class="n">binary_crossentropy</span><span class="p">(</span> <span class="n">real_labels</span><span class="p">,</span> <span class="n">generated_logits</span><span class="p">,</span> <span class="n">from_logits</span><span class="o">=</span><span class="kc">True</span> <span class="p">)</span> <span class="c1"># the discriminator tries to determine if images are real or generated</span> <span class="n">discriminator_loss</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">losses</span><span class="o">.</span><span class="n">binary_crossentropy</span><span class="p">(</span> <span class="n">ops</span><span class="o">.</span><span class="n">concatenate</span><span class="p">([</span><span class="n">real_labels</span><span class="p">,</span> <span class="n">generated_labels</span><span class="p">],</span> <span class="n">axis</span><span class="o">=</span><span class="mi">0</span><span class="p">),</span> <span class="n">ops</span><span class="o">.</span><span class="n">concatenate</span><span class="p">([</span><span class="n">real_logits</span><span class="p">,</span> <span class="n">generated_logits</span><span class="p">],</span> <span class="n">axis</span><span class="o">=</span><span class="mi">0</span><span class="p">),</span> <span class="n">from_logits</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="p">)</span> <span class="k">return</span> <span class="n">ops</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">generator_loss</span><span class="p">),</span> <span class="n">ops</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">discriminator_loss</span><span class="p">)</span> <span class="k">def</span><span class="w"> </span><span class="nf">train_step</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">real_images</span><span class="p">):</span> <span class="n">real_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span><span class="p">(</span><span class="n">real_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="c1"># use persistent gradient tape because gradients will be calculated twice</span> <span class="k">with</span> <span class="n">tf</span><span class="o">.</span><span class="n">GradientTape</span><span class="p">(</span><span class="n">persistent</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="k">as</span> <span class="n">tape</span><span class="p">:</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">generate</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="c1"># gradient is calculated through the image augmentation</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span><span class="p">(</span><span class="n">generated_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="c1"># separate forward passes for the real and generated images, meaning</span> <span class="c1"># that batch normalization is applied separately</span> <span class="n">real_logits</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span><span class="p">(</span><span class="n">real_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="n">generated_logits</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span><span class="p">(</span><span class="n">generated_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span> <span class="n">generator_loss</span><span class="p">,</span> <span class="n">discriminator_loss</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">adversarial_loss</span><span class="p">(</span> <span class="n">real_logits</span><span class="p">,</span> <span class="n">generated_logits</span> <span class="p">)</span> <span class="c1"># calculate gradients and update weights</span> <span class="n">generator_gradients</span> <span class="o">=</span> <span class="n">tape</span><span class="o">.</span><span class="n">gradient</span><span class="p">(</span> <span class="n">generator_loss</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="o">.</span><span class="n">trainable_weights</span> <span class="p">)</span> <span class="n">discriminator_gradients</span> <span class="o">=</span> <span class="n">tape</span><span class="o">.</span><span class="n">gradient</span><span class="p">(</span> <span class="n">discriminator_loss</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span><span class="o">.</span><span class="n">trainable_weights</span> <span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator_optimizer</span><span class="o">.</span><span class="n">apply_gradients</span><span class="p">(</span> <span class="nb">zip</span><span class="p">(</span><span class="n">generator_gradients</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="o">.</span><span class="n">trainable_weights</span><span class="p">)</span> <span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator_optimizer</span><span class="o">.</span><span class="n">apply_gradients</span><span class="p">(</span> <span class="nb">zip</span><span class="p">(</span><span class="n">discriminator_gradients</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator</span><span class="o">.</span><span class="n">trainable_weights</span><span class="p">)</span> <span class="p">)</span> <span class="c1"># update the augmentation probability based on the discriminator&#39;s performance</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span><span class="o">.</span><span class="n">update</span><span class="p">(</span><span class="n">real_logits</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator_loss_tracker</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="n">generator_loss</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">discriminator_loss_tracker</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="n">discriminator_loss</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">real_accuracy</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="mf">1.0</span><span class="p">,</span> <span class="n">step</span><span class="p">(</span><span class="n">real_logits</span><span class="p">))</span> <span class="bp">self</span><span class="o">.</span><span class="n">generated_accuracy</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="mf">0.0</span><span class="p">,</span> <span class="n">step</span><span class="p">(</span><span class="n">generated_logits</span><span class="p">))</span> <span class="bp">self</span><span class="o">.</span><span class="n">augmentation_probability_tracker</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">augmenter</span><span class="o">.</span><span class="n">probability</span><span class="p">)</span> <span class="c1"># track the exponential moving average of the generator&#39;s weights to decrease</span> <span class="c1"># variance in the generation quality</span> <span class="k">for</span> <span class="n">weight</span><span class="p">,</span> <span class="n">ema_weight</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span> <span class="bp">self</span><span class="o">.</span><span class="n">generator</span><span class="o">.</span><span class="n">weights</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">ema_generator</span><span class="o">.</span><span class="n">weights</span> <span class="p">):</span> <span class="n">ema_weight</span><span class="o">.</span><span class="n">assign</span><span class="p">(</span><span class="n">ema</span> <span class="o">*</span> <span class="n">ema_weight</span> <span class="o">+</span> <span class="p">(</span><span class="mi">1</span> <span class="o">-</span> <span class="n">ema</span><span class="p">)</span> <span class="o">*</span> <span class="n">weight</span><span class="p">)</span> <span class="c1"># KID is not measured during the training phase for computational efficiency</span> <span class="k">return</span> <span class="p">{</span><span class="n">m</span><span class="o">.</span><span class="n">name</span><span class="p">:</span> <span class="n">m</span><span class="o">.</span><span class="n">result</span><span class="p">()</span> <span class="k">for</span> <span class="n">m</span> <span class="ow">in</span> <span class="bp">self</span><span class="o">.</span><span class="n">metrics</span><span class="p">[:</span><span class="o">-</span><span class="mi">1</span><span class="p">]}</span> <span class="k">def</span><span class="w"> </span><span class="nf">test_step</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">real_images</span><span class="p">):</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">generate</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">False</span><span class="p">)</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid</span><span class="o">.</span><span class="n">update_state</span><span class="p">(</span><span class="n">real_images</span><span class="p">,</span> <span class="n">generated_images</span><span class="p">)</span> <span class="c1"># only KID is measured during the evaluation phase for computational efficiency</span> <span class="k">return</span> <span class="p">{</span><span class="bp">self</span><span class="o">.</span><span class="n">kid</span><span class="o">.</span><span class="n">name</span><span class="p">:</span> <span class="bp">self</span><span class="o">.</span><span class="n">kid</span><span class="o">.</span><span class="n">result</span><span class="p">()}</span> <span class="k">def</span><span class="w"> </span><span class="nf">plot_images</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">epoch</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">logs</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="n">num_rows</span><span class="o">=</span><span class="mi">3</span><span class="p">,</span> <span class="n">num_cols</span><span class="o">=</span><span class="mi">6</span><span class="p">,</span> <span class="n">interval</span><span class="o">=</span><span class="mi">5</span><span class="p">):</span> <span class="c1"># plot random generated images for visual evaluation of generation quality</span> <span class="k">if</span> <span class="n">epoch</span> <span class="ow">is</span> <span class="kc">None</span> <span class="ow">or</span> <span class="p">(</span><span class="n">epoch</span> <span class="o">+</span> <span class="mi">1</span><span class="p">)</span> <span class="o">%</span> <span class="n">interval</span> <span class="o">==</span> <span class="mi">0</span><span class="p">:</span> <span class="n">num_images</span> <span class="o">=</span> <span class="n">num_rows</span> <span class="o">*</span> <span class="n">num_cols</span> <span class="n">generated_images</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">generate</span><span class="p">(</span><span class="n">num_images</span><span class="p">,</span> <span class="n">training</span><span class="o">=</span><span class="kc">False</span><span class="p">)</span> <span class="n">plt</span><span class="o">.</span><span class="n">figure</span><span class="p">(</span><span class="n">figsize</span><span class="o">=</span><span class="p">(</span><span class="n">num_cols</span> <span class="o">*</span> <span class="mf">2.0</span><span class="p">,</span> <span class="n">num_rows</span> <span class="o">*</span> <span class="mf">2.0</span><span class="p">))</span> <span class="k">for</span> <span class="n">row</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">num_rows</span><span class="p">):</span> <span class="k">for</span> <span class="n">col</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">num_cols</span><span class="p">):</span> <span class="n">index</span> <span class="o">=</span> <span class="n">row</span> <span class="o">*</span> <span class="n">num_cols</span> <span class="o">+</span> <span class="n">col</span> <span class="n">plt</span><span class="o">.</span><span class="n">subplot</span><span class="p">(</span><span class="n">num_rows</span><span class="p">,</span> <span class="n">num_cols</span><span class="p">,</span> <span class="n">index</span> <span class="o">+</span> <span class="mi">1</span><span class="p">)</span> <span class="n">plt</span><span class="o">.</span><span class="n">imshow</span><span class="p">(</span><span class="n">generated_images</span><span class="p">[</span><span class="n">index</span><span class="p">])</span> <span class="n">plt</span><span class="o">.</span><span class="n">axis</span><span class="p">(</span><span class="s2">&quot;off&quot;</span><span class="p">)</span> <span class="n">plt</span><span class="o">.</span><span class="n">tight_layout</span><span class="p">()</span> <span class="n">plt</span><span class="o">.</span><span class="n">show</span><span class="p">()</span> <span class="n">plt</span><span class="o">.</span><span class="n">close</span><span class="p">()</span> </code></pre></div> <hr /> <h2 id="training">Training</h2> <p>One can should see from the metrics during training, that if the real accuracy (discriminator's accuracy on real images) is below the target accuracy, the augmentation probability is increased, and vice versa. In my experience, during a healthy GAN training, the discriminator accuracy should stay in the 80-95% range. Below that, the discriminator is too weak, above that it is too strong.</p> <p>Note that we track the exponential moving average of the generator's weights, and use that for image generation and KID evaluation.</p> <div class="codehilite"><pre><span></span><code><span class="c1"># create and compile the model</span> <span class="n">model</span> <span class="o">=</span> <span class="n">GAN_ADA</span><span class="p">()</span> <span class="n">model</span><span class="o">.</span><span class="n">compile</span><span class="p">(</span> <span class="n">generator_optimizer</span><span class="o">=</span><span class="n">keras</span><span class="o">.</span><span class="n">optimizers</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">learning_rate</span><span class="p">,</span> <span class="n">beta_1</span><span class="p">),</span> <span class="n">discriminator_optimizer</span><span class="o">=</span><span class="n">keras</span><span class="o">.</span><span class="n">optimizers</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">learning_rate</span><span class="p">,</span> <span class="n">beta_1</span><span class="p">),</span> <span class="p">)</span> <span class="c1"># save the best model based on the validation KID metric</span> <span class="n">checkpoint_path</span> <span class="o">=</span> <span class="s2">&quot;gan_model.weights.h5&quot;</span> <span class="n">checkpoint_callback</span> <span class="o">=</span> <span class="n">keras</span><span class="o">.</span><span class="n">callbacks</span><span class="o">.</span><span class="n">ModelCheckpoint</span><span class="p">(</span> <span class="n">filepath</span><span class="o">=</span><span class="n">checkpoint_path</span><span class="p">,</span> <span class="n">save_weights_only</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">monitor</span><span class="o">=</span><span class="s2">&quot;val_kid&quot;</span><span class="p">,</span> <span class="n">mode</span><span class="o">=</span><span class="s2">&quot;min&quot;</span><span class="p">,</span> <span class="n">save_best_only</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="p">)</span> <span class="c1"># run training and plot generated images periodically</span> <span class="n">model</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span> <span class="n">train_dataset</span><span class="p">,</span> <span class="n">epochs</span><span class="o">=</span><span class="n">num_epochs</span><span class="p">,</span> <span class="n">validation_data</span><span class="o">=</span><span class="n">val_dataset</span><span class="p">,</span> <span class="n">callbacks</span><span class="o">=</span><span class="p">[</span> <span class="n">keras</span><span class="o">.</span><span class="n">callbacks</span><span class="o">.</span><span class="n">LambdaCallback</span><span class="p">(</span><span class="n">on_epoch_end</span><span class="o">=</span><span class="n">model</span><span class="o">.</span><span class="n">plot_images</span><span class="p">),</span> <span class="n">checkpoint_callback</span><span class="p">,</span> <span class="p">],</span> <span class="p">)</span> </code></pre></div> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>/usr/local/lib/python3.11/dist-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead. warnings.warn( /usr/local/lib/python3.11/dist-packages/keras/src/layers/activations/leaky_relu.py:41: UserWarning: Argument `alpha` is deprecated. Use `negative_slope` instead. warnings.warn( </code></pre></div> </div> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold">Model: "generator"</span> </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃<span style="font-weight: bold"> Layer (type) </span>┃<span style="font-weight: bold"> Output Shape </span>┃<span style="font-weight: bold"> Param # </span>┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_layer_1 (<span style="color: #0087ff; text-decoration-color: #0087ff">InputLayer</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">64</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (<span style="color: #0087ff; text-decoration-color: #0087ff">Dense</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">2048</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">131,072</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">2048</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">6,144</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ re_lu (<span style="color: #0087ff; text-decoration-color: #0087ff">ReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">2048</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ reshape (<span style="color: #0087ff; text-decoration-color: #0087ff">Reshape</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_transpose │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2DTranspose</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_1 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ re_lu_1 (<span style="color: #0087ff; text-decoration-color: #0087ff">ReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_transpose_1 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2DTranspose</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_2 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ re_lu_2 (<span style="color: #0087ff; text-decoration-color: #0087ff">ReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_transpose_2 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2DTranspose</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_3 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ re_lu_3 (<span style="color: #0087ff; text-decoration-color: #0087ff">ReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_transpose_3 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">64</span>, <span style="color: #00af00; text-decoration-color: #00af00">64</span>, <span style="color: #00af00; text-decoration-color: #00af00">3</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">6,147</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2DTranspose</span>) │ │ │ └─────────────────────────────────┴────────────────────────┴───────────────┘ </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Total params: </span><span style="color: #00af00; text-decoration-color: #00af00">930,947</span> (3.55 MB) </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Trainable params: </span><span style="color: #00af00; text-decoration-color: #00af00">926,083</span> (3.53 MB) </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Non-trainable params: </span><span style="color: #00af00; text-decoration-color: #00af00">4,864</span> (19.00 KB) </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold">Model: "discriminator"</span> </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃<span style="font-weight: bold"> Layer (type) </span>┃<span style="font-weight: bold"> Output Shape </span>┃<span style="font-weight: bold"> Param # </span>┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_layer_2 (<span style="color: #0087ff; text-decoration-color: #0087ff">InputLayer</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">64</span>, <span style="color: #00af00; text-decoration-color: #00af00">64</span>, <span style="color: #00af00; text-decoration-color: #00af00">3</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2D</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">6,144</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_4 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ leaky_re_lu (<span style="color: #0087ff; text-decoration-color: #0087ff">LeakyReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">32</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_1 (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2D</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_5 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ leaky_re_lu_1 (<span style="color: #0087ff; text-decoration-color: #0087ff">LeakyReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">16</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_2 (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2D</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_6 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ leaky_re_lu_2 (<span style="color: #0087ff; text-decoration-color: #0087ff">LeakyReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">8</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ conv2d_3 (<span style="color: #0087ff; text-decoration-color: #0087ff">Conv2D</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">262,144</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ batch_normalization_7 │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">384</span> │ │ (<span style="color: #0087ff; text-decoration-color: #0087ff">BatchNormalization</span>) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ leaky_re_lu_3 (<span style="color: #0087ff; text-decoration-color: #0087ff">LeakyReLU</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">4</span>, <span style="color: #00af00; text-decoration-color: #00af00">128</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ flatten (<span style="color: #0087ff; text-decoration-color: #0087ff">Flatten</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">2048</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (<span style="color: #0087ff; text-decoration-color: #0087ff">Dropout</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">2048</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">0</span> │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (<span style="color: #0087ff; text-decoration-color: #0087ff">Dense</span>) │ (<span style="color: #00d7ff; text-decoration-color: #00d7ff">None</span>, <span style="color: #00af00; text-decoration-color: #00af00">1</span>) │ <span style="color: #00af00; text-decoration-color: #00af00">2,049</span> │ └─────────────────────────────────┴────────────────────────┴───────────────┘ </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Total params: </span><span style="color: #00af00; text-decoration-color: #00af00">796,161</span> (3.04 MB) </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Trainable params: </span><span style="color: #00af00; text-decoration-color: #00af00">795,137</span> (3.03 MB) </pre> <pre style="white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace"><span style="font-weight: bold"> Non-trainable params: </span><span style="color: #00af00; text-decoration-color: #00af00">1,024</span> (4.00 KB) </pre> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5 </code></pre></div> </div> <div class="codehilite"><pre><span></span><code> 0/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step </code></pre></div> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4202496/87910968 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12304384/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21815296/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34373632/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>47669248/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>62324736/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>72032256/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>82083840/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>87910968/87910968 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 1/10 E0000 00:00:1738798983.901596 17795 meta_optimizer.cc:966] layout failed: INVALID_ARGUMENT: Size of values 0 does not match size of permutation 4 @ fanin shape inStatefulPartitionedCall/gradient_tape/adaptive_augmenter_3/SelectV2_1-1-TransposeNHWCToNCHW-LayoutOptimizer WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1738798987.822990 17861 cuda_solvers.cc:178] Creating GpuSolver handles for stream 0x9f45670 I0000 00:00:1738798988.976919 17862 cuda_dnn.cc:529] Loaded cuDNN version 90300 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 13:22 18s/step - aug_p: 0.0000e+00 - d_loss: 0.8829 - g_loss: 0.5585 - gen_acc: 0.2812 - real_acc: 0.7031</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 7s 173ms/step - aug_p: 0.0000e+00 - d_loss: 0.7985 - g_loss: 0.8154 - gen_acc: 0.4258 - real_acc: 0.6777 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 8s 201ms/step - aug_p: 0.0000e+00 - d_loss: 0.7488 - g_loss: 0.9481 - gen_acc: 0.5052 - real_acc: 0.6680</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 10s 249ms/step - aug_p: 0.0000e+00 - d_loss: 0.7106 - g_loss: 1.0407 - gen_acc: 0.5586 - real_acc: 0.6680</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 11s 269ms/step - aug_p: 0.0000e+00 - d_loss: 0.6782 - g_loss: 1.1104 - gen_acc: 0.5991 - real_acc: 0.6744</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 10s 273ms/step - aug_p: 0.0000e+00 - d_loss: 0.6506 - g_loss: 1.1692 - gen_acc: 0.6301 - real_acc: 0.6818</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 10s 280ms/step - aug_p: 5.1020e-07 - d_loss: 0.6253 - g_loss: 1.2302 - gen_acc: 0.6558 - real_acc: 0.6902</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 10s 289ms/step - aug_p: 1.4962e-06 - d_loss: 0.6030 - g_loss: 1.2838 - gen_acc: 0.6772 - real_acc: 0.6986</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 10s 297ms/step - aug_p: 3.8570e-06 - d_loss: 0.5829 - g_loss: 1.3313 - gen_acc: 0.6952 - real_acc: 0.7074</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 10s 299ms/step - aug_p: 7.9244e-06 - d_loss: 0.5644 - g_loss: 1.3805 - gen_acc: 0.7109 - real_acc: 0.7162</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 11s 314ms/step - aug_p: 1.3531e-05 - d_loss: 0.5473 - g_loss: 1.4301 - gen_acc: 0.7249 - real_acc: 0.7245</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 11s 333ms/step - aug_p: 2.0444e-05 - d_loss: 0.5318 - g_loss: 1.4743 - gen_acc: 0.7373 - real_acc: 0.7322</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 11s 352ms/step - aug_p: 2.8561e-05 - d_loss: 0.5174 - g_loss: 1.5181 - gen_acc: 0.7485 - real_acc: 0.7394</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 11s 365ms/step - aug_p: 3.7929e-05 - d_loss: 0.5040 - g_loss: 1.5586 - gen_acc: 0.7587 - real_acc: 0.7464</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 11s 379ms/step - aug_p: 4.8560e-05 - d_loss: 0.4914 - g_loss: 1.5977 - gen_acc: 0.7680 - real_acc: 0.7530</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 11s 387ms/step - aug_p: 6.0448e-05 - d_loss: 0.4795 - g_loss: 1.6355 - gen_acc: 0.7766 - real_acc: 0.7593</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 11s 399ms/step - aug_p: 7.3577e-05 - d_loss: 0.4683 - g_loss: 1.6715 - gen_acc: 0.7844 - real_acc: 0.7654</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 11s 414ms/step - aug_p: 8.7805e-05 - d_loss: 0.4578 - g_loss: 1.7063 - gen_acc: 0.7917 - real_acc: 0.7711</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 11s 422ms/step - aug_p: 1.0308e-04 - d_loss: 0.4480 - g_loss: 1.7386 - gen_acc: 0.7984 - real_acc: 0.7765</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 11s 433ms/step - aug_p: 1.1933e-04 - d_loss: 0.4386 - g_loss: 1.7736 - gen_acc: 0.8046 - real_acc: 0.7817</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 11s 444ms/step - aug_p: 1.3652e-04 - d_loss: 0.4297 - g_loss: 1.8065 - gen_acc: 0.8104 - real_acc: 0.7866</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 10s 440ms/step - aug_p: 1.5459e-04 - d_loss: 0.4213 - g_loss: 1.8383 - gen_acc: 0.8158 - real_acc: 0.7913</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 9s 434ms/step - aug_p: 1.7347e-04 - d_loss: 0.4132 - g_loss: 1.8694 - gen_acc: 0.8209 - real_acc: 0.7958 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 9s 429ms/step - aug_p: 1.9312e-04 - d_loss: 0.4056 - g_loss: 1.8988 - gen_acc: 0.8257 - real_acc: 0.8000</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 8s 426ms/step - aug_p: 2.1348e-04 - d_loss: 0.3983 - g_loss: 1.9278 - gen_acc: 0.8302 - real_acc: 0.8041</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 8s 423ms/step - aug_p: 2.3451e-04 - d_loss: 0.3914 - g_loss: 1.9548 - gen_acc: 0.8345 - real_acc: 0.8079</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 8s 427ms/step - aug_p: 2.5614e-04 - d_loss: 0.3848 - g_loss: 1.9828 - gen_acc: 0.8385 - real_acc: 0.8116</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 7s 424ms/step - aug_p: 2.7834e-04 - d_loss: 0.3785 - g_loss: 2.0093 - gen_acc: 0.8423 - real_acc: 0.8151</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 7s 420ms/step - aug_p: 3.0107e-04 - d_loss: 0.3724 - g_loss: 2.0347 - gen_acc: 0.8459 - real_acc: 0.8185</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 6s 416ms/step - aug_p: 3.2432e-04 - d_loss: 0.3666 - g_loss: 2.0599 - gen_acc: 0.8493 - real_acc: 0.8218</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 6s 413ms/step - aug_p: 3.4806e-04 - d_loss: 0.3610 - g_loss: 2.0840 - gen_acc: 0.8526 - real_acc: 0.8249</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 5s 409ms/step - aug_p: 3.7225e-04 - d_loss: 0.3556 - g_loss: 2.1073 - gen_acc: 0.8556 - real_acc: 0.8279</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 5s 406ms/step - aug_p: 3.9686e-04 - d_loss: 0.3505 - g_loss: 2.1300 - gen_acc: 0.8586 - real_acc: 0.8307</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 4s 402ms/step - aug_p: 4.2187e-04 - d_loss: 0.3455 - g_loss: 2.1520 - gen_acc: 0.8614 - real_acc: 0.8335</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 4s 400ms/step - aug_p: 4.4725e-04 - d_loss: 0.3407 - g_loss: 2.1736 - gen_acc: 0.8641 - real_acc: 0.8361</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 3s 397ms/step - aug_p: 4.7297e-04 - d_loss: 0.3361 - g_loss: 2.1947 - gen_acc: 0.8667 - real_acc: 0.8387</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 3s 393ms/step - aug_p: 4.9903e-04 - d_loss: 0.3316 - g_loss: 2.2152 - gen_acc: 0.8691 - real_acc: 0.8411</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 3s 387ms/step - aug_p: 5.2539e-04 - d_loss: 0.3273 - g_loss: 2.2357 - gen_acc: 0.8715 - real_acc: 0.8435</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 2s 382ms/step - aug_p: 5.5206e-04 - d_loss: 0.3231 - g_loss: 2.2554 - gen_acc: 0.8738 - real_acc: 0.8458</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 2s 376ms/step - aug_p: 5.7902e-04 - d_loss: 0.3191 - g_loss: 2.2756 - gen_acc: 0.8759 - real_acc: 0.8480</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 1s 371ms/step - aug_p: 6.0626e-04 - d_loss: 0.3151 - g_loss: 2.2954 - gen_acc: 0.8780 - real_acc: 0.8502</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 1s 366ms/step - aug_p: 6.3377e-04 - d_loss: 0.3113 - g_loss: 2.3147 - gen_acc: 0.8800 - real_acc: 0.8522</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 1s 362ms/step - aug_p: 6.6154e-04 - d_loss: 0.3076 - g_loss: 2.3339 - gen_acc: 0.8820 - real_acc: 0.8543</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 358ms/step - aug_p: 6.8956e-04 - d_loss: 0.3041 - g_loss: 2.3524 - gen_acc: 0.8839 - real_acc: 0.8562</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 354ms/step - aug_p: 7.1780e-04 - d_loss: 0.3006 - g_loss: 2.3703 - gen_acc: 0.8857 - real_acc: 0.8581</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 350ms/step - aug_p: 7.4625e-04 - d_loss: 0.2973 - g_loss: 2.3871 - gen_acc: 0.8874 - real_acc: 0.8599</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 61s 958ms/step - aug_p: 7.7349e-04 - d_loss: 0.2942 - g_loss: 2.4032 - gen_acc: 0.8890 - real_acc: 0.8616 - val_kid: 9.1841</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 2/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 22:28 30s/step - aug_p: 0.0051 - d_loss: 0.1030 - g_loss: 8.8928 - gen_acc: 1.0000 - real_acc: 0.9375</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 7s 176ms/step - aug_p: 0.0051 - d_loss: 0.1073 - g_loss: 8.4938 - gen_acc: 0.9980 - real_acc: 0.9297 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 176ms/step - aug_p: 0.0051 - d_loss: 0.1074 - g_loss: 7.7540 - gen_acc: 0.9952 - real_acc: 0.9340</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 176ms/step - aug_p: 0.0052 - d_loss: 0.1037 - g_loss: 7.2815 - gen_acc: 0.9945 - real_acc: 0.9388</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 175ms/step - aug_p: 0.0052 - d_loss: 0.0997 - g_loss: 6.9305 - gen_acc: 0.9943 - real_acc: 0.9432</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 175ms/step - aug_p: 0.0052 - d_loss: 0.0960 - g_loss: 6.6418 - gen_acc: 0.9942 - real_acc: 0.9473</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 6s 175ms/step - aug_p: 0.0052 - d_loss: 0.0933 - g_loss: 6.4224 - gen_acc: 0.9939 - real_acc: 0.9503</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 175ms/step - aug_p: 0.0053 - d_loss: 0.0907 - g_loss: 6.2473 - gen_acc: 0.9937 - real_acc: 0.9530</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 175ms/step - aug_p: 0.0053 - d_loss: 0.0885 - g_loss: 6.0970 - gen_acc: 0.9936 - real_acc: 0.9552</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 175ms/step - aug_p: 0.0053 - d_loss: 0.0868 - g_loss: 5.9686 - gen_acc: 0.9936 - real_acc: 0.9571</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 175ms/step - aug_p: 0.0054 - d_loss: 0.0852 - g_loss: 5.8546 - gen_acc: 0.9936 - real_acc: 0.9588</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - aug_p: 0.0054 - d_loss: 0.0837 - g_loss: 5.7615 - gen_acc: 0.9937 - real_acc: 0.9602</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 175ms/step - aug_p: 0.0054 - d_loss: 0.0825 - g_loss: 5.6750 - gen_acc: 0.9937 - real_acc: 0.9614</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - aug_p: 0.0055 - d_loss: 0.0813 - g_loss: 5.5972 - gen_acc: 0.9937 - real_acc: 0.9626</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - aug_p: 0.0055 - d_loss: 0.0802 - g_loss: 5.5273 - gen_acc: 0.9938 - real_acc: 0.9636</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - aug_p: 0.0055 - d_loss: 0.0792 - g_loss: 5.4619 - gen_acc: 0.9939 - real_acc: 0.9645</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - aug_p: 0.0056 - d_loss: 0.0783 - g_loss: 5.4012 - gen_acc: 0.9940 - real_acc: 0.9654</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0056 - d_loss: 0.0775 - g_loss: 5.3477 - gen_acc: 0.9941 - real_acc: 0.9661</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0056 - d_loss: 0.0768 - g_loss: 5.2979 - gen_acc: 0.9941 - real_acc: 0.9667</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0057 - d_loss: 0.0762 - g_loss: 5.2495 - gen_acc: 0.9941 - real_acc: 0.9673</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0057 - d_loss: 0.0758 - g_loss: 5.2113 - gen_acc: 0.9940 - real_acc: 0.9677</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0057 - d_loss: 0.0754 - g_loss: 5.1753 - gen_acc: 0.9940 - real_acc: 0.9681</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 176ms/step - aug_p: 0.0058 - d_loss: 0.0752 - g_loss: 5.1387 - gen_acc: 0.9940 - real_acc: 0.9684</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 176ms/step - aug_p: 0.0058 - d_loss: 0.0749 - g_loss: 5.1112 - gen_acc: 0.9939 - real_acc: 0.9688</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 176ms/step - aug_p: 0.0058 - d_loss: 0.0746 - g_loss: 5.0899 - gen_acc: 0.9939 - real_acc: 0.9691</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 177ms/step - aug_p: 0.0059 - d_loss: 0.0744 - g_loss: 5.0691 - gen_acc: 0.9939 - real_acc: 0.9693</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 177ms/step - aug_p: 0.0059 - d_loss: 0.0743 - g_loss: 5.0465 - gen_acc: 0.9937 - real_acc: 0.9696</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 177ms/step - aug_p: 0.0059 - d_loss: 0.0742 - g_loss: 5.0296 - gen_acc: 0.9935 - real_acc: 0.9698</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 177ms/step - aug_p: 0.0060 - d_loss: 0.0741 - g_loss: 5.0163 - gen_acc: 0.9934 - real_acc: 0.9701</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 177ms/step - aug_p: 0.0060 - d_loss: 0.0740 - g_loss: 5.0018 - gen_acc: 0.9932 - real_acc: 0.9703</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 177ms/step - aug_p: 0.0060 - d_loss: 0.0739 - g_loss: 4.9862 - gen_acc: 0.9931 - real_acc: 0.9705</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 177ms/step - aug_p: 0.0061 - d_loss: 0.0739 - g_loss: 4.9725 - gen_acc: 0.9929 - real_acc: 0.9707</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 177ms/step - aug_p: 0.0061 - d_loss: 0.0739 - g_loss: 4.9583 - gen_acc: 0.9928 - real_acc: 0.9709</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 178ms/step - aug_p: 0.0061 - d_loss: 0.0739 - g_loss: 4.9439 - gen_acc: 0.9927 - real_acc: 0.9711</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0062 - d_loss: 0.0739 - g_loss: 4.9297 - gen_acc: 0.9926 - real_acc: 0.9712</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0062 - d_loss: 0.0740 - g_loss: 4.9151 - gen_acc: 0.9925 - real_acc: 0.9714</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0062 - d_loss: 0.0741 - g_loss: 4.9027 - gen_acc: 0.9924 - real_acc: 0.9714</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0063 - d_loss: 0.0743 - g_loss: 4.8890 - gen_acc: 0.9921 - real_acc: 0.9715</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0063 - d_loss: 0.0748 - g_loss: 4.8802 - gen_acc: 0.9918 - real_acc: 0.9713</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 178ms/step - aug_p: 0.0063 - d_loss: 0.0752 - g_loss: 4.8742 - gen_acc: 0.9916 - real_acc: 0.9712</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0064 - d_loss: 0.0756 - g_loss: 4.8685 - gen_acc: 0.9914 - real_acc: 0.9710</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0064 - d_loss: 0.0759 - g_loss: 4.8620 - gen_acc: 0.9911 - real_acc: 0.9709</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0064 - d_loss: 0.0762 - g_loss: 4.8555 - gen_acc: 0.9909 - real_acc: 0.9708</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0064 - d_loss: 0.0765 - g_loss: 4.8492 - gen_acc: 0.9907 - real_acc: 0.9707</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0065 - d_loss: 0.0768 - g_loss: 4.8424 - gen_acc: 0.9905 - real_acc: 0.9707</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - aug_p: 0.0065 - d_loss: 0.0771 - g_loss: 4.8357 - gen_acc: 0.9902 - real_acc: 0.9706</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 43s 280ms/step - aug_p: 0.0065 - d_loss: 0.0774 - g_loss: 4.8293 - gen_acc: 0.9900 - real_acc: 0.9705 - val_kid: 8.8293</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 3/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 5:54 8s/step - aug_p: 0.0105 - d_loss: 0.0941 - g_loss: 3.4148 - gen_acc: 0.9766 - real_acc: 0.9609</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 8s 196ms/step - aug_p: 0.0105 - d_loss: 0.0925 - g_loss: 3.3668 - gen_acc: 0.9805 - real_acc: 0.9668</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 8s 187ms/step - aug_p: 0.0106 - d_loss: 0.0918 - g_loss: 3.3820 - gen_acc: 0.9835 - real_acc: 0.9666</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0106 - d_loss: 0.0932 - g_loss: 3.3732 - gen_acc: 0.9847 - real_acc: 0.9661</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 182ms/step - aug_p: 0.0106 - d_loss: 0.0941 - g_loss: 3.3531 - gen_acc: 0.9859 - real_acc: 0.9670</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 181ms/step - aug_p: 0.0107 - d_loss: 0.0942 - g_loss: 3.3519 - gen_acc: 0.9869 - real_acc: 0.9679</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0107 - d_loss: 0.0941 - g_loss: 3.3467 - gen_acc: 0.9877 - real_acc: 0.9690</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 179ms/step - aug_p: 0.0107 - d_loss: 0.0944 - g_loss: 3.3438 - gen_acc: 0.9882 - real_acc: 0.9693</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 179ms/step - aug_p: 0.0107 - d_loss: 0.0947 - g_loss: 3.3384 - gen_acc: 0.9886 - real_acc: 0.9696</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0108 - d_loss: 0.0948 - g_loss: 3.3468 - gen_acc: 0.9889 - real_acc: 0.9694</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0108 - d_loss: 0.0952 - g_loss: 3.3443 - gen_acc: 0.9888 - real_acc: 0.9695</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0108 - d_loss: 0.0955 - g_loss: 3.3676 - gen_acc: 0.9887 - real_acc: 0.9693</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0109 - d_loss: 0.0954 - g_loss: 3.3959 - gen_acc: 0.9888 - real_acc: 0.9693</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0109 - d_loss: 0.0957 - g_loss: 3.4125 - gen_acc: 0.9883 - real_acc: 0.9694</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0109 - d_loss: 0.0963 - g_loss: 3.4419 - gen_acc: 0.9880 - real_acc: 0.9688</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0110 - d_loss: 0.0969 - g_loss: 3.4641 - gen_acc: 0.9876 - real_acc: 0.9684</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0110 - d_loss: 0.0972 - g_loss: 3.4867 - gen_acc: 0.9873 - real_acc: 0.9681</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0110 - d_loss: 0.0975 - g_loss: 3.5046 - gen_acc: 0.9869 - real_acc: 0.9679</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0110 - d_loss: 0.0977 - g_loss: 3.5235 - gen_acc: 0.9866 - real_acc: 0.9678</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0111 - d_loss: 0.0978 - g_loss: 3.5387 - gen_acc: 0.9863 - real_acc: 0.9677</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0111 - d_loss: 0.0980 - g_loss: 3.5544 - gen_acc: 0.9861 - real_acc: 0.9676</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 179ms/step - aug_p: 0.0111 - d_loss: 0.0983 - g_loss: 3.5646 - gen_acc: 0.9857 - real_acc: 0.9675</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 179ms/step - aug_p: 0.0112 - d_loss: 0.0990 - g_loss: 3.5834 - gen_acc: 0.9853 - real_acc: 0.9670</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0112 - d_loss: 0.0995 - g_loss: 3.6027 - gen_acc: 0.9850 - real_acc: 0.9665</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0112 - d_loss: 0.1001 - g_loss: 3.6171 - gen_acc: 0.9845 - real_acc: 0.9662</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0112 - d_loss: 0.1006 - g_loss: 3.6374 - gen_acc: 0.9840 - real_acc: 0.9659</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0113 - d_loss: 0.1009 - g_loss: 3.6630 - gen_acc: 0.9836 - real_acc: 0.9656</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0113 - d_loss: 0.1012 - g_loss: 3.6907 - gen_acc: 0.9833 - real_acc: 0.9654</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0113 - d_loss: 0.1014 - g_loss: 3.7165 - gen_acc: 0.9830 - real_acc: 0.9652</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 179ms/step - aug_p: 0.0114 - d_loss: 0.1016 - g_loss: 3.7387 - gen_acc: 0.9827 - real_acc: 0.9651</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 179ms/step - aug_p: 0.0114 - d_loss: 0.1016 - g_loss: 3.7601 - gen_acc: 0.9824 - real_acc: 0.9650</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 179ms/step - aug_p: 0.0114 - d_loss: 0.1017 - g_loss: 3.7799 - gen_acc: 0.9822 - real_acc: 0.9649</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 179ms/step - aug_p: 0.0114 - d_loss: 0.1017 - g_loss: 3.7963 - gen_acc: 0.9820 - real_acc: 0.9649</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 179ms/step - aug_p: 0.0115 - d_loss: 0.1019 - g_loss: 3.8154 - gen_acc: 0.9818 - real_acc: 0.9647</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0115 - d_loss: 0.1020 - g_loss: 3.8348 - gen_acc: 0.9816 - real_acc: 0.9645</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0115 - d_loss: 0.1022 - g_loss: 3.8515 - gen_acc: 0.9813 - real_acc: 0.9644</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0115 - d_loss: 0.1025 - g_loss: 3.8702 - gen_acc: 0.9810 - real_acc: 0.9642</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0116 - d_loss: 0.1027 - g_loss: 3.8891 - gen_acc: 0.9807 - real_acc: 0.9640</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0116 - d_loss: 0.1032 - g_loss: 3.9048 - gen_acc: 0.9803 - real_acc: 0.9638</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0116 - d_loss: 0.1036 - g_loss: 3.9227 - gen_acc: 0.9799 - real_acc: 0.9636</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0117 - d_loss: 0.1040 - g_loss: 3.9415 - gen_acc: 0.9796 - real_acc: 0.9633</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0117 - d_loss: 0.1044 - g_loss: 3.9588 - gen_acc: 0.9792 - real_acc: 0.9631</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0117 - d_loss: 0.1048 - g_loss: 3.9748 - gen_acc: 0.9789 - real_acc: 0.9629</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0117 - d_loss: 0.1052 - g_loss: 3.9895 - gen_acc: 0.9785 - real_acc: 0.9627</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0118 - d_loss: 0.1055 - g_loss: 4.0041 - gen_acc: 0.9782 - real_acc: 0.9626</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0118 - d_loss: 0.1058 - g_loss: 4.0177 - gen_acc: 0.9779 - real_acc: 0.9624</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 22s 315ms/step - aug_p: 0.0118 - d_loss: 0.1061 - g_loss: 4.0306 - gen_acc: 0.9776 - real_acc: 0.9623 - val_kid: 8.4585</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 4/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 11s 263ms/step - aug_p: 0.0154 - d_loss: 0.1223 - g_loss: 2.5203 - gen_acc: 0.9688 - real_acc: 1.0000</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0154 - d_loss: 0.1018 - g_loss: 3.6445 - gen_acc: 0.9766 - real_acc: 0.9980 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0155 - d_loss: 0.0925 - g_loss: 4.2071 - gen_acc: 0.9809 - real_acc: 0.9926</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0155 - d_loss: 0.0875 - g_loss: 4.3535 - gen_acc: 0.9827 - real_acc: 0.9910</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0155 - d_loss: 0.0850 - g_loss: 4.3580 - gen_acc: 0.9843 - real_acc: 0.9900</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0156 - d_loss: 0.0830 - g_loss: 4.3789 - gen_acc: 0.9856 - real_acc: 0.9889</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0156 - d_loss: 0.0821 - g_loss: 4.3592 - gen_acc: 0.9864 - real_acc: 0.9879</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0156 - d_loss: 0.0814 - g_loss: 4.3377 - gen_acc: 0.9871 - real_acc: 0.9870</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0156 - d_loss: 0.0811 - g_loss: 4.3049 - gen_acc: 0.9876 - real_acc: 0.9864</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0157 - d_loss: 0.0807 - g_loss: 4.2813 - gen_acc: 0.9881 - real_acc: 0.9859</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0157 - d_loss: 0.0804 - g_loss: 4.2560 - gen_acc: 0.9884 - real_acc: 0.9855</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0157 - d_loss: 0.0804 - g_loss: 4.2284 - gen_acc: 0.9886 - real_acc: 0.9851</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0158 - d_loss: 0.0806 - g_loss: 4.2117 - gen_acc: 0.9888 - real_acc: 0.9847</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0158 - d_loss: 0.0807 - g_loss: 4.1925 - gen_acc: 0.9887 - real_acc: 0.9844</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0158 - d_loss: 0.0807 - g_loss: 4.1790 - gen_acc: 0.9888 - real_acc: 0.9841</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0159 - d_loss: 0.0807 - g_loss: 4.1631 - gen_acc: 0.9887 - real_acc: 0.9840</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0159 - d_loss: 0.0807 - g_loss: 4.1518 - gen_acc: 0.9887 - real_acc: 0.9838</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0159 - d_loss: 0.0807 - g_loss: 4.1398 - gen_acc: 0.9887 - real_acc: 0.9837</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0160 - d_loss: 0.0809 - g_loss: 4.1255 - gen_acc: 0.9887 - real_acc: 0.9837</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0160 - d_loss: 0.0810 - g_loss: 4.1171 - gen_acc: 0.9887 - real_acc: 0.9835</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0160 - d_loss: 0.0810 - g_loss: 4.1116 - gen_acc: 0.9886 - real_acc: 0.9835</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0161 - d_loss: 0.0811 - g_loss: 4.1037 - gen_acc: 0.9885 - real_acc: 0.9834</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0161 - d_loss: 0.0812 - g_loss: 4.1013 - gen_acc: 0.9885 - real_acc: 0.9833</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0161 - d_loss: 0.0813 - g_loss: 4.1000 - gen_acc: 0.9884 - real_acc: 0.9832</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0162 - d_loss: 0.0814 - g_loss: 4.0967 - gen_acc: 0.9883 - real_acc: 0.9832</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0162 - d_loss: 0.0815 - g_loss: 4.0951 - gen_acc: 0.9882 - real_acc: 0.9831</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0162 - d_loss: 0.0815 - g_loss: 4.0930 - gen_acc: 0.9882 - real_acc: 0.9830</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0163 - d_loss: 0.0817 - g_loss: 4.0887 - gen_acc: 0.9880 - real_acc: 0.9830</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0163 - d_loss: 0.0818 - g_loss: 4.0890 - gen_acc: 0.9879 - real_acc: 0.9829</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0163 - d_loss: 0.0819 - g_loss: 4.0918 - gen_acc: 0.9878 - real_acc: 0.9828</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0164 - d_loss: 0.0821 - g_loss: 4.0923 - gen_acc: 0.9877 - real_acc: 0.9826</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0164 - d_loss: 0.0823 - g_loss: 4.0957 - gen_acc: 0.9876 - real_acc: 0.9826</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0164 - d_loss: 0.0824 - g_loss: 4.1014 - gen_acc: 0.9874 - real_acc: 0.9825</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0165 - d_loss: 0.0824 - g_loss: 4.1072 - gen_acc: 0.9873 - real_acc: 0.9824</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0165 - d_loss: 0.0825 - g_loss: 4.1116 - gen_acc: 0.9872 - real_acc: 0.9823</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0165 - d_loss: 0.0826 - g_loss: 4.1168 - gen_acc: 0.9871 - real_acc: 0.9822</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0166 - d_loss: 0.0827 - g_loss: 4.1217 - gen_acc: 0.9870 - real_acc: 0.9822</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0166 - d_loss: 0.0827 - g_loss: 4.1262 - gen_acc: 0.9869 - real_acc: 0.9821</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0166 - d_loss: 0.0828 - g_loss: 4.1298 - gen_acc: 0.9868 - real_acc: 0.9820</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0167 - d_loss: 0.0828 - g_loss: 4.1333 - gen_acc: 0.9868 - real_acc: 0.9820</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0167 - d_loss: 0.0828 - g_loss: 4.1361 - gen_acc: 0.9867 - real_acc: 0.9819</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0167 - d_loss: 0.0828 - g_loss: 4.1389 - gen_acc: 0.9866 - real_acc: 0.9819</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0168 - d_loss: 0.0828 - g_loss: 4.1408 - gen_acc: 0.9866 - real_acc: 0.9819</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0168 - d_loss: 0.0828 - g_loss: 4.1438 - gen_acc: 0.9865 - real_acc: 0.9818</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0168 - d_loss: 0.0828 - g_loss: 4.1466 - gen_acc: 0.9865 - real_acc: 0.9818</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0168 - d_loss: 0.0829 - g_loss: 4.1480 - gen_acc: 0.9864 - real_acc: 0.9818</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 14s 316ms/step - aug_p: 0.0169 - d_loss: 0.0829 - g_loss: 4.1493 - gen_acc: 0.9863 - real_acc: 0.9817 - val_kid: 6.6764</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 5/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 10s 237ms/step - aug_p: 0.0212 - d_loss: 0.3046 - g_loss: 11.2403 - gen_acc: 1.0000 - real_acc: 0.7734</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 8s 197ms/step - aug_p: 0.0212 - d_loss: 0.2549 - g_loss: 10.8464 - gen_acc: 1.0000 - real_acc: 0.8086 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 8s 190ms/step - aug_p: 0.0212 - d_loss: 0.2217 - g_loss: 10.0394 - gen_acc: 0.9983 - real_acc: 0.8359</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 187ms/step - aug_p: 0.0212 - d_loss: 0.2183 - g_loss: 9.2019 - gen_acc: 0.9753 - real_acc: 0.8560 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 185ms/step - aug_p: 0.0212 - d_loss: 0.2125 - g_loss: 8.8056 - gen_acc: 0.9652 - real_acc: 0.8676</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 185ms/step - aug_p: 0.0213 - d_loss: 0.2060 - g_loss: 8.5755 - gen_acc: 0.9606 - real_acc: 0.8755</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 184ms/step - aug_p: 0.0213 - d_loss: 0.1995 - g_loss: 8.3695 - gen_acc: 0.9579 - real_acc: 0.8823</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 7s 184ms/step - aug_p: 0.0213 - d_loss: 0.1938 - g_loss: 8.1574 - gen_acc: 0.9561 - real_acc: 0.8884</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 184ms/step - aug_p: 0.0214 - d_loss: 0.1881 - g_loss: 7.9590 - gen_acc: 0.9552 - real_acc: 0.8939</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 184ms/step - aug_p: 0.0214 - d_loss: 0.1827 - g_loss: 7.7719 - gen_acc: 0.9550 - real_acc: 0.8989</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 183ms/step - aug_p: 0.0214 - d_loss: 0.1785 - g_loss: 7.5867 - gen_acc: 0.9546 - real_acc: 0.9034</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 183ms/step - aug_p: 0.0214 - d_loss: 0.1748 - g_loss: 7.4330 - gen_acc: 0.9546 - real_acc: 0.9072</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 6s 183ms/step - aug_p: 0.0215 - d_loss: 0.1717 - g_loss: 7.2895 - gen_acc: 0.9548 - real_acc: 0.9103</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 183ms/step - aug_p: 0.0215 - d_loss: 0.1693 - g_loss: 7.1489 - gen_acc: 0.9544 - real_acc: 0.9132</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0215 - d_loss: 0.1674 - g_loss: 7.0344 - gen_acc: 0.9543 - real_acc: 0.9153</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0215 - d_loss: 0.1654 - g_loss: 6.9321 - gen_acc: 0.9544 - real_acc: 0.9173</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0216 - d_loss: 0.1637 - g_loss: 6.8304 - gen_acc: 0.9541 - real_acc: 0.9191</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 183ms/step - aug_p: 0.0216 - d_loss: 0.1620 - g_loss: 6.7449 - gen_acc: 0.9540 - real_acc: 0.9209</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 183ms/step - aug_p: 0.0216 - d_loss: 0.1603 - g_loss: 6.6702 - gen_acc: 0.9540 - real_acc: 0.9225</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0217 - d_loss: 0.1587 - g_loss: 6.5977 - gen_acc: 0.9541 - real_acc: 0.9240</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0217 - d_loss: 0.1572 - g_loss: 6.5271 - gen_acc: 0.9542 - real_acc: 0.9255</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0217 - d_loss: 0.1556 - g_loss: 6.4626 - gen_acc: 0.9544 - real_acc: 0.9269</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0217 - d_loss: 0.1540 - g_loss: 6.4028 - gen_acc: 0.9546 - real_acc: 0.9282</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0218 - d_loss: 0.1525 - g_loss: 6.3440 - gen_acc: 0.9548 - real_acc: 0.9295</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 183ms/step - aug_p: 0.0218 - d_loss: 0.1510 - g_loss: 6.2898 - gen_acc: 0.9551 - real_acc: 0.9307</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 183ms/step - aug_p: 0.0218 - d_loss: 0.1495 - g_loss: 6.2380 - gen_acc: 0.9554 - real_acc: 0.9318</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 183ms/step - aug_p: 0.0219 - d_loss: 0.1481 - g_loss: 6.1880 - gen_acc: 0.9558 - real_acc: 0.9330</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 182ms/step - aug_p: 0.0219 - d_loss: 0.1468 - g_loss: 6.1413 - gen_acc: 0.9561 - real_acc: 0.9340</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 183ms/step - aug_p: 0.0219 - d_loss: 0.1454 - g_loss: 6.0966 - gen_acc: 0.9565 - real_acc: 0.9350</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0220 - d_loss: 0.1441 - g_loss: 6.0534 - gen_acc: 0.9569 - real_acc: 0.9360</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 183ms/step - aug_p: 0.0220 - d_loss: 0.1428 - g_loss: 6.0122 - gen_acc: 0.9573 - real_acc: 0.9370</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0220 - d_loss: 0.1415 - g_loss: 5.9738 - gen_acc: 0.9577 - real_acc: 0.9379</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0220 - d_loss: 0.1403 - g_loss: 5.9369 - gen_acc: 0.9581 - real_acc: 0.9388</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0221 - d_loss: 0.1390 - g_loss: 5.9020 - gen_acc: 0.9585 - real_acc: 0.9396</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0221 - d_loss: 0.1378 - g_loss: 5.8680 - gen_acc: 0.9589 - real_acc: 0.9404</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0221 - d_loss: 0.1366 - g_loss: 5.8355 - gen_acc: 0.9592 - real_acc: 0.9412</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0222 - d_loss: 0.1355 - g_loss: 5.8042 - gen_acc: 0.9596 - real_acc: 0.9420</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0222 - d_loss: 0.1344 - g_loss: 5.7737 - gen_acc: 0.9600 - real_acc: 0.9427</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0222 - d_loss: 0.1333 - g_loss: 5.7447 - gen_acc: 0.9604 - real_acc: 0.9434</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0223 - d_loss: 0.1323 - g_loss: 5.7161 - gen_acc: 0.9608 - real_acc: 0.9441</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0223 - d_loss: 0.1313 - g_loss: 5.6892 - gen_acc: 0.9611 - real_acc: 0.9447</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0223 - d_loss: 0.1304 - g_loss: 5.6621 - gen_acc: 0.9615 - real_acc: 0.9453</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0223 - d_loss: 0.1296 - g_loss: 5.6390 - gen_acc: 0.9618 - real_acc: 0.9458</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0224 - d_loss: 0.1288 - g_loss: 5.6185 - gen_acc: 0.9621 - real_acc: 0.9463</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0224 - d_loss: 0.1280 - g_loss: 5.5982 - gen_acc: 0.9623 - real_acc: 0.9468</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0224 - d_loss: 0.1273 - g_loss: 5.5795 - gen_acc: 0.9626 - real_acc: 0.9473</p> <p><img alt="png" src="/img/examples/generative/gan_ada/gan_ada_18_265.png" /></p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 14s 317ms/step - aug_p: 0.0225 - d_loss: 0.1265 - g_loss: 5.5616 - gen_acc: 0.9629 - real_acc: 0.9478 - val_kid: 4.7496</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 6/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 10s 236ms/step - aug_p: 0.0268 - d_loss: 0.0745 - g_loss: 5.1780 - gen_acc: 0.9922 - real_acc: 0.9688</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 8s 184ms/step - aug_p: 0.0269 - d_loss: 0.0774 - g_loss: 4.5412 - gen_acc: 0.9883 - real_acc: 0.9766 </p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0269 - d_loss: 0.0743 - g_loss: 4.5406 - gen_acc: 0.9887 - real_acc: 0.9783</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 184ms/step - aug_p: 0.0269 - d_loss: 0.0724 - g_loss: 4.5764 - gen_acc: 0.9896 - real_acc: 0.9779</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0270 - d_loss: 0.0732 - g_loss: 4.5209 - gen_acc: 0.9882 - real_acc: 0.9785</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 182ms/step - aug_p: 0.0270 - d_loss: 0.0738 - g_loss: 4.5449 - gen_acc: 0.9878 - real_acc: 0.9782</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 182ms/step - aug_p: 0.0270 - d_loss: 0.0747 - g_loss: 4.5880 - gen_acc: 0.9878 - real_acc: 0.9769</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0271 - d_loss: 0.0766 - g_loss: 4.5791 - gen_acc: 0.9857 - real_acc: 0.9763</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 183ms/step - aug_p: 0.0271 - d_loss: 0.0777 - g_loss: 4.6269 - gen_acc: 0.9844 - real_acc: 0.9757</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0271 - d_loss: 0.0786 - g_loss: 4.7075 - gen_acc: 0.9836 - real_acc: 0.9749</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0271 - d_loss: 0.0792 - g_loss: 4.7786 - gen_acc: 0.9826 - real_acc: 0.9745</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0272 - d_loss: 0.0793 - g_loss: 4.8440 - gen_acc: 0.9820 - real_acc: 0.9744</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 6s 183ms/step - aug_p: 0.0272 - d_loss: 0.0792 - g_loss: 4.9001 - gen_acc: 0.9816 - real_acc: 0.9744</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0272 - d_loss: 0.0789 - g_loss: 4.9354 - gen_acc: 0.9814 - real_acc: 0.9745</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0273 - d_loss: 0.0785 - g_loss: 4.9643 - gen_acc: 0.9813 - real_acc: 0.9747</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0273 - d_loss: 0.0781 - g_loss: 4.9864 - gen_acc: 0.9814 - real_acc: 0.9749</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0273 - d_loss: 0.0778 - g_loss: 4.9973 - gen_acc: 0.9814 - real_acc: 0.9751</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 182ms/step - aug_p: 0.0274 - d_loss: 0.0774 - g_loss: 5.0125 - gen_acc: 0.9815 - real_acc: 0.9753</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0274 - d_loss: 0.0770 - g_loss: 5.0280 - gen_acc: 0.9816 - real_acc: 0.9755</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0274 - d_loss: 0.0765 - g_loss: 5.0398 - gen_acc: 0.9818 - real_acc: 0.9757</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0275 - d_loss: 0.0760 - g_loss: 5.0455 - gen_acc: 0.9819 - real_acc: 0.9759</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 182ms/step - aug_p: 0.0275 - d_loss: 0.0756 - g_loss: 5.0535 - gen_acc: 0.9820 - real_acc: 0.9760</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0275 - d_loss: 0.0752 - g_loss: 5.0590 - gen_acc: 0.9822 - real_acc: 0.9762</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0276 - d_loss: 0.0749 - g_loss: 5.0595 - gen_acc: 0.9823 - real_acc: 0.9763</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0276 - d_loss: 0.0746 - g_loss: 5.0650 - gen_acc: 0.9825 - real_acc: 0.9764</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0276 - d_loss: 0.0743 - g_loss: 5.0742 - gen_acc: 0.9826 - real_acc: 0.9765</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0277 - d_loss: 0.0740 - g_loss: 5.0823 - gen_acc: 0.9828 - real_acc: 0.9766</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0277 - d_loss: 0.0737 - g_loss: 5.0871 - gen_acc: 0.9829 - real_acc: 0.9767</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0277 - d_loss: 0.0734 - g_loss: 5.0913 - gen_acc: 0.9831 - real_acc: 0.9768</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0278 - d_loss: 0.0731 - g_loss: 5.0957 - gen_acc: 0.9832 - real_acc: 0.9769</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0278 - d_loss: 0.0727 - g_loss: 5.0986 - gen_acc: 0.9834 - real_acc: 0.9770</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0278 - d_loss: 0.0725 - g_loss: 5.0992 - gen_acc: 0.9835 - real_acc: 0.9771</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0278 - d_loss: 0.0722 - g_loss: 5.1012 - gen_acc: 0.9836 - real_acc: 0.9772</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0279 - d_loss: 0.0719 - g_loss: 5.1022 - gen_acc: 0.9838 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0279 - d_loss: 0.0718 - g_loss: 5.1007 - gen_acc: 0.9838 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0279 - d_loss: 0.0717 - g_loss: 5.1026 - gen_acc: 0.9839 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0280 - d_loss: 0.0716 - g_loss: 5.1070 - gen_acc: 0.9840 - real_acc: 0.9772</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0280 - d_loss: 0.0715 - g_loss: 5.1124 - gen_acc: 0.9840 - real_acc: 0.9772</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0280 - d_loss: 0.0714 - g_loss: 5.1178 - gen_acc: 0.9841 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 181ms/step - aug_p: 0.0281 - d_loss: 0.0712 - g_loss: 5.1221 - gen_acc: 0.9842 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0281 - d_loss: 0.0710 - g_loss: 5.1258 - gen_acc: 0.9843 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0281 - d_loss: 0.0708 - g_loss: 5.1290 - gen_acc: 0.9843 - real_acc: 0.9773</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0282 - d_loss: 0.0707 - g_loss: 5.1315 - gen_acc: 0.9844 - real_acc: 0.9774</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0282 - d_loss: 0.0705 - g_loss: 5.1332 - gen_acc: 0.9845 - real_acc: 0.9774</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0282 - d_loss: 0.0703 - g_loss: 5.1347 - gen_acc: 0.9845 - real_acc: 0.9775</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 181ms/step - aug_p: 0.0283 - d_loss: 0.0701 - g_loss: 5.1357 - gen_acc: 0.9846 - real_acc: 0.9775</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 12s 267ms/step - aug_p: 0.0283 - d_loss: 0.0699 - g_loss: 5.1367 - gen_acc: 0.9846 - real_acc: 0.9776 - val_kid: 6.2893</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 7/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 7s 174ms/step - aug_p: 0.0328 - d_loss: 0.0456 - g_loss: 3.5202 - gen_acc: 1.0000 - real_acc: 1.0000</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 7s 179ms/step - aug_p: 0.0329 - d_loss: 0.0466 - g_loss: 3.7961 - gen_acc: 0.9980 - real_acc: 0.9980</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 179ms/step - aug_p: 0.0329 - d_loss: 0.0471 - g_loss: 3.9462 - gen_acc: 0.9970 - real_acc: 0.9961</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 179ms/step - aug_p: 0.0329 - d_loss: 0.0469 - g_loss: 4.0184 - gen_acc: 0.9967 - real_acc: 0.9946</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0330 - d_loss: 0.0463 - g_loss: 4.0670 - gen_acc: 0.9968 - real_acc: 0.9941</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0330 - d_loss: 0.0458 - g_loss: 4.1012 - gen_acc: 0.9969 - real_acc: 0.9938</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0330 - d_loss: 0.0451 - g_loss: 4.1240 - gen_acc: 0.9970 - real_acc: 0.9937</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0331 - d_loss: 0.0444 - g_loss: 4.1347 - gen_acc: 0.9971 - real_acc: 0.9938</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0331 - d_loss: 0.0438 - g_loss: 4.1433 - gen_acc: 0.9971 - real_acc: 0.9937</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0331 - d_loss: 0.0434 - g_loss: 4.1560 - gen_acc: 0.9970 - real_acc: 0.9936</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0332 - d_loss: 0.0431 - g_loss: 4.1654 - gen_acc: 0.9969 - real_acc: 0.9936</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0332 - d_loss: 0.0429 - g_loss: 4.1695 - gen_acc: 0.9969 - real_acc: 0.9935</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0333 - d_loss: 0.0428 - g_loss: 4.1758 - gen_acc: 0.9969 - real_acc: 0.9934</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0333 - d_loss: 0.0427 - g_loss: 4.1789 - gen_acc: 0.9969 - real_acc: 0.9932</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0333 - d_loss: 0.0426 - g_loss: 4.1799 - gen_acc: 0.9970 - real_acc: 0.9929</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0334 - d_loss: 0.0425 - g_loss: 4.1823 - gen_acc: 0.9970 - real_acc: 0.9927</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0334 - d_loss: 0.0425 - g_loss: 4.1836 - gen_acc: 0.9970 - real_acc: 0.9926</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0334 - d_loss: 0.0425 - g_loss: 4.1854 - gen_acc: 0.9971 - real_acc: 0.9923</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0335 - d_loss: 0.0426 - g_loss: 4.1843 - gen_acc: 0.9971 - real_acc: 0.9921</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0335 - d_loss: 0.0427 - g_loss: 4.1873 - gen_acc: 0.9971 - real_acc: 0.9920</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0335 - d_loss: 0.0427 - g_loss: 4.1927 - gen_acc: 0.9972 - real_acc: 0.9918</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0336 - d_loss: 0.0428 - g_loss: 4.1952 - gen_acc: 0.9972 - real_acc: 0.9916</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0336 - d_loss: 0.0428 - g_loss: 4.2017 - gen_acc: 0.9972 - real_acc: 0.9915</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0336 - d_loss: 0.0428 - g_loss: 4.2106 - gen_acc: 0.9972 - real_acc: 0.9914</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0337 - d_loss: 0.0428 - g_loss: 4.2181 - gen_acc: 0.9972 - real_acc: 0.9913</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0337 - d_loss: 0.0428 - g_loss: 4.2229 - gen_acc: 0.9972 - real_acc: 0.9912</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0337 - d_loss: 0.0429 - g_loss: 4.2318 - gen_acc: 0.9972 - real_acc: 0.9911</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0338 - d_loss: 0.0429 - g_loss: 4.2416 - gen_acc: 0.9972 - real_acc: 0.9910</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0338 - d_loss: 0.0430 - g_loss: 4.2491 - gen_acc: 0.9971 - real_acc: 0.9909</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0338 - d_loss: 0.0430 - g_loss: 4.2604 - gen_acc: 0.9971 - real_acc: 0.9908</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0339 - d_loss: 0.0431 - g_loss: 4.2736 - gen_acc: 0.9971 - real_acc: 0.9907</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0339 - d_loss: 0.0432 - g_loss: 4.2834 - gen_acc: 0.9970 - real_acc: 0.9906</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0339 - d_loss: 0.0439 - g_loss: 4.3010 - gen_acc: 0.9968 - real_acc: 0.9901</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0340 - d_loss: 0.0444 - g_loss: 4.3187 - gen_acc: 0.9967 - real_acc: 0.9897</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0340 - d_loss: 0.0455 - g_loss: 4.3319 - gen_acc: 0.9961 - real_acc: 0.9892</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0340 - d_loss: 0.0464 - g_loss: 4.3508 - gen_acc: 0.9956 - real_acc: 0.9889</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0341 - d_loss: 0.0474 - g_loss: 4.3765 - gen_acc: 0.9951 - real_acc: 0.9884</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0341 - d_loss: 0.0483 - g_loss: 4.4070 - gen_acc: 0.9947 - real_acc: 0.9880</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0341 - d_loss: 0.0492 - g_loss: 4.4400 - gen_acc: 0.9943 - real_acc: 0.9875</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0342 - d_loss: 0.0499 - g_loss: 4.4739 - gen_acc: 0.9939 - real_acc: 0.9872</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0342 - d_loss: 0.0506 - g_loss: 4.5070 - gen_acc: 0.9935 - real_acc: 0.9868</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0342 - d_loss: 0.0513 - g_loss: 4.5375 - gen_acc: 0.9932 - real_acc: 0.9865</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0343 - d_loss: 0.0519 - g_loss: 4.5646 - gen_acc: 0.9929 - real_acc: 0.9862</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0343 - d_loss: 0.0525 - g_loss: 4.5904 - gen_acc: 0.9925 - real_acc: 0.9859</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0343 - d_loss: 0.0530 - g_loss: 4.6149 - gen_acc: 0.9923 - real_acc: 0.9857</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0344 - d_loss: 0.0536 - g_loss: 4.6368 - gen_acc: 0.9920 - real_acc: 0.9854</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 13s 294ms/step - aug_p: 0.0344 - d_loss: 0.0542 - g_loss: 4.6579 - gen_acc: 0.9917 - real_acc: 0.9852 - val_kid: 6.7378</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 8/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 7s 167ms/step - aug_p: 0.0384 - d_loss: 0.1191 - g_loss: 4.3279 - gen_acc: 1.0000 - real_acc: 0.9219</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 7s 179ms/step - aug_p: 0.0384 - d_loss: 0.1470 - g_loss: 3.7525 - gen_acc: 0.9590 - real_acc: 0.9219</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0384 - d_loss: 0.1768 - g_loss: 4.0819 - gen_acc: 0.9544 - real_acc: 0.8950</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0384 - d_loss: 0.1801 - g_loss: 4.1693 - gen_acc: 0.9551 - real_acc: 0.8910</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0384 - d_loss: 0.1829 - g_loss: 4.1280 - gen_acc: 0.9491 - real_acc: 0.8934</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0384 - d_loss: 0.1828 - g_loss: 4.2346 - gen_acc: 0.9471 - real_acc: 0.8949</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 180ms/step - aug_p: 0.0385 - d_loss: 0.1806 - g_loss: 4.3823 - gen_acc: 0.9470 - real_acc: 0.8968</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0385 - d_loss: 0.1765 - g_loss: 4.5079 - gen_acc: 0.9478 - real_acc: 0.8997</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0385 - d_loss: 0.1723 - g_loss: 4.5814 - gen_acc: 0.9486 - real_acc: 0.9028</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0385 - d_loss: 0.1679 - g_loss: 4.6213 - gen_acc: 0.9496 - real_acc: 0.9061</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0385 - d_loss: 0.1637 - g_loss: 4.6466 - gen_acc: 0.9507 - real_acc: 0.9092</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 180ms/step - aug_p: 0.0386 - d_loss: 0.1595 - g_loss: 4.6599 - gen_acc: 0.9520 - real_acc: 0.9122</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0386 - d_loss: 0.1561 - g_loss: 4.6625 - gen_acc: 0.9531 - real_acc: 0.9148</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0386 - d_loss: 0.1535 - g_loss: 4.6513 - gen_acc: 0.9537 - real_acc: 0.9172</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0386 - d_loss: 0.1530 - g_loss: 4.6600 - gen_acc: 0.9544 - real_acc: 0.9175</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0387 - d_loss: 0.1612 - g_loss: 4.6490 - gen_acc: 0.9512 - real_acc: 0.9180</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0387 - d_loss: 0.1727 - g_loss: 4.6715 - gen_acc: 0.9488 - real_acc: 0.9157</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0387 - d_loss: 0.1825 - g_loss: 4.7072 - gen_acc: 0.9469 - real_acc: 0.9135</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0387 - d_loss: 0.1904 - g_loss: 4.7428 - gen_acc: 0.9454 - real_acc: 0.9118</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 179ms/step - aug_p: 0.0387 - d_loss: 0.1970 - g_loss: 4.7693 - gen_acc: 0.9440 - real_acc: 0.9106</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 179ms/step - aug_p: 0.0387 - d_loss: 0.2029 - g_loss: 4.7854 - gen_acc: 0.9424 - real_acc: 0.9098</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0387 - d_loss: 0.2079 - g_loss: 4.7960 - gen_acc: 0.9409 - real_acc: 0.9092</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0388 - d_loss: 0.2119 - g_loss: 4.8033 - gen_acc: 0.9397 - real_acc: 0.9090</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0388 - d_loss: 0.2153 - g_loss: 4.8076 - gen_acc: 0.9387 - real_acc: 0.9088</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0388 - d_loss: 0.2182 - g_loss: 4.8077 - gen_acc: 0.9378 - real_acc: 0.9087</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0388 - d_loss: 0.2207 - g_loss: 4.8051 - gen_acc: 0.9371 - real_acc: 0.9087</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 179ms/step - aug_p: 0.0388 - d_loss: 0.2229 - g_loss: 4.8007 - gen_acc: 0.9365 - real_acc: 0.9086</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0388 - d_loss: 0.2249 - g_loss: 4.7934 - gen_acc: 0.9360 - real_acc: 0.9086</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0388 - d_loss: 0.2265 - g_loss: 4.7860 - gen_acc: 0.9355 - real_acc: 0.9086</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0389 - d_loss: 0.2278 - g_loss: 4.7775 - gen_acc: 0.9352 - real_acc: 0.9087</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0389 - d_loss: 0.2290 - g_loss: 4.7677 - gen_acc: 0.9349 - real_acc: 0.9087</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0389 - d_loss: 0.2299 - g_loss: 4.7575 - gen_acc: 0.9347 - real_acc: 0.9089</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0389 - d_loss: 0.2305 - g_loss: 4.7470 - gen_acc: 0.9346 - real_acc: 0.9091</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0389 - d_loss: 0.2310 - g_loss: 4.7363 - gen_acc: 0.9345 - real_acc: 0.9093</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0389 - d_loss: 0.2314 - g_loss: 4.7249 - gen_acc: 0.9344 - real_acc: 0.9095</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0389 - d_loss: 0.2317 - g_loss: 4.7149 - gen_acc: 0.9344 - real_acc: 0.9098</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0390 - d_loss: 0.2319 - g_loss: 4.7045 - gen_acc: 0.9345 - real_acc: 0.9101</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0390 - d_loss: 0.2319 - g_loss: 4.6937 - gen_acc: 0.9345 - real_acc: 0.9104</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0390 - d_loss: 0.2319 - g_loss: 4.6838 - gen_acc: 0.9346 - real_acc: 0.9107</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 179ms/step - aug_p: 0.0390 - d_loss: 0.2318 - g_loss: 4.6734 - gen_acc: 0.9347 - real_acc: 0.9110</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0390 - d_loss: 0.2316 - g_loss: 4.6636 - gen_acc: 0.9349 - real_acc: 0.9114</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 179ms/step - aug_p: 0.0390 - d_loss: 0.2313 - g_loss: 4.6532 - gen_acc: 0.9350 - real_acc: 0.9117</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0391 - d_loss: 0.2310 - g_loss: 4.6442 - gen_acc: 0.9352 - real_acc: 0.9120</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0391 - d_loss: 0.2306 - g_loss: 4.6361 - gen_acc: 0.9354 - real_acc: 0.9124</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0391 - d_loss: 0.2302 - g_loss: 4.6279 - gen_acc: 0.9356 - real_acc: 0.9127</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0391 - d_loss: 0.2297 - g_loss: 4.6201 - gen_acc: 0.9358 - real_acc: 0.9131</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 12s 266ms/step - aug_p: 0.0391 - d_loss: 0.2292 - g_loss: 4.6126 - gen_acc: 0.9361 - real_acc: 0.9134 - val_kid: 5.7109</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 9/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 8s 180ms/step - aug_p: 0.0422 - d_loss: 0.0668 - g_loss: 3.8939 - gen_acc: 0.9922 - real_acc: 0.9922</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 8s 187ms/step - aug_p: 0.0422 - d_loss: 0.0676 - g_loss: 3.8295 - gen_acc: 0.9863 - real_acc: 0.9941</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 7s 185ms/step - aug_p: 0.0422 - d_loss: 0.0659 - g_loss: 3.8676 - gen_acc: 0.9865 - real_acc: 0.9944</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0423 - d_loss: 0.0703 - g_loss: 3.8084 - gen_acc: 0.9831 - real_acc: 0.9928</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0423 - d_loss: 0.0755 - g_loss: 3.9384 - gen_acc: 0.9821 - real_acc: 0.9880</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0423 - d_loss: 0.0781 - g_loss: 4.0291 - gen_acc: 0.9818 - real_acc: 0.9846</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 182ms/step - aug_p: 0.0424 - d_loss: 0.0831 - g_loss: 4.0366 - gen_acc: 0.9779 - real_acc: 0.9828</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0424 - d_loss: 0.0888 - g_loss: 4.1585 - gen_acc: 0.9756 - real_acc: 0.9782</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0424 - d_loss: 0.0921 - g_loss: 4.3258 - gen_acc: 0.9744 - real_acc: 0.9750</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0424 - d_loss: 0.0937 - g_loss: 4.4967 - gen_acc: 0.9737 - real_acc: 0.9729</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0425 - d_loss: 0.0944 - g_loss: 4.6444 - gen_acc: 0.9732 - real_acc: 0.9715</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 181ms/step - aug_p: 0.0425 - d_loss: 0.0945 - g_loss: 4.7625 - gen_acc: 0.9730 - real_acc: 0.9706</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0425 - d_loss: 0.0943 - g_loss: 4.8487 - gen_acc: 0.9728 - real_acc: 0.9701</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0425 - d_loss: 0.0940 - g_loss: 4.9110 - gen_acc: 0.9726 - real_acc: 0.9698</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0426 - d_loss: 0.0935 - g_loss: 4.9645 - gen_acc: 0.9725 - real_acc: 0.9696</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0426 - d_loss: 0.0931 - g_loss: 5.0047 - gen_acc: 0.9726 - real_acc: 0.9694</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0426 - d_loss: 0.0930 - g_loss: 5.0287 - gen_acc: 0.9723 - real_acc: 0.9693</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0426 - d_loss: 0.0941 - g_loss: 5.0578 - gen_acc: 0.9721 - real_acc: 0.9679</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0427 - d_loss: 0.0976 - g_loss: 5.0709 - gen_acc: 0.9702 - real_acc: 0.9669</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0427 - d_loss: 0.1023 - g_loss: 5.0961 - gen_acc: 0.9687 - real_acc: 0.9645</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0427 - d_loss: 0.1064 - g_loss: 5.1232 - gen_acc: 0.9674 - real_acc: 0.9623</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0427 - d_loss: 0.1101 - g_loss: 5.1442 - gen_acc: 0.9662 - real_acc: 0.9603</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 181ms/step - aug_p: 0.0428 - d_loss: 0.1136 - g_loss: 5.1570 - gen_acc: 0.9649 - real_acc: 0.9587</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0428 - d_loss: 0.1166 - g_loss: 5.1674 - gen_acc: 0.9638 - real_acc: 0.9573</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0428 - d_loss: 0.1192 - g_loss: 5.1751 - gen_acc: 0.9628 - real_acc: 0.9561</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0428 - d_loss: 0.1216 - g_loss: 5.1786 - gen_acc: 0.9620 - real_acc: 0.9550</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0428 - d_loss: 0.1238 - g_loss: 5.1785 - gen_acc: 0.9612 - real_acc: 0.9539</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0429 - d_loss: 0.1258 - g_loss: 5.1765 - gen_acc: 0.9605 - real_acc: 0.9530</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0429 - d_loss: 0.1276 - g_loss: 5.1726 - gen_acc: 0.9599 - real_acc: 0.9521</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0429 - d_loss: 0.1294 - g_loss: 5.1667 - gen_acc: 0.9595 - real_acc: 0.9513</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0429 - d_loss: 0.1309 - g_loss: 5.1594 - gen_acc: 0.9590 - real_acc: 0.9506</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0429 - d_loss: 0.1323 - g_loss: 5.1512 - gen_acc: 0.9587 - real_acc: 0.9500</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0429 - d_loss: 0.1335 - g_loss: 5.1414 - gen_acc: 0.9584 - real_acc: 0.9494</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 180ms/step - aug_p: 0.0430 - d_loss: 0.1346 - g_loss: 5.1320 - gen_acc: 0.9582 - real_acc: 0.9489</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0430 - d_loss: 0.1356 - g_loss: 5.1216 - gen_acc: 0.9580 - real_acc: 0.9484</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0430 - d_loss: 0.1365 - g_loss: 5.1109 - gen_acc: 0.9579 - real_acc: 0.9479</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0430 - d_loss: 0.1373 - g_loss: 5.0996 - gen_acc: 0.9578 - real_acc: 0.9475</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0430 - d_loss: 0.1379 - g_loss: 5.0882 - gen_acc: 0.9577 - real_acc: 0.9472</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0431 - d_loss: 0.1385 - g_loss: 5.0769 - gen_acc: 0.9577 - real_acc: 0.9468</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 180ms/step - aug_p: 0.0431 - d_loss: 0.1391 - g_loss: 5.0648 - gen_acc: 0.9577 - real_acc: 0.9466</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0431 - d_loss: 0.1395 - g_loss: 5.0535 - gen_acc: 0.9577 - real_acc: 0.9463</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0431 - d_loss: 0.1400 - g_loss: 5.0419 - gen_acc: 0.9576 - real_acc: 0.9461</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0431 - d_loss: 0.1403 - g_loss: 5.0307 - gen_acc: 0.9577 - real_acc: 0.9459</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0431 - d_loss: 0.1406 - g_loss: 5.0198 - gen_acc: 0.9577 - real_acc: 0.9458</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0432 - d_loss: 0.1408 - g_loss: 5.0087 - gen_acc: 0.9577 - real_acc: 0.9456</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 180ms/step - aug_p: 0.0432 - d_loss: 0.1410 - g_loss: 4.9981 - gen_acc: 0.9578 - real_acc: 0.9455</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 14s 300ms/step - aug_p: 0.0432 - d_loss: 0.1411 - g_loss: 4.9879 - gen_acc: 0.9579 - real_acc: 0.9455 - val_kid: 3.6018</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>Epoch 10/10 </code></pre></div> </div> <p>1/46 ━━━━━━━━━━━━━━━━━━━━ 5:15 7s/step - aug_p: 0.0464 - d_loss: 0.0324 - g_loss: 4.1750 - gen_acc: 1.0000 - real_acc: 0.9922</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>2/46 ━━━━━━━━━━━━━━━━━━━━ 8s 195ms/step - aug_p: 0.0464 - d_loss: 0.0337 - g_loss: 4.0349 - gen_acc: 0.9980 - real_acc: 0.9941</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>3/46 ━━━━━━━━━━━━━━━━━━━━ 8s 186ms/step - aug_p: 0.0464 - d_loss: 0.0367 - g_loss: 4.0199 - gen_acc: 0.9978 - real_acc: 0.9918</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>4/46 ━━━━━━━━━━━━━━━━━━━━ 7s 184ms/step - aug_p: 0.0465 - d_loss: 0.0374 - g_loss: 4.0297 - gen_acc: 0.9979 - real_acc: 0.9909</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>5/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0465 - d_loss: 0.0380 - g_loss: 4.0271 - gen_acc: 0.9980 - real_acc: 0.9902</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>6/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0465 - d_loss: 0.0383 - g_loss: 4.0130 - gen_acc: 0.9981 - real_acc: 0.9901</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>7/46 ━━━━━━━━━━━━━━━━━━━━ 7s 183ms/step - aug_p: 0.0466 - d_loss: 0.0385 - g_loss: 4.0148 - gen_acc: 0.9982 - real_acc: 0.9901</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>8/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0466 - d_loss: 0.0389 - g_loss: 4.0141 - gen_acc: 0.9983 - real_acc: 0.9902</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>9/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0467 - d_loss: 0.0393 - g_loss: 4.0076 - gen_acc: 0.9984 - real_acc: 0.9903</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>10/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0467 - d_loss: 0.0397 - g_loss: 4.0031 - gen_acc: 0.9985 - real_acc: 0.9903</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>11/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0467 - d_loss: 0.0402 - g_loss: 3.9981 - gen_acc: 0.9985 - real_acc: 0.9902</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>12/46 ━━━━━━━━━━━━━━━━━━━━ 6s 182ms/step - aug_p: 0.0468 - d_loss: 0.0406 - g_loss: 3.9968 - gen_acc: 0.9985 - real_acc: 0.9902</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>13/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0468 - d_loss: 0.0411 - g_loss: 3.9967 - gen_acc: 0.9985 - real_acc: 0.9899</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>14/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0468 - d_loss: 0.0418 - g_loss: 3.9930 - gen_acc: 0.9984 - real_acc: 0.9897</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>15/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0469 - d_loss: 0.0428 - g_loss: 3.9956 - gen_acc: 0.9982 - real_acc: 0.9893</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>16/46 ━━━━━━━━━━━━━━━━━━━━ 5s 181ms/step - aug_p: 0.0469 - d_loss: 0.0436 - g_loss: 3.9957 - gen_acc: 0.9980 - real_acc: 0.9890</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>17/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0469 - d_loss: 0.0443 - g_loss: 3.9983 - gen_acc: 0.9978 - real_acc: 0.9887</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>18/46 ━━━━━━━━━━━━━━━━━━━━ 5s 180ms/step - aug_p: 0.0470 - d_loss: 0.0450 - g_loss: 3.9978 - gen_acc: 0.9977 - real_acc: 0.9885</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>19/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0470 - d_loss: 0.0457 - g_loss: 3.9987 - gen_acc: 0.9976 - real_acc: 0.9883</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>20/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0470 - d_loss: 0.0464 - g_loss: 3.9966 - gen_acc: 0.9974 - real_acc: 0.9880</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>21/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0471 - d_loss: 0.0472 - g_loss: 3.9956 - gen_acc: 0.9973 - real_acc: 0.9877</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>22/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0471 - d_loss: 0.0482 - g_loss: 3.9910 - gen_acc: 0.9969 - real_acc: 0.9874</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>23/46 ━━━━━━━━━━━━━━━━━━━━ 4s 180ms/step - aug_p: 0.0471 - d_loss: 0.0501 - g_loss: 3.9936 - gen_acc: 0.9965 - real_acc: 0.9862</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>24/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0472 - d_loss: 0.0532 - g_loss: 3.9900 - gen_acc: 0.9949 - real_acc: 0.9853</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>25/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0472 - d_loss: 0.0576 - g_loss: 3.9964 - gen_acc: 0.9935 - real_acc: 0.9832</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>26/46 ━━━━━━━━━━━━━━━━━━━━ 3s 180ms/step - aug_p: 0.0472 - d_loss: 0.0624 - g_loss: 3.9986 - gen_acc: 0.9917 - real_acc: 0.9813</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>27/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0472 - d_loss: 0.0667 - g_loss: 4.0030 - gen_acc: 0.9901 - real_acc: 0.9795</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>28/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0473 - d_loss: 0.0707 - g_loss: 4.0083 - gen_acc: 0.9887 - real_acc: 0.9778</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>29/46 ━━━━━━━━━━━━━━━━━━━━ 3s 181ms/step - aug_p: 0.0473 - d_loss: 0.0744 - g_loss: 4.0128 - gen_acc: 0.9873 - real_acc: 0.9762</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>30/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0473 - d_loss: 0.0776 - g_loss: 4.0161 - gen_acc: 0.9862 - real_acc: 0.9748</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>31/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0473 - d_loss: 0.0806 - g_loss: 4.0186 - gen_acc: 0.9851 - real_acc: 0.9735</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>32/46 ━━━━━━━━━━━━━━━━━━━━ 2s 181ms/step - aug_p: 0.0474 - d_loss: 0.0832 - g_loss: 4.0199 - gen_acc: 0.9841 - real_acc: 0.9724</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>33/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0474 - d_loss: 0.0856 - g_loss: 4.0204 - gen_acc: 0.9832 - real_acc: 0.9714</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>34/46 ━━━━━━━━━━━━━━━━━━━━ 2s 182ms/step - aug_p: 0.0474 - d_loss: 0.0878 - g_loss: 4.0206 - gen_acc: 0.9825 - real_acc: 0.9705</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>35/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0474 - d_loss: 0.0898 - g_loss: 4.0206 - gen_acc: 0.9818 - real_acc: 0.9697</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>36/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0475 - d_loss: 0.0916 - g_loss: 4.0200 - gen_acc: 0.9811 - real_acc: 0.9690</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>37/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0475 - d_loss: 0.0933 - g_loss: 4.0193 - gen_acc: 0.9805 - real_acc: 0.9683</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>38/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0475 - d_loss: 0.0948 - g_loss: 4.0185 - gen_acc: 0.9800 - real_acc: 0.9677</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>39/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0475 - d_loss: 0.0961 - g_loss: 4.0171 - gen_acc: 0.9796 - real_acc: 0.9672</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>40/46 ━━━━━━━━━━━━━━━━━━━━ 1s 182ms/step - aug_p: 0.0475 - d_loss: 0.0974 - g_loss: 4.0158 - gen_acc: 0.9791 - real_acc: 0.9667</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>41/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0476 - d_loss: 0.0985 - g_loss: 4.0146 - gen_acc: 0.9787 - real_acc: 0.9662</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>42/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0476 - d_loss: 0.0995 - g_loss: 4.0133 - gen_acc: 0.9784 - real_acc: 0.9658</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>43/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0476 - d_loss: 0.1005 - g_loss: 4.0119 - gen_acc: 0.9781 - real_acc: 0.9655</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>44/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0476 - d_loss: 0.1013 - g_loss: 4.0102 - gen_acc: 0.9778 - real_acc: 0.9652</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>45/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0476 - d_loss: 0.1021 - g_loss: 4.0083 - gen_acc: 0.9775 - real_acc: 0.9649</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 0s 182ms/step - aug_p: 0.0477 - d_loss: 0.1028 - g_loss: 4.0070 - gen_acc: 0.9773 - real_acc: 0.9647</p> <p><img alt="png" src="/img/examples/generative/gan_ada/gan_ada_18_506.png" /></p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code> </code></pre></div> </div> <p>46/46 ━━━━━━━━━━━━━━━━━━━━ 21s 304ms/step - aug_p: 0.0477 - d_loss: 0.1035 - g_loss: 4.0058 - gen_acc: 0.9771 - real_acc: 0.9644 - val_kid: 3.0212</p> <div class="k-default-codeblock"> <div class="codehilite"><pre><span></span><code>&lt;keras.src.callbacks.history.History at 0x794f705d3390&gt; </code></pre></div> </div> <hr /> <h2 id="inference">Inference</h2> <div class="codehilite"><pre><span></span><code><span class="c1"># load the best model and generate images</span> <span class="n">model</span><span class="o">.</span><span class="n">load_weights</span><span class="p">(</span><span class="n">checkpoint_path</span><span class="p">)</span> <span class="n">model</span><span class="o">.</span><span class="n">plot_images</span><span class="p">()</span> </code></pre></div> <p><img alt="png" src="/img/examples/generative/gan_ada/gan_ada_20_0.png" /></p> <hr /> <h2 id="results">Results</h2> <p>By running the training for 400 epochs (which takes 2-3 hours in a Colab notebook), one can get high quality image generations using this code example.</p> <p>The evolution of a random batch of images over a 400 epoch training (ema=0.999 for animation smoothness): <img alt="birds evolution gif" src="https://i.imgur.com/ecGuCcz.gif" /></p> <p>Latent-space interpolation between a batch of selected images: <img alt="birds interpolation gif" src="https://i.imgur.com/nGvzlsC.gif" /></p> <p>I also recommend trying out training on other datasets, such as <a href="https://www.tensorflow.org/datasets/catalog/celeb_a">CelebA</a> for example. In my experience good results can be achieved without changing any hyperparameters (though discriminator augmentation might not be necessary).</p> <hr /> <h2 id="gan-tips-and-tricks">GAN tips and tricks</h2> <p>My goal with this example was to find a good tradeoff between ease of implementation and generation quality for GANs. During preparation, I have run numerous ablations using <a href="https://github.com/beresandras/gan-flavours-keras">this repository</a>.</p> <p>In this section I list the lessons learned and my recommendations in my subjective order of importance.</p> <p>I recommend checking out the <a href="https://arxiv.org/abs/1511.06434">DCGAN paper</a>, this <a href="https://www.youtube.com/watch?v=myGAju4L7O8">NeurIPS talk</a>, and this <a href="https://arxiv.org/abs/1711.10337">large scale GAN study</a> for others' takes on this subject.</p> <h3 id="architectural-tips">Architectural tips</h3> <ul> <li><strong>resolution</strong>: Training GANs at higher resolutions tends to get more difficult, I recommend experimenting at 32x32 or 64x64 resolutions initially.</li> <li><strong>initialization</strong>: If you see strong colorful patterns early on in the training, the initialization might be the issue. Set the kernel_initializer parameters of layers to <a href="https://keras.io/api/layers/initializers/#randomnormal-class">random normal</a>, and decrease the standard deviation (recommended value: 0.02, following DCGAN) until the issue disappears.</li> <li><strong>upsampling</strong>: There are two main methods for upsampling in the generator. <a href="https://keras.io/api/layers/convolution_layers/convolution2d_transpose/">Transposed convolution</a> is faster, but can lead to <a href="https://distill.pub/2016/deconv-checkerboard/">checkerboard artifacts</a>, which can be reduced by using a kernel size that is divisible with the stride (recommended kernel size is 4 for a stride of 2). <a href="https://keras.io/api/layers/reshaping_layers/up_sampling2d/">Upsampling</a> + <a href="https://keras.io/api/layers/convolution_layers/convolution2d/">standard convolution</a> can have slightly lower quality, but checkerboard artifacts are not an issue. I recommend using nearest-neighbor interpolation over bilinear for it.</li> <li><strong>batch normalization in discriminator</strong>: Sometimes has a high impact, I recommend trying out both ways.</li> <li><strong><a href="https://www.tensorflow.org/addons/api_docs/python/tfa/layers/SpectralNormalization">spectral normalization</a></strong>: A popular technique for training GANs, can help with stability. I recommend disabling batch normalization's learnable scale parameters along with it.</li> <li><strong><a href="https://keras.io/guides/functional_api/#a-toy-resnet-model">residual connections</a></strong>: While residual discriminators behave similarly, residual generators are more difficult to train in my experience. They are however necessary for training large and deep architectures. I recommend starting with non-residual architectures.</li> <li><strong>dropout</strong>: Using dropout before the last layer of the discriminator improves generation quality in my experience. Recommended dropout rate is below 0.5.</li> <li><strong><a href="https://keras.io/api/layers/activation_layers/leaky_relu/">leaky ReLU</a></strong>: Use leaky ReLU activations in the discriminator to make its gradients less sparse. Recommended slope/alpha is 0.2 following DCGAN.</li> </ul> <h3 id="algorithmic-tips">Algorithmic tips</h3> <ul> <li><strong>loss functions</strong>: Numerous losses have been proposed over the years for training GANs, promising improved performance and stability. I have implemented 5 of them in <a href="https://github.com/beresandras/gan-flavours-keras">this repository</a>, and my experience is in line with <a href="https://arxiv.org/abs/1711.10337">this GAN study</a>: no loss seems to consistently outperform the default non-saturating GAN loss. I recommend using that as a default.</li> <li><strong>Adam's beta_1 parameter</strong>: The beta_1 parameter in Adam can be interpreted as the momentum of mean gradient estimation. Using 0.5 or even 0.0 instead of the default 0.9 value was proposed in DCGAN and is important. This example would not work using its default value.</li> <li><strong>separate batch normalization for generated and real images</strong>: The forward pass of the discriminator should be separate for the generated and real images. Doing otherwise can lead to artifacts (45 degree stripes in my case) and decreased performance.</li> <li><strong>exponential moving average of generator's weights</strong>: This helps to reduce the variance of the KID measurement, and helps in averaging out the rapid color palette changes during training.</li> <li><strong><a href="https://arxiv.org/abs/1706.08500">different learning rate for generator and discriminator</a></strong>: If one has the resources, it can help to tune the learning rates of the two networks separately. A similar idea is to update either network's (usually the discriminator's) weights multiple times for each of the other network's updates. I recommend using the same learning rate of 2e-4 (Adam), following DCGAN for both networks, and only updating both of them once as a default.</li> <li><strong>label noise</strong>: <a href="https://arxiv.org/abs/1606.03498">One-sided label smoothing</a> (using less than 1.0 for real labels), or adding noise to the labels can regularize the discriminator not to get overconfident, however in my case they did not improve performance.</li> <li><strong>adaptive data augmentation</strong>: Since it adds another dynamic component to the training process, disable it as a default, and only enable it when the other components already work well.</li> </ul> <hr /> <h2 id="related-works">Related works</h2> <p>Other GAN-related Keras code examples:</p> <ul> <li><a href="https://keras.io/examples/generative/dcgan_overriding_train_step/">DCGAN + CelebA</a></li> <li><a href="https://keras.io/examples/generative/wgan_gp/">WGAN + FashionMNIST</a></li> <li><a href="https://keras.io/examples/generative/wgan-graphs/">WGAN + Molecules</a></li> <li><a href="https://keras.io/examples/generative/conditional_gan/">ConditionalGAN + MNIST</a></li> <li><a href="https://keras.io/examples/generative/cyclegan/">CycleGAN + Horse2Zebra</a></li> <li><a href="https://keras.io/examples/generative/stylegan/">StyleGAN</a></li> </ul> <p>Modern GAN architecture-lines:</p> <ul> <li><a href="https://arxiv.org/abs/1805.08318">SAGAN</a>, <a href="https://arxiv.org/abs/1809.11096">BigGAN</a></li> <li><a href="https://arxiv.org/abs/1710.10196">ProgressiveGAN</a>, <a href="https://arxiv.org/abs/1812.04948">StyleGAN</a>, <a href="https://arxiv.org/abs/1912.04958">StyleGAN2</a>, <a href="https://arxiv.org/abs/2006.06676">StyleGAN2-ADA</a>, <a href="https://arxiv.org/abs/2106.12423">AliasFreeGAN</a></li> </ul> <p>Concurrent papers on discriminator data augmentation: <a href="https://arxiv.org/abs/2006.02595">1</a>, <a href="https://arxiv.org/abs/2006.05338">2</a>, <a href="https://arxiv.org/abs/2006.10738">3</a></p> <p>Recent literature overview on GANs: <a href="https://www.youtube.com/watch?v=3ktD752xq5k">talk</a></p> </div> <div class='k-outline'> <div class='k-outline-depth-1'> <a href='#dataefficient-gans-with-adaptive-discriminator-augmentation'>Data-efficient GANs with Adaptive Discriminator Augmentation</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#introduction'>Introduction</a> </div> <div class='k-outline-depth-3'> <a href='#gans'>GANs</a> </div> <div class='k-outline-depth-3'> <a href='#data-augmentation-for-gans'>Data augmentation for GANS</a> </div> <div class='k-outline-depth-3'> <a href='#invertible-data-augmentation'>Invertible data augmentation</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#setup'>Setup</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#hyperparameterers'>Hyperparameterers</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#data-pipeline'>Data pipeline</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#kernel-inception-distance'>Kernel inception distance</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#adaptive-discriminator-augmentation'>Adaptive discriminator augmentation</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#network-architecture'>Network architecture</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#gan-model'>GAN model</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#training'>Training</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#inference'>Inference</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#results'>Results</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#gan-tips-and-tricks'>GAN tips and tricks</a> </div> <div class='k-outline-depth-3'> <a href='#architectural-tips'>Architectural tips</a> </div> <div class='k-outline-depth-3'> <a href='#algorithmic-tips'>Algorithmic tips</a> </div> <div class='k-outline-depth-2'> ◆ <a href='#related-works'>Related works</a> </div> </div> </div> </div> </div> </body> <footer style="float: left; width: 100%; padding: 1em; border-top: solid 1px #bbb;"> <a href="https://policies.google.com/terms">Terms</a> | <a href="https://policies.google.com/privacy">Privacy</a> </footer> </html>

Pages: 1 2 3 4 5 6 7 8 9 10