From eb069c8e68082a9b0878760cb1ad248a52dfd0f9 Mon Sep 17 00:00:00 2001
From: Tarun Chadha <tarunchadha23@gmail.com>
Date: Sun, 28 Apr 2019 22:34:47 +0200
Subject: [PATCH] Update material

---
 neural_nets_intro.ipynb | 2611 +++++++--------------------------------
 1 file changed, 432 insertions(+), 2179 deletions(-)

diff --git a/neural_nets_intro.ipynb b/neural_nets_intro.ipynb
index 74603f3..cebf4bb 100644
--- a/neural_nets_intro.ipynb
+++ b/neural_nets_intro.ipynb
@@ -1,5 +1,20 @@
 {
  "cells": [
+  {
+   "cell_type": "code",
+   "execution_count": 35,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# IGNORE THIS CELL WHICH CUSTOMIZES LAYOUT AND STYLING OF THE NOTEBOOK !\n",
+    "import matplotlib.pyplot as plt\n",
+    "%matplotlib inline\n",
+    "%config InlineBackend.figure_format = 'retina'\n",
+    "import warnings\n",
+    "warnings.filterwarnings('ignore', category=FutureWarning)\n",
+    "#from IPython.core.display import HTML; HTML(open(\"custom.html\", \"r\").read())"
+   ]
+  },
   {
    "cell_type": "markdown",
    "metadata": {},
@@ -390,6 +405,15 @@
     "</center>"
    ]
   },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "### Multi-layer perceptrons\n"
+   ]
+  },
   {
    "cell_type": "markdown",
    "metadata": {},
@@ -409,7 +433,7 @@
     "\n",
     "Now we know that we can compute complex functions if we stack together a number of perceptrons.\n",
     "\n",
-    "However, we can DO NOT want to set the weights and thresholds by hand as we did in the examples above.\n",
+    "However, we definitely **DO NOT** want to set the weights and thresholds by hand as we did in the examples above.\n",
     "\n",
     "We want some algorithm to do this for us!\n",
     "\n",
@@ -417,14 +441,21 @@
     "\n",
     "\n",
     "### Loss function\n",
-    "As in the case of other machine learning algorithms we need to define a so-called \"Loss function\". In simple words this function measures how close are the predictions of our network to the supplied labels. Once we have this function we need an algorithm to update the weights of the network such that this loss decreases. As one can already imagine the choice of an appropriate loss function is very important to the success of the trained model. Fortunately, for classification and regression (which comprise of a large range of probelms) these loss functions are well known. Generally **crossentropy** and **mean squared error** loss functions are chosen for classification and regression problems, respectively.\n",
+    "In order to learn using an algorithm for learning we need to define a quantity which allows us to measure how far are the predictions of our network/setup are from the reality. This is done by choosing a so-called \"Loss function\" (as in the case for other machine learning algorithms). In other words this function measures how close are the predictions of our network to the supplied labels. Once we have this function we need an algorithm to update the weights of the network such that this loss decreases. As one can already imagine the choice of an appropriate loss function is very important to the success of the model. Fortunately, for classification and regression (which cover a large variety of probelms) these loss functions are well known. \n",
+    "\n",
+    "Generally **crossentropy** and **mean squared error** loss functions are used for classification and regression problems, respectively.\n",
     "\n",
     "### Gradient based learning\n",
-    "Once we have a loss function we want to solve an **optimization problem** which minimizes this loss by updating the weights of the network and this is how the learning actually happens.\n",
+    "As mentioned above, once we have decided upon a loss function we want to solve an **optimization problem** which minimizes this loss by updating the weights of the network. This is how the learning actually happens.\n",
+    "\n",
+    "The most popular optimization methods used in Neural Network training are some sort of **Gradient-descent** type methods, for e.g. gradient-descent, RMSprop, adam etc. \n",
+    "**Gradient-descent** uses partial derivatives of the loss function with respect to the network weights and a learning rate to updates the weights such that the loss function decreases and hopefully after some iterations reaches its (Global) minimum.\n",
+    "\n",
+    "First, the loss function and its derivative are computed at the output node and this signal is propogated backwards, using chain rule, in the network to compute the partial derivatives. Hence, this method is called **Backpropagation**.\n",
+    "\n",
+    "Depending of\n",
     "\n",
-    "One of the most popular optimization method used in machine learning is **Gradient-descent**\n",
     "\n",
-    "INSERT MORE EXPLAINATIONS HERE\n",
     "\n",
     "### Activation Functions\n",
     "\n",
@@ -502,22 +533,6 @@
     "plt.plot(pts, pts_relu) ;"
    ]
   },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "### Multi-layer preceptron neural network\n",
-    "Universal function theorem\n",
-    "\n",
-    "epochs\n",
-    "\n",
-    "Suggestion Uwe:\n",
-    "\n",
-    "3. way around: look at nature how neuron works and introduce non linear activation functions.\n",
-    "\n",
-    "4. theoretical background: universal approximation theorem."
-   ]
-  },
   {
    "cell_type": "markdown",
    "metadata": {},
@@ -1756,7 +1771,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 10,
+   "execution_count": 148,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -1771,7 +1786,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 11,
+   "execution_count": 149,
    "metadata": {},
    "outputs": [
     {
@@ -1789,24 +1804,28 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 12,
+   "execution_count": 150,
    "metadata": {},
    "outputs": [
     {
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "This digit is:  0\n"
+      "This digit is:  4\n"
      ]
     },
     {
      "data": {
-      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAADjNJREFUeJzt3W+IXfWdx/HPd20LxvaBMhMbbLLTLbJxEDbVa1iwLi7FatZCTKDS4J8slE4fVNhCH6xOhPrAP3HZmu2DJZisofnT2hYy+fNAbUUWTWApuZFSbTJpRcY0m5CZkEKNPijqtw/mTJnGub/fzT3n3HMn3/cLwtx7vvfc8+VOPnPuvb9zzs/cXQDi+ZumGwDQDMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiCoT/RzY0NDQz4yMtLPTQKhTE1N6dy5c9bNY0uF38zukvQDSVdI+h9335x6/MjIiNrtdplNAkhotVpdP7bnt/1mdoWk/5a0RtKopA1mNtrr8wHorzKf+VdLesvd33b3P0n6iaS11bQFoG5lwn+dpN/Pu3+qWPZXzGzMzNpm1p6ZmSmxOQBVKhP+hb5U+Nj5we6+zd1b7t4aHh4usTkAVSoT/lOSls+7/zlJp8u1A6BfyoT/iKTrzezzZvYpSV+XdLCatgDUreehPnf/wMwekvRzzQ717XD331TWGYBalRrnd/cXJL1QUS8A+ojDe4GgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiq1Cy9ZjYl6V1JH0r6wN1bVTSFwfHee+8l65OTkz0/9w033JCsL1mypOfnRl6p8Bf+2d3PVfA8APqIt/1AUGXD75J+YWZHzWysioYA9EfZt/23uvtpM1sq6WUzm3T31+Y/oPijMCZJK1asKLk5AFUpted399PFz2lJ+yStXuAx29y95e6t4eHhMpsDUKGew29mV5nZZ+ZuS/qKpDeragxAvcq87b9W0j4zm3ueH7v7S5V0BaB2PYff3d+W9A8V9oIaPP7448n6gQMHkvX77rsvWd+8eXOyPj093bE2OjqaXPfKK69M1nPHCezatStZj46hPiAowg8ERfiBoAg/EBThB4Ii/EBQVZzVh5ImJiaS9U2bNiXrJ06c6Fhz9+S6xXEaHbXb7VLrp7Z/7NixnteVpKNHjybrx48f71g7cuRIct0I2PMDQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCM8/fBAw88kKzv378/Wc9dPjs11r5y5crkuqljBLqRG+eva91u1k9dVvzJJ59Mrjs+Pt5TT4sJe34gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIpx/grkLo+9Z8+eZD03Xp2bqnr37t0da+vWrUuue8sttyTrMzMzyXrufP+hoaGOtTVr1iTXfemlctNAXLhwoWPt0UcfTa6b6luSxsYW/9SU7PmBoAg/EBThB4Ii/EBQhB8IivADQRF+IKjsOL+Z7ZD0VUnT7n5jsewaST+VNCJpStK97v6H+tpsXmos/+mnn06umxvHL3vt/Nw5+ym5aa5z9dx4eErq+AQpP86/b9++nuu513z79u3J+vr165P1Mq9Lv3Sz5/+hpLsuWvawpFfc/XpJrxT3ASwi2fC7+2uSzl+0eK2kncXtnZLuqbgvADXr9TP/te5+RpKKn0urawlAP9T+hZ+ZjZlZ28zauePEAfRPr+E/a2bLJKn4Od3pge6+zd1b7t4aHh7ucXMAqtZr+A9K2ljc3ijpQDXtAOiXbPjN7HlJ/yfp783slJl9Q9JmSXeY2e8k3VHcB7CIZMf53X1Dh9KXK+6lUblr66fOya/zfHyp3Dh+ziOPPJKs58b5y8iNhd9///3J+vvvv5+sT0xMXHJPc3LHVpw8eTJZv1zG+QFchgg/EBThB4Ii/EBQhB8IivADQYW5dHfu0OLDhw8n66nhvNxQX24oL3d57TrVOZRXt1zvZX5nEbDnB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgwozz56bJfuedd5J1d+9Yy12hqMlx/MvZbbfdlqynfmc5ZdZdLNjzA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQYcb5T5w4kayXOb87N10zmpE63z/3/yHniSeeSNb37t1b6vn7gT0/EBThB4Ii/EBQhB8IivADQRF+ICjCDwSVHec3sx2Svipp2t1vLJY9JumbkuYuhj/u7i/U1WQVXn311WQ9d/52hPO7Lzep8/0nJyeT60b4fXez5/+hpLsWWL7F3VcV/wY6+AA+Lht+d39N0vk+9AKgj8p85n/IzH5tZjvM7OrKOgLQF72Gf6ukL0haJemMpO93eqCZjZlZ28zaufnyAPRPT+F397Pu/qG7fyRpu6TVicduc/eWu7dyF7oE0D89hd/Mls27u07Sm9W0A6Bfuhnqe17S7ZKGzOyUpO9Jut3MVklySVOSvlVjjwBqkA2/u29YYPFzNfRSqzrP58dgSv1Oy/6+N23aVGr9QcARfkBQhB8IivADQRF+ICjCDwRF+IGgwly6Ozed86FDh5L11CmeudOF0YzU74xTuNnzA2ERfiAowg8ERfiBoAg/EBThB4Ii/EBQYcb5n3nmmWT97rvvTtanp6c71spO94zeTExMJOv79+/vWMud0pu76tTQ0FCyvhiw5weCIvxAUIQfCIrwA0ERfiAowg8ERfiBoMKM8998883J+vLly5P1s2fP9rzt9evXJ+t79uxJ1pcsWdLzthez3Ovy4IMPJuupc/Jz4/xbt25N1lesWJGsLwbs+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqOw4v5ktl7RL0mclfSRpm7v/wMyukfRTSSOSpiTd6+5/qK/Veo2PjyfrqSmZc+fzp84rl6TVq1cn67ljENatW5esp+SOQchdv35ycjJZT82XkDsfPzeOX2aa7dHR0WQ997pcDrrZ838g6bvufoOkf5T0bTMblfSwpFfc/XpJrxT3ASwS2fC7+xl3f724/a6k45Kuk7RW0s7iYTsl3VNXkwCqd0mf+c1sRNIXJf1S0rXufkaa/QMhaWnVzQGoT9fhN7NPS9or6Tvu/sdLWG/MzNpm1p6ZmemlRwA16Cr8ZvZJzQb/R+4+9y3NWTNbVtSXSVrwCpfuvs3dW+7eyl0UEUD/ZMNvs1+pPifpuLvPvwTuQUkbi9sbJR2ovj0AdbHcUI6ZfUnSIUlvaHaoT5LGNfu5/2eSVkg6Kelr7n4+9VytVsvb7XbZnhuR+sjy1FNPJdfdsmVLsp4bsurid1TLunWvX3bbS5emv2a66aabOtZ2796dXHexXpq71Wqp3W53NQaaHed398OSOj3Zly+lMQCDgyP8gKAIPxAU4QeCIvxAUIQfCIrwA0GFuXR3WamjE3PTf995553Jeu6U32effTZZL3Nqa5l169527ojQF198MVlPjfODPT8QFuEHgiL8QFCEHwiK8ANBEX4gKMIPBJU9n79Ki/l8/kGWugT2sWPHkuseOFDvNVhSl+5euXJlct2xsbGq27nsXcr5/Oz5gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAoxvmBywjj/ACyCD8QFOEHgiL8QFCEHwiK8ANBEX4gqGz4zWy5mf2vmR03s9+Y2b8Vyx8zs/83s18V//6l/nYBVKWbSTs+kPRdd3/dzD4j6aiZvVzUtrj7f9bXHoC6ZMPv7mcknSluv2tmxyVdV3djAOp1SZ/5zWxE0hcl/bJY9JCZ/drMdpjZ1R3WGTOztpm1Z2ZmSjULoDpdh9/MPi1pr6TvuPsfJW2V9AVJqzT7zuD7C63n7tvcveXurdzcawD6p6vwm9knNRv8H7n7hCS5+1l3/9DdP5K0XdLq+toEULVuvu03Sc9JOu7uz8xbvmzew9ZJerP69gDUpZtv+2+V9ICkN8zsV8WycUkbzGyVJJc0JelbtXQIoBbdfNt/WNJC5we/UH07APqFI/yAoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANB9XWKbjObkfTOvEVDks71rYFLM6i9DWpfEr31qsre/tbdu7peXl/D/7GNm7XdvdVYAwmD2tug9iXRW6+a6o23/UBQhB8Iqunwb2t4+ymD2tug9iXRW68a6a3Rz/wAmtP0nh9AQxoJv5ndZWYnzOwtM3u4iR46MbMpM3ujmHm43XAvO8xs2szenLfsGjN72cx+V/xccJq0hnobiJmbEzNLN/raDdqM131/229mV0j6raQ7JJ2SdETSBnc/1tdGOjCzKUktd298TNjM/knSBUm73P3GYtl/SDrv7puLP5xXu/u/D0hvj0m60PTMzcWEMsvmzywt6R5J/6oGX7tEX/eqgdetiT3/aklvufvb7v4nST+RtLaBPgaeu78m6fxFi9dK2lnc3qnZ/zx916G3geDuZ9z99eL2u5LmZpZu9LVL9NWIJsJ/naTfz7t/SoM15bdL+oWZHTWzsaabWcC1xbTpc9OnL224n4tlZ27up4tmlh6Y166XGa+r1kT4F5r9Z5CGHG5195skrZH07eLtLbrT1czN/bLAzNIDodcZr6vWRPhPSVo+7/7nJJ1uoI8Fufvp4ue0pH0avNmHz85Nklr8nG64n78YpJmbF5pZWgPw2g3SjNdNhP+IpOvN7PNm9ilJX5d0sIE+PsbMriq+iJGZXSXpKxq82YcPStpY3N4o6UCDvfyVQZm5udPM0mr4tRu0Ga8bOcinGMr4L0lXSNrh7k/0vYkFmNnfaXZvL81OYvrjJnszs+cl3a7Zs77OSvqepP2SfiZphaSTkr7m7n3/4q1Db7dr9q3rX2ZunvuM3efeviTpkKQ3JH1ULB7X7Ofrxl67RF8b1MDrxhF+QFAc4QcERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+IKg/A1I8Tc1nqLidAAAAAElFTkSuQmCC\n",
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAAH0CAYAAADVH+85AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAHS1JREFUeJzt3X2wZHV5J/DvEwiCFIyBxFgWa3iJvAQDLEMU3xDHxJe8CAZYqUoMlRIqZs0qRC0txDgm2RSpokRRVy10Q0VTS1JYIeVKUIt3xLwNMSwVFFRGlyyENxmUtzD42z/6TBxv7p2Z291z+86vP5+qrnP7nPP075nD4X7v6T59TrXWAgD06Udm3QAAsPMIegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDo2O6zbmBnqKo7k+ybZOOMWwGAcR2Y5OHW2kGTvEiXQZ9k37322mu/I444Yr9ZNwIA47jtttvy2GOPTfw6Mw36qjogye8neXWS/ZPcneTyJO9rrX1ngpfeeMQRR+y3YcOGKXQJACtv7dq1ufnmmzdO+jozC/qqOiTJTUmemeSvknw1yfOTvDXJq6vqxa21B2bVHwD0YJYn4/2PjEL+La21k1tr72qtrUtyYZLDkvz3GfYGAF2YSdBX1cFJXpnRyXIfWbD4vUkeSfKGqtp7hVsDgK7M6oh+3TD9Qmvt+1svaK19N8mXkjw9yfEr3RgA9GRWn9EfNkxvX2L5HRkd8R+a5KqlXqSqljrb7vDxWwOAfszqiH7NMN20xPIt85+xAr0AQLdW6/foa5i2ba3UWlu7aPHoSP/YaTcFALuaWR3RbzliX7PE8n0XrAcAjGFWQf+1YXroEsufO0yX+gwfANgBswr6a4bpK6vqh3qoqn2SvDjJY0n+ZqUbA4CezCToW2vfSPKFjC7Y/+YFi9+XZO8kf9pae2SFWwOArszyZLz/mtElcC+qqlckuS3JC5K8PKO37N89w94AoAszuwTucFR/XJJLMgr4tyU5JMlFSV7oOvcAMLmZfr2utfZ/k/zmLHsAgJ7N8qY2AMBOJugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGO7z7oBAHbcZZddNnbtPvvsM9HYr3rVqyaqZzYc0QNAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAx9yPHmAF3XPPPRPVn3XWWWPXPuc5z5lobPej3zXN7Ii+qjZWVVviMdn/CQBAktkf0W9K8oFF5n9vpRsBgB7NOugfaq2tn3EPANAtJ+MBQMdmfUT/tKr69STPSfJIkluSXN9ae2q2bQFAH2Yd9M9K8qkF8+6sqt9srV23veKq2rDEosMn7gwAOjDLt+7/JMkrMgr7vZP8bJKPJzkwyV9X1dGzaw0A+jCzI/rW2vsWzLo1yZuq6ntJ3pZkfZLXbec11i42fzjSP3YKbQLALm01noz3sWF6wky7AIAOrMagv3eY7j3TLgCgA6sx6F84TL850y4AoAMzCfqqOrKq9ltk/k8l+fDw9NMr2xUA9GdWJ+OdluRdVXVNkjuTfDfJIUl+KcmeSa5IcsGMegOAbswq6K9JcliS/5zRW/V7J3koyY0Zfa/+U621NqPeAKAbMwn64WI4270gDkBvzjzzzInqN23aNHbtkUceOdHY7JpW48l4AMCUCHoA6JigB4COCXoA6JigB4COCXoA6JigB4COCXoA6JigB4COCXoA6JigB4COCXoA6JigB4COCXoA6JigB4COzeR+9MAPXH/99WPXHn/88RONvccee0xUP6+efPLJsWvvvvvuicZes2bN2LXvfOc7JxqbXZMjegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI4JegDomKAHgI65TS1M6PLLL5+o/h3veMfYtR//+McnGnvdunUT1c+rz33uc2PX/uM//uNEY7/73e8eu/boo4+eaGx2TY7oAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBj7kcPSb71rW+NXfuud71rip0sz3HHHTezsXdlDz744ET1Z5111pQ6gZ3PET0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DH3KaWLjz11FMT1V944YVj127cuHGisb/4xS+OXbvvvvtONPa8evLJJyeqf+CBB6bUyfKdeeaZMxubXdNUjuir6tSq+lBV3VBVD1dVq6pPb6fmRVV1RVU9WFWPVtUtVXV2Ve02jZ4AgOkd0Z+X5Ogk30tyV5LDt7VyVZ2U5DNJHk/y50keTPIrSS5M8uIkp02pLwCYa9P6jP6cJIcm2TfJb29rxaraN8nFSZ5KcmJr7Y2ttXckOSbJl5OcWlWnT6kvAJhrUwn61to1rbU7WmttB1Y/NclPJLm0tfYPW73G4xm9M5Bs548FAGDHzOKs+3XD9MpFll2f5NEkL6qqp61cSwDQp1kE/WHD9PaFC1prm5PcmdG5AwevZFMA0KNZfL1uzTDdtMTyLfOfsb0XqqoNSyza5smAADAvVuMFc2qY7sjn/QDANsziiH7LEfuaJZbvu2C9JbXW1i42fzjSP3b5rQFAX2ZxRP+1YXrowgVVtXuSg5JsTvLNlWwKAHo0i6C/epi+epFlJyR5epKbWmtPrFxLANCnWQT9ZUnuT3J6VR23ZWZV7ZnkD4enH51BXwDQnal8Rl9VJyc5eXj6rGH6wqq6ZPj5/tba25OktfZwVZ2VUeBfW1WXZnQJ3Ndm9NW7yzK6LC4AMKFpnYx3TJIzFsw7OD/4Lvy3krx9y4LW2uVV9bIk705ySpI9k3w9ye8muWgHr7AHAGzHVIK+tbY+yfpl1nwpyS9OY3wAYHHuR8+qsXnz5rFrzz333InGvuiii8auPemkkyYa+6UvfelE9SzfVVddNbOx16xZ6pvFO2afffaZUifMi9V4wRwAYEoEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0zG1qWTUuvvjisWsvuOCCKXayPL/8y788s7EZz6233jqzsX/nd35novr99ttvSp0wLxzRA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DH3I+eqZn0Ht/vfe97p9TJ8h122GFj177+9a+fYif07oADDph1C8wZR/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdc5tafshjjz02du0pp5wy0dj33Xff2LUHHnjgRGNfc801Y9du3rx5orHvuuuusWt32223icae5L/3pNasWTNR/U033TR27QUXXDDR2K21sWuPOeaYicaG5XJEDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdcz96fsi55547du0dd9wx0dhVNXbtpk2bJhr7/e9//9i1V1111URjf+Mb3xi7do899pho7Pvvv3/s2knuyZ4kBx988ET1d95550T1k5hkX/37v//7icY+6qijxq79zne+M9HYz372s8euvfHGGyca+yUveclE9fNsKkf0VXVqVX2oqm6oqoerqlXVp5dY98Bh+VKPS6fREwAwvSP685IcneR7Se5KcvgO1PxTkssXmX/rlHoCgLk3raA/J6OA/3qSlyW5ZgdqvtJaWz+l8QGARUwl6Ftr/x7sk3x2BQBM1yxPxnt2Vf1Wkv2TPJDky621W2bYDwB0Z5ZB/wvD499V1bVJzmitfXtHXqCqNiyxaEfOEQCA7s3ie/SPJvmDJGuT/Njw2PK5/olJrqqqvWfQFwB0Z8WP6Ftr9yb5vQWzr6+qVya5MckLkpyZ5IM78FprF5s/HOkfO2GrALDLWzVXxmutbU7yieHpCbPsBQB6sWqCfnDfMPXWPQBMwWoL+uOH6Tdn2gUAdGLFg76qXlBV/+EC3VW1LqML7yTJopfPBQCWZyon41XVyUlOHp4+a5i+sKouGX6+v7X29uHnP05y5PBVuruGeUclWTf8/J7W2k3T6AsA5t20zro/JskZC+YdPDyS5FtJtgT9p5K8LsnPJXlNkh9N8q9J/iLJh1trN0ypJwCYe9O6BO76JOt3cN1PJvnkNMYFALatJr2n9GpUVRuOPfbYYzdsWOrCef268sorJ6o//fTTx659+OGHJxp7V3XMMcdMVH/cccdNqZOVdfHFF09U774Y4zn00EPHrt1jj/9wetSy/PzP//zYtdddd91EY8/j7/O1a9fm5ptvvnmpa8bsqNV21j0AMEWCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6JugBoGOCHgA6NpX70bN6nH/++RPVz/JWs8985jPHrn3rW9860divf/3rx6494IADJhp70luHzsqFF144Uf0k2zxJrrjiirFrd9ttt4nGfsc73jF27XnnnTfR2DfffPPYtWvXTnS309x1111j15577rkTjc34HNEDQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMcEPQB0TNADQMfcj74zJ5100kT1TzzxxNi1v/ZrvzbR2OvWrRu79md+5mcmGpvl23vvvSeqn/Se8JP48R//8Ynq/+iP/mhKnSzfS17ykpmN/dznPndmYzM+R/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdc5vazpxzzjkzrWd+/Mu//MtE9TfccMNE9YcccsjYtX/2Z3820diwK3FEDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdE/QA0DFBDwAdcz96YCwf+chHJqp/6KGHJqp//PHHx6593vOeN9HYsCuZ+Ii+qvavqjOr6i+r6utV9VhVbaqqG6vqjVW16BhV9aKquqKqHqyqR6vqlqo6u6p2m7QnAGBkGkf0pyX5aJK7k1yT5NtJfjLJryb5RJLXVNVprbW2paCqTkrymSSPJ/nzJA8m+ZUkFyZ58fCaAMCEphH0tyd5bZLPtda+v2VmVZ2b5O+SnJJR6H9mmL9vkouTPJXkxNbaPwzz35Pk6iSnVtXprbVLp9AbAMy1id+6b61d3Vr77NYhP8y/J8nHhqcnbrXo1CQ/keTSLSE/rP94kvOGp789aV8AwM4/6/7JYbp5q3nrhumVi6x/fZJHk7yoqp62MxsDgHmw0866r6rdk/zG8HTrUD9smN6+sKa1trmq7kxyZJKDk9y2nTE2LLHo8OV1CwB92plH9OcneV6SK1prn99q/pphummJui3zn7GzGgOAebFTjuir6i1J3pbkq0nesNzyYdq2uVaS1traJcbfkOTYZY4LAN2Z+hF9Vb05yQeT/HOSl7fWHlywypYj9jVZ3L4L1gMAxjTVoK+qs5N8OMmtGYX8PYus9rVheugi9bsnOSijk/e+Oc3eAGAeTS3oq+qdGV3w5isZhfy9S6x69TB99SLLTkjy9CQ3tdaemFZvADCvphL0w8Vuzk+yIckrWmv3b2P1y5Lcn+T0qjpuq9fYM8kfDk8/Oo2+AGDeTXwyXlWdkeT3M7rS3Q1J3lJVC1fb2Fq7JElaaw9X1VkZBf61VXVpRpfAfW1GX727LKPL4gIAE5rGWfcHDdPdkpy9xDrXJblky5PW2uVV9bIk787oErl7Jvl6kt9NctHW18UHAMY3cdC31tYnWT9G3ZeS/OKk4wOz8dnPfnam4z//+c8fu3aPPfaYYiewuu3sS+ACADMk6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADo28f3ogfn0yCOPzHT8Qw89dOza3Xf3q4/54YgeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY4IeADom6AGgY+7VCIzlqquumqj+oosumqj+TW9600T1MC8c0QNAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAxwQ9AHRM0ANAx9yPHhjLQQcdNFH9hRdeOKVOgG1xRA8AHRP0ANAxQQ8AHRP0ANAxQQ8AHRP0ANAxQQ8AHRP0ANAxQQ8AHRP0ANAxQQ8AHRP0ANAxQQ8AHRP0ANAxQQ8AHZs46Ktq/6o6s6r+sqq+XlWPVdWmqrqxqt5YVT+yYP0Dq6pt43HppD0BACO7T+E1Tkvy0SR3J7kmybeT/GSSX03yiSSvqarTWmttQd0/Jbl8kde7dQo9AQCZTtDfnuS1ST7XWvv+lplVdW6Sv0tySkah/5kFdV9pra2fwvgAwBImfuu+tXZ1a+2zW4f8MP+eJB8bnp446TgAwPJN44h+W54cppsXWfbsqvqtJPsneSDJl1trt+zkfgBgruy0oK+q3ZP8xvD0ykVW+YXhsXXNtUnOaK19e2f1BQDzZGce0Z+f5HlJrmitfX6r+Y8m+YOMTsT75jDvqCTrk7w8yVVVdUxr7ZHtDVBVG5ZYdPi4TQNAT3bK9+ir6i1J3pbkq0nesPWy1tq9rbXfa63d3Fp7aHhcn+SVSf42yU8nOXNn9AUA82bqR/RV9eYkH0zyz0le0Vp7cEfqWmubq+oTSV6Q5IThNbZXs3aJHjYkOXaHmwaATk31iL6qzk7y4Yy+C//y4cz75bhvmO49zb4AYF5NLeir6p1JLkzylYxC/t4xXub4YfrNba4FAOyQqQR9Vb0no5PvNmT0dv3921j3BVW1xyLz1yU5Z3j66Wn0BQDzbuLP6KvqjCS/n+SpJDckeUtVLVxtY2vtkuHnP05y5PBVuruGeUclWTf8/J7W2k2T9gUATOdkvIOG6W5Jzl5ineuSXDL8/Kkkr0vyc0lek+RHk/xrkr9I8uHW2g1T6AkAyBSCfrhe/fplrP/JJJ+cdFwAYPvcjx4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBjgh4AOiboAaBj1VqbdQ9TV1UP7LXXXvsdccQRs24FAMZy22235bHHHnuwtbb/JK/Ta9DfmWTfJBuXWOXwYfrVFWmoD7bZeGy38dhuy2ebjWc1b7cDkzzcWjtokhfpMui3p6o2JElrbe2se9lV2Gbjsd3GY7stn202nnnYbj6jB4COCXoA6JigB4COCXoA6JigB4COzeVZ9wAwLxzRA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DH5iroq+qAqvqfVfX/quqJqtpYVR+oqh+bdW+r1bCN2hKPe2bd36xU1alV9aGquqGqHh62x6e3U/Oiqrqiqh6sqker6paqOruqdlupvmdtOdutqg7cxr7XqurSle5/Fqpq/6o6s6r+sqq+XlWPVdWmqrqxqt5YVYv+Hp/3/W25263n/W33WTewUqrqkCQ3JXlmkr/K6N7Dz0/y1iSvrqoXt9YemGGLq9mmJB9YZP73VrqRVeS8JEdntA3uyg/uab2oqjopyWeSPJ7kz5M8mORXklyY5MVJTtuZza4iy9pug39Kcvki82+dYl+r2WlJPprk7iTXJPl2kp9M8qtJPpHkNVV1Wtvq6mf2tyRjbLdBf/tba20uHkk+n6Ql+W8L5r9/mP+xWfe4Gh9JNibZOOs+VtsjycuTPDdJJTlx2Ic+vcS6+ya5N8kTSY7bav6eGf3x2ZKcPut/0yrcbgcOyy+Zdd8z3mbrMgrpH1kw/1kZhVdLcspW8+1v4223bve3uXjrvqoOTvLKjELrIwsWvzfJI0neUFV7r3Br7KJaa9e01u5ow2+I7Tg1yU8kubS19g9bvcbjGR3hJslv74Q2V51lbjeStNaubq19trX2/QXz70nyseHpiVstsr9lrO3WrXl5637dMP3CIv/Rv1tVX8roD4Hjk1y10s3tAp5WVb+e5DkZ/VF0S5LrW2tPzbatXcaW/e/KRZZdn+TRJC+qqqe11p5YubZ2Gc+uqt9Ksn+SB5J8ubV2y4x7Wi2eHKabt5pnf9u+xbbbFt3tb/MS9IcN09uXWH5HRkF/aAT9Yp6V5FML5t1ZVb/ZWrtuFg3tYpbc/1prm6vqziRHJjk4yW0r2dgu4heGx7+rqmuTnNFa+/ZMOloFqmr3JL8xPN061O1v27CN7bZFd/vbXLx1n2TNMN20xPIt85+xAr3sav4kySsyCvu9k/xsko9n9HnWX1fV0bNrbZdh/xvPo0n+IMnaJD82PF6W0YlVJya5as4/bjs/yfOSXNFa+/xW8+1v27bUdut2f5uXoN+eGqY+N1ygtfa+4bOuf22tPdpau7W19qaMTmLcK8n62XbYBfvfIlpr97bWfq+1dnNr7aHhcX1G7779bZKfTnLmbLucjap6S5K3ZfTtoTcst3yYzt3+tq3t1vP+Ni9Bv+Uv2DVLLN93wXps35aTWU6YaRe7BvvfFLXWNmf09ahkDve/qnpzkg8m+eckL2+tPbhgFfvbInZguy2qh/1tXoL+a8P00CWWP3eYLvUZPv/RvcN0l3wra4Utuf8NnxcelNFJQd9cyaZ2cfcN07na/6rq7CQfzug73S8fziBfyP62wA5ut23Zpfe3eQn6a4bpKxe5GtI+GV1A4rEkf7PSje3CXjhM5+aXxQSuHqavXmTZCUmenuSmOT4DehzHD9O52f+q6p0ZXfDmKxmF1b1LrGp/28oyttu27NL721wEfWvtG0m+kNEJZG9esPh9Gf2V9qettUdWuLVVraqOrKr9Fpn/Uxn9dZwk27zsK0mSy5Lcn+T0qjpuy8yq2jPJHw5PPzqLxlazqnpBVe2xyPx1Sc4Zns7F/ldV78noJLINSV7RWrt/G6vb3wbL2W497281L9etWOQSuLcleUFGV+q6PcmLmkvg/pCqWp/kXRm9I3Jnku8mOSTJL2V0la0rkryutfZvs+pxVqrq5CQnD0+fleRVGf21f8Mw7/7W2tsXrH9ZRpckvTSjS5K+NqOvQl2W5L/Mw0VklrPdhq80HZnk2owul5skR+UH3xN/T2ttS3B1q6rOSHJJkqeSfCiLf7a+sbV2yVY1c7+/LXe7db2/zfrSfCv5SPKfMvq62N1J/i3JtzI6OWO/Wfe2Gh8ZfbXkf2V0hupDGV1k4r4kX8zoe6g16x5nuG3WZ3TW8lKPjYvUvDijP46+k9FHRf8noyOF3Wb971mN2y3JG5P874yuaPm9jC7p+u2Mrt3+0ln/W1bRNmtJrrW/Tbbdet7f5uaIHgDm0Vx8Rg8A80rQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdEzQA0DHBD0AdOz/A6mZ1LW3XV/7AAAAAElFTkSuQmCC\n",
       "text/plain": [
        "<Figure size 432x288 with 1 Axes>"
       ]
      },
      "metadata": {
+      "image/png": {
+       "height": 250,
+       "width": 253
+      },
       "needs_background": "light"
      },
      "output_type": "display_data"
@@ -1823,7 +1842,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 13,
+   "execution_count": 151,
    "metadata": {},
    "outputs": [
     {
@@ -1848,7 +1867,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 14,
+   "execution_count": 152,
    "metadata": {},
    "outputs": [
     {
@@ -1866,7 +1885,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 15,
+   "execution_count": 153,
    "metadata": {},
    "outputs": [],
    "source": [
@@ -1902,7 +1921,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 16,
+   "execution_count": 154,
    "metadata": {},
    "outputs": [
     {
@@ -2054,6 +2073,13 @@
     "plt.plot(np.arange(1, num_epochs+1), history_model[\"val_acc\"], \"red\")"
    ]
   },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Adding regularization"
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": 32,
@@ -2167,7 +2193,7 @@
     }
    ],
    "source": [
-    "# Adding some regularization\n",
+    "# Adding l2 regularization\n",
     "# Building the keras model\n",
     "from keras.models import Sequential\n",
     "from keras.layers import Dense\n",
@@ -2189,6 +2215,8 @@
     "                  optimizer=\"rmsprop\", metrics=[\"accuracy\"])\n",
     "    return model\n",
     "\n",
+    "model = mnist_model()\n",
+    "\n",
     "num_epochs = 50\n",
     "model_run = model.fit(X_train_prep, y_train_onehot, epochs=num_epochs,\n",
     "                      batch_size=512)"
@@ -2212,6 +2240,189 @@
     "print(\"The [loss, accuracy] on test dataset are: \" , model.evaluate(X_test_prep, y_test_onehot))"
    ]
   },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Another way to add regularization and to make the network more robust we can add something called \"Dropout\". When we add dropout to a layer a specified percentage of units in that layer are switched off. \n",
+    "\n",
+    "### Exercise: Add dropout instead of l2 regularization in the network above"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# Adding dropout is easy in keras\n",
+    "# We import a layer called Dropout and add as follows\n",
+    "# model.add(Dropout(0.5)) to randomly drop 50% of the hidden units\n",
+    "\n",
+    "\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 155,
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Epoch 1/50\n",
+      "60000/60000 [==============================] - 5s 85us/step - loss: 0.7865 - acc: 0.7668\n",
+      "Epoch 2/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.3798 - acc: 0.8885\n",
+      "Epoch 3/50\n",
+      "60000/60000 [==============================] - 2s 26us/step - loss: 0.3082 - acc: 0.9102\n",
+      "Epoch 4/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.2685 - acc: 0.9217\n",
+      "Epoch 5/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.2442 - acc: 0.9288\n",
+      "Epoch 6/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.2250 - acc: 0.9342\n",
+      "Epoch 7/50\n",
+      "60000/60000 [==============================] - 2s 26us/step - loss: 0.2118 - acc: 0.9372\n",
+      "Epoch 8/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.2007 - acc: 0.9394\n",
+      "Epoch 9/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1931 - acc: 0.9419\n",
+      "Epoch 10/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1837 - acc: 0.9443\n",
+      "Epoch 11/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1766 - acc: 0.9462\n",
+      "Epoch 12/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.1748 - acc: 0.9477\n",
+      "Epoch 13/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1687 - acc: 0.9492\n",
+      "Epoch 14/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1607 - acc: 0.9515\n",
+      "Epoch 15/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1608 - acc: 0.9502\n",
+      "Epoch 16/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1571 - acc: 0.9528\n",
+      "Epoch 17/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.1526 - acc: 0.9538\n",
+      "Epoch 18/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1474 - acc: 0.9554\n",
+      "Epoch 19/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1471 - acc: 0.9545\n",
+      "Epoch 20/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1446 - acc: 0.9561\n",
+      "Epoch 21/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1408 - acc: 0.9572\n",
+      "Epoch 22/50\n",
+      "60000/60000 [==============================] - 2s 27us/step - loss: 0.1360 - acc: 0.9583\n",
+      "Epoch 23/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1358 - acc: 0.9581\n",
+      "Epoch 24/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1320 - acc: 0.9597\n",
+      "Epoch 25/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1315 - acc: 0.9589\n",
+      "Epoch 26/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1285 - acc: 0.9605\n",
+      "Epoch 27/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1283 - acc: 0.9602\n",
+      "Epoch 28/50\n",
+      "60000/60000 [==============================] - 2s 28us/step - loss: 0.1272 - acc: 0.9601\n",
+      "Epoch 29/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1262 - acc: 0.9603\n",
+      "Epoch 30/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1236 - acc: 0.9618\n",
+      "Epoch 31/50\n",
+      "60000/60000 [==============================] - 2s 29us/step - loss: 0.1204 - acc: 0.9623\n",
+      "Epoch 32/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1201 - acc: 0.9623\n",
+      "Epoch 33/50\n",
+      "60000/60000 [==============================] - 2s 32us/step - loss: 0.1199 - acc: 0.9618\n",
+      "Epoch 34/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1165 - acc: 0.9637\n",
+      "Epoch 35/50\n",
+      "60000/60000 [==============================] - 2s 32us/step - loss: 0.1173 - acc: 0.9630\n",
+      "Epoch 36/50\n",
+      "60000/60000 [==============================] - 2s 32us/step - loss: 0.1161 - acc: 0.9638\n",
+      "Epoch 37/50\n",
+      "60000/60000 [==============================] - 2s 34us/step - loss: 0.1152 - acc: 0.9635\n",
+      "Epoch 38/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1150 - acc: 0.9639\n",
+      "Epoch 39/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1148 - acc: 0.9631\n",
+      "Epoch 40/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1116 - acc: 0.9641\n",
+      "Epoch 41/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1106 - acc: 0.9651\n",
+      "Epoch 42/50\n",
+      "60000/60000 [==============================] - 2s 33us/step - loss: 0.1109 - acc: 0.9645\n",
+      "Epoch 43/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1105 - acc: 0.9647\n",
+      "Epoch 44/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1071 - acc: 0.9667\n",
+      "Epoch 45/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1064 - acc: 0.9662\n",
+      "Epoch 46/50\n",
+      "60000/60000 [==============================] - 2s 32us/step - loss: 0.1060 - acc: 0.9666\n",
+      "Epoch 47/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1071 - acc: 0.9668\n",
+      "Epoch 48/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1066 - acc: 0.9666\n",
+      "Epoch 49/50\n",
+      "60000/60000 [==============================] - 2s 30us/step - loss: 0.1060 - acc: 0.9660\n",
+      "Epoch 50/50\n",
+      "60000/60000 [==============================] - 2s 31us/step - loss: 0.1033 - acc: 0.9676\n"
+     ]
+    }
+   ],
+   "source": [
+    "# Solution\n",
+    "# Adding Dropout\n",
+    "# Building the keras model\n",
+    "from keras.models import Sequential\n",
+    "from keras.layers import Dense, Dropout\n",
+    "\n",
+    "def mnist_model():\n",
+    "    \n",
+    "    model = Sequential()\n",
+    "\n",
+    "    model.add(Dense(64, input_shape=(28*28,), activation=\"relu\"))\n",
+    "              \n",
+    "    model.add(Dropout(0.4))\n",
+    "\n",
+    "    model.add(Dense(64, activation=\"relu\"))\n",
+    "\n",
+    "    model.add(Dense(10, activation=\"softmax\"))\n",
+    "\n",
+    "    model.compile(loss=\"categorical_crossentropy\",\n",
+    "                  optimizer=\"rmsprop\", metrics=[\"accuracy\"])\n",
+    "              \n",
+    "    return model\n",
+    "\n",
+    "model = mnist_model()\n",
+    "\n",
+    "num_epochs = 50\n",
+    "model_run = model.fit(X_train_prep, y_train_onehot, epochs=num_epochs,\n",
+    "                      batch_size=512)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 156,
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "10000/10000 [==============================] - 2s 199us/step\n",
+      "The [loss, accuracy] on test dataset are:  [0.09923268887351733, 0.9732]\n"
+     ]
+    }
+   ],
+   "source": [
+    "print(\"The [loss, accuracy] on test dataset are: \" , model.evaluate(X_test_prep, y_test_onehot))"
+   ]
+  },
   {
    "cell_type": "markdown",
    "metadata": {},
@@ -2253,2225 +2464,267 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Will remove the example below."
+    "## CNN example"
    ]
   },
   {
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "**This beer example is not good for neural networks. Basically the dataset is far too small**"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 3,
-   "metadata": {},
-   "outputs": [
-    {
-     "data": {
-      "text/plain": [
-       "(225, 4)"
-      ]
-     },
-     "execution_count": 3,
-     "metadata": {},
-     "output_type": "execute_result"
-    }
-   ],
-   "source": [
-    "# Revisiting the beer example\n",
-    "\n",
-    "import pandas as pd\n",
-    "from sklearn.model_selection import train_test_split\n",
-    "from sklearn.preprocessing import MinMaxScaler\n",
-    "from keras.models import Sequential\n",
-    "import numpy as np\n",
-    "import matplotlib.pyplot as plt\n",
+    "For this example we will work with a dataset called fashion-MNIST which is quite similar to the MNIST data above.\n",
+    "> Fashion-MNIST is a dataset of Zalando's article images—consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes. We intend Fashion-MNIST to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms. It shares the same image size and structure of training and testing splits.\n",
+    "source: https://github.com/zalandoresearch/fashion-mnist\n",
     "\n",
-    "# Loading the beer data\n",
-    "beer = pd.read_csv(\"beers.csv\")\n",
+    "The 10 classes of this dataset are:\n",
     "\n",
-    "# Extracting the features and labels\n",
-    "#beer_data.describe()\n",
-    "features = beer.iloc[:, :-1]\n",
-    "labels = beer.iloc[:, -1]\n",
-    "features.shape"
+    "| Label| Item |\n",
+    "| --- | --- |\n",
+    "| 0 |\tT-shirt/top |\n",
+    "| 1\t| Trouser |\n",
+    "|2|\tPullover|\n",
+    "|3|\tDress|\n",
+    "|4|\tCoat|\n",
+    "|5|\tSandal|\n",
+    "|6|\tShirt|\n",
+    "|7|\tSneaker|\n",
+    "|8|\tBag|\n",
+    "|9|\tAnkle boot|"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 75,
+   "execution_count": 137,
    "metadata": {},
    "outputs": [],
    "source": [
-    "# Revisiting the beer example\n",
-    "\n",
-    "# Loading and preparing the data\n",
-    "\n",
-    "import pandas as pd\n",
-    "from sklearn.model_selection import train_test_split\n",
-    "from sklearn.preprocessing import MinMaxScaler\n",
-    "\n",
-    "# Loading the beer data\n",
-    "beer = pd.read_csv(\"beers.csv\")\n",
+    "# Loading the dataset in keras\n",
+    "# Later you can explore and play with other datasets with come with Keras\n",
+    "from keras.datasets import fashion_mnist\n",
     "\n",
-    "# Extracting the features and labels\n",
-    "#beer_data.describe()\n",
-    "features = beer.iloc[:, :-1]\n",
-    "labels = beer.iloc[:, -1]\n",
+    "# Loading the train and test data\n",
     "\n",
-    "# Here we split the dataset into training (70%) and validation sets (30%) \n",
-    "#X_train, X_test, y_train, y_test = train_test_split(features, labels, test_size=0.5, random_state=42)\n",
-    "X_train, X_test, y_train, y_test = train_test_split(features, labels, test_size=0.3)\n",
+    "(X_train, y_train), (X_test, y_test) = fashion_mnist.load_data()\n",
     "\n",
-    "# Scaling the data\n",
-    "# NOTE: The features should be normalized before being fed into the neural network\n",
-    "scaling = MinMaxScaler()\n",
-    "scaling.fit(X_train)\n",
-    "\n",
-    "X_train_scaled = scaling.transform(X_train)\n",
-    "X_test_scaled = scaling.transform(X_test)"
+    "items =['T-shirt/top', 'Trouser', \n",
+    "        'Pullover', 'Dress', \n",
+    "        'Coat', 'Sandal', \n",
+    "        'Shirt', 'Sneaker',\n",
+    "        'Bag', 'Ankle boot']"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 82,
+   "execution_count": 138,
    "metadata": {},
    "outputs": [
     {
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "Train on 157 samples, validate on 68 samples\n",
-      "Epoch 1/1000\n",
-      "157/157 [==============================] - 1s 6ms/step - loss: 0.6730 - acc: 0.5350 - val_loss: 0.6769 - val_acc: 0.5147\n",
-      "Epoch 2/1000\n",
-      "157/157 [==============================] - 0s 406us/step - loss: 0.6704 - acc: 0.5350 - val_loss: 0.6754 - val_acc: 0.5147\n",
-      "Epoch 3/1000\n",
-      "157/157 [==============================] - 0s 256us/step - loss: 0.6693 - acc: 0.5350 - val_loss: 0.6740 - val_acc: 0.5147\n",
-      "Epoch 4/1000\n",
-      "157/157 [==============================] - 0s 215us/step - loss: 0.6679 - acc: 0.5350 - val_loss: 0.6728 - val_acc: 0.5147\n",
-      "Epoch 5/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.6668 - acc: 0.5350 - val_loss: 0.6716 - val_acc: 0.5147\n",
-      "Epoch 6/1000\n",
-      "157/157 [==============================] - 0s 107us/step - loss: 0.6658 - acc: 0.5350 - val_loss: 0.6704 - val_acc: 0.5147\n",
-      "Epoch 7/1000\n",
-      "157/157 [==============================] - 0s 303us/step - loss: 0.6652 - acc: 0.5350 - val_loss: 0.6693 - val_acc: 0.5147\n",
-      "Epoch 8/1000\n",
-      "157/157 [==============================] - 0s 98us/step - loss: 0.6637 - acc: 0.5350 - val_loss: 0.6682 - val_acc: 0.5147\n",
-      "Epoch 9/1000\n",
-      "157/157 [==============================] - 0s 92us/step - loss: 0.6626 - acc: 0.5350 - val_loss: 0.6670 - val_acc: 0.5147\n",
-      "Epoch 10/1000\n",
-      "157/157 [==============================] - 0s 90us/step - loss: 0.6616 - acc: 0.5350 - val_loss: 0.6657 - val_acc: 0.5147\n",
-      "Epoch 11/1000\n",
-      "157/157 [==============================] - 0s 92us/step - loss: 0.6605 - acc: 0.5350 - val_loss: 0.6644 - val_acc: 0.5147\n",
-      "Epoch 12/1000\n",
-      "157/157 [==============================] - 0s 305us/step - loss: 0.6596 - acc: 0.5350 - val_loss: 0.6633 - val_acc: 0.5147\n",
-      "Epoch 13/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.6587 - acc: 0.5350 - val_loss: 0.6622 - val_acc: 0.5147\n",
-      "Epoch 14/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.6578 - acc: 0.5350 - val_loss: 0.6612 - val_acc: 0.5147\n",
-      "Epoch 15/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.6567 - acc: 0.5350 - val_loss: 0.6601 - val_acc: 0.5147\n",
-      "Epoch 16/1000\n",
-      "157/157 [==============================] - 0s 179us/step - loss: 0.6558 - acc: 0.5350 - val_loss: 0.6591 - val_acc: 0.5147\n",
-      "Epoch 17/1000\n",
-      "157/157 [==============================] - 0s 98us/step - loss: 0.6551 - acc: 0.5350 - val_loss: 0.6580 - val_acc: 0.5147\n",
-      "Epoch 18/1000\n",
-      "157/157 [==============================] - 0s 106us/step - loss: 0.6540 - acc: 0.5350 - val_loss: 0.6570 - val_acc: 0.5147\n",
-      "Epoch 19/1000\n",
-      "157/157 [==============================] - 0s 97us/step - loss: 0.6531 - acc: 0.5350 - val_loss: 0.6559 - val_acc: 0.5147\n",
-      "Epoch 20/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.6523 - acc: 0.5350 - val_loss: 0.6549 - val_acc: 0.5147\n",
-      "Epoch 21/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.6512 - acc: 0.5350 - val_loss: 0.6537 - val_acc: 0.5147\n",
-      "Epoch 22/1000\n",
-      "157/157 [==============================] - 0s 288us/step - loss: 0.6506 - acc: 0.5350 - val_loss: 0.6527 - val_acc: 0.5147\n",
-      "Epoch 23/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.6496 - acc: 0.5414 - val_loss: 0.6517 - val_acc: 0.5147\n",
-      "Epoch 24/1000\n",
-      "157/157 [==============================] - 0s 257us/step - loss: 0.6486 - acc: 0.5414 - val_loss: 0.6506 - val_acc: 0.5147\n",
-      "Epoch 25/1000\n",
-      "157/157 [==============================] - 0s 95us/step - loss: 0.6477 - acc: 0.5478 - val_loss: 0.6495 - val_acc: 0.5147\n",
-      "Epoch 26/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.6466 - acc: 0.5414 - val_loss: 0.6483 - val_acc: 0.5147\n",
-      "Epoch 27/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.6458 - acc: 0.5541 - val_loss: 0.6472 - val_acc: 0.5147\n",
-      "Epoch 28/1000\n",
-      "157/157 [==============================] - 0s 257us/step - loss: 0.6447 - acc: 0.5541 - val_loss: 0.6461 - val_acc: 0.5147\n",
-      "Epoch 29/1000\n",
-      "157/157 [==============================] - 0s 134us/step - loss: 0.6437 - acc: 0.5541 - val_loss: 0.6449 - val_acc: 0.5147\n",
-      "Epoch 30/1000\n",
-      "157/157 [==============================] - 0s 111us/step - loss: 0.6427 - acc: 0.5669 - val_loss: 0.6438 - val_acc: 0.5147\n",
-      "Epoch 31/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.6417 - acc: 0.5669 - val_loss: 0.6426 - val_acc: 0.5147\n",
-      "Epoch 32/1000\n",
-      "157/157 [==============================] - 0s 103us/step - loss: 0.6407 - acc: 0.5669 - val_loss: 0.6414 - val_acc: 0.5147\n",
-      "Epoch 33/1000\n",
-      "157/157 [==============================] - 0s 269us/step - loss: 0.6394 - acc: 0.5732 - val_loss: 0.6401 - val_acc: 0.5294\n",
-      "Epoch 34/1000\n",
-      "157/157 [==============================] - 0s 119us/step - loss: 0.6384 - acc: 0.5732 - val_loss: 0.6387 - val_acc: 0.5294\n",
-      "Epoch 35/1000\n",
-      "157/157 [==============================] - 0s 92us/step - loss: 0.6371 - acc: 0.5732 - val_loss: 0.6373 - val_acc: 0.5294\n",
-      "Epoch 36/1000\n",
-      "157/157 [==============================] - 0s 298us/step - loss: 0.6361 - acc: 0.5796 - val_loss: 0.6360 - val_acc: 0.5294\n",
-      "Epoch 37/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.6349 - acc: 0.5860 - val_loss: 0.6347 - val_acc: 0.5441\n",
-      "Epoch 38/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.6336 - acc: 0.5860 - val_loss: 0.6333 - val_acc: 0.5441\n",
-      "Epoch 39/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.6323 - acc: 0.5860 - val_loss: 0.6318 - val_acc: 0.5441\n",
-      "Epoch 40/1000\n",
-      "157/157 [==============================] - 0s 321us/step - loss: 0.6310 - acc: 0.5860 - val_loss: 0.6302 - val_acc: 0.5441\n",
-      "Epoch 41/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.6297 - acc: 0.5924 - val_loss: 0.6286 - val_acc: 0.5441\n",
-      "Epoch 42/1000\n",
-      "157/157 [==============================] - 0s 229us/step - loss: 0.6285 - acc: 0.5924 - val_loss: 0.6273 - val_acc: 0.5441\n",
-      "Epoch 43/1000\n",
-      "157/157 [==============================] - 0s 201us/step - loss: 0.6271 - acc: 0.5924 - val_loss: 0.6258 - val_acc: 0.5441\n",
-      "Epoch 44/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.6260 - acc: 0.5924 - val_loss: 0.6243 - val_acc: 0.5441\n",
-      "Epoch 45/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.6245 - acc: 0.5987 - val_loss: 0.6228 - val_acc: 0.5588\n",
-      "Epoch 46/1000\n",
-      "157/157 [==============================] - 0s 113us/step - loss: 0.6234 - acc: 0.6051 - val_loss: 0.6213 - val_acc: 0.5588\n",
-      "Epoch 47/1000\n",
-      "157/157 [==============================] - 0s 537us/step - loss: 0.6218 - acc: 0.6178 - val_loss: 0.6197 - val_acc: 0.5588\n",
-      "Epoch 48/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.6205 - acc: 0.6178 - val_loss: 0.6181 - val_acc: 0.5588\n",
-      "Epoch 49/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.6191 - acc: 0.6178 - val_loss: 0.6164 - val_acc: 0.5735\n",
-      "Epoch 50/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.6176 - acc: 0.6178 - val_loss: 0.6146 - val_acc: 0.5882\n",
-      "Epoch 51/1000\n",
-      "157/157 [==============================] - 0s 286us/step - loss: 0.6165 - acc: 0.6178 - val_loss: 0.6130 - val_acc: 0.5882\n",
-      "Epoch 52/1000\n",
-      "157/157 [==============================] - 0s 254us/step - loss: 0.6152 - acc: 0.6242 - val_loss: 0.6116 - val_acc: 0.6029\n",
-      "Epoch 53/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.6136 - acc: 0.6242 - val_loss: 0.6100 - val_acc: 0.6029\n",
-      "Epoch 54/1000\n",
-      "157/157 [==============================] - 0s 202us/step - loss: 0.6127 - acc: 0.6242 - val_loss: 0.6085 - val_acc: 0.6029\n",
-      "Epoch 55/1000\n",
-      "157/157 [==============================] - 0s 108us/step - loss: 0.6114 - acc: 0.6242 - val_loss: 0.6070 - val_acc: 0.6029\n",
-      "Epoch 56/1000\n",
-      "157/157 [==============================] - 0s 157us/step - loss: 0.6098 - acc: 0.6242 - val_loss: 0.6053 - val_acc: 0.6029\n",
-      "Epoch 57/1000\n",
-      "157/157 [==============================] - 0s 118us/step - loss: 0.6085 - acc: 0.6242 - val_loss: 0.6036 - val_acc: 0.6029\n",
-      "Epoch 58/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.6070 - acc: 0.6242 - val_loss: 0.6018 - val_acc: 0.6029\n",
-      "Epoch 59/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.6057 - acc: 0.6242 - val_loss: 0.6001 - val_acc: 0.6029\n",
-      "Epoch 60/1000\n",
-      "157/157 [==============================] - 0s 263us/step - loss: 0.6039 - acc: 0.6242 - val_loss: 0.5982 - val_acc: 0.6176\n",
-      "Epoch 61/1000\n",
-      "157/157 [==============================] - 0s 244us/step - loss: 0.6023 - acc: 0.6242 - val_loss: 0.5963 - val_acc: 0.6176\n",
-      "Epoch 62/1000\n",
-      "157/157 [==============================] - 0s 409us/step - loss: 0.6006 - acc: 0.6306 - val_loss: 0.5943 - val_acc: 0.6176\n",
-      "Epoch 63/1000\n",
-      "157/157 [==============================] - 0s 104us/step - loss: 0.5991 - acc: 0.6306 - val_loss: 0.5922 - val_acc: 0.6324\n",
-      "Epoch 64/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.5981 - acc: 0.6369 - val_loss: 0.5906 - val_acc: 0.6324\n",
-      "Epoch 65/1000\n",
-      "157/157 [==============================] - 0s 104us/step - loss: 0.5958 - acc: 0.6433 - val_loss: 0.5889 - val_acc: 0.6471\n",
-      "Epoch 66/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.5945 - acc: 0.6433 - val_loss: 0.5871 - val_acc: 0.6471\n",
-      "Epoch 67/1000\n",
-      "157/157 [==============================] - 0s 378us/step - loss: 0.5929 - acc: 0.6433 - val_loss: 0.5852 - val_acc: 0.6471\n",
-      "Epoch 68/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.5917 - acc: 0.6497 - val_loss: 0.5836 - val_acc: 0.6471\n",
-      "Epoch 69/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.5901 - acc: 0.6497 - val_loss: 0.5816 - val_acc: 0.6471\n",
-      "Epoch 70/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.5885 - acc: 0.6497 - val_loss: 0.5797 - val_acc: 0.6765\n",
-      "Epoch 71/1000\n",
-      "157/157 [==============================] - 0s 208us/step - loss: 0.5867 - acc: 0.6561 - val_loss: 0.5778 - val_acc: 0.6765\n",
-      "Epoch 72/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.5850 - acc: 0.6561 - val_loss: 0.5755 - val_acc: 0.6765\n",
-      "Epoch 73/1000\n",
-      "157/157 [==============================] - 0s 279us/step - loss: 0.5831 - acc: 0.6624 - val_loss: 0.5733 - val_acc: 0.6765\n",
-      "Epoch 74/1000\n",
-      "157/157 [==============================] - 0s 263us/step - loss: 0.5812 - acc: 0.6688 - val_loss: 0.5712 - val_acc: 0.6912\n",
-      "Epoch 75/1000\n",
-      "157/157 [==============================] - 0s 263us/step - loss: 0.5791 - acc: 0.6752 - val_loss: 0.5688 - val_acc: 0.7059\n",
-      "Epoch 76/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.5771 - acc: 0.6752 - val_loss: 0.5665 - val_acc: 0.7059\n",
-      "Epoch 77/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.5750 - acc: 0.6879 - val_loss: 0.5643 - val_acc: 0.7059\n",
-      "Epoch 78/1000\n",
-      "157/157 [==============================] - 0s 217us/step - loss: 0.5728 - acc: 0.6879 - val_loss: 0.5619 - val_acc: 0.7059\n",
-      "Epoch 79/1000\n",
-      "157/157 [==============================] - 0s 123us/step - loss: 0.5708 - acc: 0.6943 - val_loss: 0.5596 - val_acc: 0.7059\n",
-      "Epoch 80/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.5687 - acc: 0.7006 - val_loss: 0.5570 - val_acc: 0.7206\n",
-      "Epoch 81/1000\n",
-      "157/157 [==============================] - 0s 181us/step - loss: 0.5666 - acc: 0.7070 - val_loss: 0.5545 - val_acc: 0.7206\n",
-      "Epoch 82/1000\n",
-      "157/157 [==============================] - 0s 109us/step - loss: 0.5643 - acc: 0.7006 - val_loss: 0.5519 - val_acc: 0.7206\n",
-      "Epoch 83/1000\n",
-      "157/157 [==============================] - 0s 258us/step - loss: 0.5623 - acc: 0.7134 - val_loss: 0.5495 - val_acc: 0.7206\n",
-      "Epoch 84/1000\n",
-      "157/157 [==============================] - 0s 123us/step - loss: 0.5600 - acc: 0.7197 - val_loss: 0.5469 - val_acc: 0.7206\n",
-      "Epoch 85/1000\n",
-      "157/157 [==============================] - 0s 120us/step - loss: 0.5577 - acc: 0.7197 - val_loss: 0.5443 - val_acc: 0.7206\n",
-      "Epoch 86/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.5550 - acc: 0.7197 - val_loss: 0.5411 - val_acc: 0.7353\n",
-      "Epoch 87/1000\n",
-      "157/157 [==============================] - 0s 134us/step - loss: 0.5529 - acc: 0.7325 - val_loss: 0.5383 - val_acc: 0.7353\n",
-      "Epoch 88/1000\n",
-      "157/157 [==============================] - 0s 185us/step - loss: 0.5498 - acc: 0.7325 - val_loss: 0.5347 - val_acc: 0.7353\n",
-      "Epoch 89/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.5471 - acc: 0.7516 - val_loss: 0.5314 - val_acc: 0.7647\n",
-      "Epoch 90/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.5451 - acc: 0.7452 - val_loss: 0.5283 - val_acc: 0.7941\n",
-      "Epoch 91/1000\n",
-      "157/157 [==============================] - 0s 292us/step - loss: 0.5430 - acc: 0.7580 - val_loss: 0.5258 - val_acc: 0.8088\n",
-      "Epoch 92/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.5399 - acc: 0.7580 - val_loss: 0.5234 - val_acc: 0.8088\n",
-      "Epoch 93/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.5383 - acc: 0.7643 - val_loss: 0.5210 - val_acc: 0.8088\n",
-      "Epoch 94/1000\n",
-      "157/157 [==============================] - 0s 231us/step - loss: 0.5356 - acc: 0.7643 - val_loss: 0.5184 - val_acc: 0.8088\n",
-      "Epoch 95/1000\n",
-      "157/157 [==============================] - 0s 96us/step - loss: 0.5334 - acc: 0.7643 - val_loss: 0.5158 - val_acc: 0.8235\n",
-      "Epoch 96/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.5309 - acc: 0.7707 - val_loss: 0.5128 - val_acc: 0.8235\n",
-      "Epoch 97/1000\n",
-      "157/157 [==============================] - 0s 214us/step - loss: 0.5291 - acc: 0.7898 - val_loss: 0.5100 - val_acc: 0.8235\n",
-      "Epoch 98/1000\n",
-      "157/157 [==============================] - 0s 179us/step - loss: 0.5263 - acc: 0.7898 - val_loss: 0.5074 - val_acc: 0.8235\n",
-      "Epoch 99/1000\n",
-      "157/157 [==============================] - 0s 182us/step - loss: 0.5243 - acc: 0.7962 - val_loss: 0.5044 - val_acc: 0.8088\n",
-      "Epoch 100/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.5234 - acc: 0.7834 - val_loss: 0.5024 - val_acc: 0.8088\n",
-      "Epoch 101/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.5198 - acc: 0.8025 - val_loss: 0.5002 - val_acc: 0.8088\n",
-      "Epoch 102/1000\n",
-      "157/157 [==============================] - 0s 320us/step - loss: 0.5184 - acc: 0.7962 - val_loss: 0.4978 - val_acc: 0.8088\n",
-      "Epoch 103/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.5162 - acc: 0.8025 - val_loss: 0.4956 - val_acc: 0.8088\n",
-      "Epoch 104/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.5136 - acc: 0.8025 - val_loss: 0.4932 - val_acc: 0.8088\n",
-      "Epoch 105/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.5115 - acc: 0.7962 - val_loss: 0.4903 - val_acc: 0.8235\n",
-      "Epoch 106/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.5091 - acc: 0.8025 - val_loss: 0.4877 - val_acc: 0.8382\n",
-      "Epoch 107/1000\n",
-      "157/157 [==============================] - 0s 351us/step - loss: 0.5065 - acc: 0.8089 - val_loss: 0.4851 - val_acc: 0.8382\n",
-      "Epoch 108/1000\n",
-      "157/157 [==============================] - 0s 370us/step - loss: 0.5041 - acc: 0.8025 - val_loss: 0.4822 - val_acc: 0.8529\n",
-      "Epoch 109/1000\n",
-      "157/157 [==============================] - 0s 345us/step - loss: 0.5016 - acc: 0.8089 - val_loss: 0.4795 - val_acc: 0.8529\n",
-      "Epoch 110/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.4996 - acc: 0.8025 - val_loss: 0.4765 - val_acc: 0.8529\n",
-      "Epoch 111/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.4972 - acc: 0.8089 - val_loss: 0.4739 - val_acc: 0.8529\n",
-      "Epoch 112/1000\n",
-      "157/157 [==============================] - 0s 266us/step - loss: 0.4944 - acc: 0.8280 - val_loss: 0.4716 - val_acc: 0.8529\n",
-      "Epoch 113/1000\n",
-      "157/157 [==============================] - 0s 218us/step - loss: 0.4918 - acc: 0.8153 - val_loss: 0.4686 - val_acc: 0.8529\n",
-      "Epoch 114/1000\n",
-      "157/157 [==============================] - 0s 174us/step - loss: 0.4894 - acc: 0.8471 - val_loss: 0.4656 - val_acc: 0.8529\n",
-      "Epoch 115/1000\n",
-      "157/157 [==============================] - 0s 157us/step - loss: 0.4869 - acc: 0.8408 - val_loss: 0.4624 - val_acc: 0.8676\n",
-      "Epoch 116/1000\n",
-      "157/157 [==============================] - 0s 276us/step - loss: 0.4846 - acc: 0.8089 - val_loss: 0.4592 - val_acc: 0.8676\n",
-      "Epoch 117/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.4818 - acc: 0.8408 - val_loss: 0.4565 - val_acc: 0.8676\n",
-      "Epoch 118/1000\n",
-      "157/157 [==============================] - 0s 246us/step - loss: 0.4792 - acc: 0.8535 - val_loss: 0.4539 - val_acc: 0.8676\n",
-      "Epoch 119/1000\n",
-      "157/157 [==============================] - 0s 116us/step - loss: 0.4768 - acc: 0.8408 - val_loss: 0.4506 - val_acc: 0.8676\n"
+      "This item is a:  Pullover\n"
      ]
     },
     {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Epoch 120/1000\n",
-      "157/157 [==============================] - 0s 229us/step - loss: 0.4756 - acc: 0.8471 - val_loss: 0.4482 - val_acc: 0.8676\n",
-      "Epoch 121/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.4726 - acc: 0.8599 - val_loss: 0.4463 - val_acc: 0.8676\n",
-      "Epoch 122/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.4708 - acc: 0.8471 - val_loss: 0.4438 - val_acc: 0.8676\n",
-      "Epoch 123/1000\n",
-      "157/157 [==============================] - 0s 245us/step - loss: 0.4682 - acc: 0.8599 - val_loss: 0.4415 - val_acc: 0.8676\n",
-      "Epoch 124/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.4658 - acc: 0.8535 - val_loss: 0.4390 - val_acc: 0.8676\n",
-      "Epoch 125/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.4635 - acc: 0.8599 - val_loss: 0.4361 - val_acc: 0.8824\n",
-      "Epoch 126/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.4614 - acc: 0.8535 - val_loss: 0.4332 - val_acc: 0.8824\n",
-      "Epoch 127/1000\n",
-      "157/157 [==============================] - 0s 327us/step - loss: 0.4584 - acc: 0.8726 - val_loss: 0.4307 - val_acc: 0.8824\n",
-      "Epoch 128/1000\n",
-      "157/157 [==============================] - 0s 181us/step - loss: 0.4571 - acc: 0.8535 - val_loss: 0.4279 - val_acc: 0.8824\n",
-      "Epoch 129/1000\n",
-      "157/157 [==============================] - 0s 268us/step - loss: 0.4550 - acc: 0.8726 - val_loss: 0.4258 - val_acc: 0.8824\n",
-      "Epoch 130/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.4517 - acc: 0.8599 - val_loss: 0.4230 - val_acc: 0.8824\n",
-      "Epoch 131/1000\n",
-      "157/157 [==============================] - 0s 281us/step - loss: 0.4497 - acc: 0.8726 - val_loss: 0.4204 - val_acc: 0.8824\n",
-      "Epoch 132/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.4476 - acc: 0.8662 - val_loss: 0.4178 - val_acc: 0.8824\n",
-      "Epoch 133/1000\n",
-      "157/157 [==============================] - 0s 177us/step - loss: 0.4456 - acc: 0.8726 - val_loss: 0.4153 - val_acc: 0.8824\n",
-      "Epoch 134/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.4433 - acc: 0.8790 - val_loss: 0.4131 - val_acc: 0.8824\n",
-      "Epoch 135/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.4409 - acc: 0.8854 - val_loss: 0.4108 - val_acc: 0.8824\n",
-      "Epoch 136/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.4381 - acc: 0.8726 - val_loss: 0.4082 - val_acc: 0.8824\n",
-      "Epoch 137/1000\n",
-      "157/157 [==============================] - 0s 272us/step - loss: 0.4357 - acc: 0.8854 - val_loss: 0.4053 - val_acc: 0.8824\n",
-      "Epoch 138/1000\n",
-      "157/157 [==============================] - 0s 286us/step - loss: 0.4338 - acc: 0.8726 - val_loss: 0.4025 - val_acc: 0.8824\n",
-      "Epoch 139/1000\n",
-      "157/157 [==============================] - 0s 164us/step - loss: 0.4308 - acc: 0.8726 - val_loss: 0.3994 - val_acc: 0.8824\n",
-      "Epoch 140/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.4286 - acc: 0.8790 - val_loss: 0.3968 - val_acc: 0.8824\n",
-      "Epoch 141/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.4266 - acc: 0.8726 - val_loss: 0.3944 - val_acc: 0.8824\n",
-      "Epoch 142/1000\n",
-      "157/157 [==============================] - 0s 285us/step - loss: 0.4241 - acc: 0.8790 - val_loss: 0.3924 - val_acc: 0.8824\n",
-      "Epoch 143/1000\n",
-      "157/157 [==============================] - 0s 136us/step - loss: 0.4224 - acc: 0.8726 - val_loss: 0.3902 - val_acc: 0.8824\n",
-      "Epoch 144/1000\n",
-      "157/157 [==============================] - 0s 243us/step - loss: 0.4204 - acc: 0.8726 - val_loss: 0.3882 - val_acc: 0.8824\n",
-      "Epoch 145/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.4177 - acc: 0.8726 - val_loss: 0.3860 - val_acc: 0.8824\n",
-      "Epoch 146/1000\n",
-      "157/157 [==============================] - 0s 210us/step - loss: 0.4167 - acc: 0.8854 - val_loss: 0.3840 - val_acc: 0.8824\n",
-      "Epoch 147/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.4133 - acc: 0.8726 - val_loss: 0.3815 - val_acc: 0.8824\n",
-      "Epoch 148/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.4112 - acc: 0.8790 - val_loss: 0.3791 - val_acc: 0.8824\n",
-      "Epoch 149/1000\n",
-      "157/157 [==============================] - 0s 274us/step - loss: 0.4098 - acc: 0.8854 - val_loss: 0.3771 - val_acc: 0.8824\n",
-      "Epoch 150/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.4075 - acc: 0.8726 - val_loss: 0.3743 - val_acc: 0.8824\n",
-      "Epoch 151/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.4047 - acc: 0.8854 - val_loss: 0.3721 - val_acc: 0.8824\n",
-      "Epoch 152/1000\n",
-      "157/157 [==============================] - 0s 282us/step - loss: 0.4033 - acc: 0.8726 - val_loss: 0.3694 - val_acc: 0.8824\n",
-      "Epoch 153/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.4013 - acc: 0.9108 - val_loss: 0.3680 - val_acc: 0.8824\n",
-      "Epoch 154/1000\n",
-      "157/157 [==============================] - 0s 313us/step - loss: 0.3985 - acc: 0.8854 - val_loss: 0.3655 - val_acc: 0.8824\n",
-      "Epoch 155/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.3970 - acc: 0.8981 - val_loss: 0.3635 - val_acc: 0.8824\n",
-      "Epoch 156/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.3944 - acc: 0.8981 - val_loss: 0.3613 - val_acc: 0.8824\n",
-      "Epoch 157/1000\n",
-      "157/157 [==============================] - 0s 124us/step - loss: 0.3928 - acc: 0.8981 - val_loss: 0.3594 - val_acc: 0.8824\n",
-      "Epoch 158/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.3903 - acc: 0.8917 - val_loss: 0.3567 - val_acc: 0.8824\n",
-      "Epoch 159/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.3881 - acc: 0.8981 - val_loss: 0.3543 - val_acc: 0.8824\n",
-      "Epoch 160/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.3871 - acc: 0.8917 - val_loss: 0.3523 - val_acc: 0.8824\n",
-      "Epoch 161/1000\n",
-      "157/157 [==============================] - 0s 123us/step - loss: 0.3840 - acc: 0.9108 - val_loss: 0.3503 - val_acc: 0.8824\n",
-      "Epoch 162/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.3833 - acc: 0.8854 - val_loss: 0.3481 - val_acc: 0.8971\n",
-      "Epoch 163/1000\n",
-      "157/157 [==============================] - 0s 222us/step - loss: 0.3810 - acc: 0.8917 - val_loss: 0.3463 - val_acc: 0.8971\n",
-      "Epoch 164/1000\n",
-      "157/157 [==============================] - 0s 210us/step - loss: 0.3785 - acc: 0.9236 - val_loss: 0.3449 - val_acc: 0.8824\n",
-      "Epoch 165/1000\n",
-      "157/157 [==============================] - 0s 278us/step - loss: 0.3774 - acc: 0.9045 - val_loss: 0.3431 - val_acc: 0.8971\n",
-      "Epoch 166/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.3751 - acc: 0.8917 - val_loss: 0.3406 - val_acc: 0.8971\n",
-      "Epoch 167/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.3735 - acc: 0.8981 - val_loss: 0.3388 - val_acc: 0.8971\n",
-      "Epoch 168/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.3711 - acc: 0.9172 - val_loss: 0.3368 - val_acc: 0.8971\n",
-      "Epoch 169/1000\n",
-      "157/157 [==============================] - 0s 304us/step - loss: 0.3701 - acc: 0.9108 - val_loss: 0.3346 - val_acc: 0.8971\n",
-      "Epoch 170/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.3674 - acc: 0.9236 - val_loss: 0.3330 - val_acc: 0.8971\n",
-      "Epoch 171/1000\n",
-      "157/157 [==============================] - 0s 287us/step - loss: 0.3666 - acc: 0.9172 - val_loss: 0.3312 - val_acc: 0.8971\n",
-      "Epoch 172/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.3638 - acc: 0.9108 - val_loss: 0.3291 - val_acc: 0.8971\n",
-      "Epoch 173/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.3617 - acc: 0.9236 - val_loss: 0.3275 - val_acc: 0.8971\n",
-      "Epoch 174/1000\n",
-      "157/157 [==============================] - 0s 136us/step - loss: 0.3595 - acc: 0.9236 - val_loss: 0.3257 - val_acc: 0.8971\n",
-      "Epoch 175/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.3579 - acc: 0.9236 - val_loss: 0.3240 - val_acc: 0.8971\n",
-      "Epoch 176/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.3565 - acc: 0.9172 - val_loss: 0.3219 - val_acc: 0.8971\n",
-      "Epoch 177/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.3540 - acc: 0.9236 - val_loss: 0.3205 - val_acc: 0.8971\n",
-      "Epoch 178/1000\n",
-      "157/157 [==============================] - 0s 374us/step - loss: 0.3529 - acc: 0.9108 - val_loss: 0.3181 - val_acc: 0.8971\n",
-      "Epoch 179/1000\n",
-      "157/157 [==============================] - 0s 307us/step - loss: 0.3500 - acc: 0.9236 - val_loss: 0.3156 - val_acc: 0.8971\n",
-      "Epoch 180/1000\n",
-      "157/157 [==============================] - 0s 319us/step - loss: 0.3485 - acc: 0.9236 - val_loss: 0.3136 - val_acc: 0.8971\n",
-      "Epoch 181/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.3467 - acc: 0.9236 - val_loss: 0.3116 - val_acc: 0.8971\n",
-      "Epoch 182/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.3450 - acc: 0.9236 - val_loss: 0.3103 - val_acc: 0.8971\n",
-      "Epoch 183/1000\n",
-      "157/157 [==============================] - 0s 282us/step - loss: 0.3439 - acc: 0.9172 - val_loss: 0.3084 - val_acc: 0.8971\n",
-      "Epoch 184/1000\n",
-      "157/157 [==============================] - 0s 287us/step - loss: 0.3413 - acc: 0.9172 - val_loss: 0.3064 - val_acc: 0.8971\n",
-      "Epoch 185/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.3405 - acc: 0.9108 - val_loss: 0.3047 - val_acc: 0.9118\n",
-      "Epoch 186/1000\n",
-      "157/157 [==============================] - 0s 238us/step - loss: 0.3376 - acc: 0.9236 - val_loss: 0.3028 - val_acc: 0.9118\n",
-      "Epoch 187/1000\n",
-      "157/157 [==============================] - 0s 291us/step - loss: 0.3358 - acc: 0.9299 - val_loss: 0.3014 - val_acc: 0.9118\n",
-      "Epoch 188/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.3347 - acc: 0.9236 - val_loss: 0.2989 - val_acc: 0.9118\n",
-      "Epoch 189/1000\n",
-      "157/157 [==============================] - 0s 231us/step - loss: 0.3334 - acc: 0.9299 - val_loss: 0.2972 - val_acc: 0.9118\n",
-      "Epoch 190/1000\n",
-      "157/157 [==============================] - 0s 208us/step - loss: 0.3302 - acc: 0.9299 - val_loss: 0.2961 - val_acc: 0.8971\n",
-      "Epoch 191/1000\n",
-      "157/157 [==============================] - 0s 213us/step - loss: 0.3284 - acc: 0.9299 - val_loss: 0.2943 - val_acc: 0.8971\n",
-      "Epoch 192/1000\n",
-      "157/157 [==============================] - 0s 184us/step - loss: 0.3265 - acc: 0.9299 - val_loss: 0.2917 - val_acc: 0.9118\n",
-      "Epoch 193/1000\n",
-      "157/157 [==============================] - 0s 369us/step - loss: 0.3259 - acc: 0.9299 - val_loss: 0.2908 - val_acc: 0.8971\n",
-      "Epoch 194/1000\n",
-      "157/157 [==============================] - 0s 218us/step - loss: 0.3226 - acc: 0.9299 - val_loss: 0.2889 - val_acc: 0.8971\n",
-      "Epoch 195/1000\n",
-      "157/157 [==============================] - 0s 203us/step - loss: 0.3237 - acc: 0.9236 - val_loss: 0.2873 - val_acc: 0.8971\n",
-      "Epoch 196/1000\n",
-      "157/157 [==============================] - 0s 207us/step - loss: 0.3194 - acc: 0.9236 - val_loss: 0.2857 - val_acc: 0.8971\n",
-      "Epoch 197/1000\n",
-      "157/157 [==============================] - 0s 291us/step - loss: 0.3173 - acc: 0.9236 - val_loss: 0.2830 - val_acc: 0.9118\n",
-      "Epoch 198/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.3165 - acc: 0.9299 - val_loss: 0.2819 - val_acc: 0.9118\n",
-      "Epoch 199/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.3166 - acc: 0.9236 - val_loss: 0.2805 - val_acc: 0.8971\n",
-      "Epoch 200/1000\n",
-      "157/157 [==============================] - 0s 308us/step - loss: 0.3128 - acc: 0.9236 - val_loss: 0.2790 - val_acc: 0.9118\n",
-      "Epoch 201/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.3109 - acc: 0.9299 - val_loss: 0.2772 - val_acc: 0.9118\n",
-      "Epoch 202/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.3092 - acc: 0.9236 - val_loss: 0.2755 - val_acc: 0.9118\n",
-      "Epoch 203/1000\n",
-      "157/157 [==============================] - 0s 230us/step - loss: 0.3076 - acc: 0.9236 - val_loss: 0.2736 - val_acc: 0.9118\n",
-      "Epoch 204/1000\n",
-      "157/157 [==============================] - 0s 123us/step - loss: 0.3056 - acc: 0.9236 - val_loss: 0.2724 - val_acc: 0.9118\n",
-      "Epoch 205/1000\n",
-      "157/157 [==============================] - 0s 118us/step - loss: 0.3046 - acc: 0.9236 - val_loss: 0.2703 - val_acc: 0.9118\n",
-      "Epoch 206/1000\n",
-      "157/157 [==============================] - 0s 319us/step - loss: 0.3018 - acc: 0.9299 - val_loss: 0.2682 - val_acc: 0.9118\n",
-      "Epoch 207/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.2998 - acc: 0.9427 - val_loss: 0.2670 - val_acc: 0.9118\n",
-      "Epoch 208/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.2988 - acc: 0.9299 - val_loss: 0.2651 - val_acc: 0.9118\n",
-      "Epoch 209/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.2970 - acc: 0.9299 - val_loss: 0.2626 - val_acc: 0.9118\n",
-      "Epoch 210/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.2945 - acc: 0.9427 - val_loss: 0.2626 - val_acc: 0.8971\n",
-      "Epoch 211/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.2932 - acc: 0.9299 - val_loss: 0.2599 - val_acc: 0.9118\n",
-      "Epoch 212/1000\n",
-      "157/157 [==============================] - 0s 317us/step - loss: 0.2919 - acc: 0.9427 - val_loss: 0.2590 - val_acc: 0.8971\n",
-      "Epoch 213/1000\n",
-      "157/157 [==============================] - 0s 241us/step - loss: 0.2898 - acc: 0.9236 - val_loss: 0.2560 - val_acc: 0.9118\n",
-      "Epoch 214/1000\n",
-      "157/157 [==============================] - 0s 396us/step - loss: 0.2892 - acc: 0.9427 - val_loss: 0.2547 - val_acc: 0.9118\n",
-      "Epoch 215/1000\n",
-      "157/157 [==============================] - 0s 317us/step - loss: 0.2863 - acc: 0.9427 - val_loss: 0.2529 - val_acc: 0.9118\n",
-      "Epoch 216/1000\n",
-      "157/157 [==============================] - 0s 254us/step - loss: 0.2870 - acc: 0.9363 - val_loss: 0.2518 - val_acc: 0.9118\n",
-      "Epoch 217/1000\n",
-      "157/157 [==============================] - 0s 255us/step - loss: 0.2839 - acc: 0.9363 - val_loss: 0.2511 - val_acc: 0.9118\n",
-      "Epoch 218/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.2816 - acc: 0.9363 - val_loss: 0.2490 - val_acc: 0.9118\n",
-      "Epoch 219/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.2807 - acc: 0.9427 - val_loss: 0.2484 - val_acc: 0.9118\n",
-      "Epoch 220/1000\n",
-      "157/157 [==============================] - 0s 140us/step - loss: 0.2789 - acc: 0.9427 - val_loss: 0.2471 - val_acc: 0.9118\n",
-      "Epoch 221/1000\n",
-      "157/157 [==============================] - 0s 267us/step - loss: 0.2770 - acc: 0.9363 - val_loss: 0.2438 - val_acc: 0.9118\n",
-      "Epoch 222/1000\n",
-      "157/157 [==============================] - 0s 251us/step - loss: 0.2760 - acc: 0.9427 - val_loss: 0.2423 - val_acc: 0.9118\n",
-      "Epoch 223/1000\n",
-      "157/157 [==============================] - 0s 298us/step - loss: 0.2745 - acc: 0.9299 - val_loss: 0.2407 - val_acc: 0.9118\n",
-      "Epoch 224/1000\n",
-      "157/157 [==============================] - 0s 218us/step - loss: 0.2726 - acc: 0.9490 - val_loss: 0.2411 - val_acc: 0.9118\n",
-      "Epoch 225/1000\n",
-      "157/157 [==============================] - 0s 293us/step - loss: 0.2707 - acc: 0.9363 - val_loss: 0.2380 - val_acc: 0.9118\n",
-      "Epoch 226/1000\n",
-      "157/157 [==============================] - 0s 157us/step - loss: 0.2703 - acc: 0.9427 - val_loss: 0.2386 - val_acc: 0.9118\n",
-      "Epoch 227/1000\n",
-      "157/157 [==============================] - 0s 213us/step - loss: 0.2681 - acc: 0.9490 - val_loss: 0.2374 - val_acc: 0.9118\n",
-      "Epoch 228/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.2680 - acc: 0.9363 - val_loss: 0.2365 - val_acc: 0.9118\n",
-      "Epoch 229/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.2668 - acc: 0.9236 - val_loss: 0.2342 - val_acc: 0.9118\n",
-      "Epoch 230/1000\n",
-      "157/157 [==============================] - 0s 213us/step - loss: 0.2652 - acc: 0.9363 - val_loss: 0.2324 - val_acc: 0.9118\n",
-      "Epoch 231/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.2634 - acc: 0.9490 - val_loss: 0.2320 - val_acc: 0.9118\n",
-      "Epoch 232/1000\n",
-      "157/157 [==============================] - 0s 258us/step - loss: 0.2624 - acc: 0.9427 - val_loss: 0.2310 - val_acc: 0.9118\n",
-      "Epoch 233/1000\n",
-      "157/157 [==============================] - 0s 245us/step - loss: 0.2627 - acc: 0.9427 - val_loss: 0.2299 - val_acc: 0.9118\n",
-      "Epoch 234/1000\n",
-      "157/157 [==============================] - 0s 396us/step - loss: 0.2597 - acc: 0.9490 - val_loss: 0.2293 - val_acc: 0.9118\n",
-      "Epoch 235/1000\n",
-      "157/157 [==============================] - 0s 192us/step - loss: 0.2584 - acc: 0.9490 - val_loss: 0.2292 - val_acc: 0.9118\n",
-      "Epoch 236/1000\n",
-      "157/157 [==============================] - 0s 294us/step - loss: 0.2579 - acc: 0.9427 - val_loss: 0.2271 - val_acc: 0.9118\n",
-      "Epoch 237/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.2564 - acc: 0.9427 - val_loss: 0.2262 - val_acc: 0.9118\n",
-      "Epoch 238/1000\n"
-     ]
-    },
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "157/157 [==============================] - 0s 251us/step - loss: 0.2542 - acc: 0.9490 - val_loss: 0.2261 - val_acc: 0.9118\n",
-      "Epoch 239/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.2552 - acc: 0.9363 - val_loss: 0.2241 - val_acc: 0.9118\n",
-      "Epoch 240/1000\n",
-      "157/157 [==============================] - 0s 281us/step - loss: 0.2531 - acc: 0.9490 - val_loss: 0.2243 - val_acc: 0.9118\n",
-      "Epoch 241/1000\n",
-      "157/157 [==============================] - 0s 158us/step - loss: 0.2508 - acc: 0.9490 - val_loss: 0.2222 - val_acc: 0.9118\n",
-      "Epoch 242/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.2530 - acc: 0.9427 - val_loss: 0.2201 - val_acc: 0.9118\n",
-      "Epoch 243/1000\n",
-      "157/157 [==============================] - 0s 185us/step - loss: 0.2502 - acc: 0.9554 - val_loss: 0.2198 - val_acc: 0.9118\n",
-      "Epoch 244/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.2478 - acc: 0.9490 - val_loss: 0.2190 - val_acc: 0.9118\n",
-      "Epoch 245/1000\n",
-      "157/157 [==============================] - 0s 247us/step - loss: 0.2477 - acc: 0.9490 - val_loss: 0.2185 - val_acc: 0.9118\n",
-      "Epoch 246/1000\n",
-      "157/157 [==============================] - 0s 164us/step - loss: 0.2458 - acc: 0.9490 - val_loss: 0.2167 - val_acc: 0.9118\n",
-      "Epoch 247/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.2445 - acc: 0.9490 - val_loss: 0.2152 - val_acc: 0.9118\n",
-      "Epoch 248/1000\n",
-      "157/157 [==============================] - 0s 215us/step - loss: 0.2437 - acc: 0.9299 - val_loss: 0.2130 - val_acc: 0.9118\n",
-      "Epoch 249/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.2420 - acc: 0.9554 - val_loss: 0.2124 - val_acc: 0.9118\n",
-      "Epoch 250/1000\n",
-      "157/157 [==============================] - 0s 305us/step - loss: 0.2404 - acc: 0.9490 - val_loss: 0.2109 - val_acc: 0.9118\n",
-      "Epoch 251/1000\n",
-      "157/157 [==============================] - 0s 237us/step - loss: 0.2428 - acc: 0.9618 - val_loss: 0.2121 - val_acc: 0.9118\n",
-      "Epoch 252/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.2383 - acc: 0.9490 - val_loss: 0.2106 - val_acc: 0.9118\n",
-      "Epoch 253/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.2370 - acc: 0.9618 - val_loss: 0.2111 - val_acc: 0.9118\n",
-      "Epoch 254/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.2370 - acc: 0.9490 - val_loss: 0.2096 - val_acc: 0.9118\n",
-      "Epoch 255/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.2390 - acc: 0.9172 - val_loss: 0.2082 - val_acc: 0.9118\n",
-      "Epoch 256/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.2338 - acc: 0.9490 - val_loss: 0.2063 - val_acc: 0.9118\n",
-      "Epoch 257/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.2332 - acc: 0.9554 - val_loss: 0.2063 - val_acc: 0.9118\n",
-      "Epoch 258/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.2319 - acc: 0.9490 - val_loss: 0.2060 - val_acc: 0.9118\n",
-      "Epoch 259/1000\n",
-      "157/157 [==============================] - 0s 214us/step - loss: 0.2329 - acc: 0.9299 - val_loss: 0.2034 - val_acc: 0.9118\n",
-      "Epoch 260/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.2304 - acc: 0.9490 - val_loss: 0.2044 - val_acc: 0.9118\n",
-      "Epoch 261/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.2307 - acc: 0.9554 - val_loss: 0.2025 - val_acc: 0.9118\n",
-      "Epoch 262/1000\n",
-      "157/157 [==============================] - 0s 205us/step - loss: 0.2277 - acc: 0.9554 - val_loss: 0.2018 - val_acc: 0.9118\n",
-      "Epoch 263/1000\n",
-      "157/157 [==============================] - 0s 118us/step - loss: 0.2265 - acc: 0.9554 - val_loss: 0.2022 - val_acc: 0.9118\n",
-      "Epoch 264/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.2261 - acc: 0.9490 - val_loss: 0.2007 - val_acc: 0.9118\n",
-      "Epoch 265/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.2256 - acc: 0.9554 - val_loss: 0.1985 - val_acc: 0.9118\n",
-      "Epoch 266/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.2233 - acc: 0.9618 - val_loss: 0.1982 - val_acc: 0.9118\n",
-      "Epoch 267/1000\n",
-      "157/157 [==============================] - 0s 202us/step - loss: 0.2220 - acc: 0.9554 - val_loss: 0.1957 - val_acc: 0.9118\n",
-      "Epoch 268/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.2226 - acc: 0.9554 - val_loss: 0.1957 - val_acc: 0.9118\n",
-      "Epoch 269/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.2213 - acc: 0.9554 - val_loss: 0.1935 - val_acc: 0.9118\n",
-      "Epoch 270/1000\n",
-      "157/157 [==============================] - 0s 140us/step - loss: 0.2214 - acc: 0.9554 - val_loss: 0.1968 - val_acc: 0.9118\n",
-      "Epoch 271/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.2187 - acc: 0.9554 - val_loss: 0.1965 - val_acc: 0.9118\n",
-      "Epoch 272/1000\n",
-      "157/157 [==============================] - 0s 222us/step - loss: 0.2174 - acc: 0.9490 - val_loss: 0.1925 - val_acc: 0.9118\n",
-      "Epoch 273/1000\n",
-      "157/157 [==============================] - 0s 130us/step - loss: 0.2188 - acc: 0.9618 - val_loss: 0.1925 - val_acc: 0.9118\n",
-      "Epoch 274/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.2157 - acc: 0.9554 - val_loss: 0.1923 - val_acc: 0.9118\n",
-      "Epoch 275/1000\n",
-      "157/157 [==============================] - 0s 120us/step - loss: 0.2170 - acc: 0.9490 - val_loss: 0.1908 - val_acc: 0.9118\n",
-      "Epoch 276/1000\n",
-      "157/157 [==============================] - 0s 201us/step - loss: 0.2149 - acc: 0.9618 - val_loss: 0.1918 - val_acc: 0.9118\n",
-      "Epoch 277/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.2140 - acc: 0.9618 - val_loss: 0.1924 - val_acc: 0.9118\n",
-      "Epoch 278/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.2128 - acc: 0.9554 - val_loss: 0.1899 - val_acc: 0.9118\n",
-      "Epoch 279/1000\n",
-      "157/157 [==============================] - 0s 205us/step - loss: 0.2123 - acc: 0.9618 - val_loss: 0.1881 - val_acc: 0.9118\n",
-      "Epoch 280/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.2115 - acc: 0.9554 - val_loss: 0.1889 - val_acc: 0.9118\n",
-      "Epoch 281/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.2115 - acc: 0.9490 - val_loss: 0.1863 - val_acc: 0.9118\n",
-      "Epoch 282/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.2100 - acc: 0.9554 - val_loss: 0.1854 - val_acc: 0.9118\n",
-      "Epoch 283/1000\n",
-      "157/157 [==============================] - 0s 127us/step - loss: 0.2099 - acc: 0.9618 - val_loss: 0.1872 - val_acc: 0.9118\n",
-      "Epoch 284/1000\n",
-      "157/157 [==============================] - 0s 108us/step - loss: 0.2085 - acc: 0.9618 - val_loss: 0.1867 - val_acc: 0.9118\n",
-      "Epoch 285/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.2070 - acc: 0.9618 - val_loss: 0.1862 - val_acc: 0.9118\n",
-      "Epoch 286/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.2061 - acc: 0.9618 - val_loss: 0.1858 - val_acc: 0.9118\n",
-      "Epoch 287/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.2074 - acc: 0.9554 - val_loss: 0.1866 - val_acc: 0.9118\n",
-      "Epoch 288/1000\n",
-      "157/157 [==============================] - 0s 134us/step - loss: 0.2052 - acc: 0.9554 - val_loss: 0.1864 - val_acc: 0.9118\n",
-      "Epoch 289/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.2045 - acc: 0.9554 - val_loss: 0.1839 - val_acc: 0.9118\n",
-      "Epoch 290/1000\n",
-      "157/157 [==============================] - 0s 246us/step - loss: 0.2035 - acc: 0.9618 - val_loss: 0.1817 - val_acc: 0.9118\n",
-      "Epoch 291/1000\n",
-      "157/157 [==============================] - 0s 127us/step - loss: 0.2043 - acc: 0.9618 - val_loss: 0.1828 - val_acc: 0.9118\n",
-      "Epoch 292/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.2014 - acc: 0.9618 - val_loss: 0.1832 - val_acc: 0.9118\n",
-      "Epoch 293/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.2014 - acc: 0.9554 - val_loss: 0.1829 - val_acc: 0.9118\n",
-      "Epoch 294/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.2003 - acc: 0.9618 - val_loss: 0.1822 - val_acc: 0.9118\n",
-      "Epoch 295/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.2019 - acc: 0.9618 - val_loss: 0.1799 - val_acc: 0.9118\n",
-      "Epoch 296/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.1995 - acc: 0.9554 - val_loss: 0.1778 - val_acc: 0.9118\n",
-      "Epoch 297/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.1990 - acc: 0.9618 - val_loss: 0.1810 - val_acc: 0.9118\n",
-      "Epoch 298/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.1975 - acc: 0.9618 - val_loss: 0.1822 - val_acc: 0.9118\n",
-      "Epoch 299/1000\n",
-      "157/157 [==============================] - 0s 169us/step - loss: 0.1975 - acc: 0.9490 - val_loss: 0.1800 - val_acc: 0.9118\n",
-      "Epoch 300/1000\n",
-      "157/157 [==============================] - 0s 270us/step - loss: 0.1964 - acc: 0.9618 - val_loss: 0.1784 - val_acc: 0.9118\n",
-      "Epoch 301/1000\n",
-      "157/157 [==============================] - 0s 249us/step - loss: 0.1957 - acc: 0.9618 - val_loss: 0.1755 - val_acc: 0.9118\n",
-      "Epoch 302/1000\n",
-      "157/157 [==============================] - 0s 368us/step - loss: 0.1977 - acc: 0.9618 - val_loss: 0.1741 - val_acc: 0.9118\n",
-      "Epoch 303/1000\n",
-      "157/157 [==============================] - 0s 214us/step - loss: 0.1941 - acc: 0.9554 - val_loss: 0.1766 - val_acc: 0.9118\n",
-      "Epoch 304/1000\n",
-      "157/157 [==============================] - 0s 283us/step - loss: 0.1930 - acc: 0.9618 - val_loss: 0.1742 - val_acc: 0.9118\n",
-      "Epoch 305/1000\n",
-      "157/157 [==============================] - 0s 299us/step - loss: 0.1932 - acc: 0.9618 - val_loss: 0.1752 - val_acc: 0.9118\n",
-      "Epoch 306/1000\n",
-      "157/157 [==============================] - 0s 284us/step - loss: 0.1930 - acc: 0.9618 - val_loss: 0.1766 - val_acc: 0.9118\n",
-      "Epoch 307/1000\n",
-      "157/157 [==============================] - 0s 217us/step - loss: 0.1914 - acc: 0.9618 - val_loss: 0.1746 - val_acc: 0.9118\n",
-      "Epoch 308/1000\n",
-      "157/157 [==============================] - 0s 303us/step - loss: 0.1918 - acc: 0.9490 - val_loss: 0.1736 - val_acc: 0.9118\n",
-      "Epoch 309/1000\n",
-      "157/157 [==============================] - 0s 561us/step - loss: 0.1892 - acc: 0.9618 - val_loss: 0.1723 - val_acc: 0.9118\n",
-      "Epoch 310/1000\n",
-      "157/157 [==============================] - 0s 379us/step - loss: 0.1897 - acc: 0.9618 - val_loss: 0.1725 - val_acc: 0.9118\n",
-      "Epoch 311/1000\n",
-      "157/157 [==============================] - 0s 219us/step - loss: 0.1880 - acc: 0.9618 - val_loss: 0.1721 - val_acc: 0.9118\n",
-      "Epoch 312/1000\n",
-      "157/157 [==============================] - 0s 181us/step - loss: 0.1872 - acc: 0.9618 - val_loss: 0.1693 - val_acc: 0.9118\n",
-      "Epoch 313/1000\n",
-      "157/157 [==============================] - 0s 206us/step - loss: 0.1880 - acc: 0.9554 - val_loss: 0.1679 - val_acc: 0.9118\n",
-      "Epoch 314/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.1857 - acc: 0.9618 - val_loss: 0.1690 - val_acc: 0.9118\n",
-      "Epoch 315/1000\n",
-      "157/157 [==============================] - 0s 579us/step - loss: 0.1847 - acc: 0.9554 - val_loss: 0.1694 - val_acc: 0.9118\n",
-      "Epoch 316/1000\n",
-      "157/157 [==============================] - 0s 199us/step - loss: 0.1843 - acc: 0.9618 - val_loss: 0.1727 - val_acc: 0.9118\n",
-      "Epoch 317/1000\n",
-      "157/157 [==============================] - 0s 244us/step - loss: 0.1853 - acc: 0.9554 - val_loss: 0.1714 - val_acc: 0.9118\n",
-      "Epoch 318/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.1843 - acc: 0.9618 - val_loss: 0.1680 - val_acc: 0.9118\n",
-      "Epoch 319/1000\n",
-      "157/157 [==============================] - 0s 249us/step - loss: 0.1815 - acc: 0.9554 - val_loss: 0.1686 - val_acc: 0.9118\n",
-      "Epoch 320/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.1828 - acc: 0.9618 - val_loss: 0.1669 - val_acc: 0.9118\n",
-      "Epoch 321/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.1807 - acc: 0.9618 - val_loss: 0.1646 - val_acc: 0.9118\n",
-      "Epoch 322/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.1695 - acc: 0.968 - 0s 170us/step - loss: 0.1819 - acc: 0.9554 - val_loss: 0.1626 - val_acc: 0.9118\n",
-      "Epoch 323/1000\n",
-      "157/157 [==============================] - 0s 234us/step - loss: 0.1799 - acc: 0.9618 - val_loss: 0.1626 - val_acc: 0.9118\n",
-      "Epoch 324/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.1795 - acc: 0.9554 - val_loss: 0.1648 - val_acc: 0.9118\n",
-      "Epoch 325/1000\n",
-      "157/157 [==============================] - 0s 421us/step - loss: 0.1801 - acc: 0.9554 - val_loss: 0.1642 - val_acc: 0.9118\n",
-      "Epoch 326/1000\n",
-      "157/157 [==============================] - 0s 287us/step - loss: 0.1787 - acc: 0.9554 - val_loss: 0.1667 - val_acc: 0.9118\n",
-      "Epoch 327/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.1770 - acc: 0.9618 - val_loss: 0.1638 - val_acc: 0.9118\n",
-      "Epoch 328/1000\n",
-      "157/157 [==============================] - 0s 268us/step - loss: 0.1777 - acc: 0.9618 - val_loss: 0.1619 - val_acc: 0.9118\n",
-      "Epoch 329/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.1759 - acc: 0.9554 - val_loss: 0.1634 - val_acc: 0.9118\n",
-      "Epoch 330/1000\n",
-      "157/157 [==============================] - 0s 282us/step - loss: 0.1774 - acc: 0.9618 - val_loss: 0.1606 - val_acc: 0.9118\n",
-      "Epoch 331/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.1752 - acc: 0.9554 - val_loss: 0.1633 - val_acc: 0.9118\n",
-      "Epoch 332/1000\n",
-      "157/157 [==============================] - 0s 524us/step - loss: 0.1749 - acc: 0.9618 - val_loss: 0.1638 - val_acc: 0.9118\n",
-      "Epoch 333/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.1772 - acc: 0.9618 - val_loss: 0.1624 - val_acc: 0.9118\n",
-      "Epoch 334/1000\n",
-      "157/157 [==============================] - 0s 89us/step - loss: 0.1730 - acc: 0.9618 - val_loss: 0.1599 - val_acc: 0.9118\n",
-      "Epoch 335/1000\n",
-      "157/157 [==============================] - 0s 307us/step - loss: 0.1734 - acc: 0.9554 - val_loss: 0.1576 - val_acc: 0.9118\n",
-      "Epoch 336/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.1722 - acc: 0.9554 - val_loss: 0.1603 - val_acc: 0.9118\n",
-      "Epoch 337/1000\n",
-      "157/157 [==============================] - 0s 270us/step - loss: 0.1731 - acc: 0.9554 - val_loss: 0.1617 - val_acc: 0.9118\n",
-      "Epoch 338/1000\n",
-      "157/157 [==============================] - 0s 234us/step - loss: 0.1704 - acc: 0.9618 - val_loss: 0.1619 - val_acc: 0.9118\n",
-      "Epoch 339/1000\n",
-      "157/157 [==============================] - 0s 400us/step - loss: 0.1709 - acc: 0.9618 - val_loss: 0.1589 - val_acc: 0.9118\n",
-      "Epoch 340/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.1708 - acc: 0.9554 - val_loss: 0.1590 - val_acc: 0.9118\n",
-      "Epoch 341/1000\n",
-      "157/157 [==============================] - 0s 390us/step - loss: 0.1694 - acc: 0.9618 - val_loss: 0.1590 - val_acc: 0.9118\n",
-      "Epoch 342/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.1713 - acc: 0.9618 - val_loss: 0.1567 - val_acc: 0.9118\n",
-      "Epoch 343/1000\n",
-      "157/157 [==============================] - 0s 201us/step - loss: 0.1708 - acc: 0.9554 - val_loss: 0.1574 - val_acc: 0.9118\n",
-      "Epoch 344/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.1679 - acc: 0.9618 - val_loss: 0.1572 - val_acc: 0.9118\n",
-      "Epoch 345/1000\n",
-      "157/157 [==============================] - 0s 126us/step - loss: 0.1690 - acc: 0.9618 - val_loss: 0.1560 - val_acc: 0.9118\n",
-      "Epoch 346/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.1676 - acc: 0.9618 - val_loss: 0.1558 - val_acc: 0.9118\n",
-      "Epoch 347/1000\n",
-      "157/157 [==============================] - 0s 169us/step - loss: 0.1666 - acc: 0.9618 - val_loss: 0.1556 - val_acc: 0.9118\n",
-      "Epoch 348/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.1675 - acc: 0.9618 - val_loss: 0.1548 - val_acc: 0.9118\n",
-      "Epoch 349/1000\n",
-      "157/157 [==============================] - 0s 515us/step - loss: 0.1679 - acc: 0.9554 - val_loss: 0.1564 - val_acc: 0.9118\n",
-      "Epoch 350/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.1647 - acc: 0.9554 - val_loss: 0.1566 - val_acc: 0.9118\n",
-      "Epoch 351/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.1647 - acc: 0.9618 - val_loss: 0.1564 - val_acc: 0.9118\n",
-      "Epoch 352/1000\n",
-      "157/157 [==============================] - 0s 258us/step - loss: 0.1645 - acc: 0.9554 - val_loss: 0.1550 - val_acc: 0.9118\n",
-      "Epoch 353/1000\n",
-      "157/157 [==============================] - 0s 230us/step - loss: 0.1639 - acc: 0.9618 - val_loss: 0.1523 - val_acc: 0.9118\n",
-      "Epoch 354/1000\n",
-      "157/157 [==============================] - 0s 370us/step - loss: 0.1632 - acc: 0.9554 - val_loss: 0.1571 - val_acc: 0.9118\n",
-      "Epoch 355/1000\n",
-      "157/157 [==============================] - 0s 126us/step - loss: 0.1624 - acc: 0.9618 - val_loss: 0.1561 - val_acc: 0.9118\n",
-      "Epoch 356/1000\n"
-     ]
-    },
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "157/157 [==============================] - 0s 169us/step - loss: 0.1626 - acc: 0.9554 - val_loss: 0.1579 - val_acc: 0.9118\n",
-      "Epoch 357/1000\n",
-      "157/157 [==============================] - 0s 159us/step - loss: 0.1629 - acc: 0.9618 - val_loss: 0.1585 - val_acc: 0.9118\n",
-      "Epoch 358/1000\n",
-      "157/157 [==============================] - 0s 405us/step - loss: 0.1603 - acc: 0.9618 - val_loss: 0.1544 - val_acc: 0.9118\n",
-      "Epoch 359/1000\n",
-      "157/157 [==============================] - 0s 236us/step - loss: 0.1603 - acc: 0.9618 - val_loss: 0.1535 - val_acc: 0.9118\n",
-      "Epoch 360/1000\n",
-      "157/157 [==============================] - 0s 242us/step - loss: 0.1589 - acc: 0.9618 - val_loss: 0.1526 - val_acc: 0.9118\n",
-      "Epoch 361/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.1592 - acc: 0.9618 - val_loss: 0.1506 - val_acc: 0.9118\n",
-      "Epoch 362/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.1594 - acc: 0.9682 - val_loss: 0.1509 - val_acc: 0.9118\n",
-      "Epoch 363/1000\n",
-      "157/157 [==============================] - 0s 370us/step - loss: 0.1597 - acc: 0.9618 - val_loss: 0.1533 - val_acc: 0.9118\n",
-      "Epoch 364/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.1570 - acc: 0.9618 - val_loss: 0.1521 - val_acc: 0.9118\n",
-      "Epoch 365/1000\n",
-      "157/157 [==============================] - 0s 266us/step - loss: 0.1576 - acc: 0.9682 - val_loss: 0.1506 - val_acc: 0.9118\n",
-      "Epoch 366/1000\n",
-      "157/157 [==============================] - 0s 177us/step - loss: 0.1570 - acc: 0.9618 - val_loss: 0.1503 - val_acc: 0.9118\n",
-      "Epoch 367/1000\n",
-      "157/157 [==============================] - 0s 211us/step - loss: 0.1559 - acc: 0.9682 - val_loss: 0.1476 - val_acc: 0.9118\n",
-      "Epoch 368/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.1570 - acc: 0.9618 - val_loss: 0.1474 - val_acc: 0.9118\n",
-      "Epoch 369/1000\n",
-      "157/157 [==============================] - 0s 226us/step - loss: 0.1546 - acc: 0.9618 - val_loss: 0.1506 - val_acc: 0.9118\n",
-      "Epoch 370/1000\n",
-      "157/157 [==============================] - 0s 217us/step - loss: 0.1558 - acc: 0.9618 - val_loss: 0.1481 - val_acc: 0.9118\n",
-      "Epoch 371/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.1538 - acc: 0.9682 - val_loss: 0.1487 - val_acc: 0.9118\n",
-      "Epoch 372/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.1553 - acc: 0.9618 - val_loss: 0.1535 - val_acc: 0.9118\n",
-      "Epoch 373/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.1530 - acc: 0.9682 - val_loss: 0.1485 - val_acc: 0.9118\n",
-      "Epoch 374/1000\n",
-      "157/157 [==============================] - 0s 234us/step - loss: 0.1523 - acc: 0.9682 - val_loss: 0.1469 - val_acc: 0.9118\n",
-      "Epoch 375/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.1525 - acc: 0.9554 - val_loss: 0.1511 - val_acc: 0.9118\n",
-      "Epoch 376/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.1517 - acc: 0.9682 - val_loss: 0.1465 - val_acc: 0.9118\n",
-      "Epoch 377/1000\n",
-      "157/157 [==============================] - 0s 251us/step - loss: 0.1517 - acc: 0.9554 - val_loss: 0.1484 - val_acc: 0.9118\n",
-      "Epoch 378/1000\n",
-      "157/157 [==============================] - 0s 212us/step - loss: 0.1517 - acc: 0.9682 - val_loss: 0.1423 - val_acc: 0.9265\n",
-      "Epoch 379/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.1503 - acc: 0.9554 - val_loss: 0.1454 - val_acc: 0.9118\n",
-      "Epoch 380/1000\n",
-      "157/157 [==============================] - 0s 236us/step - loss: 0.1516 - acc: 0.9618 - val_loss: 0.1442 - val_acc: 0.9118\n",
-      "Epoch 381/1000\n",
-      "157/157 [==============================] - 0s 510us/step - loss: 0.1500 - acc: 0.9554 - val_loss: 0.1458 - val_acc: 0.9118\n",
-      "Epoch 382/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.1501 - acc: 0.9618 - val_loss: 0.1458 - val_acc: 0.9118\n",
-      "Epoch 383/1000\n",
-      "157/157 [==============================] - 0s 248us/step - loss: 0.1480 - acc: 0.9682 - val_loss: 0.1443 - val_acc: 0.9118\n",
-      "Epoch 384/1000\n",
-      "157/157 [==============================] - 0s 187us/step - loss: 0.1478 - acc: 0.9682 - val_loss: 0.1416 - val_acc: 0.9118\n",
-      "Epoch 385/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.1494 - acc: 0.9554 - val_loss: 0.1457 - val_acc: 0.9118\n",
-      "Epoch 386/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.1468 - acc: 0.9554 - val_loss: 0.1471 - val_acc: 0.9118\n",
-      "Epoch 387/1000\n",
-      "157/157 [==============================] - 0s 287us/step - loss: 0.1498 - acc: 0.9618 - val_loss: 0.1455 - val_acc: 0.9118\n",
-      "Epoch 388/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.1461 - acc: 0.9618 - val_loss: 0.1428 - val_acc: 0.9118\n",
-      "Epoch 389/1000\n",
-      "157/157 [==============================] - 0s 238us/step - loss: 0.1462 - acc: 0.9682 - val_loss: 0.1408 - val_acc: 0.9118\n",
-      "Epoch 390/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.1461 - acc: 0.9682 - val_loss: 0.1391 - val_acc: 0.9265\n",
-      "Epoch 391/1000\n",
-      "157/157 [==============================] - 0s 248us/step - loss: 0.1469 - acc: 0.9618 - val_loss: 0.1421 - val_acc: 0.9118\n",
-      "Epoch 392/1000\n",
-      "157/157 [==============================] - 0s 286us/step - loss: 0.1452 - acc: 0.9618 - val_loss: 0.1442 - val_acc: 0.9118\n",
-      "Epoch 393/1000\n",
-      "157/157 [==============================] - 0s 219us/step - loss: 0.1450 - acc: 0.9682 - val_loss: 0.1417 - val_acc: 0.9118\n",
-      "Epoch 394/1000\n",
-      "157/157 [==============================] - 0s 304us/step - loss: 0.1441 - acc: 0.9682 - val_loss: 0.1389 - val_acc: 0.9265\n",
-      "Epoch 395/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.1445 - acc: 0.9618 - val_loss: 0.1395 - val_acc: 0.9118\n",
-      "Epoch 396/1000\n",
-      "157/157 [==============================] - 0s 136us/step - loss: 0.1428 - acc: 0.9618 - val_loss: 0.1397 - val_acc: 0.9118\n",
-      "Epoch 397/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.1423 - acc: 0.9618 - val_loss: 0.1434 - val_acc: 0.9118\n",
-      "Epoch 398/1000\n",
-      "157/157 [==============================] - 0s 101us/step - loss: 0.1423 - acc: 0.9618 - val_loss: 0.1395 - val_acc: 0.9118\n",
-      "Epoch 399/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.1431 - acc: 0.9618 - val_loss: 0.1416 - val_acc: 0.9118\n",
-      "Epoch 400/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.1409 - acc: 0.9682 - val_loss: 0.1410 - val_acc: 0.9118\n",
-      "Epoch 401/1000\n",
-      "157/157 [==============================] - 0s 107us/step - loss: 0.1412 - acc: 0.9682 - val_loss: 0.1374 - val_acc: 0.9265\n",
-      "Epoch 402/1000\n",
-      "157/157 [==============================] - 0s 208us/step - loss: 0.1421 - acc: 0.9618 - val_loss: 0.1356 - val_acc: 0.9265\n",
-      "Epoch 403/1000\n",
-      "157/157 [==============================] - 0s 104us/step - loss: 0.1397 - acc: 0.9682 - val_loss: 0.1400 - val_acc: 0.9118\n",
-      "Epoch 404/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.1398 - acc: 0.9682 - val_loss: 0.1353 - val_acc: 0.9265\n",
-      "Epoch 405/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.1405 - acc: 0.9618 - val_loss: 0.1359 - val_acc: 0.9265\n",
-      "Epoch 406/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.1382 - acc: 0.9682 - val_loss: 0.1353 - val_acc: 0.9265\n",
-      "Epoch 407/1000\n",
-      "157/157 [==============================] - 0s 251us/step - loss: 0.1390 - acc: 0.9682 - val_loss: 0.1349 - val_acc: 0.9265\n",
-      "Epoch 408/1000\n",
-      "157/157 [==============================] - 0s 96us/step - loss: 0.1378 - acc: 0.9682 - val_loss: 0.1368 - val_acc: 0.9118\n",
-      "Epoch 409/1000\n",
-      "157/157 [==============================] - 0s 245us/step - loss: 0.1372 - acc: 0.9682 - val_loss: 0.1443 - val_acc: 0.9118\n",
-      "Epoch 410/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.1363 - acc: 0.9745 - val_loss: 0.1365 - val_acc: 0.9118\n",
-      "Epoch 411/1000\n",
-      "157/157 [==============================] - 0s 164us/step - loss: 0.1366 - acc: 0.9682 - val_loss: 0.1385 - val_acc: 0.9118\n",
-      "Epoch 412/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.1371 - acc: 0.9745 - val_loss: 0.1335 - val_acc: 0.9265\n",
-      "Epoch 413/1000\n",
-      "157/157 [==============================] - 0s 175us/step - loss: 0.1374 - acc: 0.9682 - val_loss: 0.1359 - val_acc: 0.9118\n",
-      "Epoch 414/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.1346 - acc: 0.9682 - val_loss: 0.1370 - val_acc: 0.9118\n",
-      "Epoch 415/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.1352 - acc: 0.9682 - val_loss: 0.1363 - val_acc: 0.9118\n",
-      "Epoch 416/1000\n",
-      "157/157 [==============================] - 0s 130us/step - loss: 0.1364 - acc: 0.9745 - val_loss: 0.1368 - val_acc: 0.9118\n",
-      "Epoch 417/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.1350 - acc: 0.9682 - val_loss: 0.1334 - val_acc: 0.9265\n",
-      "Epoch 418/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.1272 - acc: 0.968 - 0s 196us/step - loss: 0.1331 - acc: 0.9682 - val_loss: 0.1357 - val_acc: 0.9118\n",
-      "Epoch 419/1000\n",
-      "157/157 [==============================] - 0s 169us/step - loss: 0.1353 - acc: 0.9682 - val_loss: 0.1349 - val_acc: 0.9118\n",
-      "Epoch 420/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.1324 - acc: 0.9682 - val_loss: 0.1378 - val_acc: 0.9118\n",
-      "Epoch 421/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.1327 - acc: 0.9682 - val_loss: 0.1348 - val_acc: 0.9118\n",
-      "Epoch 422/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.1332 - acc: 0.9618 - val_loss: 0.1372 - val_acc: 0.9118\n",
-      "Epoch 423/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.1327 - acc: 0.9745 - val_loss: 0.1362 - val_acc: 0.9118\n",
-      "Epoch 424/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.1312 - acc: 0.9682 - val_loss: 0.1381 - val_acc: 0.9118\n",
-      "Epoch 425/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.1311 - acc: 0.9745 - val_loss: 0.1374 - val_acc: 0.9118\n",
-      "Epoch 426/1000\n",
-      "157/157 [==============================] - 0s 262us/step - loss: 0.1334 - acc: 0.9745 - val_loss: 0.1333 - val_acc: 0.9118\n",
-      "Epoch 427/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.1305 - acc: 0.9745 - val_loss: 0.1298 - val_acc: 0.9265\n",
-      "Epoch 428/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.1292 - acc: 0.9682 - val_loss: 0.1335 - val_acc: 0.9118\n",
-      "Epoch 429/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.1336 - acc: 0.9745 - val_loss: 0.1308 - val_acc: 0.9265\n",
-      "Epoch 430/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.1289 - acc: 0.9682 - val_loss: 0.1346 - val_acc: 0.9118\n",
-      "Epoch 431/1000\n",
-      "157/157 [==============================] - 0s 182us/step - loss: 0.1287 - acc: 0.9682 - val_loss: 0.1320 - val_acc: 0.9118\n",
-      "Epoch 432/1000\n",
-      "157/157 [==============================] - 0s 93us/step - loss: 0.1281 - acc: 0.9745 - val_loss: 0.1291 - val_acc: 0.9265\n",
-      "Epoch 433/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.1292 - acc: 0.9682 - val_loss: 0.1353 - val_acc: 0.9118\n",
-      "Epoch 434/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.1274 - acc: 0.9682 - val_loss: 0.1329 - val_acc: 0.9118\n",
-      "Epoch 435/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.1275 - acc: 0.968 - 0s 161us/step - loss: 0.1285 - acc: 0.9618 - val_loss: 0.1295 - val_acc: 0.9265\n",
-      "Epoch 436/1000\n",
-      "157/157 [==============================] - 0s 272us/step - loss: 0.1267 - acc: 0.9809 - val_loss: 0.1263 - val_acc: 0.9265\n",
-      "Epoch 437/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.1276 - acc: 0.9745 - val_loss: 0.1332 - val_acc: 0.9118\n",
-      "Epoch 438/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.1263 - acc: 0.9809 - val_loss: 0.1271 - val_acc: 0.9265\n",
-      "Epoch 439/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.1260 - acc: 0.9682 - val_loss: 0.1315 - val_acc: 0.9118\n",
-      "Epoch 440/1000\n",
-      "157/157 [==============================] - 0s 324us/step - loss: 0.1257 - acc: 0.9682 - val_loss: 0.1332 - val_acc: 0.9118\n",
-      "Epoch 441/1000\n",
-      "157/157 [==============================] - 0s 97us/step - loss: 0.1251 - acc: 0.9745 - val_loss: 0.1377 - val_acc: 0.9118\n",
-      "Epoch 442/1000\n",
-      "157/157 [==============================] - 0s 95us/step - loss: 0.1241 - acc: 0.9809 - val_loss: 0.1281 - val_acc: 0.9265\n",
-      "Epoch 443/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.1252 - acc: 0.9745 - val_loss: 0.1287 - val_acc: 0.9265\n",
-      "Epoch 444/1000\n",
-      "157/157 [==============================] - 0s 247us/step - loss: 0.1250 - acc: 0.9682 - val_loss: 0.1315 - val_acc: 0.9118\n",
-      "Epoch 445/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.1244 - acc: 0.9745 - val_loss: 0.1322 - val_acc: 0.9118\n",
-      "Epoch 446/1000\n",
-      "157/157 [==============================] - 0s 174us/step - loss: 0.1242 - acc: 0.9809 - val_loss: 0.1319 - val_acc: 0.9118\n",
-      "Epoch 447/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.1229 - acc: 0.9809 - val_loss: 0.1292 - val_acc: 0.9118\n",
-      "Epoch 448/1000\n",
-      "157/157 [==============================] - 0s 244us/step - loss: 0.1223 - acc: 0.9682 - val_loss: 0.1276 - val_acc: 0.9265\n",
-      "Epoch 449/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.1216 - acc: 0.9745 - val_loss: 0.1283 - val_acc: 0.9265\n",
-      "Epoch 450/1000\n",
-      "157/157 [==============================] - 0s 251us/step - loss: 0.1211 - acc: 0.9745 - val_loss: 0.1277 - val_acc: 0.9265\n",
-      "Epoch 451/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.1208 - acc: 0.9682 - val_loss: 0.1333 - val_acc: 0.9118\n",
-      "Epoch 452/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.1207 - acc: 0.9745 - val_loss: 0.1294 - val_acc: 0.9118\n",
-      "Epoch 453/1000\n",
-      "157/157 [==============================] - 0s 195us/step - loss: 0.1214 - acc: 0.9745 - val_loss: 0.1307 - val_acc: 0.9118\n",
-      "Epoch 454/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.1217 - acc: 0.9745 - val_loss: 0.1293 - val_acc: 0.9118\n",
-      "Epoch 455/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.1203 - acc: 0.9682 - val_loss: 0.1302 - val_acc: 0.9118\n",
-      "Epoch 456/1000\n",
-      "157/157 [==============================] - 0s 268us/step - loss: 0.1188 - acc: 0.9682 - val_loss: 0.1237 - val_acc: 0.9265\n",
-      "Epoch 457/1000\n",
-      "157/157 [==============================] - 0s 285us/step - loss: 0.1221 - acc: 0.9682 - val_loss: 0.1258 - val_acc: 0.9265\n",
-      "Epoch 458/1000\n",
-      "157/157 [==============================] - 0s 277us/step - loss: 0.1183 - acc: 0.9745 - val_loss: 0.1293 - val_acc: 0.9118\n",
-      "Epoch 459/1000\n",
-      "157/157 [==============================] - 0s 124us/step - loss: 0.1191 - acc: 0.9745 - val_loss: 0.1256 - val_acc: 0.9265\n",
-      "Epoch 460/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.1178 - acc: 0.9682 - val_loss: 0.1273 - val_acc: 0.9118\n",
-      "Epoch 461/1000\n",
-      "157/157 [==============================] - 0s 118us/step - loss: 0.1176 - acc: 0.9682 - val_loss: 0.1316 - val_acc: 0.9118\n",
-      "Epoch 462/1000\n",
-      "157/157 [==============================] - 0s 124us/step - loss: 0.1192 - acc: 0.9809 - val_loss: 0.1270 - val_acc: 0.9118\n",
-      "Epoch 463/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.1189 - acc: 0.9809 - val_loss: 0.1244 - val_acc: 0.9265\n",
-      "Epoch 464/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.1176 - acc: 0.9682 - val_loss: 0.1269 - val_acc: 0.9118\n",
-      "Epoch 465/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.1179 - acc: 0.9809 - val_loss: 0.1228 - val_acc: 0.9265\n",
-      "Epoch 466/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.1171 - acc: 0.9682 - val_loss: 0.1279 - val_acc: 0.9118\n",
-      "Epoch 467/1000\n",
-      "157/157 [==============================] - 0s 111us/step - loss: 0.1173 - acc: 0.9745 - val_loss: 0.1255 - val_acc: 0.9265\n",
-      "Epoch 468/1000\n",
-      "157/157 [==============================] - 0s 116us/step - loss: 0.1161 - acc: 0.9745 - val_loss: 0.1241 - val_acc: 0.9265\n",
-      "Epoch 469/1000\n",
-      "157/157 [==============================] - 0s 221us/step - loss: 0.1147 - acc: 0.9745 - val_loss: 0.1247 - val_acc: 0.9265\n",
-      "Epoch 470/1000\n",
-      "157/157 [==============================] - 0s 211us/step - loss: 0.1158 - acc: 0.9809 - val_loss: 0.1220 - val_acc: 0.9265\n",
-      "Epoch 471/1000\n",
-      "157/157 [==============================] - 0s 126us/step - loss: 0.1143 - acc: 0.9682 - val_loss: 0.1254 - val_acc: 0.9118\n",
-      "Epoch 472/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0983 - acc: 1.000 - 0s 138us/step - loss: 0.1152 - acc: 0.9745 - val_loss: 0.1270 - val_acc: 0.9118\n",
-      "Epoch 473/1000\n"
-     ]
-    },
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "157/157 [==============================] - 0s 162us/step - loss: 0.1152 - acc: 0.9745 - val_loss: 0.1234 - val_acc: 0.9265\n",
-      "Epoch 474/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.1172 - acc: 0.9682 - val_loss: 0.1206 - val_acc: 0.9265\n",
-      "Epoch 475/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.1132 - acc: 0.9745 - val_loss: 0.1195 - val_acc: 0.9265\n",
-      "Epoch 476/1000\n",
-      "157/157 [==============================] - 0s 140us/step - loss: 0.1137 - acc: 0.9682 - val_loss: 0.1242 - val_acc: 0.9265\n",
-      "Epoch 477/1000\n",
-      "157/157 [==============================] - 0s 165us/step - loss: 0.1127 - acc: 0.9682 - val_loss: 0.1263 - val_acc: 0.9118\n",
-      "Epoch 478/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0833 - acc: 0.968 - 0s 168us/step - loss: 0.1156 - acc: 0.9682 - val_loss: 0.1246 - val_acc: 0.9118\n",
-      "Epoch 479/1000\n",
-      "157/157 [==============================] - 0s 169us/step - loss: 0.1123 - acc: 0.9745 - val_loss: 0.1227 - val_acc: 0.9265\n",
-      "Epoch 480/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.1129 - acc: 0.9809 - val_loss: 0.1195 - val_acc: 0.9265\n",
-      "Epoch 481/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.1133 - acc: 0.9682 - val_loss: 0.1225 - val_acc: 0.9265\n",
-      "Epoch 482/1000\n",
-      "157/157 [==============================] - 0s 205us/step - loss: 0.1130 - acc: 0.9745 - val_loss: 0.1259 - val_acc: 0.9118\n",
-      "Epoch 483/1000\n",
-      "157/157 [==============================] - 0s 140us/step - loss: 0.1113 - acc: 0.9745 - val_loss: 0.1233 - val_acc: 0.9265\n",
-      "Epoch 484/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.1114 - acc: 0.9745 - val_loss: 0.1219 - val_acc: 0.9265\n",
-      "Epoch 485/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.1110 - acc: 0.9809 - val_loss: 0.1184 - val_acc: 0.9265\n",
-      "Epoch 486/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.1107 - acc: 0.9745 - val_loss: 0.1186 - val_acc: 0.9265\n",
-      "Epoch 487/1000\n",
-      "157/157 [==============================] - 0s 157us/step - loss: 0.1109 - acc: 0.9745 - val_loss: 0.1221 - val_acc: 0.9265\n",
-      "Epoch 488/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.1108 - acc: 0.9745 - val_loss: 0.1222 - val_acc: 0.9265\n",
-      "Epoch 489/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.1097 - acc: 0.9745 - val_loss: 0.1182 - val_acc: 0.9265\n",
-      "Epoch 490/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.1088 - acc: 0.9745 - val_loss: 0.1242 - val_acc: 0.9118\n",
-      "Epoch 491/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.1098 - acc: 0.9809 - val_loss: 0.1260 - val_acc: 0.9118\n",
-      "Epoch 492/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.1094 - acc: 0.9809 - val_loss: 0.1200 - val_acc: 0.9265\n",
-      "Epoch 493/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.1081 - acc: 0.9745 - val_loss: 0.1216 - val_acc: 0.9265\n",
-      "Epoch 494/1000\n",
-      "157/157 [==============================] - 0s 212us/step - loss: 0.1083 - acc: 0.9745 - val_loss: 0.1202 - val_acc: 0.9265\n",
-      "Epoch 495/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.1083 - acc: 0.9809 - val_loss: 0.1167 - val_acc: 0.9412\n",
-      "Epoch 496/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.1081 - acc: 0.9809 - val_loss: 0.1146 - val_acc: 0.9412\n",
-      "Epoch 497/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.1083 - acc: 0.9745 - val_loss: 0.1203 - val_acc: 0.9265\n",
-      "Epoch 498/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.1076 - acc: 0.9745 - val_loss: 0.1177 - val_acc: 0.9265\n",
-      "Epoch 499/1000\n",
-      "157/157 [==============================] - 0s 158us/step - loss: 0.1064 - acc: 0.9745 - val_loss: 0.1182 - val_acc: 0.9265\n",
-      "Epoch 500/1000\n",
-      "157/157 [==============================] - 0s 134us/step - loss: 0.1057 - acc: 0.9745 - val_loss: 0.1216 - val_acc: 0.9265\n",
-      "Epoch 501/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.1051 - acc: 0.9809 - val_loss: 0.1191 - val_acc: 0.9265\n",
-      "Epoch 502/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.1072 - acc: 0.9809 - val_loss: 0.1160 - val_acc: 0.9265\n",
-      "Epoch 503/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.1057 - acc: 0.9745 - val_loss: 0.1191 - val_acc: 0.9265\n",
-      "Epoch 504/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.1049 - acc: 0.9745 - val_loss: 0.1157 - val_acc: 0.9265\n",
-      "Epoch 505/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.1047 - acc: 0.9745 - val_loss: 0.1153 - val_acc: 0.9265\n",
-      "Epoch 506/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.1035 - acc: 0.9745 - val_loss: 0.1202 - val_acc: 0.9265\n",
-      "Epoch 507/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.1053 - acc: 0.9745 - val_loss: 0.1162 - val_acc: 0.9265\n",
-      "Epoch 508/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.1041 - acc: 0.9745 - val_loss: 0.1157 - val_acc: 0.9265\n",
-      "Epoch 509/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.1039 - acc: 0.9745 - val_loss: 0.1160 - val_acc: 0.9265\n",
-      "Epoch 510/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.1037 - acc: 0.9745 - val_loss: 0.1190 - val_acc: 0.9265\n",
-      "Epoch 511/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.1018 - acc: 0.9809 - val_loss: 0.1144 - val_acc: 0.9412\n",
-      "Epoch 512/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.1041 - acc: 0.9745 - val_loss: 0.1220 - val_acc: 0.9118\n",
-      "Epoch 513/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.1027 - acc: 0.9745 - val_loss: 0.1166 - val_acc: 0.9265\n",
-      "Epoch 514/1000\n",
-      "157/157 [==============================] - 0s 132us/step - loss: 0.1014 - acc: 0.9745 - val_loss: 0.1172 - val_acc: 0.9265\n",
-      "Epoch 515/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.1013 - acc: 0.9745 - val_loss: 0.1135 - val_acc: 0.9412\n",
-      "Epoch 516/1000\n",
-      "157/157 [==============================] - 0s 336us/step - loss: 0.1011 - acc: 0.9809 - val_loss: 0.1219 - val_acc: 0.9118\n",
-      "Epoch 517/1000\n",
-      "157/157 [==============================] - 0s 199us/step - loss: 0.1009 - acc: 0.9745 - val_loss: 0.1199 - val_acc: 0.9265\n",
-      "Epoch 518/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.1040 - acc: 0.9745 - val_loss: 0.1156 - val_acc: 0.9265\n",
-      "Epoch 519/1000\n",
-      "157/157 [==============================] - 0s 114us/step - loss: 0.1007 - acc: 0.9809 - val_loss: 0.1185 - val_acc: 0.9265\n",
-      "Epoch 520/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.1000 - acc: 0.9809 - val_loss: 0.1175 - val_acc: 0.9265\n",
-      "Epoch 521/1000\n",
-      "157/157 [==============================] - 0s 187us/step - loss: 0.1008 - acc: 0.9745 - val_loss: 0.1102 - val_acc: 0.9559\n",
-      "Epoch 522/1000\n",
-      "157/157 [==============================] - 0s 145us/step - loss: 0.1014 - acc: 0.9809 - val_loss: 0.1169 - val_acc: 0.9265\n",
-      "Epoch 523/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.0998 - acc: 0.9745 - val_loss: 0.1142 - val_acc: 0.9265\n",
-      "Epoch 524/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.1006 - acc: 0.9809 - val_loss: 0.1158 - val_acc: 0.9265\n",
-      "Epoch 525/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.1005 - acc: 0.9745 - val_loss: 0.1154 - val_acc: 0.9265\n",
-      "Epoch 526/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.0992 - acc: 0.9745 - val_loss: 0.1177 - val_acc: 0.9265\n",
-      "Epoch 527/1000\n",
-      "157/157 [==============================] - 0s 145us/step - loss: 0.0990 - acc: 0.9745 - val_loss: 0.1193 - val_acc: 0.9118\n",
-      "Epoch 528/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.0984 - acc: 0.9745 - val_loss: 0.1163 - val_acc: 0.9265\n",
-      "Epoch 529/1000\n",
-      "157/157 [==============================] - 0s 147us/step - loss: 0.0990 - acc: 0.9745 - val_loss: 0.1127 - val_acc: 0.9412\n",
-      "Epoch 530/1000\n",
-      "157/157 [==============================] - 0s 340us/step - loss: 0.0972 - acc: 0.9809 - val_loss: 0.1201 - val_acc: 0.9118\n",
-      "Epoch 531/1000\n",
-      "157/157 [==============================] - 0s 277us/step - loss: 0.0994 - acc: 0.9745 - val_loss: 0.1132 - val_acc: 0.9412\n",
-      "Epoch 532/1000\n",
-      "157/157 [==============================] - 0s 287us/step - loss: 0.0974 - acc: 0.9745 - val_loss: 0.1083 - val_acc: 0.9559\n",
-      "Epoch 533/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.0983 - acc: 0.9809 - val_loss: 0.1158 - val_acc: 0.9265\n",
-      "Epoch 534/1000\n",
-      "157/157 [==============================] - 0s 231us/step - loss: 0.0970 - acc: 0.9809 - val_loss: 0.1187 - val_acc: 0.9118\n",
-      "Epoch 535/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.0970 - acc: 0.9745 - val_loss: 0.1159 - val_acc: 0.9265\n",
-      "Epoch 536/1000\n",
-      "157/157 [==============================] - 0s 350us/step - loss: 0.0962 - acc: 0.9745 - val_loss: 0.1169 - val_acc: 0.9265\n",
-      "Epoch 537/1000\n",
-      "157/157 [==============================] - 0s 226us/step - loss: 0.0974 - acc: 0.9745 - val_loss: 0.1178 - val_acc: 0.9265\n",
-      "Epoch 538/1000\n",
-      "157/157 [==============================] - 0s 330us/step - loss: 0.0957 - acc: 0.9809 - val_loss: 0.1080 - val_acc: 0.9559\n",
-      "Epoch 539/1000\n",
-      "157/157 [==============================] - 0s 259us/step - loss: 0.0956 - acc: 0.9873 - val_loss: 0.1122 - val_acc: 0.9412\n",
-      "Epoch 540/1000\n",
-      "157/157 [==============================] - 0s 208us/step - loss: 0.0954 - acc: 0.9745 - val_loss: 0.1095 - val_acc: 0.9559\n",
-      "Epoch 541/1000\n",
-      "157/157 [==============================] - 0s 140us/step - loss: 0.0965 - acc: 0.9809 - val_loss: 0.1096 - val_acc: 0.9559\n",
-      "Epoch 542/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.0943 - acc: 0.9809 - val_loss: 0.1129 - val_acc: 0.9265\n",
-      "Epoch 543/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.0951 - acc: 0.9809 - val_loss: 0.1135 - val_acc: 0.9265\n",
-      "Epoch 544/1000\n",
-      "157/157 [==============================] - 0s 130us/step - loss: 0.0950 - acc: 0.9809 - val_loss: 0.1144 - val_acc: 0.9265\n",
-      "Epoch 545/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0944 - acc: 0.9745 - val_loss: 0.1093 - val_acc: 0.9559\n",
-      "Epoch 546/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0939 - acc: 0.9745 - val_loss: 0.1086 - val_acc: 0.9559\n",
-      "Epoch 547/1000\n",
-      "157/157 [==============================] - 0s 208us/step - loss: 0.0935 - acc: 0.9745 - val_loss: 0.1047 - val_acc: 0.9559\n",
-      "Epoch 548/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0938 - acc: 0.9809 - val_loss: 0.1097 - val_acc: 0.9559\n",
-      "Epoch 549/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.0938 - acc: 0.9809 - val_loss: 0.1163 - val_acc: 0.9265\n",
-      "Epoch 550/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.1062 - acc: 0.968 - 0s 177us/step - loss: 0.0944 - acc: 0.9809 - val_loss: 0.1127 - val_acc: 0.9265\n",
-      "Epoch 551/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.0920 - acc: 0.9809 - val_loss: 0.1087 - val_acc: 0.9559\n",
-      "Epoch 552/1000\n",
-      "157/157 [==============================] - 0s 159us/step - loss: 0.0917 - acc: 0.9873 - val_loss: 0.1094 - val_acc: 0.9559\n",
-      "Epoch 553/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0946 - acc: 0.9809 - val_loss: 0.1089 - val_acc: 0.9559\n",
-      "Epoch 554/1000\n",
-      "157/157 [==============================] - 0s 181us/step - loss: 0.0914 - acc: 0.9809 - val_loss: 0.1117 - val_acc: 0.9265\n",
-      "Epoch 555/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.0918 - acc: 0.9873 - val_loss: 0.1154 - val_acc: 0.9265\n",
-      "Epoch 556/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0916 - acc: 0.9745 - val_loss: 0.1064 - val_acc: 0.9559\n",
-      "Epoch 557/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.0930 - acc: 0.9809 - val_loss: 0.1102 - val_acc: 0.9559\n",
-      "Epoch 558/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.0911 - acc: 0.9809 - val_loss: 0.1076 - val_acc: 0.9559\n",
-      "Epoch 559/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0908 - acc: 0.9809 - val_loss: 0.1167 - val_acc: 0.9118\n",
-      "Epoch 560/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0902 - acc: 0.9745 - val_loss: 0.1082 - val_acc: 0.9559\n",
-      "Epoch 561/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.0918 - acc: 0.9745 - val_loss: 0.1033 - val_acc: 0.9559\n",
-      "Epoch 562/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0896 - acc: 0.9873 - val_loss: 0.1126 - val_acc: 0.9265\n",
-      "Epoch 563/1000\n",
-      "157/157 [==============================] - 0s 132us/step - loss: 0.0903 - acc: 0.9809 - val_loss: 0.1043 - val_acc: 0.9559\n",
-      "Epoch 564/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.0901 - acc: 0.9745 - val_loss: 0.1031 - val_acc: 0.9559\n",
-      "Epoch 565/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.0889 - acc: 0.9873 - val_loss: 0.1165 - val_acc: 0.9118\n",
-      "Epoch 566/1000\n",
-      "157/157 [==============================] - 0s 120us/step - loss: 0.0889 - acc: 0.9809 - val_loss: 0.1029 - val_acc: 0.9559\n",
-      "Epoch 567/1000\n",
-      "157/157 [==============================] - 0s 138us/step - loss: 0.0892 - acc: 0.9873 - val_loss: 0.1111 - val_acc: 0.9265\n",
-      "Epoch 568/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.0887 - acc: 0.9809 - val_loss: 0.1072 - val_acc: 0.9559\n",
-      "Epoch 569/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.0903 - acc: 0.9809 - val_loss: 0.1057 - val_acc: 0.9559\n",
-      "Epoch 570/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.0879 - acc: 0.9873 - val_loss: 0.1071 - val_acc: 0.9559\n",
-      "Epoch 571/1000\n",
-      "157/157 [==============================] - 0s 254us/step - loss: 0.0902 - acc: 0.9745 - val_loss: 0.1026 - val_acc: 0.9559\n",
-      "Epoch 572/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.0894 - acc: 0.9809 - val_loss: 0.1060 - val_acc: 0.9559\n",
-      "Epoch 573/1000\n",
-      "157/157 [==============================] - 0s 233us/step - loss: 0.0878 - acc: 0.9809 - val_loss: 0.1063 - val_acc: 0.9559\n",
-      "Epoch 574/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0903 - acc: 0.9809 - val_loss: 0.1055 - val_acc: 0.9559\n",
-      "Epoch 575/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.0873 - acc: 0.9809 - val_loss: 0.1051 - val_acc: 0.9559\n",
-      "Epoch 576/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.0896 - acc: 0.9745 - val_loss: 0.1032 - val_acc: 0.9559\n",
-      "Epoch 577/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.0873 - acc: 0.9809 - val_loss: 0.1079 - val_acc: 0.9559\n",
-      "Epoch 578/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.0868 - acc: 0.9745 - val_loss: 0.1049 - val_acc: 0.9559\n",
-      "Epoch 579/1000\n",
-      "157/157 [==============================] - 0s 605us/step - loss: 0.0879 - acc: 0.9809 - val_loss: 0.1005 - val_acc: 0.9559\n",
-      "Epoch 580/1000\n",
-      "157/157 [==============================] - 0s 201us/step - loss: 0.0862 - acc: 0.9873 - val_loss: 0.1069 - val_acc: 0.9559\n",
-      "Epoch 581/1000\n",
-      "157/157 [==============================] - 0s 175us/step - loss: 0.0891 - acc: 0.9809 - val_loss: 0.1084 - val_acc: 0.9559\n",
-      "Epoch 582/1000\n",
-      "157/157 [==============================] - 0s 262us/step - loss: 0.0857 - acc: 0.9809 - val_loss: 0.1103 - val_acc: 0.9265\n",
-      "Epoch 583/1000\n",
-      "157/157 [==============================] - 0s 147us/step - loss: 0.0861 - acc: 0.9809 - val_loss: 0.1083 - val_acc: 0.9559\n",
-      "Epoch 584/1000\n",
-      "157/157 [==============================] - 0s 243us/step - loss: 0.0890 - acc: 0.9809 - val_loss: 0.1060 - val_acc: 0.9559\n",
-      "Epoch 585/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0853 - acc: 0.9809 - val_loss: 0.1076 - val_acc: 0.9559\n",
-      "Epoch 586/1000\n",
-      "157/157 [==============================] - 0s 294us/step - loss: 0.0869 - acc: 0.9745 - val_loss: 0.1027 - val_acc: 0.9559\n",
-      "Epoch 587/1000\n",
-      "157/157 [==============================] - 0s 279us/step - loss: 0.0859 - acc: 0.9809 - val_loss: 0.1073 - val_acc: 0.9559\n",
-      "Epoch 588/1000\n",
-      "157/157 [==============================] - 0s 215us/step - loss: 0.0849 - acc: 0.9873 - val_loss: 0.1131 - val_acc: 0.9265\n",
-      "Epoch 589/1000\n",
-      "157/157 [==============================] - 0s 357us/step - loss: 0.0856 - acc: 0.9745 - val_loss: 0.1021 - val_acc: 0.9559\n",
-      "Epoch 590/1000\n",
-      "157/157 [==============================] - 0s 113us/step - loss: 0.0865 - acc: 0.9873 - val_loss: 0.1045 - val_acc: 0.9559\n",
-      "Epoch 591/1000\n"
-     ]
-    },
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "157/157 [==============================] - 0s 132us/step - loss: 0.0840 - acc: 0.9809 - val_loss: 0.1031 - val_acc: 0.9559\n",
-      "Epoch 592/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0847 - acc: 0.9873 - val_loss: 0.1088 - val_acc: 0.9559\n",
-      "Epoch 593/1000\n",
-      "157/157 [==============================] - 0s 241us/step - loss: 0.0844 - acc: 0.9809 - val_loss: 0.1104 - val_acc: 0.9412\n",
-      "Epoch 594/1000\n",
-      "157/157 [==============================] - 0s 289us/step - loss: 0.0846 - acc: 0.9873 - val_loss: 0.1109 - val_acc: 0.9265\n",
-      "Epoch 595/1000\n",
-      "157/157 [==============================] - 0s 119us/step - loss: 0.0841 - acc: 0.9809 - val_loss: 0.1089 - val_acc: 0.9559\n",
-      "Epoch 596/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.0842 - acc: 0.9809 - val_loss: 0.1101 - val_acc: 0.9265\n",
-      "Epoch 597/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.0845 - acc: 0.9809 - val_loss: 0.1052 - val_acc: 0.9559\n",
-      "Epoch 598/1000\n",
-      "157/157 [==============================] - 0s 239us/step - loss: 0.0824 - acc: 0.9809 - val_loss: 0.1039 - val_acc: 0.9559\n",
-      "Epoch 599/1000\n",
-      "157/157 [==============================] - 0s 203us/step - loss: 0.0832 - acc: 0.9936 - val_loss: 0.1032 - val_acc: 0.9559\n",
-      "Epoch 600/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0825 - acc: 0.9809 - val_loss: 0.1041 - val_acc: 0.9559\n",
-      "Epoch 601/1000\n",
-      "157/157 [==============================] - 0s 224us/step - loss: 0.0831 - acc: 0.9809 - val_loss: 0.1038 - val_acc: 0.9559\n",
-      "Epoch 602/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0829 - acc: 0.9809 - val_loss: 0.1097 - val_acc: 0.9412\n",
-      "Epoch 603/1000\n",
-      "157/157 [==============================] - 0s 259us/step - loss: 0.0830 - acc: 0.9809 - val_loss: 0.1023 - val_acc: 0.9559\n",
-      "Epoch 604/1000\n",
-      "157/157 [==============================] - 0s 224us/step - loss: 0.0824 - acc: 0.9873 - val_loss: 0.1132 - val_acc: 0.9118\n",
-      "Epoch 605/1000\n",
-      "157/157 [==============================] - 0s 96us/step - loss: 0.0827 - acc: 0.9809 - val_loss: 0.1121 - val_acc: 0.9265\n",
-      "Epoch 606/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0831 - acc: 0.9873 - val_loss: 0.1072 - val_acc: 0.9559\n",
-      "Epoch 607/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.0813 - acc: 0.9809 - val_loss: 0.1062 - val_acc: 0.9559\n",
-      "Epoch 608/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.0817 - acc: 0.9809 - val_loss: 0.1049 - val_acc: 0.9559\n",
-      "Epoch 609/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.0817 - acc: 0.9809 - val_loss: 0.1018 - val_acc: 0.9559\n",
-      "Epoch 610/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0808 - acc: 0.9873 - val_loss: 0.1021 - val_acc: 0.9559\n",
-      "Epoch 611/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.0812 - acc: 0.9873 - val_loss: 0.1007 - val_acc: 0.9559\n",
-      "Epoch 612/1000\n",
-      "157/157 [==============================] - 0s 502us/step - loss: 0.0817 - acc: 0.9873 - val_loss: 0.1089 - val_acc: 0.9412\n",
-      "Epoch 613/1000\n",
-      "157/157 [==============================] - 0s 312us/step - loss: 0.0810 - acc: 0.9809 - val_loss: 0.1070 - val_acc: 0.9559\n",
-      "Epoch 614/1000\n",
-      "157/157 [==============================] - 0s 203us/step - loss: 0.0806 - acc: 0.9873 - val_loss: 0.1112 - val_acc: 0.9265\n",
-      "Epoch 615/1000\n",
-      "157/157 [==============================] - 0s 243us/step - loss: 0.0797 - acc: 0.9809 - val_loss: 0.1005 - val_acc: 0.9559\n",
-      "Epoch 616/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0810 - acc: 0.9809 - val_loss: 0.1057 - val_acc: 0.9559\n",
-      "Epoch 617/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.0825 - acc: 0.9809 - val_loss: 0.1116 - val_acc: 0.9265\n",
-      "Epoch 618/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.0801 - acc: 0.9809 - val_loss: 0.1021 - val_acc: 0.9559\n",
-      "Epoch 619/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.0788 - acc: 0.9809 - val_loss: 0.1018 - val_acc: 0.9559\n",
-      "Epoch 620/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0816 - acc: 0.9809 - val_loss: 0.1011 - val_acc: 0.9559\n",
-      "Epoch 621/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.0807 - acc: 0.9809 - val_loss: 0.1004 - val_acc: 0.9559\n",
-      "Epoch 622/1000\n",
-      "157/157 [==============================] - 0s 203us/step - loss: 0.0795 - acc: 0.9809 - val_loss: 0.0986 - val_acc: 0.9559\n",
-      "Epoch 623/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.0804 - acc: 0.9809 - val_loss: 0.1027 - val_acc: 0.9559\n",
-      "Epoch 624/1000\n",
-      "157/157 [==============================] - 0s 295us/step - loss: 0.0787 - acc: 0.9873 - val_loss: 0.1072 - val_acc: 0.9559\n",
-      "Epoch 625/1000\n",
-      "157/157 [==============================] - 0s 215us/step - loss: 0.0798 - acc: 0.9809 - val_loss: 0.1022 - val_acc: 0.9559\n",
-      "Epoch 626/1000\n",
-      "157/157 [==============================] - 0s 288us/step - loss: 0.0779 - acc: 0.9873 - val_loss: 0.1063 - val_acc: 0.9559\n",
-      "Epoch 627/1000\n",
-      "157/157 [==============================] - 0s 257us/step - loss: 0.0777 - acc: 0.9809 - val_loss: 0.1014 - val_acc: 0.9559\n",
-      "Epoch 628/1000\n",
-      "157/157 [==============================] - 0s 230us/step - loss: 0.0779 - acc: 0.9873 - val_loss: 0.0965 - val_acc: 0.9706\n",
-      "Epoch 629/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.0782 - acc: 0.9809 - val_loss: 0.1018 - val_acc: 0.9559\n",
-      "Epoch 630/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.0772 - acc: 0.9873 - val_loss: 0.1039 - val_acc: 0.9559\n",
-      "Epoch 631/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.0771 - acc: 0.9873 - val_loss: 0.1081 - val_acc: 0.9559\n",
-      "Epoch 632/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.0780 - acc: 0.9809 - val_loss: 0.0956 - val_acc: 0.9706\n",
-      "Epoch 633/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.0770 - acc: 0.9809 - val_loss: 0.0983 - val_acc: 0.9559\n",
-      "Epoch 634/1000\n",
-      "157/157 [==============================] - 0s 166us/step - loss: 0.0787 - acc: 0.9809 - val_loss: 0.1033 - val_acc: 0.9559\n",
-      "Epoch 635/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.0765 - acc: 0.9809 - val_loss: 0.0989 - val_acc: 0.9559\n",
-      "Epoch 636/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0778 - acc: 0.9809 - val_loss: 0.1014 - val_acc: 0.9559\n",
-      "Epoch 637/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.0762 - acc: 0.9873 - val_loss: 0.1015 - val_acc: 0.9559\n",
-      "Epoch 638/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0757 - acc: 0.9873 - val_loss: 0.1073 - val_acc: 0.9559\n",
-      "Epoch 639/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0777 - acc: 0.9873 - val_loss: 0.1046 - val_acc: 0.9559\n",
-      "Epoch 640/1000\n",
-      "157/157 [==============================] - 0s 128us/step - loss: 0.0768 - acc: 0.9873 - val_loss: 0.1076 - val_acc: 0.9559\n",
-      "Epoch 641/1000\n",
-      "157/157 [==============================] - 0s 147us/step - loss: 0.0783 - acc: 0.9745 - val_loss: 0.1049 - val_acc: 0.9559\n",
-      "Epoch 642/1000\n",
-      "157/157 [==============================] - 0s 256us/step - loss: 0.0751 - acc: 0.9873 - val_loss: 0.1052 - val_acc: 0.9559\n",
-      "Epoch 643/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.0759 - acc: 0.9809 - val_loss: 0.0991 - val_acc: 0.9559\n",
-      "Epoch 644/1000\n",
-      "157/157 [==============================] - 0s 302us/step - loss: 0.0753 - acc: 0.9809 - val_loss: 0.0980 - val_acc: 0.9559\n",
-      "Epoch 645/1000\n",
-      "157/157 [==============================] - 0s 325us/step - loss: 0.0758 - acc: 0.9809 - val_loss: 0.0942 - val_acc: 0.9706\n",
-      "Epoch 646/1000\n",
-      "157/157 [==============================] - 0s 242us/step - loss: 0.0751 - acc: 0.9873 - val_loss: 0.0975 - val_acc: 0.9559\n",
-      "Epoch 647/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.0740 - acc: 0.9873 - val_loss: 0.1068 - val_acc: 0.9559\n",
-      "Epoch 648/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.0754 - acc: 0.9745 - val_loss: 0.0966 - val_acc: 0.9559\n",
-      "Epoch 649/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.0736 - acc: 0.9873 - val_loss: 0.1073 - val_acc: 0.9559\n",
-      "Epoch 650/1000\n",
-      "157/157 [==============================] - 0s 119us/step - loss: 0.0754 - acc: 0.9809 - val_loss: 0.0996 - val_acc: 0.9559\n",
-      "Epoch 651/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.0730 - acc: 0.9873 - val_loss: 0.1035 - val_acc: 0.9559\n",
-      "Epoch 652/1000\n",
-      "157/157 [==============================] - 0s 100us/step - loss: 0.0745 - acc: 0.9873 - val_loss: 0.1131 - val_acc: 0.9118\n",
-      "Epoch 653/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.0778 - acc: 0.9809 - val_loss: 0.1017 - val_acc: 0.9559\n",
-      "Epoch 654/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.0732 - acc: 0.9809 - val_loss: 0.0970 - val_acc: 0.9559\n",
-      "Epoch 655/1000\n",
-      "157/157 [==============================] - 0s 214us/step - loss: 0.0734 - acc: 0.9873 - val_loss: 0.1025 - val_acc: 0.9559\n",
-      "Epoch 656/1000\n",
-      "157/157 [==============================] - 0s 240us/step - loss: 0.0732 - acc: 0.9809 - val_loss: 0.1020 - val_acc: 0.9559\n",
-      "Epoch 657/1000\n",
-      "157/157 [==============================] - 0s 255us/step - loss: 0.0733 - acc: 0.9809 - val_loss: 0.0967 - val_acc: 0.9559\n",
-      "Epoch 658/1000\n",
-      "157/157 [==============================] - 0s 187us/step - loss: 0.0724 - acc: 0.9873 - val_loss: 0.0989 - val_acc: 0.9559\n",
-      "Epoch 659/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.0740 - acc: 0.9873 - val_loss: 0.0978 - val_acc: 0.9559\n",
-      "Epoch 660/1000\n",
-      "157/157 [==============================] - 0s 378us/step - loss: 0.0744 - acc: 0.9873 - val_loss: 0.0978 - val_acc: 0.9559\n",
-      "Epoch 661/1000\n",
-      "157/157 [==============================] - 0s 243us/step - loss: 0.0719 - acc: 0.9873 - val_loss: 0.0981 - val_acc: 0.9559\n",
-      "Epoch 662/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0727 - acc: 0.9873 - val_loss: 0.0941 - val_acc: 0.9706\n",
-      "Epoch 663/1000\n",
-      "157/157 [==============================] - 0s 114us/step - loss: 0.0730 - acc: 0.9873 - val_loss: 0.1029 - val_acc: 0.9559\n",
-      "Epoch 664/1000\n",
-      "157/157 [==============================] - 0s 411us/step - loss: 0.0738 - acc: 0.9809 - val_loss: 0.0987 - val_acc: 0.9559\n",
-      "Epoch 665/1000\n",
-      "157/157 [==============================] - 0s 329us/step - loss: 0.0722 - acc: 0.9809 - val_loss: 0.0947 - val_acc: 0.9559\n",
-      "Epoch 666/1000\n",
-      "157/157 [==============================] - 0s 335us/step - loss: 0.0732 - acc: 0.9873 - val_loss: 0.0993 - val_acc: 0.9559\n",
-      "Epoch 667/1000\n",
-      "157/157 [==============================] - 0s 206us/step - loss: 0.0710 - acc: 0.9809 - val_loss: 0.0935 - val_acc: 0.9706\n",
-      "Epoch 668/1000\n",
-      "157/157 [==============================] - 0s 101us/step - loss: 0.0709 - acc: 0.9873 - val_loss: 0.0988 - val_acc: 0.9559\n",
-      "Epoch 669/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0705 - acc: 0.9873 - val_loss: 0.1039 - val_acc: 0.9559\n",
-      "Epoch 670/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.0711 - acc: 0.9809 - val_loss: 0.1019 - val_acc: 0.9559\n",
-      "Epoch 671/1000\n",
-      "157/157 [==============================] - 0s 105us/step - loss: 0.0704 - acc: 0.9809 - val_loss: 0.1007 - val_acc: 0.9559\n",
-      "Epoch 672/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.0749 - acc: 0.9873 - val_loss: 0.0974 - val_acc: 0.9559\n",
-      "Epoch 673/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.0726 - acc: 0.9873 - val_loss: 0.1020 - val_acc: 0.9559\n",
-      "Epoch 674/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.0713 - acc: 0.9809 - val_loss: 0.0986 - val_acc: 0.9559\n",
-      "Epoch 675/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.0705 - acc: 0.9873 - val_loss: 0.1031 - val_acc: 0.9559\n",
-      "Epoch 676/1000\n",
-      "157/157 [==============================] - 0s 291us/step - loss: 0.0723 - acc: 0.9809 - val_loss: 0.1020 - val_acc: 0.9559\n",
-      "Epoch 677/1000\n",
-      "157/157 [==============================] - 0s 242us/step - loss: 0.0709 - acc: 0.9809 - val_loss: 0.0967 - val_acc: 0.9559\n",
-      "Epoch 678/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.0705 - acc: 0.9873 - val_loss: 0.1041 - val_acc: 0.9559\n",
-      "Epoch 679/1000\n",
-      "157/157 [==============================] - 0s 226us/step - loss: 0.0711 - acc: 0.9873 - val_loss: 0.1024 - val_acc: 0.9559\n",
-      "Epoch 680/1000\n",
-      "157/157 [==============================] - 0s 330us/step - loss: 0.0696 - acc: 0.9873 - val_loss: 0.0955 - val_acc: 0.9559\n",
-      "Epoch 681/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0697 - acc: 0.9809 - val_loss: 0.0930 - val_acc: 0.9706\n",
-      "Epoch 682/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.0700 - acc: 0.9809 - val_loss: 0.0955 - val_acc: 0.9559\n",
-      "Epoch 683/1000\n",
-      "157/157 [==============================] - 0s 282us/step - loss: 0.0687 - acc: 0.9873 - val_loss: 0.1031 - val_acc: 0.9559\n",
-      "Epoch 684/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0697 - acc: 0.9873 - val_loss: 0.1033 - val_acc: 0.9559\n",
-      "Epoch 685/1000\n",
-      "157/157 [==============================] - 0s 111us/step - loss: 0.0702 - acc: 0.9809 - val_loss: 0.0906 - val_acc: 0.9706\n",
-      "Epoch 686/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.0692 - acc: 0.9873 - val_loss: 0.0927 - val_acc: 0.9706\n",
-      "Epoch 687/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.0679 - acc: 0.9873 - val_loss: 0.1006 - val_acc: 0.9559\n",
-      "Epoch 688/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.0714 - acc: 0.9809 - val_loss: 0.0951 - val_acc: 0.9559\n",
-      "Epoch 689/1000\n",
-      "157/157 [==============================] - 0s 271us/step - loss: 0.0684 - acc: 0.9873 - val_loss: 0.1033 - val_acc: 0.9559\n",
-      "Epoch 690/1000\n",
-      "157/157 [==============================] - 0s 239us/step - loss: 0.0685 - acc: 0.9873 - val_loss: 0.0967 - val_acc: 0.9559\n",
-      "Epoch 691/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.0684 - acc: 0.9873 - val_loss: 0.1023 - val_acc: 0.9559\n",
-      "Epoch 692/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0690 - acc: 0.9809 - val_loss: 0.0969 - val_acc: 0.9559\n",
-      "Epoch 693/1000\n",
-      "157/157 [==============================] - 0s 237us/step - loss: 0.0699 - acc: 0.9745 - val_loss: 0.0944 - val_acc: 0.9559\n",
-      "Epoch 694/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.0678 - acc: 0.9873 - val_loss: 0.0987 - val_acc: 0.9559\n",
-      "Epoch 695/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0668 - acc: 0.9873 - val_loss: 0.0994 - val_acc: 0.9559\n",
-      "Epoch 696/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.0692 - acc: 0.9809 - val_loss: 0.1031 - val_acc: 0.9559\n",
-      "Epoch 697/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.0689 - acc: 0.9809 - val_loss: 0.0985 - val_acc: 0.9559\n",
-      "Epoch 698/1000\n",
-      "157/157 [==============================] - 0s 299us/step - loss: 0.0676 - acc: 0.9873 - val_loss: 0.0983 - val_acc: 0.9559\n",
-      "Epoch 699/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.0671 - acc: 0.9809 - val_loss: 0.0983 - val_acc: 0.9559\n",
-      "Epoch 700/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.0695 - acc: 0.9809 - val_loss: 0.1002 - val_acc: 0.9559\n",
-      "Epoch 701/1000\n",
-      "157/157 [==============================] - 0s 175us/step - loss: 0.0665 - acc: 0.9873 - val_loss: 0.0976 - val_acc: 0.9559\n",
-      "Epoch 702/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.0687 - acc: 0.9873 - val_loss: 0.0940 - val_acc: 0.9559\n",
-      "Epoch 703/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0663 - acc: 0.9873 - val_loss: 0.0949 - val_acc: 0.9559\n",
-      "Epoch 704/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.0661 - acc: 0.9873 - val_loss: 0.0946 - val_acc: 0.9559\n",
-      "Epoch 705/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.0677 - acc: 0.9809 - val_loss: 0.0947 - val_acc: 0.9559\n",
-      "Epoch 706/1000\n",
-      "157/157 [==============================] - 0s 127us/step - loss: 0.0674 - acc: 0.9809 - val_loss: 0.0942 - val_acc: 0.9559\n",
-      "Epoch 707/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.0664 - acc: 0.9873 - val_loss: 0.0948 - val_acc: 0.9559\n",
-      "Epoch 708/1000\n",
-      "157/157 [==============================] - 0s 125us/step - loss: 0.0664 - acc: 0.9873 - val_loss: 0.0964 - val_acc: 0.9559\n",
-      "Epoch 709/1000\n"
-     ]
-    },
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "157/157 [==============================] - 0s 150us/step - loss: 0.0652 - acc: 0.9873 - val_loss: 0.0961 - val_acc: 0.9559\n",
-      "Epoch 710/1000\n",
-      "157/157 [==============================] - 0s 120us/step - loss: 0.0680 - acc: 0.9809 - val_loss: 0.0943 - val_acc: 0.9559\n",
-      "Epoch 711/1000\n",
-      "157/157 [==============================] - 0s 138us/step - loss: 0.0652 - acc: 0.9809 - val_loss: 0.0913 - val_acc: 0.9706\n",
-      "Epoch 712/1000\n",
-      "157/157 [==============================] - 0s 127us/step - loss: 0.0655 - acc: 0.9873 - val_loss: 0.0945 - val_acc: 0.9559\n",
-      "Epoch 713/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.0666 - acc: 0.9873 - val_loss: 0.0895 - val_acc: 0.9706\n",
-      "Epoch 714/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.0658 - acc: 0.9936 - val_loss: 0.0982 - val_acc: 0.9559\n",
-      "Epoch 715/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0653 - acc: 0.9873 - val_loss: 0.0988 - val_acc: 0.9559\n",
-      "Epoch 716/1000\n",
-      "157/157 [==============================] - 0s 310us/step - loss: 0.0642 - acc: 0.9873 - val_loss: 0.0943 - val_acc: 0.9559\n",
-      "Epoch 717/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.0660 - acc: 0.9809 - val_loss: 0.0910 - val_acc: 0.9706\n",
-      "Epoch 718/1000\n",
-      "157/157 [==============================] - 0s 237us/step - loss: 0.0649 - acc: 0.9873 - val_loss: 0.0950 - val_acc: 0.9559\n",
-      "Epoch 719/1000\n",
-      "157/157 [==============================] - 0s 221us/step - loss: 0.0636 - acc: 0.9873 - val_loss: 0.0951 - val_acc: 0.9559\n",
-      "Epoch 720/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0661 - acc: 0.9809 - val_loss: 0.0948 - val_acc: 0.9559\n",
-      "Epoch 721/1000\n",
-      "157/157 [==============================] - 0s 219us/step - loss: 0.0644 - acc: 0.9809 - val_loss: 0.0915 - val_acc: 0.9559\n",
-      "Epoch 722/1000\n",
-      "157/157 [==============================] - 0s 127us/step - loss: 0.0644 - acc: 0.9936 - val_loss: 0.0999 - val_acc: 0.9559\n",
-      "Epoch 723/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.0641 - acc: 0.9809 - val_loss: 0.0966 - val_acc: 0.9559\n",
-      "Epoch 724/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.0634 - acc: 0.9873 - val_loss: 0.0916 - val_acc: 0.9559\n",
-      "Epoch 725/1000\n",
-      "157/157 [==============================] - 0s 139us/step - loss: 0.0638 - acc: 0.9873 - val_loss: 0.0897 - val_acc: 0.9706\n",
-      "Epoch 726/1000\n",
-      "157/157 [==============================] - 0s 229us/step - loss: 0.0631 - acc: 0.9873 - val_loss: 0.0922 - val_acc: 0.9559\n",
-      "Epoch 727/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0635 - acc: 0.9809 - val_loss: 0.0877 - val_acc: 0.9706\n",
-      "Epoch 728/1000\n",
-      "157/157 [==============================] - 0s 222us/step - loss: 0.0632 - acc: 0.9936 - val_loss: 0.0944 - val_acc: 0.9559\n",
-      "Epoch 729/1000\n",
-      "157/157 [==============================] - 0s 143us/step - loss: 0.0631 - acc: 0.9809 - val_loss: 0.0848 - val_acc: 0.9706\n",
-      "Epoch 730/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0641 - acc: 0.9873 - val_loss: 0.0898 - val_acc: 0.9706\n",
-      "Epoch 731/1000\n",
-      "157/157 [==============================] - 0s 132us/step - loss: 0.0624 - acc: 0.9873 - val_loss: 0.0859 - val_acc: 0.9706\n",
-      "Epoch 732/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0627 - acc: 0.9936 - val_loss: 0.0920 - val_acc: 0.9559\n",
-      "Epoch 733/1000\n",
-      "157/157 [==============================] - 0s 147us/step - loss: 0.0618 - acc: 0.9936 - val_loss: 0.0944 - val_acc: 0.9559\n",
-      "Epoch 734/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0655 - acc: 0.9745 - val_loss: 0.0916 - val_acc: 0.9559\n",
-      "Epoch 735/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.0619 - acc: 0.9936 - val_loss: 0.0957 - val_acc: 0.9559\n",
-      "Epoch 736/1000\n",
-      "157/157 [==============================] - 0s 136us/step - loss: 0.0646 - acc: 0.9809 - val_loss: 0.0965 - val_acc: 0.9559\n",
-      "Epoch 737/1000\n",
-      "157/157 [==============================] - 0s 132us/step - loss: 0.0623 - acc: 0.9809 - val_loss: 0.0920 - val_acc: 0.9559\n",
-      "Epoch 738/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.0618 - acc: 0.9873 - val_loss: 0.0971 - val_acc: 0.9559\n",
-      "Epoch 739/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0612 - acc: 0.9936 - val_loss: 0.0957 - val_acc: 0.9559\n",
-      "Epoch 740/1000\n",
-      "157/157 [==============================] - 0s 160us/step - loss: 0.0626 - acc: 0.9873 - val_loss: 0.0922 - val_acc: 0.9559\n",
-      "Epoch 741/1000\n",
-      "157/157 [==============================] - 0s 158us/step - loss: 0.0620 - acc: 0.9936 - val_loss: 0.0915 - val_acc: 0.9559\n",
-      "Epoch 742/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0297 - acc: 1.000 - 0s 213us/step - loss: 0.0619 - acc: 0.9936 - val_loss: 0.0895 - val_acc: 0.9706\n",
-      "Epoch 743/1000\n",
-      "157/157 [==============================] - 0s 179us/step - loss: 0.0621 - acc: 0.9936 - val_loss: 0.0893 - val_acc: 0.9706\n",
-      "Epoch 744/1000\n",
-      "157/157 [==============================] - 0s 329us/step - loss: 0.0629 - acc: 0.9809 - val_loss: 0.0915 - val_acc: 0.9559\n",
-      "Epoch 745/1000\n",
-      "157/157 [==============================] - 0s 194us/step - loss: 0.0617 - acc: 0.9809 - val_loss: 0.0920 - val_acc: 0.9559\n",
-      "Epoch 746/1000\n",
-      "157/157 [==============================] - 0s 199us/step - loss: 0.0602 - acc: 0.9936 - val_loss: 0.0965 - val_acc: 0.9559\n",
-      "Epoch 747/1000\n",
-      "157/157 [==============================] - 0s 271us/step - loss: 0.0611 - acc: 0.9745 - val_loss: 0.0925 - val_acc: 0.9559\n",
-      "Epoch 748/1000\n",
-      "157/157 [==============================] - 0s 189us/step - loss: 0.0606 - acc: 0.9936 - val_loss: 0.1003 - val_acc: 0.9559\n",
-      "Epoch 749/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.0601 - acc: 0.9873 - val_loss: 0.0938 - val_acc: 0.9559\n",
-      "Epoch 750/1000\n",
-      "157/157 [==============================] - 0s 103us/step - loss: 0.0618 - acc: 0.9809 - val_loss: 0.0885 - val_acc: 0.9706\n",
-      "Epoch 751/1000\n",
-      "157/157 [==============================] - 0s 196us/step - loss: 0.0602 - acc: 0.9936 - val_loss: 0.0954 - val_acc: 0.9559\n",
-      "Epoch 752/1000\n",
-      "157/157 [==============================] - 0s 164us/step - loss: 0.0627 - acc: 0.9873 - val_loss: 0.0938 - val_acc: 0.9559\n",
-      "Epoch 753/1000\n",
-      "157/157 [==============================] - 0s 134us/step - loss: 0.0601 - acc: 0.9873 - val_loss: 0.0922 - val_acc: 0.9559\n",
-      "Epoch 754/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0621 - acc: 0.9809 - val_loss: 0.0881 - val_acc: 0.9706\n",
-      "Epoch 755/1000\n",
-      "157/157 [==============================] - 0s 119us/step - loss: 0.0610 - acc: 0.9936 - val_loss: 0.0950 - val_acc: 0.9559\n",
-      "Epoch 756/1000\n",
-      "157/157 [==============================] - 0s 110us/step - loss: 0.0597 - acc: 0.9936 - val_loss: 0.0900 - val_acc: 0.9706\n",
-      "Epoch 757/1000\n",
-      "157/157 [==============================] - 0s 126us/step - loss: 0.0595 - acc: 0.9809 - val_loss: 0.0857 - val_acc: 0.9706\n",
-      "Epoch 758/1000\n",
-      "157/157 [==============================] - 0s 110us/step - loss: 0.0592 - acc: 0.9936 - val_loss: 0.0940 - val_acc: 0.9559\n",
-      "Epoch 759/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.0606 - acc: 0.9936 - val_loss: 0.0983 - val_acc: 0.9559\n",
-      "Epoch 760/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.0590 - acc: 0.9873 - val_loss: 0.0945 - val_acc: 0.9559\n",
-      "Epoch 761/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0595 - acc: 0.9809 - val_loss: 0.0845 - val_acc: 0.9706\n",
-      "Epoch 762/1000\n",
-      "157/157 [==============================] - 0s 130us/step - loss: 0.0599 - acc: 0.9873 - val_loss: 0.0842 - val_acc: 0.9706\n",
-      "Epoch 763/1000\n",
-      "157/157 [==============================] - 0s 136us/step - loss: 0.0598 - acc: 0.9936 - val_loss: 0.0896 - val_acc: 0.9706\n",
-      "Epoch 764/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0580 - acc: 0.9936 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 765/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0588 - acc: 0.9936 - val_loss: 0.0892 - val_acc: 0.9706\n",
-      "Epoch 766/1000\n",
-      "157/157 [==============================] - 0s 200us/step - loss: 0.0588 - acc: 0.9936 - val_loss: 0.0994 - val_acc: 0.9559\n",
-      "Epoch 767/1000\n",
-      "157/157 [==============================] - 0s 249us/step - loss: 0.0592 - acc: 0.9936 - val_loss: 0.0977 - val_acc: 0.9559\n",
-      "Epoch 768/1000\n",
-      "157/157 [==============================] - 0s 222us/step - loss: 0.0583 - acc: 0.9809 - val_loss: 0.0848 - val_acc: 0.9706\n",
-      "Epoch 769/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0591 - acc: 0.9936 - val_loss: 0.0839 - val_acc: 0.9706\n",
-      "Epoch 770/1000\n",
-      "157/157 [==============================] - 0s 146us/step - loss: 0.0587 - acc: 0.9873 - val_loss: 0.0837 - val_acc: 0.9706\n",
-      "Epoch 771/1000\n",
-      "157/157 [==============================] - 0s 225us/step - loss: 0.0594 - acc: 0.9936 - val_loss: 0.0918 - val_acc: 0.9559\n",
-      "Epoch 772/1000\n",
-      "157/157 [==============================] - 0s 410us/step - loss: 0.0575 - acc: 0.9809 - val_loss: 0.0849 - val_acc: 0.9706\n",
-      "Epoch 773/1000\n",
-      "157/157 [==============================] - 0s 458us/step - loss: 0.0594 - acc: 0.9936 - val_loss: 0.0926 - val_acc: 0.9559\n",
-      "Epoch 774/1000\n",
-      "157/157 [==============================] - 0s 380us/step - loss: 0.0587 - acc: 0.9936 - val_loss: 0.0882 - val_acc: 0.9706\n",
-      "Epoch 775/1000\n",
-      "157/157 [==============================] - 0s 278us/step - loss: 0.0587 - acc: 0.9936 - val_loss: 0.0944 - val_acc: 0.9559\n",
-      "Epoch 776/1000\n",
-      "157/157 [==============================] - 0s 210us/step - loss: 0.0579 - acc: 0.9936 - val_loss: 0.0958 - val_acc: 0.9559\n",
-      "Epoch 777/1000\n",
-      "157/157 [==============================] - 0s 221us/step - loss: 0.0574 - acc: 0.9873 - val_loss: 0.0911 - val_acc: 0.9559\n",
-      "Epoch 778/1000\n",
-      "157/157 [==============================] - 0s 187us/step - loss: 0.0602 - acc: 0.9809 - val_loss: 0.0920 - val_acc: 0.9559\n",
-      "Epoch 779/1000\n",
-      "157/157 [==============================] - 0s 159us/step - loss: 0.0567 - acc: 0.9873 - val_loss: 0.0883 - val_acc: 0.9706\n",
-      "Epoch 780/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.0570 - acc: 0.9936 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 781/1000\n",
-      "157/157 [==============================] - 0s 214us/step - loss: 0.0562 - acc: 0.9936 - val_loss: 0.0909 - val_acc: 0.9559\n",
-      "Epoch 782/1000\n",
-      "157/157 [==============================] - 0s 203us/step - loss: 0.0574 - acc: 0.9873 - val_loss: 0.0836 - val_acc: 0.9706\n",
-      "Epoch 783/1000\n",
-      "157/157 [==============================] - 0s 179us/step - loss: 0.0572 - acc: 0.9936 - val_loss: 0.0966 - val_acc: 0.9559\n",
-      "Epoch 784/1000\n",
-      "157/157 [==============================] - 0s 137us/step - loss: 0.0567 - acc: 0.9873 - val_loss: 0.0908 - val_acc: 0.9559\n",
-      "Epoch 785/1000\n",
-      "157/157 [==============================] - 0s 400us/step - loss: 0.0568 - acc: 0.9936 - val_loss: 0.0955 - val_acc: 0.9559\n",
-      "Epoch 786/1000\n",
-      "157/157 [==============================] - 0s 459us/step - loss: 0.0569 - acc: 0.9936 - val_loss: 0.0988 - val_acc: 0.9559\n",
-      "Epoch 787/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0567 - acc: 0.9745 - val_loss: 0.0899 - val_acc: 0.9559\n",
-      "Epoch 788/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.0572 - acc: 0.9936 - val_loss: 0.0927 - val_acc: 0.9559\n",
-      "Epoch 789/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.0556 - acc: 0.9936 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 790/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0571 - acc: 0.9809 - val_loss: 0.0939 - val_acc: 0.9559\n",
-      "Epoch 791/1000\n",
-      "157/157 [==============================] - 0s 220us/step - loss: 0.0582 - acc: 0.9745 - val_loss: 0.0904 - val_acc: 0.9559\n",
-      "Epoch 792/1000\n",
-      "157/157 [==============================] - 0s 97us/step - loss: 0.0563 - acc: 0.9936 - val_loss: 0.0966 - val_acc: 0.9559\n",
-      "Epoch 793/1000\n",
-      "157/157 [==============================] - 0s 124us/step - loss: 0.0577 - acc: 0.9873 - val_loss: 0.0959 - val_acc: 0.9559\n",
-      "Epoch 794/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0564 - acc: 0.9809 - val_loss: 0.0863 - val_acc: 0.9706\n",
-      "Epoch 795/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.0560 - acc: 0.9936 - val_loss: 0.0951 - val_acc: 0.9559\n",
-      "Epoch 796/1000\n",
-      "157/157 [==============================] - 0s 229us/step - loss: 0.0563 - acc: 0.9936 - val_loss: 0.0908 - val_acc: 0.9559\n",
-      "Epoch 797/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.0545 - acc: 0.9936 - val_loss: 0.0927 - val_acc: 0.9559\n",
-      "Epoch 798/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.0572 - acc: 0.9873 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 799/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0559 - acc: 0.9936 - val_loss: 0.0895 - val_acc: 0.9559\n",
-      "Epoch 800/1000\n",
-      "157/157 [==============================] - 0s 220us/step - loss: 0.0555 - acc: 0.9936 - val_loss: 0.0900 - val_acc: 0.9559\n",
-      "Epoch 801/1000\n",
-      "157/157 [==============================] - 0s 103us/step - loss: 0.0555 - acc: 0.9936 - val_loss: 0.0897 - val_acc: 0.9559\n",
-      "Epoch 802/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0569 - acc: 0.9873 - val_loss: 0.0896 - val_acc: 0.9559\n",
-      "Epoch 803/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0556 - acc: 0.9936 - val_loss: 0.0899 - val_acc: 0.9559\n",
-      "Epoch 804/1000\n",
-      "157/157 [==============================] - 0s 116us/step - loss: 0.0539 - acc: 0.9936 - val_loss: 0.0897 - val_acc: 0.9559\n",
-      "Epoch 805/1000\n",
-      "157/157 [==============================] - 0s 184us/step - loss: 0.0559 - acc: 0.9873 - val_loss: 0.0864 - val_acc: 0.9706\n",
-      "Epoch 806/1000\n",
-      "157/157 [==============================] - 0s 233us/step - loss: 0.0539 - acc: 0.9936 - val_loss: 0.0901 - val_acc: 0.9559\n",
-      "Epoch 807/1000\n",
-      "157/157 [==============================] - 0s 151us/step - loss: 0.0538 - acc: 0.9936 - val_loss: 0.0855 - val_acc: 0.9706\n",
-      "Epoch 808/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.0547 - acc: 0.9936 - val_loss: 0.0852 - val_acc: 0.9706\n",
-      "Epoch 809/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.0545 - acc: 0.9936 - val_loss: 0.0941 - val_acc: 0.9559\n",
-      "Epoch 810/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.0549 - acc: 0.9936 - val_loss: 0.0935 - val_acc: 0.9559\n",
-      "Epoch 811/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.0560 - acc: 0.9873 - val_loss: 0.0872 - val_acc: 0.9706\n",
-      "Epoch 812/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0557 - acc: 0.9809 - val_loss: 0.0842 - val_acc: 0.9706\n",
-      "Epoch 813/1000\n",
-      "157/157 [==============================] - 0s 156us/step - loss: 0.0540 - acc: 0.9936 - val_loss: 0.0859 - val_acc: 0.9706\n",
-      "Epoch 814/1000\n",
-      "157/157 [==============================] - 0s 183us/step - loss: 0.0529 - acc: 0.9936 - val_loss: 0.0885 - val_acc: 0.9559\n",
-      "Epoch 815/1000\n",
-      "157/157 [==============================] - 0s 96us/step - loss: 0.0549 - acc: 0.9873 - val_loss: 0.0855 - val_acc: 0.9706\n",
-      "Epoch 816/1000\n",
-      "157/157 [==============================] - 0s 105us/step - loss: 0.0535 - acc: 0.9936 - val_loss: 0.0943 - val_acc: 0.9559\n",
-      "Epoch 817/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.0554 - acc: 0.9809 - val_loss: 0.0882 - val_acc: 0.9559\n",
-      "Epoch 818/1000\n",
-      "157/157 [==============================] - 0s 135us/step - loss: 0.0531 - acc: 0.9936 - val_loss: 0.0934 - val_acc: 0.9559\n",
-      "Epoch 819/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.0549 - acc: 0.9936 - val_loss: 0.0899 - val_acc: 0.9559\n",
-      "Epoch 820/1000\n",
-      "157/157 [==============================] - 0s 182us/step - loss: 0.0555 - acc: 0.9936 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 821/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.0533 - acc: 0.9873 - val_loss: 0.0841 - val_acc: 0.9706\n",
-      "Epoch 822/1000\n",
-      "157/157 [==============================] - 0s 221us/step - loss: 0.0531 - acc: 0.9936 - val_loss: 0.0844 - val_acc: 0.9706\n",
-      "Epoch 823/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.0531 - acc: 0.9873 - val_loss: 0.0826 - val_acc: 0.9706\n",
-      "Epoch 824/1000\n",
-      "157/157 [==============================] - 0s 202us/step - loss: 0.0533 - acc: 0.9936 - val_loss: 0.0897 - val_acc: 0.9559\n",
-      "Epoch 825/1000\n",
-      "157/157 [==============================] - 0s 263us/step - loss: 0.0525 - acc: 1.0000 - val_loss: 0.0800 - val_acc: 0.9706\n",
-      "Epoch 826/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.0544 - acc: 0.9936 - val_loss: 0.0937 - val_acc: 0.9559\n",
-      "Epoch 827/1000\n"
-     ]
-    },
+     "data": {
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAAH0CAYAAADVH+85AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3WuQXXWZ7/Hf05ekkyY3QiK3IwEUEnEIh8hwUy6hQHQKRYVTvBilLJlyPNZhcPSU1ihORE8VUx5PRJiDU+qRGqk6OIXKFEcGoQQkCjNTJEREIpcJIQZzIdfuTvqS7n7Oi71aO213kl7P7l7dz/5+qnat9Nr72f9/r17p3169116PubsAAEBOTVVPAAAATByCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEispeoJTAQze1XSXEmbKp4KAABlLZHU4e6nRp4kZdBLmjtr1qxjly1bdmzVEwEm0o4dO0L1XV1dpWvb2tpCY7t7qH5wcLB07cknnxwa28xC9cDR2LBhg7q7u8PPU2nQm9nJkm6TdLWkhZK2SnpA0pfcfU/gqTctW7bs2LVr19ZhlsDUtXr16lD9008/Xbr2zDPPDI3d09MTqu/r6ytd+7WvfS00dktL+V+dkRcoktTUVP4d1+iLK17gTK4VK1Zo3bp1m6LPU1nQm9npkp6StFjSP0v6jaQ/lfRXkq42s4vdfVdV8wMAIIMqT8b736qF/M3ufq27f87dV0paLelMSf+jwrkBAJBCJUFvZqdJukq1k+X+fsTdfytpv6QPm1n7JE8NAIBUqjqiX1ksH3H3Q96wcvdOSb+QNFvSBZM9MQAAMqnqPfqhs3heGuP+l1U74j9D0k/HehIzG+tsu6XlpwYAQB5VHdHPK5b7xrh/aP38SZgLAABpTdXP0Q99huOwnwVx9xWjFteO9M+t96QAAJhuqjqiHzpinzfG/XNHPA4AAJRQVdC/WCzPGOP+txbLsd7DBwAAR6GqoH+8WF5lZofMwczmSLpYUrekf53siQEAkEklQe/u/yHpEdUu2P/JEXd/SVK7pH909/2TPDUAAFKp8mS8/6raJXC/YWZXSNog6XxJl6v2J/vPVzg3AABSqOwSuMVR/Tsk3aNawH9a0umSviHpQq5zDwBAXKUfr3P330r6aJVzAAAgs6n6OXpUJNrGMqLKFpiR7zu6zSJtR7/73e+Gxj7hhBNK1z744IOhsU8//fRQfUdHR+naLVu2hMZesmRJ6dpom9rI/5Po/7HIvk6L2+pU2b0OAABMMIIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDEaFOLQ0zXVrFRVbb+fPTRR0vXtrW1hcZeunRp6drt27eHxj777LND9ZFWs+vWrQuNHWlT29JS3a/dKv+PRcemzW15HNEDAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAidGPHnUznftN//a3vy1du3r16tDYTz/9dOnaSF90SRocHCxdO3/+/NDYPT09ofrI+A8//HBo7GeeeaZ07bvf/e7Q2Jdeemnp2uj/scj/cfrJV4cjegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjDa1qJsq21DeeeedofpvfetbpWsXLVoUGnvZsmWh+oj+/v7Stfv37w+NHd1fmpubS9cee+yxobE7OztL195xxx2hsX/84x+Xrr3ttttCY7e1tZWujbRElqSmJo5Ly2LLAQCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAInRjx6HiPQnb2mJ7U7r168vXbt69erQ2O985ztL10a2WdVaW1tL10a/72h/8oGBgdK1ke9bkubPn1+69phjjgmNvXHjxtK1jz32WGjs9773vaF6VKOyI3oz22RmPsZtW1XzAgAgk6qP6PdJ+voo67smeyIAAGRUddDvdfdVFc8BAIC0OBkPAIDEqj6in2lmfy7pzZL2S3pO0pPuXv4sGwAA8HtVB/3xkr43Yt2rZvZRd//ZkYrNbO0Ydy0NzwwAgASq/NP9dyVdoVrYt0v6E0n/IGmJpH8xs+XVTQ0AgBwqO6J39y+NWPW8pL80sy5Jn5a0StIHjvAcK0ZbXxzpn1uHaQIAMK1NxZPxvlksL6l0FgAAJDAVg35HsWyvdBYAACQwFYP+wmJZ/jqPAABAUkVBb2Znmdmxo6w/RdJdxZf3Tu6sAADIp6qT8a6X9Dkze1zSq5I6JZ0u6c8ktUl6SNL/rGhuAACkUVXQPy7pTEn/WbU/1bdL2ivp56p9rv577u4VzQ0AgDQqCfriYjhHvCAOxi/6+qjK11ePPPJI6dq3ve1tobF7e3tD9RE9PT2la9va2kJjR77vSKtWqdo2tWYWGjvyM4tqby9/nvK6detCY0fa1DY1TcVTwhoDWx4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIrJJ+9Jg40X7yra2tdZrJ+K1du7Z07cGDB0Nj7927t3TtokWLQmN3dnaWrp0zZ05o7Mj+cs4551Q2thTrR3/MMceExt65c2fp2ra2ttDYg4ODpWs3b94cGrvKfTXyfTc1NfYxbWN/9wAAJEfQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGG1qUTd79uwJ1Udaf0ZaWEpSf39/ZWP39fWVrm1ubg6NvX///tK1ixcvDo0d+b6lWJvb6M8soru7O1R/4MCBOs1k/DZu3Fi6dvny5aGxaVNbXmN/9wAAJEfQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBi9KNPxswqG3v9+vWh+khP+JaW2K7c2dlZura3tzc0dk9PT+na6Pc9MDBQunb37t2hsY855phQfeR7j/7MWltbS9fu2bMnNHZXV1fp2gULFoTG/tWvflW6NtqPPrqvNzKO6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMfr+oW5+/etfh+ojbSibmmKvWQcHB0vX9vX1VTZ2tC3xjBkzStfu3bs3NHa0TW1EpCWyJM2aNat07bZt20JjR1oLt7e3h8aO/h9HNepyRG9m15nZnWa2xsw6zMzN7N4j1FxkZg+Z2W4zO2Bmz5nZLWbWXI85AQCA+h3Rf0HSckldkrZIWnq4B5vZ+yX9QFKPpO9L2i3pGkmrJV0s6fo6zQsAgIZWr/foPyXpDElzJX3icA80s7mSviVpQNJl7v4xd//vks6R9LSk68zshjrNCwCAhlaXoHf3x939ZXf3o3j4dZIWSbrP3Z8Z9hw9qv1lQDrCiwUAAHB0qjjrfmWxfHiU+56UdEDSRWY2c/KmBABATlUE/ZnF8qWRd7h7v6RXVTt34LTJnBQAABlV8fG6ecVy3xj3D62ff6QnMrO1Y9x12JMBAQBoFFPxgjlDHww+mvf7AQDAYVRxRD90xD5vjPvnjnjcmNx9xWjriyP9c8c/NQAAcqniiP7FYnnGyDvMrEXSqZL6JW2czEkBAJBRFUH/WLG8epT7LpE0W9JT7t47eVMCACCnKoL+fkk7Jd1gZu8YWmlmbZK+Unx5dwXzAgAgnbq8R29m10q6tvjy+GJ5oZndU/x7p7t/RpLcvcPM/kK1wH/CzO5T7RK471Pto3f3q3ZZXAAAEFSvk/HOkXTjiHWn6Q+fhX9N0meG7nD3B8zsUkmfl/QhSW2SXpH015K+cZRX2AMAAEdQl6B391WSVo2z5heS3luP8QEAwOjoR4+6ifaqnj17dunaaH/xgwcPlq7dv39/aOympvKnykT6yUtSW1tb6dqenp7Q2JFtLsV6wnd3d4fGjmz36B8sIz+zlpbYr/zXXnstVI9qTMUL5gAAgDoh6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIw2tcmYWWVjb9q0KVS/YMGC0rW7d+8Ojd3c3Fy6NtrydOHChaVr+/r6QmPPnDmzdG20TW20vW9ku+3Zsyc0dmRfjbTXlWKtZqNtjSPbrbOzMzT2nDlzQvWNjCN6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDH60aNudu7cGao/4YQTSte6e2jsgYGB0rXRnvCR/uIHDx4Mjd3W1la61sxCY/f29obqIyI/bym+v1Ul+jPr6uoqXfv666+Hxl66dGmovpFxRA8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAidGmFofYunVr6dqOjo7Q2JGWqVGRlqmRNrOS1NRU/vV2lW1qW1tbQ2NHWp5WLdKmNtriNrK/RPX395eu3bBhQ2hs2tSWxxE9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBj96HGI119/vXRttM92c3Nz6dpIn2xJ6unpKV0b7ctuZqVrBwYGQmO3tJT/FTBjxozQ2N3d3aH6wcHB0rWRbR6tj8w7Wh/9fzJr1qzStTt27AiNjfLqckRvZteZ2Z1mtsbMOszMzezeMR67pLh/rNt99ZgTAACo3xH9FyQtl9QlaYukpUdR80tJD4yy/vk6zQkAgIZXr6D/lGoB/4qkSyU9fhQ16919VZ3GBwAAo6hL0Lv774M9+t4XAAConypPxjvRzD4uaaGkXZKedvfnKpwPAADpVBn0Vxa33zOzJyTd6O6bj+YJzGztGHcdzTkCAACkV8Xn6A9I+rKkFZIWFLeh9/Uvk/RTM2uvYF4AAKQz6Uf07r5D0hdHrH7SzK6S9HNJ50u6SdIdR/FcK0ZbXxzpnxucKgAA096UuTKeu/dL+nbx5SVVzgUAgCymTNAX3iiW/OkeAIA6mGpBf0Gx3FjpLAAASGLSg97MzjezP7pItpmtVO3CO5I06uVzAQDA+NTlZDwzu1bStcWXxxfLC83snuLfO939M8W//07SWcVH6bYU686WtLL4963u/lQ95gUAQKOr11n350i6ccS604qbJL0maSjovyfpA5LOk/QeSa2Stkv6J0l3ufuaOs0JAICGV69L4K6StOooH/sdSd+px7gAAODw6EePQ7zwwgulayP95CVp//79pWsXLlwYGvuUU04pXbtp06bQ2JHtFu1t3tRU/jSdaF+Lvr6+UH2kt3p0X3X30rUtLbFfu5G+7vPnzw+NHfHGG28c+UGYEFPtrHsAAFBHBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAInRphaH2Lx5c+naJUuWhMbeu3dv6drly5eHxo60qX322WdDY0daxUZatUYNDAyE6qtsUxttFdvT01O6dvbs2aGxI62JI/OWpBkzZpSu3bdvX2hslMcRPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAYQQ8AQGIEPQAAiRH0AAAkRtADAJAY/ehxiO3bt5eujfSqlqTOzs7Staeeempo7G3btpWujfZVj4j0JpckMytdG+3pHuknL0nd3d2la6Nz7+3trWzsxYsXl66N9qNvbW0tXbt3797Q2CiPI3oAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIw2tTjE66+/Xrq2ubk5NHakheZb3vKW0Nhbt24tXevuobEj3/fs2bNDYw8MDITqI5qaYscZ+/btK10bafUqSR0dHaVro21qTzrppNK1zz77bGjsyP4W3dci/88i7Zgz4IgeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIx+9DjErl27StcuWLAgNPaBAwdK115wwQWhsR988MFQfUS0n31Ef39/6dpoP/lofWR/GRwcDI0d0dPTE6o/77zzSteuWbMmNPbcuXNL10b2NYl+9BHhI3ozW2hmN5nZj8zsFTPrNrN9ZvZzM/uYmY06hpldZGYPmdluMztgZs+Z2S1m1hydEwAAqKnHEf31ku6WtFXS45I2S3qTpA9K+rak95jZ9T7s5ZiZvV/SDyT1SPq+pN2SrpG0WtLFxXMCAICgegT9S5LeJ+nH7v77v4eZ2d9I+ndJH1It9H9QrJ8r6VuSBiRd5u7PFOtvlfSYpOvM7AZ3v68OcwMAoKGF/3Tv7o+5+4PDQ75Yv03SN4svLxt213WSFkm6byjki8f3SPpC8eUnovMCAAATf9b9wWI5/CyMlcXy4VEe/6SkA5IuMrOZEzkxAAAawYSddW9mLZI+Unw5PNTPLJYvjaxx934ze1XSWZJOk7ThCGOsHeOupeObLQAAOU3kEf3tkt4u6SF3/8mw9fOK5b4x6obWz5+oiQEA0Cgm5IjezG6W9GlJv5H04fGWF8sjfmjS3VeMMf5aSeeOc1wAANKp+xG9mX1S0h2SXpB0ubvvHvGQoSP2eRrd3BGPAwAAJdU16M3sFkl3SXpetZDfNsrDXiyWZ4xS3yLpVNVO3ttYz7kBANCI6hb0ZvZZ1S54s161kN8xxkMfK5ZXj3LfJZJmS3rK3XvrNTcAABpVXYK+uNjN7ZLWSrrC3Xce5uH3S9op6QYze8ew52iT9JXiy7vrMS8AABpd+GQ8M7tR0m2qXelujaSbR2kgsMnd75Ekd+8ws79QLfCfMLP7VLsE7vtU++jd/apdFhcAAATV46z7U4tls6RbxnjMzyTdM/SFuz9gZpdK+rxql8htk/SKpL+W9A2vsp0XAACJhIPe3VdJWlWi7heS3hsdH4eKtt/cs2dP6dr29vbQ2AMDA6Vroy1Pt2/fXro22gKzyhaakdfUra2tobFbWmK/fvbtK//BnObm6ppkdnR0hOpPOumk0rW9vbFTnzo7O0vXRn/ekRbaixYtCo093U30JXABAECFCHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASCzcjx5Ty9atW0P1kX70J554YmjsGTNmhOojenp6Stced9xxobEHBgZK1w4ODobGjtRHe7o3NcWOMyJz7+/vD40d6a0e2dck6c1vfnOoPiI694jt27eXrqUfPQAASIugBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMdrUJrNly5ZQ/cknn1y6Ntp2dPHixaH6iHnz5pWubW9vD40daZnq7qGxDx48WLq2tbU1NHa0zW1XV1fp2t7e3tDYs2fPLl3b19cXGnvJkiWla6Ptec2sdG2kHbMk7dq1K1TfyDiiBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABKjH30yL7/8cqg+0jM62ht97ty5ofqISD/6aJ/tSG/0AwcOhMaO9pSPiPQ2l2L97Hfu3Bkae/78+aVru7u7Q2O3tJT/tT1z5szQ2JGfWfTnvX///lB9I+OIHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASI+gBAEiMoAcAIDGCHgCAxAh6AAASo01tMtH2mxEzZswI1S9ZsqQ+Eynh9ttvL1370ksvhcaOtBbu6+sLjT1nzpzStZH2upLU1dUVqv/d735XujbarvW4444rXfvVr341NHbEggULQvWRn3mkva5Em9qI8BG9mS00s5vM7Edm9oqZdZvZPjP7uZl9zMyaRjx+iZn5YW73RecEAABq6nFEf72kuyVtlfS4pM2S3iTpg5K+Lek9Zna9u/uIul9KemCU53u+DnMCAACqT9C/JOl9kn7s7oNDK83sbyT9u6QPqRb6PxhRt97dV9VhfAAAMIbwn+7d/TF3f3B4yBfrt0n6ZvHlZdFxAADA+E30yXgHi2X/KPedaGYfl7RQ0i5JT7v7cxM8HwAAGsqEBb2ZtUj6SPHlw6M85MriNrzmCUk3uvvmiZoXAACNZCKP6G+X9HZJD7n7T4atPyDpy6qdiLexWHe2pFWSLpf0UzM7x92P+FkKM1s7xl1Ly04aAIBMJuSCOWZ2s6RPS/qNpA8Pv8/dd7j7F919nbvvLW5PSrpK0r9JeoukmyZiXgAANJq6H9Gb2Scl3SHpBUlXuPvuo6lz934z+7ak8yVdUjzHkWpWjDGHtZLOPepJAwCQVF2P6M3sFkl3qfZZ+MuLM+/H441i2V7PeQEA0KjqFvRm9llJqyWtVy3kd5R4mguK5cbDPgoAAByVugS9md2q2sl3a1X7c/2YF1w3s/PN7I8uim5mKyV9qvjy3nrMCwCARhd+j97MbpR0m6QBSWsk3WxmIx+2yd3vKf79d5LOKj5Kt6VYd7aklcW/b3X3p6LzAgAA9TkZ79Ri2SzpljEe8zNJ9xT//p6kD0g6T9J7JLVK2i7pnyTd5e5r6jAnAACgOgR9cb36VeN4/HckfSc6LgAAODL60Sfz2muvheoj/c07OztDYy9cuDBUX5Uf/vCHVU8BOCpz584N1b/yyiula0d5S3dcXnzxxVB9I5uQC+YAAICpgaAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAx2tQmc+2114bq9+3bV7p2cHAwNPa73vWuUD0wWdy9dG20XWvENddcE6p/9NFHS9eecsopobGvvPLKUH0j44geAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABIj6AEASIygBwAgMYIeAIDECHoAABKzSLvFqcrMds2aNevYZcuWVT2VSdfV1RWq37VrV+na6L60aNGi0rXt7e2hsYFGsHv37lB9R0dH6doZM2aExp43b17p2un6+2HDhg3q7u7e7e4LI8+TNehflTRX0qYxHrK0WP5mUiaUA9usHLZbOWy38WOblTOVt9sSSR3ufmrkSVIG/ZGY2VpJcvcVVc9lumCblcN2K4ftNn5ss3IaYbvxHj0AAIkR9AAAJEbQAwCQGEEPAEBiBD0AAIk15Fn3AAA0Co7oAQBIjKAHACAxgh4AgMQIegAAEiPoAQBIjKAHACAxgh4AgMQaKujN7GQz+z9m9jsz6zWzTWb2dTNbUPXcpqpiG/kYt21Vz68qZnadmd1pZmvMrKPYHvceoeYiM3vIzHab2QEze87MbjGz5smad9XGs93MbMlh9j03s/sme/5VMLOFZnaTmf3IzF4xs24z22dmPzezj5nZqL/HG31/G+92y7y/tVQ9gcliZqdLekrSYkn/rFrv4T+V9FeSrjazi919V4VTnMr2Sfr6KOu7JnsiU8gXJC1XbRts0R96Wo/KzN4v6QeSeiR9X9JuSddIWi3pYknXT+Rkp5BxbbfCLyU9MMr65+s4r6nsekl3S9oq6XFJmyW9SdIHJX1b0nvM7HofdvUz9jdJJbZbId/+5u4NcZP0E0ku6b+NWP+/ivXfrHqOU/EmaZOkTVXPY6rdJF0u6a2STNJlxT507xiPnStph6ReSe8Ytr5NtRefLumGqr+nKbjdlhT331P1vCveZitVC+mmEeuPVy28XNKHhq1nfyu33dLubw3xp3szO03SVaqF1t+PuPtvJe2X9GEza5/kqWGacvfH3f1lL35DHMF1khZJus/dnxn2HD2qHeFK0icmYJpTzji3GyS5+2Pu/qC7D45Yv03SN4svLxt2F/ubSm23tBrlT/cri+Ujo/zQO83sF6q9ELhA0k8ne3LTwEwz+3NJb1btRdFzkp5094FqpzVtDO1/D49y35OSDki6yMxmunvv5E1r2jjRzD4uaaGkXZKedvfnKp7TVHGwWPYPW8f+dmSjbbch6fa3Rgn6M4vlS2Pc/7JqQX+GCPrRHC/peyPWvWpmH3X3n1UxoWlmzP3P3fvN7FVJZ0k6TdKGyZzYNHFlcfs9M3tC0o3uvrmSGU0BZtYi6SPFl8NDnf3tMA6z3Yak298a4k/3kuYVy31j3D+0fv4kzGW6+a6kK1QL+3ZJfyLpH1R7P+tfzGx5dVObNtj/yjkg6cuSVkhaUNwuVe3Eqssk/bTB3267XdLbJT3k7j8Ztp797fDG2m5p97dGCfojsWLJ+4YjuPuXive6trv7AXd/3t3/UrWTGGdJWlXtDFNg/xuFu+9w9y+6+zp331vcnlTtr2//Juktkm4OVzYcAAAC5klEQVSqdpbVMLObJX1atU8PfXi85cWy4fa3w223zPtbowT90CvYeWPcP3fE43BkQyezXFLpLKYH9r86cvd+1T4eJTXg/mdmn5R0h6QXJF3u7rtHPIT9bRRHsd1GlWF/a5Sgf7FYnjHG/W8tlmO9h48/tqNYTss/ZU2yMfe/4v3CU1U7KWjjZE5qmnujWDbU/mdmt0i6S7XPdF9enEE+EvvbCEe53Q5nWu9vjRL0jxfLq0a5GtIc1S4g0S3pXyd7YtPYhcWyYX5ZBDxWLK8e5b5LJM2W9FQDnwFdxgXFsmH2PzP7rGoXvFmvWljtGOOh7G/DjGO7Hc603t8aIujd/T8kPaLaCWSfHHH3l1R7lfaP7r5/kqc2pZnZWWZ27CjrT1Ht1bEkHfayr5Ak3S9pp6QbzOwdQyvNrE3SV4ov765iYlOZmZ1vZjNGWb9S0qeKLxti/zOzW1U7iWytpCvcfedhHs7+VhjPdsu8v1mjXLdilEvgbpB0vmpX6npJ0kXOJXAPYWarJH1Otb+IvCqpU9Lpkv5MtatsPSTpA+7eV9Ucq2Jm10q6tvjyeEnvVu3V/ppi3U53/8yIx9+v2iVJ71PtkqTvU+2jUPdL+i+NcBGZ8Wy34iNNZ0l6QrXL5UrS2frD58Rvdfeh4ErLzG6UdI+kAUl3avT31je5+z3Dahp+fxvvdku9v1V9ab7JvEn6T6p9XGyrpD5Jr6l2csaxVc9tKt5U+2jJ/1XtDNW9ql1k4g1Jj6r2OVSreo4VbptVqp21PNZt0yg1F6v24miPam8V/Uq1I4Xmqr+fqbjdJH1M0v9T7YqWXapd0nWzatduf1fV38sU2mYu6Qn2t9h2y7y/NcwRPQAAjagh3qMHAKBREfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJEfQAACRG0AMAkBhBDwBAYgQ9AACJ/X87+RLxTU8oSwAAAABJRU5ErkJggg==\n",
+      "text/plain": [
+       "<Figure size 432x288 with 1 Axes>"
+      ]
+     },
+     "metadata": {
+      "image/png": {
+       "height": 250,
+       "width": 253
+      },
+      "needs_background": "light"
+     },
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# We can see that the training set consists of 60,000 images of size 28x28 pixels\n",
+    "import matplotlib.pyplot as plt\n",
+    "import numpy as np\n",
+    "i=np.random.randint(0,X_train.shape[0])\n",
+    "plt.imshow(X_train[i], cmap=\"gray_r\") ; \n",
+    "print(\"This item is a: \" , items[y_train[i]])"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 139,
+   "metadata": {},
+   "outputs": [
     {
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "157/157 [==============================] - 0s 183us/step - loss: 0.0521 - acc: 0.9936 - val_loss: 0.0847 - val_acc: 0.9706\n",
-      "Epoch 828/1000\n",
-      "157/157 [==============================] - 0s 266us/step - loss: 0.0524 - acc: 0.9936 - val_loss: 0.0902 - val_acc: 0.9559\n",
-      "Epoch 829/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.0525 - acc: 0.9873 - val_loss: 0.0839 - val_acc: 0.9706\n",
-      "Epoch 830/1000\n",
-      "157/157 [==============================] - 0s 228us/step - loss: 0.0528 - acc: 0.9809 - val_loss: 0.0829 - val_acc: 0.9706\n",
-      "Epoch 831/1000\n",
-      "157/157 [==============================] - 0s 251us/step - loss: 0.0530 - acc: 0.9936 - val_loss: 0.0832 - val_acc: 0.9706\n",
-      "Epoch 832/1000\n",
-      "157/157 [==============================] - 0s 275us/step - loss: 0.0544 - acc: 0.9873 - val_loss: 0.0876 - val_acc: 0.9706\n",
-      "Epoch 833/1000\n",
-      "157/157 [==============================] - 0s 233us/step - loss: 0.0526 - acc: 0.9936 - val_loss: 0.0859 - val_acc: 0.9706\n",
-      "Epoch 834/1000\n",
-      "157/157 [==============================] - 0s 366us/step - loss: 0.0527 - acc: 0.9936 - val_loss: 0.0853 - val_acc: 0.9706\n",
-      "Epoch 835/1000\n",
-      "157/157 [==============================] - 0s 220us/step - loss: 0.0514 - acc: 0.9936 - val_loss: 0.0911 - val_acc: 0.9559\n",
-      "Epoch 836/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0529 - acc: 0.9873 - val_loss: 0.0923 - val_acc: 0.9559\n",
-      "Epoch 837/1000\n",
-      "157/157 [==============================] - 0s 255us/step - loss: 0.0513 - acc: 0.9936 - val_loss: 0.0843 - val_acc: 0.9706\n",
-      "Epoch 838/1000\n",
-      "157/157 [==============================] - 0s 117us/step - loss: 0.0514 - acc: 0.9936 - val_loss: 0.0900 - val_acc: 0.9559\n",
-      "Epoch 839/1000\n",
-      "157/157 [==============================] - 0s 104us/step - loss: 0.0509 - acc: 0.9936 - val_loss: 0.0868 - val_acc: 0.9706\n",
-      "Epoch 840/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.0530 - acc: 0.9873 - val_loss: 0.0874 - val_acc: 0.9706\n",
-      "Epoch 841/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0510 - acc: 0.9809 - val_loss: 0.0826 - val_acc: 0.9706\n",
-      "Epoch 842/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0512 - acc: 0.9936 - val_loss: 0.0835 - val_acc: 0.9706\n",
-      "Epoch 843/1000\n",
-      "157/157 [==============================] - 0s 174us/step - loss: 0.0505 - acc: 0.9936 - val_loss: 0.0914 - val_acc: 0.9559\n",
-      "Epoch 844/1000\n",
-      "157/157 [==============================] - 0s 116us/step - loss: 0.0551 - acc: 0.9936 - val_loss: 0.0913 - val_acc: 0.9559\n",
-      "Epoch 845/1000\n",
-      "157/157 [==============================] - 0s 224us/step - loss: 0.0502 - acc: 0.9936 - val_loss: 0.0861 - val_acc: 0.9706\n",
-      "Epoch 846/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.0504 - acc: 0.9936 - val_loss: 0.0863 - val_acc: 0.9706\n",
-      "Epoch 847/1000\n",
-      "157/157 [==============================] - 0s 150us/step - loss: 0.0500 - acc: 0.9873 - val_loss: 0.0803 - val_acc: 0.9706\n",
-      "Epoch 848/1000\n",
-      "157/157 [==============================] - 0s 198us/step - loss: 0.0511 - acc: 0.9936 - val_loss: 0.0844 - val_acc: 0.9706\n",
-      "Epoch 849/1000\n",
-      "157/157 [==============================] - 0s 182us/step - loss: 0.0497 - acc: 0.9936 - val_loss: 0.0830 - val_acc: 0.9706\n",
-      "Epoch 850/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.0515 - acc: 0.9936 - val_loss: 0.0926 - val_acc: 0.9559\n",
-      "Epoch 851/1000\n",
-      "157/157 [==============================] - 0s 185us/step - loss: 0.0502 - acc: 0.9809 - val_loss: 0.0829 - val_acc: 0.9706\n",
-      "Epoch 852/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.0501 - acc: 0.9936 - val_loss: 0.0916 - val_acc: 0.9559\n",
-      "Epoch 853/1000\n",
-      "157/157 [==============================] - 0s 206us/step - loss: 0.0509 - acc: 0.9809 - val_loss: 0.0803 - val_acc: 0.9706\n",
-      "Epoch 854/1000\n",
-      "157/157 [==============================] - 0s 131us/step - loss: 0.0507 - acc: 0.9936 - val_loss: 0.0835 - val_acc: 0.9706\n",
-      "Epoch 855/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.0497 - acc: 0.9936 - val_loss: 0.0848 - val_acc: 0.9706\n",
-      "Epoch 856/1000\n",
-      "157/157 [==============================] - 0s 106us/step - loss: 0.0501 - acc: 0.9936 - val_loss: 0.0889 - val_acc: 0.9559\n",
-      "Epoch 857/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0745 - acc: 1.000 - 0s 99us/step - loss: 0.0493 - acc: 0.9936 - val_loss: 0.0840 - val_acc: 0.9706\n",
-      "Epoch 858/1000\n",
-      "157/157 [==============================] - 0s 106us/step - loss: 0.0512 - acc: 0.9936 - val_loss: 0.0918 - val_acc: 0.9559\n",
-      "Epoch 859/1000\n",
-      "157/157 [==============================] - 0s 169us/step - loss: 0.0507 - acc: 0.9936 - val_loss: 0.0846 - val_acc: 0.9706\n",
-      "Epoch 860/1000\n",
-      "157/157 [==============================] - 0s 180us/step - loss: 0.0494 - acc: 0.9873 - val_loss: 0.0875 - val_acc: 0.9559\n",
-      "Epoch 861/1000\n",
-      "157/157 [==============================] - 0s 154us/step - loss: 0.0507 - acc: 0.9873 - val_loss: 0.0861 - val_acc: 0.9706\n",
-      "Epoch 862/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.0524 - acc: 0.9873 - val_loss: 0.0799 - val_acc: 0.9706\n",
-      "Epoch 863/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.0493 - acc: 0.9936 - val_loss: 0.0896 - val_acc: 0.9559\n",
-      "Epoch 864/1000\n",
-      "157/157 [==============================] - 0s 216us/step - loss: 0.0494 - acc: 0.9936 - val_loss: 0.0929 - val_acc: 0.9559\n",
-      "Epoch 865/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.0505 - acc: 0.9936 - val_loss: 0.0885 - val_acc: 0.9559\n",
-      "Epoch 866/1000\n",
-      "157/157 [==============================] - 0s 177us/step - loss: 0.0513 - acc: 0.9936 - val_loss: 0.0919 - val_acc: 0.9559\n",
-      "Epoch 867/1000\n",
-      "157/157 [==============================] - 0s 121us/step - loss: 0.0500 - acc: 0.9873 - val_loss: 0.0859 - val_acc: 0.9706\n",
-      "Epoch 868/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0502 - acc: 0.9936 - val_loss: 0.0889 - val_acc: 0.9559\n",
-      "Epoch 869/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.0502 - acc: 0.9936 - val_loss: 0.0886 - val_acc: 0.9559\n",
-      "Epoch 870/1000\n",
-      "157/157 [==============================] - 0s 241us/step - loss: 0.0486 - acc: 0.9936 - val_loss: 0.0878 - val_acc: 0.9559\n",
-      "Epoch 871/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.0483 - acc: 0.9873 - val_loss: 0.0808 - val_acc: 0.9706\n",
-      "Epoch 872/1000\n",
-      "157/157 [==============================] - 0s 239us/step - loss: 0.0482 - acc: 0.9936 - val_loss: 0.0869 - val_acc: 0.9559\n",
-      "Epoch 873/1000\n",
-      "157/157 [==============================] - 0s 199us/step - loss: 0.0500 - acc: 0.9936 - val_loss: 0.0901 - val_acc: 0.9559\n",
-      "Epoch 874/1000\n",
-      "157/157 [==============================] - 0s 175us/step - loss: 0.0495 - acc: 0.9873 - val_loss: 0.0881 - val_acc: 0.9559\n",
-      "Epoch 875/1000\n",
-      "157/157 [==============================] - 0s 155us/step - loss: 0.0485 - acc: 0.9936 - val_loss: 0.0923 - val_acc: 0.9559\n",
-      "Epoch 876/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.0482 - acc: 0.9873 - val_loss: 0.0821 - val_acc: 0.9706\n",
-      "Epoch 877/1000\n",
-      "157/157 [==============================] - 0s 177us/step - loss: 0.0493 - acc: 0.9936 - val_loss: 0.0955 - val_acc: 0.9559\n",
-      "Epoch 878/1000\n",
-      "157/157 [==============================] - 0s 218us/step - loss: 0.0499 - acc: 0.9809 - val_loss: 0.0851 - val_acc: 0.9706\n",
-      "Epoch 879/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.0475 - acc: 0.9936 - val_loss: 0.0855 - val_acc: 0.9706\n",
-      "Epoch 880/1000\n",
-      "157/157 [==============================] - 0s 209us/step - loss: 0.0490 - acc: 0.9936 - val_loss: 0.0826 - val_acc: 0.9706\n",
-      "Epoch 881/1000\n",
-      "157/157 [==============================] - 0s 349us/step - loss: 0.0474 - acc: 0.9936 - val_loss: 0.0813 - val_acc: 0.9706\n",
-      "Epoch 882/1000\n",
-      "157/157 [==============================] - 0s 153us/step - loss: 0.0479 - acc: 0.9936 - val_loss: 0.0924 - val_acc: 0.9559\n",
-      "Epoch 883/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.0527 - acc: 0.9873 - val_loss: 0.0853 - val_acc: 0.9706\n",
-      "Epoch 884/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.0473 - acc: 0.9936 - val_loss: 0.0863 - val_acc: 0.9706\n",
-      "Epoch 885/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0469 - acc: 0.9873 - val_loss: 0.0787 - val_acc: 0.9706\n",
-      "Epoch 886/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0477 - acc: 0.9936 - val_loss: 0.0853 - val_acc: 0.9706\n",
-      "Epoch 887/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.0475 - acc: 0.9936 - val_loss: 0.0957 - val_acc: 0.9559\n",
-      "Epoch 888/1000\n",
-      "157/157 [==============================] - 0s 173us/step - loss: 0.0478 - acc: 0.9936 - val_loss: 0.0922 - val_acc: 0.9559\n",
-      "Epoch 889/1000\n",
-      "157/157 [==============================] - 0s 223us/step - loss: 0.0477 - acc: 0.9873 - val_loss: 0.0834 - val_acc: 0.9706\n",
-      "Epoch 890/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.0477 - acc: 0.9873 - val_loss: 0.0862 - val_acc: 0.9706\n",
-      "Epoch 891/1000\n",
-      "157/157 [==============================] - 0s 386us/step - loss: 0.0466 - acc: 0.9873 - val_loss: 0.0829 - val_acc: 0.9706\n",
-      "Epoch 892/1000\n",
-      "157/157 [==============================] - 0s 254us/step - loss: 0.0477 - acc: 0.9936 - val_loss: 0.0824 - val_acc: 0.9706\n",
-      "Epoch 893/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.0466 - acc: 0.9873 - val_loss: 0.0840 - val_acc: 0.9706\n",
-      "Epoch 894/1000\n",
-      "157/157 [==============================] - 0s 122us/step - loss: 0.0480 - acc: 0.9936 - val_loss: 0.0870 - val_acc: 0.9559\n",
-      "Epoch 895/1000\n",
-      "157/157 [==============================] - 0s 176us/step - loss: 0.0477 - acc: 0.9873 - val_loss: 0.0866 - val_acc: 0.9559\n",
-      "Epoch 896/1000\n",
-      "157/157 [==============================] - 0s 177us/step - loss: 0.0467 - acc: 0.9873 - val_loss: 0.0824 - val_acc: 0.9706\n",
-      "Epoch 897/1000\n",
-      "157/157 [==============================] - 0s 319us/step - loss: 0.0467 - acc: 0.9873 - val_loss: 0.0852 - val_acc: 0.9706\n",
-      "Epoch 898/1000\n",
-      "157/157 [==============================] - 0s 133us/step - loss: 0.0483 - acc: 0.9936 - val_loss: 0.0878 - val_acc: 0.9559\n",
-      "Epoch 899/1000\n",
-      "157/157 [==============================] - 0s 138us/step - loss: 0.0482 - acc: 0.9936 - val_loss: 0.0899 - val_acc: 0.9559\n",
-      "Epoch 900/1000\n",
-      "157/157 [==============================] - 0s 239us/step - loss: 0.0455 - acc: 0.9873 - val_loss: 0.0804 - val_acc: 0.9706\n",
-      "Epoch 901/1000\n",
-      "157/157 [==============================] - 0s 212us/step - loss: 0.0491 - acc: 0.9873 - val_loss: 0.0852 - val_acc: 0.9706\n",
-      "Epoch 902/1000\n",
-      "157/157 [==============================] - 0s 244us/step - loss: 0.0457 - acc: 0.9936 - val_loss: 0.0920 - val_acc: 0.9559\n",
-      "Epoch 903/1000\n",
-      "157/157 [==============================] - 0s 236us/step - loss: 0.0473 - acc: 0.9809 - val_loss: 0.0789 - val_acc: 0.9706\n",
-      "Epoch 904/1000\n",
-      "157/157 [==============================] - 0s 172us/step - loss: 0.0469 - acc: 0.9936 - val_loss: 0.0858 - val_acc: 0.9706\n",
-      "Epoch 905/1000\n",
-      "157/157 [==============================] - 0s 232us/step - loss: 0.0461 - acc: 0.9936 - val_loss: 0.0868 - val_acc: 0.9559\n",
-      "Epoch 906/1000\n",
-      "157/157 [==============================] - 0s 192us/step - loss: 0.0456 - acc: 0.9936 - val_loss: 0.0847 - val_acc: 0.9706\n",
-      "Epoch 907/1000\n",
-      "157/157 [==============================] - 0s 162us/step - loss: 0.0467 - acc: 0.9936 - val_loss: 0.0896 - val_acc: 0.9559\n",
-      "Epoch 908/1000\n",
-      "157/157 [==============================] - 0s 94us/step - loss: 0.0500 - acc: 0.9873 - val_loss: 0.0832 - val_acc: 0.9706\n",
-      "Epoch 909/1000\n",
-      "157/157 [==============================] - 0s 221us/step - loss: 0.0453 - acc: 0.9936 - val_loss: 0.0872 - val_acc: 0.9559\n",
-      "Epoch 910/1000\n",
-      "157/157 [==============================] - 0s 348us/step - loss: 0.0455 - acc: 0.9936 - val_loss: 0.0890 - val_acc: 0.9559\n",
-      "Epoch 911/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.0463 - acc: 0.9873 - val_loss: 0.0857 - val_acc: 0.9706\n",
-      "Epoch 912/1000\n",
-      "157/157 [==============================] - 0s 144us/step - loss: 0.0452 - acc: 0.9936 - val_loss: 0.0939 - val_acc: 0.9559\n",
-      "Epoch 913/1000\n",
-      "157/157 [==============================] - 0s 158us/step - loss: 0.0465 - acc: 0.9873 - val_loss: 0.0809 - val_acc: 0.9706\n",
-      "Epoch 914/1000\n",
-      "157/157 [==============================] - 0s 123us/step - loss: 0.0448 - acc: 0.9936 - val_loss: 0.0851 - val_acc: 0.9706\n",
-      "Epoch 915/1000\n",
-      "157/157 [==============================] - 0s 148us/step - loss: 0.0480 - acc: 0.9873 - val_loss: 0.0852 - val_acc: 0.9706\n",
-      "Epoch 916/1000\n",
-      "157/157 [==============================] - 0s 179us/step - loss: 0.0450 - acc: 0.9936 - val_loss: 0.0950 - val_acc: 0.9559\n",
-      "Epoch 917/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0466 - acc: 0.9936 - val_loss: 0.0868 - val_acc: 0.9559\n",
-      "Epoch 918/1000\n",
-      "157/157 [==============================] - 0s 184us/step - loss: 0.0452 - acc: 0.9873 - val_loss: 0.0825 - val_acc: 0.9706\n",
-      "Epoch 919/1000\n",
-      "157/157 [==============================] - 0s 107us/step - loss: 0.0457 - acc: 0.9936 - val_loss: 0.0792 - val_acc: 0.9706\n",
-      "Epoch 920/1000\n",
-      "157/157 [==============================] - 0s 104us/step - loss: 0.0446 - acc: 0.9936 - val_loss: 0.0843 - val_acc: 0.9706\n",
-      "Epoch 921/1000\n",
-      "157/157 [==============================] - 0s 102us/step - loss: 0.0462 - acc: 0.9873 - val_loss: 0.0818 - val_acc: 0.9706\n",
-      "Epoch 922/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0544 - acc: 1.000 - 0s 102us/step - loss: 0.0451 - acc: 0.9873 - val_loss: 0.0853 - val_acc: 0.9706\n",
-      "Epoch 923/1000\n",
-      "157/157 [==============================] - 0s 99us/step - loss: 0.0448 - acc: 0.9873 - val_loss: 0.0818 - val_acc: 0.9706\n",
-      "Epoch 924/1000\n",
-      "157/157 [==============================] - 0s 105us/step - loss: 0.0449 - acc: 0.9873 - val_loss: 0.0841 - val_acc: 0.9706\n",
-      "Epoch 925/1000\n",
-      "157/157 [==============================] - 0s 275us/step - loss: 0.0452 - acc: 0.9873 - val_loss: 0.0823 - val_acc: 0.9706\n",
-      "Epoch 926/1000\n",
-      "157/157 [==============================] - 0s 167us/step - loss: 0.0441 - acc: 0.9873 - val_loss: 0.0808 - val_acc: 0.9706\n",
-      "Epoch 927/1000\n",
-      "157/157 [==============================] - 0s 118us/step - loss: 0.0447 - acc: 0.9936 - val_loss: 0.0785 - val_acc: 0.9706\n",
-      "Epoch 928/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0447 - acc: 0.9873 - val_loss: 0.0766 - val_acc: 0.9706\n",
-      "Epoch 929/1000\n",
-      "157/157 [==============================] - 0s 193us/step - loss: 0.0449 - acc: 0.9936 - val_loss: 0.0836 - val_acc: 0.9706\n",
-      "Epoch 930/1000\n",
-      "157/157 [==============================] - 0s 191us/step - loss: 0.0446 - acc: 0.9936 - val_loss: 0.0850 - val_acc: 0.9706\n",
-      "Epoch 931/1000\n",
-      "157/157 [==============================] - 0s 163us/step - loss: 0.0439 - acc: 0.9936 - val_loss: 0.0868 - val_acc: 0.9559\n",
-      "Epoch 932/1000\n",
-      "157/157 [==============================] - 0s 168us/step - loss: 0.0434 - acc: 0.9936 - val_loss: 0.0844 - val_acc: 0.9706\n",
-      "Epoch 933/1000\n",
-      "157/157 [==============================] - 0s 171us/step - loss: 0.0443 - acc: 0.9936 - val_loss: 0.0793 - val_acc: 0.9706\n",
-      "Epoch 934/1000\n",
-      "157/157 [==============================] - 0s 197us/step - loss: 0.0441 - acc: 0.9873 - val_loss: 0.0855 - val_acc: 0.9706\n",
-      "Epoch 935/1000\n",
-      "157/157 [==============================] - 0s 142us/step - loss: 0.0447 - acc: 0.9873 - val_loss: 0.0878 - val_acc: 0.9559\n",
-      "Epoch 936/1000\n",
-      "157/157 [==============================] - 0s 190us/step - loss: 0.0439 - acc: 0.9809 - val_loss: 0.0763 - val_acc: 0.9706\n",
-      "Epoch 937/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0427 - acc: 0.9936 - val_loss: 0.0870 - val_acc: 0.9559\n",
-      "Epoch 938/1000\n",
-      "157/157 [==============================] - ETA: 0s - loss: 0.0191 - acc: 1.000 - 0s 138us/step - loss: 0.0439 - acc: 0.9873 - val_loss: 0.0760 - val_acc: 0.9706\n",
-      "Epoch 939/1000\n",
-      "157/157 [==============================] - 0s 188us/step - loss: 0.0442 - acc: 0.9936 - val_loss: 0.0777 - val_acc: 0.9706\n",
-      "Epoch 940/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.0445 - acc: 0.9936 - val_loss: 0.0877 - val_acc: 0.9559\n",
-      "Epoch 941/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0441 - acc: 0.9873 - val_loss: 0.0819 - val_acc: 0.9706\n",
-      "Epoch 942/1000\n",
-      "157/157 [==============================] - 0s 204us/step - loss: 0.0433 - acc: 0.9936 - val_loss: 0.0863 - val_acc: 0.9559\n",
-      "Epoch 943/1000\n",
-      "157/157 [==============================] - 0s 235us/step - loss: 0.0430 - acc: 0.9873 - val_loss: 0.0843 - val_acc: 0.9706\n",
-      "Epoch 944/1000\n",
-      "157/157 [==============================] - 0s 207us/step - loss: 0.0436 - acc: 0.9873 - val_loss: 0.0817 - val_acc: 0.9706\n"
+      "(60000, 10)\n"
      ]
-    },
+    }
+   ],
+   "source": [
+    "# Also we need to reshape the input data such that each sample is a 4D matrix of dimension\n",
+    "# (num_samples, width, height, channels). Even though these images are grayscale we need to add\n",
+    "# channel dimension as this is expected by the Conv function\n",
+    "X_train_prep = X_train.reshape(X_train.shape[0],28,28,1)/255.\n",
+    "X_test_prep = X_test.reshape(X_test.shape[0],28,28,1)/255.\n",
+    "\n",
+    "from keras.utils.np_utils import to_categorical\n",
+    "\n",
+    "y_train_onehot = to_categorical(y_train, num_classes=10)\n",
+    "y_test_onehot = to_categorical(y_test, num_classes=10)\n",
+    "\n",
+    "print(y_train_onehot.shape)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 140,
+   "metadata": {},
+   "outputs": [
     {
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "Epoch 945/1000\n",
-      "157/157 [==============================] - 0s 178us/step - loss: 0.0470 - acc: 0.9873 - val_loss: 0.0836 - val_acc: 0.9706\n",
-      "Epoch 946/1000\n",
-      "157/157 [==============================] - 0s 170us/step - loss: 0.0432 - acc: 0.9936 - val_loss: 0.0858 - val_acc: 0.9559\n",
-      "Epoch 947/1000\n",
-      "157/157 [==============================] - 0s 327us/step - loss: 0.0428 - acc: 0.9873 - val_loss: 0.0818 - val_acc: 0.9706\n",
-      "Epoch 948/1000\n",
-      "157/157 [==============================] - 0s 219us/step - loss: 0.0433 - acc: 0.9873 - val_loss: 0.0807 - val_acc: 0.9706\n",
-      "Epoch 949/1000\n",
-      "157/157 [==============================] - 0s 253us/step - loss: 0.0438 - acc: 0.9936 - val_loss: 0.0792 - val_acc: 0.9706\n",
-      "Epoch 950/1000\n",
-      "157/157 [==============================] - 0s 369us/step - loss: 0.0430 - acc: 0.9936 - val_loss: 0.0869 - val_acc: 0.9559\n",
-      "Epoch 951/1000\n",
-      "157/157 [==============================] - 0s 241us/step - loss: 0.0435 - acc: 0.9936 - val_loss: 0.0822 - val_acc: 0.9706\n",
-      "Epoch 952/1000\n",
-      "157/157 [==============================] - 0s 82us/step - loss: 0.0430 - acc: 0.9873 - val_loss: 0.0797 - val_acc: 0.9706\n",
-      "Epoch 953/1000\n",
-      "157/157 [==============================] - 0s 86us/step - loss: 0.0431 - acc: 0.9873 - val_loss: 0.0845 - val_acc: 0.9706\n",
-      "Epoch 954/1000\n",
-      "157/157 [==============================] - 0s 112us/step - loss: 0.0433 - acc: 0.9873 - val_loss: 0.0792 - val_acc: 0.9706\n",
-      "Epoch 955/1000\n",
-      "157/157 [==============================] - 0s 152us/step - loss: 0.0437 - acc: 0.9936 - val_loss: 0.0863 - val_acc: 0.9559\n",
-      "Epoch 956/1000\n",
-      "157/157 [==============================] - 0s 323us/step - loss: 0.0438 - acc: 0.9873 - val_loss: 0.0850 - val_acc: 0.9706\n",
-      "Epoch 957/1000\n",
-      "157/157 [==============================] - 0s 229us/step - loss: 0.0424 - acc: 0.9936 - val_loss: 0.0862 - val_acc: 0.9559\n",
-      "Epoch 958/1000\n",
-      "157/157 [==============================] - 0s 332us/step - loss: 0.0422 - acc: 0.9873 - val_loss: 0.0763 - val_acc: 0.9706\n",
-      "Epoch 959/1000\n",
-      "157/157 [==============================] - 0s 369us/step - loss: 0.0434 - acc: 0.9873 - val_loss: 0.0755 - val_acc: 0.9706\n",
-      "Epoch 960/1000\n",
-      "157/157 [==============================] - 0s 314us/step - loss: 0.0421 - acc: 0.9936 - val_loss: 0.0840 - val_acc: 0.9706\n",
-      "Epoch 961/1000\n",
-      "157/157 [==============================] - 0s 358us/step - loss: 0.0423 - acc: 0.9873 - val_loss: 0.0861 - val_acc: 0.9559\n",
-      "Epoch 962/1000\n",
-      "157/157 [==============================] - 0s 363us/step - loss: 0.0416 - acc: 0.9936 - val_loss: 0.0824 - val_acc: 0.9706\n",
-      "Epoch 963/1000\n",
-      "157/157 [==============================] - 0s 227us/step - loss: 0.0443 - acc: 0.9873 - val_loss: 0.0831 - val_acc: 0.9706\n",
-      "Epoch 964/1000\n",
-      "157/157 [==============================] - 0s 347us/step - loss: 0.0440 - acc: 0.9873 - val_loss: 0.0839 - val_acc: 0.9706\n",
-      "Epoch 965/1000\n",
-      "157/157 [==============================] - 0s 256us/step - loss: 0.0411 - acc: 0.9936 - val_loss: 0.0864 - val_acc: 0.9559\n",
-      "Epoch 966/1000\n",
-      "157/157 [==============================] - 0s 331us/step - loss: 0.0416 - acc: 0.9873 - val_loss: 0.0843 - val_acc: 0.9706\n",
-      "Epoch 967/1000\n",
-      "157/157 [==============================] - 0s 286us/step - loss: 0.0419 - acc: 0.9873 - val_loss: 0.0754 - val_acc: 0.9706\n",
-      "Epoch 968/1000\n",
-      "157/157 [==============================] - 0s 299us/step - loss: 0.0425 - acc: 0.9936 - val_loss: 0.0783 - val_acc: 0.9706\n",
-      "Epoch 969/1000\n",
-      "157/157 [==============================] - 0s 314us/step - loss: 0.0417 - acc: 0.9936 - val_loss: 0.0784 - val_acc: 0.9706\n",
-      "Epoch 970/1000\n",
-      "157/157 [==============================] - 0s 348us/step - loss: 0.0418 - acc: 0.9936 - val_loss: 0.0836 - val_acc: 0.9706\n",
-      "Epoch 971/1000\n",
-      "157/157 [==============================] - 0s 310us/step - loss: 0.0428 - acc: 0.9873 - val_loss: 0.0837 - val_acc: 0.9706\n",
-      "Epoch 972/1000\n",
-      "157/157 [==============================] - 0s 357us/step - loss: 0.0416 - acc: 0.9936 - val_loss: 0.0867 - val_acc: 0.9559\n",
-      "Epoch 973/1000\n",
-      "157/157 [==============================] - 0s 317us/step - loss: 0.0430 - acc: 0.9936 - val_loss: 0.0851 - val_acc: 0.9559\n",
-      "Epoch 974/1000\n",
-      "157/157 [==============================] - 0s 296us/step - loss: 0.0413 - acc: 0.9873 - val_loss: 0.0855 - val_acc: 0.9559\n",
-      "Epoch 975/1000\n",
-      "157/157 [==============================] - 0s 249us/step - loss: 0.0414 - acc: 0.9936 - val_loss: 0.0873 - val_acc: 0.9559\n",
-      "Epoch 976/1000\n",
-      "157/157 [==============================] - 0s 279us/step - loss: 0.0416 - acc: 0.9873 - val_loss: 0.0879 - val_acc: 0.9559\n",
-      "Epoch 977/1000\n",
-      "157/157 [==============================] - 0s 98us/step - loss: 0.0414 - acc: 0.9936 - val_loss: 0.0799 - val_acc: 0.9706\n",
-      "Epoch 978/1000\n",
-      "157/157 [==============================] - 0s 90us/step - loss: 0.0418 - acc: 0.9936 - val_loss: 0.0793 - val_acc: 0.9706\n",
-      "Epoch 979/1000\n",
-      "157/157 [==============================] - 0s 149us/step - loss: 0.0414 - acc: 0.9873 - val_loss: 0.0807 - val_acc: 0.9706\n",
-      "Epoch 980/1000\n",
-      "157/157 [==============================] - 0s 120us/step - loss: 0.0405 - acc: 0.9873 - val_loss: 0.0741 - val_acc: 0.9706\n",
-      "Epoch 981/1000\n",
-      "157/157 [==============================] - 0s 129us/step - loss: 0.0413 - acc: 0.9936 - val_loss: 0.0755 - val_acc: 0.9706\n",
-      "Epoch 982/1000\n",
-      "157/157 [==============================] - 0s 252us/step - loss: 0.0409 - acc: 0.9936 - val_loss: 0.0803 - val_acc: 0.9706\n",
-      "Epoch 983/1000\n",
-      "157/157 [==============================] - 0s 263us/step - loss: 0.0404 - acc: 0.9873 - val_loss: 0.0769 - val_acc: 0.9706\n",
-      "Epoch 984/1000\n",
-      "157/157 [==============================] - 0s 158us/step - loss: 0.0419 - acc: 0.9936 - val_loss: 0.0744 - val_acc: 0.9706\n",
-      "Epoch 985/1000\n",
-      "157/157 [==============================] - 0s 220us/step - loss: 0.0410 - acc: 0.9936 - val_loss: 0.0833 - val_acc: 0.9706\n",
-      "Epoch 986/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.0417 - acc: 0.9873 - val_loss: 0.0915 - val_acc: 0.9559\n",
-      "Epoch 987/1000\n",
-      "157/157 [==============================] - 0s 141us/step - loss: 0.0403 - acc: 0.9873 - val_loss: 0.0797 - val_acc: 0.9706\n",
-      "Epoch 988/1000\n",
-      "157/157 [==============================] - 0s 115us/step - loss: 0.0405 - acc: 0.9873 - val_loss: 0.0821 - val_acc: 0.9706\n",
-      "Epoch 989/1000\n",
-      "157/157 [==============================] - 0s 102us/step - loss: 0.0397 - acc: 0.9936 - val_loss: 0.0813 - val_acc: 0.9706\n",
-      "Epoch 990/1000\n",
-      "157/157 [==============================] - 0s 161us/step - loss: 0.0402 - acc: 0.9936 - val_loss: 0.0899 - val_acc: 0.9559\n",
-      "Epoch 991/1000\n",
-      "157/157 [==============================] - 0s 299us/step - loss: 0.0421 - acc: 0.9809 - val_loss: 0.0819 - val_acc: 0.9706\n",
-      "Epoch 992/1000\n",
-      "157/157 [==============================] - 0s 218us/step - loss: 0.0400 - acc: 0.9873 - val_loss: 0.0787 - val_acc: 0.9706\n",
-      "Epoch 993/1000\n",
-      "157/157 [==============================] - 0s 195us/step - loss: 0.0410 - acc: 0.9936 - val_loss: 0.0817 - val_acc: 0.9706\n",
-      "Epoch 994/1000\n",
-      "157/157 [==============================] - 0s 114us/step - loss: 0.0392 - acc: 0.9936 - val_loss: 0.0889 - val_acc: 0.9559\n",
-      "Epoch 995/1000\n",
-      "157/157 [==============================] - 0s 186us/step - loss: 0.0399 - acc: 0.9873 - val_loss: 0.0750 - val_acc: 0.9706\n",
-      "Epoch 996/1000\n",
-      "157/157 [==============================] - 0s 474us/step - loss: 0.0406 - acc: 0.9936 - val_loss: 0.0791 - val_acc: 0.9706\n",
-      "Epoch 997/1000\n",
-      "157/157 [==============================] - 0s 267us/step - loss: 0.0396 - acc: 0.9936 - val_loss: 0.0862 - val_acc: 0.9559\n",
-      "Epoch 998/1000\n",
-      "157/157 [==============================] - 0s 310us/step - loss: 0.0395 - acc: 0.9873 - val_loss: 0.0734 - val_acc: 0.9706\n",
-      "Epoch 999/1000\n",
-      "157/157 [==============================] - 0s 457us/step - loss: 0.0398 - acc: 0.9936 - val_loss: 0.0901 - val_acc: 0.9559\n",
-      "Epoch 1000/1000\n",
-      "157/157 [==============================] - 0s 332us/step - loss: 0.0395 - acc: 0.9873 - val_loss: 0.0871 - val_acc: 0.9559\n"
+      "_________________________________________________________________\n",
+      "Layer (type)                 Output Shape              Param #   \n",
+      "=================================================================\n",
+      "conv2d_75 (Conv2D)           (None, 26, 26, 6)         60        \n",
+      "_________________________________________________________________\n",
+      "max_pooling2d_65 (MaxPooling (None, 13, 13, 6)         0         \n",
+      "_________________________________________________________________\n",
+      "conv2d_76 (Conv2D)           (None, 11, 11, 16)        880       \n",
+      "_________________________________________________________________\n",
+      "max_pooling2d_66 (MaxPooling (None, 5, 5, 16)          0         \n",
+      "_________________________________________________________________\n",
+      "flatten_29 (Flatten)         (None, 400)               0         \n",
+      "_________________________________________________________________\n",
+      "dense_186 (Dense)            (None, 120)               48120     \n",
+      "_________________________________________________________________\n",
+      "dense_187 (Dense)            (None, 84)                10164     \n",
+      "_________________________________________________________________\n",
+      "dense_188 (Dense)            (None, 10)                850       \n",
+      "=================================================================\n",
+      "Total params: 60,074\n",
+      "Trainable params: 60,074\n",
+      "Non-trainable params: 0\n",
+      "_________________________________________________________________\n"
      ]
     }
    ],
    "source": [
+    "# Creating a CNN similar to the one shown in the figure from LeCun paper\n",
+    "# In the original implementation Average pooling was used. However, we will use maxpooling as this \n",
+    "# is what us used in the more recent architectures and is found to be a better choice\n",
+    "# Convolution -> Pooling -> Convolution -> Pooling -> Flatten -> Dense -> Dense -> Output layer\n",
     "from keras.models import Sequential\n",
-    "# Building a Keras model\n",
+    "from keras.layers import Dense, Conv2D, MaxPool2D, Flatten, Dropout, BatchNormalization\n",
     "\n",
-    "model = Sequential()\n",
-    "\n",
-    "model.add(Dense(8, input_shape = (4,), activation = \"relu\"))\n",
-    "\n",
-    "model.add(Dense(8, activation = \"relu\"))\n",
-    "\n",
-    "model.add(Dense(1, activation = \"sigmoid\"))\n",
-    "\n",
-    "model.compile(loss=\"binary_crossentropy\", optimizer=\"rmsprop\", metrics=[\"accuracy\"])\n",
-    "\n",
-    "num_epochs = 1000\n",
+    "def simple_CNN():\n",
+    "    \n",
+    "    model = Sequential()\n",
+    "    \n",
+    "    model.add(Conv2D(6, (3,3), input_shape=(28,28,1), activation='relu'))\n",
+    "    \n",
+    "    model.add(MaxPool2D((2,2)))\n",
+    "    \n",
+    "    model.add(Conv2D(16, (3,3), activation='relu'))\n",
+    "    \n",
+    "    model.add(MaxPool2D((2,2)))\n",
+    "    \n",
+    "    model.add(Flatten())\n",
+    "    \n",
+    "    model.add(Dense(120, activation='relu'))\n",
+    "    \n",
+    "    model.add(Dense(84, activation='relu'))\n",
+    "    \n",
+    "    model.add(Dense(10, activation='softmax'))\n",
+    "    \n",
+    "    model.compile(loss=\"categorical_crossentropy\", optimizer=\"rmsprop\", metrics=[\"accuracy\"])\n",
+    "    \n",
+    "    return model\n",
     "\n",
-    "model_run = model.fit(X_train_scaled, y_train, epochs=num_epochs, validation_data = (X_test_scaled,y_test))\n"
+    "model = simple_CNN()\n",
+    "model.summary()"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 84,
+   "execution_count": 141,
    "metadata": {},
    "outputs": [
     {
-     "data": {
-      "text/plain": [
-       "[<matplotlib.lines.Line2D at 0x7fe91c78a208>]"
-      ]
-     },
-     "execution_count": 84,
-     "metadata": {},
-     "output_type": "execute_result"
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Train on 60000 samples, validate on 10000 samples\n",
+      "Epoch 1/20\n",
+      "60000/60000 [==============================] - 39s 654us/step - loss: 0.5778 - acc: 0.7859 - val_loss: 0.4449 - val_acc: 0.8398\n",
+      "Epoch 2/20\n",
+      "60000/60000 [==============================] - 36s 593us/step - loss: 0.3826 - acc: 0.8586 - val_loss: 0.4165 - val_acc: 0.8462\n",
+      "Epoch 3/20\n",
+      "60000/60000 [==============================] - 37s 622us/step - loss: 0.3315 - acc: 0.8765 - val_loss: 0.3604 - val_acc: 0.8732\n",
+      "Epoch 4/20\n",
+      "60000/60000 [==============================] - 38s 625us/step - loss: 0.2993 - acc: 0.8876 - val_loss: 0.3470 - val_acc: 0.8768\n",
+      "Epoch 5/20\n",
+      "60000/60000 [==============================] - 37s 619us/step - loss: 0.2791 - acc: 0.8960 - val_loss: 0.3262 - val_acc: 0.8829\n",
+      "Epoch 6/20\n",
+      "60000/60000 [==============================] - 38s 625us/step - loss: 0.2614 - acc: 0.9016 - val_loss: 0.3000 - val_acc: 0.8913\n",
+      "Epoch 7/20\n",
+      "60000/60000 [==============================] - 37s 622us/step - loss: 0.2463 - acc: 0.9074 - val_loss: 0.3715 - val_acc: 0.8696\n",
+      "Epoch 8/20\n",
+      "60000/60000 [==============================] - 37s 620us/step - loss: 0.2352 - acc: 0.9126 - val_loss: 0.3060 - val_acc: 0.8888\n",
+      "Epoch 9/20\n",
+      "60000/60000 [==============================] - 37s 624us/step - loss: 0.2247 - acc: 0.9162 - val_loss: 0.3020 - val_acc: 0.8901\n",
+      "Epoch 10/20\n",
+      "60000/60000 [==============================] - 38s 625us/step - loss: 0.2152 - acc: 0.9188 - val_loss: 0.3262 - val_acc: 0.8878\n",
+      "Epoch 11/20\n",
+      "60000/60000 [==============================] - 38s 631us/step - loss: 0.2077 - acc: 0.9222 - val_loss: 0.2973 - val_acc: 0.8961\n",
+      "Epoch 12/20\n",
+      "60000/60000 [==============================] - 38s 636us/step - loss: 0.1999 - acc: 0.9245 - val_loss: 0.2893 - val_acc: 0.8997\n",
+      "Epoch 13/20\n",
+      "60000/60000 [==============================] - 40s 661us/step - loss: 0.1926 - acc: 0.9279 - val_loss: 0.3193 - val_acc: 0.8929\n",
+      "Epoch 14/20\n",
+      "59904/60000 [============================>.] - ETA: 0s - loss: 0.1851 - acc: 0.9311"
+     ]
     },
     {
-     "data": {
-      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAusAAAH0CAYAAACEkWPuAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4xLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvAOZPmwAAIABJREFUeJzs3Xd8VeX9wPHPuSP3Zi/IDmGGPcMUEVBBFKh7FKtVa6u1rdZVraPu7ta2Wkd/tmLde9QNDlRAgaAIskdYgew97jy/P07uHrlJbgbk+3698sq9ZzznuZckfM9zv8/3UVRVRQghhBBCCNH36Hq7A0IIIYQQQojgJFgXQgghhBCij5JgXQghhBBCiD5KgnUhhBBCCCH6KAnWhRBCCCGE6KMkWBdCCCGEEKKPkmBdCCGEEEKIPkqCdSGEEEIIIfooCdaFEEIIIYTooyRYF0IIIYQQoo+SYF0IIYQQQog+SoJ1IYQQQggh+igJ1oUQQgghhOijJFgXQgghhBCij5JgXQghhBBCiD5KgnUhhBBCCCH6KENvd6AnKYqyD0gCSnq5K0IIIYQQ4vg2GKhXVXVIVxrpV8E6kBQbG5s2evTotN7uiBBCCCGEOH5t27aNlpaWLrfT34L1ktGjR6cVFxf3dj+EEEIIIcRxrKioiI0bN5Z0tR3JWRdCCCGEEKKPkmBdCCGEEEKIPkqCdSGEEEIIIfooCdaFEEIIIYTooyRYF0IIIYQQoo+SYF0IIYQQQog+SoJ1IYQQQggh+igJ1oUQQgghhOijJFgXQgghhBCij5JgXQghhBBCiD5KgnUhhBBCCCH6qKgE64qinKcoykOKonyuKEq9oiiqoijPdLKtPEVR/qMoSqmiKBZFUUoURfmboiip0eirEEIIIYQQxwpDlNq5A5gINAKHgFGdaURRlGHAGiADeBPYDkwHrgMWKYoyW1XVqqj0WAghhBBCiD4uWmkw1wOFQBLw0y608whaoH6tqqpnqap6q6qqJwMPAiOBB7rcUyGEEEIIIY4RUQnWVVX9RFXVXaqqqp1tQ1GUocBCoAT4p9/uu4Am4BJFUeI73VEhhBBCCCGOIX1pgunJbd8/VFXV6b1DVdUGYDUQB8zs6Y4JIYQQQgjRG6KVsx4NI9u+7wyxfxfayHsh8FG4hhRFKQ6xq1O59EIIIYQQQvSGvjSyntz2vS7Eftf2lB7oixBCCCGEEL2uL42st0dp+95uXryqqkVBG9BG3KdEs1NCCCGE6F92HG1gw/5qSmtbmDEknZMKB/Z2l8RxrC8F666R8+QQ+5P8jhNCCCGE6HEfbS/jj+/vAMBqd0qwLrpVX0qD2dH2vTDE/hFt30PltAshhBBCdLvclFj349La1l7siegP+tLI+idt3xcqiqLzrgijKEoiMBtoAb7sjc4JIYQ4/tkcTu7533fUNNu4a8kYMpLMvd2lY47DqXLf21tZsbWMw7UtXDA1j7E5yTzwzjasDiejshKZM2IAt50xGkVR2HK4jj99sIPpQ9KobrKydk8Vp43N4rpTR9BstXPHG1v47nA9AHsqGrE7PdmwwwbGY9DpmJifzL1njsNk0PGnD3bwyKd7SDAZaLTYyUk2k5MSS2yMHqvdic3h5GhdK6V1rYzMTGRHWQNJZgMnj8rgvrPGkWg2AvD+liM8/tlespLMzC0cyMvFh2iy2BmcHs/73x119+GdzUeoeHwt20rrabDYGZ6RwO7yRoYNjGdgogmbQ6XJYkdV4UhdC/Wtdvd1pwxK4bfnjGdXWSMPf7wbc4yeH88ZwpIJOQA0Wuzc+cYWtpbWc9bkXN7fcoRNh+qYmJfMxTMK+N+3pZTXWzDH6DmvKI/1+6rZcbSB3NRYThubydvfHmHh2CwumVnAI5/u5u1NRzAbdVjsTr4rrWfPb8+gyWrnN29s4ePt5dS32rnqpKHExRh4+9tSdpU3MiorkXOn5PHjk4ZS22zlzje/w2zQcd9Z4zAb9QC02hz85s0ttNic/GbJGB5cuZOaJisnFQ7knW+PYHU4abba0SkKl8ws4Pyp+Ty4YifF+2tottrJTonlriVj+MfHu9hZ1oiqqtS32MlPi+OupWN4bt0BHv10D4kmA4VZieh1CnaHk5/OG86CMZlsLa3nd+9tw+5Quem0QgozE/nNm9/x8fZy6lpsXHniEG5frP28HYuULpRGD96gosxDC7yfVVX1B0H2G4FhgE1V1T1++z5Aq/hyraqqD3lt/yvawkuPq6p6dRf6VjxlypQpxcWhisUIIYToz55aU8Jdb30HwJIJ2Ty8TKY5ddSKrWX8+L8b2j3u/rPGMSEvmSuWr6ey0Rqw/6WrZrHtSL3736M9fzh3PDkpsVzy73Ud7rPL7WeM5scnDcXpVJl074fUt9o73VakBqfH8aM5Q7nzjS0AJJoMFN+5gBiDjn9/sY/73t7a5Ws886MZXLF8PflpseypaHJvf/sXJ/LpjnL+/GH7SQurbp7H8jUlPLm6BICbFhby85O1pIcnPt/L/e9sC3v+7OHpHKlr5VBNC49ePIUfPeX7MxKj12F1OAPOy0wyUVZvCdpmapyRjXcu4JJ/r+OL3ZUATMpPYcmE7ID+PPaDIhaNy2r3dUZTUVERGzdu3BhqLmWkojKyrijKWcBZbU9d78QsRVGWtz2uVFX1prbHucA2YD8w2K+pa4A1wD8URTml7bgZwHy09Jfbo9FfIYQQIpivD9S4H5eHCBCEptFi53BNCw2tNhLNRpavKeHLvVXsq2xq/2TgjrbgNJQLHl/LjCFpLJ6QzTvfHmm3vdc2HuarfdURXTuUB97dxujsJMbnJfdIoA5QUtXs87zBYqesvpUPvjvabgCcmxLL4dqWdq/x7y/2YnU4uX5BIT9/7mv39tvf2MKmg7UR9XN3eSPbjzS4n//5w500Whzcevoonlpb0u75VY1WHE4Vq93JE5/vC9gfLFAHQgbqADXNNk796yqfG5BvDtYyMjMx4NjXvz7U48F6tEQrDWYS8EO/bUPbvkALzG+iHaqq7lEUZSpwL7AIOAM4AvwDuEdV1a79FgohhBBh5KZ6cpFnDx/Qiz3pW6qbrMTF6DEbtVSSGIOOlVvL+OWL3wCwdGIOzRZ7xIF6pM4ryuP8qflcf2ojZ/1zNY2WwAC6ID2O1356AsX7a7ocrAPc/MomUuJiutxOpJZOzGHJ+Gx+9+42mq0OAA7XtlDXYmv33NgYPYkmAw1B3hdvn+yoAPAJ1IGIA3WA0toWls0YxNq9Ve5tTlXl6wM1HKxu/4Zh+1FPoO/dRld5B+ouL244GLX2+4KoBOuqqt4N3B3hsSV4yjAG238QuDwa/RJCCCE6osFrNDXR3JemdfWuW1/9lg+3lrmf73rgdJ/3p6HVRkqsMerXdeWPD89IIMlsCBqsL5s+iPQEk8+NVle02hxsO1If0bFp8TFUNwWm8ARz9dxhPLZqT8D2RLOB1PgY5o/KcH+CcNG/Ipuel2AykJWf4k4BcZk9PJ3Vu6MXEAMcrm1lTI5vwb5Xig/xr8/2RnS+Ua9gc0Q39bojQgzcHxPkL5EQQohes/1oPa9vPMziCdlMyAtc8+7rAzW8t+Uo507JY2SW56PtnWUNvFp8iNT4GDbur0FR4EcnDmX6kLSANr4rrePNb0pZOiGH8XmeYGNfZRMvrD/AvMIMth+tp9XmpMord9oVjK7ZXcmqXRWcMS7bPWlydHYil88eglEfWFRtx9EGHv9sDxUNFpZNH8Tp47Pd+97fcpTi/dXcvngMn2wvZ31JNZfMKiA7OZZdZQ28UnyI08ZlMWVQKm9/W8q2I/XoFYWDNZ6Ry5wUMwoKmUkmfjCzgD0VjTy9dj/bjzYwOD2en8wdyrCBCQCs3FpGXIyekqpm1uyp5EhdK4PS4shONmPQKZQ3WBg8IJ6aJivnT81neEYCNU1Wnlt3gCtmDyE2Rk+rzeETqAN8ubfKJ5Vh/b5qThsb/RSDF9cfQFHgtLFZ6PXBx/kSzUbue3srL0VpNLWoII2V28raPxCYWpAa8N6EkhRr4J/LpvD614d92n/uqwOMyEjwqTDTEWqQ5WeuO6WQ1bvXdqo9b6eMyuCj7eUAvLv5CEfqfEfQO/L6O+LtX5zIkoe+iGqbFQ3HbtUeCdaFEEL0mpte3sSWw/W8tamUL245Gb3OE5DZHE5+8nQxFQ0WPt1RzofXz3Xvu/b5r30+Vgf4cm81X/76FGJj9O5tqqrys2c3UlLVzNubSvnc6xq3vPot6/ZV8/iq4CODiWYjdS02fvzfDTRZHT7Hvf41JMcauXDaoIDzrn3+a3aUaX37fFclH984l6EDEyipbOKaZ4vJTo7lxycN5SdPb8DmUNlZ1sATP5zGL1/8hu9K63nuqwM8/5OZASkLweSmxvLgil1sPqwtQfLVvmp2ljfw+jWz2VPRyJX/3cCpozN9gsPi/TVB21q7t4q3fn4iLTYHf/pgB06nyi9OGcGL6wOD4Ftf3UyT1TPK3WR18NrXh9vtb0d9sqOCuBgDp43NQgnxoXyi2cD+qiafT0UisWBMJiu2lvGz+cP45yeeEe8xOUkRB+sT81OCBqtLJ+bwv02lPtsGxJtYPCGbIQPiA9pfvbuSEzuRdvXNwVoMusD3ZfqQNG5ZNIo/vL+9w216G5Wd6A7WD1Q343+pyYPCB+vZyWaO1LWSGmekprn9tB6XgvS4TvU3mBi9jpwUM3mp0Wuzp/WlOutCCCH6mf1tk+uO1LUG5OgermmhokGbXLazrJFpD6zkgsfXoqpqQKAOUNdiY84fP+EfH+3y2eaawFda1+puz2p3sq6d/Oarnynm5pc30dSWR+zvm4OBa/Q1W+3uQN3FFexuPlyHU9UmZv5o+QZ3SsDKbeWoqsp3pVrqRYPFHnEFkLve+s4dqLt8faCWqfev5JS/rALg4+2RBZ7fHqrjk+3l7vb+smInk+/90Of9dKloDD/59qJp+Tz/45mMzUny2f7qT2fx3nVzIuqPyzubj/De5iOEqrqXYDbw7aHAfwvvgM9s9A13FozJ5P8unUrJ7xdz82mjOGdyLgBDBsSzKMJPCAozE/jexByGDox3bzuvKI+S3y/moe9P5vpTfZeNcX1SkxofmC6kUxTmjczwuVm9YGqeT9ugBZ43LvBt17uUJcD5RXkAfH96Pgmm9sdkY9vKL549OZdRXp9e5aXG8pM5wxiQYPL0U6ewcEwmAGOykzh/al7Idk8Yls5LV83CqFdCBuoF6XEsm+F7wzt9SBqJZmPA6xyekcB5ba/t7xdNCvnz4O3MSTlsv28Rn948n39efOxWdpKRdSGEEL0myWx0j4g2ttpJi/dM7DtS5/uxdUWDBaNOQVEUpgxKYeOBwMlxlY0W1u6p4tpTtJJy/pUySutaSE+IofCO9yLqX7hRw892VvDZzgr36pWtNgeHagIn2n28vZyfnzyc0ra+1LXYGJmV6BNk/8+v2kmw0e95IwfyadtEQZdQE/vqWjzpPM4OpAlXN1m56ZVN7uc1zTZigqT6WO1OrPbQScCKArOGpfPOtXMoum8FVW153TkpsRzwq34SiWe+2h9y39VPF2Px6svfLpzE/FEZJHvl0FvtTk74/cdUtt1k+M9H+P25EzhnSh7j85J9Ama9TsHh9QY+edm0thcIRQWpJJmNLJs+yF21xbvdnBTfGv2FbYFwdnJswMh7k9XO4AHxfHLjPDbsryY1PoYThw+g2epg86E6pg1JZcvhOrKSY8lJNvOXFb6lFj//1Xx2VzRi1OmYMVRLBUuJi+HjG+fyXWk9zVYHDlWlqCCVzYfquPoZTwnrj2+ay57yJqYPScNid7Bhfw2oMHlQCslxRj745Ry+2F2Jw6mSYDJwUuFANpTUMGlQCgkmA8tmDOK5rw4AMDEvmUd/UMSeikamFqQRG6Pn4xvnsfFADbOGpmN3qny1r4pYowGTUceU/FTiTXqWjM/GYndisTs5qVD7hOFn84czfUgaOSmx7K9qZkJ+MrFGPYsnZDNlUCof3TCXxrb69XsqGlEUuP5F7WfXbNTxxKXTmD4kDV2QTx6ONRKsCyGE6DXewU19q+/oW2mQknSuwD4nJTZosO7a52nDN+AvrW1hyqDUTvf3tjNG8dt3tdSCw7UtXPqfdXx790KSzEYeW7WHv60MHIXeVd7I+Ls/JN4rPWdkZiIDEmLc9cWvfd435cV/tBS0kU//YL07RGP5FVfZw1abwx2o63UKGYlm9wJHHTEgwcQJwwbwpw92BOwrzPS98fnexJyAAC3GoOPquUPdQXWS2Riw/8QRgWkoDqfqE7C32hw+cxAAnxKPiV7teuegTxuc6p5HANqE3GAGpccxyOsTgeRYT7+KCjzzMUZnJ/lMgs1PiyM/LTDNIyPJHLCw10CvkXK9TiEryUx2stbXGIOO+SMzfI5PTzBx5qRcn23e79WUQanuYD0vNY6clFif30H/vp09OXA0/oQgKUA6ncKMoenuNlxc/fO+GZuYn8K0B1a6n7fanEH/PY9VEqwLIUQvs9qdfLStjPIGC4WZiQwZEM9X+6qYV5hBclz7FTYaLXbe33KUuBg9p47OJMago7LRwurdlZw0YiCp8cHL0DW02vh4ezlFBankpcaxt6KRVTsrmJSfwmSvgPZwbQsrt5YxIjOBWUPT3asAVjRYWLOnkjkjtJHlz3dVMCgtjk0HaxmTk0xeaizrS6qZPyqDJLORdfuqabE5mDN8gDuY8g7W/7ZyJ3MLB7JgTBZZyeagwXqj1c4zX+73GUn156oK0my186/PfKtvjMxM5L3N7dfsDiU93hSw7dFP95AeH8NnO8MH0t7pNC9tOBh0IaBQclNimZSfEjDSG0pHqm54l/678eVNAftD1b8Op74tpam0toVYo54Wm4OsJDN6nUKDJfLcZRe9ovCz+cO5Zt4wmqwO7A6nu7zilU+tZ7NXurzrZ2tXWQM1zTacqkpts42tbcFtXIyepA5U+vF+v3/33vaAYN078E7yGVmPxWTQkZsSG5Av3dH8+mjy7m+CydDlVT29P0F4Z/MR/mJzuFc27UmuFLfjkQTrQgjRyx76eBcPfbw7YPtJhQP57xXT2z3/V69s4t3N2vLn1548nOsXFHL5k+vZfLiOyYNSeP2a2UHPu/W1zbzz7RGyksy89fPZnPnwanfQ9vYvTmRcbjIWu4PzHl3jTkl56orpzC0ciKqqXPLvr9h+tIFpg1NxqqEnLi4ck8mPThzChW3l6B76/mSWTsxha2m9zyIrK7eVs3JbOf/+Yh8rb5jL9ybl8PrXh9nrVbtbVeE3b24Jm9rxj492ccb4LJ5aU8L6Ek+fblpYyIjMxKCj3y4GnRJ0VBu0gHnmsPSA7Y9+GliOrz27yhsjPva96+YwOlvL/S6+41QMeh3j7vqgw9cEeP7HM7np5U0+6UGjs5NYVxJZffIYgy5s+ovL57squf31zWw8UMu43CTWl9SQmxpLXYvNnarQEaa24E9RlIA87NiY4KHMo6v28NpG30mvo7OTeO+6OXR29fZgkzkB9w2J94h9QXoc2+9bFDQYPndKns/vi66DAXOQzKSIedfCj6SWe3syEn1vYE2G3p8O2ZGbsWNB77+jQgjRzwUL1EHLiY5kFNUVqAM8tXY/h2tb3GkBXx+oxRZiZNRV0/lofStPrinxWVhldVvd5j3lTRypa2VCXjLpXiP0FY0W9yTP9SU1IQN10PK+vZeM/8XzX7O/qonLl68LuphLSVUzh2paKEiP567vjQ3YH0kO9jmPrOH5db5VTHJSYnE6VZ9VSr1NHpTCrgdOZ8mE7IB9ep3CS1fPIjcllrltOeo9IT0+hiEDPJMMU+JiSDAZ3MG7N9ckyVCGZyQwc2gao7N9V3ecP8qT9pAca+TkURn+pwJw2thM/nz+xIj6bdApPPvVAbYdqXffME0tSCXBZCAvRD303ywZw7Z7F5Hq92mSVpZzSMhrXTNvmPvxtScPB7TRcP9A3bfN8MHxbWeMcj++cGq++/H9Z40LOPaupWPZdt8idj1wOmdP8fwbKIoS8jrLZgzilkWea3g/jsTdS8d6PR7ToXNHZiWS2HbDszjIz3pHFaTHu1N+igpSuzxS31l/PG+C+/HfL5rcK33oLsfXrYcQQhxn9lY0MiLI0tku/iOEdS02TvzDJz7bGvwmbtodTo7W++Zyj8pK5JzJue7yex9vL+f7Mwa5U1G+PVTHSYUD3YHqkdqO1Sz2r5Ay90+fhj2+tLaFwQPiOWnEAB5eNjlsGcPhGQncvXQsP/j3V+5tzUEquOSkxHLB42sp9Zu4mpsSy9mTc7loej6KonDfmeMYn5tMeYMFm8OJw6mydGKOOyD543kTeH7dAY7UtpLgN4KnqlqawdyRAxmcHs+aPZWkx5vYfrQepwpGvY6TR2VwweOeGtiuUVmXxROymTk0ndLaFpZOyAmaUvD4D4p4d8sRspPNbDvSwOjsRJZMyGFO4QB2ljWyeHw2+yqb2FJax+isJI7Wt2rlDxXFJ58Y4NwpuWQnm9l+tIElE7LJTjbz4oaD6BSF08ZmsWLrUWwOlQum5pMWH4PN7qTF5sBqd1JS1YRep3D25FxKa1v4dEcFg9LjGBBv4levfuu+xj3fG8t5RXnodQrLL5/GD55Y5/4ZTDQZ+PUZozlnSi5mo55nrpzBmt1VnDomk1U7yhk6MIHhGQmEMjo7iScvm8bBmmZ3tZBQ4WJNhAsYXTprMAkmIwMSYpg3MoOiglQSzAZmBflkxSVYzf1wLp89mKRYA1lJZsblJrd/gpepg9N4/JIiqpusnDMl/E2av0SzkWd/PIN1+6o5q50bvEgY9TqWXz6Nj7eXs2RiTpfb66yz215LrFHPvJE9d0PdEyRYF0KIPmzBg58BWpm05ZdPC5gsFix3e1RWok9pw4ZWG2nxMdQ2W5n7p0+DfvRd02Tl1DGZ7mD9q33VTLj7Q59jcr1yU73zySNJjeho1sGyJ77ih7MK+Om84SyZkMN/vtgXckJpaW0Ls4enMzEvmU1BSvgBjM9NZnRWUtD8/VtOH8X3vIKM1PgYrpo7LOA4l8wkM7/0K8sXSrAgzP8G654zx/KrVzyBbV2zjUtmFoRtd1B6HFe39fHMSZ7t3pP3xuUmszRI8OQdrGckmkiNjwkI2q6ZN9z9+Ccn+b4X5xYFL9c3IS+FReO0kdqVflV0fnjCYPfj4RmJ3LFktPsGbE7hAJ/yfWNzkhnbtlLmkAGhR9S9zff7NCBUBZDq5siCdbNR79OnC6blhzm6c8xGPRfPCP/vHE5XFqGakJcSdBGyzhqRmRh2UKEnGPU6Lpga/X+nvkDSYIQQIkoaWm0crG5GVVV2lTVg90s/qWmysmZ3JTvLGthV1oCzAzX1th6pp7LRyqaDtXy5t4pWm4NGiz3o4i2xMXpGev3H6UpR+WJ3Zcgc1bc2lQaMuPpLMhtZs7uSNbsr+XJvx5YyP7sTI3hPrd2P3am9h4nm0BNtm60O6lpsYSftPXD2OJLjjOiDfET//pbOTzjtDP80Af8eBashH00zh6Zz/amF/Om8CTx1xfQOjwhHQ6zXpwUtIerYd4dI8u2F6GtkZF0IIaKgvKGVU/68igaLnZQ4I7XNNqYWpPLy1bNQFIV9lU2c9rfPfIIF18IsAxNNEVUyuOzJdZS3HWc26nCqwYOPrw/U+gRDN7y0if1VzcSbfNMpXKsLAmw8UBtQF9pfq83Bsie+CtienWx2L24UyqWzCni9gytc6hRtFBtgVZhKKyMzE6lvsfuU0PPnGkUMtjS7d85/fzApP4VJ+dEbVe2M1PgYRmcnEWvU+ZQ0FEIEkpF1IYSIgkc+2eOeLFnbtlrfhv017KtsYk9FI39fuTMgsF6xtYxGi53WCEcWy70C+lZb+EVpvHOgAf6zel9AzfFTR2e6H2cnmxkQb/JZrdBfsEmNAPeeOc49kr90Yg4DEnxTTV6/5gQmD0pl+pC0YKe7+U88LMxMdI/6hppg+PCyyXxw/UkMSo/j+gUj3Ntzkj03Hld7pbRcOmtwQBs3nzYybL+6wzCvlSlnDk3nqrlD3c+vPWV4sFOOKRPyPOk/wX6mpgxK5b3r5vDaNbO5Y0nHJkhGyr9KCcBP54VObxKir5KRdSGEiII9FcFL8ZU3WPjnJ7v5fFdl0P1HalsCAuvu0NBq51CN7+h3Tkos93xvLB9uPUpqXAw6ncLfL5rEk6v3ERdjQAUqGyzodLBgdCZTB6cxa6jvBLsThqVz0ogBHDqhgNtf38L3p+dz2QkF/O7d7eyrbOKa+cPdNdvv+d5YHv5kN3pFcbednhCDTlGINxm4YUEhr399iE93VJBkNvoEsD+bPxyL3UF6vIlBaXG88c1hxucm++TtnjsljwNVzdgcKr9cMILHV+2hvsXOT72C9ROGpXPLolFsP1qPqkJmkokrZkeWF91hTifodIGPgUd/UMRjq/Ywe9gA8lNj+fm8oVjtTsxGPRdNGxSiwfBtBlBV7UunA1sr6I2g62T967Z0JHQ6zwQE73Qep1N7rqpgbyUjVsdD35/MJ9vLuXLOUN+2rM0Q41V33LufTieobV+GGHDYwd4CxnjPNa2NYDCDovf0yfv9aPu+/PLpPPH5XuaOHMie8kZqW2xcNWewdq1gFUucTkAFpx0MJrA2adfQx/i+z9Zm7Xy9yes1t2h9Bq1vOoP7vUB1gDEOFJ3Wd2hr06hdz9bseX2gHa+q2usP8rp83m9XezEJWn9RtcdOh9YnAEOs9u9ua9Hed9fPgt0CxljPv6frGg6Lts11rDNUeUcFYuLb/j1itdfntGv9UvTa67C3evqmj9H657AGf/32Fu08nV47D7Q2Y7zeG1uLdg1vBrP2elWH1g9bk/bdadP2uX62jmFKZ2uNHosURSmeMmXKlOLi4vYPFkKICBXvr+bcR9cG3ffghRNZvmY/mw4GnxzZGUMHxjNjSFpAacKOWjZjEL89e3yUeiV87FoBr/0EMsdqgUTVLjj33zB0ru9xlkZYvhgay+H7z0FOmJJzH98PXz4GJ16nBTGf/xVmXAWn/Cbw2KZKWL5EC1zypsHWtyAhA77/PGRHVn7R7bWfwLcvao/HnAVHN2uPL30TUvLhcHHbtbxtA6/YAAAgAElEQVRuBnVGmH0dnHKnZ5vTAc+cA3s/1fr0w7e1wHD5YmiugYJZsPll7VhjPIxeAns+hqZ2Vm0tPB2mX6n1s7lKO3fiRbDkr77HlX0Hz10I5mS47G2I9VrJdsOT8PYvtcd6k9Yvb6OWwIXPwFu/gK+fgSDpVG6uQLUrYhLgjD/BoQ3ae1K4CPatgpQCWHi/9j5aI6/V70MfowXMYSmAqv07hgzWe8jQ+fCDV+GD22Hdvzr33l7yOgw7Ofp9a0dRUREbN27cqKpqUVfakWBdCHHcsTucGPQ69wRPg9cEukaLnTijHp1OcR8H0GSxYzLoMOh1qKpKg8VOktlIi9WBQa+40zFsDid2h0ps29LxTRY7Y8MsUHP13GF8uPUoeyuaQh4D8MJPZvKPj3aREqdd85Mwy8oPHRhPXmpcuytmRlKlpeT3i8PuF510d4hSfHf7VatZcRes/pv2ODEHbtwW/Dy7Be4PXv+c249qI6Te3rgGvnk28NipPwoMYsM5XAz/FyLIGbEQLn4Z/jQCmsoD9+uMcNthbZQaYN/n8NQSz/4Ln4H9a+DLRyLvT0dctwlSB3uePzwdKndoj/3fh1D/Xt6W/h3+d11UuygidNFz8MLFhL1Jao//714PiFawLmkwQojjymsbD3HHG1t86myfVDiQ5ZdN41evfssrxYfIT4tlVFYSX+yq5IYFhVQ2WXji830MSIjhn8umcMcbW9wVOQw6hQSzgUcunoJeUbj6mWIaLXauX1BIk8XOPz8Jv3rlY6v2+Ez2BLhkZgFPf7nfZ9t1L3zNV7edCoDTqTL0tnfDtpsYwQp9O+8/ncG3vtPucaIXHfSasNtQGvo4S5hRVEtjYLC+99PgxzZ3rIoPdYdC79vVVtozWKAO2oispcETrDf7pYI1V8H+1R3rT0c0V/kG665AHTp33YPr2zmgbTT6eBTjNwnY1tL1Tw/a5fV+Vu32PHal36hO309zjmPHdhKPEEL4ueGlTQEL4ny2s4Lla0p4pVgLPA5Wt7BiaxktNgcPvLuNx1ftxeFUKau3cPmT631K59mdKrXNNp74fB9Pri6hptmGzaHyx/d38NiqvUH78NgPprgfT8pPCchJLypI9T+FsnoLdW0TU0PViHY5e1Iufzl/ItMHaxM2X/jJzIBJna5VKBeMyQw43yUrKXz1F9GHWMOUcwy3L+DYTqZOdJbFq2/+NxzhbkCiIdLMgUiPa+99LpgdWTvHmpRB2ick3l8Tv9/91/V+Pxu8KjZljNH68Iv+kyUhwboQ4rihqmrAKLbLvW9vDXnegxdOdJeyc1V08Vda28Lfvz+JJK8RbUeQOul3LR3DwjFZnD05l5Q4I8mxvvXB/3z+RE4fn8XCIEH0oVrPKNG/fziVwswEfnTiEE4elUFGoonMJBMnj8rgyjlDMRv1DEqP47YzRjFzaDrPXDmDoW0VRooKUt256HctHcPEttcWY9AxKC2ORJOB8bnJPPHDqSHfE9HHtDeyHo12gulqqqz3zYH/jUJ33zhE2n6ko7PtvXeJnV+kqE+LCbLYkakHym16v58NXmshuK7tP9p/HJM0GCHEcaO22dapyio6ReGbdiaAlta2YDLogy4gM21wKi9ffYLPtgcv1JaVrG6yMuW+FQCkxBndy6H/61ItUD7zn6vdk09X7650r9x4yuhMThkdelQctMDfZVRWEh/fOC/gmLzUON782XE64tefhAs8OxL0djRA7mqagXeAGzCy3r2LP0V8YxLxce3093gN1oMF5j0RKPsE614j6zH9L1iXkXUhxHHjcG1Lt7Vd32qnoTV4VQT/0XNv3ucEyzOv81r+/Lfvbu9CD0Wf4eyGVTKjNrLewQC5q6kqPiPrDaH3dYdI24/0uFC5+S7mZK1k4PEmWFDcEyPrsV4LdwUbWe9oOcZjuKCKjKwLcZz4/XvbKd5fze2LxwSsTuh0qtzx5hb2lDdy/1njGJGZyHNfHeDFDQe56qShnDo6k1+9sonKRiu/P3c8ealxAe2rqsoD72zjrU2llDdY0CngVCE3JZbDtS3odQpZSWZ3wPzK1bPYdrSBO9/YAsB1p4xgb2UTH3x3FKvdyZIJ2fz5/InE6HXc/sYWnl93AIDB6XFcPKOAH580lOte+Jo3vynl9jNG8/bmI1S2LQqUaDZgdTgZPjCBifkp/OkDbeKYf952pB79NPgk0RXXn8RPni5mX6VWyeX8x9YGLc9c0Ri6DNrq3Z4JfQerA28mUuNjKGln9U9xjLGFqfzjsIO+E//19lbOekfaDqY3c9YjvTGJ9LiGdla6NSVqgaS9+wYNOkbR6ox3tT+9NbLunX7jM7IeJC0nEnYLGI/NeToSrAtxHPhqbxWPrdICzsueXMc3v1nos/+9LUd57istGP7Vq9/y3JUzue11rVbyNc9u5O6lY3jjG60Sxb3/2+pO0Wi22vngu6MMH5hIeUMrT3yxz92mK13bFZw7nKrPyPaB6mZ3oA7w9492+fTp7W+PMCEvmezkWHegDlBS1czv3tvG4gnZvNnWpwfeDV7Obm9FEx9uLXM/rwwTNIfjPaHUW3ZKLHmpse5gvbLRwuzhA9z9cpkaZMKoS7Dl7b1NH5zG1we0NJhIKryIY4A1TLBubfQdMfQXasGecG12JOgN105n2m5vtNJnZL0p9L7uEOlrjbQf9tbw+2MStK/26sL3lJgErUpQV4P1oDnrnQyYO8L7JsH7ve/sqL61UYJ1IUTvWV9S7X7sWure2/aj9e7HXx+opbbFN6j1DsI/3FqGxe5gxm8/CtpWpG54aVO7x7yw7iCjcwKXsHeqcMXy9sqkda+sJDMJJgOXzCxgfUk1rTYtteHXp4/my71VlNVro/z5abH84uTQy8NfMDWfl9YfZEdZA3+9YFLA/mtPGcGKbWWU11t49OIuleIVfUW4ANc/WLf7Lb5jbw0swxhJm5Gyt3ZsdD9s20r7Oe3e/fZvy9LYPSlDwa4X7jrRGuE3JfRMekikTG3BenvpO5G0469HRtZDXKOz17Y0QPyAzvenF0mwLsQxrqrRQk07QfXFMwp46OPdAKTFx9DQ6lvx5FCN78iLyaAnfPHA6Nhb2cTeyuCjX+0tIhTKF7fMZ0CCidpmG+kJMTicKnUtNnSKQoLJwDXPFoddcAjg4WWTOWGY9kd94dgsiu9YQE2zFZ2ikJVsZvUtJ+NQVW1Fbr0ubKlFo17HGz+bTavN6V5IyVu8ycBHN8zF0rbUvDgOhEsd8Q8MgwWwwYL1jrQJ4AjzN8Ha4Lt6ZzjtBbLt7fd+ff7pJtaG7h1d9+6bf2qS1esmI1p9iEmMLEXDYG5/lD4q/UkI/rPUmXb89cRNSahrdGVk/RglwboQx7gz/7k6INj2NzDRhEGnYHeqVDdZ3bnfoVz4+Np2bwC6W7D0kQEJMVw6azB/XbEz6DlnT85159tnJWuBr1GPTxDsf6MSzJwRA30mjcabDMSbPH8uDXpdh/54KooSNFD33n/cBerhRm+99zkdoOjAUq9N0ANoqdHSK8wp2sIrlgZtn6pqx5mStBFdU6L23BUgWeq1hVJCcZ3ntGvXNCVBa62WdmJO0friagfAGAc6g9bX1nrtPNACIJ0RULWgWNH5jjCHy21uKIUEr5VIW+t999cfAr3R0zeXpjCLGTVXQnO177ZwgUnd4cgn27XUhNmphl80CaDJq2+tfitIttZ1b956c5Xn2o1+o8uW+tD7OktviCyQ1Jt6Jlg3JWg/w9Fox19PTKQNdePT2ZH1ukOQNb7z/elFEqwLcYx5em0JJVXNXDNvGOkJJpqC1AW/443NzBiSTnmDhbV7qkiNM2L3qgm+7ImvAs7x9tW+6rD7ByTE+OSHv3z1LJosdi57smupK6ePy+K9LVqgY3MEBhMF6fFce8oILp4xiKL7V/rsO2dyLn+9MDDNxN+G/eGCD01Dqy1shRfRjtX/gE9/B1MuhdP/4NnudMKz58GhDXDmw5A+DJ483RPEFczW0kIOb+idfveEp88Ov/9f8zre5rp/aV+ReiyKpTyfODn8/nWPa1/B1JRErx/BbH5J+wqmtRb+OCT614wkkOyJjy1Bu9GMie96O8FeU7B5FdEWcmS9k/nyz18Ed9e1f1wfJMG6EMeQ4v013Pnmd4CWm/6XCyYGPe6ZLw/wzJcHgu6Lhvy0OJ9gPSclNirB7aT8FHewHkxxW6CdFh9Y9SU7pesThx6/pIhEs4EBCaYut9WvrbhT+/7VY3DSzZ480a1vwJ6PtMcvXQIDR/uOtnbn0vNCdLeUAkjMDtxujPN88pKUC6MWd+zmqiMMXhNKk3KjMxE02GtKyu16u+Hkz4D4DO1TK/9Py7zrrw+eAyWfd29f+gAJ1oU4hjzz5X7341c3HgoZrLtcd8oIviutY+W2jn/Me9G0fF5Yf9D9XFG0T86HDYxn/sgMdwUT0KqYxMfoyUwyuSdeJpkN1AdJOTl1dAaKorDCq4oLwMT8FC6aNohHV+0JObH1sR9MaeuLwjlTcnlt42EA4mP0XDE7slEy/9flKkF582kjOW3scbqoSW/yntRVscN3X0XwKj9dojMGjiY6rF1f3AfQhkTDpI+YkrTgwqXVb6Etc4gqMK7jzCla+orT6/fGYNa+QEudsLeCMV77hbQ2+u4P166iaHnq/ukykfB+HTojOG2Br8f7mOlXQeNR2LsqsC2DSXt9epPn38R1bkyilsceN0AbFba3QmoBjFoCn/5eS4lKHgSTlsGuD6F0o6ddc0pk77f3e+JD1W4czcnatRU9TLhAG/3f/rbnHJ1e64/dAt++4Dl93q8hbQhMv1L7ZOhg26eXM66G8RfA/67T8uYX3q99glS9F3avhIRMmHYlrHnIk34VP1Drg63Fq7/Jnhtb79f6vYe0f5PVf4epV2i/ax/dox0/50bQx0D5Vq1CjaXRE/zGD9BSlFqC/DxMuFD7Hdr6FgyeDYWnBR6TnAsn3wmbX4GxZ8GWV6FypxZgo2rXy5kCIxZqN+mjl8KBL7X+LP4LvHMD7PkYEnNg6uWw9U0YMhdq92vnnvUoJAzU3tf1T2jvt6Jo7Q0+ydOPMx+G16/WfrbThsKeT7T+bH9Ha8uVxmVO0X4/j1GKegwXie8oRVGKp0yZMqW4uLi3uyJEp/x1xU7+4VUCseT3i5l874ch88t/s2QM97691Wfb3y6cxC9f/Cbg2O9NzOGtTZ6ShC/8ZCY5ybFsP1pPbmoso7OS2Hy4jsLMRF5Yf4B7/udpd+9vz0CnU/jj+9t5pK1m+Y0LClkwNpOMRDMGvcLeiiZUVWVcbjIKsG5fNS02B0UFqZRUNTMqKxGzUU9ds431JdXEmfSMyU7iUE0LGYkm6lvtDM/wfCzqcKpsPlyHAgxKiyM1yGh7MM6283SKQl5qLHq9woGqZsbmJKH0xEe7/cHdyZ7H136jBTEAK++BL/7avdcedy6c9x/fbVvfhJcu7XrbmeOhbAshA/ZfboGU/K5d4/llsOMdz/NT74ETf9m1NoUQvaKoqIiNGzduVFW1S6W+ZGRdiGPINfOGuYN1vU5h6UNfhJ0IWtHoO5E0LkbPmZNy+HxXJa9u9J0YtrfSd6JXbkos+WlxDEr3TFCa2LbYkncaSkaiyV0NxXvyZqLZwKgsz0iG/0JNJwz3lNCaFOdpLznOyKljMt3PU9r2ZfgNiuh1SkCbkdDpFPfrcBmXmxziaNFhDr9PU7w/wnZ0rg5+h+iD3LQF29YZhhhtxNMZ4ncuGtfR+U001sl/00L0d/JXQIheVtVo4f3vjjJ72AAGD4jHYnfw7uYjxMcYKKtvBUXhzEk5lNW18sF3nnxu18hyOKW1gVViFEXh12eMYtmMfM59dK17e0FaPFsOeypTZCaFzgE/c1IuZ04KzFlsaPUEMYlmmaDZLzn8Kg15B+jHerCuN4WfWGeIwnX0fr83EqwL0e/JXwEhetmNL2/i0x0V5KbE8unN83hqTQm/fXe7zzHr9lXz+a6KDi9SNHyg72z61LZR6gEJpoBJlDEGXdjnkchOiWV0dhINrTbSEqIUIIlji39A7r3oT08E64Ygk4ODbetU2+38TEdlZN3vv+VIFy8SQhy35K+AEL3s07YFeg7XtrDlcF1AoA7wv02lAdsi8Re/euQPnD3O5/mZk3J485tSkswGbj19FB9tK6O+1c5Zk3I6db1bFo3ilkWjOnWuOE7Y/QJy78V5/Pd1h24dWW8vWI/CTYF/sC4j60L0e/JXQIg+5OxH1nT63Aun5nPJrAL3ZMxT/6pVYjDoFB5eNhmjXsfcwoE+59z7vXHMGppOUUEqmUlmXr76BDYeqOGM8UFKdQkRCf/Rc0eYkXWdwbfySTT0ZrDun2/eGRKsCyH8yF8BIXqRw9n5akyvXXMCVz1dTEXbaqQ3LCx055kb9Z6gKC5Gz6JxwYPv5DgjF00f5H4+MiuRkVlRqMsr+q+AYN07Z90vn11vOr6C9WhUEwoI1mXuhxD9nQTrQnSzo3WtVDZasDmcNFkcZCaZGDIgnnX7qjEZO54X7pJkNmCxOdzPH1+1l7c2lWJzOLlwmqd8XLhl7oWIuoCcdWvwx6DlY3dsGkb7guWVRy1nvQcWywoI1uX3V4j+ToJ1IbrRrrIGFjz4Wbe0HRdjwOrwlMVzqiqVbaUa//XZXvf2WKP8Zy86wdoMqO0vV263agGyw6YtJGMPUw3GtUCJWzfUtQ86sh6l0elotdORa0gajBD9XueH9YQQIZXXt/LpjnJO/3vXlkG+eu6woNsTTAYGJJiw2J0+24LR62ShH9FBW16DPw2HPw6FjU+HPu6D2+F3ufDcRfDX0fDwVGj0XZnWHay/cyMcWtd9fXYJNskzGhM/o9lOOP4j6T1xgyCE6NMkWBeiG6zdW8VlT67H3smc9H9dUsSe357BNfOHcdVJQ332TR+cxr8uLeJAdRPeCxBLuouImlV/1JZGt7fCx/cFP8baBGsf1oLxne9pS4RX74EP7/Q9zmGFxgptyXB//qPw0RAsuPXfZmzn0wKAvGlB2omByZd0rl+RkgmmQgg/EqwL0Q28V/IMxtDOaHdDqx29TiHJbOTXZ4z22Td/VAYnDBvA3z/a7bPdqJcRdBEldQc9j/1Hyl2sTcG3V+7wfW63QGtt8GPtgYt2dVkkddZNCYHH+Dvzn0HaiYFT7gzcHk3+E0olZ12Ifk+CdSG6QX1r+FlzDjX8iHtDO+eDb9pLgsmAUS+/zqIHWRsjO85h654R9FAiqQYTE0GwnjYUhswNbCc2FU64tvP9a49UgxFC+JH/3YXoBvUt4UfW24nV2x2ZB0iN8/wnfvXcoRKsi+jx/wEN9gNriTRYt4Ctuet9ilSwYN0/AI6Ja78dnSEwfSZaJSDDXtdvJF3SYITo9+R/dyG6QaMl9Mj4CcPS2X7fIkZkhB7dW7m9nJJKT5pBildgPnVwKuBb5aXF5iAmRLAuQbzoEFUNTE8JFmxHPLJuBUtD1/sVqWABtX/980gmiipKYKDcE8G6VIMRQviR/8WF6AahRsbfvXYOz/14JmajngcvnBTy/E0Ha3ng3W3u58svn870IWlcM28Y0wanAb4TSlusToyG4Dnrdy4Z05mXIPorWwuoTt9twUbRIw3A7dbIA/toCFZnPeCYCKu6+AfKvVFnXS/BuhD9nfwVEKIbhArWR3mtDjouNzlsG4lmz6/npPwUXrpqls9+s9/IeqgR9BarI+h2IYIKFlhbG4FM322RBusOa+QpM9EQyah5pOUQAwLnHsgfl2owQgg/MrIuRDcINUG0yRr50upJ5vCBwcEaT2rC8+sOhA7WbRKsiw4IFoQH29aRNJieHFmPJFUl0nrpAcF6b6xgKsG6EP2d/BUQIgL1rTZ+9uxGPt9VyXWnjOD6BYXufRtKqrnjjS0cqmkhM8nEeUX5IUfWd5Y1MC43GZOh/XJs3iPrwTRbfIPwmUPSiYvR0+w3ki7BuuiQkCPrfiKeYNrTOesRjH5HWg6xN3LWpRqMEMKPjKwLEYGnVpfw+a5KAP7+0S5avQLgq54uZvvRBhotdvZUNPGH97eztyJ4DepzH13Lf9fsdz+fPiTN/XjhmEyumD3E/by9kfVR2Yk+z5PjjIz0SrNZPD6b3549nqKC1AheoRBtguanRxjAB2O39OzIutKJ/9ZCneOfLx5JPnxXBQTrUmddiP5OgnUhIpCf5lvq7Xdekz+rmqwBxyfFhh4Vf3fLEffjP583kfy0WAanx3HPmWN90mfaG1m/cGo+M4akkRpn5L9XTAfAe8HUq+YOZdmMQQwbGEFN6f7KFmZRHluLVhlFVcHW6ruvqVKbOFl/BOoOefZ7HxvsPO+2Xdd2Xaf+iFaT3GH3vQaA06Fdx3uE2m7VjvHpZ1u7rfXa8c3V2rENZW39bNGuUXcIrM3ayqJ1hzxfdgs0VwX2t/aA73F1h6DhaOj3zltrLVTvi+zY3hIqvaU30mCkGowQwo/8FRAiAjkpsT7PjXodL60/yJ7K4COG9581nkXjsrA5nNgdKqN/837Q4walx7HqpvkA6HSKT/pMYjsj6wa9jhevmoXDqaJvWxH1zZ/NRlVVbA613VVS+703fw6bnoe5t8Lcm333bXkN3rgGBhZqAWzdYbjgKRh+Cvzvl1D8pO/xsalw/nJ492YtMD7/P7DibqjZp20fsUA7zumE/34PSj4P3a+4AZAxWjsmfqB2/hs/1QJmvQmW/g1yJsNTS6GpQjsnY6w2Oly22a8xBWinqH8k3rtZ++qMrW92/frdTR8TfDXVXplg6jeS3hPXFEL0aTKyLkQEclLMfs9jeeTT3Ty+am/Q40trtf/4jXqdT4nFYHQ6BV1bYN1giXxk3UXvFZQ7nSpWhxOHU3W3KYJoKIOvnwanHT65P3D/K5drwduRTVCxHawN8Mw52si2f6AO0FID/z0TKneCpQ6eOVcLnK2N8Ox5nuN2vhc+UAdorvQc01QBL1ysBeqgLTC07v9g0wueQB2g/LsggTpEJVDvCQNH+6ai6E0Qn9G5thKzg283eN1wjzvXd9+US/yObft9988Xd5VuHDwnsP2RiyPvYzj+15SRdSH6PfkrIASwZk8l//liHy02B4vGZdPYaueznRWoqIzJTuaGhYU+xw9MNJGTEktJVfCVGdeXVHPFiUOC7gu3eunq3Z4UhI4uZrSzrIGFD34GwIiMBFbcMLedM/ox/wmPqhq4cE4wrXVdu64r6O6I1lrf5y012le0KHowJQZeB2BAIViDz78AoP5w+LbNKdoNi7OdKkiL/wKHN8CG/2j9mflTSBsCK+8GYxyMORO+fUkbdT5cHLqdebdBwsDg+y57Bz64DQbNhPHnQWM5bHkV5t4CBSdA3UHtU4C0oXDOE9o5/qPcMW0pZSMWwMyfwf7V2s9S+jDtNUSDVIMRQviRvwKi31NVlZte2kRpnZZf7B0wA3y5t5qyet/c47T4mIDUGG9LJuSE3JeXGvo8b0Z9x0bGvdNebA5nmCNF4AqdLZEtQd+TVU3C9SGaEzYzx8KcG+Dly3y3L3sZCheGP/fFS2DbW4HbT7wBTr1Le/x/J4cPsOffDoNna1+zr/PdN/xUz+NZP9O+3+21PsGQubBvVeAxweQVwY8+8Dr2Gu3L5YL/Bp7jHyib2iZwKwos+m3oa3WFBOtCCD+SBiP6lbV7qvj1a9/yyxe+5qX1BwGw2J3uQD2UdzZ7JoWOykpk5tB0zpyUg3emye/PGU9qnPYR9sgs30mdT142DZ0C8TF67lgcekXRn84bBsC43KQOV3HxHokvqWrGLgF7aP7VTSINfru8uE8UUpOsjdFdZMiUCDGJQbZHMDE5VClD73Nj2mmnvf1hr28M/7yr/NvrSl8jvqYE60IIX/JXQPQreysbeX6dFqTHxhiYmJ/Ciq0RVrUAdAr87xcnotcpzBkxkDW3nsLeykYGpcWRlxrH0ok5fHuoLmDUff6oDFbfejIJJkPYiaO3LBrF+UV5DEqLQ4kkLcOLyeB7791kdZAcK/fjQfkH55YGSIggR9ra1ZH1KOSQ21uDp6x0VkxC8MA8ksA01M+od/BvCnIj4C2Sm4KQ1/dLU4l2YBswst4DwbqMrAsh/MhfAdGvxBo9/7m3WO1c82wxe0LURA9mQl6Kzwh2VrKZrGTP5NN4k4FZw9KDnpudHFn6y9BOllr0z3GP6WDOe7/in87iHbzbA0txes7r4oh2uPzvjoi0bGIkTAnBA/NIAlNHiPeqp0bW/XPKO1NjvSOMEaRKdZUE60IIP/K/uehX7A7PyGZFo8UnUI+keEqkFVp6g9FvZL2jOe/9SsDIemPofT7HdXJk3elov+2OiGawbowLMbLezog4hL6x8Q7A2wv62xt5D8vvZ7yDn0a1y+6XHhft9oPxv+HQyX/TQvR38ldA9CsPf7Lb/dh/IqnTL0Nh9vB04vzKLra3qmhv8q+rrpfSjaGFy1kPF5B7l0vsCFf70co1d1ii0w5oAWhnc9YjGVlvb2S4S3ng3VyaMtSiVt0pXLkoIUS/JMG66FfCLRQ0IMF3slxDq505Iwb4bOvLI+smg44JeVqljOlD0jqc896vBMtZD7XPW2dHtF1BejSruESTf2Cu6D21xsMJFaxHMiof6tp9if/IuhBC9AIJ1kW/Em60ubbZxpOXTXM/b2i18+fzJzIpP8W9rS8H64qi8NTl03no+5P5v0um9nZ3+rZwOevhRr8bjoTeF457ZL0PlH4MxrXYj4veGFnKRyQj6+3p0sh6N9+Q9kawLjfZQgg/fTfyEKIbhAvW7U6V+lbPCqL1LTYSzUZOGJbOzoNHSVfqyFXjoDVXCzAcVjDGakvI1x/WloYPFszFJGjHKTpoLOuOl+WWCizNB1oPgQwKhuY/Ql57EKr3aY+r94Q+r3Z/565XuUsrc9hc1f6xxxJ7iHScjgTgXcpZ7+40mJb2j4k2SYMRQviRYH7vF4cAACAASURBVF0cl2qbrWwtrWf6kDQMbVVRjta1sv1o+JHN/671BGNVTVbKG1rJLF/NetOviVcssAHtC8CUDBcsh7dvgJp93fNCRM/4/M/aV3tKv+5c+y9e3Lnz+jqHLfj2HhtZ72ahbkaEEKIHSRqMOO7YHE4W/e1zlj3xFfe/sw2A6iYr8/78SbvnOv1GtVqtTtJ3vagF6v4sdfD02RKo93fJg3q7B8Hp/CZDpw0LftzwBdr37EmebblFkV1j/HnBt3sH4MNODt+GfwpOe/Rex0+4sGPndtSoMzyPc6Z077VcUgt65jpCiGOGjKyL486Oow0crddyQFbt1Kp3fLm3ilZb+yt63rCgkHv+t5Xd5Y0kmAxkJZuxpzigI6nGsWmej/aDpU3ED+yZes0iPFsLNJVrwXawPOHa/VrqiisvO6UtiFIUbfKlOQVGLoLBc+CD27TUGte/d0qB72NvjeVgb0uvMMRqjweO1mqGl23RtqcO1q5hjNNSZ0xJMGkZ7P0UKrZrx8Slg61Ze+4q96e2/YwXng6L/wwf3w8H18GEC2Dk6fD+r7XXPfEi2P4OZIyG0Uu1cxb9Dlb8RptcuuDeyN7DmddAxQ5orYMhJ8HWN2HqFb4rf45YCLN+Dge/AmszpA2BQbNg21sw46qO52hf/h6svAsKToCxZ0HDH7Trzr25Y+1EYuIyOLRBS3M7I4JPXqIhMQtO/xN89xqcdFPPXFMI0acpaj/Kj1MUpXjKlClTiouLe7srohttPlTH0oe/AGBsThLvXDuHN785zHUvfBPynGtPHk52SiwXTctnT0UjrxQf5rSxmUwelIr98fkYjmyMvANnP64FQwC/zQtc9fLiV2DEgo6+LCGEEEIcQ4qKiti4ceNGVVUj/LgyOBlZF8cdm9Mzgu7KV2+xOkIev2hsFjcsHOl+PjwjkVtPH+Vpw9bBVSe9UwBi4gOD9b6coyuEEEKIPiVqOeuKouQpivIfRVFKFUWxKIpSoijK3xRFSe1gO2crivKxoii1iqK0KoqyTVGU3yiKEkHRXyF8Vyk1tlV/abGFDtZj/RY+CtDR2tjtLQjTl+tKCyGEEKJPiUqwrijKMKAYuBxYBzwI7AWuA9YqipIeYTv3Aa8B04A3gH8C9cA9wEpFUWKj0V9xfLM7PCPrumgE6x1ddbK9BWFkZF0IIYQQEYrWyPojQAZwraqqZ6mqequqqiejBe0jgQfaa0BRlMnA7UAtMFFV1ctUVb0RmAk8DMwGbolSf8VxzGL3BOvr9lUD2qRTbzcuKHQ/jjWGCdZV1TeNJZKVGb1HzoNNnutSXWkhhBBC9CddDtYVRRkKLARK0EbCvd0FNAGXKIoS305TZ6MtR/eEqqp7XRtVbQbsbWirX/xUUZR2hkFFf+dfftHucGJ3+m77y4qd7sdhg3Vbi6fCht4EsRFkdbU3ci4j60IIIYSIUDRG1l1FdD9UVdWnNp6qqg3AaiAObYQ8nKy273v9d7S1U4k2ej++S70Vxz2bwzcwt9idhKt6FDYNxjtf3ZQQWb55e8d0tK60EEIIIfqtaFSDcZXR2Bli/y60kfdC4KMw7VS2fR/iv0NRlERgQNvTUUDoGnza8aFqM44KsV0cR+xO33rqVrsThzN0sG4ON7Ju8U6BSYhsVLy9YzpaV1oIIYQQ/VY0gvXktu91Ifa7tqe0087bwK+BKxVFeURV1RKvffejpcgAdKi6jOh/7H4j6/9du58PvisDIJFmzFhxopCsNDFraDqnZdZD5a7gjVXs8Dw2JUY2sq6TTC0hhBBCREdP1Fl3BdlhV19SVXWNoiiPA1cB3yqK8ipQjTaxdBrwHTAWCF3Ww9NW0OLzbSPuPbRmtOgtNofvyPqDK7UPfU7WbeQR498xKzbPzsPAsxE2HOnIuhBCCCFElEQjZ901cp4cYn+S33Ehqap6NfAjYCtwAXA1YAVOAza3HVbe6Z6KfmHxhGyM+sBUk0v1K3wD9Y5KzoPk/I6dU/RD3+exaZ2/vhBCCCH6nWiMrLvyBApD7B/R9j1UTrsPVVX/A/zHf7uiKE+0PVzfod6JficuxsDABBOlda0+2xOV5oBjq025pCVEUL4/OR9OuklbkbR6L+x8T9uePhyMcaA3at/n+lUXnfVzqNoLm56DjLFw5sOdfVlCCCGE6IeiEax/0vZ9oaIoOu+KMG0TQ2cDLcCXnb2AoigLgQJglaqqh7vSWdE/WP1SYQBiCBxV/8OQ5fzhoukda3zZC5Efa4yFsx/VvoQQQgghOqjLaTCqqu4BPgQGAz/z230PEA/8V1XVJtdGRVFGKYoSUJlFUZSkINuGAf9Cy1W/tav9Ff2DxRYYrMfTGrDtxW8kq0oIIYQQfVe0JpheA6wB/qEoyinANmAGMB8t/eV2v+O3tX33Tyz+t6IoBUAxUAMMB5YCRuBKVVU7PTov+o8mi50Giz1ge4ISGKwH/ggKIYQQQvQd0Zhg6hpdnwosRwvSbwSGAf8AZqmqWhVhU28DNrTJpTcBJwCvAlNUVV0ejb6K498Tn+8Luj0xSLD+h3NljS0hhBBC9F1RK92oqupB4PIIjw06nKmq6lPAU9Hqk+if/BdFOnNSDredPpLYBwOD9QunDeqpbgkhhBBCdFhURtaF6EtsfosiJZmNZJoC02KEEEIIIfo6CdbFMeOrvVX85cMdHKrxLcHodKq8sO4AT3y+l1abg5c3HPTZn2g2gLWxJ7sqhBBCCBEVPbGCqRBdVtdi4/Ll62m2Ovj2UB1PXeEpt/jJjnJufU1bM2tDSQ1VTVafcwckmMAiwboQQgghjj0SrIu+x+mEql3gdLg3fbOzgjxbCShwZNdBfv/UQW5cMBKjXseTr31JoaIF6Hu3HqTQa0bEL+YPY+mIFijbhhBCCCHEsUaCddG3OOzwr3lQttln81xgrslrwz606vvAMwAmglvT9iWEEEIIcQySnHXRtxxaHxCod5upP+qZ6wghhBBCdJKMrIu+pbXO8zgmAZLzAShraKW22ebepVMUhg2Mp8nqoKy+FYdT9W8JgOxkM0lmo2dD/ABw2CBlEJx8R7e8BCGEEEKIaJFgXfQt3lVbRiyA85cDcPtTG1i5rczn0IQKA41BVir19vCCySyZkBPtXgohhBBC9AhJgxF9i6XB8zgmwf2wtLYl4ND2AnUAw/+3d+dxclV13se/v947SS8J2ROykpCwjEAwLGFHYkBHXFAZR7aRcVBGkcFHfcRRYAYdHWUQlFGGAWTcRvRRRsQRZIcIAgZFCCQkBEL2PZ3el/P8cW9V3dp6qa6qe6vq83696nW3U7dPcrs63z753XOr+BYHAACliySDaAmOrNc3xVevPGuhlsweP+LTnbFocj56BQAAEArKYBAtwfnQAyPrZx02RXs7evTc63tGdLquvn7V1fA7KQAAKE2kGERL0si6F9a3t3WpratXbV1Dl72kyuU9AAAAUcHIOqIlpWZ95as7deHtv1dtdZU6e/uzvy+DMxdNVk2VDd0QAAAgohhZR7Sk1Kx/6r+fV9+AGzSoX3XWQi2a2pS2/7aLjtWU5oZC9BIAAKAoGFlHtARq1tfvN21v6046PKO1UZv2dmrxtGat3rJfFxw/W39z0lyduXiKbnnkVZmZ+gcG9MG3zpIZo+oAAKC0EdYRnv5eaftqSYEHGh1IzKV+9X2vSTo86S2fOOMQnb90VtqpDpverG996JgCdRQAACAchHWEo2u/dMvx0v5NWZsccI1p+6a3pu8DAAAoV9SsIxzrHho0qPe5Km11E9L2E9YBAEAlYWQd4ejal1hvHC+1zIxvtvdV6ctblmiHWtPeNr2VG0YBAEDlIKwjHMFZX/7ig9LZX41vPv/qTv3gtqfT3vL5cxZpTB3fsgAAoHJQBoNwZHlSqSR19mSepvEjJ80rZI8AAAAih7COcPQEHn5UnxLWs8ypXs0DjgAAQIWhpgDhSBlZX7/jgK755Uuac9AY/ebFreH1CwAAIEII6whHypNKr7r7j1r1xl49Fl6PAAAAIocyGIQjZWR91Rt7B21+/XuOKHCHAAAAooewjnAkjayPy97O99fHzS5gZwAAAKKJsI5wdAduMK1r0vxJY7M2/cyKQ4vQIQAAgOihZh3F075L2rPBW+/YndhfP05tXXvSmn/z/KN07lEzitM3AACACCKsozhee1z6/nul/p70Y3Xj1NbVl7a7pbG2CB0DAACILspgUBx//lmWoN6k3vrWjHOrnzh/YhE6BgAAEF2MrKM4uvYl1sfPlRrHS7WNemri+3T+NY+kNf+fv1+muhp+lwQAAJWNsI7iCM7+suIr0qFnS5K2Pb9J0vNpzae1NBapYwAAANFFWEdxpMyrvqe9Rw+8tE3fevjVjM2bGvjWBAAAIBGhOHoCUzXWj9MnfrRKT7y6M2vzhtrqInQKAAAg2igKRnEkjaw3DRrUl86dUIQOAQAARB8j6yiOlCeW1lVXqad/IK3ZdecerhWHTy1ixwAAAKKLsI7iCIysu7qxGYO6JF14wpwidQgAACD6KINB4fX3SX2d/oapr3pM1qZrtrVlPQYAAFBpCOsovJ7kmWB6B1zWplv2dRWhQwAAAKWBsI7CS6lX7+3LHtbfMrOlCB0CAAAoDYR1FN7DX06s143LWq/+nxcdq9YxdUXqFAAAQPQR1lF4m55LrNeNUW+GsD5/0liduXhKETsFAAAQfYR1FF5PR2L91M9lDOu11XwrAgAApCIhofCCTy89eGnGsH7ekplF7BAAAEBpIKyj8LqTZ4OpqUr+tjtiRrMuPXlekTsFAAAQfYR1FFZftzTQ661X1Ug19Zozcax+fcXJ8SaDzQ4DAABQyQjrKKyUUXWZSUquUc9UFgMAAADCOgotWK9e3xRfrQuE9WxTOQIAAFQ6wjoKK3VkXdKNv12jd9/yZHz3m3s6dfsTrxW7ZwAAAJFXE3YHUOZSnl66bscB3fjbtWnNVm3cW8ROAQAAlAZG1lFYKSPrG3a2Z2xWW21F6hAAAEDpYGQd+dO+U9q+2ruJdPrRUt3YlJr1carJ8vCjOh6KBAAAkIawjvzYvEq67azENI2NE6RPrkoZWW9StvFznmAKAACQjoSE/Fj9y0RQl6TO3dL6R6Tu5JH1rt7+jG8nrAMAAKQjISE/uvan7+tuS77BtG6cOrOF9Rpq1gEAAFIR1pEfwVAe3Jcysj62LnPlVXcvc60DAACkIqwjP4KhPL7vQMrIepPedtgUfXbFouL1CwAAoIQR1pEfGUfW25JvMK33HorUyxNLAQAAhoWwjvzozhDW00bWCesAAAAjQVhHfgyzZl2SfvLsxrSmR89qLVTPAAAAShZhHfkxrJH1Jv1u3S4dOrU5remhU5sK2DkAAIDSRFhHfvRkuME0Q836z/7wph5bsyOtaW+fK2DnAAAAShNhHaPn3LBr1rPNs95DHTsAAEAawjpGxznp9ScllyGEt+9IeljSc9v61d7dl/E03HQKAACQLvMTaoDh+sXHpD/+KPOxfck3kn7gjj+pX9UZmz69freOn3dQvnsHAABQ0hhZx+i8+Ivk7WlvkRpa0pq9PjA5LajPmjAmvr788CkF6R4AAEApy9vIupnNlHSdpBWSDpK0RdIvJF3rnNszgvOcJOn/SHqLpKmStkv6s6SbnHP/m6/+Ig/6e6W+zsT24e+VTvqU1LZNevb2+LHdffX69Nrj0t5+64VL9Ms/btaiqc1aPC19hhgAAIBKl5ewbmbzJa2UNFnSPZJelrRU0hWSVpjZMufcrmGc52OSbpHULunnkt6UNFPSeyWdbWZfcM5dn48+Iw+S5lBvkd5/h7c+TdLC5fFDEyQ987lfJb21rqZKh05p0qK3Lyp8PwEAAEpUvkbWb5EX1D/pnLs5ttPMbpB0paTrJV022AnMrFbSVyR1SVrinHslcOzLklZJutrMvu6c685TvzEaPcnTMqba19GrqirpQIabSsfWVcvMCtk7AACAkjfqsG5m8yQtl7RB0rdTDn9J0kclXWBmVznn2gc51QRJLZL+FAzqkuScW21mayQdKWmcJMJ6FHQnT8sYdOtj6/Tl+17O+tY9Hb2F6hUAAEDZyMcNpmf4y/udc0nz7znn2iQ9KWmMpOOHOM92STskLTSzBcEDZrZQ0gJJzw+nnAZFMsjI+mBBXZJWHD61ED0CAAAoK/kogznUX67JcnytvJH3hZIezHYS55wzs8slfV/Sc2b2c0mbJc2Q9B5JL0o6fzgdMrPnshyiQDqfgjXrdellMNmcunCS/vEvDytAhwAAAMpLPsJ6bJ6+fVmOx/a3DnUi59zdZrZZ0o8kXRg4tE3SHZLW59pJFEDSyHrTsN7y2lfOoVYdAABgmIoxz3osmbkhG5p9WNJvJT0uabG88pnF8kbkvyXpx8P5gs65JZle8mapQb4MUrOeDUEdAABg+PIR1mMj5+lPwvE0p7TLyK9Lv11eucsFzrmXnXOdzrmXJV0g6TlJ7zez00bfZeRFlpr13v6BDI0BAAAwUvkI67GZWxZmOR67WTRbTXvMckm1kh7NcKPqgKTH/M0luXQSBZClZr2ztz9j82NmDVkJBQAAgIB81Kw/7C+Xm1lVMGibWZOkZZI6JT01xHnq/eWkLMdj+3ty7SjyLMPI+r6OXj33xm5JUnWVqabKNH/SOLV19+qr7/uLMHoJAABQskYd1p1z68zsfnkj45dLujlw+FpJYyV9NzjHupkt8t8brCF/3F+e5z/46E+B9kdJOk9e3ftDo+0z8qSnI7FeO1aS9Lv1u3TZ973JeE4/dLK+8+FjVFNdJecc9eoAAAAjlK8nmH5c0kpJN5nZmZJWSzpO0unyyl+uTmm/2l/G05tz7vdmdoekSyQ940/d+LqkOZLeLalO0o3OuRfz1GeMVn/g2VQ1dZKkzXs747uqLHFXMUEdAABg5PIS1v3R9WMlXSdphaRzJG2RdJOka51zu4d5qo/Iq02/WNLbJTVJ2i/pCUn/4Zwb1mwwKJL+wFNIq70qpuvufSm+6/6Xtun1Xe06ZPLwpnUEAABAsnyNrMs5t1HeqPhw2mYcZnXOOUl3+i9EXV9gZL26Ti+8mT7hz/6uviJ2CAAAoLwUY551lKv+wL2+NXXauKcjrcnqLfuL2CEAAIDyQlhH7oJhvbpOnT3pUzZe/fM/F7FDAAAA5YWwjtwlhfX6rPOrAwAAIDeEdeSuLxjWa9VFWAcAAMgrwjpyl1SzXp+xDKauhm8xAACAXJGkkLvgPOvVtRnLYG750DFF7BAAAEB5ydvUjahAKfOsv+foGTpiRos6e/o14JzmTRqnY2a1htc/AACAEkdYR+5S5llfMKlJC6bwACQAAIB8oQwGuQuOrNfUhdcPAACAMkVYR+76k0fWAQAAkF+EdeQuZZ51AAAA5Bc168hdyjzrH73rWa3eul+NtdW64QNH6YgZLeH1DQAAoAwQ1pG7lHnWt+zr0sbdnd6hARdSpwAAAMoHZTDIzcCANBC4wbQqeZ71xrrqEDoFAABQXgjryE1KUFdVVdITTBtrCesAAACjRVhHboJzrNd4N5fu70wE+LH1VFgBAACMFmEduUl6emmt9nf1qq27T5JUX1Ol8WNqQ+oYAABA+SCsIzdJc6zXa8vervjmjNZGmVkInQIAACgvhHXkJmmO9Tq9uacjvjm9tTGEDgEAAJQfwjpyE5hjvWOgWh/53rPx7emtDWH0CAAAoOwQ1pGbwMj6vp7kkpfZB40tdm8AAADKEmEduQnUrE+b0KxX/nmFTlk4SYunNeuDbz04xI4BAACUD+bXQ26SZoOpU31Nte76m6Xh9QcAAKAMMbKO3GSYZx0AAAD5RVhHbgIj66s2t2tf4IFIAAAAyA/COnITqFnf2Sl19/arf8CF2CEAAIDyQ1hHbgKzwfSqRh/9r+dC7AwAAEB5IqwjN4F51ntUo5e37ld1FU8tBQAAyCfCOnITHFl3NWptrAuxMwAAAOWJsI7cBGrWe1SrpgZmAQUAAMg3wjpy0tnZFV/vUQ1hHQAAoAAI68jJrn374+teWK8NsTcAAADlibCO3Awk5lXvZWQdAACgIAjryEl/T6Bm3dWquZGRdQAAgHwjrCMnA32JsM7IOgAAQGEQ1pGTgd7gbDA1GuDppQAAAHlHWEdOBlIeinTKwkkh9gYAAKA8EdaRExcog2loaNRJh0wMsTcAAADlibCOnLjAE0zfMnuyzCzE3gAAAJQnwjpyExhZr6qtD7EjAAAA5Yuwjtz0J+ZZr6mrC7EjAAAA5Yuwjpw01w7E1yc0N4fYEwAAgPJFWMfI9XRo2o4n4pvHzpscYmcAAADKF2EdI/ejDyZv11CzDgAAUAiEdYxMf6/02mOJ7apaacK88PoDAABQxgjrGJnutuTt838oNU0Npy8AAABljrCOkek5EF/dUzNJGyeeFGJnAAAAyhthHSPTnQjrO3vq9L2VG8LrCwAAQJkjrGNkAiPr7WrQuIaaEDsDAABQ3gjrGJlAzfoB16CmhtoQOwMAAFDeCOsYmaSR9UY1MbIOAABQMIR1jEygZv2AGtRMWAcAACgYwjpGJjiy7hopgwEAACggwjpGJlCz3q4GymAAAAAKiKSFoTknbXhC2r1e2vh0fPcBRtYBAAAKirCOoT13h3TvlWm7GVkHAAAoLMpgMLR1D2XcvXpgNmEdAACggEhaGFpgBhgd8japaZq+t3mGxjacovqa6vD6BQAAUOYI6xhaYAYYnfIZadZxOmnHAf31hDHh9QkAAKACENYxtODIev04SdL8SeNC6gwAAEDloGYdQwuMrN/61Hb98Ok3tHVfV4gdAgAAqAyEdQwtMLf6v/9uqz7/8xf0+q72EDsEAABQGQjrGJxzyU8tVaMkaVJTfVg9AgAAqBiEdQyur0sa6JMk9apGPfIegsTDkAAAAAqPsI7BBW4uPeAa4uvMrw4AAFB4hHUMridRr94uL6zXVVepoZb51QEAAAqN4VFkNtDvPbn0jafiuw44r16dUXUAAIDiyNvIupnNNLPbzWyzmXWb2QYzu9HMxg/z/aeZmRvG6+B89RmDWHmz9IPzpMe/Ht8VG1lvrGNUHQAAoBjyMkRqZvMlrZQ0WdI9kl6WtFTSFZJWmNky59yuIU6zQdK1WY4dKem9kl50zm3MR58xhNceTdv14sAcSVIzN5cCAAAURb7qGW6RF9Q/6Zy7ObbTzG6QdKWk6yVdNtgJnHMbJF2T6ZiZ/chfvTUPfcVwBG4s3Tr9LN31+nj9sP9MSVLrGMI6AABAMYy6DMbM5klaLm9k/Nsph78kqV3SBWY2NsfzHyTpPZI6Jf1X7j3FiATmVl8581Ld0v9u7VWTJGrWAQAAiiUfNetn+Mv7nXMDwQPOuTZJT0oaI+n4HM9/saR6SXc75/bk2kmMUGBkfWNHco36ifMnFrs3AAAAFSkfQ6SH+ss1WY6vlTfyvlDSgzmc/1J/+d3hvsHMnstyaFEOX78yBaZsfP1A4ne6m/7qaL3rLdPD6BEAAEDFycfIeou/3JfleGx/60hPbGanygvYLzrnVubQN+QqMLL+2v7Et8mM1oZMrQEAAFAAxSg+Nn/pcnjvR/3lsEfVJck5tyRjR7wR92Ny6Edl6euWBnq99apavbGvL35oemtjSJ0CAACoPPkYWY+NnLdkOd6c0m5YzGyCpPeJG0uLLzCq7urHaVd7jySppso0uYmRdQAAgGLJR1h/xV8uzHJ8gb/MVtOezUXybiz9iXNuby4dQ44C9er9NWM13p+qcWpLg6qrLNu7AAAAkGf5KIN52F8uN7Oq4IwwZtYkaZm80fGnMr15EH/rL5lbvdgCI+s1jc1addVydfT0aU9Hb4idAgAAqDyjHll3zq2TdL+kOZIuTzl8raSxku5yzrXHdprZIjPLOjOLmZ0sabGkP3NjaQi6EyPrqhsnSRpTV6MZ1KsDAAAUVb5uMP24pJWSbjKzMyWtlnScpNPllb9cndJ+tb/MVlMRu7GUUfUwBB6IpPpx4fUDAACgwuWjZj02un6spDvlhfSrJM2XdJOkE5xzu4Z7LjMbL+k8cWNpeDKMrAMAAKD48jZ1o3Nuo6RLhtk2612K/lNKqbcIU2Bk/YH1Hdq8coPOWDRZ01oaVFOdl9/vAAAAMAwkL6QL3GD6Znu1vvQ/L+rkrz2s9TvbB3kTAAAA8o2wjnSBkfV2JeZVP2QSJTEAAADFRFhHukDNervzKpK++M7DVMUc6wAAAEVFWEe6wMj6AX9kvdV/MBIAAACKh7COdIGa9XbnhfWmBsI6AABAsRHWkS6pZt0rg2lqyNvEQQAAABgmwjrSBWrWY2UwhHUAAIDiI6wjXXBk3b/BtJkyGAAAgKJjuBQJfd3Sy7+SNq+K72JkHQAAIDwkMCQ88i/SEzck7YqNrI+r51sFAACg2EhgSHjjqaTN3rFTdcNfn6UDvVWqqaZiCgAAoNgI60jo70mszz1VtWd/VSdMnhZefwAAACocw6VI6O9OrC//J2ny4vD6AgAAAMI6AvoCI+vVdeH1AwAAAJIog0FQoAzm879co9qJ/TpkSpMuOH52iJ0CAACoXIysIyEQ1h9Zu1ff+93r+skzG0PsEAAAQGUjrCMhENZ75D0EaXprQ1i9AQAAqHiEdST0BcN6tSRpemtjWL0BAACoeIR1JGQYWZ9BWAcAAAgNYR0JgakbE2UwhHUAAICwENbhGeiX3IAkqV9VGvC/NQjrAAAA4SGsw9MXHFVPzOjJDaYAAADhIazDE6hX73VeWK+rqdLEsfVh9QgAAKDiEdbhCYT1bn9kfXpLg6qqLKweAQAAVDzCOjzBkfVYWKdeHQAAIFQ1QzdBRQjUrDePHaurTlmoKS3UqwMAAISJsA5Pf298ddyYMfrEmQtC7AwAAAAkymAQE5hjXTV14fUDAAAAcYR1ePoSNeuqJqwDAABEAWEdnsANpru6pI27O0LsDAAAo3pPvwAAGbtJREFUACTCOnwbd+yNr6/e3qUHV28LsTcAAACQCOvwfeeh1fH1HtUybSMAAEAEMBtMpetuU9fzd2vpgQelam+Xq6rVW+dMCLdfAAAAIKxXvAe+qIZnb9e51YldJx46XY1juckUAAAgbJTBVLo3nk7b1TjnrSF0BAAAAKkYWa9wA91t8d/Y7uh7u+pnH6sPLf1oqH0CAACAh5H1Ctd5YF98/ea+92jrnHdLNfUh9ggAAAAxhPUKV9vfHl9vV4PeOmd8iL0BAABAEGG9kvX1qE593qqqdf15x+qkQyaG3CkAAADEULNeyXoOxFdrGpp03rEHh9gZAAAApGJkvZJ1tyXW65vC6wcAAAAyIqxXsmBYrxsXXj8AAACQEWG9gu3dtzuxUU9YBwAAiBrCegX71TNr4+tbOrl9AQAAIGoI6xWsbs+a+LpjZB0AACByCOuVyjm9f9d34pt1Y5pD7AwAAAAyIaxXqBde25K0XTP7+JB6AgAAgGwI6xXqZ0+9nLTddMIlIfUEAAAA2RDWK9Tu3YmZYHbUTlN1TW2IvQEAAEAmhPUK1bZ/T3y9uXl8iD0BAABANoT1CtQ/4NTVvi++XcvNpQAAAJFEWK9A29u61Og649tV9U0h9gYAAADZ8CScMnf/i1s14KQZrY1aMGWcGmqr1VhbrXMXN0vr/EbMsQ4AABBJhPUy94371+iVbW2SpHs/cZKOmNGi1jF1GqvEyLrqCOsAAABRRBlMGXPOadPeRCif0doYX192cEOiIWUwAAAAkURYL2P7u/p0oLtPktRYW63WMYnpGYM164ysAwAARBNlMGVo095O3froOq3f2R7fN721QWbmbbTvkh79auIN1KwDAABEEmG9DH39N6/o56s2Je2bHiiB0X2fluQS24ysAwAARBJlMGXola1taftWHDE1sbHp2eSDs44vcI8AAACQC0bWy1Bbd298/Z/OPVxHzxqvw6cHHnzUfSCxfv6PpCmHF7F3AAAAGC7Cehlq6+qLr5995DRNHFef3KAnENbnn16kXgEAAGCkCOtl6Oa/Olr7OnvV1tWnlsba5IN9PVJ/j7du1VJNQ/oJAAAAEAmE9TJ08oJJ2Q8GR9Xrx0mxGWIAAAAQOdxgWmm6Azef1vEwJAAAgCgjrFea1JF1AAAARBZhvdIEZ4JhfnUAAIBIy1tYN7OZZna7mW02s24z22BmN5rZ+BzOdaSZ3WVmG/1zbTezR83swnz1t1w9sXanjvmnB3Tqvz6sL/zihfQGPYEyGEbWAQAAIi0vN5ia2XxJKyVNlnSPpJclLZV0haQVZrbMObdrmOe6WNJtkjok3Stpg6RWSUdIOkfSXfnoc7na29mj3e3e67BpPekNGFkHAAAoGfmaDeYWeUH9k865m2M7zewGSVdKul7SZUOdxMyOlxfU/yxphXNua8rx2oxvRNzu9kRAbx1T691Q+vwPpb1veDt3vJJoXM8NpgAAAFE26rBuZvMkLZc3Av7tlMNfkvRRSReY2VXOufYhTvc1SdWSPpwa1CXJOdeb/hYEbdrbGV+f3tIoPfo1aeVNmRszsg4AABBp+ahZP8Nf3u+cGwgecM61SXpS0hhJxw92EjObKelkSc9KetHMTjezT5vZVWZ2pplxM+wwbN7bFV+f3toobV6VvfHsE4vQIwAAAOQqH2Uwh/rLNVmOr5U38r5Q0oODnOetgfYPSTot5fgLZvZe59yrQ3XIzJ7LcmjRUO8tdZuDI+utjcnzqh/3Mal5urc+5XBp3ulF7h0AAABGIh9hvcVf7styPLa/dYjzTPaXH5C0U9J75YX7SfLKaS6Q9CszO9I5l+HOSUjJYX1Ga2PyvOrHXiJNOjTDuwAAABBF+brBdDCx59m7IdpVB5aXOufu9bf3m9lFkhZLOlbS+yT9aLATOeeWZOyIN+J+zHA6XYp6+we0bb9XBmMmTWmpZ/YXAACAEpaPOvDYyHlLluPNKe2y2eMvuyXdFzzgnHPypoSUvCkhkcG2/V0a8H8lmjSuXvU11TyxFAAAoITlI6zH5gJcmOX4An+ZraY99TxtqTeq+mJhvnEEfasoaTeXDgwkh3VG1gEAAEpKPsL6w/5yeeqMLWbWJGmZpE5JTw1xnj/Jq1WfaGZTMhw/wl9uyL2r5S2tXr03MFNm7RipqjrDuwAAABBVo65Zd86tM7P75c34crmkmwOHr5U0VtJ3g3Osm9ki/70vB87TZ2bflXS1pK+Z2SWxEXYzO1LSxZL6JP10tH0uV+96y3SdeMhB2ry3S/U1VdSrAwAAlLh83WD6cUkrJd1kZmdKWi3pOEmnyyt/uTql/Wp/aSn7vyzpTEkXSjrSzB6RNxvM+yQ1SLpqOFM3VqqqKtPkpgZNbmrwduzcljhIvToAAEDJycuDhpxz6+TN1HKnvJB+laT5km6SdIJzbtcwz9MhL6xfK+9BSpdLepe8XwTOcc7dkI/+lpu+/kwl/kqeY52RdQAAgJKTt6kbnXMbJV0yzLapI+rBYx2SrvFfGIYrf/JHPfrKdk1vbdQ/vvMwLTtkondg++pEo/qmcDoHAACAnBVjnnUU2KY9Hdrf1af9W9tUZYHfg+79VGKdkXUAAICSQ1gvYS9t3q9vP/Kq/vDG3vi+Ga2BmS2Dk/PMPLaIPQMAAEA+ENZL2Bfv+bOefX1PfDv+1FJJ6u+T+hLzruukfyhy7wAAADBaebnBFMXnnNNLW/Yn7TvnyGneU0slqSdwc2l9s1TN72UAAAClhgRXovZ19qqjpz++/YvLl+kvZrQkGjDHOgAAQMkjrJeoTYGnlR4yeZyOOrg1uUFPIKwzEwwAAEBJogymRG3em6hHnx68qTQmOLLOA5EAAABKEmG9RG0OjKzPaG1Ib9AdqGenDAYAAKAkUQZTos45cprmTxqnzXs7NeugMekNKIMBAAAoeYT1EjWpqV6TmuqzN+AGUwAAgJJHGUy56qFmHQAAoNQR1kvQ1n1duuf5Tdp1oDt7o+7APOuMrAMAAJQkwnqJ2XmgW2d84xG9vqtDE8bWZW/IyDoAAEDJI6yXmCdf3amOnn7d8MAazf2/9+lzP/tT5oZJNevcYAoAAFCKCOslJji/uiR95KS5mRsysg4AAFDyCOslJji/+hfesVgLpmQZNadmHQAAoOQR1kuIc05PrtsZ356R6cmlMYysAwAAlDzCeolwzuniO57R+h3t8X3TBwvr1KwDAACUPMJ6iXhzT6ceXbMjvl1XXaXZmZ5cGsPIOgAAQMkjrJeIfZ29SdvXnXu4WscMMnUjTzAFAAAoeYT1EtHW1RdfXzpngs5fOmvwN/QEbjCtpwwGAACgFNWE3QEMT3Njjc4+Yqrauvq0aOoQ4ds5RtYBAADKAGG9RBw+vUX//uElw2vc1yW5fm+9uk6qGaRcBgAAAJFFWC8321+WHv9GYptRdQAAgJJFWC83d18s7Vid2GYmGAAAgJLFDablpL8vOahL0rzTwugJAAAA8oCR9RJxz/Ob9MrWNjU31urMRZO1YEqGm0yDM8BI0vk/lA45qzgdBAAAQN4R1kvAE2t36oofPx/fnt7amDmsB2eAaZouLXpHEXoHAACAQqEMpgRc/YsXkrZbGmszN+SppQAAAGWFsB5xAwNOb+zuiG/PaG3UcXMnZG7M3OoAAABlhTKYiGvv6ZNzie1H/s9pqq3O8jtW0lNLCesAAACljpH1iNvf1Rdfn9rckD2oSykj60M85RQAAACRR1iPuAdXb4uvNzUM8R8h1KwDAACUFcJ6hO1p79EX73kxvj1kWKdmHQAAoKwQ1iPssbU7krabGrLMAhNDzToAAEBZ4QbTCOvuG0jaTqtX3/O69PtbpY5d3vbWwBSP1KwDAACUPMJ6hG3b15W0/bbFk5Mb/Ooq6dUHMr+ZkXUAAICSRxlMhD30yvb4+nXnHq7zl85KbrDtRWU17/QC9QoAAADFwsh6RK3Z1qZVb+yNb09vaUxvFJz95R3fkGrHeOsHHycdNL/APQQAAEChEdYj6rE1yTeXHjGjJbmBc8lh/ZiLpWouJwAAQDmhDCaiNu3tjK+fsnCSprY0JDfo7ZCcfwNqTQNBHQAAoAyR8CLqsysW6cIT5mjTnk7NmjAmvQFzqgMAAJQ9wnqR7Ovo1as72oZuKKmhtlqHT2/R3IljNXfi2MyNeFopAABA2SOsF8kf3tijS+58ZlhtF09r1q+vOHnwRt2B4M+c6gAAAGWJmvVSxcg6AABA2WNkvUiaG2t0zKzWYbWdk630JYiadQAAgLJHWC+SJbMn6P99fFn+TsjIOgAAQNmjDKYUvfBT6WcfSWwzsg4AAFCWCOulZscryUFdkuq5wRQAAKAcEdZLzfbV6fsWnFX8fgAAAKDgqFkvNcFa9aoa6e8el6YcFl5/AAAAUDCMrJea4CwwSy4hqAMAAJQxwnqp6Qk8DIladQAAgLJGWC813UzZCAAAUCkI66UmWLNex8g6AABAOSOslxpG1gEAACoGYb3UJI2sE9YBAADKGWG91HQHbzAlrAMAAJQzwnqpoWYdAACgYvBQpGL47TXS7vX5OdfOtYl1RtYBAADKGmG9GNY/Km3+Q/7PS806AABAWaMMplRNOUJqmRl2LwAAAFBAjKwXw5lflLr25u98NQ3S3FMls/ydEwAAAJFDWC+G+aeH3QMAAACUIMpgAAAAgIgirAMAAAARlbewbmYzzex2M9tsZt1mtsHMbjSz8SM4xyNm5gZ5NeSrvwAAAEDU5aVm3czmS1opabKkeyS9LGmppCskrTCzZc65XSM45bVZ9veNqqMAAABACcnXDaa3yAvqn3TO3RzbaWY3SLpS0vWSLhvuyZxz1+SpXwAAAEDJGnUZjJnNk7Rc0gZJ3045/CVJ7ZIuMLOxo/1aAAAAQCXJx8j6Gf7yfufcQPCAc67NzJ6UF+aPl/TgcE5oZh+UNFdSj6TVkh5yznXnoa8AAABAychHWD/UX67JcnytvLC+UMMM65J+nLK93cwud879NIf+AQAAACUpH2G9xV/uy3I8tr91GOe6R9LXJa2StEvSbEkXSbpK0n+b2Tudc78e6iRm9lyWQ4uG0QcAAAAgEorxBFPzl26ohs65f0vZ9Yqkz5vZZkk3S/qypCHDOgAAAFAO8hHWYyPnLVmON6e0y8Vtkv5N0lFm1uScaxussXNuSab9/oj7MaPoBwAAAFA0+Xgo0iv+cmGW4wv8Zbaa9iE557okxQI6s8oAAACgIuQjrD/sL5ebWdL5zKxJ0jJJnZKeyvULmNmhksbLC+w7cz0PAAAAUEpGHdadc+sk3S9pjqTLUw5fK28k/C7nXHtsp5ktMrOkmz3NbJ6ZzUg9v5lNlHSHv/lj5xxPMQUAAEBFyNcNph+XtFLSTWZ2pry50Y+TdLq88perU9qv9pcW2HeKpNvM7FFJ6yTtljRL0jny6uGflfSZPPUXAAAAiLy8hHXn3DozO1bSdZJWyAvYWyTdJOla59zuYZzmOUnfl7RE0lHybkxtk/SCpJ9I+q5zricf/QUAAABKgTk35IyKZcPMdjU2Nk5YvHhx2F0BAABAGVu9erU6Ozt3O+cOGs15Ki2svyZvxH5Dkb90rD7/5SJ/XRQX17kycJ0rA9e5MnCdy1+Y13iOpP3OubmjOUlFhfWwxJ6omm3+d5QHrnNl4DpXBq5zZeA6l79yuMb5mLoRAAAAQAEQ1gEAAICIIqwDAAAAEUVYBwAAACKKsA4AAABEFLPBAAAAABHFyDoAAAAQUYR1AAAAIKII6wAAAEBEEdYBAACAiCKsAwAAABFFWAcAAAAiirAOAAAARBRhvYDMbKaZ3W5mm82s28w2mNmNZjY+7L4hmZkdZGaXmtnPzexVM+s0s31m9oSZfcTMMn5WzOxEM7vPzHabWYeZ/cnMPmVm1YN8rXea2SP++Q+Y2dNmdlHh/nQYipldYGbOf12apc2Ir5uZXWRmv/fb7/Pf/87C/CmQiZmdbGY/M7Mt/s/hLWZ2v5mdk6Etn+cSZGbv8K/pm/7P7vVmdreZnZClPdc5gszsPDO72cweN7P9/s/j7w/xnqJcy9B/ljvneBXgJWm+pG2SnKRfSPoXSQ/52y9LOijsPvJKul6X+ddms6QfSPqKpNsl7fX3/1T+Q8QC7zlXUp+kA5L+U9K/+tfWSbo7y9f5e//4TknflvRvkjb6+74e9t9DJb4kHexf5zb/Olyaj+sm6ev+8Y1++29L2uXv+/uw/9yV8JL0Bf/ve4ekOyR9WdKtkp6R9LWUtnyeS/Al6auBa3Cb/2/tTyX1SBqQ9GGuc2m8JD3v/522SVrtr39/kPZFuZZR+Fke+sUp15ek3/gX8hMp+2/w938n7D7ySrouZ0j6S0lVKfunSnrDv2bvC+xvlrRdUrekYwP7GySt9Nufn3KuOZK6/A/5nMD+8ZJe9d9zQth/F5X0kmSSfitpnf+DPi2s53LdJJ3o739V0viUc+3yzzenUH8uXk6S3u9fgwckNWU4XhtY5/Ncgi//53O/pK2SJqccO92/Buu5zqXx8q/ZAv/n8mkaJKwX61pG5Wc5ZTAFYGbzJC2XtEHeb2BBX5LULukCMxtb5K4hC+fcQ865XzrnBlL2b5X0HX/ztMCh8yRNkvRj59yzgfZd8kbzJOljKV/mbyTVS/qWc25D4D175I34Sd4IP4rnk/J+UbtE3ucyk1yuW2z7er9d7D0b5P1MqPe/JgrAL1v7qqQOSR9yzrWltnHO9QY2+TyXptnyynmfds5tDx5wzj0sb4R2UmA31znCnHMPO+fWOj8ND6FY1zISP8sJ64Vxhr+8P0P4a5P0pKQxko4vdseQk9g/6n2BfbFr/L8Z2j8mLyScaGb1w3zPr1PaoMDMbLG8/zL/pnPusUGa5nLduNbhOlHSXEn3Sdrj1zR/1syuyFLHzOe5NK2VV+6y1MwmBg+Y2SmSmuT9z1kM17l8FOtaRuL6E9YL41B/uSbL8bX+cmER+oJRMLMaSRf6m8EPa9Zr7Jzrk/SapBpJ84b5ni3yRnZnmtmYUXYbQ/Cv63/JK3H6/BDNR3Td/P8xmyHpgH88FZ//wnurv9wm6Q+S7pX3i9mNklaa2aNmFhxx5fNcgpxzuyV9VtIUSS+Z2a1m9hUz+4mk++WVQP1d4C1c5/JR8GsZpZ/lhPXCaPGX+7Icj+1vLUJfMDr/IukISfc5534T2J/LNR7ue1qyHEf+fFHS0ZIuds51DtF2pNeNz3/4JvvLyyQ1SnqbvFHWI+TdT3SKpLsD7fk8lyjn3I2S3isvmP2tpM/Ju19ho6Q7U8pjuM7loxjXMjI/ywnr4TB/OZy6LITEzD4p6Sp5d5dfMNK3+8uRXGO+L4rAzJbKG03/hnPud/k4pb8c6XXjOhdObNo2k3Sec+5B59wB59yLkt4j6U1Jp2ab2i8DPs8RZWafkTf7y53yZmEbK2mJpPWSfmBmXxvJ6fwl17n0FfNaFvzaE9YLY6jftJtT2iFizOxySd+U9JKk0/3/bg3K5RoP9z37R9BVjECg/GWNpH8c5ttGet2Gaj/UaA1GL3Yj2Hrn3B+DB/z/SYn9L9lSf8nnuQSZ2WnybiT+H+fcPzjn1jvnOpxzf5D3S9kmSVf5kz5IXOdyUoxrGZmf5YT1wnjFX2arY1rgL7PVtCNEZvYpSd+S9Gd5QX1rhmZZr7EfCOfKuyF1/TDfM03eiNCbzrmO3HuPIYyT9/e/WFJX4EFITt5MTZL0H/6+G/3tEV0351y7vJAwzj+eis9/4cWu2d4sx2NhvjGlPZ/n0hJ7KM3DqQf8v/ffy8s5R/u7uc7lo+DXMko/ywnrhRH7wbHcUp58aWZNkpZJ6pT0VLE7hsGZ2WflPfTgeXlBfXuWpg/5yxUZjp0ib7aflc657mG+5+yUNiiMbnkPz8j0WuW3ecLfjpXI5HLduNbhekzeP9QLzKwuw/Ej/OUGf8nnuTTFZvqYlOV4bH+Pv+Q6l49iXctoXP9CT+ReqS/xUKSSe8kri3CSnpU0YYi2zfKeijiSBzLMFQ/XiOxL0jXK/FCkEV83ReRBGpX8kvR9/xr8c8r+s+Q92XKvpFZ/H5/nEnxJ+oD/97xV0oyUY2f717lT/hPDuc6l89LwHopU8GsZlZ/l5n9R5JmZzZf3DTNZ0j3yHp17nLwndK2RdKJzbld4PUSQmV0k7walfkk3K3MN2gbn3J2B97xb3o1NXZJ+LGm3pHfJmx7qp5I+4FI+YGb2CUk3yfuQ/7e8EZ/zJM2Ud8Pjp/P558Lwmdk18kph/tY5d1vKsRFfNzP7hqR/kHcz408l1Un6oKSD5P0S/62C/WEgM5ss75kWh0h6XF5JxGx5tcxO3sOS7g605/NcYvz/uf6NvNl+2iT9XF5wXyyvRMYkfco5983Ae7jOEeVfm3f7m1MlvV1eGcvj/r6dwb/rYl3LSPwsD/u3p3J+STpY0h2StvjfEK/Lu2lx0FFbXqFcq2vk/QM+2OuRDO9bJv/BK/JGcF6QdKWk6kG+1l9KelTePy7tkp6RdFHYfweV/lKWkfXRXDdJF/nt2v33PSrpnWH/WSvlJWmCvP/NfM3/GbxL3uDJ8Vna83kusZekWkmfkldWul9e+dN2eXPrL+c6l85rGP8ObwjrWob9s5yRdQAAACCiuMEUAAAAiCjCOgAAABBRhHUAAAAgogjrAAAAQEQR1gEAAICIIqwDAAAAEUVYBwAAACKKsA4AAABEFGEdAAAAiCjCOgAAABBRhHUAAAAgogjrAAAAQEQR1gEAAICIIqwDAAAAEUVYBwAAACKKsA4AAABEFGEdAAAAiKj/D6Wmg9vMbXk2AAAAAElFTkSuQmCC\n",
-      "text/plain": [
-       "<matplotlib.figure.Figure at 0x7fe91c78a748>"
-      ]
-     },
-     "metadata": {
-      "image/png": {
-       "height": 250,
-       "width": 373
-      },
-      "needs_background": "light"
-     },
-     "output_type": "display_data"
+     "ename": "KeyboardInterrupt",
+     "evalue": "",
+     "output_type": "error",
+     "traceback": [
+      "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
+      "\u001b[0;31mKeyboardInterrupt\u001b[0m                         Traceback (most recent call last)",
+      "\u001b[0;32m<ipython-input-141-f774d95b4160>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m\u001b[0m\n\u001b[1;32m      1\u001b[0m model_run = model.fit(X_train_prep, y_train_onehot, epochs=20, \n\u001b[0;32m----> 2\u001b[0;31m                       batch_size=64, validation_data=(X_test_prep, y_test_onehot))\n\u001b[0m",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/keras/engine/training.py\u001b[0m in \u001b[0;36mfit\u001b[0;34m(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs)\u001b[0m\n\u001b[1;32m   1037\u001b[0m                                         \u001b[0minitial_epoch\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0minitial_epoch\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   1038\u001b[0m                                         \u001b[0msteps_per_epoch\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msteps_per_epoch\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1039\u001b[0;31m                                         validation_steps=validation_steps)\n\u001b[0m\u001b[1;32m   1040\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   1041\u001b[0m     def evaluate(self, x=None, y=None,\n",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/keras/engine/training_arrays.py\u001b[0m in \u001b[0;36mfit_loop\u001b[0;34m(model, f, ins, out_labels, batch_size, epochs, verbose, callbacks, val_f, val_ins, shuffle, callback_metrics, initial_epoch, steps_per_epoch, validation_steps)\u001b[0m\n\u001b[1;32m    210\u001b[0m                         val_outs = test_loop(model, val_f, val_ins,\n\u001b[1;32m    211\u001b[0m                                              \u001b[0mbatch_size\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mbatch_size\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 212\u001b[0;31m                                              verbose=0)\n\u001b[0m\u001b[1;32m    213\u001b[0m                         \u001b[0mval_outs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mto_list\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval_outs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    214\u001b[0m                         \u001b[0;31m# Same labels assumed.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/keras/engine/training_arrays.py\u001b[0m in \u001b[0;36mtest_loop\u001b[0;34m(model, f, ins, batch_size, verbose, steps)\u001b[0m\n\u001b[1;32m    390\u001b[0m                 \u001b[0mins_batch\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mins_batch\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtoarray\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    391\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 392\u001b[0;31m             \u001b[0mbatch_outs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mins_batch\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m    393\u001b[0m             \u001b[0;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mbatch_outs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlist\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    394\u001b[0m                 \u001b[0;32mif\u001b[0m \u001b[0mbatch_index\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, inputs)\u001b[0m\n\u001b[1;32m   2713\u001b[0m                 \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_legacy_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minputs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   2714\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 2715\u001b[0;31m             \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minputs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m   2716\u001b[0m         \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   2717\u001b[0m             \u001b[0;32mif\u001b[0m \u001b[0mpy_any\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mis_tensor\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0minputs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py\u001b[0m in \u001b[0;36m_call\u001b[0;34m(self, inputs)\u001b[0m\n\u001b[1;32m   2673\u001b[0m             \u001b[0mfetched\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_callable_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0marray_vals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrun_metadata\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   2674\u001b[0m         \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 2675\u001b[0;31m             \u001b[0mfetched\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_callable_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0marray_vals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m   2676\u001b[0m         \u001b[0;32mreturn\u001b[0m \u001b[0mfetched\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0moutputs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   2677\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
+      "\u001b[0;32m~/anaconda3/envs/mlw-2/lib/python3.6/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *args, **kwargs)\u001b[0m\n\u001b[1;32m   1437\u001b[0m           ret = tf_session.TF_SessionRunCallable(\n\u001b[1;32m   1438\u001b[0m               \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_session\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_handle\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstatus\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1439\u001b[0;31m               run_metadata_ptr)\n\u001b[0m\u001b[1;32m   1440\u001b[0m         \u001b[0;32mif\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m   1441\u001b[0m           \u001b[0mproto_data\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_GetBuffer\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrun_metadata_ptr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+      "\u001b[0;31mKeyboardInterrupt\u001b[0m: "
+     ]
     }
    ],
    "source": [
-    "import matplotlib.pyplot as plt\n",
-    "\n",
-    "history_model = model_run.history\n",
-    "\n",
-    "fig, ax = plt.subplot()\n",
-    "\n",
-    "plt.plot(np.arange(1,num_epochs+1), history_model[\"acc\"], \"--\")\n",
-    "\n",
-    "plt.plot(np.arange(1,num_epochs+1), history_model[\"val_acc\"])\n"
+    "num_epochs = 10\n",
+    "model_run = model.fit(X_train_prep, y_train_onehot, epochs=num_epochs, \n",
+    "                      batch_size=64, validation_data=(X_test_prep, y_test_onehot))"
    ]
   },
   {
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "\n"
+    "### Exercise: Use the above model or improve it (change number of filters, add more layers etc. on the MNIST example and see if you can get a better accuracy than what we achieved with a vanilla neural network)"
    ]
   },
   {
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "\n",
-    "      \n",
-    "      "
+    "### Exercise: Load and play with the CIFAR10 dataset also included with Keras and build+train a simple CNN using it"
    ]
   },
   {
-- 
GitLab