Newer
Older
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Mini-Project OCR Code\n",
"\n",
"This is a walkthrough of the process we went through to develop an OCR model that recognizes handwriting.\n",
"\n",
"Our libraries used are listed in the README, we will utilize requirements.txt to load them all at once.\n",
"\n",
"**IMPORTANT** Make sure to use `python3.6` (the version we use for class projects) to run our program with little issues. Information can be found [here](https://courses.cs.vt.edu/cs4804/Spring24/projects/project0.html#python-installation) "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"hide_output"
]
},
"outputs": [],
"source": [
"# Install libraries\n",
"%pip install -r requirements.txt"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [],
"source": [
"# Load libraries\n",
"import tensorflow as tf\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"import cv2\n",
"\n",
"from keras import layers, models\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.preprocessing import LabelBinarizer\n",
"from sklearn.metrics import classification_report\n",
"from sklearn.model_selection import KFold"
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Helper functions for loading dataset\n",
"\n",
"We need to load our dataset. We will create a helper function for the model OCR. This function will load the English Handwritten Characters dataset that should be in given path."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"def load_eng_dataset(datasetPath):\n",
"\t # initialize the list of data and labels\n",
" data = []\n",
" labels = []\n",
"\n",
" # loop over the rows of the A-Z handwritten digit dataset\n",
" for row in open(datasetPath):\n",
" # Skip the first row\n",
" if row == \"image,label\\n\":\n",
" continue\n",
"\n",
" # parse the label and image from the row\n",
" row = row.split(\",\")\n",
" imagePath = \"eng_dataset/\" + row[0] # hardcode the path\n",
" try:\n",
" image = cv2.imread(imagePath)\n",
" image = cv2.cvtColor(image, cv2.COLOR_RGB2GRAY)\n",
" except cv2.error as e:\n",
" print(\"[ERROR] loading image \", row[0], \" fail\")\n",
" continue\n",
" \n",
" label = row[1][:-1] if len(row[1]) > 1 else row[1] # remove '\\n' at end\n",
"\n",
" # update the list of data and labels\n",
" data.append(image)\n",
" labels.append(label)\n",
"\n",
" # convert the data and labels to NumPy arrays\n",
" data = np.array(data)\n",
" labels = np.array(labels, dtype=\"U1\")\n",
"\t# return a 2-tuple of the English Handwritten Characters data and labels\n",
" return (data, labels)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Dataset Pre-Processing\n",
"\n",
"Next we will pre-process the dataset in order to train the model."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"def process_dataset(data, labels):\n",
" \"\"\"\n",
" Help function to pre-process the dataset for ready to train model.\n",
" \"\"\"\n",
" # the architecture we're using is designed for 32x32 images,\n",
" # so we need to resize them to 32x32\n",
" data = [cv2.resize(image, (32, 32)) for image in data]\n",
" data = np.array(data, dtype=\"float32\")\n",
"\n",
" # add a channel dimension to every image in the dataset and \n",
" # data = np.expand_dims(data, axis=-1)\n",
"\n",
" # scale the pixel intensities of the images from [0, 255] down to [0, 1]\n",
" data /= 255.0\n",
"\n",
" # convert the labels from integers to vectors\n",
" le = LabelBinarizer()\n",
" labels = le.fit_transform(labels)\n",
"\n",
" # account for skew in the labeled data\n",
" classTotals = labels.sum(axis=0)\n",
" classWeight = {}\n",
" # loop over all classes and calculate the class weight\n",
" for i in range(0, len(classTotals)):\n",
" classWeight[i] = classTotals.max() / classTotals[i]\n",
"\n",
" return data, labels, classWeight"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Verification\n",
"\n",
"To verify that the dataset looks correct, let's plot the first 25 images from the training set and display the class name below each image.\n",
"\n",
"We define the helper function here:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"def show_train_data(train_images, train_labels):\n",
" class_names = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', \n",
" 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'G', 'K', 'L', 'M', \n",
" 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z',\n",
" 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'g', 'k', 'l', 'm', \n",
" 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']\n",
"\n",
" plt.figure(figsize=(10,10))\n",
" for i in range(25):\n",
" plt.subplot(5,5,i+1)\n",
" plt.xticks([])\n",
" plt.yticks([])\n",
" plt.grid(False)\n",
" plt.imshow(train_images[i])\n",
" # The CIFAR labels happen to be arrays, \n",
" # which is why you need the extra index\n",
" index = np.where(train_labels[i] == 1)[0][0]\n",
" plt.xlabel(class_names[index])\n",
" plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Load and Pre-Process\n",
"\n",
"First, we need to load and pre-process the data. We will use the functions we defined previously.\n",
"\n",
"Make sure the English Handwritten Characters dataset is loocated in `/eng_dataset` in the following format:\n",
"\n",
"```\n",
".\n",
"├── ocr_project.ipynb\n",
"└── eng_dataset\n",
" ├── english.csv\n",
" └── Img\n",
" ├── imgXXX-XXX.png\n",
" └── ...\n",
"```\n"
]
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"# Define directories here\n",
"datasetPath = \"eng_dataset/english.csv\""
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[INFO] loading datasets...\n",
"[INFO] pre-processing datasets...\n"
]
}
],
"source": [
"# load the English Handwritten Characters datasets\n",
"print(\"[INFO] loading datasets...\")\n",
"(data, labels) = load_eng_dataset(datasetPath)\n",
"\n",
"# pre-process the data and labels for training\n",
"print(\"[INFO] pre-processing datasets...\")\n",
"data, labels, classWeight = process_dataset(data, labels)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Training\n",
"\n",
"Time to begin the training, we need to split the data for training and testing first. The training data will be shown here."
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAj0AAAI8CAYAAAAazRqkAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAACYFklEQVR4nOz9eXxcd53n+7++59RepaW075tXeY+TOI6dlRCyQFhCOkADARqapYFpmtt9Z3q5w/Rv+t5e6JlhGoYGAiEQCJAOECA72ZyEJN4Sx7stW5Ila9+l2qvO+f7+kGJL8W5LKlXV5/l48ODho1PHH+XrU/Wu892U1hohhBBCiGxnpLsAIYQQQoj5IKFHCCGEEDlBQo8QQgghcoKEHiGEEELkBAk9QgghhMgJEnqEEEIIkRMcF3JySZGpG2qdc1WLOI32ziSDw5aa7etKW6bHzt3xQa116WxfV9pz/sm9mV3m4t6UtkyPs7XlBYWehlon256qnZ2qxHnZcEvnnFxX2jI9zMojx+biutKe80/uzewyF/emtGV6nK0tpXtLCCGEEDlBQo8QQgghcoKEHiGEEELkBAk9QgghhMgJEnqEEEIIkRMk9AghhBAiJ0joEUIIIUROkNAjhBBCiJwgoUcIIYQQOUFCjxBCCCFygoQeIYQQQuQECT1CCCGEyAkSeoQQQgiREyT0CCGEECInSOgRQgghRE6Q0COEEEKInCChRwghhBA5wZHuAoQQQmS+uE4yZieIaY1PKQoMD05lprssIWaQ0COEEOKSPRUp4O/2vY9QVz7F9SN8vflhbvDa6S5LiBmke0sIIcQle3p0Nb6HC2j+/zqxf1vM86HmdJckxCnkSY9YkOI6yYAVZ8I28Bs2JYYLn+FKd1lCiDMIWy7cozaprm7cY3WMJH1E7AROZUo3l1gwJPSIBenVmJs/33MP4aMFGNUR/n+X/Y4P542kuywhxHlwxGz2jFTxRH4J1Y4RVrliBAxPussSQrq3xML0SngJjicKWfpPRyn9pZfHh1enuyQhxHlyRG26Bgt5enQlO2MNjNqpdJckBCBPekQaJLVFjxVlwDpzd9XuiWo8IxprYADPcB0to6XsrEic8fxCI0GNw41bOeeiZHGJInaCHivBqH1+XZTSnpnhrW7oiFb0RPJRlgbASNgkx9wcHisjYMaJ+w+kuVIhJknoEfPuSDLOnx68h75d5agzTO7w9iuqDo1gA+5jw4w+UsFH6v/89CcrcC0b51trfyazRRaol2N+vrLrT0gdzjv3ydKeGeONuMF/a/8QR3tLMY56aeoZm7pnh6h6rpLR3dX8al0Z1994kEXOSLrLFUJCj5h/rakihl+uYMl3WyAeP+052rLQUz+zWjso6+mn3DzDYEhD0fehFby+tIEbvK1zVba4BNsii/A9mUfZw/vB1mc/WdozY+yK1XF0Wx0lb2q8Q0mM3iFsIHXsOPk9feSbJmZyDYc2VYL/aLrLFUJCj5h/Se3ATIA9Pn4i2JyVbWGHw6ccNnw+jKIg2uMi6Vc4lTUH1YoLEddJWpNJ+qwAFurE8R0j9bjHbKzRsfO6jntMs22sgRd9rZQaEZqcTunqWoAithtHVOEeTeGYSKKTyckf2BZ2bPJ+NJJgaxk+KhYGCT0iY9mrFtF6R4BkXZyltce41ncYcKe7rJz2ZgI+veszpHYVzui69AxoKvYPc76xtODAGAceWs6fli3Ds3KU7675CRtl8s+CM2Z58QyA78gQKhrHDksXlljYJPSIjBVq8HPDLbv4p8pncSoDr5J1fNLtYLwS9VIhjfftQyenzdixLKxE8vwvdKiN6t4hlMtFx4frOLi8ko2e/tkvWFySsZQX75CNdfQYaBv0ObouhUgzCT1i3vlUnHiRRi1vwkie/O6vYgn04DDW+PiM8w2PB6O8FO33zjgeqjSo9wwTNH3zUrc4NxtjsjsjFEanTj9N+UztqcJR7L4B7Fhs8rWhMLiSGAlIaHmrWqiUBmzpWhaZQd5JxLxb7hrh+ht382LjIvS0KcxWbxENvyvG8dzOmS9Y2kDrnUVYy2aO61le1co78/YCMtYjo5yhPc2DJTQ+7IN9h9CWNTnI3bIwUmBrdYaLCSHE+ZPQI+ZdnSPAt2peIFn97Izj/za8ll/tewfFz6sZj8ljlQEW3dDGzxf/esb5TmXiQJa3zzRnas+7yj9I7A8VOPcBWqNTKbStUeea7SWEEOdpQYSepLY4kEyyJ16Nic1qdzfLnW5MJSP+s5VbnTobx2fG4XRf6JXCYyZlGfsMY3g8GFUVWAX+GcfHG5ys9A+f0p4eM0XCOM0/AMk8QqRNTyrE64kSBlL5NDgHucwdpsDwnvuFC9SCCD2DVpQ/O3gPo89XoA2of1c7P138sIzVECKDqdoq2j5cgbl+FKVOJpemYCt3F29NY2VCiPP1k/G1fOfpm8lrMxhbm+BbN/yEd/ti6S7roi2I0DNqG/QcKGP5j1vB7WJ/YzWxRbISqxCZzCoOENg4wAtrf4qpTj7BMTBk120hMsSrw03UPW3hfmYHznuuZO9Vtbzb15Lusi5a2kJPUlvsTljsjDWwN1yNp99Ax+Io0wRbBi0KkSlCdoytcT/7YzU80b8S96hG2xqtFKZh41aO8+qqrvKN0bGoiZKr1+IYDqM7u7Fj57F4pZhzI1aErfEgLfGKGYtOvtTdhC+lMfPz0ZaFHY2BbWEGg+jaciy/m1CNosQxkcbqxXTT27LUMc4mbyd1jsCMc3pSIV6JVdGZLGLP8WoaJ5LoVOqM2wZlkrSFnjE7xl8c/hijT1XiGdbU7J/AjkQw/dKlJUQmOZQ0+NzWj5P/khf3qKZ41yCWvvB3x48Wv8Lox7y03lHM8K5SFv/EgIOydcFCsDUe5AvP30PRdgfTeipxRjXOUAprRQPmRByzswdrfJzkmgaO3u2ivGmQd5Qc5FrfUcB/xuuL+TO9LSca4CPvfpG/L90345yHJ1byv5+4jcJDiqpeC2drB6dfgCLzpC30RLSm82gpKx46Rqqre3K2RrqKEUJctK5UIZ43fZT/8HXseBzrIheo2+wx2NjwLDaamx0fJPl4ITKVYWFoiVdQtN1B6X3bJ5cTmGKWlJBcXkO0woPbaeLsn1wRPVzp5gMbt/EvFTsmz1MSeBaK6W1ZeN1atm5sgLeFntcn6qh6ycb72+0ApLJo0cm0julRempqchb9BxWzzzmR5I1jtXwtf+WM44vcfdzsa6fybY9mxfyyMVAa9Czcy6YyMGHGwGcjBccTRXSkDpGnDPINj8zsnAcjVoTnohXsjdbwRNcKPCP2ZOCZ3sbJBGYogdNpYEYSMG1BSkNpaacFyEJN3q+WhdIafYY1sJRFVn42L4iBzEKcjeNIN3UP1PFk2XUzjg+t0yRvf4RPF/SmqTIx57SNM6x5dbCRIkeYFZ4urvaMUqAyd8pspng9kcdfbvkQpS878IxZ5O3uO+Ubvx2OYnT04On3oONx7NCpGwMLsZBI6BELntXXj+upft6+s5aR2sjBGytBQk9Wc8Q1feN57AlU4zPiXOEeTndJOaEzWUzBbidFP92GTqVOO6ZDJxNYQ9IeInNI6BFpNWiFeTxcz+vhBrYcX4x/+PwHwKrse/IqhBDzxjGR4NCRSv5TwZU0eIZ4T94eljqze/yVhB6RVq/HC/na83dSucWgYMzCv787qwbNCSHEQmW0dbP4wXp2PrOe3zebjH3Qe8pMrmwjoUekVXcqSMF+B/kPn/kRuhBCiNlnDQ1jbBkmALhuvZJ9N1eeMpMr20joEfNuxIrwq1ATr4wtZkdvLYE+G32em0qapaUkV9QQDzoZWq1Y5Omf42pFulkuRZE/whJfP1XOEZwyI2jOnOneNPPzsVY0EK2YuV+aazSF+2AXqd6+NFUsxIWR0CPm3f6kh3/Y8l5qnlKUjKXwtBwnZVvnfiGQXFHD0U8rbl6+hzv9PdweOATIlPWspQwSfsW1ZW3cVfA6hQb4lGw8O1fOeG/WVtLyER83XrUXh3HyXn22ZTn199XgkNAjMoSEHjHvBqx8Aq0O/I/vRMfjF9SlFQ86uXn5Hr5b8+rUEQk82c52QqN7IOsHWC4EZ7o3U/keypcO8L3aF2esvfMVM8n24ivkLhQZI22hxwC000bn+zHDBehYHDuWuTu3ivNXYY4RWp5g9I8uwz1mE9jXT6q1HTMYJLmmgXCl+4yvHVqtuNPfM4/VivmyM57ggeFNtIZK6NhTydLRUbJgq5+c5O1P8vCb62mZKOOywk7uKdxGo1OikUi/tIUeEzB8KeKV+bgNA2NoFLs/ma5yxDxqdiX412seYttlTbzc18TwjyrIb+9E15Zz9G4XH9i4DeMM89EXefqlSytLPTC8ied+uoGSvQmWDEzAsa50lyQukntPB0u/W8l4fi0PXtdE4139NDoH012WEOnt3jIdFimfE4ffhSP09qXnRLYqMLx8MDDOBwO7uN/bzTdK7qLAUFh+N+VNg/xLxY5zLF8vgWchMc7yPEZrhY3GPI/rtIeKKdmbwPn0DjRM7sVnnM8rxUJjDQzAwABOoKDqanqSQUBCT0ZTp9+uItOkLfR4lMGa6m52X7ME10geZW+4cfcNpKsckSYNzkHG1iZw3nMl4SrFO0oOprskcYEqHGNEVkcZ/Nh6PGM2+bsHsQ4fxTEcZnhXKTc7Pkhj/hCfKdvCZo/MvBIikzjqa5lYV0ksaDJ4hc1id2YPWk9b6Mk3PPzX2kfZ9f4a9kVreNTcRP122U8n16x1hfjf1z/I3g21lDgnuMHXIjsyZ5gVTov/c/WDvL62gWf7lzF8fxUFR9rRx3tY/FOT5BNBdq2u4v5P2Gyu/UO6yxVCXIDwygpGPzXBh5teZ7Gnl+u9PUDmvkenb0yPMljj8rDGNcgBTwe/LN4IDplMlmuCpo/3+iO8139o6kjm3ky5KmB4uNUX51bfIapcI3wjeBeFhsIOh2HfIQyg2LGe9lARSX3mpQlS2oDp6zUphTJNtJEdj9WFWOhsrUhqi6R9sls5kW9yS91B/qYkO96jJWUIIeaccyDC0a01bIx8hDNlmLG9xSwamJyxZS5pYvTyMqJFBqNXxlnilk1lhZhL7r4Ie15dzKbxjzByqIjFvSGycUMgCT1CiLnX1sni+5PYBb4znlI6PoA+PrkcwfiaUlx/0ss9NdtZ4u7lCncIkO5vIeaKOtLBkvvKsfO9lIaG0Md7JfQIIcTZGNhoA9RUV7W2LNB6sqvr8NGzvnZ6x1ciz+COyn18vvCtaesSeISYDSYa2wTlcoFlnbxHJybg0AQw817MNhJ6hBCzZpGrn8hVEbrM9XiGNSXbhrD2H053WUKIKcvc3YxuimN5L8czqCne2o91ji8k2URCjxBi1qx1JXhg4w9ov7yEB3uvYiDRQP4BBTobH5QLkXk2eSZ44Jrv035VKfd1bmYiXIlPQk+a2TBsm+TZMdzKiVPJAmVCZIKA4WGjBzZ6RhhIHeC+wiYKfT6wLuCBuWFgO8BQsgnFQmRpm7hOkcQinHKj7PMPtEltEddJLDROTHyGLEo73wKGh80e2OwZ4nDpUZ70V3PKSDulUA4nyjSwnAqnyp4OrwUXenQ4QvFOk/d4vkwgGOFTS17ji8FDuJUz3aUJIS7Aak8n4WvDdBSsveDXxldFWevpmIOqxKXalUjxtWPv42BXBWa7h8Zj5z/L56FQGd9ouYmRcR8bG9r5++pHWSR7ci04jsZ6Bq6rJFypiC6Pcbm/Pd0lzZoFF3qssXHKfnuEshfziNcF+f4XNvGpq/biNiX0CJFJNrhjPLTxe/RemY/F+a+1Y6IpNSdY5rQBz9wVKC7KlvByOn/TyNLH+1GxQeyBofMKPZa2uf/4Jjz3BVl6aJQdd69gzx/vYJEzNOc1iwsTWVqK68N9fG3x41SY4yx2WmTLZIIFF3qwrRP7trh1I7FQEVZWTpwTIrv5DBfr3ADxi3i1dHssJMqyiSWc9FsR2mPF+PpsrENHzu+1NkRsFyEdZzDkp7wzgn3oKO6RYsK2G5DQky5OZWG5wMjLm3E8XmCyvriLd/tiZNu9uPBCjxBCiAXFMTCO/UI1mwa+gqvHSUNr+Lxf6x6zeKRtDWMpL2MdBVRGRs+yRa2YT1f42vjxzRsI166ecTzeGOOa/OycdSmhRwghxFlZx3uofjBKjc8LiST2yPkHF89AnL59hfxmcB35R0xUKDKntYrzd51ngl9v/C5DG2Z2XRUaMZockI3dyxJ6hBBCAOBUKSwXmMFCdCSKHY2hkwl0MoHV13/qCwwTw+uZXOhO22hrKgolk9jxOGiNEU/hHFdYLifOkIZU9swEynQ+w8XK0/ZeZV/YeYuEHiGEEAA0OYYpubaHQwVNeIYUVVvCqNd2n3GdJXNRPb3vLGeiHswEuMYUZhwC3Rb5WztJdXVj9o9R+qaPRJ6BvzuODp9/15gQs01CjxBCCAAWO9082PwAw8sc/HRkI88MX03JVgP06Z/OxOuC+N7Xx33LfsbhZBmPDq2lK1xI++4q/G1B6OrG6unFMzSMxzQhmcSKx1GmrL0m0mNBhB5TaWyPhpJCzGlbMFtFAUyXhXkB012FEEJcHKcyqXEEqAH2+Dp5LLgJR2U5OhbDDoXR8TjK4cAI+MHtJlTsZFHBIOvcbvKMLo7lleA1k7QGytFOAwCdSqFTqfT+YkJMWRChp8iA6y/fzwt/3owRKz5x3A6kuLt5Bz5D1ugRQoj5tNrdRclN3RyorsXTZ1L7bAhe241ZV0P37VWML7ZxVoX5ctEeAIoMg6t8R1ni7uMPJU3YbhdGmn8HId5uQYSeEtPP/6x+iuHKJ2YsYuZEU2SauFV2LIokhBCZotnp5KHmnzKxTPPtwet4qesqglsVieogztsGeGLV/fiUpshwAS6Cpo+rDQubEb4XHCHmrpTQIxacBRF6AIKmj6B08wohxILgVCZlpp8yE5b5evl9scJRV8NYuYu6/BGWOv2nfQ2YuAyLmHHqsATD58PIzwOvh2QAPCo5D7+JECctmNAjhBBiYbrc007xrV3sX1KJrzTEF8reuPCLKEXq8mV0vMtDoixF89J2Vrh64dTtLoWYMxJ6hBBCnNU6l4NfLf8ZsWU2TqXIM1zABY61VAYjyz18+L0v8vngVnyGSUBl73owYmGS0COEEOKsTGUQNM//iUy+K0pvqYOixY0nDxoG0RLFIncflQ7ZWV2kh4QeIYQQs+qukh38wx+Xc/DmopMHlWZNw1Eu93SSLTt2i8wjoUcIIcSsus03wfVrf0xSn9yhy0ThVAZe5U5jZSLXSegRQggxq5zKpECWGhELkCyjIIQQQoicIKFHCCGEEDlBQo8QQgghcoKEHiGEEELkBAk9QgghhMgJEnqEEEIIkRMk9AghhBAiJ0joEUIIIUROkNAjhBBCiJwgoUcIIYQQOUFCjxBCCCFygoQeIYQQQuQECT1CCCGEyAkSeoQQQgiREyT0CCGEECInKK31+Z+s1ABwbO7KEadRr7Uune2LSlumjbRn9pC2zC6z3p7Slmlzxra8oNAjhBBCCJGppHtLCCGEEDlBQo8QQgghckJOhB6lVCjdNYjZI+2ZHZRStyqlDimljiil/ku66xGzQyn135RSf5nuOoQ4nZwIPUKIhUUpZQL/B7gNWAF8RCm1Ir1VCSGynYQeIUQ6bACOaK1btdYJ4OfA+9Jck7hISqm/VUodVkq9DCxLdz3i4iml/kkp9cVpf86qJ3cSeoQQ6VANdE778/GpYyLDKKUuBz4MrANuB65Ma0HiUv0CuHvan++eOpYVHOkuQAghREa7Fvi11joCoJT6bZrrEZdAa/2GUqpMKVUFlAIjWuvOc70uU0joEUKkQxdQO+3PNVPHhBDp9x/AXUAFWfSUB6R7SwiRHtuBJUqpRqWUi8nuEXlCkJleBN6vlPIqpfKAO9JdkLhkv2DynryLyQCUNeRJjxBi3mmtU0qpLwFPASZwn9Z6X5rLEhdBa/26UuoXwJtAP5OBVmQwrfW+qQDbpbXuSXc9s0m2oRBCCCFETpDuLSGEEELkBAk9QgghhMgJEnqEEEIIkRMk9AghhBAiJ0joEUIIIUROkNAjhBBCiJxwQev0lBSZuqHWOVe1iNNo70wyOGyp2b6utGV67NwdH9Ral872daU955/cm9llLu5Nacv0OFtbXlDoaah1su2p2nOfKGbNhlvmZssTacv0MCuPHJuL60p7zj+5N7PLXNyb0pbpcba2lO4tIYQQQuQECT1CCCGEyAkSeoQQQgiREyT0CCGEECInSOgRQgghRE6Q0COEEEKInCChRwghhBA5QUKPEEIIIXKChB4hhBBC5AQJPUIIIYTICRJ6hBBCCJETJPQIIYQQIidI6BFCCCFETrigXdYXIkvbhHScuLbPeI5bGfiUC6cy57EycaGkLXNDUltEdOKUdnaicCoDAwNTKdzKmaYKhRDZKuNDz75kgv+79S4OtVeCPv05FVUj/MPSR7jJa81vceKCSFvmhlfjJn9z+KN0dRXNOO4tiLGusotq7ygrfV3c7m+jzPSnqUohRDbK+NDzWrSJrifrWfFwN+jTf1L231jF419cy03e1+e5OnEhpC1zw9Pjq4n+upwVT3fPOB5eXsaOa5fzanmSXQ01rG46Tpk80BNCzKIFHXosbTNuxwhrmzN1eByNleEZ0KTajp3xg9K3spy2cDEdqRA+pSgwPNI9kkZvtevE27o3LrQtZxyXdl3QktpizI4R0ZrWcAm+AXuynafx5vvwDAWxXQ56gvkcTZZSanbhU4qg4cVUMgQxU8R1kjE7Qext97G05dw602emEygwXPgMV7pKWzAWdOjpsiJ89dj72bmvCWWp057j7jepPRQ+44ckgK99nMOPLeGGuv+L4voRvt78MDd4zzxuRMyt9lSEP2/7I/YdqEXZJ9v1QttyuvKmQf7XsofY6JmzssUleDVu8n8f/AT9bcV4j5vUtY6e8kXGGByjZI+feKeDSGsBf9vyx6TyLBqW9PG/Fj/EOrc7LbWLC/frUBn/fe/tRHsDJ45ppaUt59iZPjONYJwvrX2BLxe25nzgXNChpz0VYPcfltD8/V5UNH7ac3QiiZ6YOOOTIAB9qJW63gGU203vu+t5vraZG7z75qZocU4tyWKOvtDIivuPQzJ14viFtuV0XXc28FrdIjZ62uemaHFJXg0vIf5oGc2/aUfH49gToVPOSfX04R4ZxW2aFDgcKLcLnE46PlzHrtpa1rn701C5uBi/HVxH4S/yaPhDx8mDhiFtOcfO9JkZba7k51+9gi8WHiXXn4UvuNBjaZshO8qoDXtiy3GNKHR3H1YkctHX1MkE1uAQAO7xOiKWPOKbb2+165Cl2BNbhGsErK5edDJxQdeZ3pbTuSbqidsy22chsbRNXKdIYtGTKMA9ZpPq6j7zC2wL++33uWHinKglJm274EXsBIN2grBtcGwiiHcgMbO9pS3nXFKbOMIK3dOPFQ6fOO4qCzKYcDLZ6XXu2DO9LafLM2xKTXdGz6xccKGn34rwxfYPsGtXE64Rk6qdcXTiwj4YxcLTkYrwxda7Obi7DvewQfXuGNqSGVjZbMiO8mS4nj2RWp45toySUWnvbPZEpIS/3fU+Up1+8loNCjp7kBbPTNPbcrr8JSN8c9XP2JzBwwgWXOjps5zs3raI5m/3osfG0eEIdip17heKBa3TCnD0lXqav3ccHQpjh8JoW94Ss9mAZfDo4Fp2ddagO324RqPpLknMoWfHVhD8tZ/gM0chHj/1qZ3IGDPacpqeP1rC1qbFbPa0pqmyS7fgQg+ANkG7nSivF+X1XtSy0TqZxB4dQ8dPPxZIzD1L2/RYEQYtJzsizbhGFfbA0Iw3Q+VwYBTko7zei/+LlCKRr3AbyVmoWsyWmDbpj+SRHHXjHVcYCQsNKKcLo7BgcszOGehYbPL+tSzMOOwN17DT20apmaDS9MosvQUiYic4biWZsJ0cGC3HM5zCGhhId1k5y6kskn6Nqq7AMRHGHp/AntbNdb6ilhPXhH1KWzoji0nqzL73FlzoqXKk2LxxPy/lLYVU8KKv4+twUPvEGLwhA5bTpceK8Nmjd3N4Rz3uEUXljtgpXZVmRTk976ljZI11SZui1Db2sMnXwuTkTLEQdKaK6DhQQdUfwD2awOwZJgUYi+vpfE8poUVnfoKbf9BBzW+7SbUdo3hviC2/uJynS9ZTuHqQ7698gDWuzH7jzRbPRAv5y513Qasf/3FFZVufdGmlUYMjxOXXHmJr8SKcQ2XUPJfE+cwb6S5rQVlwoafM9PPt2qeZqH78kq7zV8ffQ/uhZfilvdNm0HLSsq2eZd/qnPzWHouj39ZVaRfnE74xzHMbv43n9KsSnBePMggYMg12IelMFFNwyCD/0V1oyyI1FXhjtQVU3NrJD5f87LRDKi3gztpPkdxWgGrV8MYBag56UW4XHZ9exsEl5axxjc3r7yJO76WJZQQf91P82/3oZAorGkt3STmt2vTx3frHiNRa/Ca0jG/3v4+q5+ULwnQLLvQABAwPgUtcSqDEHaLNvIRPUXHJLBRGCvTYOPbExInjyuHACAZReX7CNQFKC4aocUiXRbaxUJhJjR2Nzlh7SRuKAleUGkfgjK/1uxJow48CdCqFnphARR2YcUjqBfm2lZOS2sQR01ijEkIXAlMZFCgvBQZUOMfQDsBQqKRFaNTP89EARWaIJkeCkrdt8RKyYxxJKoZsH4dHS3EnJhcPMXw+jKIg2uMiVqTwGZk9sUjePcS8M0uK6flAEyNXJAkUhfly/WsYSEAVQoi5oPqGqHy8kK+2fJpYpcWfXLeFvys5OOOcpyJl/NVrd+Fp8eDr1hQe6SEF2KsW0XpHgGRdnKW1x7jWdxjI3KfqEnrEvNPBfMY2xdhy/TcpMEx8yoUpT3mEEGJOWIOD5P1mnHzTJLlhOU8uXsFfF++fsTrzKxOLKXvKRfCRN8GySE1NAgo1+Lnhll38U+WzOJWBV2X2Onc5EXoMnw+jrATt8xAuNwg6ZCrlXElqi7ZUjJZkMTvCK3COK7TWKIcDs6QYHcwn0lhIQcE4JbIXjBBZz/D7MUqL0T4P8SKFz5AZtfNOa3Q8jgaMuIWlJ5+sj9lRDiUddKeCbB+sxz1mnzLbSxuQ74gRNH1pKHz25UTo0c2NHLkzH90UYXlVK+/M24vM8pkbg1aUL7T8Md0v1uAehcrXI+hYHCMYpOcDTYxtilFQMM5nF7+MW+XEPz8hcpq1djFH3+MjWZlgZWMbl3s6gUtYokLMmi3RYr7yyofx7/Pg7dOUHMr+2Xc58akTqfFz5Y0H+E7dkziViSPndx+ZO6O2QfveKpbfdwxrYBCdTKFtC5XnZ+SKJM9d903KTRdOZUqXlhA5YKLey8Yb9/GlimcpNeJUOTJ3PEi2eSPSQPELboof3Am2xkpl/1pnWRV6RqwIryfy6EwW80pvI77QVGZV4HfECRgZvHZ2BlEW6HhixsKQ2lAoU5NnKOnSEiJLlDhDhKoN8i5biRGOobt6z9A9EqfUiOM3FMalLMglzlueESVaqtFrl2KGE9A7gDU0jBlN0tdXyP3jVTzftxTPqH3KIr5GXh6qsgzb7yFUbVDiPHWD4EyVVaHnuWgFf7nlQxTsduIbsPEfmBx9LoQQYvbdlrebvXdVsf/6cqKHy1j0Cw/snLkgrDYVhc4IpaYDpzJlaYp5ssw5xvvesZXnli5ltDef+t/m4358BNU1QO0vG/nWjjtxj9gU7e0/pUtLL2+g5e4AviWjrCo9wC2BvWTyjK3psir07I3WUPqyg6KfbkPbmpTs7SSEEHPmcreLBxp+j91g8+mym+l8cSlvf56uFeSZMXnSPs9qHAH+pWIHqYqt/HCsge/ueR/lpok1MIDnsWG8hkLbGus0n5ORKi/XXruX79Q+h4GBU2VH4IEsCz02CmVzyqq/QoiFwxFOsburim8UNZzxnGPdxSyNTs42EQvb5JMbkzrvMHubHLivXotjLIru7MGemMA9YfO7rtX4jARL3L1c7RmlwJCBzPPBVAYmBh4jiZ6+FJptoe2Z556Y5ez3Eqo0KXNP4FbZN+Enq0KPEGLhcx3pofKndfyk4rYznlPXlcLoOPWxu1i43l+4k+MfKaTltlIG95Sx+EEH7NpP3u5+xu4r5/7g7QxvSPLd63/Eu3zZP2A206iaSrpuLWei0cZdO8HmwOF0lzQnJPQIIeZVqqcXz6N9p3SDvJ2l5TlPJtngdvLDuhegDt7nfTeRp6twAtaRNgJH28kzTbRxJa1Xl4GvK93lirexivyMrUyyeXULS/z9LHcNAP5zvi7TSOgRc0Y5XRgNNSSrCpiodFNcMoxTycyNXFHhGGO8EQpuXI9zIoHR3os1MDD5Qwk0WemtFX5dRoqQ00A5HGhbT3anWBaeYc0DHVfRlzzIWl8HN3oHpKtrHhSaESJVGnvDSsxQHNXVjzU0POMcbRoYvhQNviFqXMN4VHbeoxJ6xJwxCvI4fkcFee/qpSYwxofKtuPL8CXMxfnb5O3kI+9+ka0bGzjUXknjg3U4nxlId1liHrhMC8trYBQWoOMJ7HAEbIvCnX2E4qU8VnA991+X5Ec3fp/rZHzznFvh6mPdtYfZWVuH0R2k/nEf5gszQ4/lMaksHeGugh3kGUlKsnRpEQk9Ys4oj4eJpSkeX/HjaTtqy3TVXFHnCPD3pfugdB9fK17JU89dS0G6ixLzwkBjORXK7QatUVGFtie7urxH2vA5XcSDV9B5TTF4htJdbtYrNw3+pOJlNhZW89uyNUR2VpCn1IwnrrbToMI/zjq3m2yZnn46EnrEnJvNmHM0GeJ3oVW0x0ousAabq/KOcouvVx6np0G1a4SR5QrPbVfiGkvgaOk+2dU1xSwtJbWkimS+C/dgFONwB9b4eJoqFpeiyT/IG8ub0UYd/t4Ert3tM7tT3j51SMypYcvi0ZF1/KG7kdHufBpHcneGs4QekVEeHr+MH/7qZor3Xdibpu2AX918GQ03fIcN2fslZsG62XeY/vc+z+4bqtne0sCiH9diPj8z9MTX1NF2j2Z9Uxtv7FzM0vuqYLeEnkz0kcJtmHfatEWKeXlHM8uGy+FtY0jE/NmfLOGJly+j9hmLkvEkriM9pHJ0XJ2EHpFRDoYrKN+ewvPotgt6neHxMF6/noFr84DY3BQnzqjRGeDvSg5CyUG+4r2C7aVXEHjbOdFSJx9cvZWvV7zBjTE/qfygbFiQoVa6vJNdm8CNoSCpPGnLdBpI5ZPXauB5agc6lcrpnQok9Agh5tUizwBPrFaYiQ0zjk/UGDzTuYx74nkcO1TBsolxbMBRW0N0WTmJAgfjS1NUOEbTUneu6bfC/GpiKdvHGyl3j/P+wp1scF/4YnXNhX28fFU1hSUb8HZHMQ8ewwqF8ffa/OvBm3mhopPNBS28z99O0PTNwW+Sm0asCL8KNfHK2GJ29NYS6LMnZ9LlOAk9Qoh59Z7APpIfMDl0S/mM41vaF+N4qohjHfksGYyijnWDUoxtqGb84+NcU93GpwLtrHWNk43rhyw0r8eL+Jdn3kPNc5qddQ6Of6SQH9a9cGJa+vn6fOkLlN8zTke0iBdeXcWyeytg/2GCr3bhHSjjYMEqnrl1Bcvf+R02yjyHWbM/6eEftryXmqcUJWMpPC3HZWsmJPSIeSC3mZiu0Rngq0WtUNQ64/j7o/mMH3HjfHoHMPXvxjCJlBr82dIX+XzhWwvaSeCZD72pAvJbTHyPbsNz5UpabiuFugu/zhqXhzWl+0lqi02DVdj+yUF1qWOdOI514vJ48C1bz5DtR7qeZ8+AlU+g1YH/8Z3oeDynu7Smy9rQY5aWEl9TR7TUydBqxZ3+nnSXlHN0LEbgiIOPH/ootYER7irZwW2+iUvaZXlFoJtXrlpFcWAj3v4k7j0dp8wCEpljXyLK/cObaJkoY/fuBpYOh2W/rQXIMRZlcE8Z7/O+m6bAIB8rfuWiurpE+pnBIMk1DYQr3bjGLPx7e0h1Hsc9FOP13Yt4v22yJK+fTxa9wkpX9s10zdrQk1pSRds9mg+u3soiTz+3Bw7BKUMnxVyyxyao/V0/8ddLOFxZyT/8cTnXr/0xBerib6S78t7E98EE7e8u4eE317P0u5UgoSdjPTR2BU/84mrK3kiwdDiMcaRTngwuQLqzh8UPOog8XcWWlY2kPmayofq1dJclLoKuLefo3S4+sHEbTx1rhh9U4u48jnHkOEt/WM14US2PX76I4EcirCw5lO5yZ11WhR5LG6ipaXjJfBfrm9r4esUbUz+VwDPfdDKBdegIjkNQtLiRgzcXkbzE9TkanQG+WNgJdNIyUcZ4fi3n9X3TmByHYMpzhAWlI1pE8b4Uzqd3oDlDV6g63UExn+yJCdi1HydQaq3n6HgJVpV9weN7ZBxt+ll+N+VNg/xLxQ4CjjhPFl2HG7BGRmDHCE6g2LeBY7GidJc6JzI+9LQlQ9w7vImtQw20tpfR1CO79+aKywo7efC6JgoqrybQk8TzehvW4LTVXZXC0VhPeHkp8QKTyJIERWYIzi8miTRyNNYzvq6CWKHByLoUDa7BdJckpjiHIrS8Xss77Q/MON5c2MfnS19gjWvmvhIvxuD7fdfRMVFE6M1iysf65WneHBm0wvxwbA3P9DXTNlBEyTEbLAuzsIDEZYsYr3MTqlW8o+RguktNm4wPPa/E6vnlY5upfSbOsokoxrFeuaFyxD2F26i/a5DjiSJ+sP0alg1VwIzQYzB2WTl9H4yzqKKbO8sO0uRIIKFn4RtfV0H8M8N8oHY3K7xdXOUeAWQ684LQ1sWSH0Hqt0VMX3znpSurKf/4OGtK9884/ft917Hv/pUU7Y+xaGgIfVzGV86V1pSLb790E42/tmkcT+Do7CCVSkFlGa13Ovn4dS9S7RrhBl8LpsrNCQEZH3oGUnnkHQNzyxtorSXwLFDK1mgNNmDpC38sfjqNzgCNzn6gn2frlmF5Zy6ApgxFrNDgyvpjvLdkF02ufnxKAk+6WNrGflv3YtJ+26B2pUAZxAoN7qrbxX8ubpn6gQSetFCAaYJlndinyZ6YgL0HT1lsMBjcQEe0iKSe+S7cMVFE0b4oxsu7Tn1/Vmry+kjX86V4694atgJ4uxy4X96FHYmcmLFle1wEasf5m5JduJWTGTMgFWCYk1uDTFulOWWbJLWFgZqV9+uFIuNDj8gQ8QTeVjefbb2TOv8w7wu+zg2e5JzfTNoBlZ4xap1DFBtxnJcwiFpcvH2JKN/oeye7BqpnHB85VMTi3hCayS6t0SsriRYrRtalaPZ0nf5iYl40ufoZvyKGSl2OZ9QmuHMQ69CRM57v7Y7ywqur2DRYdeKYrSH0ZjGLhgZnBh6lMFcsZfiyILGgQWpdiApznGze6HKujFgR/n3kMn7XtYr+oXzKj9ho6/y+/q/0HufHmyyS/g14B20Kt/eQau+Y0ZarSnr4i4rfn9Jtmakk9Ih5YY9PULE1wbGRRRwsa2LiXR6uqX0Oc44Xp7fciiXePta6EjiV+5Kmy4uL91x4Oa88spaaZydQ077Ql4aG0Md70UBoZTnJTwzx2cZXaHANSpdWmq13xbh3849ouaKCX/asZ9SuJu/w0RlPA6YzDx5j2b0VJ9bheUv5WP8pXVrKNBnYUETtp45wa+k+lru7WeqUEesXo9tS3PvqdSz6ucXSkRhGbzepROK8XvsObzf//s4f0X5dCfce3Uw4XI67veNEW1oBNztuWM1T93SzpujoHP8m80NCj5h12gDlMFEOx+Q3Dq0hmcTdHyHfbWAkTbrDBZf891jaJoWFpTWWbZz2H7M2IM+METCy41tKpklqCxub44kgeR02bNszoxPjxPdRpUjkGVxfeWTaIoQSeNIpYHi4yWtxk7eLpHbww8Ia8l0usKyT9/U01vg47D91g9i3P+FRpolyuYgHFXeX7+DDeSNTP3TN2e+SzWLaxDXowLXzENb45NYtJxgmylDgMDDVqU9/Skw/t/ri4OviYFUr2wNXTM7kmtaWeU0b6U/kz8vvMh8k9IhZVWjYVK/oo/2TTbhHGil5M4Kx4wDasjFHQ3gdBpbLz1jcg6X1JU1HfiHm5OvH7qBzpBB7dwGNAzIrZCE5kIjwL723sL27juixPJo64qecY/h8GMVFaK+baIlBgSOahkrFuSxzdzO6KY7lvRzPoKZ4az/W4Qv/5m+uWMrAhiLiQUVsQ4gG5yAgT1/ngllYQOTqpYwucRKp0NxavQOH/LeW0CNmV4np5d+XPci+xgp2hht5/MFN1Ox1Y09MYPX0oQaGCOgGjoUvfWzNzwc3MvjTOup2jGKMd2H39M3CbyBmy9ZYA1sfXU39o6OoyAD0DpwSSlVegPjiMuJBJ5EqTYljIi21irPb5JnggWu+T/tVpdzXuZmJcCW+Cw09SjF8WZDaTx3h7vIdNDgHWeOykNAzR0qK6Hi3wX+56REqHKOsc/djKlmvLrtCj1IohxNlGthOhcO4tIXwxIVzKpOVLi8rXWMscm7j18VXY/h96EQCnUyh43HMSIJEPMCwnSA57WPQROFWzrOOu7G0TVynSGLREQpS0J7AfvMA0tILz6jlw9ensd88MLMrZNp9qnxeEgUOYgUGllfjMWSdrYUoYHjY7IHNniGOlB7h0YJqAp4L7DI2TWJBg1tL9011aZlI4Ll4cZ0kplOM2gGm91wph2NyaEHAi6sswsfy2vEZLmSB3klZFXoc1VUMX1fLeL1BpD7FR4LZt4R2Jik1E5St66P1c4twj0DZjjDGtn2o0QnyXyvlOvs/YZgnPwyLCkN8dfEz0/r4T/VSzMHft36Azv4iXAe8NPQOSZdWhnHU1zJ4bTWhGkXKp0mUWhj+GIurBlju6kE+CBe2K/yt/PgdVxGpWH/Br02tC7Hc3T0HVeWWMTvKPw9cxa9a1hIf8VC1V6MTCcz8fCLXLmNopZNYseYdDbtk8sbbZFXoSdaVMHZniK+ve5hiI8wyZxwZDJk+laaX+5ofoH1xIS9MNPOYeQ1Vu1xYA4NUPaypfDowuU7HlPHVxdz/Z5v4o2W/O+NU9kdG1zPxiyqWvjSACvdjT1+MUGSEeGMJibtG+JeVj+BRSfwqgVNZFBoJKk0XEnoWtus9ozx0zXfpvfrCBreaaCrM8alZWjJo+VIMWhY/23EVS38QxxwchJExrHgcR0UZHe8y+W+3PESdc5jFznGc0qU1Q0aGHkvbRHWCiLYYSflPPNqz3Sa1wSHe7Ysx+cYpgSednMpkqdPPUmcSU+3jkcA1oBQ6lSLV2we9M8fg+Au89E3k0W9FMNXpRzh3hIMEulNnXS9EORwolwvldmO7wJTOr7RwKgvLrTDz8tDTurcihU6aS/qm7lM42c0hH4SZIGB4uNwNEDvXqach6/BcioidIKKTdKbycQw5MI+0zth6RztMKEhyu/8YJaafC+nScioLywVGXh4kk9iJJNjZ9xw9I0PP4WSMvz72Ad5srcHZ66K+JXbGtSNE5nAMjGO/UM2mga+ccVaXt81F/fHhs8eYtcvovqaAeLHGu3aY5a5e5M12/q32dBK/YZzW0lUz1uaJ1SX4ZNGB9BUmRAYK2TH++8BGfnlgHfaIm4o3NDp6McHz9DbntfDr29Yxtng1/i5N2fO9WEfaZu36C0VGhp49iUoOP72I5od6UdE49siofJfPAtbxHqofjFLjO8vMrlgca/jMY35QiuHV+Wz44zf5ZOlLlJpR6h3yBCEdrnAl+OUV32Ng/cwnroVGjHqHBmR1bCHO16id4qFdl7P03xM4+rrRY+NYkcisXf+d3kF+s/nbDF3t5e9aPkC0oxiXhJ6FIWK7cY2D3daBTqXO/QKRdpNdHRqjsADlmhZCLAs7GkMnE+hkAquv/8IuPL0bTGtQBikPXJ7XzmaPwYw9ZsS88hkuml0umk/5iSwUKcRFSRqYEzH02Dg6Fj/Rw2F4PCiPGzvPh+GyMC9iAbSA4WHl1Ftzc7CXPYXleINBkl6F28iez9mMDD0i89SaIRqu7uSQpw6VOnlDukcVVVvCqNd2X3AXpXK6MPIDKKcTHYthh8JoW7o5hRDZJ88wuWHNQbZ8YQWukVLKt6fwPbcXTJOJd61g4DKDRJHFHUvfwK0u7aP9psL9PPe+ZQxcthzn4gmuCRyepd8i/ST0iHlR5/Bx/5KfM7DIga1Php6fjV7FM8NXU7LVAH1hg+aUxw3BAmyPGyMUQcXi6ISs8yKEyD4Fhpd/rn6Sgfc+zZ54FV9zfJhFr3pQHg891yj+7Y4f0uAYody08RmX9oT7Pb4BLt/8LSaudlJgJCk3XWTLRAMJPWJemMqg0hGg8m3H9/k6eSy4CUdl+QV3VSqXC+2Uf8JCiNxQZvopM8GpjpMstKGsGNvlQAeTrHcNUumYnenpPsPFIuOtkJNdk0DkE0Ok1Wp3FyU3dXOguhZlX1g/tL/ToPKlCRwd/eh4QsZ3CSFyQpEB11++nxf+vBmU5o5Vu8gz5OP8fMh/JZFWzU4nDzX/lIllGusCB9/9yYGPk9oVhP5B0PbkmCBDFrYTQmS3EtPP/6x+iuHKJzCAItMkYMhsyPOR8aFHORwYBfkor5dwsZMK1+ytWyDmnlOZJx7ZXqgSb5hxsygrF9ASQoizCZo+gvId74JlfOgxa6rofk8NoytT+MpCfKHsjXSXJIQQQogFKONDT6qsAOudI2xZfy8+pcgzXIAz3WUJIYQQYoHJmNCT1BZ9VpRBy8m+SDVmXKNtjTYN/O4E1abvjJtUCiGEEEJkTOg5noryt8fv4LWjjRh9buqPJiYHrwohhBBCnIeMCT19lpdXDy6i4lkH7rEUnrYhUrLJqBBCCCHOU8aEHqeycPqSxIIuLJfCjBbj9rgYq/US9AykuzwhhBBCLHAZE3pqHUn+dPXLPF+xjGjKyUDYRyzqpbhwgE9W/0HG8wghhBDirDIm9JSZfr4SPMwXCw8AYDM5nsfAmNpcTUKPEEIIIc4sY0IPTC5k51SyGpOYVOoJcbzOScllKzHHwtjdvdjxOK4JzeMDq/EZcRa5+lnrShAwPOkuVwghRJplVOgRYrqPlrxK38fzaH9PkMSbVTQ96IBDRyh6Y5i+HzTyP4JNRK6K8MDGH7BRMo8QQuQ8CT0iY93gtdm8+DFsbG723k3y9wUYh8A60ELhwaMYLidd5nraLy9ho2ck3eUKIYRIMwk9IqNNdneamMa0NZu0Bm2hUwplgyXjvYQQQiCjf4UQQgiRIyT0CCGEECInSPeWyAoNecO8ubqKIuf6E8dsUxGqsyk2Q2msTAghxEIhoUdkhc+UbeG+T2iOhwtPHHMZNp8uPspa1xAQSFttQgghFgYJPSIrbPYYbK57+Qw/lcAjhBBCxvQIIYQQIkdI6BFCCCFETpDQI4QQQoicIKFHCCGEEDlBaa3P/2SlBoBjc1eOOI16rXXpbF9U2jJtpD2zh7Rldpn19pS2TJsztuUFhR4hhBBCiEwl3VtCCCGEyAkSeoQQQgiRE7I69Cillimldk3737hS6ivprktcHKVUuVLqQaVUq1Jqp1LqVaXUB9Jdl7g4Sqk/V0rtVUrtk/sysyml/naqHXdPvddele6axIVTStUqpdqUUkVTfw5O/bkhzaXNmqxekVlrfQhYB6CUMoEu4NfprElcHKWUAh4BfqS1/uOpY/XAe9NZl7g4SqlVwJ8CG4AE8KRS6lGt9ZH0ViYulFLqauA9wHqtdVwpVQK40lyWuAha606l1L8D/wR8dur/v6e1bk9rYbMoq5/0vM1NwFGttYykz0zvABJa6++8dUBrfUxr/c001iQuXjOwVWsd0VqngC3AnWmuSVycSmBQax0H0FoPaq2701yTuHj/C9g49fT1GuBf01vO7Mql0PNh4GfpLkJctJXA6+kuQsyavcC1SqlipZQPuB2oTXNN4uI8DdQqpQ4rpb6tlLo+3QWJi6e1TgJ/xWT4+crUn7NGToQepZSLyW6Q/0h3LWJ2KKX+j1LqTaXU9nTXIi6c1voA8M9MfmA+CewCrHTWJC6O1joEXM5kd8gA8Aul1CfTWpS4VLcBPcCqdBcy23Ii9DDZgK9rrfvSXYi4aPuA9W/9QWv9RSa7LGd9cTgxP7TWP9BaX661vg4YAQ6nuyZxcbTWltb6Ba3114AvAR9Md03i4iil1gE3AxuBv1BKVaa3otmVK6HnI0jXVqZ7DvAopb4w7ZgvXcWIS6eUKpv6/zomx/M8mN6KxMWYmiW7ZNqhdcgqxBlpasLIvzPZrdUBfJ0sG9OT9SsyK6X8QAfQpLUeS3c94uJNfeP4X8BVTD5GDwPf0Vr/Iq2FiYuilHoJKAaSwFe11s+muSRxEZRSlwPfBAqBFHAE+KzWejCddYkLp5T6LHCT1vpDU382ge3AX2itt6S1uFmS9aFHCCGEEAJyp3tLCCGEEDlOQo8QQgghcoKEHiGEEELkBAk9QgghhMgJEnqEEEIIkRMk9AghhBAiJ1zQLuslRaZuqHXOVS3iNNo7kwwOW2q2ryttmR47d8cHtdazvoq0tOf8k3szu8zFvSltmR5na8sLCj0NtU62PSV7As6nDbd0zsl1pS3Tw6w8Micr1Up7zj+5N7PLXNyb0pbpcba2lO4tIYQQQuQECT1CCCGEyAkSeoQQQgiREyT0CCGEECInSOgRQgghRE6Q0COEEEKInCChRwghhBA5QUKPEEIIIXKChB4hhBBC5AQJPUIIIYTICRJ6hBBCCJETJPQIIYQQIidI6BFCCCFETpDQI4QQQoic4Eh3ASL3WNomqhPEtIV9CddxK4OAcmMqye5CCCHOTUKPmHcdqQh/ffy9bD3cCLa66OvU1w7y/y7+FZs9s1icEEKIrCWhR8y7Q8li3nh+Gc0/7kclUxd9ne7bqnnmc6vY7Nk/i9UJIYTIVhJ6xLyLaSeucYV97Dg6Hr/o63iHKwlZ7lmsTAghRDaTwRBCCCGEyAkSeoQQQgiRExZ095albYbsKEOWwqVsSkyTAsM7J39XyI4xYKWI6Zk5sNjUFBtemSE0iwxstAJlmmjDBNua+oGJmR8A91m6rBJJ7IkJdCqFkdJ0RoMcSETIM2xKTTdu5ZyfX0IIIUTGWdChpyMV4Yutd3Nwdx3an+KjV2zla6W7cCpzVv8eS9t8b3QF39p5A8ag68RxbcLi1cf59uKfs8gZmNW/M5e5lIXlBaO0GBWJoScmsGMxHNWV9N5ey+hyDWeY1OXvMKh5YgDrQAt5LePs+/Vy7qhcSv6SEb656mcyk0sIIcQZLejQ02kFOPpKPc3fO06yuohfB9fwNyU7Zz302Gh+dXwdDQ8oPG8ePXFcedwc/UwdR+uDLHImZ/XvzGVOlSLl1Vgl+RghNyqVglgMqyJI/JZxHl1/L6bSp33tV1r/iMiBatwHQO8/Su0xLzhd9PzRErY2LWazp3WefxshhBCZYkGHnqQ2MWMKe3AYh9tFZCTItriHUjNMlakJmr5Z+7viKQeBsQTWwMCJY4bHgxmrI4kJSOiZLU5lYQVsYmU+HHluXA4TR8BPqNxLeX4vS52eM3YnlnsnaDUnHwPpZAJrNDF5zchiknp2w7AQ4vyF7BjdlsWEfeYu5iIjQZVDuqFF+izo0DPD8CiVz5Tyud7PkSix+OTVL/N3JXtlrE0GanCEuOmKvbxYsggrZWLF8iBZgLsoylerd0ibCpGBHg1X8nc73o9x7Ax9zAr8K0b47poH2CArTYg0yZjQYw0NU/DILgofd5Fc08Rvq1fxNyV7kO/2mafa9PG/qp8lUvU0AJae7MoylaLAcAHyLVCITPP82HLKf+Om4Ml9pz/B6aDr48vZtbSeDe7u+S1OiCkZE3rQGjsWg1gMx2iMkaE8no36KDJDNDkSlJj+dFcozpOpDALKw6UODTd8PoyiINrjIlak8BmJWalPzL6QHaM1BQOWn0IjSpMjNavd0yI9QnaMI0nFkO1j73Al7tEU1vj46U82TMyYxtYXv/WMuHhxnSSpLWLa4njKwZB98fefiabUDNPoMPEZrnO/YAHJnNAzjeobovLxQr7a8mlilRZ/ct0W/q7kYLrLEvPMXrWI1jsCJOviLK09xrW+w4A8N1+IXo4V8KVtH8F1wEe0LslfX/MYny2Qb/uZ7qlIGX/12l14Wjz4ujWFR3q4+I1lxFyJ2AkOJzVdViE7wk38aNdGPEcufqqrVqDWjHPv+h9n3IzZjAw91uAgeb8ZJ980SW5YzpOLV/DXxftlLEiOCTX4ueGWXfxT5bM4lYFXZdY3jlyyI9JI8BkvJb94k/DNK3lhxTIJPVnglYnFlD3lIvjIm2BZpC5hWxkxd+I6xdFkBXujNTzds5zyp1wU/m4P2PZFXU+ZJt2fXMW+lTVs9mTWfZyRoQet0fE4GnCMxekZLOCRcCGljnGWO8OUSVdXVhmzoxxIuOi1Ctg3VEEgNrmYoTYg3xGTbpIFKmTHOJQ06EoV8vLgItxjNnY4jGs0xb6BCh4pC1BqjrPCKW2YqSwMzCTY4fAZzzHy8jBKi9FeN7ESRZ4Zm8cKBUBY2+yN1vDKYBM9/YXUjVnYoRDo0y8Nck5K4R7VvDC8jArHKHWOEZY6VUZ0dWVm6JnG7Bmk8pEG/tubHyNca/GFdzzDXxUdPfcLRcZ4IlzFX7/8QQKHXPh6NN6j3fIIPQNsjfv53NaP43nTh3dAU7p/EAtwtw3g/Y9q/p9X7iG0KMnfX/9r7skfTHe5Yo6k1i2m7f1ujOoIV9QeYL27E5CQO5+OJPO5f8cmil9xUjVm4zvUf2ICyUXRmqJdIxz60XL+pqgZ+4px7r/8hxkxKy/jQ0+qt4/AI0METJPE9at5fvUyCT1ZZnuokYpnHRT8cifYmlRK1kzKBPtjNeS/5KX8R7vQyRTWVLulOo5T+FAfhYYidMc6Xlu/WEJPFhtv9PDhm17mv5Rsx6lM3EoCz3zrSBaRv8dF6c92oRMJLMu65Gvaew5ResCJ8rjp+uxqWleXscE9MgvVzq2MDz0AOpWCVAojaWPZMq4n21gYGCnQMl5gwQvZMd5MuGhJVPBE/0rcoxo7Gp35GF1rdHJypp2R1MQtB0ltYaBkXF4GGLEivJ7IozNZzCu9jfhCkx+gRl4eqrIM2z9tZKsBoWpFnXuIgJFhI16ziMdIksgH1VCDEYlhD41gT0zMOMfw+1FV5diBM+9vaY6Fsbt7J2dSv3Ufaxv3iOaRwctI6N0sdfWxxmUt2K6urAg9QoiF4c2Ei0+8+mn827x4hjXFuwbP+hhdWTCRcjNoRfEog4DhnvVtZsTsei5awV9u+RAFu534Bmz8ByZnbOnlDbTcHcC3ZHTG+Rsq9nCt9wjSpZU+S539FG3q5XBBGe5Bg5pnC2DbnpknLannyIcK8DSPnvE6iTeraHrQAYeOnDimLYuSHSO02sv4H8HlJDeP8/Mrvs+ahZl5JPQIIWZPS6IC/zYvVd99HTuRxNJnnx2itCZmOZnQCgsbHzbIkqML2t5oDaUvOyj66Ta0rUnZk096IlVerr12L9+pfW7G+QYGTunSSqsqR4qP121lZ7CBV443EDkYwKfUjCewsQo/Kza18ovFj57xOjd77yb5+wKMQ9MOao295xDBvQaG30eHdzXd6wpY41qYT+YzMvQYHg+qtgqrOIARTaK6+rEGhzDDSY50l/KNigaqnCNs8nRR45Dd0YWYT8qe6nK2T44bMPx+VF0VqUIf5kQcdbwHa3QMR9hif1cF3w5cz3JvD+8JHKLGIStyL2Q26mQbT6NsiFpOJuwEHmXiVS7prlwgnChKHRPUeEbI91Vgn+Fz0WFYZ90XrS5vhIPLKihOrcEcDqM7uk50daEtSCZBg8XCXYAyM0NPVQVtH64gsHGAgZ4CGh5uwvXUMOaxPmp/Xs9PXryNscXwmduf4T8Xt6S7XCFEYy1HPhakaM0A/a3FLPq5B+PlXbhbeql+sIZXSq7kd+s0nluTfDK/P93ViotgJGzax4rYEq2kwjHKKlecAnXm8SFi/gQMN1d6ulnkHKAvkc8OX9lFXedT5S/x75900DEeZGJ7OY0/1dDSOsvVzq2MjOFWgR9z/SgvrP0pX974HBO1k9nN6uvH/fgOir//GtVbUuwcq0tzpUIIgGSpj/orj/PS2l9w+1W7iFROzm1NdXXjeWw7wR+9RtlWOBCtSnOl4mIZSc1Y2MuhWCWdyWIi9qXPEBKzw6lM6hwB1rndLPX1Yl/keJubvBa/aHqal9Y9iO/yQezCzFsTLyOf9AAopTGVwlBvGzMw1UfpGkuys72Or+WtpNE9wDt9rdLVlSWUwwHKwDbBUJew1oSYU4bHg1FVgVXgZ6zBTbNvDKcycRgWqGmPv6fuWaWRfZkyQKVzlPEmRd6N63FMJDDaurGGhnGEEiQ6/TzsXkfQF+Xlgj6KXaFzXs9Ac4W/les9ozLDKwOYygAN9QUjdKxdTKF/Pa7+ELq1I92lnZeMDT3n4jjSTd0DdTxZdh0jK2Hijsf5cvBYussSl0g5HBg+H7icpDwKp5JvkwuVUV5K921VjDVbuMtDvDO4P90liVnwDv9h2t7/Cm9eX82hlioWP1iPsWUYs62Xpl/XkCgsIOEsZKenCts8d4i1HfDjd1zFQ9d8l8szYHE7MRl8vlj9HD/5TIyOUJC2l2to+lEcu6cv3aWdU9aGHquvH9dT/bgA9/s3sPcdVSChJ/MpA9xulNuFdoD59id9YsHQAR/ji2w2rT9Eg2+INe4uZEPYzLfU6eefy3dB+S7+U/6V7HxmPQHAGhjAGBjgQp/VGB4PkYr19F6dD8gWFZniJq/FTXUvEbETXDn2Kex8H/Sku6pzy/jQU+UcYWwxBG67AtdYEseRbqw+GQiZ6Y6nQjweXsr+SBWPH15JzfDkar5mSRGRNTXEihyMLYY611CaKxVnJb1VWa3BM8Tvm01ct16JazSO2XIca2j4gq6htcbXp/nXtlt4obiDzXktvNM7KF1dGcJUmXWTZ3zo2eTp4jO3P8POzXXsbK+j7oE6XE9J6Ml0z0Ua+JdH30f5dk3NcBLvvi5SQGJxJcc+orl++T7uDHTzDt8RQMZqCZEO78nbw9gHvey7uZKdBxpZcn8t6pULDD2JBKUv9xMeKOelgkp+fds6frP526xcoIvbicyW8aGnxhGYnJZe3MLX8lbyZNl1yL2S+Y7GyyneDYGHXgM4scFovNjJzSv28N2aV6eOSOARIl2WOv38fek+KN3Hp5zX0lrUfMHdW2iNdfgonsNH8eXlMbZ4NUNXy1R3MTcyPvRM1+geYGTl5Bge12gK98EuUr19uIeS/H7/Cj5lO1gR6OauvDdpdMqHZTbZFY/z8NgVdMUKuSK/nTsDB6iU2XpppSIxAh0Gf8hfzFZvA1vLG6jxj/Jiy2IaBmXT2GyzxNfP1stWU+LYcO6TNfi6IqgDbdjh8NwXJ2bdtniSX45eSWckSLIlHxXuJRPm0mZV6Hmnr5WJOx5n7zuqeLZlOfX31eDo7cN58DiLvl9Na2Ezr1y1Ct8HE3yxsDPd5YpZ9O8DN/Dazy6joDXFi9evpPqOEd7vOPd0WTF37L4Bqp90Ub7Nj3YoUt4KWh2VNI0lcR7uRObdZZe7Cl7HvNum9Y6Sc56bsB28vGUVS+4tgyNt81CdmE2WtvlG97vY//NmCluTNPWOo3syY1hJVoWeGkdgclp68BhfMZNsL77ilFkFxYGNtL+7BJDQk00Oj5VRtj2C8coeCio30JssACT0pJMdicCBFhST45mndztL4Mk+S53+qRXwz70KfsROcOXiOrRfBitnIhtNy0gpFa9NoLfvQQOayZl4C11WhZ5L4XUmiRcX4KsoR8diWOPygbkQeQYSPL1rFe+P5p84ZqPo2FPJ0tFRZAK7EOmxLxHl/uFNtEzM3OLgssJO7incRqMzwIsx+Ong1XSGg6T252NM9Mg9m0Fei1n8ZGgT7eEixvYWUzo+kHFfYCT0AAaKEm+IjpoKzFg1rv4QRmxh7hCb6xwHO1j6/WrGC2tnHF8yMAHHutJUlRDiobEreOIXV1P2RmLG8Qeva6Lxrn7qHP38W9e7aPvJEgqOJmjqH8HuzYwuETHp3v7ref0nawgejLNoYBR9PAMW5nkbCT1TfI4kSb8iUejAEXajTBOsTMuw2cPSBkqfOizOGhqGoWHevg/wW49XMcx5qE4I8XYd0SKK96VwPr1jxvGCyqs5nijGpo+20SJKXw+ht++RJzwZxtI2rRPFlOyKYLy86/TtZ0xu52ku4CHNWRt6Vvi6efQqSPquxjeQwv96J6meXny9CR7efgWvL6plfVEnny9+iQaHL93lCqAtGeLe4U1sHWqgtb2Mph6Z4ZNpGpyDjK1N4LznSjyjNnm7+7FkoGpOC/QkuXfHtTxd00zozWLKx/pndokohbliKaOrgsQLFHZziFIjAsj78kLwWszi3v7raZ0o5vjrVSwZGT61/ZYuYmxNMfECg/iqKKXmBCzQxWOyNvTc4T+M/z3/QdvNpTxwYAO14SrMnl5ce9pZNlaJlV/MozfWsPxDPXwyvzvd5QrglVg9v3xsM7XPxFk2EcU41ptx/cW5bq0rxP++/kH2bqjlse6VjP2gjMDR9hObiorc43m9jWXDlVjeIIuGhk7pElEuF/2biqj5WCtXF7Vyha+VekfWfjRlnJ8MbWLnT9dQuivK4pERdPvxGT9XpsnQVaUUfqKT95YdYq23g2XOhfscL2v/ZVU6Anw0bwjyhjhaX0proBmTk90jBpDXeDWDqbx0l5qTLG1jv+0RaF+ygLxjYG55A631+QUepSb343rrj6aJNjJrWfRsEjR9vNcf4b3+Q+SZMe4vup0800Tb09pa22cMQbZWJPXMljdQkzs7i4xkDQ7B4BAGp5m1pxTK4SBaovh89Qvc6ntrLOXCfEqQS956j24PF1F0II7x0hundmkpBaZJrFjxqZo/8OG8kakfLNxZXFkbesTC1ZMK8Y2ha3i+ewlanwwog10FNHScf5eWo7Ge0SsriRadvIY2FKNXxlni7p3VmsWFW+LuZXhDEm1cyfR86x2yKdzeQ6q9Y8b5/q44v37lSl5sXDzj+NUVbfx56fMskgVFs8dUl9bwZUFiQYPUuhAV5jiyIe3CsC2e5Bvd76JlpJSxvcUsGjh1ZqzZvISRy4qJBQ1Cl0epdQ4BC//LiYQeMe/2Jwv45XMbafpVDCNx8rtfSXQC1dOPdZ5dIaGV5SQ/McRnG1+ZcXyJu5cr3CFAlrJPp6s9o3z3+h/RevXMKcz3Ht1MOFyO+22hx7mnleXDFdi+kx982lQ8ffsVbPrQERY5RxDZQZkmAxuKqP3UEW4t3cdydzdLnfKEdqH45eiV7P95MxWvTVA6PnDqLC2lGLmsmILPdPKJil0sd/ew1pVgIT/heUtOhB4DjXaAcjgmH7Pbkx+0yoaY7SSqE6S0AQbYpkKbCmUYMntrliW1hY3NQCqIt8fA2LYfnTw5vfWsUccwUdO7rZRBvMDgxqoWPl94uqnqEnjSrcDw8i5fEnwz2+dgVSvbA1ec8p3eGh2D0bGZBw0T37oNjFo+QELPQjb9ffZclMtFPKi4u3zHtC4R6dJKt5RtEtdJOiNBCluT6O17Thm0rExzsksraPCJil3T3n8XfuCBHAk9VxW08vw7V5LXsIFAl0Xhq8dJHe8iryPOj7Zcy0PVlxGPO7GWJ5lYZBDc7afimB97ZDTdpWeNfivM1weu4amO5UwMBKg9kpoc23EeHLU1jF5dQ6hq5qPT8RVJLvMdm4tyhRAXaPr77LloA2IbQjQ4BwFZZmIh8PSEOPByE+sHP0myJX9ya4m3nWMuXcTQVaXEihWhy6Msd8s6PQvSnYEWFt32fXpThfzD7tvx9ZRiHO/C9foRlneXon1ujr/Lx413b+eWwj18yfcxyl/OBwk9s6Y16eFXL17F4p9HqBkfg/4hrFTq3C8E4ovKGPvwBH+98skZx6sdI6x1hZCprUKk3/T32XMxsWlwDrLGZSGhZ4FoOcbiH0Sx87yTm4f29J8SesbWFBP8ZAefrH6FWudQxnRpTZcToafE9HOT1wKG+I+yfsZ9tZMzCcbHYXwcDBPnxg2s8HXzTu8EvoIo2ik34myI6yQxnaLXKsXTb2DsPoIViZz7hUqhHE6UaRDLc7CqvH1yNt4pJPBkJaVQLhfK4cB2KgwlU94Xuunvs+fHRALP3Hrr/ddE4VZOnOrM/73tSAS7tf3UH7x1L5om8QKD95cdnOqSNMi0wAM5EnpEegxaYf6h/3oeb1lBcthDzQELnTy/pzuO6iqGr6tlvN4gUp/iI8FDc1ytWEjMFUvp21xErFiRWhdiuVvW0hLiQnSkQnyt+zZebl1EXiDKf1r6PB/Pu/BZrebSRfRfU0q0TBFfFWWtp+PcL1rAJPSIOdNtmfz2lctZen8IY2wAhkaxkolzvxBI1pUwdmeIr697mGIjzDJnHHmqkztGVwWp/9gRPlv1IhXm+NTMHhnoKsT5OpgI8ofnV7H4pyOEFxVw359t5qOrHr7g64Saiwj+8XH+of5pSs2JqYUHM+8Jz1tyLvQ4DAvLbeDOy4NkEjs+uRiWYcFgKo8+K04y4QAtG45eqqQ2cI4bGC2dk12Jb5nWdYHW6EQC/bbxPbbbpDY4xLt9MSYfgUvgySUpr+KqYPvUYnWydosQFyqi3bhHFLqlDb9jEW1jAXqsKCMpP+p0E5OndWO99WcMg1iByTVFx6buxcz/4pFzoeedxQf45/c14Fm/mvw2TclzHaS6uilsifPjx2/kB8XXUrjLiTE0IBvizRGzKMj4jUsYWmHiGofKP0zAjr2yVYEQQswBY2gc74t1XD/2FzgHHdS3xE55v3VUVzG2sYZQpYnthJQXtBOSSyNsDBxJU+WzL+dCz0fyj7Dxpv/DuHbzhTc+SuHhYjjehXP7IZYcKQSHiQ5HsUZGzmu9CXERioN03Wzz7zfdz6Mj63gttJ7inQZoWRdJCCFmm9XTS+Uv4lQ96oWUhT1y6grLqZpiut5lc+3qA5S7J7gy0EqFY4xiI0q9Q5HJXVrT5dyneoHhZd3U0/Ky/BC2I4gB2OEwdjg841zlcGAkoStZxPHUYfIMk4Byyz5A58lEYztBBQswzZOzBqxCH97iKNd5JujNa+Nl7+UoQ53vsj0iSymHA+X1olxOUh5wnvYZvBDiQulUCmtgAAbe9gPDxPB6UC4XkaAbb1GUK/OPUesaYpOnjzLTT7Yt9JpzoedC6FSK0t0J/vWR9/GPQYsrVx3lf9b9hhqH7AF0PsrNJM1XtbHP0YCROLmacqrQ4u5F2846fVLkHrViMd3vKCJSqfEuG+UKX2u6SxIiqznqa+i9uYrxJrAqEvzRor2s97ZRaMTxZen7s4Ses9CpFO6X97P4zQDkB3jjT5dyvMpLjfxXOy9lpo/vNj7MQL0De9rGom5lUWGCU8ngZHHSxJIClnzwMP+19lHyVIoqhxtwprssIbJWojqIfs8w/7H6x/iMFKWGwmc4MXBl7ZdS+fg+BzsSgUgEIxLFEakgIYtpnTdTGVQ6AlS+7XhSWwxaUQ4kIhyJlWMm9OSeaIaJmR8At5togYMix/nvuC4yh9tIkQgozPKZG5HGggYr83tY48qOsQNCpJNTpUh5wSwrRSdOv1RItMRFXUEP69xucmWWpIQeMe/aUjG+0PLHtO+pwj1oULM3AtrGUVNN7+21jC7XmBVR/qT8jXSXKubAjXkHeOK9zRxa0zTjeH7jCNcFDqapKiGyyxLnEHXXdXCgpBplnX4He7M8995nJfSIedeSLKb7xRqW/6AdHYliRyJorbEqgsRvGefR9ffiN2xKDBfZsC6EmOlGb4jH199L5LKZb8R5SlNkSpeWELNhkcPLA0t/wcRijcXpQ49H6Zx7n83p0JPnijNa4iKvphodjWKPjZ+ySJ6YfUntwBEDa3AIHT+5CKTtMMjzRljq9MgMuSzmVk5qHBJshJhLpjIoM/2UyYiMGXL6k+Wu8h0MfizC/r+tpufDyzErK9JdkhBCCCHmSE4/6flQXg+3X/VdJmzNe8o+h7W1EDqPp7ssIYQQQsyBjAk9yuHACAZReX7CVT7yPUMYl/igyq2cuE0nBYaFz50ElRuj14UQQohclDGhxywppucDTYxckSRQFObL9a9hnGFwlhBCCCHE22VM6NHBfMY2xdhy/TcpMEx8yoWZpYsnCSGEEGL2ZUzoQSlMh02R4SBgzO7iZQaKQm+UUH2Q/NAS1HgYq38QnTz9gk7i0vhUnHiRRi1vwkie3F9prNZL0PP2zWGEEEKI2ZE5oWcOmcrg0zUv840/uYnDowH820upedggJYOa58Ry1wjX37ibFxsXoe2T60MUFw7wyeo/yHR1IYQQc0JCz5Q/Cgzx3jUPEtJJrnN8HvvZAHSmu6rsVOcI8K2aF0hWPzvjuIGBWznI8ZUUhBBCzJEFHXr8KkGsxMZau5houYfC/OFLnrF1JqYyJscJaYVp2qBkkPRccisnbiUL1AkhhJg/Czr0NDljfPi6V3iysZmAe5wv1b809SRACCGEEOLCLOgEUWb6+e9lu/ha2U4MDAyUjPcQQgghxEVZ0KEHJrudzHke41GWH2J0RSX53jU4+sexOrvn9e8XQgghxOyTxyZv48Dkzxuepfhzxxj4mzgdd1VhlpWkuywhhBBCXCIJPW9jKoP3+0P8ZsljPH3ZfUTWRNF+b7rLEkIIIcQlWvDdW+liKgOPMqkoGWPkijLcI0XEqpLkqzgge3QJIYQQmUZCz1l4lYu/WvQ0P//iBkJJN18t3U+9wzr3C4UQQgix4EjoOYu3urre3/jctKO+tNUjhBBCiIsnY3qEEEIIkRMk9AghhBAiJ0joEUIIIUROkNAjhBBCiJygtNbnf7JSA8CxuStHnEa91rp0ti8qbZk20p7ZQ9oyu8x6e0pbps0Z2/KCQo8QQgghRKaS7i0hhBBC5AQJPUIIIYTICVkdepRSWin1P6b9+S+VUv8tjSWJS6CUWqaU2jXtf+NKqa+kuy4hhMgGSilr6r11n1LqTaXU/6WUyqqckO0rMseBO5VS/6i1Hkx3MeLSaK0PAesAlFIm0AX8Op01CSFEFolqrdcBKKXKgAeBfOBr6SxqNmVVgjuNFPA94C/SXYiYdTcBR7XWMjMiQyml/lYpdVgp9bJS6mdKqb9Md03i4iil/Eqpx6aeDuxVSn0o3TWJS6O17gc+C3xJKaXSXc9syfYnPQD/B9itlPqXdBciZtWHgZ+luwhxcZRSlzPZhuuYfB96HdiZzprEJbkV6NZavxtAKVWQ5nrELNBat049VS8D+tJdz2zI9ic9aK3HgR8D/yndtYjZoZRyAe8F/iPdtYiLdi3wa611ZOoe/W26CxKXZA9ws1Lqn5VS12qtx9JdkBCnk/WhZ8o3gE8D/jTXIWbHbcDrWuus+OYhRKbTWh8G1jMZfv5BKfVf01ySmAVKqSbAAvrTXctsyYnQo7UeBh5iMviIzPcRpGsr070IvF8p5VVK5QF3pLsgcfGUUlVARGv9E+DrTAYgkcGUUqXAd4Bv6SxaxTgXxvS85X8AX0p3EeLSKKX8wM3A59Jdi7h4WuvXlVK/AN5k8lvk9jSXJC7NauDrSikbSAJfSHM94uJ4lVK7ACeTE4EeAP5nWiuaZbINhRAi7abWzwpprf813bUIIbJXTnRvCSGEEELIkx4hhBBC5AR50iOEEEKInCChRwghhBA5QUKPEEIIIXKChB4hhBBC5IQLWqenpMjUDbXOuapFnEZ7Z5LBYWvWN3uTtkyPnbvjg1rr0tm+rrTn/JN7M7vMxb0pbZkeZ2vLCwo9DbVOtj1VOztVifOy4ZbOObmutGV6mJVH5mRXeGnP+Sf3ZnaZi3tT2jI9ztaW0r0lhBBCiJwgoUcIIYQQOUFCjxBCCCFygoQeIYQQQuQECT1CCCGEyAkSeoQQQgiREyT0CCGEECInSOgRQgghRE6Q0COEEEKInCChRwghhBA54YK2oRBCCCFE7rC0TVQnsNAzjjsxcSsHpsqsZycSeoQQQghxWk9GffzjkbvoHSo4eVBp1td18vc1v6PZ5UtfcRdBQo8QQgghTuvhwStJPljO0q0DJw8ain0fWMb2e96k2TWYvuIugoQeMW9CdoyYtgDIM1y4lfOCr5HUFhGdIK5t3MogoNwZ93hVCCHm2lvdUjFtYV/CdTpCQQJdCaxDR04eNEzcwyVEbPcl1znfJPSIeXE8FeJvum7n5cOLcXmTfHbFH/hysAWnMi/oOs9HPfzXwx+ir6eQ+tpB/t/Fv2KzZ46KFkKIDNWRivDXx9/L1sONYKuLvo6/xUVd9wDWLNaWThJ6xLw4lvLx6ksrWX7/EPHKPH705av4/BUHLzj0PDq6DvuhUlZs6ab7tmqe+dwqNnv2z1HVQgiRmQ4li3nj+WU0/7gflUxd/IWiMayR0VmrK90k9Ig5NWZHmbAtWhJNuIcVuv04bquKibF82lIWRUaIAsOFz3Cd8RpJbTFmx4hoTXu4GF9filTbMbzDlYSszHu8KoQQc8HSNuN2jLC2OZqoxTOksNs60clEuktbMCT0iDnTb4X5Ssd7eHXvYpwjDqreTKKTKdToOMVbSrlj7CsYJXG+tPYFvlzYesaxOc9HPfz1gY8w0hHE325Se2wwax61CiHEbOmyInz12PvZua8J15BJ9Z442pJ3y+kk9Ig5051ysHXrMpq/O4gam8CeCGEnE1gDQ5T+Kk7ZE16iK6v5+Vev4IuFRzlTR9cLE82oR4ppfvIYOh7HHpuY199DCCEyQXsqwO4/LKH5+70wEUZPhLBtCT3TSegRc0ppwLLQqRToqcWtbAtrdAxGx3BVFDOYcGJjwxliT8R24Z7QpLq6561uIXKFpW2G7ChDlsKlbEpMkwLDOyd/V8iOMWCliGmDQsOmxPRe8Lg+cQ42KMuefMJjn2belmFi5gfA7YZUCns8dEr3l3K6MAJ+cJ1+hq0yDFJ+hVNlXqCS0CPmTLmZZP1VLWz3LsI1XEHVS0ncz+2W/mUhFpCOVIQvtt7Nwd11aH+Kj16xla+V7pr1MGJpm++NruBbO29ADbuoXtHHvy97kJWuuQlYuajBEWL55jb25tdPdm9tieN4YRdMe9rjqK6k9/ZaRpdrPP0GtU+Pwc59M65jLG3k+K3FhGtPP9ldK6hc1ss6Twdw5vGYC5GEHjFnykwf367/LWO1mmcjS/m3ifdT+5JDQo8QC0inFeDoK/U0f+84yeoifh1cw9+U7Jz10GOj+dXxdTQ8oPDua6f9k03sa6xgpWtsVv+eXFZt+riv6ZeMNWh+M7GGH43cSuVLJnpa6LEqgsRvGefR9ffyr73v4kDbKvJ2zrxOpCGfhve28q8Nvzzt32OiyTMUQSPz1guR0CPmjKkMSkw/JSZ0ubuJBzWqphLHRBh7fAI7HEZZFuGQlx1xk0IjSo2DOXu0LoSYZGmbHivCoOVkR6QZ16jCHhjC4feSTHinuptnXzzlIDCWwBoYxD3SyM5wI4uc2yg1E1RKV9clm/6eu8LTRTwIZm0VJJInzgmVeynP72Wp00ONd4S9zlPX8LGdikrvGEud/vksf15I6BHzosER4vJrD7G1eBHOoTJqnkvifOYNjP4RSp8s5E+OfZFEWYo/u/o5/qroaLrLFSKr9VgRPnv0bg7vqMc9oqjcEUMnElz8EnYXRlsWJbsjPPbzTfyq6Goq1/Xy/eU/ycoP2XRZ4hxi+Q1HebOibrI/aoq7KMpXq3fk7Er2EnrEvKg2fXy3/jEitRa/CS3j2/3vo+p5k1RfP8FfTVDkcBC/cgm/q1vDV4MtOXtDCjEfBi0nLdvqWfatTuzRMXQsPjnZYL5ojbH9ALV73Bh+H62fW0T7okKWOpPnfq04Lw0OHz9a9AiRxpmDjU2lKDBcwIVvA5QNJPSIeWEqgwLlpcCACucY2gEYCrTGjkQAcI7G6Bgq4NmoG1OdfLy+b7QSMz75Z8PjwSguQnvdRIsNChzRdPw6Yo4NWmFaUy4m7HOPGTDR1DrGqXOcuXskqS06UlE6U/k4VYp6R4QaR2C2y84YFgojBXpsHHsiPUtA6GRicnxfMomRgCQmIKFntkx/zxUnSegRC4bZPUTJow18Zf+fMv05u69XU3Zwau+X5U20vq+QeGOcxprj3BjYD8hdnW1+OLaGb790E97uc79F2U5NyYY+Hmj+MY3O0weZPivKZw5/lO7XqrA8cP21e/hmzXNnXQlcCJF9JPSIBSPV20fBL0coNE1Q01KPZWFNDcSL1Aa47F0H+Le6R/EoE6+SD61s9ExfM42/tHC/uu+c56qAn6PGIrqX+mg8wxP7PsvF8Z1VLP12G3ZxIS9ULiFe/TS+DJtuK4S4NBJ6xLzzqTjxIo1a3oQRjsHg8ORihVqj43H0WV6rTchzxigxZcBjNhixIuxPehiw8mccbxsoonE8gR0On/MaBuAeUTw2to5R+zBNjmEWOyf3ZGtLxWhJFrMjvAL3iMIOhTG8HuyknyQaS9syfkzkrIAZI16ocDTUQTyBPTKKHYulu6w5JaFHzLvlrhGuv3E3LzYuItmXT/1jxbie3nlyxWaRM34VauIftryXQOvMt6KSYzaOznbOZ2itjsepeDXME/FreKTwGkqu7eHB5gcwgS+0/DHdL9bgHoXK1yPoWHzyRZZi1Aa3iuPDJVOlRU5a6+ngB9dOcKSsGl+vovL5Qdh/ON1lzSkJPWLe1TkCfKvmBZLVz/LDsWXcd+jdlD9rzu/sEbEgvDK2mJqnFf7H3rY6mmWROs+NEnUqhXptL+U7HZjBQg4VNDG8zIETm/a9VSy/7xjWwCA6mZpcpE1rsBUR20FEJXEaloQekZOWOEf42PLt7K6qZntLA8X7A5j7013V3JLQI9LCrZy4lZNa1xCRSg2XNWOG49AzgDUyku7yxBwatMK8Hi+kOxVkR28tJaMpdDx+Xq81i4ugohTtNDFGJrC6+yZnANkWOm6hI1E8Q4qfjmzEqSzcAwY6HJl5/anxYk5lY6r5WplGiIXHo6DSOcqYz8ubvmpsp+uMGz9nCwk9Iq3Wunq56ZY3+MOqRsZ7Cmn8dT7O30tXVzZ7PFzP156/k4L9DgJ9Np6W4+fVjYVhEr56MR3v03iDUZx/qKXm50lSPb0nTrGjMaq2hHlm+GpQULM3cmJJhBkcmjzDosBw48j6t3khTq/AcLHJ28pydzf7yioJeWuyfvUeCT0irRqdAb5Z9Qqpqpf4t5Hl/GLnuyhOd1FiTr0ebqByi0H+w9vQtiZln183ljIUo4scfO3ah7nd38bV0S+hH/NDz8lzdDKBem03JVunBidrG/22AK0NhTI0PqVwq2x/ixfizHyGi2bX5AzGxXkD7HTVprmiuSehR6SdqQxMDJzKYt7WwRfzatAK80qslNZEGVuOL6ZgzDrvMVxmSTG6ppxUwEWkSlPqGMetDJTBzKUN3qI16JlBSjkcmDVVpMoKmKj2URAcwymztuaVgaI8MMFAczGF5hrMoRB2R1e6yxJTDKVPfz9lGQk9Qog591y0iv/8zIco2mXiH7Lx7+8+vy4tpYhsaOLYBzVV1cPcWb6Lta5BuMAuKaMgn647arBvGqEyr4s/rX4Nn6zxNK9MZfCFmue599PX0zVRQPjVChp/msLu7U93aSKHSOgRQsy5fdEayl4zKfzJNtA2qfMds6UMxuscfHXj4/xZYRsApgowZl/Y9iPK62VsZYot6++l2vRNrc0jY3nm27t9MW5d/ARRnWBD7DPYj/qh99yvE2K2SOgRGcU5bvFC2xK+5h5nkbuPm33tVObwHkoL2aAV5rloFfuiNTzStobCEQvOc/zOCdrGM6L5eeflDKZOtnMo5UZ1elDxoTO+VDkcmHU1JKqDjJW7CJRP4FNKFiNMM1MZODFRSiYriPknoUdkFM+BLip/VMuTxdcxtE6TvP0RPl0gXxUXoldipfznZz40+YRnxCKwu+f8urSm05rCnX2E4qU8mVd+4rCyNY3tUeyBM4ceI+Cn+/YqnLcNUJc/wpfKdpIne20JkdMk9IiMkurpxd3TixswUhs5eGMlSOhZkFoTZRTtmurSsq0LDzxTrCNteI+04T3Nz+yzvdDtZnyxzROr7mep861tS2S2lhC5bEGEnrhO8nw0wPMTzcTtkyUFzDjvyt/DZrfsjyNOJU/HFz6lAX3maGKWFJNaUkOyYOYTGPdgFONwB9b4+BxXKITIJQsi9PRZcf5y9yfx/i4fV/jkJ1m0WPGHO5v4XfMvCChPGisUQsyF5PJajnzSweXL2mYcf2PnYpbeVwW7JfQIIWbPggg9Ydsg1pZH3aNHsAYGThwvWNzIwatKiC23kKGqQmSfeImLd63Zw3drXp1x/MaYn1R+EHm+K4SYTWkLPXGd5KlIAU+PrqZlvBR/pwHJRLrKEWnSb4X51cRSto838urxBkoGJmf3mMEgqeY6YmXuGee7h5I4Dx7HGhjAM5zi1wfWMZgIsDavk7vy9lIjM7nSKqktBq0oE1rRHivBSF7cdZoL+3j5qmoKSzac81xlga9jAg61YcdiF/cXCiFyQtpCz5id4O/2vQ/fwwV4By2qjw1gh8LpKkekyevxIv7lmfdQ85ymaiyFu6WLlNbYTVW03OPmnZftw1Anx4T8fv8KFn2/GmNgAO/uThq/U0VrYTMvb15F6Qcm+GjemWfziLkXsuNsjVewN1rD9oE6nOGzDjU+o8+XvkD5PeN0xwvOee5Y0svuJ5fT+P0R7B4Z1C6EOLO0hZ6Y1oS68ql9voNUVzcXuHqHyBK9qQLyW0x8j25Dp1InZvik8t0sXdLNvbV/mHH+p2wHrYXNeIBUbx9Gbx8eIFh8NccTRYCEnnRKoulKBjkaKWUk5KMkeXGjzde4PKwp3X9e5/ZbYa5qaAKP+9wnCyFy2oIY0zOdcrth9RImGv1Eykwaa47jUbJyqhCZwK0MGlwDRAIujuaVYLlO7W40S0uJr6kjWupkaLXiTn/Paa4khBCzb8GFHqOwgPbb8rn63btp9A1yY2A/XtkjR4iMEFBurnIPscI5SH9pPi/4K045J7WkirZ7NB9cvZVFnn5uDxwCmaoghJgHaQs9lgY0YM/s81cuF9GaJP9Q9eTU9gIyf0O8zek2AtZgafm3km6mMigx/ZSYUO0eQZ/mHSaZ72J9Uxtfr3hj6ogEnlxjne/eayJt7Cx9P53X0JPUFr8MlfDz3g10TRSQf8hEJxInurRGlwWIFRvUN3bjkcUIxWmszevk5c2rCBZdjb8vhfeNY1h9/QR6kty741qerVvG1SVtfLboVepkJlfWsrTNI+FCftp7Fd2hAgIHXOjIhW1CKubfzniCR8fX0REtInE0HxXpQ+LPwuPpi/PsjlXcNFHM6mA3nyt+kWaXL91lzYp5DT0hO84/HriVvJ/kE+yJ4+zpxhobxyjI59jN+VzzgTdY5uvjGv8hAoYMShSnuitvL0XvD9Fxewk/fPNqFo1Xovr68bzexrLhSlKBEh66pZo1H+ygLjCW7nLFHInqBP969GbUj0rJOx6jqLcHe3g03WWJs7C0zUOjG/jlsxvx9RhUHkmBtNmC5DjQzrJ7q0kVFPP7zTXU/fEwzUWt6S5rVszvkx4040N+anb2kmo7dnIvHoeDWLnN/1X+zNQeOSf3x7G0jX2W7wIGsmtyJnqrXRPaMXM7CaVAGWhDnXYX5hpHgHvyB4FBdtTVMx6oxQlYg0MwOIRpmASWb2A4FQAk9GQrC83AcD5LXx/AOnx05uzPqX9DyjDQCkx5lrBgdEUL8XcaFLSn8HVF0LF4uksS02gFGCbWeAh2H8IA8quvpC+Zn+7SZs2CG8g83aAV5pvDG3iqqxnLPn2wKQ9M8IWa53m3TxYlyxQ9qRDfGLqG57uXMNBbQF17Cm1rzJJiIhuaGK9zMNEAHytuO+e1RG4ZsSJ0W4peK4AVcpw6JtDpQl+2jOFVAeJBReWyXvKM0w0CE+kwmvDiGdZ4e6IYo2HsVAqlpH0WgtW+4/zu6itIBDaAAm1OhqCRNRZrfJ3pLm/WLOjQ05py8cCL17DooQRm7PR7NA80F3Pvp6/n1sVPyBOfDLE/WcAvn9tI069iFE+EUN0DWLaFrinn2Ac1X934OFXOEa70dCODXMV0nZbBc+HldMaKcIw4UKmZK3wZXg8dN+bxzru3sdp3nHWeDoKG7Nu3ENhohqI+8jrjmAePoROJk2M6Rdrd7D+CdctvabmuHLeRosgRxmfEqXUNcZV7CPCnu8RZsXBCjw0xbRLXJ9etH7Dy8PSaOHYewo5ETvuyAsda+iLywZhJJmwv3j4DY8cB7PjJx9uW10lF1Qh/Vtg2FWDP3q6GsrFNhXI40LYGe/IDUGmI2G7iOomBgVPWeUoLE41tTs7IxLLQlgVTs3ZStnne7WNpmxQWltYMWz664kE6o0HMBGBNPelRCmWa4HYTL9b8SfHLrHF5AFnuYiFJWSaOcBJrXDaSXWjKTTe3+w8T8bXgU5pS041bvTXUJDsCDyyU0BONUbRXcWf5F3B7Tu6/FRnzUnV46s3ydJQ6/fRlkROuLmrlO7csIrBsA/kdFvmvtJPq6ye/Pcm3ttzMD6s2cm1NK39V9gyNTgnG822Zu5vRTXEs7+V4BjXFW/uxDh/F0xPiwMtNrB/8JEtKBvkvtY+z0XPm4PNIuJBvtr+DgQk/0QkPasSJGVMUH9DoSGQy9K5vZuCyyS6tktV9FBmnfzIshDg9AwO/MjCx8SgDI0uXi1kQoccKhSl7vJWSbUEwT6YYlRyHgRGsxGk2IjVMlKGwDYVxmgGvIvt9omA3a9/TwYCVz9d23oG/swx6+/BuO8ry9mLsgJtn77yMW+/eTaPz9E8KxdzZ5JnggWu+T/tVpdzXuZmJcCW+w0eh5RiLfxDFzvNy7JpF/PYzl7HRs/u017C0zb3HryXxgwrqDo6hUiFIpibH8oyFsMfGUS4XPZvzuPFj29iU18JyVx/lpneef1shMptTmeQbHgLorJ4gNK+hxwCUqdEeF4bHg06l0KkU2Bap3j7o7Tvr65XDgXI4wDBOPM5OukxMCT05qcz08y5fEhji+6UjWO4gBmCNjMDICMrhwHPtBiZsLyChZ74FDA+bPbDZM8Th0qM86a/GB9iRCHZrOwD5dRvoiBYxZp9+jZ2ktukez6fqaAj7zQMzf2iYKKcD5fcTK9L8UXAbmz0GIGN4zoc2Qfm8qHgcnUyd6B4WuctUBtk+GGBeQ49HmWxc0sqOjy/HPVxK6e4E7pf3n3G8znRmMEjo+iUML3OgDU50a0UrLT5RcShrU6kQ2cx7PMz2Z5u5vKnxtD/XGny7vJhDXczosDJM2LCS3isDxINQdEUfFWYEGfh+fkrNBGXr+mj93CLcI1C2I4yxbV+6yxJizs1r6PEqF/9U+ztaPvQiRxPl/Osj72PxmwE4j9BDSZDOW+Cfb/op+cbJ6el5RpQlzijZNNBKiFyhDrWx+PvF6DPtkK41KtyH1T8483VOB71XBrj5U69yU/5+6h0j1DmkS+t8VZpe7mt+gPbFhbww0cxj5jVU7ZJB3yL7zWvoMZVBnSNAncNimfMw/xi0ID+AcR7Lx9sFPjwlUd7jG8BnTL85DSTwZDClUC4XyuUi6TFxGva5X3MaLsMi6XPgyMuDZHLGrDCRfk5lYbnAyMsDy8KOxcG2Jru6jp1/16NyOCb/vfj9xINwU/5+bvXFgexYIn++OJXJUqefpc4kptrHI4FrJieGWDappEmflSKpI/gM57QZPEJkvrQNZM4zTK5cdZQ3/nQpjsipOzG/XaLQ5o6mHTL9OMuYZaWM3tjEyFKDWGWKP63cf1FdlbeU7+ebd1bhuXo1hS02Rc+1Ta7SLBaEK3xt/PjmDYRrV+Pt1VRsGcQ60HJhF1EKvb6Zns15xIo0RVf0Ue8YQQLP7FFjEwT/UMo7Y3+BrzjCl5tf4E8LOmX4gMgaaQs9AeXmf9b9huNVXhLnMXTKo5I0OBI4lTzVySa6tIje2xN8Z9MDFJth6h1JLubJ3acK9vKOWw4wZPv43NaPU3CgBCT0LBjXeSb49cbvMrTBy//X/m5CPdV4D5z7ddMp02TgsgA3fmwbfxTcRoUZkS6tWZbqH6T8lykqnvITXlHOA//pKj69uiPrB7eK3JG20GMqgxpHgJrzrsDJ9D25ROZyqhSWC8xgIYmgl+KiEDd6YziVi4tdTC5o+giakNQJyovGSQaDuINBLM/k3yfSy2e4WDnVtL8r7GZLYR2BYPDCLuJwEA8qNuW1TM3SkkHLs862Tuxj5wnmMRxzY2PDJcSeuE4ybMWJaIjEnRRPbR2inC4Mrwfl92G7wInMHhNzb0Gs0yNyS5NjmJJrezhU0IRVmOLTdbsxZmmVSQPFe6v38J2PXo9jaDElq/tY7upDpjEvHDfkH+B3717FSPPyC3uhgbRnBvrZRDX/+OatJAe8FO4zMPs7SSmFvWEFndf5iBfb1K7tosE5iozPFHNNQo+Yd4udbh5sfoDhZQ6c2FQ5FKaanW4KUxl8PribD7xzFzFtUmSkZKG6BeZGzzhPXPXvjF15YU9uDaWlPTPQw72XU/6gh7xtx9CxONbExGRX5RovH/jQS9xduF3aVcwbCT1i3jmVOdm1OUfXLzC8FMi4ywXLZ7hoNGR69ELhVBYpr8YoK0FNeNChMHYshrJtohE3BxI2BUaIUtNB4CI2bw0nXbiHEqR6ek8cUw4HllvR7O2e2iNNiPkhoUcIIXJYgyPE8s1t7M2vxzVkUr0ljuOFXRj9IxT+vpG7ur6CXR7nLy5/li8Hj6W7XCEuiYQeIYTIYdWmj/uafslYg+Y3E2v40citVL5kkuoboOThMKVuF9HLG/lV5Tr+rLBNpq+LjCahRwghcpipDEpMPyUmNLn7sV2AoSBpYU9MwAS4hyppH8lne1zjVCc3gPapFLUOg4DhIWTH6LYsJuyTY7UsFIMhP5XW5P6Ihs+HkZ8HXg/JwORSJELMJwk9QgghzsrsHSH/8To+0fIl9LSJllZtjP/nysf4ZH4/j4Yr+bsd78c4NnOMTsERcB6fnLGVunwZHe/ykChL0by0nRWuXmRxSTGfJPQIIYQ4q1RXN8UPjVDimPmRMfHOZp5uWskn8/t5fmw55b9xU/DkzI1LdSpFKhYHZTCy3MOH3/sinw9uxWeYBJQMYhbzS0KPEEIIAHwqTqJAYzTVoSIx7OHRyS4urbFPszG0azTFoeFSnq0w2TtciXs0hTU+fvqLGya2CZXOUSodsrCkSA8JPUIIIQBY4hzhyhsP8EptE0Z/MfVPlOF47nXQ+rTne9oGMX9ZxZe3fQ5ft6bwSA+y/rlYyCT0CCGEAKDRGeB7dU8Tr03xw7FVPNB6K+VbTHTq9FEm1d5BUXcvxQ4HWBapeHyeKxbiwkjoEUIIcYLPcOHDRb1rkFgZqOWLMab2ywJQ0Tj24PCJbi8dj6PfFnYMvx+jrATtdZ88qBSxEkWeGZuvX0WIU0joEUIIcYp17m6uvGUvry1vYPqULft4MU2PBFF/2HXG11prFnPkfT5U/f+/vTuPjrO+7z3+/j3PLNJo3yzJWi3v2BjbBBuzhZgkkBNSIFs5TUuT25M0vfe0oTntbXvbnjbt7b1tki73tCVr23DSJmlCILQQEgI4rMUGvIBZvUm2JVn7PprteX73DwmwjAReJD2amc/rHB08j58Zvjo/PzOf+W3PafOAjOVdTS+zNXoCrdiSoCj0iIjIW6wMF/ON5odJN828+/kXenfw+P7tlD0593PHmwu5/rq9/GX9ozOOh41L1CjwSHAUekREZFZREyZqZt4YtiE6xHijQ/mWDXM+b7zRoalg8Lzu1SWykBR6RETkrO0seoW9Nzezf0fDnOdsrHmZ64sPAtE5zxEJgkKPiIictU2RAv6peRd+sz/nOQ4OYaPAI0uPQo+IiJyTsHEBN+gyRM6ZbpcrIiIieUGhR0RERPKCQo+IiIjkBYUeERERyQsKPSIiIpIXFHpEREQkLyj0iIiISF5Q6BEREZG8oNAjIiIiecFYa8/+ZGP6gI6FK0dm0WKtrZnvF1VbBkbtmTvUlrll3ttTbRmYOdvynEKPiIiISLbS8JaIiIjkBYUeERERyQsKPZIVjDFNxphjxpjK6ccV049bAy5NJK8ZY1qNMQfPOPanxpjfCaomOT/GmCpjzP7pn1PGmM7THkeCrm8+hIIuQORsWGtPGGO+Avwl8Jnp/37dWtseaGEiIjnCWjsAbIap4AqMW2u/HGRN8y3ne3qMMbcZY543xhwwxnw76HrkgvwtcLkx5nbgKiCnLsZ8Y4z5vDHm4PTP7UHXIyK5L6d7eowxG4A/Aq6w1va/PjQi2clamzbG/C7wE+D91tp00DXJ+THGXAp8CtgOGGC3MeZRa+2+YCsTkVyW6z09O4EfWGv7Aay1gwHXIxfuA0A3sDHoQuSCXAXcY62dsNaOA3cDVwdck5yfufY90X4osuTkeuiRHGKM2Qy8D7gc+G1jTH2wFYkIMABUnHGsEugPoBaRt5XroecR4GPGmCoADW9lL2OMAb4C3G6tPQ58Cc3pyWaPAzcbY2LGmCLgluljkmWme+q6jTE74Y332RuAJwItTGQWOR16rLUvAn8BPGqMOQD8TcAlyfn7NHDcWvuz6cd3AOuNMe8OsCY5T9bavcC3gD3AbuCbms+T1W4D/tgYs5+pL5tfsNYeCbYkkbfSbShEREQkL+R0T4+IiIjI6xR6REREJC8o9IiIiEheUOgRERGRvKDQIyIiInlBoUdERETywjnde6u60rWtTeGFqkVm0X4iTf+gZ+b7ddWWwXju+WS/tbZmvl9X7bn4dG3mloW4NtWWwXi7tjyn0NPaFGbPT5vmpyo5K9uuP7Egr6u2DIZbf7hjIV5X7bn4dG3mloW4NtWWwXi7ttTwloiIiOQFhR4RERHJCwo9IiIikhcUekRERCQvKPSIiIhIXlDoERERkbyg0CMiIiJ5QaFHRERE8oJCj4iIiOQFhR4RERHJCwo9IiIikhcUekRERCQvKPSIiIhIXlDoERERkbyg0CMiIiJ5QaFHRERE8oJCj4iIiOQFhR4RERHJCwo9IiIikhcUekRERCQvhIIuYDHE/RT9fooJf2bGK3F8atwoURMOqDIRERFZLHkReh6IV/OH+28ic6JoxvHS1UP8/cbvcmVBQIWJiIjIosmL0PPwyEVU3FNExUNHZhzv/thqdret4sqCowFVJiIiIotlyYUez/p0e3H6vTAe5o3jYeOz3PWodove5tmzm/TCRMZ8vL6+GcfD8VWkrXvBNcv8SNo0XZkkg36EmMmwPGQocwqDLkveQdp6dHuT9HmROc9Re4rIUrDkQk+3F+czRz7Oa8+24KTfPJ4u9/jo5c/wf2qfJWwUVHLRs0mX//78p5l8uZx0XYrf2/4TPlveGXRZ8g4Op5N8+pXb6Nlfi/FnPyddr/YUkeAtudDT74U5tKeFtf9wAjsy+sZxf20L99Vs5AvLdiv05Kj9iRbMwxWs+teXiV+xih+3XawPySxwNFPJ4BN1rP7aIUgmZz0nfvVataeIBG7JhR4Pg5MBOzKKN/pm6HEnkniZQnzm+CopWS9tXUIJizc8TCjukcwsuX+eMou0DeGmwB8dxc4ReiLDaY4OVPFwo0u5M0lbKEOFG1vkSmW+xf0UHZkMI36UMidJU8ih2Jl7ZUi/N8HRTIS4H6XWHactHNbqWVlU+lQRkQUXPt5P0T1N/Oa+X2eyOc0fXHU/nynrCrosuUCPJUr4nwc/zER7GYXNY/zVph/ywVhi1nM96/P3g9v49mNXEe13CW8Z4l8uuZNLo4tctOQ1bU4oIgsuc7KTiu/vpeWv99N0v+Hng2uDLknmwd54K/bJClb++yTuY2XsmVg557k+lp92rmfl91Os+H8vYp+s4FCqdhGrFcmmnp50hvRQlHsnGqgLDbM2PEJjqDjoquQCxf0Ur6UtxzMVPDG4ilDcBl2SXAATjuAuq8aWFmGSaWz/4NQwtbXYZBILuAlLyte8vGw14k/yajpEV6aCJwZWEh2yhEYmcRMxEv7bD1V5voObyOCPTxAdsvxsaAMxZy+t4UHWhl0NdS0RQ16cl9IF9Hml1LkjrI+kcmblZfaEnv4hmh+o5i+P/iKTtZabdu7mi3XP4hp1VmWz51Mun3z2k4T2llAwYKk5MIRvFXyylbu8lhMfbSJ+aRzbU0Drf1YReuS5oMuSefToZBW3P3UrRS8WEB2wVO8dhd4BImOVTHpnF1qs51H93Cj7QhfzdMUmQtuH+M7mf2ZDRKFnKbh7vI3//egvUHw0xPi6FF++6vt8pHj0nZ+YBbIm9HhDQxQ+sJeGB13slrU8smYNmbrduBqhy2qvpWqJPlXC8m8cwKbS+Jn0Oz9Jliy/ogTvihGe3vZ1vj60lbtf3EnVLgMKsjljX7yVqp9HqfrOc+BbrOfh+R7h+CqS/ll+pFiLPfAKyw66OEWFtDsbOLGxnA2R2SfCy+J6amQVjQ8aiu5/juGPbWHPljY+Urw/6LLmRdaEHgCbyUAmg0l6eNa88xNkyfNxMB5TQx+ZTNDlyHkocSaZrLHYS9Yw0RAjlUxy9/hqnhhYSXhiKuw4sRhmeS1+SYzRlhAbCsYCrlrOl//6CtszVuqZcw22vof1PWzCwfjM2IxWFl+/N8HeZDldmQqePdVE9XAGm0ziZMDPoc/brAo9IrL0rA2PcNPO3TyyZg3jE0lMeyF3PH0z0SFL1f5+PGsxK5o4+ouVRDYN01ZxlI9X7Q66bBE5zY8nWviTXR+m7KUQxT0+BYdOkotfQxV6ROSCNIaK+WLds2TqdnPnaAt3PH0z9V/fi59K49mpfbVSNUU0XNHJj9f/EAdHG4yKLDF7J1qpf9Sh9K49WN+S8b2gS1oQSy70FBiPVE2G5GWrCQ8ncLsGyJzqOec5Ab3eBE8lamlPVbP7ZAvLJ6Yyq1tRgW2qxSuKMt5oqA6pm32pO70t57I6eoorooPa8C4A436C3ckiXkqs4IHeDUSHLH4qDae9abrxNIe7q/hK/WrK3ThtkV5qnDhFjk+1EyHmzH3fLhE5d29el40zhg5rQmNcXtDByvBbVz8bn5yfZrDkQk9jCD674+fc37KR4wNlVN/XStkPh+bc6XUu/zm+kr/42U1UvuBQ3ecRPdRJBkhvauXIxyPUtvWzs/oVro4dAc79JqayeE5vy9lYA4NbPf72uu9wc9H4Ilcnr6Ydfn33r1D6eCHR4deHtGbunO529ND0vRb+9bEPEF9mMFtHuLium7XFPXyifA9rFHpE5tXp16VzWo4ZbzTccOMe/rpuT16ufl5yoafMKeT3qg7xO5Wv8vBklNtf+jTlrsu5rv14Id5I3ZOGku8/DUBmuqdooj7KLZfv4Yt1zwLgGgWepe7MtnwL4+D86jZevbIeig4tbnFCZ6acggMxav9lL34yiTdLr6zX00v0x31EgZpN6zgWLWdfxmV0WQHvL3mBNVqpLDKvzrwuX1e5YxO7t7VA3Z4AqwvOkgs9r3ONQ5U7wWRjhsQ1GwiPpggf7yfTefZb1xvLrMNijrF5mXCz2VxtCYD1pv5eAlHuxok3eCSv2UhoLE24vYdM96m3njjdfiaepLDXMFZYxFFj6WssBdRDl29C9XWkW2tJlIaZbPQod+LoJgHzx8fBWLDWznzvzPP3yiX9L6wtlOFzVz9I0e+f5MRv+wxd1YRxNQFSZClZF57gN3Y+RMEfdHHsNw2jl7eA8zbXaU8/dT8fZMW9k0R2l3Ag3rx4xcrS4LiMXNlC++cshb/fxW+8+yHWhieDrkrywJLt6QGocGPcXtHO7RXt/KihmD9+5jZKFXpElpRlbhG/W3mE3608wrfql/F3//VRihzDGdN63uCNjsLBURygvHobXcmyRa1Xgmccw1ijy59suZ9PlAxMH9VUA1l4Szr0zMWJJ5k4Ws1vN1xNY3SIG0sPsClSEHRZMo9CYylePVzPb5Vdxk8OXUTjoHZqzgY1oVHG2nzKr9+COW3Fa3gsTehwF15Pb3DFiQjuWJKeI9V8rnoHIefNi/THr22Y9X02OpThPw5dTNq6XBTr4kNFr1Gfxfe9zMrQQ3cvK+8q4sB/XcLjrS7HPlrNVxsf1zydHOIc62LVd1p47qGtNA5lKHyxMyc3yso1WyP9/I/3PcjTl62YsYvrvo4mmr/dTOSnCj0iQTInu1n5vQL2PboZzJvXaONgetb32djBLhr+aTnPlL+L+7ZD6Y3/zq0lQ4ta83zKytDjjY5injpAMVC04xIO7qyDxqCrkvnkDQziPDrI698nFHiyQ32omM9XHoXKozOO/0npBn6y7Bq0MF0kWN7wCM4T+5mtr2a299lMZxfhzi7CQDq2g473VwMKPSIiIm8R91P0eCni1mUsHqXKn2Oyl8yrGneU8ZVpxj+0mcioR+GrPWQ6TgRdVuAUekREZMGc9NI8MrGW7nQ5yf5CTHI031dNL4qLwgm+8O57eHrrKp7ubiHxbw2UnOiasVN6PtIkGBERWTBxP8TxZBVHJmpw4w7Gy+8P3cVS4ca4rbSfOxqe5rfW7CK+zME4uXO39POVFz09blUl3soGUhVRhtY6NEcHgy5J5kGosYH4xuUkKlwGLrGsiGqSrMhS83yygR8d2cRkX4zyYw4mngi6pLzTFB5gZEOGgl98FwXDHkUv9pBpPz7jHLemhuSmZiZrwhQMZih8/sTUfS9zTF6EHr+1nvabirFtcdbUd7I9dhjQvvfZLr5xOad+LcGNKw+yIdbJtYVdaK8PkaXlJwMXU3R/Cc0Hx3GGxvF7+oIuKe9sjYzxf9/zA/Ztb+Ghk2vIfKueoo4TM3ZqzqxezrHbLB+5eDf3vLyZFV9djqPQExyHc5v85lvzRoN6xREyzQneu/I11hd1U+cmUejJfokKlxtXHuRLdfumjyjwLFWedTBz3UZEclp3vJTyQ5PYZ17g9YEtE8qaj56cUOHGuLVkiFtLhvi7yDDfrvoARcYB++ZQY6oswta2Y3ypbh89yRI6ytaRi7vfZc2/vGXuGONrUozesoXIiEfRK71kjnXMOOe19ARf67+a/UONHD1cx6pTKQC8sENRSZxNxSdpjfQRMxrXzFahhuWMb2lksnpqSGtDrDPokvJSvzfBv4xs4qGe9ZREEvxy3dN8KDZKtxfnG0PbeaJv5Yx9eto7amjrntr4zC0txa5oIFNewNCqEMujI0H9GiJ5Z3X0FINbPZxf3TbjnoWTNYYTHU28J3ETHa/Wsbo/N28LkjWhpy2c4LPbH2VX2xoOn6qh/gd1xDpOzjjnZxPreOBHl9Pw2CRrRydw2rvwAK/AZU1VHzcVv0yBMZQ5uZhf80NibT29vzrJr6zdQ1u0l3cXnoBZd5yQhXQ0E+GOx69jxT0+ndUhvnxbKe/f+D32JpfxnZ9eQ8sDSUL+m++oa8cncTpOTX3Tr6uh+5oKxlt9ws1jbIm1B/VriOSd7dEBvrzze7x6RT0+b34xeaDrIkL31RF6oZK1I2OY9k5yccp51oSeYhPmPcUv0RgZ5N7IZtrL1hADjLV4vkMGj85kBaXHfJxH92HhjQazIaiKTtCYxVtny5R0qcs1LYf5X9WvTh9RmwZhzC+gsCtE9MkDFDQv59CNpSRthlOZcoqPG0KPP4/NvLnV2YzrsSBMosbiNsZZUT1AlTuOFpLmhozvkrYzPyqtVc/6UlLtFvGR4lEoHgXAsz4+llPJMvb11OA8uu8cJ5Nkl6wJPWHjUuVMkg73UVcwytHpyt3BCcaeqeUK/xMMHa+grTMVbKEiecaMxSnYX8O1sV9jdKCI5vYM1p97/o4fi5Bc5nFl03FWF/VS406i+VjZr/BUgp8/vZErB+qBqbDjWxg/UMXKgf6c7DXIdi+mJvm7nveyv6+BgSOVrOrK/ZV1WRV6GkNRat0Uewt7eTxqMI7BHu9kxb9Z/PuLqI0PQ1evLi6RReT19NH8gxDerlKWJ8cxXX14b7MBWro4TEtbL3+4/McUOT61bnQRq5WF4r7UzppvLscrjYEF4/kYC7UjvdiT3UGXJ7N4ZGIdT/3oEhofHqNmfBC6enL+8zNrQg9A1ISJmjAl7iR2ujfcTyTg0NR9fnK9sXJJ2nr4+CT8MOZc+lItpPwQSZvGwSFs3AWrUc6OTaemFhUc46077TruWzZE86MOy2JjrI/EFq1GmT8OFuuACUewngfWB2un7on4ShwHpnr6fG/GsCYAxmBcFxOJYB1wc3ogZel6/f33ZKqCkuM+7JleWee4Uz9nyqFdnLMq9EhuOJkZ54u972HXidWM9xTTcjQ99eZ5FmKdcZ54dCNbVzWzofYUv9fwAJdGdRvLpcitXcb4jlZGWkKcNl+SsTaPmyuOBFeYXJBLYsf51jVpkhXvorDPp+q/TpE52g4wdR0bZyoIzcK9aA192ypJVhgS28ZpDfcD+uKymF5Oxfniqet5pquZyY4S2o4nganrNXFxE5M1U9u5vL6yq7A3TfSF43h9ubG/kkKPLLpX02U88PC7WPmDMZyJITjVj3eWe7iYl4+x+hvLsEUFvHLdWh761HEujR5a4IrlfPjLazh5S4Y/3X437mlrY2vcUTZHh9E8nuz0nsI+7nzPNzlxVRX/cPRaJoZriE6HHqydsffLDMYwuKWCpk8d5uO1z9Ia7mdTxEOhZ3HtTrSy+76LablvGBPvg1N9eExdryfeH6H8ogF8OzUny1pD/4sVrByoBoUekfMz5hdSMGDghUN4yeTsJxmDCYUxroO1FptKgbX4ExNw+BgAxesupyNRxYj/PGFcoiaEa7QKaDG4WKwLprAAZ45eukxphOW1w3yipHeWdlHgyVZlTiHXFAAFA+xddoynymooLCjAej42k56xyy8wdS1HIphQiESFww01L3JryRBTYUeBZ7ENezFiPRb/wMsz2sovCJGpTrO9tgMfQ9p38a3hkVMl+NHciQq585tITgk1LGfwmiZGWxxiPZaaJ3rxXps5JFLSEeehh7bwYPN6Lmrs5s+a72VzVJNiF8Py0Bgll/XR/htrMZnZz5ms8/mV+lcVRHPY9pIj3P2+LYy2bKWoy1LzWNdbNo11L1pDz5WVJKoMmc3jrIt2BVStvB0bcYiVTbKj9DCeNfg4eNbhifI2/IibM5tKKPTIkpRurmbkw+N8afNdfPnY9Uz01VJwRuhxXjjC6u5KbEGUYx9q48n/torN0RMBVZxfVoQKuHPDnXSsrcBj9n1YSpwE68ITqFcnd10fO0XrtV+l7+oS/vy1G5nsrCZ8RugZ3lhByy8f5jPLH6POHWVN2ACah7fUZApcWiqHuLFoatNff7oX6HvVQ/jRZQo9QXLwsSEwhYWQSmHTmbfOLndcnEgYwmEyUYfwOS0RkkAYgxONQjjMZFmEtuqTfDCW4OdVx3m8rJ5YSQmk0/ipNPge/sTE1HAXENtWQ3uiml7vFaLGodhE1cOwgMLGZX0kxvrIHMOTb1DgyWVlTiHbogAJflTVzcvlVURLSmackywz7Kg8yg2xJKCe2CXLGGKhFGVO4YzDRaEUo27ubDCZlaGnOTzI2CVJen9pIwVDlrJ9vXjTy9ZfF2pppGfnciaaDMkVCa4qfS2gauVshepqGbiulZGVDomGNJ+veQmAK0sOcc8HNjOy6mKKOi3Ldp3Cm57X87qyI3H+44HLuav+Uja0dfJXrXezIVI42/9GRBbAteWv8PCN64huvHjGcX/9OFsL24MpSuQMWRl62sKj3HrJMzxRv5ITnVWEx6qIHj42Y1JWYkU1zof7+dr6H1DuJGgLATl5z9jc4TVUM/jBSb6+7duUO5O0hDwgxnsL+7n3yjsY2FHIHx26hcnjVUTOCD3u/kOs6ijHFkZp/+gKnvlkCxsivcH8IiJ56Maik1x87T8y7M98n61x4rSEQmhIS5aCrAw9MWNoLehnoKKIwYkYXsFb77/kRxyaS4emVhko7GQFP+xSVT7O1QUZXPNmN3ixU8CG6ffL9RWneKG8lsKKCmwqhT+ZmBrqisfx43FwXCLDdST8cEC/hUh+KnMK2TRrrtEmlNkmadOM+CkS1jKcLMTNnN2WItkgS0NPmM0FHZS7E/RMltJTWPLOT5KccF35Szxy01r6tqyjpB3qH+x+Y2M0ERG5cD+Nl/FHL97EeGcpJYddGk7mzu0psjP0OBEuiaTZGO7jcEU7d8fagi5JFsmNsT4uvfIfGNsR5rMvf4Lkq5W4Cj0iIvPmweGLid1VRtOu49hEAn90POiS5k1Whh6Yug8XBmJukjlWzEoOijkRVjpTfeitZYOcrKqitHbZG39vHIdMkSFscuV7iYjIwjMZn954CS+n4hwaraGw3yPTmXt7KmVt6BH5UPUB/uxjjXRf9WZPnzVQv/YUmwuOo4mTIiJnp+D4MF331fOhg5+n6IRDQ0dfzgxpnU6hR7LWLUXdvHvHHcTtm119LpYSx1DhaPK6iMjZ8o92sPzb/RCOQDqFPz4RdEkLQqFHFl2BSZMuArehHlLpN46PV0WJhYfO+nViToSYo94cEZGzFXNSpEoNoeZGbCKJHRnFTySwmQze8EjQ5S04hR5ZdKvDA6y79ggH6pqnxqOmRSsn+XzDs9pJWURkgVxa0E7VDZ28tLqegq4QLQ+MwTMvBF3WolHokUXXGopx58ofEV8xc8TYNYYyJwJojx0RkYWwORLi7nXfJbHW58973su+Q5spfSboqhZP1oeeEidBotLgrloxY0fm+LIQqyKTAVYmc3GNQ5kppEwdOiIii8o1DhXu1IaRKwr7eHKZQ8WqFWf13ESlIeakFrK8BZf1oeeywnZaP3CMly6qnzFUUlvby81VzwVYmYiIyNL17qJXeOgX1vPKltp3PtlY1jR1cHXsNbL5xrFZH3o2hCN8b9U9pFf6eLzZ0xM2DjETAdzgihMREVmiLo24/HDtXaTXzPz8nI2LIWwcCk12Lx7J+tDjGodio+XJIiIi5yIfPz81q0JERETygkKPiIiI5AWFHhEREckLCj0iIiKSFxR6REREJC8o9IiIiEheUOgRERGRvKDQIyIiInlBoUdERETygkKPiIiI5AWFHhEREckLCj0iIiKSFxR6REREJC8o9IiIiEheUOgRERGRvGCstWd/sjF9QMfClSOzaLHW1sz3i6otA6P2zB1qy9wy7+2ptgzMnG15TqFHREREJFtpeEtERETygkKPiIiI5IWcDT3GmF3GmOvPOHa7MeYrQdUkF8YY02qMORh0HXLhjDEFxpg9xpgDxpgXjTFfCLomOT+zXZfGmD81xvxOUDWJzCVnQw/wXeDWM47dOn1cRIKVBHZaay8BNgM3GGMuD7YkEcl1uRx67gI+aIyJwNS3EWA58HiQRcn8MMa0GWP2GWMuC7oWOXd2yvj0w/D0j1ZViATMGPNnxpjbT3v8F8aYzwVY0rzK2dBjrR0E9gAfmD50K/B9q+VqWc8Ysxb4IfBJa+0zQdcj58cY4xpj9gO9wM+stbsDLklE4J+B2wCMMQ5Tn53/GmhF8yhnQ8+004e4NLSVG2qAe4FPWGsPBF2MnD9rrWet3Qw0AtuMMRsDLknOz1xfJPUFMwtZa9uBAWPMFuD9wD5r7UCwVc2fXA899wLXGWO2AjFr7XNBFyQXbAQ4DlwVdCEyP6y1w8Au4IaAS5HzMwBUnHGsEugPoBaZH98EPgl8iqmen5yR06Fnes7ALqYaTb08uSEF3ALcZoz5paCLkfNjjKkxxpRP/7kQeB/wSqBFyXmZfp/tNsbsBDDGVDIVYJ8ItDC5EPcw1YaXAT8NuJZ5FQq6gEXwXaYa8MyVXJKlrLUTxpgbgZ8ZY8attf8RdE1yzuqBO40xLlNfvr5vrb0v4Jrk/N0G/KMx5m+mH3/BWnskyILk/FlrU8aYXcCwtdYLup75pNtQiIiIyBumJzDvBT5mrT0UdD3zKaeHt0REROTsGWMuAg4DD+da4AH19IiIiEieUE+PiIiI5AWFHhEREckLCj0iIiKSFxR6REREJC8o9IiIiEheUOgRERGRvPD/AdY0PdA9XBIyAAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 720x720 with 25 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
"'Training image Shape: (3069, 32, 32)'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# partition the data into training and testing splits using 90% of\n",
"# the data for training and the remaining 10% for testing\n",
"(train_images, test_images, train_labels, test_labels) = train_test_split(data,\n",
" labels, test_size=0.10, stratify=labels, random_state=42)\n",
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
" \n",
"# show train data in plot\n",
"show_train_data(train_images, train_labels)\n",
"\n",
"# Show shapes\n",
"display(f\"Training image Shape: {train_images.shape}\")\n",
"display(f\"Testing image Shape: {test_images.shape}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Looks great!\n",
"\n",
"### Modeling\n",
"\n",
"We need to define some parameters for our model. Here is what they mean:\n",
"\n",
"- `EPOCHS` - amount of iterations to fit the model\n",
"- `BATCH_SIZE` - size of slices of the dataset\n",
"\n",
"Info about model"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"EPOCHS = 80\n",
"BATCH_SIZE = 50"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
]
}
],
"source": [
"# initialize and compile our deep neural network\n",
"print(\"[INFO] compiling model...\")\n",
"model = models.Sequential([keras.Input(shape=(32, 32, 1))])\n",
"model.add(layers.Conv2D(32, (3, 3), activation='relu'))\n",
"model.add(layers.MaxPooling2D((2, 2)))\n",
"model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n",
"model.add(layers.MaxPooling2D((2, 2)))\n",
"model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n",
"model.add(layers.Flatten())\n",
"model.add(layers.Dense(128, activation='relu'))\n",
"model.add(layers.Dense(62, activation='softmax'))"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"_________________________________________________________________\n",
"Layer (type) Output Shape Param # \n",
"=================================================================\n",
"conv2d_66 (Conv2D) (None, 30, 30, 32) 320 \n",
"_________________________________________________________________\n",
"max_pooling2d_46 (MaxPooling (None, 15, 15, 32) 0 \n",
"_________________________________________________________________\n",
"conv2d_67 (Conv2D) (None, 13, 13, 64) 18496 \n",
"_________________________________________________________________\n",
"max_pooling2d_47 (MaxPooling (None, 6, 6, 64) 0 \n",
"_________________________________________________________________\n",
"conv2d_68 (Conv2D) (None, 4, 4, 64) 36928 \n",
"_________________________________________________________________\n",
"flatten_23 (Flatten) (None, 1024) 0 \n",
"_________________________________________________________________\n",
"dense_44 (Dense) (None, 128) 131200 \n",
"_________________________________________________________________\n",
"=================================================================\n",
"Total params: 194,942\n",
"Trainable params: 194,942\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n"
]
},
{
"data": {
"text/plain": [
"None"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Use categorical_crossentropy for one-hot coding labels\n",
"model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3),\n",
" loss=keras.losses.categorical_crossentropy,\n",
" metrics=['accuracy'])\n",
"\n",
"display(model.summary())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Since we are using a CNN, we need to add the channel parameter to our training and test data shapes. This is so our data is represented as `(batch_size, new_height, new_width, filters)`."
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Keras needs a channel dimension for the model\n",
"# Since the images are greyscale, the channel can be 1\n",
"train_images = train_images.reshape(-1, 32, 32, 1)\n",
"test_images = test_images.reshape(-1, 32, 32, 1)\n",
"display(test_images.shape)"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[INFO] training model...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 27ms/step - loss: 4.1115 - accuracy: 0.0257 - val_loss: 4.0033 - val_accuracy: 0.0557\n",
"62/62 [==============================] - 1s 22ms/step - loss: 3.4961 - accuracy: 0.1310 - val_loss: 3.1457 - val_accuracy: 0.2082\n",
"62/62 [==============================] - 1s 22ms/step - loss: 2.4779 - accuracy: 0.3506 - val_loss: 2.2735 - val_accuracy: 0.3930\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.8061 - accuracy: 0.5122 - val_loss: 1.8761 - val_accuracy: 0.4663\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.4414 - accuracy: 0.6002 - val_loss: 1.5583 - val_accuracy: 0.5689\n",
"62/62 [==============================] - 1s 19ms/step - loss: 1.1454 - accuracy: 0.6693 - val_loss: 1.3188 - val_accuracy: 0.6334\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.9367 - accuracy: 0.7247 - val_loss: 1.2408 - val_accuracy: 0.6540\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.7973 - accuracy: 0.7517 - val_loss: 1.2723 - val_accuracy: 0.6510\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.6792 - accuracy: 0.7937 - val_loss: 1.1206 - val_accuracy: 0.6716\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.5712 - accuracy: 0.8130 - val_loss: 1.1829 - val_accuracy: 0.6569\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.4888 - accuracy: 0.8436 - val_loss: 1.1345 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.4121 - accuracy: 0.8729 - val_loss: 1.2073 - val_accuracy: 0.6921\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.3276 - accuracy: 0.8996 - val_loss: 1.2323 - val_accuracy: 0.6804\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2886 - accuracy: 0.9075 - val_loss: 1.2447 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2318 - accuracy: 0.9312 - val_loss: 1.4181 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2303 - accuracy: 0.9260 - val_loss: 1.2899 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1555 - accuracy: 0.9537 - val_loss: 1.3898 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1170 - accuracy: 0.9635 - val_loss: 1.4132 - val_accuracy: 0.7097\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1247 - accuracy: 0.9612 - val_loss: 1.4860 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1290 - accuracy: 0.9563 - val_loss: 1.6237 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1078 - accuracy: 0.9651 - val_loss: 1.5645 - val_accuracy: 0.7038\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1179 - accuracy: 0.9645 - val_loss: 1.6006 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0931 - accuracy: 0.9752 - val_loss: 1.6176 - val_accuracy: 0.6862\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0616 - accuracy: 0.9795 - val_loss: 1.5915 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0510 - accuracy: 0.9860 - val_loss: 1.6762 - val_accuracy: 0.6862\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0519 - accuracy: 0.9840 - val_loss: 1.7727 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0439 - accuracy: 0.9883 - val_loss: 1.8385 - val_accuracy: 0.6891\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0572 - accuracy: 0.9818 - val_loss: 1.8223 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0627 - accuracy: 0.9814 - val_loss: 1.7373 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0512 - accuracy: 0.9863 - val_loss: 1.7541 - val_accuracy: 0.6891\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0436 - accuracy: 0.9889 - val_loss: 1.7925 - val_accuracy: 0.6921\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0364 - accuracy: 0.9906 - val_loss: 1.8122 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0525 - accuracy: 0.9824 - val_loss: 1.8509 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0816 - accuracy: 0.9775 - val_loss: 1.7586 - val_accuracy: 0.7038\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0999 - accuracy: 0.9681 - val_loss: 1.8649 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0824 - accuracy: 0.9713 - val_loss: 1.9866 - val_accuracy: 0.6745\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1054 - accuracy: 0.9658 - val_loss: 1.8624 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0755 - accuracy: 0.9723 - val_loss: 1.7141 - val_accuracy: 0.7097\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0477 - accuracy: 0.9831 - val_loss: 1.9349 - val_accuracy: 0.6833\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0275 - accuracy: 0.9919 - val_loss: 1.9551 - val_accuracy: 0.6891\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0823 - accuracy: 0.9765 - val_loss: 1.9577 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0375 - accuracy: 0.9892 - val_loss: 1.9938 - val_accuracy: 0.6891\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0176 - accuracy: 0.9951 - val_loss: 1.9272 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0136 - accuracy: 0.9961 - val_loss: 1.9688 - val_accuracy: 0.7155\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0074 - accuracy: 0.9984 - val_loss: 2.0054 - val_accuracy: 0.7126\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0178 - accuracy: 0.9938 - val_loss: 1.9492 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0086 - accuracy: 0.9971 - val_loss: 2.0478 - val_accuracy: 0.6979\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0080 - accuracy: 0.9974 - val_loss: 2.0697 - val_accuracy: 0.7097\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0104 - accuracy: 0.9958 - val_loss: 2.0860 - val_accuracy: 0.7155\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0089 - accuracy: 0.9967 - val_loss: 2.0863 - val_accuracy: 0.7155\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0044 - accuracy: 0.9987 - val_loss: 2.1547 - val_accuracy: 0.6979\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0063 - accuracy: 0.9977 - val_loss: 2.1304 - val_accuracy: 0.7214\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0130 - accuracy: 0.9945 - val_loss: 2.1268 - val_accuracy: 0.7126\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0153 - accuracy: 0.9948 - val_loss: 2.0379 - val_accuracy: 0.7126\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0222 - accuracy: 0.9912 - val_loss: 2.4216 - val_accuracy: 0.6657\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0900 - accuracy: 0.9756 - val_loss: 2.1243 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1326 - accuracy: 0.9514 - val_loss: 2.0786 - val_accuracy: 0.6891\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0921 - accuracy: 0.9668 - val_loss: 1.9326 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0635 - accuracy: 0.9778 - val_loss: 2.2042 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0336 - accuracy: 0.9876 - val_loss: 2.1325 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0282 - accuracy: 0.9909 - val_loss: 2.3317 - val_accuracy: 0.6862\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0510 - accuracy: 0.9827 - val_loss: 2.4462 - val_accuracy: 0.6745\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0515 - accuracy: 0.9811 - val_loss: 2.1255 - val_accuracy: 0.6862\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0338 - accuracy: 0.9906 - val_loss: 2.0464 - val_accuracy: 0.7038\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0315 - accuracy: 0.9902 - val_loss: 2.1223 - val_accuracy: 0.7097\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0337 - accuracy: 0.9915 - val_loss: 2.1998 - val_accuracy: 0.6774\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0144 - accuracy: 0.9951 - val_loss: 2.1812 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0174 - accuracy: 0.9958 - val_loss: 2.1777 - val_accuracy: 0.7243\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0125 - accuracy: 0.9971 - val_loss: 2.1572 - val_accuracy: 0.7038\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0073 - accuracy: 0.9977 - val_loss: 2.1636 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0064 - accuracy: 0.9974 - val_loss: 2.1590 - val_accuracy: 0.7214\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0096 - accuracy: 0.9971 - val_loss: 2.1166 - val_accuracy: 0.7155\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0116 - accuracy: 0.9967 - val_loss: 2.2153 - val_accuracy: 0.6950\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0042 - accuracy: 0.9984 - val_loss: 2.1617 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0026 - accuracy: 0.9990 - val_loss: 2.2007 - val_accuracy: 0.7273\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0042 - accuracy: 0.9984 - val_loss: 2.2867 - val_accuracy: 0.7009\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0030 - accuracy: 0.9993 - val_loss: 2.2717 - val_accuracy: 0.7126\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0023 - accuracy: 0.9993 - val_loss: 2.3106 - val_accuracy: 0.7126\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0038 - accuracy: 0.9987 - val_loss: 2.3152 - val_accuracy: 0.7067\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0026 - accuracy: 0.9987 - val_loss: 2.3278 - val_accuracy: 0.7067\n"
"# train the network\n",
"print(\"[INFO] training model...\")\n",
"history = model.fit(x=train_images, \n",
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
" y=train_labels, \n",
" validation_data=(test_images, test_labels), \n",
" batch_size=BATCH_SIZE,\n",
" epochs=EPOCHS, \n",
" class_weight=classWeight)"
]
},
{
"cell_type": "code",
"execution_count": 168,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"------------------------------------------------------------------------\n",
"Training for fold 1 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 30ms/step - loss: 4.1090 - accuracy: 0.0283 - val_loss: 3.9846 - val_accuracy: 0.0616\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 3.4496 - accuracy: 0.1466 - val_loss: 3.0225 - val_accuracy: 0.2199\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 2.6281 - accuracy: 0.3073 - val_loss: 2.5281 - val_accuracy: 0.3548\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 2.0361 - accuracy: 0.4630 - val_loss: 2.1338 - val_accuracy: 0.4252\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.5790 - accuracy: 0.5666 - val_loss: 1.8311 - val_accuracy: 0.5279\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 1.2539 - accuracy: 0.6383 - val_loss: 1.5830 - val_accuracy: 0.5777\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.0528 - accuracy: 0.6898 - val_loss: 1.4818 - val_accuracy: 0.5777\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.8619 - accuracy: 0.7387 - val_loss: 1.3882 - val_accuracy: 0.6158\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.7394 - accuracy: 0.7696 - val_loss: 1.3694 - val_accuracy: 0.6246\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.6509 - accuracy: 0.7937 - val_loss: 1.2895 - val_accuracy: 0.6540\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.5439 - accuracy: 0.8254 - val_loss: 1.4169 - val_accuracy: 0.6070\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.4559 - accuracy: 0.8495 - val_loss: 1.3238 - val_accuracy: 0.6745\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.4011 - accuracy: 0.8742 - val_loss: 1.4445 - val_accuracy: 0.6422\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.3292 - accuracy: 0.8935 - val_loss: 1.5683 - val_accuracy: 0.6276\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.3216 - accuracy: 0.8957 - val_loss: 1.5567 - val_accuracy: 0.6041\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2594 - accuracy: 0.9195 - val_loss: 1.5756 - val_accuracy: 0.6246\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2395 - accuracy: 0.9215 - val_loss: 1.6202 - val_accuracy: 0.6569\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2460 - accuracy: 0.9202 - val_loss: 1.5781 - val_accuracy: 0.6452\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1743 - accuracy: 0.9446 - val_loss: 1.5596 - val_accuracy: 0.6657\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1538 - accuracy: 0.9514 - val_loss: 1.7793 - val_accuracy: 0.6569\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1143 - accuracy: 0.9671 - val_loss: 1.6882 - val_accuracy: 0.6745\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0964 - accuracy: 0.9743 - val_loss: 1.8612 - val_accuracy: 0.6393\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0946 - accuracy: 0.9756 - val_loss: 1.7344 - val_accuracy: 0.6393\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.0764 - accuracy: 0.9759 - val_loss: 1.8458 - val_accuracy: 0.6569\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 17ms/step - loss: 0.0875 - accuracy: 0.9749 - val_loss: 1.9101 - val_accuracy: 0.6158\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.0638 - accuracy: 0.9811 - val_loss: 1.8817 - val_accuracy: 0.6510\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.0552 - accuracy: 0.9834 - val_loss: 1.9406 - val_accuracy: 0.6804\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0737 - accuracy: 0.9821 - val_loss: 2.0013 - val_accuracy: 0.6598\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0518 - accuracy: 0.9834 - val_loss: 2.1261 - val_accuracy: 0.6452\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0652 - accuracy: 0.9788 - val_loss: 2.0143 - val_accuracy: 0.6422\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0499 - accuracy: 0.9883 - val_loss: 2.0202 - val_accuracy: 0.6569\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0390 - accuracy: 0.9899 - val_loss: 2.1174 - val_accuracy: 0.6422\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0271 - accuracy: 0.9928 - val_loss: 2.1187 - val_accuracy: 0.6686\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0209 - accuracy: 0.9951 - val_loss: 2.1488 - val_accuracy: 0.6628\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0276 - accuracy: 0.9925 - val_loss: 2.1829 - val_accuracy: 0.6628\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0193 - accuracy: 0.9948 - val_loss: 2.2642 - val_accuracy: 0.6716\n",
"Epoch 37/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0203 - accuracy: 0.9932 - val_loss: 2.2034 - val_accuracy: 0.6686\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0278 - accuracy: 0.9915 - val_loss: 2.2266 - val_accuracy: 0.6657\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0394 - accuracy: 0.9889 - val_loss: 2.1084 - val_accuracy: 0.6657\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1516 - accuracy: 0.9531 - val_loss: 2.4824 - val_accuracy: 0.6100\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2175 - accuracy: 0.9260 - val_loss: 2.2106 - val_accuracy: 0.6041\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1473 - accuracy: 0.9505 - val_loss: 2.2446 - val_accuracy: 0.6422\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0707 - accuracy: 0.9785 - val_loss: 2.2119 - val_accuracy: 0.6129\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0458 - accuracy: 0.9866 - val_loss: 2.1403 - val_accuracy: 0.6393\n",
"Epoch 45/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0258 - accuracy: 0.9919 - val_loss: 2.3236 - val_accuracy: 0.6393\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0209 - accuracy: 0.9922 - val_loss: 2.3172 - val_accuracy: 0.6569\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0195 - accuracy: 0.9935 - val_loss: 2.3468 - val_accuracy: 0.6510\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0202 - accuracy: 0.9932 - val_loss: 2.3622 - val_accuracy: 0.6540\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0253 - accuracy: 0.9938 - val_loss: 2.2941 - val_accuracy: 0.6569\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0998 - accuracy: 0.9671 - val_loss: 2.4474 - val_accuracy: 0.6305\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1169 - accuracy: 0.9583 - val_loss: 2.1563 - val_accuracy: 0.6393\n",
"Epoch 52/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0904 - accuracy: 0.9710 - val_loss: 2.1488 - val_accuracy: 0.6510\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0526 - accuracy: 0.9834 - val_loss: 2.2409 - val_accuracy: 0.6657\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0336 - accuracy: 0.9902 - val_loss: 2.2674 - val_accuracy: 0.6540\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0148 - accuracy: 0.9951 - val_loss: 2.3492 - val_accuracy: 0.6364\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0126 - accuracy: 0.9964 - val_loss: 2.3339 - val_accuracy: 0.6540\n",
"Epoch 57/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0068 - accuracy: 0.9980 - val_loss: 2.3669 - val_accuracy: 0.6628\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0121 - accuracy: 0.9948 - val_loss: 2.4540 - val_accuracy: 0.6569\n",
"Epoch 59/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0155 - accuracy: 0.9951 - val_loss: 2.3950 - val_accuracy: 0.6452\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0105 - accuracy: 0.9967 - val_loss: 2.3566 - val_accuracy: 0.6628\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0060 - accuracy: 0.9984 - val_loss: 2.4758 - val_accuracy: 0.6569\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0067 - accuracy: 0.9980 - val_loss: 2.4640 - val_accuracy: 0.6510\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0087 - accuracy: 0.9961 - val_loss: 2.5688 - val_accuracy: 0.6569\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0059 - accuracy: 0.9990 - val_loss: 2.5526 - val_accuracy: 0.6540\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0064 - accuracy: 0.9971 - val_loss: 2.6648 - val_accuracy: 0.6481\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0076 - accuracy: 0.9974 - val_loss: 2.5262 - val_accuracy: 0.6481\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0040 - accuracy: 0.9987 - val_loss: 2.5773 - val_accuracy: 0.6598\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0047 - accuracy: 0.9980 - val_loss: 2.6086 - val_accuracy: 0.6598\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0050 - accuracy: 0.9974 - val_loss: 2.6261 - val_accuracy: 0.6569\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0064 - accuracy: 0.9974 - val_loss: 2.7467 - val_accuracy: 0.6510\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0073 - accuracy: 0.9974 - val_loss: 2.7125 - val_accuracy: 0.6452\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0050 - accuracy: 0.9984 - val_loss: 2.6094 - val_accuracy: 0.6598\n",
"Epoch 73/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0033 - accuracy: 0.9993 - val_loss: 2.6774 - val_accuracy: 0.6628\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0060 - accuracy: 0.9974 - val_loss: 2.7463 - val_accuracy: 0.6481\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0053 - accuracy: 0.9980 - val_loss: 2.7188 - val_accuracy: 0.6569\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0121 - accuracy: 0.9967 - val_loss: 2.7309 - val_accuracy: 0.6510\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0080 - accuracy: 0.9967 - val_loss: 2.5759 - val_accuracy: 0.6481\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0031 - accuracy: 0.9990 - val_loss: 2.7343 - val_accuracy: 0.6598\n",
"Epoch 79/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0055 - accuracy: 0.9974 - val_loss: 2.9240 - val_accuracy: 0.6393\n",
"Epoch 80/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0149 - accuracy: 0.9961 - val_loss: 2.8269 - val_accuracy: 0.6364\n",
"Score for fold 1: loss of 2.8269495964050293; accuracy of 63.63636255264282%\n",
"------------------------------------------------------------------------\n",
"Training for fold 2 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 32ms/step - loss: 4.1196 - accuracy: 0.0212 - val_loss: 4.0523 - val_accuracy: 0.0381\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 3.5801 - accuracy: 0.1274 - val_loss: 3.2976 - val_accuracy: 0.1554\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 2.6998 - accuracy: 0.3095 - val_loss: 2.6077 - val_accuracy: 0.3284\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 2.0276 - accuracy: 0.4646 - val_loss: 2.1007 - val_accuracy: 0.4692\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.5628 - accuracy: 0.5696 - val_loss: 1.7295 - val_accuracy: 0.5572\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 1.2572 - accuracy: 0.6429 - val_loss: 1.5298 - val_accuracy: 0.5748\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 1.0024 - accuracy: 0.7116 - val_loss: 1.4188 - val_accuracy: 0.6217\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.8527 - accuracy: 0.7331 - val_loss: 1.4124 - val_accuracy: 0.6188\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.7340 - accuracy: 0.7748 - val_loss: 1.4814 - val_accuracy: 0.5953\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.6120 - accuracy: 0.8071 - val_loss: 1.4258 - val_accuracy: 0.6364\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.5608 - accuracy: 0.8260 - val_loss: 1.3703 - val_accuracy: 0.6774\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.4628 - accuracy: 0.8511 - val_loss: 1.3545 - val_accuracy: 0.6833\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.4080 - accuracy: 0.8697 - val_loss: 1.3929 - val_accuracy: 0.6628\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3363 - accuracy: 0.8921 - val_loss: 1.4738 - val_accuracy: 0.6804\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3132 - accuracy: 0.9000 - val_loss: 1.4183 - val_accuracy: 0.6686\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2528 - accuracy: 0.9225 - val_loss: 1.4681 - val_accuracy: 0.6862\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1976 - accuracy: 0.9368 - val_loss: 1.5874 - val_accuracy: 0.6950\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1966 - accuracy: 0.9394 - val_loss: 1.4784 - val_accuracy: 0.6891\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1672 - accuracy: 0.9446 - val_loss: 1.6273 - val_accuracy: 0.6481\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1697 - accuracy: 0.9482 - val_loss: 1.5722 - val_accuracy: 0.6921\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1342 - accuracy: 0.9593 - val_loss: 1.6102 - val_accuracy: 0.7126\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1189 - accuracy: 0.9616 - val_loss: 1.7595 - val_accuracy: 0.6745\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1313 - accuracy: 0.9557 - val_loss: 1.6686 - val_accuracy: 0.7009\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0879 - accuracy: 0.9769 - val_loss: 1.7893 - val_accuracy: 0.6774\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0929 - accuracy: 0.9733 - val_loss: 1.7226 - val_accuracy: 0.6569\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0851 - accuracy: 0.9752 - val_loss: 1.8638 - val_accuracy: 0.6716\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0835 - accuracy: 0.9739 - val_loss: 1.7895 - val_accuracy: 0.6833\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0668 - accuracy: 0.9788 - val_loss: 1.8133 - val_accuracy: 0.7067\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0426 - accuracy: 0.9866 - val_loss: 1.8173 - val_accuracy: 0.6950\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0692 - accuracy: 0.9775 - val_loss: 1.8854 - val_accuracy: 0.6833\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0655 - accuracy: 0.9762 - val_loss: 1.8763 - val_accuracy: 0.6979\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0418 - accuracy: 0.9863 - val_loss: 2.0026 - val_accuracy: 0.6891\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0560 - accuracy: 0.9837 - val_loss: 1.8753 - val_accuracy: 0.7067\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0429 - accuracy: 0.9850 - val_loss: 1.9350 - val_accuracy: 0.6774\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0504 - accuracy: 0.9879 - val_loss: 1.9904 - val_accuracy: 0.6950\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0426 - accuracy: 0.9896 - val_loss: 1.9476 - val_accuracy: 0.7067\n",
"Epoch 37/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0356 - accuracy: 0.9892 - val_loss: 2.1378 - val_accuracy: 0.6745\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0324 - accuracy: 0.9886 - val_loss: 2.1462 - val_accuracy: 0.6950\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0358 - accuracy: 0.9896 - val_loss: 2.0163 - val_accuracy: 0.7126\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0328 - accuracy: 0.9922 - val_loss: 2.0598 - val_accuracy: 0.6979\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0269 - accuracy: 0.9925 - val_loss: 2.1684 - val_accuracy: 0.6979\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0538 - accuracy: 0.9853 - val_loss: 2.1001 - val_accuracy: 0.6804\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1490 - accuracy: 0.9518 - val_loss: 2.0953 - val_accuracy: 0.6686\n",
"Epoch 44/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.1028 - accuracy: 0.9616 - val_loss: 2.2777 - val_accuracy: 0.6569\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0985 - accuracy: 0.9690 - val_loss: 2.0599 - val_accuracy: 0.6745\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0525 - accuracy: 0.9814 - val_loss: 2.1140 - val_accuracy: 0.6804\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0364 - accuracy: 0.9892 - val_loss: 2.0247 - val_accuracy: 0.6804\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0197 - accuracy: 0.9941 - val_loss: 2.0582 - val_accuracy: 0.6979\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0136 - accuracy: 0.9951 - val_loss: 2.1951 - val_accuracy: 0.6979\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0159 - accuracy: 0.9948 - val_loss: 2.2076 - val_accuracy: 0.6891\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0091 - accuracy: 0.9977 - val_loss: 2.2009 - val_accuracy: 0.6950\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0090 - accuracy: 0.9971 - val_loss: 2.2378 - val_accuracy: 0.6950\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0110 - accuracy: 0.9958 - val_loss: 2.2394 - val_accuracy: 0.7038\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0136 - accuracy: 0.9948 - val_loss: 2.2383 - val_accuracy: 0.7038\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0080 - accuracy: 0.9977 - val_loss: 2.2580 - val_accuracy: 0.6979\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0059 - accuracy: 0.9977 - val_loss: 2.2853 - val_accuracy: 0.6891\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0158 - accuracy: 0.9945 - val_loss: 2.3802 - val_accuracy: 0.6862\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0074 - accuracy: 0.9977 - val_loss: 2.3188 - val_accuracy: 0.6950\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0068 - accuracy: 0.9984 - val_loss: 2.3324 - val_accuracy: 0.6950\n",
"Epoch 60/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0043 - accuracy: 0.9990 - val_loss: 2.3576 - val_accuracy: 0.7038\n",
"Epoch 61/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0075 - accuracy: 0.9964 - val_loss: 2.3535 - val_accuracy: 0.6804\n",
"Epoch 62/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0061 - accuracy: 0.9984 - val_loss: 2.3606 - val_accuracy: 0.6921\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0064 - accuracy: 0.9980 - val_loss: 2.3529 - val_accuracy: 0.6921\n",
"Epoch 64/80\n",
"62/62 [==============================] - 2s 29ms/step - loss: 0.0083 - accuracy: 0.9971 - val_loss: 2.3187 - val_accuracy: 0.6862\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0034 - accuracy: 0.9993 - val_loss: 2.3468 - val_accuracy: 0.6921\n",
"Epoch 66/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0037 - accuracy: 0.9987 - val_loss: 2.3879 - val_accuracy: 0.6950\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0058 - accuracy: 0.9980 - val_loss: 2.3879 - val_accuracy: 0.6950\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0036 - accuracy: 0.9987 - val_loss: 2.4120 - val_accuracy: 0.6950\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0032 - accuracy: 0.9993 - val_loss: 2.4616 - val_accuracy: 0.6979\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0029 - accuracy: 0.9993 - val_loss: 2.4736 - val_accuracy: 0.6891\n",
"Epoch 71/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0114 - accuracy: 0.9967 - val_loss: 2.4131 - val_accuracy: 0.6833\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0116 - accuracy: 0.9964 - val_loss: 2.3520 - val_accuracy: 0.6950\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0079 - accuracy: 0.9980 - val_loss: 2.3576 - val_accuracy: 0.7038\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0039 - accuracy: 0.9990 - val_loss: 2.3837 - val_accuracy: 0.7009\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0043 - accuracy: 0.9984 - val_loss: 2.4302 - val_accuracy: 0.7009\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0035 - accuracy: 0.9987 - val_loss: 2.4356 - val_accuracy: 0.6862\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0440 - accuracy: 0.9896 - val_loss: 2.7332 - val_accuracy: 0.6364\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.4275 - accuracy: 0.8713 - val_loss: 1.7967 - val_accuracy: 0.6452\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2304 - accuracy: 0.9221 - val_loss: 1.8456 - val_accuracy: 0.6598\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0906 - accuracy: 0.9733 - val_loss: 1.9607 - val_accuracy: 0.6862\n",
"Score for fold 2: loss of 1.9607480764389038; accuracy of 68.62170100212097%\n",
"------------------------------------------------------------------------\n",
"Training for fold 3 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 31ms/step - loss: 4.1058 - accuracy: 0.0342 - val_loss: 4.0144 - val_accuracy: 0.0381\n",
"Epoch 2/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 3.3727 - accuracy: 0.1541 - val_loss: 2.9004 - val_accuracy: 0.2522\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 2.2790 - accuracy: 0.3874 - val_loss: 2.1670 - val_accuracy: 0.4194\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.6338 - accuracy: 0.5455 - val_loss: 1.7109 - val_accuracy: 0.5367\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 1.2646 - accuracy: 0.6445 - val_loss: 1.5148 - val_accuracy: 0.5806\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.0503 - accuracy: 0.6895 - val_loss: 1.3928 - val_accuracy: 0.6041\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.8583 - accuracy: 0.7374 - val_loss: 1.3870 - val_accuracy: 0.5953\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.7073 - accuracy: 0.7794 - val_loss: 1.3297 - val_accuracy: 0.6334\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.6045 - accuracy: 0.8166 - val_loss: 1.2579 - val_accuracy: 0.6686\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4967 - accuracy: 0.8426 - val_loss: 1.2434 - val_accuracy: 0.6452\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4295 - accuracy: 0.8589 - val_loss: 1.3035 - val_accuracy: 0.6481\n",
"Epoch 12/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.3649 - accuracy: 0.8843 - val_loss: 1.3667 - val_accuracy: 0.6716\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.3028 - accuracy: 0.9104 - val_loss: 1.4333 - val_accuracy: 0.6598\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2520 - accuracy: 0.9225 - val_loss: 1.4903 - val_accuracy: 0.6979\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1981 - accuracy: 0.9381 - val_loss: 1.3998 - val_accuracy: 0.6891\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1895 - accuracy: 0.9440 - val_loss: 1.5507 - val_accuracy: 0.6657\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1510 - accuracy: 0.9514 - val_loss: 1.5462 - val_accuracy: 0.6686\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1564 - accuracy: 0.9518 - val_loss: 1.5418 - val_accuracy: 0.6716\n",
"Epoch 19/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 0.1160 - accuracy: 0.9658 - val_loss: 1.7675 - val_accuracy: 0.6657\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1052 - accuracy: 0.9671 - val_loss: 1.6862 - val_accuracy: 0.6716\n",
"Epoch 21/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 0.1174 - accuracy: 0.9684 - val_loss: 1.6607 - val_accuracy: 0.6921\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0856 - accuracy: 0.9756 - val_loss: 1.8643 - val_accuracy: 0.6862\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0727 - accuracy: 0.9762 - val_loss: 1.6741 - val_accuracy: 0.6891\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0729 - accuracy: 0.9756 - val_loss: 1.7043 - val_accuracy: 0.6745\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0927 - accuracy: 0.9733 - val_loss: 1.7468 - val_accuracy: 0.7038\n",
"Epoch 26/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0707 - accuracy: 0.9775 - val_loss: 2.0712 - val_accuracy: 0.6657\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1110 - accuracy: 0.9707 - val_loss: 1.8174 - val_accuracy: 0.6833\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0510 - accuracy: 0.9840 - val_loss: 1.8706 - val_accuracy: 0.7097\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0301 - accuracy: 0.9925 - val_loss: 1.8959 - val_accuracy: 0.7009\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0407 - accuracy: 0.9879 - val_loss: 1.9567 - val_accuracy: 0.6862\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0300 - accuracy: 0.9915 - val_loss: 1.9592 - val_accuracy: 0.7038\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0224 - accuracy: 0.9941 - val_loss: 1.9999 - val_accuracy: 0.6862\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0156 - accuracy: 0.9948 - val_loss: 1.9928 - val_accuracy: 0.6891\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0131 - accuracy: 0.9964 - val_loss: 2.0294 - val_accuracy: 0.6950\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0191 - accuracy: 0.9941 - val_loss: 2.0577 - val_accuracy: 0.6891\n",
"Epoch 36/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0294 - accuracy: 0.9915 - val_loss: 2.1865 - val_accuracy: 0.6804\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0555 - accuracy: 0.9844 - val_loss: 2.2662 - val_accuracy: 0.6628\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0536 - accuracy: 0.9837 - val_loss: 2.1380 - val_accuracy: 0.6686\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1271 - accuracy: 0.9593 - val_loss: 2.2224 - val_accuracy: 0.6598\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1422 - accuracy: 0.9466 - val_loss: 2.0556 - val_accuracy: 0.7009\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0847 - accuracy: 0.9717 - val_loss: 2.0195 - val_accuracy: 0.6774\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0697 - accuracy: 0.9778 - val_loss: 2.1809 - val_accuracy: 0.6745\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0296 - accuracy: 0.9892 - val_loss: 2.0474 - val_accuracy: 0.7009\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0145 - accuracy: 0.9954 - val_loss: 2.1271 - val_accuracy: 0.6891\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0155 - accuracy: 0.9951 - val_loss: 2.1438 - val_accuracy: 0.6921\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0204 - accuracy: 0.9941 - val_loss: 2.1331 - val_accuracy: 0.6979\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0121 - accuracy: 0.9958 - val_loss: 2.1485 - val_accuracy: 0.6950\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0086 - accuracy: 0.9974 - val_loss: 2.1743 - val_accuracy: 0.6979\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0073 - accuracy: 0.9967 - val_loss: 2.1890 - val_accuracy: 0.6950\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0122 - accuracy: 0.9964 - val_loss: 2.1938 - val_accuracy: 0.6979\n",
"Epoch 51/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0067 - accuracy: 0.9980 - val_loss: 2.2505 - val_accuracy: 0.6804\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0053 - accuracy: 0.9980 - val_loss: 2.2433 - val_accuracy: 0.7009\n",
"Epoch 53/80\n",
"62/62 [==============================] - ETA: 0s - loss: 0.0126 - accuracy: 0.99 - 1s 24ms/step - loss: 0.0126 - accuracy: 0.9954 - val_loss: 2.2669 - val_accuracy: 0.6921\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0158 - accuracy: 0.9958 - val_loss: 2.2595 - val_accuracy: 0.6979\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0090 - accuracy: 0.9974 - val_loss: 2.2684 - val_accuracy: 0.6921\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0062 - accuracy: 0.9971 - val_loss: 2.3051 - val_accuracy: 0.6921\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0054 - accuracy: 0.9977 - val_loss: 2.3572 - val_accuracy: 0.6891\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0072 - accuracy: 0.9974 - val_loss: 2.3179 - val_accuracy: 0.7038\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0109 - accuracy: 0.9964 - val_loss: 2.2551 - val_accuracy: 0.6950\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0161 - accuracy: 0.9951 - val_loss: 2.4304 - val_accuracy: 0.6950\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0873 - accuracy: 0.9700 - val_loss: 2.6647 - val_accuracy: 0.6246\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.2652 - accuracy: 0.9172 - val_loss: 1.9149 - val_accuracy: 0.6745\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1573 - accuracy: 0.9462 - val_loss: 1.9517 - val_accuracy: 0.6657\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0641 - accuracy: 0.9782 - val_loss: 1.9519 - val_accuracy: 0.6862\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0351 - accuracy: 0.9892 - val_loss: 2.1022 - val_accuracy: 0.6979\n",
"Epoch 66/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0302 - accuracy: 0.9915 - val_loss: 2.0397 - val_accuracy: 0.7067\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0208 - accuracy: 0.9935 - val_loss: 2.1220 - val_accuracy: 0.7097\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0108 - accuracy: 0.9977 - val_loss: 2.1983 - val_accuracy: 0.6891\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0063 - accuracy: 0.9987 - val_loss: 2.1325 - val_accuracy: 0.7038\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0083 - accuracy: 0.9980 - val_loss: 2.1224 - val_accuracy: 0.7009\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0062 - accuracy: 0.9974 - val_loss: 2.1385 - val_accuracy: 0.7126\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0045 - accuracy: 0.9987 - val_loss: 2.2288 - val_accuracy: 0.7126\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0033 - accuracy: 0.9990 - val_loss: 2.2449 - val_accuracy: 0.7067\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0030 - accuracy: 0.9987 - val_loss: 2.2787 - val_accuracy: 0.7097\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0061 - accuracy: 0.9987 - val_loss: 2.2607 - val_accuracy: 0.7038\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0016 - accuracy: 0.9993 - val_loss: 2.3467 - val_accuracy: 0.6891\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0066 - accuracy: 0.9984 - val_loss: 2.3446 - val_accuracy: 0.7097\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0046 - accuracy: 0.9990 - val_loss: 2.2892 - val_accuracy: 0.7009\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0036 - accuracy: 0.9987 - val_loss: 2.2903 - val_accuracy: 0.7067\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0026 - accuracy: 0.9993 - val_loss: 2.3371 - val_accuracy: 0.7038\n",
"Score for fold 3: loss of 2.3371033668518066; accuracy of 70.3812301158905%\n",
"------------------------------------------------------------------------\n",
"Training for fold 4 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 32ms/step - loss: 4.1160 - accuracy: 0.0270 - val_loss: 4.0439 - val_accuracy: 0.0235\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 3.4941 - accuracy: 0.1349 - val_loss: 2.9183 - val_accuracy: 0.2463\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 2.4639 - accuracy: 0.3601 - val_loss: 2.1325 - val_accuracy: 0.4282\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 1.7589 - accuracy: 0.5239 - val_loss: 1.6440 - val_accuracy: 0.5572\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.3542 - accuracy: 0.6142 - val_loss: 1.3892 - val_accuracy: 0.6012\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.0875 - accuracy: 0.6921 - val_loss: 1.3272 - val_accuracy: 0.6393\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.8964 - accuracy: 0.7400 - val_loss: 1.3550 - val_accuracy: 0.6217\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.7813 - accuracy: 0.7530 - val_loss: 1.1880 - val_accuracy: 0.6481\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.6591 - accuracy: 0.7977 - val_loss: 1.1113 - val_accuracy: 0.7067\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.5565 - accuracy: 0.8188 - val_loss: 1.1950 - val_accuracy: 0.6745\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.5009 - accuracy: 0.8355 - val_loss: 1.2318 - val_accuracy: 0.6628\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.4168 - accuracy: 0.8667 - val_loss: 1.2715 - val_accuracy: 0.6422\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.3828 - accuracy: 0.8781 - val_loss: 1.1519 - val_accuracy: 0.6833\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2989 - accuracy: 0.9052 - val_loss: 1.2390 - val_accuracy: 0.7009\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2639 - accuracy: 0.9189 - val_loss: 1.2795 - val_accuracy: 0.6716\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2302 - accuracy: 0.9257 - val_loss: 1.3264 - val_accuracy: 0.6716\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.2023 - accuracy: 0.9420 - val_loss: 1.4151 - val_accuracy: 0.7067\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1865 - accuracy: 0.9352 - val_loss: 1.4816 - val_accuracy: 0.6686\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1969 - accuracy: 0.9391 - val_loss: 1.3541 - val_accuracy: 0.6950\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1341 - accuracy: 0.9570 - val_loss: 1.3054 - val_accuracy: 0.7390\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1214 - accuracy: 0.9606 - val_loss: 1.4944 - val_accuracy: 0.7097\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1134 - accuracy: 0.9664 - val_loss: 1.4411 - val_accuracy: 0.6862\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1090 - accuracy: 0.9645 - val_loss: 1.6532 - val_accuracy: 0.6745\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1102 - accuracy: 0.9655 - val_loss: 1.5392 - val_accuracy: 0.6979\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0864 - accuracy: 0.9743 - val_loss: 1.4643 - val_accuracy: 0.7097\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0669 - accuracy: 0.9775 - val_loss: 1.5177 - val_accuracy: 0.6862\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0684 - accuracy: 0.9782 - val_loss: 1.5709 - val_accuracy: 0.6862\n",
"Epoch 28/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0767 - accuracy: 0.9782 - val_loss: 1.7227 - val_accuracy: 0.6686\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0594 - accuracy: 0.9824 - val_loss: 1.7858 - val_accuracy: 0.6862\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0524 - accuracy: 0.9827 - val_loss: 1.7304 - val_accuracy: 0.6833\n",
"Epoch 31/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0482 - accuracy: 0.9870 - val_loss: 1.7643 - val_accuracy: 0.6481\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0533 - accuracy: 0.9847 - val_loss: 1.8299 - val_accuracy: 0.6891\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0509 - accuracy: 0.9834 - val_loss: 1.7555 - val_accuracy: 0.6686\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0838 - accuracy: 0.9743 - val_loss: 1.7295 - val_accuracy: 0.6686\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1040 - accuracy: 0.9703 - val_loss: 1.7135 - val_accuracy: 0.6804\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0859 - accuracy: 0.9739 - val_loss: 1.8286 - val_accuracy: 0.6686\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0680 - accuracy: 0.9749 - val_loss: 1.9434 - val_accuracy: 0.6452\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0842 - accuracy: 0.9759 - val_loss: 1.8290 - val_accuracy: 0.7155\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0917 - accuracy: 0.9717 - val_loss: 1.9668 - val_accuracy: 0.6833\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0786 - accuracy: 0.9762 - val_loss: 1.9995 - val_accuracy: 0.6598\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0681 - accuracy: 0.9785 - val_loss: 1.8846 - val_accuracy: 0.6833\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0559 - accuracy: 0.9814 - val_loss: 1.9847 - val_accuracy: 0.6657\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0451 - accuracy: 0.9853 - val_loss: 1.7817 - val_accuracy: 0.7097\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0265 - accuracy: 0.9915 - val_loss: 1.9732 - val_accuracy: 0.6979\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0127 - accuracy: 0.9961 - val_loss: 1.9530 - val_accuracy: 0.7067\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0100 - accuracy: 0.9964 - val_loss: 1.9818 - val_accuracy: 0.7067\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0106 - accuracy: 0.9954 - val_loss: 2.0347 - val_accuracy: 0.7009\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0147 - accuracy: 0.9951 - val_loss: 1.9826 - val_accuracy: 0.7038\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0113 - accuracy: 0.9958 - val_loss: 2.0190 - val_accuracy: 0.7185\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0119 - accuracy: 0.9954 - val_loss: 2.0699 - val_accuracy: 0.7067\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0173 - accuracy: 0.9938 - val_loss: 1.9870 - val_accuracy: 0.7067\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0066 - accuracy: 0.9974 - val_loss: 2.0416 - val_accuracy: 0.7038\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0099 - accuracy: 0.9961 - val_loss: 2.0787 - val_accuracy: 0.6950\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0107 - accuracy: 0.9961 - val_loss: 2.0387 - val_accuracy: 0.7185\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0114 - accuracy: 0.9967 - val_loss: 2.1181 - val_accuracy: 0.6862\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0165 - accuracy: 0.9951 - val_loss: 2.0021 - val_accuracy: 0.6921\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0117 - accuracy: 0.9954 - val_loss: 2.0039 - val_accuracy: 0.6950\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0048 - accuracy: 0.9990 - val_loss: 2.0948 - val_accuracy: 0.7067\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0087 - accuracy: 0.9967 - val_loss: 2.1327 - val_accuracy: 0.6979\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0069 - accuracy: 0.9977 - val_loss: 2.1100 - val_accuracy: 0.7038\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0040 - accuracy: 0.9987 - val_loss: 2.1610 - val_accuracy: 0.7214\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0037 - accuracy: 0.9984 - val_loss: 2.1818 - val_accuracy: 0.7097\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0048 - accuracy: 0.9977 - val_loss: 2.2089 - val_accuracy: 0.7126\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0033 - accuracy: 0.9990 - val_loss: 2.1921 - val_accuracy: 0.6979\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0033 - accuracy: 0.9990 - val_loss: 2.2179 - val_accuracy: 0.7155\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0027 - accuracy: 0.9993 - val_loss: 2.2342 - val_accuracy: 0.7126\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0054 - accuracy: 0.9980 - val_loss: 2.2169 - val_accuracy: 0.7097\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0032 - accuracy: 0.9987 - val_loss: 2.2535 - val_accuracy: 0.7067\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0168 - accuracy: 0.9951 - val_loss: 2.3603 - val_accuracy: 0.6921\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4456 - accuracy: 0.8589 - val_loss: 1.8492 - val_accuracy: 0.6188\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2034 - accuracy: 0.9352 - val_loss: 1.7857 - val_accuracy: 0.6921\n",
"Epoch 72/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0900 - accuracy: 0.9700 - val_loss: 1.7886 - val_accuracy: 0.6950\n",
"Epoch 73/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0640 - accuracy: 0.9808 - val_loss: 1.8231 - val_accuracy: 0.7067\n",
"Epoch 74/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0406 - accuracy: 0.9870 - val_loss: 1.7896 - val_accuracy: 0.7155\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0380 - accuracy: 0.9899 - val_loss: 1.8168 - val_accuracy: 0.7126\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0139 - accuracy: 0.9958 - val_loss: 1.8488 - val_accuracy: 0.7243\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0131 - accuracy: 0.9961 - val_loss: 1.9961 - val_accuracy: 0.7126\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0102 - accuracy: 0.9958 - val_loss: 1.8737 - val_accuracy: 0.7097\n",
"Epoch 79/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0087 - accuracy: 0.9967 - val_loss: 2.0443 - val_accuracy: 0.7067\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0225 - accuracy: 0.9948 - val_loss: 2.0101 - val_accuracy: 0.6979\n",
"Score for fold 4: loss of 2.010115623474121; accuracy of 69.79472041130066%\n",
"------------------------------------------------------------------------\n",
"Training for fold 5 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 2s 29ms/step - loss: 4.0957 - accuracy: 0.0277 - val_loss: 3.9149 - val_accuracy: 0.0557\n",
"Epoch 2/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 3.3126 - accuracy: 0.1681 - val_loss: 2.7530 - val_accuracy: 0.2698\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 2.4300 - accuracy: 0.3705 - val_loss: 2.1799 - val_accuracy: 0.3959\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 1.9276 - accuracy: 0.4894 - val_loss: 1.7922 - val_accuracy: 0.5191\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 1.5082 - accuracy: 0.5940 - val_loss: 1.5284 - val_accuracy: 0.5748\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 1.2213 - accuracy: 0.6500 - val_loss: 1.4374 - val_accuracy: 0.5953\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 1.0042 - accuracy: 0.7058 - val_loss: 1.2530 - val_accuracy: 0.6276\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.8344 - accuracy: 0.7458 - val_loss: 1.3686 - val_accuracy: 0.6100\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.6988 - accuracy: 0.7827 - val_loss: 1.1704 - val_accuracy: 0.6598\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.6143 - accuracy: 0.8097 - val_loss: 1.1905 - val_accuracy: 0.6305\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.5154 - accuracy: 0.8355 - val_loss: 1.2243 - val_accuracy: 0.6540\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4702 - accuracy: 0.8475 - val_loss: 1.2174 - val_accuracy: 0.6774\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.3760 - accuracy: 0.8752 - val_loss: 1.2955 - val_accuracy: 0.6716\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.3415 - accuracy: 0.8853 - val_loss: 1.4055 - val_accuracy: 0.6598\n",
"Epoch 15/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.2726 - accuracy: 0.9176 - val_loss: 1.2981 - val_accuracy: 0.6950\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2598 - accuracy: 0.9176 - val_loss: 1.3694 - val_accuracy: 0.6657\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2196 - accuracy: 0.9293 - val_loss: 1.4253 - val_accuracy: 0.6804\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1842 - accuracy: 0.9433 - val_loss: 1.4901 - val_accuracy: 0.6686\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1414 - accuracy: 0.9547 - val_loss: 1.5075 - val_accuracy: 0.6774\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1631 - accuracy: 0.9534 - val_loss: 1.4809 - val_accuracy: 0.6921\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1338 - accuracy: 0.9576 - val_loss: 1.6182 - val_accuracy: 0.6628\n",
"Epoch 22/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.1347 - accuracy: 0.9550 - val_loss: 1.5287 - val_accuracy: 0.6804\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0917 - accuracy: 0.9720 - val_loss: 1.6134 - val_accuracy: 0.6745\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0915 - accuracy: 0.9674 - val_loss: 1.6485 - val_accuracy: 0.6921\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0766 - accuracy: 0.9798 - val_loss: 1.6783 - val_accuracy: 0.6716\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0847 - accuracy: 0.9765 - val_loss: 1.6987 - val_accuracy: 0.6950\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0714 - accuracy: 0.9801 - val_loss: 1.6860 - val_accuracy: 0.7038\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0518 - accuracy: 0.9831 - val_loss: 1.7477 - val_accuracy: 0.6979\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0390 - accuracy: 0.9876 - val_loss: 1.8004 - val_accuracy: 0.7038\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0607 - accuracy: 0.9785 - val_loss: 1.9037 - val_accuracy: 0.6891\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0705 - accuracy: 0.9801 - val_loss: 1.8374 - val_accuracy: 0.7038\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0452 - accuracy: 0.9876 - val_loss: 1.9064 - val_accuracy: 0.6921\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0468 - accuracy: 0.9837 - val_loss: 1.9202 - val_accuracy: 0.6950\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0791 - accuracy: 0.9746 - val_loss: 1.9804 - val_accuracy: 0.6979\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0567 - accuracy: 0.9804 - val_loss: 1.9169 - val_accuracy: 0.7038\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0456 - accuracy: 0.9873 - val_loss: 1.9511 - val_accuracy: 0.6950\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0509 - accuracy: 0.9840 - val_loss: 2.0766 - val_accuracy: 0.6921\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0412 - accuracy: 0.9866 - val_loss: 2.1181 - val_accuracy: 0.7009\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0434 - accuracy: 0.9870 - val_loss: 2.0715 - val_accuracy: 0.6950\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0349 - accuracy: 0.9889 - val_loss: 1.9619 - val_accuracy: 0.7097\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0247 - accuracy: 0.9935 - val_loss: 1.9848 - val_accuracy: 0.6950\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0337 - accuracy: 0.9902 - val_loss: 2.1993 - val_accuracy: 0.6891\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0766 - accuracy: 0.9762 - val_loss: 2.2207 - val_accuracy: 0.6833\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0934 - accuracy: 0.9717 - val_loss: 2.1487 - val_accuracy: 0.6657\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1833 - accuracy: 0.9391 - val_loss: 2.1399 - val_accuracy: 0.6745\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0883 - accuracy: 0.9681 - val_loss: 2.1441 - val_accuracy: 0.6921\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0400 - accuracy: 0.9912 - val_loss: 2.0690 - val_accuracy: 0.7067\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0263 - accuracy: 0.9912 - val_loss: 2.2142 - val_accuracy: 0.6979\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0228 - accuracy: 0.9945 - val_loss: 2.2268 - val_accuracy: 0.6979\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0232 - accuracy: 0.9932 - val_loss: 2.2970 - val_accuracy: 0.6745\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0132 - accuracy: 0.9961 - val_loss: 2.1520 - val_accuracy: 0.6950\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0093 - accuracy: 0.9977 - val_loss: 2.2474 - val_accuracy: 0.7038\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0142 - accuracy: 0.9958 - val_loss: 2.2167 - val_accuracy: 0.7009\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0089 - accuracy: 0.9974 - val_loss: 2.2757 - val_accuracy: 0.7067\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0105 - accuracy: 0.9961 - val_loss: 2.2512 - val_accuracy: 0.7067\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0091 - accuracy: 0.9971 - val_loss: 2.3394 - val_accuracy: 0.6979\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0083 - accuracy: 0.9971 - val_loss: 2.2912 - val_accuracy: 0.7097\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0084 - accuracy: 0.9971 - val_loss: 2.3286 - val_accuracy: 0.7126\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0220 - accuracy: 0.9945 - val_loss: 2.2640 - val_accuracy: 0.7214\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0081 - accuracy: 0.9980 - val_loss: 2.3123 - val_accuracy: 0.7038\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0056 - accuracy: 0.9990 - val_loss: 2.3727 - val_accuracy: 0.7097\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0034 - accuracy: 0.9990 - val_loss: 2.3608 - val_accuracy: 0.7067\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0045 - accuracy: 0.9980 - val_loss: 2.3336 - val_accuracy: 0.7009\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0056 - accuracy: 0.9980 - val_loss: 2.3824 - val_accuracy: 0.7126\n",
"Epoch 65/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0060 - accuracy: 0.9980 - val_loss: 2.3902 - val_accuracy: 0.7126\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0050 - accuracy: 0.9980 - val_loss: 2.4322 - val_accuracy: 0.7097\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0075 - accuracy: 0.9977 - val_loss: 2.4015 - val_accuracy: 0.7038\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0097 - accuracy: 0.9971 - val_loss: 2.4781 - val_accuracy: 0.6979\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0059 - accuracy: 0.9980 - val_loss: 2.3712 - val_accuracy: 0.7009\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0062 - accuracy: 0.9980 - val_loss: 2.4851 - val_accuracy: 0.6950\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0043 - accuracy: 0.9987 - val_loss: 2.4434 - val_accuracy: 0.7038\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0139 - accuracy: 0.9967 - val_loss: 2.4102 - val_accuracy: 0.6979\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0048 - accuracy: 0.9993 - val_loss: 2.5112 - val_accuracy: 0.6804\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0068 - accuracy: 0.9980 - val_loss: 2.3889 - val_accuracy: 0.7097\n",
"Epoch 75/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0027 - accuracy: 0.9997 - val_loss: 2.4776 - val_accuracy: 0.7126\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0021 - accuracy: 0.9993 - val_loss: 2.5311 - val_accuracy: 0.7126\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0023 - accuracy: 0.9993 - val_loss: 2.5862 - val_accuracy: 0.7126\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0019 - accuracy: 0.9997 - val_loss: 2.6129 - val_accuracy: 0.6950\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0011 - accuracy: 1.0000 - val_loss: 2.5880 - val_accuracy: 0.7185\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0013 - accuracy: 0.9997 - val_loss: 2.6228 - val_accuracy: 0.7214\n",
"Score for fold 5: loss of 2.6227874755859375; accuracy of 72.14076519012451%\n",
"------------------------------------------------------------------------\n",
"Training for fold 6 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 36ms/step - loss: 4.1148 - accuracy: 0.0283 - val_loss: 4.0399 - val_accuracy: 0.0411\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 3.3910 - accuracy: 0.1659 - val_loss: 3.1004 - val_accuracy: 0.2434\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 2.3237 - accuracy: 0.3835 - val_loss: 2.3350 - val_accuracy: 0.3666\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.6999 - accuracy: 0.5301 - val_loss: 1.9660 - val_accuracy: 0.4692\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 1.3785 - accuracy: 0.6126 - val_loss: 1.7115 - val_accuracy: 0.5073\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 1.1542 - accuracy: 0.6631 - val_loss: 1.5212 - val_accuracy: 0.5982\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.9997 - accuracy: 0.7142 - val_loss: 1.4345 - val_accuracy: 0.6188\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.8552 - accuracy: 0.7481 - val_loss: 1.4262 - val_accuracy: 0.6129\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.7603 - accuracy: 0.7618 - val_loss: 1.3437 - val_accuracy: 0.6540\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.6438 - accuracy: 0.8084 - val_loss: 1.3825 - val_accuracy: 0.6510\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.5745 - accuracy: 0.8166 - val_loss: 1.3138 - val_accuracy: 0.6481\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4977 - accuracy: 0.8390 - val_loss: 1.3723 - val_accuracy: 0.6364\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.4482 - accuracy: 0.8693 - val_loss: 1.3061 - val_accuracy: 0.6510\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.3967 - accuracy: 0.8716 - val_loss: 1.3552 - val_accuracy: 0.6774\n",
"Epoch 15/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.3338 - accuracy: 0.8921 - val_loss: 1.3677 - val_accuracy: 0.6686\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2787 - accuracy: 0.9058 - val_loss: 1.4221 - val_accuracy: 0.6686\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2714 - accuracy: 0.9140 - val_loss: 1.4635 - val_accuracy: 0.6804\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2063 - accuracy: 0.9355 - val_loss: 1.4503 - val_accuracy: 0.6745\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1762 - accuracy: 0.9404 - val_loss: 1.5704 - val_accuracy: 0.6950\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1673 - accuracy: 0.9475 - val_loss: 1.6577 - val_accuracy: 0.6804\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1463 - accuracy: 0.9537 - val_loss: 1.6329 - val_accuracy: 0.6921\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1376 - accuracy: 0.9557 - val_loss: 1.7712 - val_accuracy: 0.6774\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1146 - accuracy: 0.9658 - val_loss: 1.6993 - val_accuracy: 0.6979\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1143 - accuracy: 0.9671 - val_loss: 1.6955 - val_accuracy: 0.6891\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1044 - accuracy: 0.9664 - val_loss: 1.7522 - val_accuracy: 0.6804\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0917 - accuracy: 0.9765 - val_loss: 1.7470 - val_accuracy: 0.6950\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0635 - accuracy: 0.9821 - val_loss: 1.9017 - val_accuracy: 0.7038\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1112 - accuracy: 0.9690 - val_loss: 1.7850 - val_accuracy: 0.7067\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0774 - accuracy: 0.9749 - val_loss: 1.7810 - val_accuracy: 0.6862\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0619 - accuracy: 0.9837 - val_loss: 1.9436 - val_accuracy: 0.6833\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0640 - accuracy: 0.9785 - val_loss: 1.9468 - val_accuracy: 0.7009\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0485 - accuracy: 0.9850 - val_loss: 2.0082 - val_accuracy: 0.6774\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0584 - accuracy: 0.9827 - val_loss: 1.9324 - val_accuracy: 0.6979\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0628 - accuracy: 0.9801 - val_loss: 2.0684 - val_accuracy: 0.6833\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0587 - accuracy: 0.9837 - val_loss: 1.9733 - val_accuracy: 0.6862\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0531 - accuracy: 0.9814 - val_loss: 2.1672 - val_accuracy: 0.6716\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0720 - accuracy: 0.9782 - val_loss: 2.2526 - val_accuracy: 0.6745\n",
"Epoch 38/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.1128 - accuracy: 0.9619 - val_loss: 1.9593 - val_accuracy: 0.6804\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0791 - accuracy: 0.9772 - val_loss: 1.8698 - val_accuracy: 0.6950\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0655 - accuracy: 0.9834 - val_loss: 2.0168 - val_accuracy: 0.7067\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0314 - accuracy: 0.9906 - val_loss: 2.0212 - val_accuracy: 0.7302\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0227 - accuracy: 0.9915 - val_loss: 2.0997 - val_accuracy: 0.7214\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0150 - accuracy: 0.9945 - val_loss: 2.1883 - val_accuracy: 0.7009\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0147 - accuracy: 0.9948 - val_loss: 2.1351 - val_accuracy: 0.7126\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0267 - accuracy: 0.9919 - val_loss: 2.0791 - val_accuracy: 0.7038\n",
"Epoch 46/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 0.0560 - accuracy: 0.9844 - val_loss: 2.0741 - val_accuracy: 0.7097\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0423 - accuracy: 0.9866 - val_loss: 2.2812 - val_accuracy: 0.6862\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0497 - accuracy: 0.9850 - val_loss: 2.1963 - val_accuracy: 0.6862\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0375 - accuracy: 0.9876 - val_loss: 2.2636 - val_accuracy: 0.6745\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0935 - accuracy: 0.9733 - val_loss: 2.7384 - val_accuracy: 0.6217\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1078 - accuracy: 0.9661 - val_loss: 2.3401 - val_accuracy: 0.6598\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0624 - accuracy: 0.9785 - val_loss: 2.1644 - val_accuracy: 0.7067\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0560 - accuracy: 0.9827 - val_loss: 2.2475 - val_accuracy: 0.6804\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0531 - accuracy: 0.9834 - val_loss: 2.2439 - val_accuracy: 0.6833\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.0419 - accuracy: 0.9873 - val_loss: 2.2797 - val_accuracy: 0.6686\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0248 - accuracy: 0.9938 - val_loss: 2.1639 - val_accuracy: 0.6979\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0169 - accuracy: 0.9948 - val_loss: 2.2934 - val_accuracy: 0.6950\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0257 - accuracy: 0.9935 - val_loss: 2.3076 - val_accuracy: 0.6979\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0152 - accuracy: 0.9951 - val_loss: 2.3100 - val_accuracy: 0.7126\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0130 - accuracy: 0.9951 - val_loss: 2.3291 - val_accuracy: 0.7155\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0113 - accuracy: 0.9961 - val_loss: 2.3081 - val_accuracy: 0.7126\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0162 - accuracy: 0.9954 - val_loss: 2.4600 - val_accuracy: 0.6686\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0088 - accuracy: 0.9958 - val_loss: 2.3962 - val_accuracy: 0.7126\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0155 - accuracy: 0.9961 - val_loss: 2.5643 - val_accuracy: 0.6804\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0838 - accuracy: 0.9749 - val_loss: 2.3800 - val_accuracy: 0.6305\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1046 - accuracy: 0.9684 - val_loss: 2.4192 - val_accuracy: 0.6540\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0840 - accuracy: 0.9726 - val_loss: 2.7652 - val_accuracy: 0.6686\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1023 - accuracy: 0.9612 - val_loss: 2.3553 - val_accuracy: 0.6686\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0495 - accuracy: 0.9847 - val_loss: 2.4034 - val_accuracy: 0.6804\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0227 - accuracy: 0.9925 - val_loss: 2.4183 - val_accuracy: 0.6804\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0142 - accuracy: 0.9967 - val_loss: 2.3543 - val_accuracy: 0.6921\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0109 - accuracy: 0.9964 - val_loss: 2.5394 - val_accuracy: 0.6774\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0301 - accuracy: 0.9945 - val_loss: 2.5897 - val_accuracy: 0.6804\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0413 - accuracy: 0.9909 - val_loss: 2.4726 - val_accuracy: 0.6686\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0099 - accuracy: 0.9974 - val_loss: 2.4633 - val_accuracy: 0.6921\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0046 - accuracy: 0.9990 - val_loss: 2.6381 - val_accuracy: 0.6804\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0053 - accuracy: 0.9977 - val_loss: 2.5727 - val_accuracy: 0.6921\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0041 - accuracy: 0.9990 - val_loss: 2.6148 - val_accuracy: 0.6862\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0047 - accuracy: 0.9977 - val_loss: 2.7136 - val_accuracy: 0.6862\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0117 - accuracy: 0.9971 - val_loss: 2.5878 - val_accuracy: 0.6862\n",
"Score for fold 6: loss of 2.587796926498413; accuracy of 68.62170100212097%\n",
"------------------------------------------------------------------------\n",
"Training for fold 7 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 32ms/step - loss: 4.1187 - accuracy: 0.0293 - val_loss: 4.0362 - val_accuracy: 0.0616\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 3.4906 - accuracy: 0.1391 - val_loss: 2.9413 - val_accuracy: 0.2405\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 2.5844 - accuracy: 0.3337 - val_loss: 2.3244 - val_accuracy: 0.4106\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.9907 - accuracy: 0.4731 - val_loss: 1.9670 - val_accuracy: 0.4692\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 1.5275 - accuracy: 0.5702 - val_loss: 1.4563 - val_accuracy: 0.5689\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 1.2171 - accuracy: 0.6582 - val_loss: 1.3642 - val_accuracy: 0.5894\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.9933 - accuracy: 0.7061 - val_loss: 1.2089 - val_accuracy: 0.6686\n",
"Epoch 8/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.8493 - accuracy: 0.7475 - val_loss: 1.1941 - val_accuracy: 0.6305\n",
"Epoch 9/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.7384 - accuracy: 0.7748 - val_loss: 1.0122 - val_accuracy: 0.7038\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.6363 - accuracy: 0.8091 - val_loss: 1.0789 - val_accuracy: 0.6716\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.5308 - accuracy: 0.8390 - val_loss: 1.0410 - val_accuracy: 0.6921\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.4636 - accuracy: 0.8547 - val_loss: 1.0254 - val_accuracy: 0.7155\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4004 - accuracy: 0.8804 - val_loss: 1.0845 - val_accuracy: 0.7097\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.3436 - accuracy: 0.8938 - val_loss: 1.0772 - val_accuracy: 0.6833\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.2974 - accuracy: 0.9110 - val_loss: 1.0737 - val_accuracy: 0.6950\n",
"Epoch 16/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.2557 - accuracy: 0.9182 - val_loss: 1.0932 - val_accuracy: 0.6979\n",
"Epoch 17/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.2229 - accuracy: 0.9303 - val_loss: 1.1805 - val_accuracy: 0.6833\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1942 - accuracy: 0.9436 - val_loss: 1.1670 - val_accuracy: 0.7038\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1946 - accuracy: 0.9352 - val_loss: 1.1651 - val_accuracy: 0.7067\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1561 - accuracy: 0.9531 - val_loss: 1.1776 - val_accuracy: 0.7273\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1369 - accuracy: 0.9606 - val_loss: 1.1799 - val_accuracy: 0.7390\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1451 - accuracy: 0.9554 - val_loss: 1.1697 - val_accuracy: 0.7185\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1574 - accuracy: 0.9537 - val_loss: 1.1824 - val_accuracy: 0.6891\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1250 - accuracy: 0.9586 - val_loss: 1.2986 - val_accuracy: 0.7214\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1037 - accuracy: 0.9681 - val_loss: 1.2377 - val_accuracy: 0.7155\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0952 - accuracy: 0.9710 - val_loss: 1.1803 - val_accuracy: 0.7302\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0785 - accuracy: 0.9762 - val_loss: 1.2144 - val_accuracy: 0.7273\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0715 - accuracy: 0.9782 - val_loss: 1.3256 - val_accuracy: 0.7302\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0697 - accuracy: 0.9791 - val_loss: 1.3604 - val_accuracy: 0.7243\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0576 - accuracy: 0.9827 - val_loss: 1.3751 - val_accuracy: 0.7097\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0451 - accuracy: 0.9866 - val_loss: 1.3351 - val_accuracy: 0.7390\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0468 - accuracy: 0.9860 - val_loss: 1.3958 - val_accuracy: 0.7067\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0311 - accuracy: 0.9922 - val_loss: 1.4096 - val_accuracy: 0.7449\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0242 - accuracy: 0.9919 - val_loss: 1.4948 - val_accuracy: 0.7331\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0250 - accuracy: 0.9922 - val_loss: 1.4523 - val_accuracy: 0.7243\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0225 - accuracy: 0.9941 - val_loss: 1.5341 - val_accuracy: 0.7507\n",
"Epoch 37/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0308 - accuracy: 0.9902 - val_loss: 1.5003 - val_accuracy: 0.7595\n",
"Epoch 38/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0263 - accuracy: 0.9938 - val_loss: 1.5771 - val_accuracy: 0.7273\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0218 - accuracy: 0.9932 - val_loss: 1.6398 - val_accuracy: 0.7302\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0385 - accuracy: 0.9909 - val_loss: 1.5140 - val_accuracy: 0.7390\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0292 - accuracy: 0.9912 - val_loss: 1.6513 - val_accuracy: 0.7273\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0361 - accuracy: 0.9892 - val_loss: 1.5422 - val_accuracy: 0.7067\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0492 - accuracy: 0.9857 - val_loss: 1.5291 - val_accuracy: 0.7361\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0773 - accuracy: 0.9752 - val_loss: 1.6908 - val_accuracy: 0.7302\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1169 - accuracy: 0.9583 - val_loss: 1.7247 - val_accuracy: 0.7038\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2050 - accuracy: 0.9290 - val_loss: 1.6015 - val_accuracy: 0.7009\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0986 - accuracy: 0.9700 - val_loss: 1.6989 - val_accuracy: 0.7155\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0390 - accuracy: 0.9876 - val_loss: 1.4852 - val_accuracy: 0.7097\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0232 - accuracy: 0.9928 - val_loss: 1.5549 - val_accuracy: 0.7185\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0196 - accuracy: 0.9941 - val_loss: 1.5148 - val_accuracy: 0.7419\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0115 - accuracy: 0.9954 - val_loss: 1.6101 - val_accuracy: 0.7331\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0095 - accuracy: 0.9964 - val_loss: 1.5902 - val_accuracy: 0.7273\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0161 - accuracy: 0.9938 - val_loss: 1.6166 - val_accuracy: 0.7361\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0076 - accuracy: 0.9980 - val_loss: 1.6145 - val_accuracy: 0.7361\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0049 - accuracy: 0.9990 - val_loss: 1.6576 - val_accuracy: 0.7331\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0141 - accuracy: 0.9945 - val_loss: 1.6445 - val_accuracy: 0.7331\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 18ms/step - loss: 0.0076 - accuracy: 0.9971 - val_loss: 1.6751 - val_accuracy: 0.7126\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0065 - accuracy: 0.9977 - val_loss: 1.7235 - val_accuracy: 0.7302\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0037 - accuracy: 0.9997 - val_loss: 1.7154 - val_accuracy: 0.7331\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0047 - accuracy: 0.9977 - val_loss: 1.7104 - val_accuracy: 0.7302\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0041 - accuracy: 0.9987 - val_loss: 1.7310 - val_accuracy: 0.7302\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0027 - accuracy: 0.9993 - val_loss: 1.7724 - val_accuracy: 0.7361\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0027 - accuracy: 0.9997 - val_loss: 1.7717 - val_accuracy: 0.7302\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0027 - accuracy: 0.9993 - val_loss: 1.7782 - val_accuracy: 0.7331\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0021 - accuracy: 0.9997 - val_loss: 1.8067 - val_accuracy: 0.7302\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0026 - accuracy: 0.9990 - val_loss: 1.7722 - val_accuracy: 0.7302\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0022 - accuracy: 0.9993 - val_loss: 1.8434 - val_accuracy: 0.7419\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0029 - accuracy: 0.9993 - val_loss: 1.8424 - val_accuracy: 0.7331\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0073 - accuracy: 0.9980 - val_loss: 1.8713 - val_accuracy: 0.7302\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0162 - accuracy: 0.9954 - val_loss: 1.7701 - val_accuracy: 0.7185\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0132 - accuracy: 0.9964 - val_loss: 1.8570 - val_accuracy: 0.7243\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0388 - accuracy: 0.9876 - val_loss: 2.1655 - val_accuracy: 0.6598\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3568 - accuracy: 0.8928 - val_loss: 1.3760 - val_accuracy: 0.7097\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1236 - accuracy: 0.9554 - val_loss: 1.5120 - val_accuracy: 0.6891\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0555 - accuracy: 0.9808 - val_loss: 1.6058 - val_accuracy: 0.7155\n",
"Epoch 76/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0329 - accuracy: 0.9889 - val_loss: 1.6033 - val_accuracy: 0.7126\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0152 - accuracy: 0.9954 - val_loss: 1.7478 - val_accuracy: 0.6979\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0068 - accuracy: 0.9987 - val_loss: 1.7064 - val_accuracy: 0.7038\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0051 - accuracy: 0.9990 - val_loss: 1.6698 - val_accuracy: 0.7097\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0051 - accuracy: 0.9980 - val_loss: 1.7486 - val_accuracy: 0.7067\n",
"Score for fold 7: loss of 1.7485564947128296; accuracy of 70.67448496818542%\n",
"------------------------------------------------------------------------\n",
"Training for fold 8 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 2s 30ms/step - loss: 4.1014 - accuracy: 0.0235 - val_loss: 3.9662 - val_accuracy: 0.0352\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 3.3675 - accuracy: 0.1610 - val_loss: 2.8151 - val_accuracy: 0.2933\n",
"Epoch 3/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 2.3748 - accuracy: 0.3708 - val_loss: 2.0167 - val_accuracy: 0.4663\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 1.7164 - accuracy: 0.5360 - val_loss: 1.6771 - val_accuracy: 0.5601\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.2984 - accuracy: 0.6259 - val_loss: 1.4134 - val_accuracy: 0.6188\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 1.0550 - accuracy: 0.6924 - val_loss: 1.2426 - val_accuracy: 0.6686\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.8872 - accuracy: 0.7302 - val_loss: 1.3604 - val_accuracy: 0.6276\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.7470 - accuracy: 0.7742 - val_loss: 1.2228 - val_accuracy: 0.6452\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.6179 - accuracy: 0.8198 - val_loss: 1.2048 - val_accuracy: 0.6745\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.5448 - accuracy: 0.8280 - val_loss: 1.1237 - val_accuracy: 0.7038\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.4397 - accuracy: 0.8583 - val_loss: 1.1684 - val_accuracy: 0.6804\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3713 - accuracy: 0.8847 - val_loss: 1.1747 - val_accuracy: 0.7009\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3088 - accuracy: 0.9088 - val_loss: 1.1648 - val_accuracy: 0.7273\n",
"Epoch 14/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.2471 - accuracy: 0.9277 - val_loss: 1.3007 - val_accuracy: 0.6950\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.2676 - accuracy: 0.9140 - val_loss: 1.2716 - val_accuracy: 0.6921\n",
"Epoch 16/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.2137 - accuracy: 0.9296 - val_loss: 1.3441 - val_accuracy: 0.6950\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1642 - accuracy: 0.9521 - val_loss: 1.3759 - val_accuracy: 0.7009\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1493 - accuracy: 0.9537 - val_loss: 1.5861 - val_accuracy: 0.6804\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1373 - accuracy: 0.9580 - val_loss: 1.4002 - val_accuracy: 0.7097\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0969 - accuracy: 0.9717 - val_loss: 1.4645 - val_accuracy: 0.7185\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1161 - accuracy: 0.9651 - val_loss: 1.5355 - val_accuracy: 0.6950\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1162 - accuracy: 0.9668 - val_loss: 1.5424 - val_accuracy: 0.6950\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1059 - accuracy: 0.9677 - val_loss: 1.6042 - val_accuracy: 0.6862\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0552 - accuracy: 0.9837 - val_loss: 1.7489 - val_accuracy: 0.6833\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0721 - accuracy: 0.9785 - val_loss: 1.6987 - val_accuracy: 0.6774\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0589 - accuracy: 0.9798 - val_loss: 1.6675 - val_accuracy: 0.6979\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0447 - accuracy: 0.9870 - val_loss: 1.6711 - val_accuracy: 0.7185\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0453 - accuracy: 0.9860 - val_loss: 1.9124 - val_accuracy: 0.6628\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0511 - accuracy: 0.9837 - val_loss: 1.7993 - val_accuracy: 0.6891\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0460 - accuracy: 0.9860 - val_loss: 1.8997 - val_accuracy: 0.6628\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0449 - accuracy: 0.9866 - val_loss: 1.9569 - val_accuracy: 0.6950\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0612 - accuracy: 0.9827 - val_loss: 1.8769 - val_accuracy: 0.6891\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0427 - accuracy: 0.9870 - val_loss: 2.0371 - val_accuracy: 0.6804\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.1277 - accuracy: 0.9622 - val_loss: 1.9033 - val_accuracy: 0.6657\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1007 - accuracy: 0.9668 - val_loss: 2.0515 - val_accuracy: 0.6774\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0603 - accuracy: 0.9795 - val_loss: 1.9887 - val_accuracy: 0.6686\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0368 - accuracy: 0.9860 - val_loss: 1.9042 - val_accuracy: 0.7009\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0508 - accuracy: 0.9850 - val_loss: 1.8537 - val_accuracy: 0.7185\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0470 - accuracy: 0.9879 - val_loss: 1.9760 - val_accuracy: 0.7185\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0324 - accuracy: 0.9925 - val_loss: 1.9539 - val_accuracy: 0.7038\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0337 - accuracy: 0.9899 - val_loss: 1.9607 - val_accuracy: 0.7067\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0166 - accuracy: 0.9951 - val_loss: 1.9701 - val_accuracy: 0.7009\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0133 - accuracy: 0.9951 - val_loss: 1.9870 - val_accuracy: 0.7185\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0322 - accuracy: 0.9912 - val_loss: 2.0917 - val_accuracy: 0.6716\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0465 - accuracy: 0.9847 - val_loss: 2.3282 - val_accuracy: 0.6657\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0549 - accuracy: 0.9834 - val_loss: 2.2477 - val_accuracy: 0.6716\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1012 - accuracy: 0.9694 - val_loss: 2.0189 - val_accuracy: 0.6686\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0741 - accuracy: 0.9808 - val_loss: 1.9903 - val_accuracy: 0.7009\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0744 - accuracy: 0.9762 - val_loss: 2.1765 - val_accuracy: 0.6774\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0396 - accuracy: 0.9863 - val_loss: 2.0859 - val_accuracy: 0.7155\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0404 - accuracy: 0.9879 - val_loss: 2.1643 - val_accuracy: 0.6657\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0499 - accuracy: 0.9837 - val_loss: 1.9819 - val_accuracy: 0.6979\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0317 - accuracy: 0.9899 - val_loss: 2.2071 - val_accuracy: 0.6979\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0315 - accuracy: 0.9915 - val_loss: 2.2613 - val_accuracy: 0.6862\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0171 - accuracy: 0.9958 - val_loss: 2.0460 - val_accuracy: 0.7097\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0254 - accuracy: 0.9915 - val_loss: 2.1479 - val_accuracy: 0.6950\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0154 - accuracy: 0.9954 - val_loss: 2.1073 - val_accuracy: 0.7185\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0171 - accuracy: 0.9948 - val_loss: 2.1768 - val_accuracy: 0.7243\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0127 - accuracy: 0.9967 - val_loss: 2.0410 - val_accuracy: 0.7009\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0222 - accuracy: 0.9928 - val_loss: 2.0760 - val_accuracy: 0.6921\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0139 - accuracy: 0.9951 - val_loss: 2.1351 - val_accuracy: 0.6804\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0221 - accuracy: 0.9945 - val_loss: 1.9774 - val_accuracy: 0.7067\n",
"Epoch 63/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0095 - accuracy: 0.9961 - val_loss: 2.1441 - val_accuracy: 0.7185\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0150 - accuracy: 0.9948 - val_loss: 2.3554 - val_accuracy: 0.7009\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0288 - accuracy: 0.9919 - val_loss: 2.5546 - val_accuracy: 0.6891\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0522 - accuracy: 0.9837 - val_loss: 2.3763 - val_accuracy: 0.6628\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1294 - accuracy: 0.9599 - val_loss: 2.2744 - val_accuracy: 0.6657\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1175 - accuracy: 0.9661 - val_loss: 2.1870 - val_accuracy: 0.6804\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0571 - accuracy: 0.9847 - val_loss: 2.0602 - val_accuracy: 0.7273\n",
"Epoch 70/80\n",
"62/62 [==============================] - 2s 28ms/step - loss: 0.0299 - accuracy: 0.9912 - val_loss: 2.3616 - val_accuracy: 0.6774\n",
"Epoch 71/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0224 - accuracy: 0.9928 - val_loss: 2.2492 - val_accuracy: 0.7038\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0077 - accuracy: 0.9987 - val_loss: 2.2623 - val_accuracy: 0.7126\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0080 - accuracy: 0.9977 - val_loss: 2.2675 - val_accuracy: 0.7067\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0127 - accuracy: 0.9954 - val_loss: 2.2141 - val_accuracy: 0.7155\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0051 - accuracy: 0.9984 - val_loss: 2.2032 - val_accuracy: 0.7155\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0087 - accuracy: 0.9977 - val_loss: 2.2635 - val_accuracy: 0.7038\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0104 - accuracy: 0.9974 - val_loss: 2.2820 - val_accuracy: 0.6950\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0042 - accuracy: 0.9987 - val_loss: 2.2935 - val_accuracy: 0.7009\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0029 - accuracy: 0.9993 - val_loss: 2.3007 - val_accuracy: 0.7038\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0032 - accuracy: 0.9993 - val_loss: 2.2872 - val_accuracy: 0.7067\n",
"Score for fold 8: loss of 2.2871928215026855; accuracy of 70.67448496818542%\n",
"------------------------------------------------------------------------\n",
"Training for fold 9 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 2s 30ms/step - loss: 4.1220 - accuracy: 0.0222 - val_loss: 4.0618 - val_accuracy: 0.0499\n",
"Epoch 2/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 3.4239 - accuracy: 0.1580 - val_loss: 2.7325 - val_accuracy: 0.2815\n",
"Epoch 3/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 2.2464 - accuracy: 0.4070 - val_loss: 2.0154 - val_accuracy: 0.4839\n",
"Epoch 4/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.5865 - accuracy: 0.5598 - val_loss: 1.7177 - val_accuracy: 0.5396\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.1898 - accuracy: 0.6618 - val_loss: 1.4536 - val_accuracy: 0.6041\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.9746 - accuracy: 0.7155 - val_loss: 1.3604 - val_accuracy: 0.6305\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.8137 - accuracy: 0.7530 - val_loss: 1.2402 - val_accuracy: 0.6481\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.6484 - accuracy: 0.7996 - val_loss: 1.1603 - val_accuracy: 0.6774\n",
"Epoch 9/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.5635 - accuracy: 0.8260 - val_loss: 1.2853 - val_accuracy: 0.6510\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.4706 - accuracy: 0.8540 - val_loss: 1.3185 - val_accuracy: 0.6510\n",
"Epoch 11/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.3932 - accuracy: 0.8788 - val_loss: 1.2248 - val_accuracy: 0.6686\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.3314 - accuracy: 0.8987 - val_loss: 1.2374 - val_accuracy: 0.6598\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2939 - accuracy: 0.9078 - val_loss: 1.3277 - val_accuracy: 0.6716\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.2808 - accuracy: 0.9130 - val_loss: 1.3950 - val_accuracy: 0.6686\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1852 - accuracy: 0.9466 - val_loss: 1.4783 - val_accuracy: 0.6686\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1973 - accuracy: 0.9397 - val_loss: 1.4524 - val_accuracy: 0.6628\n",
"Epoch 17/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1621 - accuracy: 0.9479 - val_loss: 1.5335 - val_accuracy: 0.6569\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1391 - accuracy: 0.9583 - val_loss: 1.5065 - val_accuracy: 0.6862\n",
"Epoch 19/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1073 - accuracy: 0.9713 - val_loss: 1.5400 - val_accuracy: 0.6891\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1044 - accuracy: 0.9700 - val_loss: 1.6070 - val_accuracy: 0.6628\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0819 - accuracy: 0.9752 - val_loss: 1.6398 - val_accuracy: 0.6540\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0811 - accuracy: 0.9769 - val_loss: 1.7297 - val_accuracy: 0.6804\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0870 - accuracy: 0.9775 - val_loss: 1.6981 - val_accuracy: 0.6686\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1074 - accuracy: 0.9720 - val_loss: 1.5424 - val_accuracy: 0.7126\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0692 - accuracy: 0.9811 - val_loss: 1.6157 - val_accuracy: 0.6833\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0401 - accuracy: 0.9879 - val_loss: 1.7492 - val_accuracy: 0.6862\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0473 - accuracy: 0.9860 - val_loss: 1.7887 - val_accuracy: 0.6833\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0510 - accuracy: 0.9853 - val_loss: 1.7385 - val_accuracy: 0.6833\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0486 - accuracy: 0.9860 - val_loss: 1.9022 - val_accuracy: 0.6686\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0577 - accuracy: 0.9840 - val_loss: 1.8568 - val_accuracy: 0.6804\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0619 - accuracy: 0.9834 - val_loss: 1.8916 - val_accuracy: 0.6774\n",
"Epoch 32/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0758 - accuracy: 0.9765 - val_loss: 1.8975 - val_accuracy: 0.6628\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0875 - accuracy: 0.9756 - val_loss: 2.1030 - val_accuracy: 0.6657\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0893 - accuracy: 0.9668 - val_loss: 1.9254 - val_accuracy: 0.6657\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0622 - accuracy: 0.9798 - val_loss: 1.8718 - val_accuracy: 0.6891\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0418 - accuracy: 0.9876 - val_loss: 1.9749 - val_accuracy: 0.6921\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0465 - accuracy: 0.9853 - val_loss: 1.9971 - val_accuracy: 0.6716\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0718 - accuracy: 0.9775 - val_loss: 2.0241 - val_accuracy: 0.6862\n",
"Epoch 39/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0584 - accuracy: 0.9801 - val_loss: 1.9995 - val_accuracy: 0.6833\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0342 - accuracy: 0.9899 - val_loss: 2.1054 - val_accuracy: 0.6657\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0263 - accuracy: 0.9925 - val_loss: 2.1175 - val_accuracy: 0.6833\n",
"Epoch 42/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0124 - accuracy: 0.9964 - val_loss: 2.0999 - val_accuracy: 0.6862\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0403 - accuracy: 0.9889 - val_loss: 2.2080 - val_accuracy: 0.6716\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0590 - accuracy: 0.9837 - val_loss: 2.2236 - val_accuracy: 0.6774\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0852 - accuracy: 0.9775 - val_loss: 1.9087 - val_accuracy: 0.6686\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0298 - accuracy: 0.9932 - val_loss: 1.9759 - val_accuracy: 0.6804\n",
"Epoch 47/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0225 - accuracy: 0.9935 - val_loss: 2.0122 - val_accuracy: 0.6950\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0143 - accuracy: 0.9961 - val_loss: 1.9644 - val_accuracy: 0.6950\n",
"Epoch 49/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0059 - accuracy: 0.9980 - val_loss: 2.0185 - val_accuracy: 0.6921\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0058 - accuracy: 0.9984 - val_loss: 2.0507 - val_accuracy: 0.6979\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0055 - accuracy: 0.9984 - val_loss: 2.1037 - val_accuracy: 0.6979\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0065 - accuracy: 0.9984 - val_loss: 2.1036 - val_accuracy: 0.6950\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0042 - accuracy: 0.9987 - val_loss: 2.1285 - val_accuracy: 0.6921\n",
"Epoch 54/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0044 - accuracy: 0.9990 - val_loss: 2.1856 - val_accuracy: 0.6950\n",
"Epoch 55/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0044 - accuracy: 0.9984 - val_loss: 2.2159 - val_accuracy: 0.6950\n",
"Epoch 56/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0050 - accuracy: 0.9987 - val_loss: 2.2115 - val_accuracy: 0.6921\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0054 - accuracy: 0.9984 - val_loss: 2.2214 - val_accuracy: 0.6891\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0054 - accuracy: 0.9984 - val_loss: 2.1961 - val_accuracy: 0.7067\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0056 - accuracy: 0.9984 - val_loss: 2.1579 - val_accuracy: 0.6950\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0050 - accuracy: 0.9987 - val_loss: 2.2094 - val_accuracy: 0.6833\n",
"Epoch 61/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0063 - accuracy: 0.9980 - val_loss: 2.2343 - val_accuracy: 0.6891\n",
"Epoch 62/80\n",
"62/62 [==============================] - 2s 26ms/step - loss: 0.0042 - accuracy: 0.9984 - val_loss: 2.2284 - val_accuracy: 0.7038\n",
"Epoch 63/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0049 - accuracy: 0.9980 - val_loss: 2.2338 - val_accuracy: 0.6979\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0091 - accuracy: 0.9974 - val_loss: 2.6446 - val_accuracy: 0.6686\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.2005 - accuracy: 0.9342 - val_loss: 2.0209 - val_accuracy: 0.6481\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2535 - accuracy: 0.9185 - val_loss: 2.1616 - val_accuracy: 0.6305\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1401 - accuracy: 0.9508 - val_loss: 1.9564 - val_accuracy: 0.6979\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0759 - accuracy: 0.9788 - val_loss: 1.9610 - val_accuracy: 0.6833\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0220 - accuracy: 0.9935 - val_loss: 1.8584 - val_accuracy: 0.7243\n",
"Epoch 70/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0138 - accuracy: 0.9971 - val_loss: 1.9887 - val_accuracy: 0.7273\n",
"Epoch 71/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0078 - accuracy: 0.9977 - val_loss: 2.0536 - val_accuracy: 0.7126\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0107 - accuracy: 0.9980 - val_loss: 2.0608 - val_accuracy: 0.7009\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0021 - accuracy: 1.0000 - val_loss: 2.0809 - val_accuracy: 0.7038\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0020 - accuracy: 0.9997 - val_loss: 2.1431 - val_accuracy: 0.7214\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0027 - accuracy: 0.9993 - val_loss: 2.1357 - val_accuracy: 0.7155\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0018 - accuracy: 0.9997 - val_loss: 2.1865 - val_accuracy: 0.7155\n",
"Epoch 77/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0013 - accuracy: 1.0000 - val_loss: 2.2132 - val_accuracy: 0.7155\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0021 - accuracy: 0.9990 - val_loss: 2.2929 - val_accuracy: 0.7273\n",
"Epoch 79/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0037 - accuracy: 0.9987 - val_loss: 2.2580 - val_accuracy: 0.7038\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0137 - accuracy: 0.9958 - val_loss: 2.1615 - val_accuracy: 0.7214\n",
"Score for fold 9: loss of 2.161499500274658; accuracy of 72.14076519012451%\n",
"------------------------------------------------------------------------\n",
"Training for fold 10 ...\n",
"Epoch 1/80\n",
"62/62 [==============================] - 3s 32ms/step - loss: 4.0988 - accuracy: 0.0316 - val_loss: 3.9318 - val_accuracy: 0.0850\n",
"Epoch 2/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 3.2523 - accuracy: 0.1965 - val_loss: 2.8015 - val_accuracy: 0.2551\n",
"Epoch 3/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 2.1609 - accuracy: 0.4174 - val_loss: 2.1322 - val_accuracy: 0.3988\n",
"Epoch 4/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 1.6546 - accuracy: 0.5419 - val_loss: 1.7082 - val_accuracy: 0.5367\n",
"Epoch 5/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 1.2903 - accuracy: 0.6344 - val_loss: 1.5384 - val_accuracy: 0.5630\n",
"Epoch 6/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 1.0680 - accuracy: 0.6830 - val_loss: 1.5125 - val_accuracy: 0.5572\n",
"Epoch 7/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.9057 - accuracy: 0.7263 - val_loss: 1.3382 - val_accuracy: 0.6276\n",
"Epoch 8/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.7582 - accuracy: 0.7638 - val_loss: 1.3581 - val_accuracy: 0.6129\n",
"Epoch 9/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.6591 - accuracy: 0.7908 - val_loss: 1.3036 - val_accuracy: 0.6510\n",
"Epoch 10/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.5628 - accuracy: 0.8254 - val_loss: 1.2775 - val_accuracy: 0.6569\n",
"Epoch 11/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.4706 - accuracy: 0.8543 - val_loss: 1.2692 - val_accuracy: 0.6540\n",
"Epoch 12/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.4159 - accuracy: 0.8615 - val_loss: 1.2941 - val_accuracy: 0.6716\n",
"Epoch 13/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.3521 - accuracy: 0.8869 - val_loss: 1.3421 - val_accuracy: 0.6422\n",
"Epoch 14/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.3041 - accuracy: 0.9045 - val_loss: 1.2835 - val_accuracy: 0.6950\n",
"Epoch 15/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.2570 - accuracy: 0.9198 - val_loss: 1.4327 - val_accuracy: 0.6686\n",
"Epoch 16/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.2081 - accuracy: 0.9358 - val_loss: 1.4217 - val_accuracy: 0.6628\n",
"Epoch 17/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.1817 - accuracy: 0.9423 - val_loss: 1.5014 - val_accuracy: 0.6598\n",
"Epoch 18/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.1632 - accuracy: 0.9479 - val_loss: 1.6336 - val_accuracy: 0.6833\n",
"Epoch 19/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.1646 - accuracy: 0.9466 - val_loss: 1.4928 - val_accuracy: 0.6716\n",
"Epoch 20/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1249 - accuracy: 0.9664 - val_loss: 1.5707 - val_accuracy: 0.6598\n",
"Epoch 21/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.1014 - accuracy: 0.9707 - val_loss: 1.7214 - val_accuracy: 0.6921\n",
"Epoch 22/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1041 - accuracy: 0.9645 - val_loss: 1.6541 - val_accuracy: 0.6833\n",
"Epoch 23/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1005 - accuracy: 0.9717 - val_loss: 1.6808 - val_accuracy: 0.6628\n",
"Epoch 24/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0876 - accuracy: 0.9743 - val_loss: 1.6991 - val_accuracy: 0.6628\n",
"Epoch 25/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0730 - accuracy: 0.9814 - val_loss: 1.7868 - val_accuracy: 0.6862\n",
"Epoch 26/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0861 - accuracy: 0.9730 - val_loss: 1.9939 - val_accuracy: 0.6657\n",
"Epoch 27/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.1211 - accuracy: 0.9580 - val_loss: 1.8650 - val_accuracy: 0.6481\n",
"Epoch 28/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.1213 - accuracy: 0.9619 - val_loss: 1.7981 - val_accuracy: 0.6745\n",
"Epoch 29/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0621 - accuracy: 0.9795 - val_loss: 1.8225 - val_accuracy: 0.6950\n",
"Epoch 30/80\n",
"62/62 [==============================] - 1s 19ms/step - loss: 0.0417 - accuracy: 0.9886 - val_loss: 1.8746 - val_accuracy: 0.6862\n",
"Epoch 31/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0287 - accuracy: 0.9915 - val_loss: 1.8727 - val_accuracy: 0.6950\n",
"Epoch 32/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0322 - accuracy: 0.9902 - val_loss: 1.9873 - val_accuracy: 0.6950\n",
"Epoch 33/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0477 - accuracy: 0.9883 - val_loss: 2.0559 - val_accuracy: 0.6774\n",
"Epoch 34/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0594 - accuracy: 0.9795 - val_loss: 1.9762 - val_accuracy: 0.6950\n",
"Epoch 35/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0662 - accuracy: 0.9808 - val_loss: 1.9717 - val_accuracy: 0.6804\n",
"Epoch 36/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0337 - accuracy: 0.9909 - val_loss: 2.0102 - val_accuracy: 0.6804\n",
"Epoch 37/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0250 - accuracy: 0.9935 - val_loss: 2.1170 - val_accuracy: 0.6774\n",
"Epoch 38/80\n",
"62/62 [==============================] - 1s 20ms/step - loss: 0.0213 - accuracy: 0.9941 - val_loss: 2.0537 - val_accuracy: 0.6833\n",
"Epoch 39/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0221 - accuracy: 0.9925 - val_loss: 2.0892 - val_accuracy: 0.6921\n",
"Epoch 40/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0373 - accuracy: 0.9879 - val_loss: 2.1625 - val_accuracy: 0.6686\n",
"Epoch 41/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0380 - accuracy: 0.9870 - val_loss: 2.0608 - val_accuracy: 0.6833\n",
"Epoch 42/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0629 - accuracy: 0.9814 - val_loss: 2.2050 - val_accuracy: 0.6745\n",
"Epoch 43/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0960 - accuracy: 0.9681 - val_loss: 2.2725 - val_accuracy: 0.6598\n",
"Epoch 44/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0980 - accuracy: 0.9694 - val_loss: 2.1557 - val_accuracy: 0.6540\n",
"Epoch 45/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0654 - accuracy: 0.9746 - val_loss: 2.2921 - val_accuracy: 0.6481\n",
"Epoch 46/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0513 - accuracy: 0.9840 - val_loss: 2.1890 - val_accuracy: 0.6862\n",
"Epoch 47/80\n",
"62/62 [==============================] - 2s 24ms/step - loss: 0.0279 - accuracy: 0.9925 - val_loss: 2.1796 - val_accuracy: 0.6921\n",
"Epoch 48/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0114 - accuracy: 0.9971 - val_loss: 2.2145 - val_accuracy: 0.6862\n",
"Epoch 49/80\n",
"62/62 [==============================] - 2s 27ms/step - loss: 0.0084 - accuracy: 0.9977 - val_loss: 2.2101 - val_accuracy: 0.6921\n",
"Epoch 50/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0129 - accuracy: 0.9961 - val_loss: 2.1808 - val_accuracy: 0.6921\n",
"Epoch 51/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0098 - accuracy: 0.9971 - val_loss: 2.1699 - val_accuracy: 0.7038\n",
"Epoch 52/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0074 - accuracy: 0.9980 - val_loss: 2.2066 - val_accuracy: 0.6950\n",
"Epoch 53/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0068 - accuracy: 0.9974 - val_loss: 2.2522 - val_accuracy: 0.6950\n",
"Epoch 54/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0092 - accuracy: 0.9967 - val_loss: 2.2748 - val_accuracy: 0.6862\n",
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
"Epoch 55/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0112 - accuracy: 0.9964 - val_loss: 2.2931 - val_accuracy: 0.6950\n",
"Epoch 56/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0083 - accuracy: 0.9971 - val_loss: 2.2434 - val_accuracy: 0.6979\n",
"Epoch 57/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0062 - accuracy: 0.9971 - val_loss: 2.2799 - val_accuracy: 0.6891\n",
"Epoch 58/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0106 - accuracy: 0.9961 - val_loss: 2.2980 - val_accuracy: 0.6950\n",
"Epoch 59/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0084 - accuracy: 0.9974 - val_loss: 2.2835 - val_accuracy: 0.7097\n",
"Epoch 60/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0084 - accuracy: 0.9964 - val_loss: 2.3406 - val_accuracy: 0.6862\n",
"Epoch 61/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0117 - accuracy: 0.9961 - val_loss: 2.3019 - val_accuracy: 0.6921\n",
"Epoch 62/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0071 - accuracy: 0.9980 - val_loss: 2.2639 - val_accuracy: 0.7038\n",
"Epoch 63/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0130 - accuracy: 0.9967 - val_loss: 2.2944 - val_accuracy: 0.6950\n",
"Epoch 64/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1294 - accuracy: 0.9658 - val_loss: 2.5276 - val_accuracy: 0.5953\n",
"Epoch 65/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.2799 - accuracy: 0.9065 - val_loss: 2.0809 - val_accuracy: 0.6129\n",
"Epoch 66/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.1344 - accuracy: 0.9573 - val_loss: 2.1698 - val_accuracy: 0.6598\n",
"Epoch 67/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0675 - accuracy: 0.9769 - val_loss: 2.4213 - val_accuracy: 0.6745\n",
"Epoch 68/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0564 - accuracy: 0.9834 - val_loss: 2.0034 - val_accuracy: 0.6745\n",
"Epoch 69/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0314 - accuracy: 0.9899 - val_loss: 2.0989 - val_accuracy: 0.6862\n",
"Epoch 70/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0160 - accuracy: 0.9951 - val_loss: 2.1941 - val_accuracy: 0.6950\n",
"Epoch 71/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0143 - accuracy: 0.9967 - val_loss: 2.2414 - val_accuracy: 0.6774\n",
"Epoch 72/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0110 - accuracy: 0.9977 - val_loss: 2.2457 - val_accuracy: 0.6833\n",
"Epoch 73/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0142 - accuracy: 0.9954 - val_loss: 2.3332 - val_accuracy: 0.6979\n",
"Epoch 74/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0168 - accuracy: 0.9951 - val_loss: 2.2853 - val_accuracy: 0.6862\n",
"Epoch 75/80\n",
"62/62 [==============================] - 1s 23ms/step - loss: 0.0283 - accuracy: 0.9915 - val_loss: 2.4522 - val_accuracy: 0.6862\n",
"Epoch 76/80\n",
"62/62 [==============================] - 1s 21ms/step - loss: 0.0166 - accuracy: 0.9964 - val_loss: 2.4504 - val_accuracy: 0.6862\n",
"Epoch 77/80\n",
"62/62 [==============================] - 2s 25ms/step - loss: 0.0255 - accuracy: 0.9932 - val_loss: 2.4046 - val_accuracy: 0.6774\n",
"Epoch 78/80\n",
"62/62 [==============================] - 1s 24ms/step - loss: 0.0214 - accuracy: 0.9948 - val_loss: 2.4315 - val_accuracy: 0.6804\n",
"Epoch 79/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0083 - accuracy: 0.9984 - val_loss: 2.4370 - val_accuracy: 0.6921\n",
"Epoch 80/80\n",
"62/62 [==============================] - 1s 22ms/step - loss: 0.0029 - accuracy: 0.9993 - val_loss: 2.4862 - val_accuracy: 0.7009\n",
"Score for fold 10: loss of 2.486210823059082; accuracy of 70.08797526359558%\n"
]
}
],
"source": [
"acc_per_fold = []\n",
"loss_per_fold = []\n",
"kf = KFold(n_splits=10, random_state=42, shuffle=True)\n",
"fold = 1\n",
"for train_index, test_index in kf.split(data, labels):\n",
" model = models.Sequential([keras.Input(shape=(32, 32, 1))])\n",
" model.add(layers.Conv2D(32, (3, 3), activation='relu'))\n",
" model.add(layers.MaxPooling2D((2, 2)))\n",
" model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n",
" model.add(layers.MaxPooling2D((2, 2)))\n",
" model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n",
" model.add(layers.Flatten())\n",
" model.add(layers.Dense(128, activation='relu'))\n",
" model.add(layers.Dense(62, activation='softmax'))\n",
"\n",
" model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3),\n",
" loss=keras.losses.categorical_crossentropy,\n",
" metrics=['accuracy'])\n",
" \n",
" print('------------------------------------------------------------------------')\n",
" print(f'Training for fold {fold} ...')\n",
" \n",
" train_images, test_images = data[train_index], data[test_index]\n",
" train_labels, test_labels = labels[train_index], labels[test_index]\n",
"\n",
" train_images = train_images.reshape(-1, 32, 32, 1)\n",
" test_images = test_images.reshape(-1, 32, 32, 1)\n",
"\n",
" history = model.fit(x=train_images, \n",
" y=train_labels, \n",
" validation_data=(test_images, test_labels), \n",
" batch_size=BATCH_SIZE,\n",
" epochs=EPOCHS, \n",
" class_weight=classWeight)\n",
" \n",
" # Generate generalization metrics\n",
" scores = model.evaluate(test_images, test_labels, verbose=0)\n",
" print(f'Score for fold {fold}: {model.metrics_names[0]} of {scores[0]}; {model.metrics_names[1]} of {scores[1]*100}%')\n",
" acc_per_fold.append(scores[1] * 100)\n",
" loss_per_fold.append(scores[0])\n",
" \n",
" fold+=1"
]
},
{
"cell_type": "code",
"execution_count": 169,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"------------------------------------------------------------------------\n",
"Score per fold\n",
"------------------------------------------------------------------------\n",
"> Fold 1 - Loss: 2.8269495964050293 - Accuracy: 63.63636255264282%\n",
"------------------------------------------------------------------------\n",
"> Fold 2 - Loss: 1.9607480764389038 - Accuracy: 68.62170100212097%\n",
"------------------------------------------------------------------------\n",
"> Fold 3 - Loss: 2.3371033668518066 - Accuracy: 70.3812301158905%\n",
"------------------------------------------------------------------------\n",
"> Fold 4 - Loss: 2.010115623474121 - Accuracy: 69.79472041130066%\n",
"------------------------------------------------------------------------\n",
"> Fold 5 - Loss: 2.6227874755859375 - Accuracy: 72.14076519012451%\n",
"------------------------------------------------------------------------\n",
"> Fold 6 - Loss: 2.587796926498413 - Accuracy: 68.62170100212097%\n",
"------------------------------------------------------------------------\n",
"> Fold 7 - Loss: 1.7485564947128296 - Accuracy: 70.67448496818542%\n",
"------------------------------------------------------------------------\n",
"> Fold 8 - Loss: 2.2871928215026855 - Accuracy: 70.67448496818542%\n",
"------------------------------------------------------------------------\n",
"> Fold 9 - Loss: 2.161499500274658 - Accuracy: 72.14076519012451%\n",
"------------------------------------------------------------------------\n",
"> Fold 10 - Loss: 2.486210823059082 - Accuracy: 70.08797526359558%\n",
"------------------------------------------------------------------------\n",
"Average scores for all folds:\n",
"> Accuracy: 69.67741906642914 (+- 2.3135587334320644)\n",
"> Loss: 2.3028960704803465\n",
"------------------------------------------------------------------------\n"
]
}
],
"source": [
"print('------------------------------------------------------------------------')\n",
"print('Score per fold')\n",
"for i in range(0, len(acc_per_fold)):\n",
" print('------------------------------------------------------------------------')\n",
" print(f'> Fold {i+1} - Loss: {loss_per_fold[i]} - Accuracy: {acc_per_fold[i]}%')\n",
"print('------------------------------------------------------------------------')\n",
"print('Average scores for all folds:')\n",
"print(f'> Accuracy: {np.mean(acc_per_fold)} (+- {np.std(acc_per_fold)})')\n",
"print(f'> Loss: {np.mean(loss_per_fold)}')\n",
"print('------------------------------------------------------------------------')\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Validation\n",
"\n",
"Now, we need to validate our data. Using the test data we split earlier, we can evaluate our model. Let's plot our training accuracy."
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEWCAYAAABrDZDcAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAABEaUlEQVR4nO3dd3hUZfbA8e9Jb5BCQi8B6Sg1AgIKdlQUu2DFta+uouvuz3VX11W36u5a1nUXOxawICwqioIUFQRC79ICBJKQ3vu8vz/eCUxCEpKQyQyZ83mePMwtc+fMncs99y33vWKMQSmllO/y83QASimlPEsTgVJK+ThNBEop5eM0ESillI/TRKCUUj5OE4FSSvk4TQTqpInIlyJyW3Ov60kikiQiF7hhu0tF5M7m3q5SJ0MTgY8SkQKXP4eIFLtM39SYbRljLjHGvNPc63ojEXlMRJbXMj9WRMpE5HRPxOUSxzQRMSJygyfjUKcWTQQ+yhgTUfUHHAAud5n3ftV6IhLguSi90nvAGBHpWWP+FGCzMWaLB2JydRuQBdzakh+qx8mpTROBqkZEJohIsoj8n4ikAm+JSLSIfC4i6SKS7Xzd1eU9R6s7nFek34vI885194nIJU1ct6eILBeRfBFZJCKviMh7dcTdkBifEZEfnNv7WkRiXZbfIiL7RSRTRH5b1/4xxiQD3wK31Fh0KzDzRHHUiPkp1+8jIvHOq/kA53SkiLwhIikickhEnhUR/7piE5EewHjgbuBiEenossxfRB4XkT3O779WRLo5lw0SkW9EJEtE0kTkcef8t0XkWZdtTBCRZJfpJOdxsgkoFJEAZ4mp6jO2ichVNWK8S0S2uywfLiK/EpE5NdZ7SURerOu7qualiUDVpiMQA/TAnlT8gLec092BYuBf9bx/FLATiAX+BrwhItKEdT8AVgPtgKc4/uTrqiEx3gjcDrQHgoBHAURkIPCqc/udnZ9X68nb6R3XWESkHzDUGW9j91V93gYqgN7AMOAioL72hVuBRGPMHGA74FrF9wgwFbgUaAv8DCgSkTbAIuAr7HfvDSxuRIxTgcuAKGNMBbAHOBuIBP4AvCcinQBE5Drs73irM4YrgExsKWuiiEQ51wvAlrBmNiIOdTKMMfrn439AEnCB8/UEoAwIqWf9oUC2y/RS4E7n62nAbpdlYYABOjZmXexJtAIIc1n+HvBeA79TbTH+zmX658BXztdPArNdloU798EFdWw7DMgDxjin/wj8rwn76inX7wPEO79/ANABKAVCXZZPBZbU8513AdOdr38DbHRZthOYXMt7pgLr69je28CzLtMTgOQax83PTvA7bKj6XGAh8FAd630J3OV8PQnY5un/F770pyUCVZt0Y0xJ1YSIhInIf51VJ3nAciCqnmqK1KoXxpgi58uIRq7bGchymQdwsK6AGxhjqsvrIpeYOrtu2xhTiL1SrZUzpo+BW52ll5twXr02YV/VpQcQCKSISI6I5AD/xZZmjiMiY4GewGznrA+AM0RkqHO6G/Zqvaa65jdUtd9ERG4VkQ0uMZ+OLe2d6LPeAW52vr4ZePckYlKNpIlA1abmkLS/BPoBo4wxbYFznPPrqu5pDilAjIiEuczrVs/6JxNjiuu2nZ/Z7gTveQe4HrgQaAN81oQ4CrGliyodXV4fxJYIYo0xUc6/tsaYQXXEc5vzMzaIbdtZ5TK/anun1fK+g0CvOrZZX3xVjh4rzjaK14AHgHbGmChgC8e+e10xAMwDBovtdTUJeL+O9ZQbaCJQDdEGW9edIyIxwO/d/YHGmP1AIvCUiASJyFnA5W6K8RNgkoiME5Eg4GlO/H/jOyAHmIGtViprQhwbgHNEpLuIRGKrcwAwxqQAXwN/F5G2IuInIqeJyPiaGxGREGxSuhtbFVX19wvgRmed++vAMyLSR6zBItIO+BzoJCLTRSRYRNqIyCiX+C4VkRhnw/P0E+yTcGxiSHfGdTu2RFDldeBRERnhjKG3M3ngLIF+grNdyBhz4ASfpZqRJgLVEC8AoUAG8CO2YbEl3AScha2meRb4EHuVXJsXaGKMxpitwP3Yk1AKkA0kn+A9Blsd1IPqjZoNjsMY8w32O20C1mJPyq5uxTZqb3PG9AnQqZZNXYlNPjONMalVf8Cb2PaGicA/gI+wySUPeAPb/pCPLdVcjq062wWc69zuu8BGbFvA185Y62SM2Qb8HVgJpAFnAD+4LP8Y257yAZCPLQXEuGziHed7tFqohYmzcUYpryciHwI7jDFuL5Golici3YEd2I4FeZ6Ox5doiUB5LRE501kd4iciE4HJ2KtI1cqIiB+2i+tsTQItz22JQETeFJEjIlLrnZbOOsKXRGS3iGwSkeHuikWdsjpiu1sWAC8B9xlj1ns0ItXsRCQcW111IS3Q/qSO57aqIRE5B/sfeKYx5rjxV0TkUmxj1qXYm4peNMaMqrmeUkop93JbicAYsxw75kldJmOThDHG/Ijta11bQ5hSSik38uRAUV2ofjNKsnNeSs0VReRubNc4wsPDR/Tv379FAlSquZWUOygpr8RhDJUOQ6UxVFQayioclFU6KK90EOjvR3iwPxHBAYQHBVDhMJSUV1JSXklphQOHMRhzrAN/gJ8Q4C8E+PnhL0KlMce27zCUVzoor7Tzqgjg7ycE+vsRGuRPSKA/IYF++CEYqLrbl5BAf/z93Hm7iHuVVdj9XencZw5jcBiO7psKhwOHw86rWoYBxO4jERCEqkFP7K4Q53y73O5jQ4Wj+j52hy5RocSEBzXpvWvXrs0wxsTVtuyUGDHQGDMD21+bhIQEk5iY6OGIlDcoLK0go6CU9PxSsovKGdotirg2wS0aw5ZDuczfeJhbRvegW0zYccuNMexIzefLzSl8sTmFjPTCo8sECPYTukQE0S06jK7RoXSOCmV/ZhEr92aSVViGa6tpTHAAfTpEEB4UcPTEb4whs7CM9PxSMgpKKa1wEBboR0RwIG1CAogKC6RzZCgdI0PoFBlCoL8f+SXl5JdWkFdcwd70Aran5JFXUkFxLd+vGOgWE8rgLlGM6hXDTaN6NGtiyC0uZ3tKHml5JZzbvz1tQwJP+J7dRwrYmZpPgL8Q6NwPRWWV5BSVkV1UTlZhKTtS89l8KJe8ovLj3u8HRAT5Ex0WRFRYIG1DAgkN8ic00J/gQD8C/IQKh03QFQ6H81+bUCsqncnVZX5kaACxEcHEtQmmXXgQ4cEBhATa7QX4C2UVDorLKykpt0knIiSAiOAAIkICCA7wQ1zuNaxwOCguq6SkwkFJmU1grhJ6RNOnQ5sm7WsR2V/XMk8mgkNUv1O0q3OeagXKKx2s2puFwdApMoSOkaGEB/mzJ72QNUlZrN6Xxd6MQsb3ieWq4V3pGRte57YKSiv4cU8mWw7nkpRRyL6MQvZmFJJfUlFtvaAAP64d0ZW7zu5V7/aaw+bkXF5c/BOLth8B4NN1ybx2awLDukcfXSersIzH5mzi621p+AmM7BnDtDHxjO7VjsjQQNqEBBIS6Edt4/E5HIadafms3Z9NbEQwAzu1pWt0KH71nISNsxQQ4N+4Gl9jDIdzS9iVln/0/QF+Qnmlg+0p+Ww5lMvG5By+2JzC8p8yeHnqMEKDGjtixjGbknP4z7I9bDyYy6GcY+knIjiAG0d15/ax8XSKDD3ufQezivjnNz8xd8Mh6rvwDgrwo3dcBBMHdeT0LpEM6tyW6LAgW/IJ8CckyI/ggKbH3xq59T4CEYkHPq+jsfgy7K3oVY3FLxljRp5om1oi8F7GGLYezmPOumTmbzhMZmFZteVBAX6UVTgAiI0IoltMGBsP5uAwMLx7FBcP6khESACBfn74+wmpeSUs/ymddQeyKa80iEDnyFB6xoYTHxtGl6gw4toEExthr8Lmrj/EJ2uTKa90cNHADkwa3JlxvWOJbmJRujb7Mgr54xfbWLT9CJGhgdwxrifn9I3jwVnrScsr4cUpQ5l4eid+2J3BIx9tILuwnIcu6MP1Cd1avLTS3GauTOL387cypGsUb9yWQLuIxn2fg1lFPLdwJ/M3HiYmPIixvWMZ0KkNAzu1JSI4gJkr9/PF5hQEuHBgB3q0Cz/6+244mMP7Px5ABKaNiWfy0C44jHFeuTsICfQnOjyI6LBAQgP9a02uvk5E1hpjEmpd5sZeQ7OwoxXGYu8y/D12EC2MMf9xDtb1L+xdj0XA7caYE57hNRF4D2MMB7KKWL0vizVJWazal8X+zCKC/P24YGB7rhzahcjQQFLzSjicU0JGQSl92kcwsmcMPWPDERFSc0v434ZDzFmXzE9pBcd9xsBObTmnbxzn9IlleI9oQgLrv5JLzy/lnRVJvL9qP9lF5YjA4C6RTOjXnlvO6kFsI09eVUrKK/n30j38Z+keggL8uOecXkwbG08bZ1VGZkEpd85MZMPBHM7v34HFO9LoFRvOi1OGcXqXyCZ9pjdauDWVB2etp1NkCG/fPpL4BpS8CksreGnxLt76IQkRuOvsXtwzvtfRfefqYFYRb3y/j6+3ppJeUEp5pT0/+Qlcn9CNhy7oU2tpQZ2YRxKBu2gi8A6bknN4Yt4WNibnAhAVFkhCjxjO7R/HpDM6Exl24rpeV8YYcorKbcOm8yqvTUhgkxvGKh2Gjck5fPdTBst3pbP+QDahgf7ceXYv7jqnFxHBx2pF80rKqaw0tZYcyisdLNqWxp++3M7BrGImD+3Mby8dQPu2IcetW1JeycMfbuDLLancOKo7T1w28KSqULzV2v3Z3PnOGgL8/Vjw4Nn1lnS+2ZbG7/+3hcO5JVw7oiu/vKhvg0/kxhhyi8vJKCglPDhAE8BJ0kSgmk1uUTl/W7iDD1YfIDYimJ9POI2xvWPpHRdRb/21p+1JL+DvX+9kweZUYsKDuHJoFw5mF7E9JY/k7GJEYET3aC4a1IGLBnakoLSCT9cdYv7GQ2QUlHFaXDjPTD6dMb1j6/0ch8NwOLeYrtHHNxy3JttT8rjylR8Y2TOGd24fedxvn5pbwlPzt/LV1lT6dWjDn64+nRE9YurYmmoJmggUABWVDhZsSaW0vJJuMWF0iwmjY9uQWnuB5JeU886KJL7ZlkaAvx8hgX6EBPiz4WAO2UVl3DYmnocv7NugXh7eZOPBHP62cAcr92TSMzacAZ3aMqBTW8orHXyzLY2th4/10wny9+P8Ae25enhXJvSLI7CRjbCt3QerDvD43M08dkl/7h1/bHTpNUlZ3D0zkeLySh46vy93nt1T950X0ETg44wxLNp+hL98uZ09Lt0XwZ7shnaPYnzfOM7uE0uPmHDeWZnEG9/vI7e4nBE9ogkJ9KOk3HZri2sTzK8n9mNQ51O73rvSYWpNgAezili8PY3AAD8uO6MTUWHN19Dc2hhjeOCD9SzcmspH957F8O7RzN94mEc/2kjX6FBevy2BXnF1PY9ItTRNBD5s2+E8/vDZVlbty6JXbDj/d0l/+nVow8HsIg5mFbMvo4AVezKPXgmLgDFwwYD2PHh+HwZ3jfLsF1BeLa+knEtf/A6Aa4Z35cXFuxgZH8N/bxnRrL211MnTROCjMgpKueAfy/AXYfoFfZgysnudRfT0/FK+353OjpR8Jg3uzBldT+0rftVy1h/I5rr/rKTCYZg8tDN/u3aw9tP3QvUlglPizmLVNE9/to2i0kq+eHDcCe9GjGsTzFXDusKwFgpOtRrDukfzzxuGkp5fyu1j47UP/ylIE0ErtWTHEeZvPMzDF/Rt8i3pSjXU5UM6ezoEdRK0Kb8VKiit4LdzN9OnfQT3TajrWeFKKWVpIjjFpeQWM2dtMqm5JUfnPb9wJyl5JfzlmsEEBehPrJSqn1YNncJKyiu5/a017EjNB2BQ57acGR/DOyuTuHV0D0b0iD7BFpRSShPBKe3pz7exIzWfv1x9BtlF5SzZcYSZK5PoHBnKrybqMxuUUg2jieAU9dnGw3yw6gD3jj+NKSO7A3DfhNPIKSpDkGpj6SilVH30bHEKSsoo5DefbmZEj2h+eVHfasv0TlilVGNpS+Ippriskvs/WIe/n/DS1GE6hotS6qRpieAUcSSvhJkr9x8dZ/+1WxPoEqXD8iqlTp4mAi9XWFrB7+dv5X8bDlHhMFwwoAN3n9OLM+N1SF+lVPPQRODl/vzlduasS+bW0T24fWzPBj0RSimlGkMTgRdbsTuD9348wB3jevLEpIGeDkcp1UppS6OXKiyt4NdzNhHfLoxHL+rn6XCUUq2Ylgi81F+/2sGhnGI+vPusVvncW6WU99ASgRdauSeTmSv3M21MPCN7aqOwUsq9NBF4meKySv5vziZ6tAvjVxdrlZBSyv20asjL/Hvpbg5kFfHBXaMIC9KfRynlfloi8CL7Mgr577K9XDm0M2NOi/V0OEopH6GJwEsYY3jyf1sIDvDj8csGeDocpZQP0UTgJb7aksp3uzJ45KK+tG8T4ulwlFI+RBOBFygsreDpz7cxsFNbbhndw9PhKKV8jLZGeoGXv91NSm4J/7pxGAE6mqhSqoXpWcfD9qYX8Mb3e7luRFdG9NB7BpRSLU8TgYc9+8V2QgL8+bU+WlIp5SGaCDxoyc4jfLvjCA+e34e4NsGeDkcp5aM0EXhIeaWDZz7fRs/YcG4bE+/pcJRSPkwTgYfMXLmfvemFPDFpAEEB+jMopTxHz0AekFlQyguLfmJ83zjO7dfe0+EopXycJoIWZozhTwt2UFxWyROTBiAing5JKeXj3JoIRGSiiOwUkd0i8lgty7uLyBIRWS8im0TkUnfG4w1e/nY3c9Ylc+/40+jdvo2nw1FKKfclAhHxB14BLgEGAlNFpObzFn8HfGSMGQZMAf7trni8wbsrk/jHNz9xzfCuPHJhX0+Ho5RSgHtLBCOB3caYvcaYMmA2MLnGOgZo63wdCRx2YzweNX/jYZ6cv5ULBnTgr9ecgZ+fVgkppbyDOxNBF+Cgy3Syc56rp4CbRSQZWAD8orYNicjdIpIoIonp6enuiNWtvt+VwSMfbuDM+BgdRkIp5XU8fUaaCrxtjOkKXAq8KyLHxWSMmWGMSTDGJMTFxbV4kCfDGMMzn2+je7swXr8tgZBAff6wUsq7uDMRHAK6uUx3dc5zdQfwEYAxZiUQArSqJ7Ks2pfFzrR87j3nNNqGBHo6HKWUOo47E8EaoI+I9BSRIGxj8Pwa6xwAzgcQkQHYRHDq1f3UY+bKJKLCArliaGdPh6KUUrVyWyIwxlQADwALge3Y3kFbReRpEbnCudovgbtEZCMwC5hmjDHuiqmlpeQWs3BrGjckdNMqIaWU13Lr8wiMMQuwjcCu8550eb0NGOvOGDzpg1UHcBjDzfqwGaWUF/N0Y3GrVVpRyazVBzi/f3u6xYR5OhyllKqTJgI3+XJzKhkFZdx6VrynQ1FKqXppInCTd1Ym0Ss2nHG9W1UnKKVUK6SJwA02J+ey/kAON4/uoXcQK6W8niaCZmaM4c9fbiciOIBrRnT1dDhKKXVCmgia2ew1B1mxJ5PfXNqfyFC9gUwp5f00ETSjlNxi/vTFdkb3imHqmd09HY5SSjWIJoJmYozh8U83U+5w8NdrBmvbgFLqlKGJoJnM23CIJTvT+dXF/enRLtzT4SilvEFlhacjaBBNBM0go6CUP3y2jeHdo5g2Jt7T4Sil6lKSC1vmtMwJesXL8Nd4OLTW/Z91kjQRNIOPE5PJKSrnL9cMxl+rhNSpoLwEPr4dvv4dFBzxdDQtozgHZl4Jn/wMvvwVnGhYs/IS+PIx+PAW2PQxlOQ1/LO2zrX7tqwA5v3cbsuLuXWsIV/x1dZUhnSNpG8HfQaxaqTtn0PmbojrD+37Q2R38GvC9VlpAexaCCkbq5/gup4JA684fv2Fj8PWT0H8YPXrcOYdMOZBaNOh6d/FmxVnw7tXQeoW6HcZJL4J0T1h7IO1r5+fCrNvtFfz4e1h+3zwD4be58P4X0PnYXV/1sHV8Ok90G203f7sG2Hpn+HCP7jnuzUDTQQn6XBOMRsP5vDrif08HUrDZO+H0CgIifR0JKeO/FQIjYaA4ObbZmUFfPME/FjjMd2BYdBjDAycDP0nQVhM3duoKLMnqK1zYfciqCgB/yAQ50i3phJWvAQX/xnO+vmx923+BBLfsCf+4bfBd8/bOFbPgFCXzwsKg5s/hZiezfe9m5sxkLHLJrC6junibFsSOLINbngP+lwEn9xu9390D7uvXR1eD7NutNVIN7xnE0fyatj2P9j8sU0odyyC2N7Hf1bWXpg1Bdp2hikfQHg7GH6r/R0GXA5dE+x6hRk2Ge9dVv39MT1tPAMn2220EDnVRn1OSEgwiYmJng7jqLd/2MdTn23j21+Op1dchPs/MOcgbJpt/xPXdmLau8yeuIbccPyy3GR4OcGeLEbfB6PvtSe45pL4FkR2gz4XnNx2cg7AtvngKIcxDzXtCrm5FGbAy8Oh42C4dX7zxFKcY6sn9iyGUffCOb+2pYL07ZC2FX76yu4DvwDoeQ6MvBv6TgRxqXY8vAH+dz+kbYE2nWDAFTDoSug2CvyciaCy3J7wtn8GF/8JzrrfnjRnTIAOg2DaF+DvvNclcw+sfcul+sPA+vdhzC8afyVrDKx+zZ5Qq4jYq+gBl0Objk3abdW2n7oJts6zJ+esPdDrXLhlbvV9BFBWCG9dAke225N634vt/PJieOcKu52bPoagCEjfaffnmjcgPBamzoKOZ1TfXtY+eP0CCG4Ddy6y67kue/86KMqonihK8uDfZ9nEes9y+/t+8ahNNKdfDQEhzu/lsCWQI9vsdLdREHNa9c8fOtUeE00gImuNMQm1LtNEcHKmzFhJVmEZXz883v0f5qiENyfaq5Pzfgfn/Kr68oIj9kRfmgd3Lzm++Dr3PttQ1vt82LkAgtvaE9G4h+1BWtOmj+DrJ+wVzVk/rz9prH4NFjwKfoFw6zyIH1d9uTGwbzl0GQHBtSTM4mxY9y5sm1e9cW34rTDpxaafgLP2wvK/25NhZBdbBRPX38YX34AR0Bf8Glb/176+5DkYdXfT4qiSsQtmTYXsfXDZ32HEtOPXMQZSNjivQD+B3IPQaQiMfwxOOw+WPwff/xPC4+Cy5+0Va137p7LcJp3t8+H8J2HLp5B3GO793u6P+nwwxZ7MH94K/g2sPDDG1o2v/JdNUH7ORFNZBgWpgED3s2DQVTD8FggMbdh2q5QV2qqWvUttyafnORDRHjZ9CDfPgd41LkK+eRJ+eBGmzoZ+l1RfVpgBr58P2UnH5vkF2m1e9R+73docXAPvTLK/ya3z7YXVmtdg0VM2ed/4oS3VudqzBN690lZHZe+DTkPhylehw8Djt5/+E2z/n602LMqqvuz8J2Dw9SfcTbXRROAmmQWlnPnHRTxwbm8euagFqoZ+eMkWZ2N6QV4K3L/KFm2rzL3PFl1D2kJ0vL0qqTpBpG6G/5xtr/AuesbWlS77qz1BDLoarn2z+tVUzkF4dYy9Wik84kwa98Donx9fXfHTQlscPu18eyVbkGavlmL72OXlxfbqdcsciOoBV7wMvVwS584v4bPp9kTRaQgMvNJe3W6YBcv/BsNuhstfrj8ZlBVCRemx6YI0u782fWj/cw660l6Bpe+w1WMY+5+4Vz0JPHMPvDISht1iS1P7f7An0Han1f0esFU2AUHV5zkqYdV/YPEz9uR3w7vHJ8vaVJbb77D8eXsCCYqwDZBDb4KL/9iwEl1lOcy50yZZgJs+gT4Xnvh92z+DD2+GGz+GvhdVX2YMOCqOlSiq5i38Lfz4ii3FXPK36sfUkR02hq3zbOmn01B71d3QKpDSAvjgejiwEi58GobcaKteKsrglTMhMBzu/e5YiSh9pz2GB0+BK1+pfZs5B23CjepuLxBielb/TnXZOhc+nmaTcHE2HFgBvS+Ey1+sO8F+8Sisewcm/MaW6BuaXJuJJgI3+XDNAf5vzma+eHAcgzo3U527w2GvBrqNhradjs1P32lP5H0uhEv+Cv8aCb0mwNQP7PL9K+GtiTDuEYjrB3PvsQdl1RXnu1fBoXXw0IbqJ4/v/g6Ln7ZXp2feaecZY9c/uBru+8GeZJf/zf6HCWpjr4rPesAmhJSN8OYl9uR4+5e2WPz6BRAUDncutleCs2+0VRln3W9P+ll7IOFntiSy+BnY/BG0HwST/wVdhh+LzRjbyLbsrzD0ZptAyvLtvkjfYf89st3+m5d8/L4MCLGfM/ah6tURJXnw2rn2BPLzFbaYX5sPb4Hdi+HB9ba+/ZXR9gpu2oLjk1L6Trt/ts6zRfseY2xCG3gFlObbRHhwFfS5GC5/ofH1v5UVdj/t+MLW69c8MTfk/d88YS8QRt3TsPdUlME/+kP82XD9O9WXffW4bWfofYH9nn0vhiV/glWv2lLmxL8cX03jascC+PQum9imfABdR9QfS2m+rXY5uAqufg3OuLb68i1zbMln8r9h2E322HnncnsB9Iu11atwmsv3L8Ci30NwJFzyFxgytf7vbIwtrXuofU4TgZvc/tZqdh0p4Ltfn4vUdwA0VGm+7W2w8wt7cE38k73yc1TCmxfZOsj7V9kia9VBeONH9kr8v+fYg+z+VbbB8e3L7AnpgbW2muG9q4/VE7tyOOxV1r5lcMfXtjop8S34fHr15AC2/nr5c/ZkFxRuT7KbP7ZF9DsXHUtcyYn28+P6QX6avYK95nVbNC8rgiV/hJWvAMZerZ/9Szj70eOvoqss+TMs+4tNYMXZx+YHhEBsX2g/wJY+gtseW+YXYBtb6+oFc3A1vHmxPale/kLty9+40F69TXjMztswC+bde6zxteYVLgLdR9vqr92Lj83zC7D765K/wuAb6j9ZeJsvH7Mn/F/uPFYSPLzBJtLOwyHvEOSn2GPAVMKo+2Dinxv2HdO2wawb7DEy+RUYfF3t65Xmw3vXQvIauOY1OP2a49cxxlbz5KXYE//OBTDnDrjsH7ZHlDsYYy9sOg+rftHmpTQRuEF+STkjnlnErWf14HeTaqnna6ycA7ZONn27PfnsWeIsbl5gT3QrXoZr3jh2JVRRBv8ZZ3uKjLjNXtXf8D4MmGSXp22zy4dOhcMbbZJ4YE3tDcyFmfDfs22R+Ib3bDtElxFwy7zaq2OObIdlf7PF46AIuGOhbXx0tW0+fHQrRHWDqR8eXxd6YBWsfds2WncafOL9s+YNeyKI6+es5+9nq5mqqgGa4usnbG+Omz+17SZVjLFJIjsJfrHuWJuGMbZ+f+8Se2WdvoNjdd5X2gbbmqW4rfNsKensX558I6knpG62x1FV+4jDYRNkzgH4RaItIVb1qAmPtSXSxiS6wgx7nOz/ofbqrgOr4H8/txdB175h2xbqkvQDvH2pLWlumGV/izsXn9wx0opoInCD+RsP8+Cs9Xxy71kkxNfTxa8h9q+0dbGV5XDdW/ak5HDAmtftVX95kT3JXD+z+n+yfctt8Rds/eRNH1dfvvC3ttEOqieR2hxYZf8Tib9NCD9faetN65OxGzDH2gJqOrzBtmE0Z8+k5lReYhNgWZGtIgqJtPO2fGKrclyr1qrkp9peKG06O0/+zdALxtv9Z5y93+Ce5TZ5f/YQXDWj9p5pTVFRZkt8379gG8AvfwF6jodvn7XdWiO72mrDXhNOvK1ZU21pAIG7FtsLGgVoInCL+99fx+qkLFb95vyTG2Auea2tRonsYns21DypZu2D9e/C6Pttw1hNc+6yDb73rTi+EbM0H14ZZXtv3PHNiXverHjZ9viY9AIk3N7073QqSV4Lb1xgSxkVpbZB1jggboBtGG7hBj2v9ON/4Kv/g9s+h49ugfYDbdfT5q7iOrwe5t0PR7ZCWDsoyoSEO2z31bracWpK/8k2EA+/BSb9s3njO8VpImhmJeWVDH/mG64a1oU/XnVG/SsbA58/bBtVr32z+s052Um2YTUwzBZhI+IaH0xlBRSm111HWZRlq4OCGjgQXvb+6j2RfMH3/4SNs217Q1W102nn1X8zly8pzIS/97NtMuVFNkHW1u2xOVSU2Rvcdn1tewY1pc983mGI6OjZ+0+8UH2JQC93mmDVviyKyiq5cGADbsdf+md7o45/MLx23rFug8XZ8P71tjpo2idNSwJgr1jra6hq7MnM15IA2DrlcQ97OgrvFd4O+k203UnPesB9SQBsh4FzH7d/TdWCd+S2Fpoym2DFngwC/YVRPWupqnG14YNjXR9/vtIWd2dOtjdffXiLvdlpyvsQ17dlAleqqcY9bHthVfWgUq2Klgia4Mc9mQzrFk1okEtvhKIs29hY1UNh33KY/6Bt9Lr8BdsAe+ci26VtwaN2natmNOymIqU8rcsIe9GiWiVNBI2UW1zO5kO5/OI8l0bdZX+zfeP9g531zP1g9ze28fb6mcfuVAyNsv3+v/uHrbJprl4XSil1EjQRNNLqfVk4DJx1mrNaaONsmwT6T7INwek77c1IER1td87QqOob8POH8b86brtKKeUpmggaacWeDIID/BjWPQr2fQf/e8D2bLj2rbrvjFVKKS+mjcWNtHJPJmfGxxCcvQc+vMkOAHf9u5oElFKnLE0EjZBZUMqO1HzO6REM719rh5+trfpHKaVOIVo11Ag/7rVjg18YuBFy9tsHYfhiv3ulVKuiJYJGWLk3g4jgAHpkr4KQKNs1VCmlTnGaCBphxZ5MzuwRhd/eb+0AWDqqoVKqFdBE0EBpeSXsTS/kso55dvz1087zdEhKKdUs3JoIRGSiiOwUkd0iUuu96SJyvYhsE5GtIvKBO+M5GSv3ZAIw1m+TnaGJQCnVSritsVhE/IFXgAuBZGCNiMw3xmxzWacP8BtgrDEmW0TqeFq0563Yk0FkaCAd01fYu4ejunk6JKWUahbuLBGMBHYbY/YaY8qA2cDkGuvcBbxijMkGMMYccWM8J2XFnkzGxUcg+3/Q0oBSqlVxZyLoAhx0mU52znPVF+grIj+IyI8iMrG2DYnI3SKSKCKJ6enpbgq3bqm5JSRnF3N51H6oKNZEoJRqVTzdWBwA9AEmAFOB10QkquZKxpgZxpgEY0xCXFwTx+0/CZuScwAYWr4e/AJ1xFClVKtywkQgIpeLSFMSxiHAtSK9q3Oeq2RgvjGm3BizD/gJmxi8ypZDufgJtD/yPXQf3fCnfSml1CmgISf4G4BdIvI3EenfiG2vAfqISE8RCQKmAPNrrDMPWxpARGKxVUV7G/EZLWLToVxGx5Xjd2SrVgsppVqdEyYCY8zNwDBgD/C2iKx01tnX+zRpY0wF8ACwENgOfGSM2SoiT4vIFc7VFgKZIrINWAL8yhiTeRLfp9kZY9hyKJdJbX6yM3qf79mAlFKqmTWo+6gxJk9EPgFCgenAVcCvROQlY8zL9bxvAbCgxrwnXV4b4BHnn1dKyS0ho6CMUXEbICwWOpzgYfVKKXWKaUgbwRUiMhdYCgQCI40xlwBDgF+6NzzP23woFzB0z14Fp50Lfp5uX1dKqebVkBLBNcA/jTHLXWcaY4pE5A73hOU9Nifn0sMvg8CSDOgxxtPhKKVUs2vI5e1TwOqqCREJFZF4AGPMYveE5T02H8plQnSGnehwumeDUUopN2hIIvgYcLhMVzrntXrGGDYfymV0RJqdEdeYTlNKKXVqaEgiCHAOEQGA87VPPJfxUE4xWYVl9Pc7CJHdIaStp0NSSqlm15BEkO7S3RMRmQxkuC8k77HlUC4AHUv2QfsBHo5GKaXcoyGNxfcC74vIvwDBjh90q1uj8hKbknMJ8askJHcPDKx1GCSllDrlnTARGGP2AKNFJMI5XeD2qLzE5kO5jI/NR/LKof1AT4ejlFJu0aAbykTkMmAQECIiABhjnnZjXB5X1VD8qy7pkIdWDSmlWq2G3FD2H+x4Q7/AVg1dB/Rwc1wel5xdTE5ROYODDoH424fRKKVUK9SQxuIxxphbgWxjzB+As7CDw7Vqm50Nxd0r9kO70yAwxMMRKaWUezQkEZQ4/y0Skc5AOdDJfSF5h82Hcgn0F9rk7dJqIaVUq9aQRPCZ82ExzwHrgCTAax8y31w2J+cyuEMgftn7tKFYKdWq1dtY7HwgzWJjTA4wR0Q+B0KMMbktEZwn7UjN45Ye2ZBltESglGrV6i0RGGMcwCsu06W+kARKyivJKChjgL/zgWrtB3k2IKWUcqOGVA0tFpFrpKrfqA84klcKQI+KfeAfDDE9PRyRUkq5T0MSwT3YQeZKRSRPRPJFJM/NcXlUSm4xAO2L90JcP/Dz93BESinlPg25s7jeR1K2Rql5tqNUm7xdcNoEzwajlFJudsJEICLn1Da/5oNqWpOU3BLaUkBAYao2FCulWr2GDDHxK5fXIcBIYC1wnlsi8gIpOcUMDUm1Ex20oVgp1bo1pGroctdpEekGvOCugLxBSm4JCSGH7a10WiJQSrVyTXkSezLQqs+OqXklDAw4BMFtoW0XT4ejlFJu1ZA2gpcB45z0A4Zi7zButVJyS+gVeMCWBnyn16xSykc1pI0g0eV1BTDLGPODm+LxuLIKB1kFxXQO2wPtr/N0OEop5XYNSQSfACXGmEoAEfEXkTBjTJF7Q/OMI/kl9OUgIZUF0P0sT4ejlFJu16A7i4FQl+lQYJF7wvG81NwSRvrtsBM9xng2GKWUagENSQQhro+ndL4Oc19InpWSW8JIv+2UR3SFqG6eDkcppdyuIYmgUESGV02IyAig2H0heVZqTjEj/XZgtDSglPIRDWkjmA58LCKHsY+q7Ih9dGWrVHbkJ+IkD9NrnKdDUUqpFtGQG8rWiEh/oJ9z1k5jTLl7w/KcqPTVAEiPsR6ORCmlWkZDHl5/PxBujNlijNkCRIjIz90fmmd0zttAjl+0fU6xUkr5gIa0EdzlfEIZAMaYbOAut0XkYf1LNrM/YojeSKaU8hkNSQT+rg+lERF/IMh9IXlORWYSnUgns90IT4eilFItpiGNxV8BH4rIf53T9wBfui8kzynY9R1RQHHn0Z4ORSmlWkxDEsH/AXcD9zqnN2F7DrU6FXt/INeEEdrldE+HopRSLeaEVUPOB9ivApKwzyI4D9jekI2LyEQR2Skiu0XksXrWu0ZEjIgkNCxs9wg5/CNrHP3oGBXhyTCUUqpF1ZkIRKSviPxeRHYALwMHAIwx5xpj/nWiDTvbEl4BLgEGAlNFZGAt67UBHsImG88pOEJEwT5WO/rTKTLEo6EopVRLqq9EsAN79T/JGDPOGPMyUNmIbY8Edhtj9hpjyoDZwORa1nsG+Cv2MTCes38FAOtlIFFhgR4NRSmlWlJ9ieBqIAVYIiKvicj52DuLG6oLcNBlOtk57yjn0BXdjDFf1LchEblbRBJFJDE9Pb0RITTC/hWUSghZbQcg2nVUKeVD6kwExph5xpgpQH9gCXaoifYi8qqIXHSyHywifsA/gF+eaF1jzAxjTIIxJiEuLu5kP7p2KRvZHdCbOG0fUEr5mIY0FhcaYz5wPru4K7Ae25PoRA4BrsN3dnXOq9IGOB1YKiJJwGhgvscajPNTSK6MoVNk6InXVUqpVqRRzyw2xmQ7r87Pb8Dqa4A+ItJTRIKAKcB8l23lGmNijTHxxph44EfgCmNMYu2bcyNjMAVHOFDelo7aUKyU8jFNeXh9gxhjKoAHgIXY7qYfGWO2isjTInKFuz63SUrzkIpi0hyR2mNIKeVzGnJDWZMZYxYAC2rMe7KOdSe4M5Z6FRwB4IiJYmRbTQRKKd/ithLBKSU/FYB0orSNQCnlczQRABSkAbZEoG0ESilfo4kAjiaCHL8Y2oW3yoFVlVKqTpoIAPJTKZdAwtpG4+enN5MppXyLJgKAgiNk+8XQUdsHlFI+SBMBQEEqGUTRLjzY05EopVSL00QAUHCE1MpIYiK0fUAp5Xs0EQAmP5XDlW2JCdNEoJTyPZoIKsqQ4izSHFHEaI8hpZQP0kRQaO8qTieKdlo1pJTyQZoI8o/dTBatVUNKKR+kicB5M1m60aohpZRv0kRQYMcZOmK0akgp5Zs0ERQcwSBk0larhpRSPkkTQX4qRQFRBAcFExLo7+lolFKqxWkiKEgjxz9GbyZTSvksTQQFaWRKlN5MppTyWZoI8tM4oj2GlFI+zLcTgTFQkEZKRVtidMA5pZSPcuszi71ecTY4yjlQ2YaY8EBPR6OUUh7h2yUC581kKZWRWiJQSvks304E+S43k2kbgVLKR/l2Iqh6aD1RRGsiUEr5KE0E6DhDSinf5tuJID+NCv9QCgnVqiGllM/y7URQkEZhUCyAVg0ppXyWzyeCvIAYAv2FtiG+3ZNWKeW7fD4RZEs00WFBiIino1FKKY/w7USQn0Y60dpQrJTyab6bCMqLoTSXVEdbTQRKKZ/mu4nA2XX0ULkmAqWUb/PdROB8aH1SaRtNBEopn+a7iaCgKhFEaCJQSvk0n08ER4w2FiulfJtbE4GITBSRnSKyW0Qeq2X5IyKyTUQ2ichiEenhzniqKUjDiB9ZaNWQUsq3uS0RiIg/8ApwCTAQmCoiA2usth5IMMYMBj4B/uaueI6Tn0J5SCwO/DQRKKV8mjtLBCOB3caYvcaYMmA2MNl1BWPMEmNMkXPyR6CrG+OpLvcQhSEdATQRKKV8mjsTQRfgoMt0snNeXe4AvqxtgYjcLSKJIpKYnp7ePNHlJpMX1AHQRKCU8m1e0VgsIjcDCcBztS03xswwxiQYYxLi4uJO/gONgbxDZPrbbUWHaSJQSvkud460dgjo5jLd1TmvGhG5APgtMN4YU+rGeI4pzobyItIklrYhAQT6e0U+VEopj3DnGXAN0EdEeopIEDAFmO+6gogMA/4LXGGMOeLGWKrLtTVWyY52tIvQZxUrpXyb2xKBMaYCeABYCGwHPjLGbBWRp0XkCudqzwERwMciskFE5texueaVmwzA/soYbR9QSvk8tw7Cb4xZACyoMe9Jl9cXuPPz65Rra6h2l0QS3U4TgVLKt/nm01hyD4J/MPuKQpnQXROBOjWVl5eTnJxMSUmJp0NRXiQkJISuXbsSGBjY4Pf4aCJIxkR2ISutgpgITQTq1JScnEybNm2Ij4/XByspAIwxZGZmkpycTM+ePRv8Pt/sLpObTGVEZ8orDTHadVSdokpKSmjXrp0mAXWUiNCuXbtGlxJ9MxHkHaIkrDOgN5OpU5smAVVTU44J30sEleWQn0J+1fASWjWklPJxvpcI8lPAOMgOaA+gVUNKNVFmZiZDhw5l6NChdOzYkS5duhydLisrq/e9iYmJPPjggyf8jDFjxjRXuABMnz6dLl264HA4mnW7pzrfayx2dh1N97PDS2jVkFJN065dOzZs2ADAU089RUREBI8++ujR5RUVFQQE1H6KSUhIICEh4YSfsWLFimaJFcDhcDB37ly6devGsmXLOPfcc5tt267q+97e6tSKtjk4byZLoR1QRjutGlKtwB8+28q2w3nNus2Bndvy+8sHNeo906ZNIyQkhPXr1zN27FimTJnCQw89RElJCaGhobz11lv069ePpUuX8vzzz/P555/z1FNPceDAAfbu3cuBAweYPn360dJCREQEBQUFLF26lKeeeorY2Fi2bNnCiBEjeO+99xARFixYwCOPPEJ4eDhjx45l7969fP7558fFtnTpUgYNGsQNN9zArFmzjiaCtLQ07r33Xvbu3QvAq6++ypgxY5g5cybPP/88IsLgwYN59913mTZtGpMmTeLaa689Lr4nnniC6OhoduzYwU8//cSVV17JwYMHKSkp4aGHHuLuu+8G4KuvvuLxxx+nsrKS2NhYvvnmG/r168eKFSuIi4vD4XDQt29fVq5cSbOMrdYAPpgI7PASBytjCA44Qmigv4cDUqp1SU5OZsWKFfj7+5OXl8d3331HQEAAixYt4vHHH2fOnDnHvWfHjh0sWbKE/Px8+vXrx3333XdcP/j169ezdetWOnfuzNixY/nhhx9ISEjgnnvuYfny5fTs2ZOpU6fWGdesWbOYOnUqkydP5vHHH6e8vJzAwEAefPBBxo8fz9y5c6msrKSgoICtW7fy7LPPsmLFCmJjY8nKyjrh9163bh1btmw52m3zzTffJCYmhuLiYs4880yuueYaHA4Hd91119F4s7Ky8PPz4+abb+b9999n+vTpLFq0iCFDhrRYEgCfTATJEBrNwQI/2rcN1l4XqlVo7JW7O1133XX4+9sLrNzcXG677TZ27dqFiFBeXl7rey677DKCg4MJDg6mffv2pKWl0bVr9ceTjBw58ui8oUOHkpSUREREBL169Tp68p06dSozZsw4bvtlZWUsWLCAf/zjH7Rp04ZRo0axcOFCJk2axLfffsvMmTMB8Pf3JzIykpkzZ3LdddcRGxsLQExMzAm/98iRI6v13X/ppZeYO3cuAAcPHmTXrl2kp6dzzjnnHF2vars/+9nPmDx5MtOnT+fNN9/k9ttvP+HnNSffSwR5h6BtV35Ky6d3XISno1Gq1QkPDz/6+oknnuDcc89l7ty5JCUlMWHChFrfExx8bPBHf39/KioqmrROXRYuXEhOTg5nnHEGAEVFRYSGhjJp0qQGbwMgICDgaEOzw+Go1iju+r2XLl3KokWLWLlyJWFhYUyYMKHevv3dunWjQ4cOfPvtt6xevZr333+/UXGdLN/rNZSbjKNtF/amF9K3QxtPR6NUq5abm0uXLvZ5VG+//Xazb79fv37s3buXpKQkAD788MNa15s1axavv/46SUlJJCUlsW/fPr755huKioo4//zzefXVVwGorKwkNzeX8847j48//pjMzEyAo1VD8fHxrF27FoD58+fXWcLJzc0lOjqasLAwduzYwY8//gjA6NGjWb58Ofv27au2XYA777yTm2++uVqJqqX4YCI4SH5IR8oqHfTRRKCUW/3617/mN7/5DcOGDWvUFXxDhYaG8u9//5uJEycyYsQI2rRpQ2RkZLV1ioqK+Oqrr7jsssuOzgsPD2fcuHF89tlnvPjiiyxZsoQzzjiDESNGsG3bNgYNGsRvf/tbxo8fz5AhQ3jkkUcAuOuuu1i2bBlDhgxh5cqV1UoBriZOnEhFRQUDBgzgscceY/To0QDExcUxY8YMrr76aoYMGcINN9xw9D1XXHEFBQUFLV4tBCDGmBb/0JORkJBgEhMTm/bm0nz4c1d2nv5LLk4cwfwHxjK4a1SzxqdUS9m+fTsDBgzwdBgeV1BQQEREBMYY7r//fvr06cPDDz/s6bAaLTExkYcffpjvvvvupLdV27EhImuNMbX22fWtEoHzHoJ95dEA9G6vbQRKnepee+01hg4dyqBBg8jNzeWee+7xdEiN9pe//IVrrrmGP//5zx75fN9qLHbeQ7CtqC3dYkIJC/Ktr69Ua/Twww+fkiUAV4899hiPPfaYxz7fx0oE9h6CtTnh9G2v7QNKKQU+lwiSMeLH2qxg+nbURKCUUuBriSDvEBXhHSmp9KNvB20fUEop8LVEkJtMfrAdfrqPVg0ppRTgc4ngIEckDj/RHkNKnaxzzz2XhQsXVpv3wgsvcN9999X5ngkTJlDV/fvSSy8lJyfnuHWeeuopnn/++Xo/e968eWzbtu3o9JNPPsmiRYsaEX39fG24at9JBA4H5B7iQGU03WPCCNHB5pQ6KVOnTmX27NnV5s2ePbvegd9cLViwgKioqCZ9ds1E8PTTT3PBBRc0aVs11Ryu2l3ccYNdU/lO/8nCdHCUs6M4kj6dtVpItTJfPgapm5t3mx3PgEv+Uufia6+9lt/97neUlZURFBREUlIShw8f5uyzz+a+++5jzZo1FBcXc+211/KHP/zhuPfHx8eTmJhIbGwsf/zjH3nnnXdo37493bp1Y8SIEYC9R2DGjBmUlZXRu3dv3n33XTZs2MD8+fNZtmwZzz77LHPmzOGZZ545Ojz04sWLefTRR6moqODMM8/k1VdfJTg4mPj4eG677TY+++wzysvL+fjjj+nfv/9xcfnicNW+UyJw3kOwJb+NNhQr1QxiYmIYOXIkX375JWBLA9dffz0iwh//+EcSExPZtGkTy5YtY9OmTXVuZ+3atcyePZsNGzawYMEC1qxZc3TZ1VdfzZo1a9i4cSMDBgzgjTfeYMyYMVxxxRU899xzbNiwgdNOO+3o+iUlJUybNo0PP/yQzZs3U1FRcXQcIYDY2FjWrVvHfffdV2f1U9Vw1VdddRVffPHF0fGEqoar3rhxI+vWrWPQoEFHh6v+9ttv2bhxIy+++OIJ99u6det48cUX+emnnwA7XPXatWtJTEzkpZdeIjMzk/T0dO666y7mzJnDxo0b+fjjj6sNVw0063DVvlMicN5DkOyI4TIdY0i1NvVcubtTVfXQ5MmTmT17Nm+88QYAH330ETNmzKCiooKUlBS2bdvG4MGDa93Gd999x1VXXUVYWBhgx9ypsmXLFn73u9+Rk5NDQUEBF198cb3x7Ny5k549e9K3b18AbrvtNl555RWmT58O2MQCMGLECD799NPj3u+rw1X7UCKwJYJDJlZ7DCnVTCZPnszDDz/MunXrKCoqYsSIEezbt4/nn3+eNWvWEB0dzbRp0+odgrk+06ZNY968eQwZMoS3336bpUuXnlS8VUNZ1zWMta8OV+07VUPx41jW40HyJZxecbWPGKiUapyIiAjOPfdcfvaznx1tJM7LyyM8PJzIyEjS0tKOVh3V5ZxzzmHevHkUFxeTn5/PZ599dnRZfn4+nTp1ory8vNpJr02bNuTn5x+3rX79+pGUlMTu3bsBePfddxk/fnyDv4+vDlftO4mg81A+CJhMfLsI7TGkVDOaOnUqGzduPJoIhgwZwrBhw+jfvz833ngjY8eOrff9w4cP54YbbmDIkCFccsklnHnmmUeXPfPMM4waNYqxY8dWa9idMmUKzz33HMOGDWPPnj1H54eEhPDWW29x3XXXccYZZ+Dn58e9997boO/hy8NV+9Qw1Oc9v5Q+HSL47y21jsSq1ClFh6H2TQ0ZrlqHoa5DSXklSZn6VDKl1KnLXcNV+0wi2JteiMOgTyVTSp2yHnvsMfbv38+4ceOadbs+kwh2HbENS/00EahW5FSr2lXu15RjwmcSQUZBGaGB/vSM1R5DqnUICQkhMzNTk4E6yhhDZmYmISEhjXqfTzUWV1Q6CPD3mdynWrny8nKSk5Ob3EdftU4hISF07dqVwMDAavPrayz2nRvKQJOAalUCAwOr3aGqVFO59cwoIhNFZKeI7BaR4x7IKSLBIvKhc/kqEYl3ZzxKKaWO57ZEICL+wCvAJcBAYKqIDKyx2h1AtjGmN/BP4K/uikcppVTt3FkiGAnsNsbsNcaUAbOByTXWmQy843z9CXC+iIgbY1JKKVWDO9sIugAHXaaTgVF1rWOMqRCRXKAdkOG6kojcDdztnCwQkZ1NjCm25ra9iLfG5q1xgffG5q1xgffG5q1xQeuJrUddC06JxmJjzAxgxsluR0QS62o19zRvjc1b4wLvjc1b4wLvjc1b4wLfiM2dVUOHgG4u012d82pdR0QCgEgg040xKaWUqsGdiWAN0EdEeopIEDAFmF9jnfnAbc7X1wLfmlPtxgallDrFua1qyFnn/wCwEPAH3jTGbBWRp4FEY8x84A3gXRHZDWRhk4U7nXT1kht5a2zeGhd4b2zeGhd4b2zeGhf4QGyn3J3FSimlmpfeaquUUj5OE4FSSvk4n0kEJxruooVjeVNEjojIFpd5MSLyjYjscv4b7YG4uonIEhHZJiJbReQhb4hNREJEZLWIbHTG9Qfn/J7OoUl2O4cqCWrJuGrE6C8i60Xkc2+JTUSSRGSziGwQkUTnPI8fZ844okTkExHZISLbReQsT8cmIv2c+6rqL09Epns6Lpf4HnYe/1tEZJbz/0WzHGc+kQgaONxFS3obmFhj3mPAYmNMH2Cxc7qlVQC/NMYMBEYD9zv3k6djKwXOM8YMAYYCE0VkNHZIkn86hyjJxg5Z4ikPAdtdpr0ltnONMUNd+pp7+res8iLwlTGmPzAEu+88GpsxZqdzXw0FRgBFwFxPxwUgIl2AB4EEY8zp2A44U2iu48wY0+r/gLOAhS7TvwF+4+GY4oEtLtM7gU7O152AnV6w3/4HXOhNsQFhwDrsXeoZQEBtv3ELx9QVe4I4D/gcEG+IDUgCYmvM8/hvib1faB/OzireFJtLLBcBP3hLXBwbhSEG29vzc+Di5jrOfKJEQO3DXXTxUCx16WCMSXG+TgU6eDIY50iww4BVeEFszqqXDcAR4BtgD5BjjKlwruLJ3/QF4NeAwzndDu+IzQBfi8ha5zAt4AW/JdATSAfeclanvS4i4V4SW5UpwCzna4/HZYw5BDwPHABSgFxgLc10nPlKIjilGJvePdavV0QigDnAdGNMnusyT8VmjKk0tsjeFTugYf+WjqE2IjIJOGKMWevpWGoxzhgzHFsler+InOO60IPHWQAwHHjVGDMMKKRGdYsn/w8469mvAD6uucxTcTnbJSZjk2hnIJzjq5ebzFcSQUOGu/C0NBHpBOD894gnghCRQGwSeN8Y86k3xQZgjMkBlmCLwVHOoUnAc7/pWOAKEUnCjrB7Hrb+2+OxOa8iMcYcwdZ1j8Q7fstkINkYs8o5/Qk2MXhDbGAT5zpjTJpz2hviugDYZ4xJN8aUA59ij71mOc58JRE0ZLgLT3MdbuM2bP18ixIRwd7tvd0Y8w9viU1E4kQkyvk6FNtusR2bEK71VFwAxpjfGGO6GmPiscfVt8aYmzwdm4iEi0ibqtfYOu8teMFxZoxJBQ6KSD/nrPOBbd4Qm9NUjlULgXfEdQAYLSJhzv+nVfuseY4zTzXGeKCx5VLgJ2zd8m89HMssbD1fOfbq6A5svfJiYBewCIjxQFzjsMXeTcAG59+lno4NGAysd8a1BXjSOb8XsBrYjS3GB3v4d50AfO4NsTk/f6Pzb2vVMe/p39IlvqFAovM3nQdEe0Ns2CqXTCDSZZ7H43LG8Qdgh/P/wLtAcHMdZzrEhFJK+ThfqRpSSilVB00ESinl4zQRKKWUj9NEoJRSPk4TgVJK+ThNBErVICKVNUahbLZBxkQkXlxGnVXKG7jtUZVKncKKjR3OQimfoCUCpRrIOb7/35xj/K8Wkd7O+fEi8q2IbBKRxSLS3Tm/g4jMdT5HYaOIjHFuyl9EXnOOLf+1825ppTxGE4FSxwutUTV0g8uyXGPMGcC/sKOOArwMvGOMGQy8D7zknP8SsMzY5ygMx97hC9AHeMUYMwjIAa5x67dR6gT0zmKlahCRAmNMRC3zk7APyNnrHJwv1RjTTkQysOPVlzvnpxhjYkUkHehqjCl12UY88I2xDzlBRP4PCDTGPNsCX02pWmmJQKnGMXW8boxSl9eVaFud8jBNBEo1zg0u/650vl6BHXkU4CbgO+frxcB9cPTBOpEtFaRSjaFXIkodL9T5NLQqXxljqrqQRovIJuxV/VTnvF9gn7b1K+yTt253zn8ImCEid2Cv/O/DjjqrlFfRNgKlGsjZRpBgjMnwdCxKNSetGlJKKR+nJQKllPJxWiJQSikfp4lAKaV8nCYCpZTycZoIlFLKx2kiUEopH/f/mPJbVnKixOsAAAAASUVORK5CYII=",
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"# Plot accuracies\n",
"plt.plot(history.history['accuracy'], label='Training Accuracy')\n",
"plt.plot(history.history['val_accuracy'], label = 'Validation Accuracy')\n",
"plt.xlabel('Epoch')\n",
"plt.ylabel('Accuracy')\n",
"plt.ylim([0, 1])\n",
"plt.legend(loc='lower right')\n",
"plt.title(\"Training and Value Accuracy\")\n",
"plt.savefig(\"accuracy_plot\")\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A big difference between training accuracy and validation accuracy means we could have potentially overfitted the model.\n",
"\n",
"### Evaluation\n",
"\n",
"We can evaluate the loss and accuracy with the `evaluate` function"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[INFO] evaluating network...\n",
"7/7 [==============================] - 0s 12ms/step - loss: 2.5202 - accuracy: 0.7097\n"
"data": {
"text/plain": [
"'test loss, test acc:'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
"[2.520172357559204, 0.7096773982048035]"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# evaluate the network\n",
"\n",
"print(\"[INFO] evaluating network...\")\n",
"testmodel = load_model(\"./model.keras\")\n",
"results = testmodel.evaluate(test_images, test_labels, batch_size=BATCH_SIZE)\n",
"display(f\"test loss, test acc:\", results)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Predictions\n",
"\n",
"Let's predict with our model."
]
},
{
"cell_type": "code",
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[INFO] predicting test samples...\n"
]
},
{
"data": {
"text/plain": [
"'predictions shape:'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" precision recall f1-score support\n",
"\n",
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
" 0 0.67 0.67 0.67 6\n",
" 1 0.75 0.50 0.60 6\n",
" 2 0.67 0.80 0.73 5\n",
" 3 1.00 0.80 0.89 5\n",
" 4 1.00 0.40 0.57 5\n",
" 5 0.75 0.60 0.67 5\n",
" 6 0.71 0.83 0.77 6\n",
" 7 0.80 0.80 0.80 5\n",
" 8 0.50 0.60 0.55 5\n",
" 9 1.00 0.60 0.75 5\n",
" A 0.80 0.67 0.73 6\n",
" B 0.60 0.60 0.60 5\n",
" C 0.67 0.67 0.67 6\n",
" D 0.83 1.00 0.91 5\n",
" E 1.00 0.60 0.75 5\n",
" F 0.83 0.83 0.83 6\n",
" G 1.00 0.80 0.89 5\n",
" H 0.83 0.83 0.83 6\n",
" I 0.50 0.50 0.50 6\n",
" G 0.57 0.67 0.62 6\n",
" K 0.60 0.60 0.60 5\n",
" L 1.00 0.83 0.91 6\n",
" M 1.00 0.67 0.80 6\n",
" N 0.67 1.00 0.80 6\n",
" O 0.67 0.67 0.67 6\n",
" P 0.80 0.80 0.80 5\n",
" Q 0.80 0.67 0.73 6\n",
" R 0.67 0.80 0.73 5\n",
" S 0.57 0.80 0.67 5\n",
" T 0.83 0.83 0.83 6\n",
" U 1.00 0.80 0.89 5\n",
" V 0.43 0.60 0.50 5\n",
" W 0.75 0.50 0.60 6\n",
" X 0.83 0.83 0.83 6\n",
" Y 1.00 0.83 0.91 6\n",
" Z 0.75 0.50 0.60 6\n",
" a 0.80 0.80 0.80 5\n",
" b 0.80 0.67 0.73 6\n",
" c 1.00 0.17 0.29 6\n",
" d 0.71 0.83 0.77 6\n",
" e 0.60 1.00 0.75 6\n",
" f 1.00 0.83 0.91 6\n",
" g 0.60 0.60 0.60 5\n",
" h 0.80 0.67 0.73 6\n",
" i 0.83 1.00 0.91 5\n",
" g 0.75 0.60 0.67 5\n",
" k 1.00 0.80 0.89 5\n",
" l 0.33 0.33 0.33 6\n",
" m 1.00 1.00 1.00 5\n",
" n 0.50 0.40 0.44 5\n",
" o 0.62 0.83 0.71 6\n",
" p 0.86 1.00 0.92 6\n",
" q 0.60 0.60 0.60 5\n",
" r 0.56 1.00 0.71 5\n",
" s 0.40 0.67 0.50 6\n",
" t 0.71 1.00 0.83 5\n",
" u 0.71 1.00 0.83 5\n",
" v 0.33 0.20 0.25 5\n",
" w 0.80 0.80 0.80 5\n",
" x 0.50 0.33 0.40 6\n",
" y 0.67 0.67 0.67 6\n",
" z 0.44 0.80 0.57 5\n",
" accuracy 0.71 341\n",
" macro avg 0.74 0.71 0.71 341\n",
"weighted avg 0.74 0.71 0.71 341\n",
"\n"
]
}
],
"source": [
"print(\"[INFO] predicting test samples...\")\n",
"predictions = testmodel.predict(test_images, batch_size=BATCH_SIZE)\n",
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
"display(\"predictions shape:\", predictions.shape)\n",
"\n",
"# labels for readability\n",
"labelNames = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', \n",
" 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'G', 'K', 'L', 'M', \n",
" 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z',\n",
" 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'g', 'k', 'l', 'm', \n",
" 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']\n",
"\n",
"print(classification_report(test_labels.argmax(axis=1),\n",
" predictions.argmax(axis=1), target_names=labelNames))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Saving\n",
"\n",
"Lets save the model."
]
},
{
"cell_type": "code",
"execution_count": 98,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[INFO] saving trained model...\n"
]
}
],
"source": [
" # save the model to disk\n",
"print(\"[INFO] saving trained model...\")\n",
"model.save(\"OCR_CNN.h5\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Intro_to_AI",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.13"
}
},
"nbformat": 4,
"nbformat_minor": 2
}