Hands-on 08: Model Compression#

! pip install --user --quiet tensorflow-model-optimization
from tensorflow.keras.utils import to_categorical
from sklearn.datasets import fetch_openml
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import LabelEncoder, StandardScaler
import numpy as np
import matplotlib.pyplot as plt

%matplotlib inline
seed = 42
np.random.seed(seed)
import tensorflow as tf

tf.random.set_seed(seed)
2023-04-07 02:54:13.491520: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE4.1 SSE4.2 AVX AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.

Fetch the jet tagging dataset from Open ML#

data = fetch_openml("hls4ml_lhc_jets_hlf", parser="auto")
X, y = data["data"], data["target"]

le = LabelEncoder()
y_onehot = le.fit_transform(y)
y_onehot = to_categorical(y_onehot, 5)
classes = le.classes_

X_train_val, X_test, y_train_val, y_test = train_test_split(X, y_onehot, test_size=0.2, random_state=42)


scaler = StandardScaler()
X_train_val = scaler.fit_transform(X_train_val)
X_test = scaler.transform(X_test)

Now construct a model#

We’ll use the same architecture as in part 1: 3 hidden layers with 64, then 32, then 32 neurons. Each layer will use relu activation. Add an output layer with 5 neurons (one for each class), then finish with Softmax activation.

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation, BatchNormalization
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.regularizers import l1
from callbacks import all_callbacks
model = Sequential()
model.add(Dense(64, input_shape=(16,), name="fc1", kernel_initializer="lecun_uniform"))
model.add(Activation(activation="relu", name="relu1"))
model.add(Dense(32, name="fc2", kernel_initializer="lecun_uniform"))
model.add(Activation(activation="relu", name="relu2"))
model.add(Dense(32, name="fc3", kernel_initializer="lecun_uniform"))
model.add(Activation(activation="relu", name="relu3"))
model.add(Dense(5, name="output", kernel_initializer="lecun_uniform"))
model.add(Activation(activation="softmax", name="softmax"))
2023-04-07 02:54:24.691949: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE4.1 SSE4.2 AVX AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.

Train the unpruned model#

adam = Adam(learning_rate=0.0001)
model.compile(optimizer=adam, loss=["categorical_crossentropy"], metrics=["accuracy"])
callbacks = all_callbacks(
    stop_patience=1000,
    lr_factor=0.5,
    lr_patience=10,
    lr_epsilon=0.000001,
    lr_cooldown=2,
    lr_minimum=0.0000001,
    outputDir="unpruned_model",
)
model.fit(
    X_train_val, y_train_val, batch_size=1024, epochs=30, validation_split=0.25, shuffle=True, callbacks=callbacks.callbacks
)
Epoch 1/30
  1/487 [..............................] - ETA: 6:16 - loss: 1.6437 - accuracy: 0.2412

 19/487 [>.............................] - ETA: 1s - loss: 1.6383 - accuracy: 0.2566  

 38/487 [=>............................] - ETA: 1s - loss: 1.6132 - accuracy: 0.2835

 56/487 [==>...........................] - ETA: 1s - loss: 1.5916 - accuracy: 0.3048

 74/487 [===>..........................] - ETA: 1s - loss: 1.5716 - accuracy: 0.3205

 92/487 [====>.........................] - ETA: 1s - loss: 1.5513 - accuracy: 0.3346

111/487 [=====>........................] - ETA: 1s - loss: 1.5312 - accuracy: 0.3521

130/487 [=======>......................] - ETA: 0s - loss: 1.5125 - accuracy: 0.3717

147/487 [========>.....................] - ETA: 0s - loss: 1.4966 - accuracy: 0.3874

165/487 [=========>....................] - ETA: 0s - loss: 1.4805 - accuracy: 0.4011

182/487 [==========>...................] - ETA: 0s - loss: 1.4661 - accuracy: 0.4118

199/487 [===========>..................] - ETA: 0s - loss: 1.4523 - accuracy: 0.4211

216/487 [============>.................] - ETA: 0s - loss: 1.4386 - accuracy: 0.4294

233/487 [=============>................] - ETA: 0s - loss: 1.4255 - accuracy: 0.4365

251/487 [==============>...............] - ETA: 0s - loss: 1.4111 - accuracy: 0.4439

268/487 [===============>..............] - ETA: 0s - loss: 1.3988 - accuracy: 0.4497

285/487 [================>.............] - ETA: 0s - loss: 1.3864 - accuracy: 0.4555

303/487 [=================>............] - ETA: 0s - loss: 1.3733 - accuracy: 0.4617

321/487 [==================>...........] - ETA: 0s - loss: 1.3611 - accuracy: 0.4670

338/487 [===================>..........] - ETA: 0s - loss: 1.3498 - accuracy: 0.4720

356/487 [====================>.........] - ETA: 0s - loss: 1.3387 - accuracy: 0.4768

373/487 [=====================>........] - ETA: 0s - loss: 1.3287 - accuracy: 0.4810

391/487 [=======================>......] - ETA: 0s - loss: 1.3186 - accuracy: 0.4855

410/487 [========================>.....] - ETA: 0s - loss: 1.3084 - accuracy: 0.4902

427/487 [=========================>....] - ETA: 0s - loss: 1.2997 - accuracy: 0.4939

444/487 [==========================>...] - ETA: 0s - loss: 1.2918 - accuracy: 0.4975

463/487 [===========================>..] - ETA: 0s - loss: 1.2832 - accuracy: 0.5014

479/487 [============================>.] - ETA: 0s - loss: 1.2760 - accuracy: 0.5049
Epoch 1: val_loss improved from inf to 1.06875, saving model to unpruned_model/model_best.h5
Epoch 1: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 3s 4ms/step - loss: 1.2729 - accuracy: 0.5063 - val_loss: 1.0687 - val_accuracy: 0.6039 - lr: 1.0000e-04
Epoch 2/30
  1/487 [..............................] - ETA: 2s - loss: 1.0665 - accuracy: 0.5928

 20/487 [>.............................] - ETA: 1s - loss: 1.0624 - accuracy: 0.6021

 36/487 [=>............................] - ETA: 1s - loss: 1.0619 - accuracy: 0.6031

 54/487 [==>...........................] - ETA: 1s - loss: 1.0560 - accuracy: 0.6064

 72/487 [===>..........................] - ETA: 1s - loss: 1.0544 - accuracy: 0.6079

 89/487 [====>.........................] - ETA: 1s - loss: 1.0520 - accuracy: 0.6095

106/487 [=====>........................] - ETA: 1s - loss: 1.0506 - accuracy: 0.6113

123/487 [======>.......................] - ETA: 1s - loss: 1.0478 - accuracy: 0.6133

139/487 [=======>......................] - ETA: 1s - loss: 1.0452 - accuracy: 0.6150

156/487 [========>.....................] - ETA: 1s - loss: 1.0429 - accuracy: 0.6163

173/487 [=========>....................] - ETA: 0s - loss: 1.0394 - accuracy: 0.6186

190/487 [==========>...................] - ETA: 0s - loss: 1.0376 - accuracy: 0.6203

208/487 [===========>..................] - ETA: 0s - loss: 1.0349 - accuracy: 0.6216

226/487 [============>.................] - ETA: 0s - loss: 1.0322 - accuracy: 0.6230

241/487 [=============>................] - ETA: 0s - loss: 1.0297 - accuracy: 0.6244

259/487 [==============>...............] - ETA: 0s - loss: 1.0270 - accuracy: 0.6263

277/487 [================>.............] - ETA: 0s - loss: 1.0246 - accuracy: 0.6272

296/487 [=================>............] - ETA: 0s - loss: 1.0219 - accuracy: 0.6283

315/487 [==================>...........] - ETA: 0s - loss: 1.0196 - accuracy: 0.6296

332/487 [===================>..........] - ETA: 0s - loss: 1.0173 - accuracy: 0.6306

350/487 [====================>.........] - ETA: 0s - loss: 1.0147 - accuracy: 0.6320

368/487 [=====================>........] - ETA: 0s - loss: 1.0123 - accuracy: 0.6332

386/487 [======================>.......] - ETA: 0s - loss: 1.0104 - accuracy: 0.6342

404/487 [=======================>......] - ETA: 0s - loss: 1.0083 - accuracy: 0.6353

420/487 [========================>.....] - ETA: 0s - loss: 1.0057 - accuracy: 0.6367

436/487 [=========================>....] - ETA: 0s - loss: 1.0040 - accuracy: 0.6376

452/487 [==========================>...] - ETA: 0s - loss: 1.0020 - accuracy: 0.6388

469/487 [===========================>..] - ETA: 0s - loss: 1.0001 - accuracy: 0.6399

485/487 [============================>.] - ETA: 0s - loss: 0.9980 - accuracy: 0.6411
Epoch 2: val_loss improved from 1.06875 to 0.94310, saving model to unpruned_model/model_best.h5
Epoch 2: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.9979 - accuracy: 0.6412 - val_loss: 0.9431 - val_accuracy: 0.6744 - lr: 1.0000e-04
Epoch 3/30
  1/487 [..............................] - ETA: 2s - loss: 0.9065 - accuracy: 0.6768

 21/487 [>.............................] - ETA: 1s - loss: 0.9345 - accuracy: 0.6767

 40/487 [=>............................] - ETA: 1s - loss: 0.9347 - accuracy: 0.6789

 58/487 [==>...........................] - ETA: 1s - loss: 0.9319 - accuracy: 0.6791

 75/487 [===>..........................] - ETA: 1s - loss: 0.9283 - accuracy: 0.6805

 93/487 [====>.........................] - ETA: 1s - loss: 0.9280 - accuracy: 0.6806

109/487 [=====>........................] - ETA: 1s - loss: 0.9269 - accuracy: 0.6818

127/487 [======>.......................] - ETA: 1s - loss: 0.9252 - accuracy: 0.6828

147/487 [========>.....................] - ETA: 0s - loss: 0.9246 - accuracy: 0.6833

166/487 [=========>....................] - ETA: 0s - loss: 0.9238 - accuracy: 0.6840

184/487 [==========>...................] - ETA: 0s - loss: 0.9218 - accuracy: 0.6849

201/487 [===========>..................] - ETA: 0s - loss: 0.9207 - accuracy: 0.6856

220/487 [============>.................] - ETA: 0s - loss: 0.9194 - accuracy: 0.6866

237/487 [=============>................] - ETA: 0s - loss: 0.9169 - accuracy: 0.6881

254/487 [==============>...............] - ETA: 0s - loss: 0.9160 - accuracy: 0.6886

272/487 [===============>..............] - ETA: 0s - loss: 0.9153 - accuracy: 0.6889

290/487 [================>.............] - ETA: 0s - loss: 0.9140 - accuracy: 0.6896

308/487 [=================>............] - ETA: 0s - loss: 0.9122 - accuracy: 0.6904

326/487 [===================>..........] - ETA: 0s - loss: 0.9108 - accuracy: 0.6911

344/487 [====================>.........] - ETA: 0s - loss: 0.9087 - accuracy: 0.6920

362/487 [=====================>........] - ETA: 0s - loss: 0.9069 - accuracy: 0.6928

380/487 [======================>.......] - ETA: 0s - loss: 0.9059 - accuracy: 0.6932

397/487 [=======================>......] - ETA: 0s - loss: 0.9048 - accuracy: 0.6936

415/487 [========================>.....] - ETA: 0s - loss: 0.9034 - accuracy: 0.6941

435/487 [=========================>....] - ETA: 0s - loss: 0.9019 - accuracy: 0.6947

455/487 [===========================>..] - ETA: 0s - loss: 0.9007 - accuracy: 0.6952

472/487 [============================>.] - ETA: 0s - loss: 0.8996 - accuracy: 0.6958

486/487 [============================>.] - ETA: 0s - loss: 0.8986 - accuracy: 0.6963
Epoch 3: val_loss improved from 0.94310 to 0.86604, saving model to unpruned_model/model_best.h5
Epoch 3: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.8986 - accuracy: 0.6963 - val_loss: 0.8660 - val_accuracy: 0.7113 - lr: 1.0000e-04
Epoch 4/30
  1/487 [..............................] - ETA: 2s - loss: 0.9337 - accuracy: 0.6748

 19/487 [>.............................] - ETA: 1s - loss: 0.8654 - accuracy: 0.7056

 37/487 [=>............................] - ETA: 1s - loss: 0.8647 - accuracy: 0.7080

 56/487 [==>...........................] - ETA: 1s - loss: 0.8606 - accuracy: 0.7111

 74/487 [===>..........................] - ETA: 1s - loss: 0.8584 - accuracy: 0.7112

 92/487 [====>.........................] - ETA: 1s - loss: 0.8562 - accuracy: 0.7129

111/487 [=====>........................] - ETA: 1s - loss: 0.8535 - accuracy: 0.7133

129/487 [======>.......................] - ETA: 1s - loss: 0.8523 - accuracy: 0.7137

149/487 [========>.....................] - ETA: 0s - loss: 0.8522 - accuracy: 0.7140

169/487 [=========>....................] - ETA: 0s - loss: 0.8517 - accuracy: 0.7143

189/487 [==========>...................] - ETA: 0s - loss: 0.8506 - accuracy: 0.7150

208/487 [===========>..................] - ETA: 0s - loss: 0.8504 - accuracy: 0.7150

228/487 [=============>................] - ETA: 0s - loss: 0.8500 - accuracy: 0.7148

247/487 [==============>...............] - ETA: 0s - loss: 0.8489 - accuracy: 0.7153

265/487 [===============>..............] - ETA: 0s - loss: 0.8479 - accuracy: 0.7157

282/487 [================>.............] - ETA: 0s - loss: 0.8479 - accuracy: 0.7159

300/487 [=================>............] - ETA: 0s - loss: 0.8464 - accuracy: 0.7163

317/487 [==================>...........] - ETA: 0s - loss: 0.8462 - accuracy: 0.7163

335/487 [===================>..........] - ETA: 0s - loss: 0.8455 - accuracy: 0.7165

354/487 [====================>.........] - ETA: 0s - loss: 0.8443 - accuracy: 0.7169

372/487 [=====================>........] - ETA: 0s - loss: 0.8433 - accuracy: 0.7172

389/487 [======================>.......] - ETA: 0s - loss: 0.8427 - accuracy: 0.7175

406/487 [========================>.....] - ETA: 0s - loss: 0.8416 - accuracy: 0.7178

425/487 [=========================>....] - ETA: 0s - loss: 0.8413 - accuracy: 0.7181

444/487 [==========================>...] - ETA: 0s - loss: 0.8407 - accuracy: 0.7181

463/487 [===========================>..] - ETA: 0s - loss: 0.8397 - accuracy: 0.7184

481/487 [============================>.] - ETA: 0s - loss: 0.8389 - accuracy: 0.7186
Epoch 4: val_loss improved from 0.86604 to 0.82260, saving model to unpruned_model/model_best.h5
Epoch 4: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.8387 - accuracy: 0.7186 - val_loss: 0.8226 - val_accuracy: 0.7235 - lr: 1.0000e-04
Epoch 5/30
  1/487 [..............................] - ETA: 2s - loss: 0.7862 - accuracy: 0.7285

 21/487 [>.............................] - ETA: 1s - loss: 0.8086 - accuracy: 0.7274

 40/487 [=>............................] - ETA: 1s - loss: 0.8075 - accuracy: 0.7284

 59/487 [==>...........................] - ETA: 1s - loss: 0.8113 - accuracy: 0.7266

 77/487 [===>..........................] - ETA: 1s - loss: 0.8149 - accuracy: 0.7251

 95/487 [====>.........................] - ETA: 1s - loss: 0.8137 - accuracy: 0.7255

112/487 [=====>........................] - ETA: 1s - loss: 0.8138 - accuracy: 0.7255

130/487 [=======>......................] - ETA: 0s - loss: 0.8113 - accuracy: 0.7264

147/487 [========>.....................] - ETA: 0s - loss: 0.8099 - accuracy: 0.7271

163/487 [=========>....................] - ETA: 0s - loss: 0.8111 - accuracy: 0.7266

180/487 [==========>...................] - ETA: 0s - loss: 0.8104 - accuracy: 0.7266

198/487 [===========>..................] - ETA: 0s - loss: 0.8106 - accuracy: 0.7266

215/487 [============>.................] - ETA: 0s - loss: 0.8099 - accuracy: 0.7267

232/487 [=============>................] - ETA: 0s - loss: 0.8094 - accuracy: 0.7267

251/487 [==============>...............] - ETA: 0s - loss: 0.8088 - accuracy: 0.7266

270/487 [===============>..............] - ETA: 0s - loss: 0.8085 - accuracy: 0.7267

288/487 [================>.............] - ETA: 0s - loss: 0.8078 - accuracy: 0.7269

306/487 [=================>............] - ETA: 0s - loss: 0.8074 - accuracy: 0.7269

324/487 [==================>...........] - ETA: 0s - loss: 0.8068 - accuracy: 0.7267

341/487 [====================>.........] - ETA: 0s - loss: 0.8066 - accuracy: 0.7266

359/487 [=====================>........] - ETA: 0s - loss: 0.8061 - accuracy: 0.7267

375/487 [======================>.......] - ETA: 0s - loss: 0.8060 - accuracy: 0.7267

394/487 [=======================>......] - ETA: 0s - loss: 0.8058 - accuracy: 0.7269

413/487 [========================>.....] - ETA: 0s - loss: 0.8057 - accuracy: 0.7267

431/487 [=========================>....] - ETA: 0s - loss: 0.8051 - accuracy: 0.7268

450/487 [==========================>...] - ETA: 0s - loss: 0.8048 - accuracy: 0.7269

468/487 [===========================>..] - ETA: 0s - loss: 0.8047 - accuracy: 0.7269

486/487 [============================>.] - ETA: 0s - loss: 0.8046 - accuracy: 0.7268
Epoch 5: val_loss improved from 0.82260 to 0.79679, saving model to unpruned_model/model_best.h5
Epoch 5: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.8047 - accuracy: 0.7268 - val_loss: 0.7968 - val_accuracy: 0.7288 - lr: 1.0000e-04
Epoch 6/30
  1/487 [..............................] - ETA: 2s - loss: 0.8556 - accuracy: 0.7148

 20/487 [>.............................] - ETA: 1s - loss: 0.7976 - accuracy: 0.7267

 39/487 [=>............................] - ETA: 1s - loss: 0.7941 - accuracy: 0.7280

 55/487 [==>...........................] - ETA: 1s - loss: 0.7934 - accuracy: 0.7292

 74/487 [===>..........................] - ETA: 1s - loss: 0.7938 - accuracy: 0.7285

 92/487 [====>.........................] - ETA: 1s - loss: 0.7934 - accuracy: 0.7283

112/487 [=====>........................] - ETA: 1s - loss: 0.7918 - accuracy: 0.7293

131/487 [=======>......................] - ETA: 0s - loss: 0.7915 - accuracy: 0.7294

150/487 [========>.....................] - ETA: 0s - loss: 0.7914 - accuracy: 0.7292

168/487 [=========>....................] - ETA: 0s - loss: 0.7916 - accuracy: 0.7289

185/487 [==========>...................] - ETA: 0s - loss: 0.7910 - accuracy: 0.7292

202/487 [===========>..................] - ETA: 0s - loss: 0.7901 - accuracy: 0.7295

219/487 [============>.................] - ETA: 0s - loss: 0.7897 - accuracy: 0.7298

236/487 [=============>................] - ETA: 0s - loss: 0.7902 - accuracy: 0.7297

253/487 [==============>...............] - ETA: 0s - loss: 0.7901 - accuracy: 0.7295

269/487 [===============>..............] - ETA: 0s - loss: 0.7893 - accuracy: 0.7299

289/487 [================>.............] - ETA: 0s - loss: 0.7892 - accuracy: 0.7298

305/487 [=================>............] - ETA: 0s - loss: 0.7887 - accuracy: 0.7301

322/487 [==================>...........] - ETA: 0s - loss: 0.7874 - accuracy: 0.7306

338/487 [===================>..........] - ETA: 0s - loss: 0.7874 - accuracy: 0.7305

354/487 [====================>.........] - ETA: 0s - loss: 0.7869 - accuracy: 0.7305

371/487 [=====================>........] - ETA: 0s - loss: 0.7864 - accuracy: 0.7308

388/487 [======================>.......] - ETA: 0s - loss: 0.7859 - accuracy: 0.7309

405/487 [=======================>......] - ETA: 0s - loss: 0.7860 - accuracy: 0.7307

423/487 [=========================>....] - ETA: 0s - loss: 0.7854 - accuracy: 0.7309

438/487 [=========================>....] - ETA: 0s - loss: 0.7847 - accuracy: 0.7310

455/487 [===========================>..] - ETA: 0s - loss: 0.7842 - accuracy: 0.7311

473/487 [============================>.] - ETA: 0s - loss: 0.7840 - accuracy: 0.7310
Epoch 6: val_loss improved from 0.79679 to 0.78066, saving model to unpruned_model/model_best.h5
Epoch 6: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7837 - accuracy: 0.7310 - val_loss: 0.7807 - val_accuracy: 0.7319 - lr: 1.0000e-04
Epoch 7/30
  1/487 [..............................] - ETA: 2s - loss: 0.7950 - accuracy: 0.7256

 19/487 [>.............................] - ETA: 1s - loss: 0.7776 - accuracy: 0.7322

 37/487 [=>............................] - ETA: 1s - loss: 0.7776 - accuracy: 0.7305

 55/487 [==>...........................] - ETA: 1s - loss: 0.7754 - accuracy: 0.7318

 74/487 [===>..........................] - ETA: 1s - loss: 0.7758 - accuracy: 0.7321

 91/487 [====>.........................] - ETA: 1s - loss: 0.7748 - accuracy: 0.7326

109/487 [=====>........................] - ETA: 1s - loss: 0.7757 - accuracy: 0.7322

128/487 [======>.......................] - ETA: 1s - loss: 0.7755 - accuracy: 0.7319

147/487 [========>.....................] - ETA: 0s - loss: 0.7758 - accuracy: 0.7318

165/487 [=========>....................] - ETA: 0s - loss: 0.7752 - accuracy: 0.7316

182/487 [==========>...................] - ETA: 0s - loss: 0.7754 - accuracy: 0.7313

201/487 [===========>..................] - ETA: 0s - loss: 0.7741 - accuracy: 0.7317

220/487 [============>.................] - ETA: 0s - loss: 0.7731 - accuracy: 0.7318

237/487 [=============>................] - ETA: 0s - loss: 0.7723 - accuracy: 0.7324

254/487 [==============>...............] - ETA: 0s - loss: 0.7724 - accuracy: 0.7325

271/487 [===============>..............] - ETA: 0s - loss: 0.7722 - accuracy: 0.7324

288/487 [================>.............] - ETA: 0s - loss: 0.7720 - accuracy: 0.7325

305/487 [=================>............] - ETA: 0s - loss: 0.7720 - accuracy: 0.7325

322/487 [==================>...........] - ETA: 0s - loss: 0.7716 - accuracy: 0.7327

341/487 [====================>.........] - ETA: 0s - loss: 0.7711 - accuracy: 0.7330

357/487 [====================>.........] - ETA: 0s - loss: 0.7706 - accuracy: 0.7331

374/487 [======================>.......] - ETA: 0s - loss: 0.7706 - accuracy: 0.7332

392/487 [=======================>......] - ETA: 0s - loss: 0.7700 - accuracy: 0.7336

407/487 [========================>.....] - ETA: 0s - loss: 0.7706 - accuracy: 0.7334

426/487 [=========================>....] - ETA: 0s - loss: 0.7702 - accuracy: 0.7335

445/487 [==========================>...] - ETA: 0s - loss: 0.7697 - accuracy: 0.7337

464/487 [===========================>..] - ETA: 0s - loss: 0.7694 - accuracy: 0.7339

483/487 [============================>.] - ETA: 0s - loss: 0.7695 - accuracy: 0.7339
Epoch 7: val_loss improved from 0.78066 to 0.76875, saving model to unpruned_model/model_best.h5
Epoch 7: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7695 - accuracy: 0.7338 - val_loss: 0.7688 - val_accuracy: 0.7347 - lr: 1.0000e-04
Epoch 8/30
  1/487 [..............................] - ETA: 2s - loss: 0.7951 - accuracy: 0.7139

 20/487 [>.............................] - ETA: 1s - loss: 0.7648 - accuracy: 0.7337

 39/487 [=>............................] - ETA: 1s - loss: 0.7693 - accuracy: 0.7332

 57/487 [==>...........................] - ETA: 1s - loss: 0.7700 - accuracy: 0.7321

 76/487 [===>..........................] - ETA: 1s - loss: 0.7655 - accuracy: 0.7337

 94/487 [====>.........................] - ETA: 1s - loss: 0.7642 - accuracy: 0.7344

112/487 [=====>........................] - ETA: 1s - loss: 0.7648 - accuracy: 0.7341

129/487 [======>.......................] - ETA: 1s - loss: 0.7635 - accuracy: 0.7343

146/487 [=======>......................] - ETA: 0s - loss: 0.7637 - accuracy: 0.7343

164/487 [=========>....................] - ETA: 0s - loss: 0.7627 - accuracy: 0.7348

181/487 [==========>...................] - ETA: 0s - loss: 0.7618 - accuracy: 0.7350

200/487 [===========>..................] - ETA: 0s - loss: 0.7615 - accuracy: 0.7354

217/487 [============>.................] - ETA: 0s - loss: 0.7619 - accuracy: 0.7352

235/487 [=============>................] - ETA: 0s - loss: 0.7620 - accuracy: 0.7352

252/487 [==============>...............] - ETA: 0s - loss: 0.7609 - accuracy: 0.7358

269/487 [===============>..............] - ETA: 0s - loss: 0.7606 - accuracy: 0.7361

286/487 [================>.............] - ETA: 0s - loss: 0.7606 - accuracy: 0.7359

304/487 [=================>............] - ETA: 0s - loss: 0.7601 - accuracy: 0.7361

323/487 [==================>...........] - ETA: 0s - loss: 0.7598 - accuracy: 0.7362

342/487 [====================>.........] - ETA: 0s - loss: 0.7598 - accuracy: 0.7362

360/487 [=====================>........] - ETA: 0s - loss: 0.7593 - accuracy: 0.7363

377/487 [======================>.......] - ETA: 0s - loss: 0.7591 - accuracy: 0.7364

396/487 [=======================>......] - ETA: 0s - loss: 0.7591 - accuracy: 0.7362

416/487 [========================>.....] - ETA: 0s - loss: 0.7591 - accuracy: 0.7361

434/487 [=========================>....] - ETA: 0s - loss: 0.7592 - accuracy: 0.7362

452/487 [==========================>...] - ETA: 0s - loss: 0.7587 - accuracy: 0.7364

471/487 [============================>.] - ETA: 0s - loss: 0.7590 - accuracy: 0.7362
Epoch 8: val_loss improved from 0.76875 to 0.75925, saving model to unpruned_model/model_best.h5
Epoch 8: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 3ms/step - loss: 0.7586 - accuracy: 0.7363 - val_loss: 0.7593 - val_accuracy: 0.7367 - lr: 1.0000e-04
Epoch 9/30
  1/487 [..............................] - ETA: 2s - loss: 0.7748 - accuracy: 0.7441

 19/487 [>.............................] - ETA: 1s - loss: 0.7468 - accuracy: 0.7393

 37/487 [=>............................] - ETA: 1s - loss: 0.7522 - accuracy: 0.7381

 53/487 [==>...........................] - ETA: 1s - loss: 0.7574 - accuracy: 0.7362

 69/487 [===>..........................] - ETA: 1s - loss: 0.7594 - accuracy: 0.7357

 87/487 [====>.........................] - ETA: 1s - loss: 0.7596 - accuracy: 0.7358

104/487 [=====>........................] - ETA: 1s - loss: 0.7583 - accuracy: 0.7359

120/487 [======>.......................] - ETA: 1s - loss: 0.7559 - accuracy: 0.7363

136/487 [=======>......................] - ETA: 1s - loss: 0.7552 - accuracy: 0.7364

153/487 [========>.....................] - ETA: 1s - loss: 0.7552 - accuracy: 0.7366

170/487 [=========>....................] - ETA: 0s - loss: 0.7545 - accuracy: 0.7368

187/487 [==========>...................] - ETA: 0s - loss: 0.7540 - accuracy: 0.7372

205/487 [===========>..................] - ETA: 0s - loss: 0.7535 - accuracy: 0.7371

221/487 [============>.................] - ETA: 0s - loss: 0.7534 - accuracy: 0.7372

239/487 [=============>................] - ETA: 0s - loss: 0.7530 - accuracy: 0.7371

257/487 [==============>...............] - ETA: 0s - loss: 0.7531 - accuracy: 0.7372

276/487 [================>.............] - ETA: 0s - loss: 0.7525 - accuracy: 0.7375

295/487 [=================>............] - ETA: 0s - loss: 0.7524 - accuracy: 0.7374

312/487 [==================>...........] - ETA: 0s - loss: 0.7519 - accuracy: 0.7377

330/487 [===================>..........] - ETA: 0s - loss: 0.7516 - accuracy: 0.7378

348/487 [====================>.........] - ETA: 0s - loss: 0.7519 - accuracy: 0.7377

366/487 [=====================>........] - ETA: 0s - loss: 0.7521 - accuracy: 0.7376

383/487 [======================>.......] - ETA: 0s - loss: 0.7514 - accuracy: 0.7378

399/487 [=======================>......] - ETA: 0s - loss: 0.7512 - accuracy: 0.7380

418/487 [========================>.....] - ETA: 0s - loss: 0.7507 - accuracy: 0.7382

434/487 [=========================>....] - ETA: 0s - loss: 0.7505 - accuracy: 0.7381

451/487 [==========================>...] - ETA: 0s - loss: 0.7503 - accuracy: 0.7383

467/487 [===========================>..] - ETA: 0s - loss: 0.7499 - accuracy: 0.7384

486/487 [============================>.] - ETA: 0s - loss: 0.7499 - accuracy: 0.7383
Epoch 9: val_loss improved from 0.75925 to 0.75193, saving model to unpruned_model/model_best.h5
Epoch 9: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7499 - accuracy: 0.7383 - val_loss: 0.7519 - val_accuracy: 0.7389 - lr: 1.0000e-04
Epoch 10/30
  1/487 [..............................] - ETA: 2s - loss: 0.7242 - accuracy: 0.7354

 20/487 [>.............................] - ETA: 1s - loss: 0.7469 - accuracy: 0.7373

 38/487 [=>............................] - ETA: 1s - loss: 0.7485 - accuracy: 0.7371

 56/487 [==>...........................] - ETA: 1s - loss: 0.7469 - accuracy: 0.7372

 73/487 [===>..........................] - ETA: 1s - loss: 0.7465 - accuracy: 0.7383

 89/487 [====>.........................] - ETA: 1s - loss: 0.7494 - accuracy: 0.7373

107/487 [=====>........................] - ETA: 1s - loss: 0.7479 - accuracy: 0.7379

125/487 [======>.......................] - ETA: 1s - loss: 0.7474 - accuracy: 0.7381

142/487 [=======>......................] - ETA: 1s - loss: 0.7470 - accuracy: 0.7383

159/487 [========>.....................] - ETA: 0s - loss: 0.7476 - accuracy: 0.7379

176/487 [=========>....................] - ETA: 0s - loss: 0.7473 - accuracy: 0.7381

192/487 [==========>...................] - ETA: 0s - loss: 0.7466 - accuracy: 0.7384

210/487 [===========>..................] - ETA: 0s - loss: 0.7455 - accuracy: 0.7388

228/487 [=============>................] - ETA: 0s - loss: 0.7451 - accuracy: 0.7390

245/487 [==============>...............] - ETA: 0s - loss: 0.7456 - accuracy: 0.7391

263/487 [===============>..............] - ETA: 0s - loss: 0.7454 - accuracy: 0.7392

280/487 [================>.............] - ETA: 0s - loss: 0.7449 - accuracy: 0.7395

298/487 [=================>............] - ETA: 0s - loss: 0.7453 - accuracy: 0.7393

317/487 [==================>...........] - ETA: 0s - loss: 0.7446 - accuracy: 0.7395

335/487 [===================>..........] - ETA: 0s - loss: 0.7445 - accuracy: 0.7395

353/487 [====================>.........] - ETA: 0s - loss: 0.7445 - accuracy: 0.7394

371/487 [=====================>........] - ETA: 0s - loss: 0.7442 - accuracy: 0.7395

388/487 [======================>.......] - ETA: 0s - loss: 0.7434 - accuracy: 0.7397

406/487 [========================>.....] - ETA: 0s - loss: 0.7431 - accuracy: 0.7399

423/487 [=========================>....] - ETA: 0s - loss: 0.7435 - accuracy: 0.7397

439/487 [==========================>...] - ETA: 0s - loss: 0.7433 - accuracy: 0.7399

458/487 [===========================>..] - ETA: 0s - loss: 0.7433 - accuracy: 0.7399

477/487 [============================>.] - ETA: 0s - loss: 0.7432 - accuracy: 0.7399
Epoch 10: val_loss improved from 0.75193 to 0.74531, saving model to unpruned_model/model_best.h5
Epoch 10: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7428 - accuracy: 0.7400 - val_loss: 0.7453 - val_accuracy: 0.7403 - lr: 1.0000e-04
Epoch 11/30
  1/487 [..............................] - ETA: 2s - loss: 0.7430 - accuracy: 0.7549

 20/487 [>.............................] - ETA: 1s - loss: 0.7325 - accuracy: 0.7428

 38/487 [=>............................] - ETA: 1s - loss: 0.7349 - accuracy: 0.7424

 57/487 [==>...........................] - ETA: 1s - loss: 0.7318 - accuracy: 0.7433

 77/487 [===>..........................] - ETA: 1s - loss: 0.7339 - accuracy: 0.7426

 93/487 [====>.........................] - ETA: 1s - loss: 0.7363 - accuracy: 0.7420

111/487 [=====>........................] - ETA: 1s - loss: 0.7368 - accuracy: 0.7413

128/487 [======>.......................] - ETA: 1s - loss: 0.7370 - accuracy: 0.7413

145/487 [=======>......................] - ETA: 0s - loss: 0.7358 - accuracy: 0.7419

162/487 [========>.....................] - ETA: 0s - loss: 0.7349 - accuracy: 0.7426

179/487 [==========>...................] - ETA: 0s - loss: 0.7356 - accuracy: 0.7425

196/487 [===========>..................] - ETA: 0s - loss: 0.7359 - accuracy: 0.7424

214/487 [============>.................] - ETA: 0s - loss: 0.7379 - accuracy: 0.7418

233/487 [=============>................] - ETA: 0s - loss: 0.7367 - accuracy: 0.7420

250/487 [==============>...............] - ETA: 0s - loss: 0.7370 - accuracy: 0.7419

267/487 [===============>..............] - ETA: 0s - loss: 0.7372 - accuracy: 0.7418

285/487 [================>.............] - ETA: 0s - loss: 0.7372 - accuracy: 0.7418

303/487 [=================>............] - ETA: 0s - loss: 0.7368 - accuracy: 0.7419

321/487 [==================>...........] - ETA: 0s - loss: 0.7368 - accuracy: 0.7417

339/487 [===================>..........] - ETA: 0s - loss: 0.7369 - accuracy: 0.7417

357/487 [====================>.........] - ETA: 0s - loss: 0.7367 - accuracy: 0.7417

375/487 [======================>.......] - ETA: 0s - loss: 0.7373 - accuracy: 0.7414

394/487 [=======================>......] - ETA: 0s - loss: 0.7369 - accuracy: 0.7415

412/487 [========================>.....] - ETA: 0s - loss: 0.7366 - accuracy: 0.7416

431/487 [=========================>....] - ETA: 0s - loss: 0.7371 - accuracy: 0.7413

449/487 [==========================>...] - ETA: 0s - loss: 0.7368 - accuracy: 0.7414

465/487 [===========================>..] - ETA: 0s - loss: 0.7368 - accuracy: 0.7415

483/487 [============================>.] - ETA: 0s - loss: 0.7367 - accuracy: 0.7416
Epoch 11: val_loss improved from 0.74531 to 0.73959, saving model to unpruned_model/model_best.h5
Epoch 11: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7368 - accuracy: 0.7416 - val_loss: 0.7396 - val_accuracy: 0.7413 - lr: 1.0000e-04
Epoch 12/30
  1/487 [..............................] - ETA: 2s - loss: 0.7585 - accuracy: 0.7324

 19/487 [>.............................] - ETA: 1s - loss: 0.7439 - accuracy: 0.7392

 37/487 [=>............................] - ETA: 1s - loss: 0.7378 - accuracy: 0.7411

 53/487 [==>...........................] - ETA: 1s - loss: 0.7335 - accuracy: 0.7421

 70/487 [===>..........................] - ETA: 1s - loss: 0.7293 - accuracy: 0.7438

 87/487 [====>.........................] - ETA: 1s - loss: 0.7305 - accuracy: 0.7428

106/487 [=====>........................] - ETA: 1s - loss: 0.7325 - accuracy: 0.7422

125/487 [======>.......................] - ETA: 1s - loss: 0.7328 - accuracy: 0.7422

143/487 [=======>......................] - ETA: 0s - loss: 0.7333 - accuracy: 0.7421

160/487 [========>.....................] - ETA: 0s - loss: 0.7334 - accuracy: 0.7423

179/487 [==========>...................] - ETA: 0s - loss: 0.7317 - accuracy: 0.7429

198/487 [===========>..................] - ETA: 0s - loss: 0.7323 - accuracy: 0.7425

218/487 [============>.................] - ETA: 0s - loss: 0.7319 - accuracy: 0.7425

237/487 [=============>................] - ETA: 0s - loss: 0.7307 - accuracy: 0.7431

256/487 [==============>...............] - ETA: 0s - loss: 0.7306 - accuracy: 0.7432

276/487 [================>.............] - ETA: 0s - loss: 0.7314 - accuracy: 0.7430

295/487 [=================>............] - ETA: 0s - loss: 0.7309 - accuracy: 0.7432

312/487 [==================>...........] - ETA: 0s - loss: 0.7311 - accuracy: 0.7431

329/487 [===================>..........] - ETA: 0s - loss: 0.7311 - accuracy: 0.7433

349/487 [====================>.........] - ETA: 0s - loss: 0.7317 - accuracy: 0.7429

369/487 [=====================>........] - ETA: 0s - loss: 0.7315 - accuracy: 0.7430

388/487 [======================>.......] - ETA: 0s - loss: 0.7317 - accuracy: 0.7429

405/487 [=======================>......] - ETA: 0s - loss: 0.7324 - accuracy: 0.7426

423/487 [=========================>....] - ETA: 0s - loss: 0.7319 - accuracy: 0.7427

442/487 [==========================>...] - ETA: 0s - loss: 0.7318 - accuracy: 0.7426

461/487 [===========================>..] - ETA: 0s - loss: 0.7317 - accuracy: 0.7427

481/487 [============================>.] - ETA: 0s - loss: 0.7315 - accuracy: 0.7428
Epoch 12: val_loss improved from 0.73959 to 0.73502, saving model to unpruned_model/model_best.h5
Epoch 12: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 3ms/step - loss: 0.7317 - accuracy: 0.7428 - val_loss: 0.7350 - val_accuracy: 0.7427 - lr: 1.0000e-04
Epoch 13/30
  1/487 [..............................] - ETA: 2s - loss: 0.7012 - accuracy: 0.7666

 20/487 [>.............................] - ETA: 1s - loss: 0.7248 - accuracy: 0.7469

 39/487 [=>............................] - ETA: 1s - loss: 0.7249 - accuracy: 0.7451

 57/487 [==>...........................] - ETA: 1s - loss: 0.7263 - accuracy: 0.7444

 75/487 [===>..........................] - ETA: 1s - loss: 0.7276 - accuracy: 0.7437

 93/487 [====>.........................] - ETA: 1s - loss: 0.7276 - accuracy: 0.7436

112/487 [=====>........................] - ETA: 1s - loss: 0.7270 - accuracy: 0.7439

131/487 [=======>......................] - ETA: 0s - loss: 0.7255 - accuracy: 0.7448

149/487 [========>.....................] - ETA: 0s - loss: 0.7262 - accuracy: 0.7442

169/487 [=========>....................] - ETA: 0s - loss: 0.7265 - accuracy: 0.7442

188/487 [==========>...................] - ETA: 0s - loss: 0.7265 - accuracy: 0.7443

207/487 [===========>..................] - ETA: 0s - loss: 0.7269 - accuracy: 0.7443

227/487 [============>.................] - ETA: 0s - loss: 0.7267 - accuracy: 0.7442

247/487 [==============>...............] - ETA: 0s - loss: 0.7273 - accuracy: 0.7440

267/487 [===============>..............] - ETA: 0s - loss: 0.7265 - accuracy: 0.7445

285/487 [================>.............] - ETA: 0s - loss: 0.7258 - accuracy: 0.7447

304/487 [=================>............] - ETA: 0s - loss: 0.7260 - accuracy: 0.7445

324/487 [==================>...........] - ETA: 0s - loss: 0.7266 - accuracy: 0.7442

343/487 [====================>.........] - ETA: 0s - loss: 0.7262 - accuracy: 0.7444

362/487 [=====================>........] - ETA: 0s - loss: 0.7264 - accuracy: 0.7443

381/487 [======================>.......] - ETA: 0s - loss: 0.7266 - accuracy: 0.7444

400/487 [=======================>......] - ETA: 0s - loss: 0.7264 - accuracy: 0.7445

419/487 [========================>.....] - ETA: 0s - loss: 0.7265 - accuracy: 0.7444

439/487 [==========================>...] - ETA: 0s - loss: 0.7268 - accuracy: 0.7444

458/487 [===========================>..] - ETA: 0s - loss: 0.7268 - accuracy: 0.7443

477/487 [============================>.] - ETA: 0s - loss: 0.7271 - accuracy: 0.7440
Epoch 13: val_loss improved from 0.73502 to 0.73083, saving model to unpruned_model/model_best.h5
Epoch 13: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 3ms/step - loss: 0.7271 - accuracy: 0.7439 - val_loss: 0.7308 - val_accuracy: 0.7433 - lr: 1.0000e-04
Epoch 14/30
  1/487 [..............................] - ETA: 2s - loss: 0.7656 - accuracy: 0.7256

 20/487 [>.............................] - ETA: 1s - loss: 0.7261 - accuracy: 0.7454

 38/487 [=>............................] - ETA: 1s - loss: 0.7203 - accuracy: 0.7478

 55/487 [==>...........................] - ETA: 1s - loss: 0.7211 - accuracy: 0.7474

 72/487 [===>..........................] - ETA: 1s - loss: 0.7195 - accuracy: 0.7476

 89/487 [====>.........................] - ETA: 1s - loss: 0.7210 - accuracy: 0.7470

106/487 [=====>........................] - ETA: 1s - loss: 0.7214 - accuracy: 0.7469

122/487 [======>.......................] - ETA: 1s - loss: 0.7231 - accuracy: 0.7457

139/487 [=======>......................] - ETA: 1s - loss: 0.7235 - accuracy: 0.7458

157/487 [========>.....................] - ETA: 0s - loss: 0.7242 - accuracy: 0.7454

174/487 [=========>....................] - ETA: 0s - loss: 0.7234 - accuracy: 0.7457

191/487 [==========>...................] - ETA: 0s - loss: 0.7233 - accuracy: 0.7455

208/487 [===========>..................] - ETA: 0s - loss: 0.7238 - accuracy: 0.7454

227/487 [============>.................] - ETA: 0s - loss: 0.7232 - accuracy: 0.7455

244/487 [==============>...............] - ETA: 0s - loss: 0.7230 - accuracy: 0.7454

259/487 [==============>...............] - ETA: 0s - loss: 0.7234 - accuracy: 0.7451

276/487 [================>.............] - ETA: 0s - loss: 0.7229 - accuracy: 0.7455

293/487 [=================>............] - ETA: 0s - loss: 0.7234 - accuracy: 0.7454

311/487 [==================>...........] - ETA: 0s - loss: 0.7239 - accuracy: 0.7451

326/487 [===================>..........] - ETA: 0s - loss: 0.7237 - accuracy: 0.7451

343/487 [====================>.........] - ETA: 0s - loss: 0.7239 - accuracy: 0.7452

361/487 [=====================>........] - ETA: 0s - loss: 0.7229 - accuracy: 0.7454

379/487 [======================>.......] - ETA: 0s - loss: 0.7225 - accuracy: 0.7457

396/487 [=======================>......] - ETA: 0s - loss: 0.7231 - accuracy: 0.7453

414/487 [========================>.....] - ETA: 0s - loss: 0.7230 - accuracy: 0.7453

431/487 [=========================>....] - ETA: 0s - loss: 0.7230 - accuracy: 0.7452

448/487 [==========================>...] - ETA: 0s - loss: 0.7232 - accuracy: 0.7452

465/487 [===========================>..] - ETA: 0s - loss: 0.7233 - accuracy: 0.7452

481/487 [============================>.] - ETA: 0s - loss: 0.7230 - accuracy: 0.7451
Epoch 14: val_loss improved from 0.73083 to 0.72710, saving model to unpruned_model/model_best.h5
Epoch 14: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7231 - accuracy: 0.7451 - val_loss: 0.7271 - val_accuracy: 0.7450 - lr: 1.0000e-04
Epoch 15/30
  1/487 [..............................] - ETA: 2s - loss: 0.7426 - accuracy: 0.7354

 19/487 [>.............................] - ETA: 1s - loss: 0.7218 - accuracy: 0.7459

 38/487 [=>............................] - ETA: 1s - loss: 0.7187 - accuracy: 0.7467

 54/487 [==>...........................] - ETA: 1s - loss: 0.7209 - accuracy: 0.7458

 73/487 [===>..........................] - ETA: 1s - loss: 0.7222 - accuracy: 0.7454

 92/487 [====>.........................] - ETA: 1s - loss: 0.7222 - accuracy: 0.7456

108/487 [=====>........................] - ETA: 1s - loss: 0.7204 - accuracy: 0.7458

124/487 [======>.......................] - ETA: 1s - loss: 0.7213 - accuracy: 0.7454

142/487 [=======>......................] - ETA: 1s - loss: 0.7198 - accuracy: 0.7461

159/487 [========>.....................] - ETA: 0s - loss: 0.7202 - accuracy: 0.7458

177/487 [=========>....................] - ETA: 0s - loss: 0.7202 - accuracy: 0.7457

194/487 [==========>...................] - ETA: 0s - loss: 0.7198 - accuracy: 0.7457

211/487 [===========>..................] - ETA: 0s - loss: 0.7196 - accuracy: 0.7459

226/487 [============>.................] - ETA: 0s - loss: 0.7202 - accuracy: 0.7457

244/487 [==============>...............] - ETA: 0s - loss: 0.7203 - accuracy: 0.7456

261/487 [===============>..............] - ETA: 0s - loss: 0.7212 - accuracy: 0.7453

278/487 [================>.............] - ETA: 0s - loss: 0.7211 - accuracy: 0.7453

296/487 [=================>............] - ETA: 0s - loss: 0.7216 - accuracy: 0.7451

313/487 [==================>...........] - ETA: 0s - loss: 0.7220 - accuracy: 0.7449

331/487 [===================>..........] - ETA: 0s - loss: 0.7217 - accuracy: 0.7452

348/487 [====================>.........] - ETA: 0s - loss: 0.7212 - accuracy: 0.7455

366/487 [=====================>........] - ETA: 0s - loss: 0.7212 - accuracy: 0.7455

385/487 [======================>.......] - ETA: 0s - loss: 0.7210 - accuracy: 0.7456

403/487 [=======================>......] - ETA: 0s - loss: 0.7207 - accuracy: 0.7457

421/487 [========================>.....] - ETA: 0s - loss: 0.7207 - accuracy: 0.7456

440/487 [==========================>...] - ETA: 0s - loss: 0.7200 - accuracy: 0.7458

458/487 [===========================>..] - ETA: 0s - loss: 0.7196 - accuracy: 0.7458

475/487 [============================>.] - ETA: 0s - loss: 0.7195 - accuracy: 0.7459
Epoch 15: val_loss improved from 0.72710 to 0.72379, saving model to unpruned_model/model_best.h5
Epoch 15: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7196 - accuracy: 0.7459 - val_loss: 0.7238 - val_accuracy: 0.7452 - lr: 1.0000e-04
Epoch 16/30
  1/487 [..............................] - ETA: 1s - loss: 0.6914 - accuracy: 0.7578

 20/487 [>.............................] - ETA: 1s - loss: 0.7224 - accuracy: 0.7442

 38/487 [=>............................] - ETA: 1s - loss: 0.7225 - accuracy: 0.7448

 56/487 [==>...........................] - ETA: 1s - loss: 0.7253 - accuracy: 0.7438

 73/487 [===>..........................] - ETA: 1s - loss: 0.7220 - accuracy: 0.7450

 91/487 [====>.........................] - ETA: 1s - loss: 0.7218 - accuracy: 0.7447

110/487 [=====>........................] - ETA: 1s - loss: 0.7235 - accuracy: 0.7443

129/487 [======>.......................] - ETA: 1s - loss: 0.7222 - accuracy: 0.7446

147/487 [========>.....................] - ETA: 0s - loss: 0.7219 - accuracy: 0.7448

165/487 [=========>....................] - ETA: 0s - loss: 0.7209 - accuracy: 0.7452

184/487 [==========>...................] - ETA: 0s - loss: 0.7193 - accuracy: 0.7457

201/487 [===========>..................] - ETA: 0s - loss: 0.7191 - accuracy: 0.7457

218/487 [============>.................] - ETA: 0s - loss: 0.7193 - accuracy: 0.7456

236/487 [=============>................] - ETA: 0s - loss: 0.7187 - accuracy: 0.7460

254/487 [==============>...............] - ETA: 0s - loss: 0.7189 - accuracy: 0.7457

272/487 [===============>..............] - ETA: 0s - loss: 0.7181 - accuracy: 0.7462

290/487 [================>.............] - ETA: 0s - loss: 0.7185 - accuracy: 0.7459

308/487 [=================>............] - ETA: 0s - loss: 0.7180 - accuracy: 0.7462

325/487 [===================>..........] - ETA: 0s - loss: 0.7176 - accuracy: 0.7463

343/487 [====================>.........] - ETA: 0s - loss: 0.7178 - accuracy: 0.7462

359/487 [=====================>........] - ETA: 0s - loss: 0.7174 - accuracy: 0.7464

376/487 [======================>.......] - ETA: 0s - loss: 0.7174 - accuracy: 0.7464

392/487 [=======================>......] - ETA: 0s - loss: 0.7173 - accuracy: 0.7464

408/487 [========================>.....] - ETA: 0s - loss: 0.7169 - accuracy: 0.7465

423/487 [=========================>....] - ETA: 0s - loss: 0.7170 - accuracy: 0.7465

439/487 [==========================>...] - ETA: 0s - loss: 0.7174 - accuracy: 0.7464

456/487 [===========================>..] - ETA: 0s - loss: 0.7170 - accuracy: 0.7465

473/487 [============================>.] - ETA: 0s - loss: 0.7168 - accuracy: 0.7467
Epoch 16: val_loss improved from 0.72379 to 0.72092, saving model to unpruned_model/model_best.h5
Epoch 16: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7164 - accuracy: 0.7468 - val_loss: 0.7209 - val_accuracy: 0.7461 - lr: 1.0000e-04
Epoch 17/30
  1/487 [..............................] - ETA: 2s - loss: 0.7394 - accuracy: 0.7334

 18/487 [>.............................] - ETA: 1s - loss: 0.7117 - accuracy: 0.7493

 35/487 [=>............................] - ETA: 1s - loss: 0.7126 - accuracy: 0.7482

 52/487 [==>...........................] - ETA: 1s - loss: 0.7163 - accuracy: 0.7465

 70/487 [===>..........................] - ETA: 1s - loss: 0.7183 - accuracy: 0.7461

 88/487 [====>.........................] - ETA: 1s - loss: 0.7176 - accuracy: 0.7461

105/487 [=====>........................] - ETA: 1s - loss: 0.7187 - accuracy: 0.7458

122/487 [======>.......................] - ETA: 1s - loss: 0.7187 - accuracy: 0.7456

139/487 [=======>......................] - ETA: 1s - loss: 0.7164 - accuracy: 0.7465

156/487 [========>.....................] - ETA: 0s - loss: 0.7148 - accuracy: 0.7471

174/487 [=========>....................] - ETA: 0s - loss: 0.7135 - accuracy: 0.7479

191/487 [==========>...................] - ETA: 0s - loss: 0.7137 - accuracy: 0.7478

210/487 [===========>..................] - ETA: 0s - loss: 0.7127 - accuracy: 0.7483

229/487 [=============>................] - ETA: 0s - loss: 0.7116 - accuracy: 0.7486

245/487 [==============>...............] - ETA: 0s - loss: 0.7116 - accuracy: 0.7486

263/487 [===============>..............] - ETA: 0s - loss: 0.7125 - accuracy: 0.7482

282/487 [================>.............] - ETA: 0s - loss: 0.7127 - accuracy: 0.7481

300/487 [=================>............] - ETA: 0s - loss: 0.7131 - accuracy: 0.7478

316/487 [==================>...........] - ETA: 0s - loss: 0.7137 - accuracy: 0.7476

333/487 [===================>..........] - ETA: 0s - loss: 0.7136 - accuracy: 0.7477

351/487 [====================>.........] - ETA: 0s - loss: 0.7138 - accuracy: 0.7476

368/487 [=====================>........] - ETA: 0s - loss: 0.7139 - accuracy: 0.7476

386/487 [======================>.......] - ETA: 0s - loss: 0.7137 - accuracy: 0.7477

405/487 [=======================>......] - ETA: 0s - loss: 0.7136 - accuracy: 0.7477

424/487 [=========================>....] - ETA: 0s - loss: 0.7136 - accuracy: 0.7476

442/487 [==========================>...] - ETA: 0s - loss: 0.7136 - accuracy: 0.7475

459/487 [===========================>..] - ETA: 0s - loss: 0.7138 - accuracy: 0.7474

476/487 [============================>.] - ETA: 0s - loss: 0.7136 - accuracy: 0.7475
Epoch 17: val_loss improved from 0.72092 to 0.71820, saving model to unpruned_model/model_best.h5
Epoch 17: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7134 - accuracy: 0.7476 - val_loss: 0.7182 - val_accuracy: 0.7470 - lr: 1.0000e-04
Epoch 18/30
  1/487 [..............................] - ETA: 2s - loss: 0.7148 - accuracy: 0.7471

 20/487 [>.............................] - ETA: 1s - loss: 0.7127 - accuracy: 0.7485

 38/487 [=>............................] - ETA: 1s - loss: 0.7079 - accuracy: 0.7491

 54/487 [==>...........................] - ETA: 1s - loss: 0.7087 - accuracy: 0.7490

 68/487 [===>..........................] - ETA: 1s - loss: 0.7092 - accuracy: 0.7488

 83/487 [====>.........................] - ETA: 1s - loss: 0.7093 - accuracy: 0.7490

 98/487 [=====>........................] - ETA: 1s - loss: 0.7100 - accuracy: 0.7487

113/487 [=====>........................] - ETA: 1s - loss: 0.7095 - accuracy: 0.7490

130/487 [=======>......................] - ETA: 1s - loss: 0.7108 - accuracy: 0.7486

147/487 [========>.....................] - ETA: 1s - loss: 0.7107 - accuracy: 0.7485

164/487 [=========>....................] - ETA: 1s - loss: 0.7113 - accuracy: 0.7488

182/487 [==========>...................] - ETA: 0s - loss: 0.7109 - accuracy: 0.7490

198/487 [===========>..................] - ETA: 0s - loss: 0.7113 - accuracy: 0.7487

216/487 [============>.................] - ETA: 0s - loss: 0.7112 - accuracy: 0.7488

234/487 [=============>................] - ETA: 0s - loss: 0.7110 - accuracy: 0.7489

252/487 [==============>...............] - ETA: 0s - loss: 0.7108 - accuracy: 0.7487

271/487 [===============>..............] - ETA: 0s - loss: 0.7099 - accuracy: 0.7490

289/487 [================>.............] - ETA: 0s - loss: 0.7103 - accuracy: 0.7490

308/487 [=================>............] - ETA: 0s - loss: 0.7101 - accuracy: 0.7489

326/487 [===================>..........] - ETA: 0s - loss: 0.7099 - accuracy: 0.7488

345/487 [====================>.........] - ETA: 0s - loss: 0.7094 - accuracy: 0.7490

364/487 [=====================>........] - ETA: 0s - loss: 0.7093 - accuracy: 0.7491

383/487 [======================>.......] - ETA: 0s - loss: 0.7089 - accuracy: 0.7492

399/487 [=======================>......] - ETA: 0s - loss: 0.7097 - accuracy: 0.7489

416/487 [========================>.....] - ETA: 0s - loss: 0.7097 - accuracy: 0.7487

435/487 [=========================>....] - ETA: 0s - loss: 0.7096 - accuracy: 0.7487

453/487 [==========================>...] - ETA: 0s - loss: 0.7102 - accuracy: 0.7484

470/487 [===========================>..] - ETA: 0s - loss: 0.7105 - accuracy: 0.7483

486/487 [============================>.] - ETA: 0s - loss: 0.7108 - accuracy: 0.7483
Epoch 18: val_loss improved from 0.71820 to 0.71597, saving model to unpruned_model/model_best.h5
Epoch 18: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7108 - accuracy: 0.7483 - val_loss: 0.7160 - val_accuracy: 0.7469 - lr: 1.0000e-04
Epoch 19/30
  1/487 [..............................] - ETA: 2s - loss: 0.7118 - accuracy: 0.7471

 20/487 [>.............................] - ETA: 1s - loss: 0.7072 - accuracy: 0.7491

 38/487 [=>............................] - ETA: 1s - loss: 0.7052 - accuracy: 0.7493

 55/487 [==>...........................] - ETA: 1s - loss: 0.7059 - accuracy: 0.7498

 72/487 [===>..........................] - ETA: 1s - loss: 0.7064 - accuracy: 0.7496

 89/487 [====>.........................] - ETA: 1s - loss: 0.7082 - accuracy: 0.7486

106/487 [=====>........................] - ETA: 1s - loss: 0.7076 - accuracy: 0.7487

123/487 [======>.......................] - ETA: 1s - loss: 0.7070 - accuracy: 0.7488

140/487 [=======>......................] - ETA: 1s - loss: 0.7077 - accuracy: 0.7490

158/487 [========>.....................] - ETA: 0s - loss: 0.7075 - accuracy: 0.7493

175/487 [=========>....................] - ETA: 0s - loss: 0.7074 - accuracy: 0.7494

192/487 [==========>...................] - ETA: 0s - loss: 0.7079 - accuracy: 0.7492

210/487 [===========>..................] - ETA: 0s - loss: 0.7079 - accuracy: 0.7494

227/487 [============>.................] - ETA: 0s - loss: 0.7083 - accuracy: 0.7492

245/487 [==============>...............] - ETA: 0s - loss: 0.7080 - accuracy: 0.7493

263/487 [===============>..............] - ETA: 0s - loss: 0.7082 - accuracy: 0.7491

282/487 [================>.............] - ETA: 0s - loss: 0.7084 - accuracy: 0.7491

299/487 [=================>............] - ETA: 0s - loss: 0.7082 - accuracy: 0.7492

316/487 [==================>...........] - ETA: 0s - loss: 0.7081 - accuracy: 0.7494

333/487 [===================>..........] - ETA: 0s - loss: 0.7080 - accuracy: 0.7495

353/487 [====================>.........] - ETA: 0s - loss: 0.7082 - accuracy: 0.7493

373/487 [=====================>........] - ETA: 0s - loss: 0.7081 - accuracy: 0.7493

391/487 [=======================>......] - ETA: 0s - loss: 0.7081 - accuracy: 0.7492

409/487 [========================>.....] - ETA: 0s - loss: 0.7084 - accuracy: 0.7492

428/487 [=========================>....] - ETA: 0s - loss: 0.7087 - accuracy: 0.7491

448/487 [==========================>...] - ETA: 0s - loss: 0.7080 - accuracy: 0.7494

466/487 [===========================>..] - ETA: 0s - loss: 0.7080 - accuracy: 0.7494

485/487 [============================>.] - ETA: 0s - loss: 0.7083 - accuracy: 0.7492
Epoch 19: val_loss improved from 0.71597 to 0.71352, saving model to unpruned_model/model_best.h5
Epoch 19: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7084 - accuracy: 0.7491 - val_loss: 0.7135 - val_accuracy: 0.7479 - lr: 1.0000e-04
Epoch 20/30
  1/487 [..............................] - ETA: 2s - loss: 0.7066 - accuracy: 0.7451

 19/487 [>.............................] - ETA: 1s - loss: 0.6931 - accuracy: 0.7586

 37/487 [=>............................] - ETA: 1s - loss: 0.6986 - accuracy: 0.7536

 54/487 [==>...........................] - ETA: 1s - loss: 0.7009 - accuracy: 0.7523

 72/487 [===>..........................] - ETA: 1s - loss: 0.7046 - accuracy: 0.7505

 89/487 [====>.........................] - ETA: 1s - loss: 0.7054 - accuracy: 0.7501

106/487 [=====>........................] - ETA: 1s - loss: 0.7050 - accuracy: 0.7500

123/487 [======>.......................] - ETA: 1s - loss: 0.7044 - accuracy: 0.7502

139/487 [=======>......................] - ETA: 1s - loss: 0.7046 - accuracy: 0.7503

157/487 [========>.....................] - ETA: 0s - loss: 0.7065 - accuracy: 0.7498

173/487 [=========>....................] - ETA: 0s - loss: 0.7065 - accuracy: 0.7496

190/487 [==========>...................] - ETA: 0s - loss: 0.7073 - accuracy: 0.7492

205/487 [===========>..................] - ETA: 0s - loss: 0.7070 - accuracy: 0.7495

222/487 [============>.................] - ETA: 0s - loss: 0.7076 - accuracy: 0.7492

238/487 [=============>................] - ETA: 0s - loss: 0.7083 - accuracy: 0.7488

254/487 [==============>...............] - ETA: 0s - loss: 0.7078 - accuracy: 0.7490

271/487 [===============>..............] - ETA: 0s - loss: 0.7079 - accuracy: 0.7489

288/487 [================>.............] - ETA: 0s - loss: 0.7076 - accuracy: 0.7491

304/487 [=================>............] - ETA: 0s - loss: 0.7070 - accuracy: 0.7493

320/487 [==================>...........] - ETA: 0s - loss: 0.7069 - accuracy: 0.7492

338/487 [===================>..........] - ETA: 0s - loss: 0.7067 - accuracy: 0.7494

356/487 [====================>.........] - ETA: 0s - loss: 0.7062 - accuracy: 0.7496

375/487 [======================>.......] - ETA: 0s - loss: 0.7060 - accuracy: 0.7494

393/487 [=======================>......] - ETA: 0s - loss: 0.7058 - accuracy: 0.7497

411/487 [========================>.....] - ETA: 0s - loss: 0.7063 - accuracy: 0.7495

428/487 [=========================>....] - ETA: 0s - loss: 0.7059 - accuracy: 0.7498

446/487 [==========================>...] - ETA: 0s - loss: 0.7062 - accuracy: 0.7496

464/487 [===========================>..] - ETA: 0s - loss: 0.7063 - accuracy: 0.7495

482/487 [============================>.] - ETA: 0s - loss: 0.7064 - accuracy: 0.7495
Epoch 20: val_loss improved from 0.71352 to 0.71122, saving model to unpruned_model/model_best.h5
Epoch 20: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7062 - accuracy: 0.7496 - val_loss: 0.7112 - val_accuracy: 0.7483 - lr: 1.0000e-04
Epoch 21/30
  1/487 [..............................] - ETA: 2s - loss: 0.7345 - accuracy: 0.7334

 19/487 [>.............................] - ETA: 1s - loss: 0.7164 - accuracy: 0.7442

 38/487 [=>............................] - ETA: 1s - loss: 0.7096 - accuracy: 0.7473

 57/487 [==>...........................] - ETA: 1s - loss: 0.7053 - accuracy: 0.7489

 74/487 [===>..........................] - ETA: 1s - loss: 0.7046 - accuracy: 0.7498

 88/487 [====>.........................] - ETA: 1s - loss: 0.7039 - accuracy: 0.7497

105/487 [=====>........................] - ETA: 1s - loss: 0.7036 - accuracy: 0.7497

121/487 [======>.......................] - ETA: 1s - loss: 0.7032 - accuracy: 0.7500

139/487 [=======>......................] - ETA: 1s - loss: 0.7041 - accuracy: 0.7500

158/487 [========>.....................] - ETA: 0s - loss: 0.7044 - accuracy: 0.7497

177/487 [=========>....................] - ETA: 0s - loss: 0.7060 - accuracy: 0.7494

191/487 [==========>...................] - ETA: 0s - loss: 0.7058 - accuracy: 0.7493

210/487 [===========>..................] - ETA: 0s - loss: 0.7053 - accuracy: 0.7495

228/487 [=============>................] - ETA: 0s - loss: 0.7059 - accuracy: 0.7492

247/487 [==============>...............] - ETA: 0s - loss: 0.7058 - accuracy: 0.7492

266/487 [===============>..............] - ETA: 0s - loss: 0.7063 - accuracy: 0.7489

284/487 [================>.............] - ETA: 0s - loss: 0.7059 - accuracy: 0.7492

300/487 [=================>............] - ETA: 0s - loss: 0.7060 - accuracy: 0.7492

319/487 [==================>...........] - ETA: 0s - loss: 0.7056 - accuracy: 0.7494

338/487 [===================>..........] - ETA: 0s - loss: 0.7058 - accuracy: 0.7494

355/487 [====================>.........] - ETA: 0s - loss: 0.7059 - accuracy: 0.7494

374/487 [======================>.......] - ETA: 0s - loss: 0.7057 - accuracy: 0.7494

391/487 [=======================>......] - ETA: 0s - loss: 0.7053 - accuracy: 0.7496

408/487 [========================>.....] - ETA: 0s - loss: 0.7049 - accuracy: 0.7498

426/487 [=========================>....] - ETA: 0s - loss: 0.7049 - accuracy: 0.7499

444/487 [==========================>...] - ETA: 0s - loss: 0.7045 - accuracy: 0.7500

461/487 [===========================>..] - ETA: 0s - loss: 0.7043 - accuracy: 0.7501

479/487 [============================>.] - ETA: 0s - loss: 0.7041 - accuracy: 0.7502
Epoch 21: val_loss improved from 0.71122 to 0.70908, saving model to unpruned_model/model_best.h5
Epoch 21: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7041 - accuracy: 0.7502 - val_loss: 0.7091 - val_accuracy: 0.7487 - lr: 1.0000e-04
Epoch 22/30
  1/487 [..............................] - ETA: 2s - loss: 0.7125 - accuracy: 0.7344

 20/487 [>.............................] - ETA: 1s - loss: 0.6987 - accuracy: 0.7520

 39/487 [=>............................] - ETA: 1s - loss: 0.7022 - accuracy: 0.7491

 57/487 [==>...........................] - ETA: 1s - loss: 0.7057 - accuracy: 0.7481

 73/487 [===>..........................] - ETA: 1s - loss: 0.7075 - accuracy: 0.7476

 87/487 [====>.........................] - ETA: 1s - loss: 0.7089 - accuracy: 0.7470

106/487 [=====>........................] - ETA: 1s - loss: 0.7078 - accuracy: 0.7479

125/487 [======>.......................] - ETA: 1s - loss: 0.7062 - accuracy: 0.7489

143/487 [=======>......................] - ETA: 0s - loss: 0.7064 - accuracy: 0.7490

162/487 [========>.....................] - ETA: 0s - loss: 0.7060 - accuracy: 0.7490

182/487 [==========>...................] - ETA: 0s - loss: 0.7051 - accuracy: 0.7493

201/487 [===========>..................] - ETA: 0s - loss: 0.7039 - accuracy: 0.7496

219/487 [============>.................] - ETA: 0s - loss: 0.7034 - accuracy: 0.7502

238/487 [=============>................] - ETA: 0s - loss: 0.7027 - accuracy: 0.7506

255/487 [==============>...............] - ETA: 0s - loss: 0.7029 - accuracy: 0.7504

271/487 [===============>..............] - ETA: 0s - loss: 0.7025 - accuracy: 0.7506

290/487 [================>.............] - ETA: 0s - loss: 0.7020 - accuracy: 0.7506

307/487 [=================>............] - ETA: 0s - loss: 0.7019 - accuracy: 0.7506

323/487 [==================>...........] - ETA: 0s - loss: 0.7018 - accuracy: 0.7508

341/487 [====================>.........] - ETA: 0s - loss: 0.7024 - accuracy: 0.7505

358/487 [=====================>........] - ETA: 0s - loss: 0.7027 - accuracy: 0.7503

376/487 [======================>.......] - ETA: 0s - loss: 0.7032 - accuracy: 0.7502

393/487 [=======================>......] - ETA: 0s - loss: 0.7036 - accuracy: 0.7501

409/487 [========================>.....] - ETA: 0s - loss: 0.7032 - accuracy: 0.7502

425/487 [=========================>....] - ETA: 0s - loss: 0.7028 - accuracy: 0.7504

440/487 [==========================>...] - ETA: 0s - loss: 0.7029 - accuracy: 0.7504

456/487 [===========================>..] - ETA: 0s - loss: 0.7025 - accuracy: 0.7505

472/487 [============================>.] - ETA: 0s - loss: 0.7024 - accuracy: 0.7506

486/487 [============================>.] - ETA: 0s - loss: 0.7021 - accuracy: 0.7506
Epoch 22: val_loss improved from 0.70908 to 0.70751, saving model to unpruned_model/model_best.h5
Epoch 22: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.7021 - accuracy: 0.7506 - val_loss: 0.7075 - val_accuracy: 0.7494 - lr: 1.0000e-04
Epoch 23/30
  1/487 [..............................] - ETA: 2s - loss: 0.7123 - accuracy: 0.7432

 18/487 [>.............................] - ETA: 1s - loss: 0.6975 - accuracy: 0.7529

 35/487 [=>............................] - ETA: 1s - loss: 0.6957 - accuracy: 0.7540

 53/487 [==>...........................] - ETA: 1s - loss: 0.6983 - accuracy: 0.7525

 72/487 [===>..........................] - ETA: 1s - loss: 0.6993 - accuracy: 0.7525

 89/487 [====>.........................] - ETA: 1s - loss: 0.7006 - accuracy: 0.7512

108/487 [=====>........................] - ETA: 1s - loss: 0.7016 - accuracy: 0.7506

127/487 [======>.......................] - ETA: 1s - loss: 0.7003 - accuracy: 0.7514

146/487 [=======>......................] - ETA: 0s - loss: 0.7005 - accuracy: 0.7515

165/487 [=========>....................] - ETA: 0s - loss: 0.6993 - accuracy: 0.7516

184/487 [==========>...................] - ETA: 0s - loss: 0.6994 - accuracy: 0.7517

203/487 [===========>..................] - ETA: 0s - loss: 0.6987 - accuracy: 0.7521

222/487 [============>.................] - ETA: 0s - loss: 0.6987 - accuracy: 0.7520

241/487 [=============>................] - ETA: 0s - loss: 0.6992 - accuracy: 0.7520

260/487 [===============>..............] - ETA: 0s - loss: 0.6997 - accuracy: 0.7517

279/487 [================>.............] - ETA: 0s - loss: 0.6992 - accuracy: 0.7518

298/487 [=================>............] - ETA: 0s - loss: 0.6993 - accuracy: 0.7516

317/487 [==================>...........] - ETA: 0s - loss: 0.6996 - accuracy: 0.7515

337/487 [===================>..........] - ETA: 0s - loss: 0.6997 - accuracy: 0.7513

355/487 [====================>.........] - ETA: 0s - loss: 0.6993 - accuracy: 0.7513

373/487 [=====================>........] - ETA: 0s - loss: 0.6997 - accuracy: 0.7511

389/487 [======================>.......] - ETA: 0s - loss: 0.7000 - accuracy: 0.7509

408/487 [========================>.....] - ETA: 0s - loss: 0.7000 - accuracy: 0.7510

428/487 [=========================>....] - ETA: 0s - loss: 0.7002 - accuracy: 0.7510

447/487 [==========================>...] - ETA: 0s - loss: 0.7001 - accuracy: 0.7512

465/487 [===========================>..] - ETA: 0s - loss: 0.7003 - accuracy: 0.7511

484/487 [============================>.] - ETA: 0s - loss: 0.7002 - accuracy: 0.7511
Epoch 23: val_loss improved from 0.70751 to 0.70550, saving model to unpruned_model/model_best.h5
Epoch 23: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 3ms/step - loss: 0.7003 - accuracy: 0.7511 - val_loss: 0.7055 - val_accuracy: 0.7498 - lr: 1.0000e-04
Epoch 24/30
  1/487 [..............................] - ETA: 2s - loss: 0.6884 - accuracy: 0.7559

 17/487 [>.............................] - ETA: 1s - loss: 0.7106 - accuracy: 0.7478

 34/487 [=>............................] - ETA: 1s - loss: 0.7049 - accuracy: 0.7488

 50/487 [==>...........................] - ETA: 1s - loss: 0.7034 - accuracy: 0.7493

 64/487 [==>...........................] - ETA: 1s - loss: 0.6996 - accuracy: 0.7509

 79/487 [===>..........................] - ETA: 1s - loss: 0.7008 - accuracy: 0.7508

 94/487 [====>.........................] - ETA: 1s - loss: 0.7003 - accuracy: 0.7509

112/487 [=====>........................] - ETA: 1s - loss: 0.6982 - accuracy: 0.7512

129/487 [======>.......................] - ETA: 1s - loss: 0.6981 - accuracy: 0.7513

145/487 [=======>......................] - ETA: 1s - loss: 0.6984 - accuracy: 0.7509

162/487 [========>.....................] - ETA: 1s - loss: 0.6959 - accuracy: 0.7517

180/487 [==========>...................] - ETA: 0s - loss: 0.6966 - accuracy: 0.7515

197/487 [===========>..................] - ETA: 0s - loss: 0.6975 - accuracy: 0.7512

214/487 [============>.................] - ETA: 0s - loss: 0.6977 - accuracy: 0.7512

232/487 [=============>................] - ETA: 0s - loss: 0.6976 - accuracy: 0.7514

250/487 [==============>...............] - ETA: 0s - loss: 0.6985 - accuracy: 0.7513

267/487 [===============>..............] - ETA: 0s - loss: 0.6982 - accuracy: 0.7515

284/487 [================>.............] - ETA: 0s - loss: 0.6978 - accuracy: 0.7517

303/487 [=================>............] - ETA: 0s - loss: 0.6977 - accuracy: 0.7516

322/487 [==================>...........] - ETA: 0s - loss: 0.6979 - accuracy: 0.7517

342/487 [====================>.........] - ETA: 0s - loss: 0.6980 - accuracy: 0.7517

361/487 [=====================>........] - ETA: 0s - loss: 0.6987 - accuracy: 0.7515

380/487 [======================>.......] - ETA: 0s - loss: 0.6980 - accuracy: 0.7517

399/487 [=======================>......] - ETA: 0s - loss: 0.6979 - accuracy: 0.7519

417/487 [========================>.....] - ETA: 0s - loss: 0.6981 - accuracy: 0.7518

434/487 [=========================>....] - ETA: 0s - loss: 0.6980 - accuracy: 0.7518

453/487 [==========================>...] - ETA: 0s - loss: 0.6984 - accuracy: 0.7516

472/487 [============================>.] - ETA: 0s - loss: 0.6986 - accuracy: 0.7516
Epoch 24: val_loss improved from 0.70550 to 0.70379, saving model to unpruned_model/model_best.h5
Epoch 24: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6986 - accuracy: 0.7517 - val_loss: 0.7038 - val_accuracy: 0.7502 - lr: 1.0000e-04
Epoch 25/30
  1/487 [..............................] - ETA: 2s - loss: 0.6816 - accuracy: 0.7627

 20/487 [>.............................] - ETA: 1s - loss: 0.6937 - accuracy: 0.7528

 39/487 [=>............................] - ETA: 1s - loss: 0.7007 - accuracy: 0.7512

 58/487 [==>...........................] - ETA: 1s - loss: 0.7007 - accuracy: 0.7519

 77/487 [===>..........................] - ETA: 1s - loss: 0.7016 - accuracy: 0.7516

 96/487 [====>.........................] - ETA: 1s - loss: 0.7003 - accuracy: 0.7514

115/487 [======>.......................] - ETA: 1s - loss: 0.7001 - accuracy: 0.7512

134/487 [=======>......................] - ETA: 0s - loss: 0.6996 - accuracy: 0.7515

152/487 [========>.....................] - ETA: 0s - loss: 0.6986 - accuracy: 0.7521

170/487 [=========>....................] - ETA: 0s - loss: 0.6986 - accuracy: 0.7521

188/487 [==========>...................] - ETA: 0s - loss: 0.6985 - accuracy: 0.7519

204/487 [===========>..................] - ETA: 0s - loss: 0.6979 - accuracy: 0.7521

221/487 [============>.................] - ETA: 0s - loss: 0.6983 - accuracy: 0.7518

238/487 [=============>................] - ETA: 0s - loss: 0.6984 - accuracy: 0.7519

255/487 [==============>...............] - ETA: 0s - loss: 0.6988 - accuracy: 0.7514

271/487 [===============>..............] - ETA: 0s - loss: 0.6994 - accuracy: 0.7512

288/487 [================>.............] - ETA: 0s - loss: 0.6990 - accuracy: 0.7512

305/487 [=================>............] - ETA: 0s - loss: 0.6995 - accuracy: 0.7509

323/487 [==================>...........] - ETA: 0s - loss: 0.6999 - accuracy: 0.7507

340/487 [===================>..........] - ETA: 0s - loss: 0.6996 - accuracy: 0.7510

357/487 [====================>.........] - ETA: 0s - loss: 0.6988 - accuracy: 0.7513

374/487 [======================>.......] - ETA: 0s - loss: 0.6982 - accuracy: 0.7515

391/487 [=======================>......] - ETA: 0s - loss: 0.6983 - accuracy: 0.7514

410/487 [========================>.....] - ETA: 0s - loss: 0.6987 - accuracy: 0.7513

427/487 [=========================>....] - ETA: 0s - loss: 0.6982 - accuracy: 0.7515

445/487 [==========================>...] - ETA: 0s - loss: 0.6978 - accuracy: 0.7516

462/487 [===========================>..] - ETA: 0s - loss: 0.6974 - accuracy: 0.7518

481/487 [============================>.] - ETA: 0s - loss: 0.6970 - accuracy: 0.7520
Epoch 25: val_loss improved from 0.70379 to 0.70247, saving model to unpruned_model/model_best.h5
Epoch 25: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6969 - accuracy: 0.7520 - val_loss: 0.7025 - val_accuracy: 0.7505 - lr: 1.0000e-04
Epoch 26/30
  1/487 [..............................] - ETA: 2s - loss: 0.7309 - accuracy: 0.7344

 21/487 [>.............................] - ETA: 1s - loss: 0.6963 - accuracy: 0.7525

 38/487 [=>............................] - ETA: 1s - loss: 0.6928 - accuracy: 0.7535

 55/487 [==>...........................] - ETA: 1s - loss: 0.6942 - accuracy: 0.7535

 72/487 [===>..........................] - ETA: 1s - loss: 0.6948 - accuracy: 0.7528

 91/487 [====>.........................] - ETA: 1s - loss: 0.6974 - accuracy: 0.7514

111/487 [=====>........................] - ETA: 1s - loss: 0.6984 - accuracy: 0.7506

127/487 [======>.......................] - ETA: 1s - loss: 0.6991 - accuracy: 0.7506

145/487 [=======>......................] - ETA: 0s - loss: 0.6996 - accuracy: 0.7509

161/487 [========>.....................] - ETA: 0s - loss: 0.6989 - accuracy: 0.7512

178/487 [=========>....................] - ETA: 0s - loss: 0.6979 - accuracy: 0.7514

194/487 [==========>...................] - ETA: 0s - loss: 0.6976 - accuracy: 0.7514

212/487 [============>.................] - ETA: 0s - loss: 0.6968 - accuracy: 0.7516

229/487 [=============>................] - ETA: 0s - loss: 0.6964 - accuracy: 0.7518

247/487 [==============>...............] - ETA: 0s - loss: 0.6964 - accuracy: 0.7520

264/487 [===============>..............] - ETA: 0s - loss: 0.6959 - accuracy: 0.7520

282/487 [================>.............] - ETA: 0s - loss: 0.6963 - accuracy: 0.7518

300/487 [=================>............] - ETA: 0s - loss: 0.6959 - accuracy: 0.7520

317/487 [==================>...........] - ETA: 0s - loss: 0.6954 - accuracy: 0.7521

334/487 [===================>..........] - ETA: 0s - loss: 0.6960 - accuracy: 0.7518

353/487 [====================>.........] - ETA: 0s - loss: 0.6958 - accuracy: 0.7520

371/487 [=====================>........] - ETA: 0s - loss: 0.6957 - accuracy: 0.7521

389/487 [======================>.......] - ETA: 0s - loss: 0.6962 - accuracy: 0.7519

408/487 [========================>.....] - ETA: 0s - loss: 0.6961 - accuracy: 0.7520

427/487 [=========================>....] - ETA: 0s - loss: 0.6957 - accuracy: 0.7522

443/487 [==========================>...] - ETA: 0s - loss: 0.6954 - accuracy: 0.7523

461/487 [===========================>..] - ETA: 0s - loss: 0.6950 - accuracy: 0.7526

478/487 [============================>.] - ETA: 0s - loss: 0.6950 - accuracy: 0.7525
Epoch 26: val_loss improved from 0.70247 to 0.70146, saving model to unpruned_model/model_best.h5
Epoch 26: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6953 - accuracy: 0.7524 - val_loss: 0.7015 - val_accuracy: 0.7510 - lr: 1.0000e-04
Epoch 27/30
  1/487 [..............................] - ETA: 2s - loss: 0.7113 - accuracy: 0.7578

 19/487 [>.............................] - ETA: 1s - loss: 0.6969 - accuracy: 0.7509

 37/487 [=>............................] - ETA: 1s - loss: 0.6945 - accuracy: 0.7530

 55/487 [==>...........................] - ETA: 1s - loss: 0.6916 - accuracy: 0.7537

 72/487 [===>..........................] - ETA: 1s - loss: 0.6952 - accuracy: 0.7525

 90/487 [====>.........................] - ETA: 1s - loss: 0.6956 - accuracy: 0.7526

109/487 [=====>........................] - ETA: 1s - loss: 0.6948 - accuracy: 0.7532

128/487 [======>.......................] - ETA: 1s - loss: 0.6944 - accuracy: 0.7528

147/487 [========>.....................] - ETA: 0s - loss: 0.6937 - accuracy: 0.7529

166/487 [=========>....................] - ETA: 0s - loss: 0.6943 - accuracy: 0.7526

185/487 [==========>...................] - ETA: 0s - loss: 0.6946 - accuracy: 0.7522

202/487 [===========>..................] - ETA: 0s - loss: 0.6946 - accuracy: 0.7523

217/487 [============>.................] - ETA: 0s - loss: 0.6938 - accuracy: 0.7529

234/487 [=============>................] - ETA: 0s - loss: 0.6936 - accuracy: 0.7533

252/487 [==============>...............] - ETA: 0s - loss: 0.6932 - accuracy: 0.7533

269/487 [===============>..............] - ETA: 0s - loss: 0.6929 - accuracy: 0.7535

287/487 [================>.............] - ETA: 0s - loss: 0.6931 - accuracy: 0.7533

304/487 [=================>............] - ETA: 0s - loss: 0.6940 - accuracy: 0.7530

321/487 [==================>...........] - ETA: 0s - loss: 0.6936 - accuracy: 0.7532

338/487 [===================>..........] - ETA: 0s - loss: 0.6936 - accuracy: 0.7531

353/487 [====================>.........] - ETA: 0s - loss: 0.6939 - accuracy: 0.7529

369/487 [=====================>........] - ETA: 0s - loss: 0.6938 - accuracy: 0.7529

385/487 [======================>.......] - ETA: 0s - loss: 0.6938 - accuracy: 0.7529

403/487 [=======================>......] - ETA: 0s - loss: 0.6942 - accuracy: 0.7527

420/487 [========================>.....] - ETA: 0s - loss: 0.6943 - accuracy: 0.7527

437/487 [=========================>....] - ETA: 0s - loss: 0.6941 - accuracy: 0.7529

455/487 [===========================>..] - ETA: 0s - loss: 0.6942 - accuracy: 0.7528

473/487 [============================>.] - ETA: 0s - loss: 0.6940 - accuracy: 0.7529
Epoch 27: val_loss improved from 0.70146 to 0.69965, saving model to unpruned_model/model_best.h5
Epoch 27: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6939 - accuracy: 0.7529 - val_loss: 0.6996 - val_accuracy: 0.7514 - lr: 1.0000e-04
Epoch 28/30
  1/487 [..............................] - ETA: 2s - loss: 0.7448 - accuracy: 0.7236

 19/487 [>.............................] - ETA: 1s - loss: 0.6984 - accuracy: 0.7534

 37/487 [=>............................] - ETA: 1s - loss: 0.6903 - accuracy: 0.7549

 54/487 [==>...........................] - ETA: 1s - loss: 0.6889 - accuracy: 0.7553

 72/487 [===>..........................] - ETA: 1s - loss: 0.6915 - accuracy: 0.7542

 91/487 [====>.........................] - ETA: 1s - loss: 0.6910 - accuracy: 0.7539

106/487 [=====>........................] - ETA: 1s - loss: 0.6899 - accuracy: 0.7543

125/487 [======>.......................] - ETA: 1s - loss: 0.6926 - accuracy: 0.7532

141/487 [=======>......................] - ETA: 1s - loss: 0.6923 - accuracy: 0.7532

158/487 [========>.....................] - ETA: 0s - loss: 0.6917 - accuracy: 0.7534

175/487 [=========>....................] - ETA: 0s - loss: 0.6914 - accuracy: 0.7535

193/487 [==========>...................] - ETA: 0s - loss: 0.6919 - accuracy: 0.7533

212/487 [============>.................] - ETA: 0s - loss: 0.6917 - accuracy: 0.7534

230/487 [=============>................] - ETA: 0s - loss: 0.6921 - accuracy: 0.7533

247/487 [==============>...............] - ETA: 0s - loss: 0.6921 - accuracy: 0.7533

265/487 [===============>..............] - ETA: 0s - loss: 0.6923 - accuracy: 0.7533

282/487 [================>.............] - ETA: 0s - loss: 0.6922 - accuracy: 0.7533

298/487 [=================>............] - ETA: 0s - loss: 0.6917 - accuracy: 0.7535

313/487 [==================>...........] - ETA: 0s - loss: 0.6914 - accuracy: 0.7535

331/487 [===================>..........] - ETA: 0s - loss: 0.6913 - accuracy: 0.7537

349/487 [====================>.........] - ETA: 0s - loss: 0.6917 - accuracy: 0.7534

367/487 [=====================>........] - ETA: 0s - loss: 0.6923 - accuracy: 0.7531

385/487 [======================>.......] - ETA: 0s - loss: 0.6926 - accuracy: 0.7530

403/487 [=======================>......] - ETA: 0s - loss: 0.6919 - accuracy: 0.7533

421/487 [========================>.....] - ETA: 0s - loss: 0.6921 - accuracy: 0.7533

438/487 [=========================>....] - ETA: 0s - loss: 0.6922 - accuracy: 0.7533

454/487 [==========================>...] - ETA: 0s - loss: 0.6924 - accuracy: 0.7531

468/487 [===========================>..] - ETA: 0s - loss: 0.6926 - accuracy: 0.7530

485/487 [============================>.] - ETA: 0s - loss: 0.6924 - accuracy: 0.7531
Epoch 28: val_loss improved from 0.69965 to 0.69813, saving model to unpruned_model/model_best.h5
Epoch 28: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6924 - accuracy: 0.7531 - val_loss: 0.6981 - val_accuracy: 0.7519 - lr: 1.0000e-04
Epoch 29/30
  1/487 [..............................] - ETA: 3s - loss: 0.7438 - accuracy: 0.7295

 15/487 [..............................] - ETA: 1s - loss: 0.6888 - accuracy: 0.7531

 31/487 [>.............................] - ETA: 1s - loss: 0.6891 - accuracy: 0.7546

 45/487 [=>............................] - ETA: 1s - loss: 0.6857 - accuracy: 0.7564

 61/487 [==>...........................] - ETA: 1s - loss: 0.6883 - accuracy: 0.7550

 79/487 [===>..........................] - ETA: 1s - loss: 0.6896 - accuracy: 0.7545

 97/487 [====>.........................] - ETA: 1s - loss: 0.6910 - accuracy: 0.7541

117/487 [======>.......................] - ETA: 1s - loss: 0.6899 - accuracy: 0.7548

136/487 [=======>......................] - ETA: 1s - loss: 0.6901 - accuracy: 0.7547

155/487 [========>.....................] - ETA: 0s - loss: 0.6912 - accuracy: 0.7540

175/487 [=========>....................] - ETA: 0s - loss: 0.6915 - accuracy: 0.7534

194/487 [==========>...................] - ETA: 0s - loss: 0.6920 - accuracy: 0.7533

213/487 [============>.................] - ETA: 0s - loss: 0.6925 - accuracy: 0.7533

230/487 [=============>................] - ETA: 0s - loss: 0.6929 - accuracy: 0.7531

247/487 [==============>...............] - ETA: 0s - loss: 0.6930 - accuracy: 0.7532

264/487 [===============>..............] - ETA: 0s - loss: 0.6932 - accuracy: 0.7530

280/487 [================>.............] - ETA: 0s - loss: 0.6931 - accuracy: 0.7529

297/487 [=================>............] - ETA: 0s - loss: 0.6932 - accuracy: 0.7529

314/487 [==================>...........] - ETA: 0s - loss: 0.6930 - accuracy: 0.7531

331/487 [===================>..........] - ETA: 0s - loss: 0.6924 - accuracy: 0.7532

349/487 [====================>.........] - ETA: 0s - loss: 0.6918 - accuracy: 0.7534

366/487 [=====================>........] - ETA: 0s - loss: 0.6916 - accuracy: 0.7536

383/487 [======================>.......] - ETA: 0s - loss: 0.6913 - accuracy: 0.7537

400/487 [=======================>......] - ETA: 0s - loss: 0.6916 - accuracy: 0.7536

417/487 [========================>.....] - ETA: 0s - loss: 0.6914 - accuracy: 0.7538

434/487 [=========================>....] - ETA: 0s - loss: 0.6908 - accuracy: 0.7539

451/487 [==========================>...] - ETA: 0s - loss: 0.6911 - accuracy: 0.7537

469/487 [===========================>..] - ETA: 0s - loss: 0.6911 - accuracy: 0.7538
Epoch 29: val_loss improved from 0.69813 to 0.69696, saving model to unpruned_model/model_best.h5
Epoch 29: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6909 - accuracy: 0.7538 - val_loss: 0.6970 - val_accuracy: 0.7519 - lr: 1.0000e-04
Epoch 30/30
  1/487 [..............................] - ETA: 2s - loss: 0.6976 - accuracy: 0.7510

 20/487 [>.............................] - ETA: 1s - loss: 0.6856 - accuracy: 0.7541

 39/487 [=>............................] - ETA: 1s - loss: 0.6930 - accuracy: 0.7539

 58/487 [==>...........................] - ETA: 1s - loss: 0.6901 - accuracy: 0.7539

 76/487 [===>..........................] - ETA: 1s - loss: 0.6905 - accuracy: 0.7541

 94/487 [====>.........................] - ETA: 1s - loss: 0.6901 - accuracy: 0.7547

112/487 [=====>........................] - ETA: 1s - loss: 0.6915 - accuracy: 0.7542

128/487 [======>.......................] - ETA: 1s - loss: 0.6921 - accuracy: 0.7537

147/487 [========>.....................] - ETA: 0s - loss: 0.6919 - accuracy: 0.7536

166/487 [=========>....................] - ETA: 0s - loss: 0.6920 - accuracy: 0.7534

183/487 [==========>...................] - ETA: 0s - loss: 0.6914 - accuracy: 0.7534

202/487 [===========>..................] - ETA: 0s - loss: 0.6914 - accuracy: 0.7535

219/487 [============>.................] - ETA: 0s - loss: 0.6915 - accuracy: 0.7531

236/487 [=============>................] - ETA: 0s - loss: 0.6906 - accuracy: 0.7534

252/487 [==============>...............] - ETA: 0s - loss: 0.6898 - accuracy: 0.7536

271/487 [===============>..............] - ETA: 0s - loss: 0.6894 - accuracy: 0.7539

287/487 [================>.............] - ETA: 0s - loss: 0.6890 - accuracy: 0.7541

303/487 [=================>............] - ETA: 0s - loss: 0.6884 - accuracy: 0.7544

322/487 [==================>...........] - ETA: 0s - loss: 0.6887 - accuracy: 0.7541

341/487 [====================>.........] - ETA: 0s - loss: 0.6888 - accuracy: 0.7541

359/487 [=====================>........] - ETA: 0s - loss: 0.6888 - accuracy: 0.7542

377/487 [======================>.......] - ETA: 0s - loss: 0.6886 - accuracy: 0.7543

396/487 [=======================>......] - ETA: 0s - loss: 0.6885 - accuracy: 0.7543

415/487 [========================>.....] - ETA: 0s - loss: 0.6886 - accuracy: 0.7543

433/487 [=========================>....] - ETA: 0s - loss: 0.6886 - accuracy: 0.7545

451/487 [==========================>...] - ETA: 0s - loss: 0.6890 - accuracy: 0.7543

470/487 [===========================>..] - ETA: 0s - loss: 0.6894 - accuracy: 0.7542
Epoch 30: val_loss improved from 0.69696 to 0.69557, saving model to unpruned_model/model_best.h5
Epoch 30: saving model to unpruned_model/model_last.h5

487/487 [==============================] - 2s 4ms/step - loss: 0.6895 - accuracy: 0.7542 - val_loss: 0.6956 - val_accuracy: 0.7526 - lr: 1.0000e-04
<keras.callbacks.History at 0x7f02b5614b50>

Train the pruned model#

This time we’ll use the Tensorflow model optimization sparsity to train a sparse model (forcing many weights to ‘0’). In this instance, the target sparsity is 75%

from tensorflow_model_optimization.python.core.sparsity.keras import prune, pruning_callbacks, pruning_schedule
from tensorflow_model_optimization.sparsity.keras import strip_pruning

pruned_model = Sequential()
pruned_model.add(Dense(64, input_shape=(16,), name="fc1", kernel_initializer="lecun_uniform", kernel_regularizer=l1(0.0001)))
pruned_model.add(Activation(activation="relu", name="relu1"))
pruned_model.add(Dense(32, name="fc2", kernel_initializer="lecun_uniform", kernel_regularizer=l1(0.0001)))
pruned_model.add(Activation(activation="relu", name="relu2"))
pruned_model.add(Dense(32, name="fc3", kernel_initializer="lecun_uniform", kernel_regularizer=l1(0.0001)))
pruned_model.add(Activation(activation="relu", name="relu3"))
pruned_model.add(Dense(5, name="output", kernel_initializer="lecun_uniform", kernel_regularizer=l1(0.0001)))
pruned_model.add(Activation(activation="softmax", name="softmax"))

pruning_params = {"pruning_schedule": pruning_schedule.ConstantSparsity(0.75, begin_step=2000, frequency=100)}
pruned_model = prune.prune_low_magnitude(pruned_model, **pruning_params)
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[7], line 1
----> 1 from tensorflow_model_optimization.python.core.sparsity.keras import prune, pruning_callbacks, pruning_schedule
      2 from tensorflow_model_optimization.sparsity.keras import strip_pruning
      4 pruned_model = Sequential()

ModuleNotFoundError: No module named 'tensorflow_model_optimization'

We’ll use the same settings as before: Adam optimizer with categorical crossentropy loss. The callbacks will decay the learning rate and save the model into a directory pruned_model.

adam = Adam(lr=0.0001)
pruned_model.compile(optimizer=adam, loss=["categorical_crossentropy"], metrics=["accuracy"])
callbacks = all_callbacks(
    stop_patience=1000,
    lr_factor=0.5,
    lr_patience=10,
    lr_epsilon=0.000001,
    lr_cooldown=2,
    lr_minimum=0.0000001,
    outputDir="pruned_model",
)
callbacks.callbacks.append(pruning_callbacks.UpdatePruningStep())
pruned_model.fit(
    X_train_val,
    y_train_val,
    batch_size=1024,
    epochs=30,
    validation_split=0.25,
    shuffle=True,
    callbacks=callbacks.callbacks,
    verbose=0,
)
# Save the model again but with the pruning 'stripped' to use the regular layer types
pruned_model = strip_pruning(pruned_model)
pruned_model.save("pruned_model/model_best.h5")

Check sparsity#

Make a quick check that the model was indeed trained sparse. We’ll just make a histogram of the weights of the 1st layer, and hopefully observe a large peak in the bin containing ‘0’. Note logarithmic y axis.

bins = np.arange(-2, 2, 0.04)
w_unpruned = model.layers[0].weights[0].numpy().flatten()
w_pruned = pruned_model.layers[0].weights[0].numpy().flatten()

plt.figure(figsize=(7, 7))

plt.hist(w_unpruned, bins=bins, alpha=0.7, label="Unpruned layer 1")
plt.hist(w_pruned, bins=bins, alpha=0.7, label="Pruned layer 1")

plt.xlabel("Weight value")
plt.ylabel("Number of weights")
plt.semilogy()
plt.legend()

print(f"Sparsity of unpruned model layer 1: {np.sum(w_unpruned==0)*100/np.size(w_unpruned)}% zeros")
print(f"Sparsity of pruned model layer 1: {np.sum(w_pruned==0)*100/np.size(w_pruned)}% zeros")
plt.show()

Check performance#

How does this 75% sparse model compare against the unpruned model? Let’s report the accuracy and make a ROC curve. The pruned model is shown with solid lines, the unpruned model is shown with dashed lines.

import plotting
import matplotlib.pyplot as plt
from sklearn.metrics import accuracy_score
from tensorflow.keras.models import load_model

unpruned_model = load_model("unpruned_model/model_best.h5")

y_ref = unpruned_model.predict(X_test, verbose=0)
y_prune = pruned_model.predict(X_test, verbose=0)

print("Accuracy unpruned: {}".format(accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_ref, axis=1))))
print("Accuracy pruned:   {}".format(accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_prune, axis=1))))
fig, ax = plt.subplots(figsize=(9, 9))
_ = plotting.make_roc(y_test, y_ref, classes)
plt.gca().set_prop_cycle(None)  # reset the colors
_ = plotting.make_roc(y_test, y_prune, classes, linestyle="--")

from matplotlib.lines import Line2D

lines = [Line2D([0], [0], ls="-"), Line2D([0], [0], ls="--")]
from matplotlib.legend import Legend

leg = Legend(ax, lines, labels=["Unpruned", "Pruned"], loc="lower right", frameon=False)
ax.add_artist(leg)
plt.show()