Advanced Configuration
Contents
!apt-get -qq install -y graphviz && pip install pydot
!pip install -U matplotlib
!pip install git+https://github.com/fastmachinelearning/hls4ml.git@main#egg=hls4ml[profiling]
!pip install qkeras==0.9.0
!pip install wget
import wget
import os.path
# WGET for colab
if not os.path.exists("callbacks.py"):
url = "https://raw.githubusercontent.com/jmduarte/iaifi-summer-school/main/book/callbacks.py"
callbacksFile = wget.download(url)
if not os.path.exists("plotting.py"):
urlPlot = "https://raw.githubusercontent.com/jmduarte/iaifi-summer-school/main/book/plotting.py"
plotFile = wget.download(urlPlot)
E: Could not open lock file /var/lib/dpkg/lock-frontend - open (13: Permission denied)
E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), are you root?
Requirement already satisfied: matplotlib in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (3.5.3)
Requirement already satisfied: pyparsing>=2.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (3.0.9)
Requirement already satisfied: cycler>=0.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (4.38.0)
Requirement already satisfied: pillow>=6.2.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (9.3.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (1.4.4)
Requirement already satisfied: numpy>=1.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (1.21.6)
Requirement already satisfied: packaging>=20.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (21.3)
Requirement already satisfied: python-dateutil>=2.7 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (2.8.2)
Requirement already satisfied: typing-extensions in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from kiwisolver>=1.0.1->matplotlib) (4.4.0)
Requirement already satisfied: six>=1.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0)
Collecting hls4ml[profiling]
Cloning https://github.com/fastmachinelearning/hls4ml.git (to revision main) to /tmp/pip-install-igchbep4/hls4ml_6614abff0e624d59a32a496de5c55cb6
Running command git clone --filter=blob:none --quiet https://github.com/fastmachinelearning/hls4ml.git /tmp/pip-install-igchbep4/hls4ml_6614abff0e624d59a32a496de5c55cb6
Resolved https://github.com/fastmachinelearning/hls4ml.git to commit fe9d3e71b03e0422c7643027880310bd2cc02cb1
Running command git submodule update --init --recursive -q
Installing build dependencies ... ?25l-
\
|
/
-
\
done
?25h Getting requirements to build wheel ... ?25l-
done
?25h Preparing metadata (pyproject.toml) ... ?25l-
\
done
?25hRequirement already satisfied: qkeras in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.9.0)
Requirement already satisfied: pyyaml in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (6.0)
Requirement already satisfied: onnx>=1.4.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.12.0)
Requirement already satisfied: calmjs.parse in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.3.0)
Requirement already satisfied: numpy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.21.6)
Requirement already satisfied: tabulate in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.9.0)
Requirement already satisfied: pydigitalwavetools==1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.1)
Requirement already satisfied: h5py in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (3.7.0)
Requirement already satisfied: six in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.16.0)
Requirement already satisfied: pandas in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.3.5)
Requirement already satisfied: seaborn in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.12.1)
Requirement already satisfied: matplotlib in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (3.5.3)
Requirement already satisfied: protobuf<=3.20.1,>=3.12.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from onnx>=1.4.0->hls4ml[profiling]) (3.19.6)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from onnx>=1.4.0->hls4ml[profiling]) (4.4.0)
Requirement already satisfied: ply>=3.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from calmjs.parse->hls4ml[profiling]) (3.11)
Requirement already satisfied: setuptools in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from calmjs.parse->hls4ml[profiling]) (65.5.1)
Requirement already satisfied: pyparsing>=2.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (3.0.9)
Requirement already satisfied: pillow>=6.2.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (9.3.0)
Requirement already satisfied: fonttools>=4.22.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (4.38.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (1.4.4)
Requirement already satisfied: cycler>=0.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (0.11.0)
Requirement already satisfied: python-dateutil>=2.7 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (2.8.2)
Requirement already satisfied: packaging>=20.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (21.3)
Requirement already satisfied: pytz>=2017.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pandas->hls4ml[profiling]) (2022.6)
Requirement already satisfied: keras-tuner>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.1.3)
Requirement already satisfied: scipy>=1.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.7.3)
Requirement already satisfied: tensorflow-model-optimization>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (0.7.3)
Requirement already satisfied: tqdm>=4.48.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (4.64.1)
Requirement already satisfied: scikit-learn>=0.23.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.0.2)
Requirement already satisfied: pyparser in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.0)
Requirement already satisfied: networkx>=2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (2.6.3)
Requirement already satisfied: kt-legacy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.0.4)
Requirement already satisfied: tensorboard in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.10.1)
Requirement already satisfied: requests in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.28.1)
Requirement already satisfied: ipython in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (7.34.0)
Requirement already satisfied: joblib>=0.11 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras->hls4ml[profiling]) (1.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras->hls4ml[profiling]) (3.1.0)
Requirement already satisfied: dm-tree~=0.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras->hls4ml[profiling]) (0.1.7)
Requirement already satisfied: parse==1.6.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyparser->qkeras->hls4ml[profiling]) (1.6.5)
Requirement already satisfied: jedi>=0.16 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.18.1)
Requirement already satisfied: matplotlib-inline in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.1.6)
Requirement already satisfied: pygments in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.13.0)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.0.32)
Requirement already satisfied: pickleshare in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.7.5)
Requirement already satisfied: pexpect>4.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (4.8.0)
Requirement already satisfied: decorator in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.1.1)
Requirement already satisfied: backcall in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.0)
Requirement already satisfied: traitlets>=4.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.5.0)
Requirement already satisfied: certifi>=2017.4.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2022.9.24)
Requirement already satisfied: idna<4,>=2.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.4)
Requirement already satisfied: charset-normalizer<3,>=2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.26.12)
Requirement already satisfied: grpcio>=1.24.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.50.0)
Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.14.1)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.6.1)
Requirement already satisfied: werkzeug>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.2.2)
Requirement already satisfied: wheel>=0.26 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.38.2)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.8.1)
Requirement already satisfied: markdown>=2.6.8 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.4.1)
Requirement already satisfied: absl-py>=0.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.3.0)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.8)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (4.9)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.2.0)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.3.1)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from jedi>=0.16->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.8.3)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.0.0)
Requirement already satisfied: ptyprocess>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pexpect>4.3->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.7.0)
Requirement already satisfied: wcwidth in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.5)
Requirement already satisfied: MarkupSafe>=2.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from werkzeug>=1.0.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.1.1)
Requirement already satisfied: zipp>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.10.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.2.2)
Requirement already satisfied: qkeras==0.9.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (0.9.0)
Requirement already satisfied: pyparser in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.0)
Requirement already satisfied: scipy>=1.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.7.3)
Requirement already satisfied: tensorflow-model-optimization>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (0.7.3)
Requirement already satisfied: tqdm>=4.48.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (4.64.1)
Requirement already satisfied: scikit-learn>=0.23.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.0.2)
Requirement already satisfied: setuptools>=41.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (65.5.1)
Requirement already satisfied: networkx>=2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (2.6.3)
Requirement already satisfied: numpy>=1.16.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.21.6)
Requirement already satisfied: keras-tuner>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.1.3)
Requirement already satisfied: requests in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (2.28.1)
Requirement already satisfied: ipython in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (7.34.0)
Requirement already satisfied: packaging in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (21.3)
Requirement already satisfied: kt-legacy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (1.0.4)
Requirement already satisfied: tensorboard in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (2.10.1)
Requirement already satisfied: joblib>=0.11 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras==0.9.0) (1.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras==0.9.0) (3.1.0)
Requirement already satisfied: dm-tree~=0.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras==0.9.0) (0.1.7)
Requirement already satisfied: six~=1.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras==0.9.0) (1.16.0)
Requirement already satisfied: parse==1.6.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyparser->qkeras==0.9.0) (1.6.5)
Requirement already satisfied: jedi>=0.16 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.18.1)
Requirement already satisfied: pickleshare in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.7.5)
Requirement already satisfied: pexpect>4.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (4.8.0)
Requirement already satisfied: traitlets>=4.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (5.5.0)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (3.0.32)
Requirement already satisfied: pygments in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (2.13.0)
Requirement already satisfied: matplotlib-inline in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.1.6)
Requirement already satisfied: backcall in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.0)
Requirement already satisfied: decorator in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (5.1.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from packaging->keras-tuner>=1.0.1->qkeras==0.9.0) (3.0.9)
Requirement already satisfied: charset-normalizer<3,>=2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (2.1.1)
Requirement already satisfied: certifi>=2017.4.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (2022.9.24)
Requirement already satisfied: idna<4,>=2.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (3.4)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (1.26.12)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.8.1)
Requirement already satisfied: protobuf<3.20,>=3.9.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.19.6)
Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.14.1)
Requirement already satisfied: werkzeug>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.2.2)
Requirement already satisfied: wheel>=0.26 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.38.2)
Requirement already satisfied: markdown>=2.6.8 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.4.1)
Requirement already satisfied: absl-py>=0.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.3.0)
Requirement already satisfied: grpcio>=1.24.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.50.0)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.6.1)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.8)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (5.2.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (4.9)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.3.1)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from jedi>=0.16->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.8.3)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (5.0.0)
Requirement already satisfied: ptyprocess>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pexpect>4.3->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.7.0)
Requirement already satisfied: wcwidth in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.5)
Requirement already satisfied: MarkupSafe>=2.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from werkzeug>=1.0.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.1.1)
Requirement already satisfied: typing-extensions>=3.6.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (4.4.0)
Requirement already satisfied: zipp>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.10.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.2.2)
Requirement already satisfied: wget in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (3.2)
#FOR COLAB ONLY
if not os.path.exists("./iaifi-summer-school"):
!git clone https://github.com/jmduarte/iaifi-summer-school.git
Cloning into 'iaifi-summer-school'...
remote: Enumerating objects: 1755, done.
remote: Counting objects: 0% (1/418)
remote: Counting objects: 1% (5/418)
remote: Counting objects: 2% (9/418)
remote: Counting objects: 3% (13/418)
remote: Counting objects: 4% (17/418)
remote: Counting objects: 5% (21/418)
remote: Counting objects: 6% (26/418)
remote: Counting objects: 7% (30/418)
remote: Counting objects: 8% (34/418)
remote: Counting objects: 9% (38/418)
remote: Counting objects: 10% (42/418)
remote: Counting objects: 11% (46/418)
remote: Counting objects: 12% (51/418)
remote: Counting objects: 13% (55/418)
remote: Counting objects: 14% (59/418)
remote: Counting objects: 15% (63/418)
remote: Counting objects: 16% (67/418)
remote: Counting objects: 17% (72/418)
remote: Counting objects: 18% (76/418)
remote: Counting objects: 19% (80/418)
remote: Counting objects: 20% (84/418)
remote: Counting objects: 21% (88/418)
remote: Counting objects: 22% (92/418)
remote: Counting objects: 23% (97/418)
remote: Counting objects: 24% (101/418)
remote: Counting objects: 25% (105/418)
remote: Counting objects: 26% (109/418)
remote: Counting objects: 27% (113/418)
remote: Counting objects: 28% (118/418)
remote: Counting objects: 29% (122/418)
remote: Counting objects: 30% (126/418)
remote: Counting objects: 31% (130/418)
remote: Counting objects: 32% (134/418)
remote: Counting objects: 33% (138/418)
remote: Counting objects: 34% (143/418)
remote: Counting objects: 35% (147/418)
remote: Counting objects: 36% (151/418)
remote: Counting objects: 37% (155/418)
remote: Counting objects: 38% (159/418)
remote: Counting objects: 39% (164/418)
remote: Counting objects: 40% (168/418)
remote: Counting objects: 41% (172/418)
remote: Counting objects: 42% (176/418)
remote: Counting objects: 43% (180/418)
remote: Counting objects: 44% (184/418)
remote: Counting objects: 45% (189/418)
remote: Counting objects: 46% (193/418)
remote: Counting objects: 47% (197/418)
remote: Counting objects: 48% (201/418)
remote: Counting objects: 49% (205/418)
remote: Counting objects: 50% (209/418)
remote: Counting objects: 51% (214/418)
remote: Counting objects: 52% (218/418)
remote: Counting objects: 53% (222/418)
remote: Counting objects: 54% (226/418)
remote: Counting objects: 55% (230/418)
remote: Counting objects: 56% (235/418)
remote: Counting objects: 57% (239/418)
remote: Counting objects: 58% (243/418)
remote: Counting objects: 59% (247/418)
remote: Counting objects: 60% (251/418)
remote: Counting objects: 61% (255/418)
remote: Counting objects: 62% (260/418)
remote: Counting objects: 63% (264/418)
remote: Counting objects: 64% (268/418)
remote: Counting objects: 65% (272/418)
remote: Counting objects: 66% (276/418)
remote: Counting objects: 67% (281/418)
remote: Counting objects: 68% (285/418)
remote: Counting objects: 69% (289/418)
remote: Counting objects: 70% (293/418)
remote: Counting objects: 71% (297/418)
remote: Counting objects: 72% (301/418)
remote: Counting objects: 73% (306/418)
remote: Counting objects: 74% (310/418)
remote: Counting objects: 75% (314/418)
remote: Counting objects: 76% (318/418)
remote: Counting objects: 77% (322/418)
remote: Counting objects: 78% (327/418)
remote: Counting objects: 79% (331/418)
remote: Counting objects: 80% (335/418)
remote: Counting objects: 81% (339/418)
remote: Counting objects: 82% (343/418)
remote: Counting objects: 83% (347/418)
remote: Counting objects: 84% (352/418)
remote: Counting objects: 85% (356/418)
remote: Counting objects: 86% (360/418)
remote: Counting objects: 87% (364/418)
remote: Counting objects: 88% (368/418)
remote: Counting objects: 89% (373/418)
remote: Counting objects: 90% (377/418)
remote: Counting objects: 91% (381/418)
remote: Counting objects: 92% (385/418)
remote: Counting objects: 93% (389/418)
remote: Counting objects: 94% (393/418)
remote: Counting objects: 95% (398/418)
remote: Counting objects: 96% (402/418)
remote: Counting objects: 97% (406/418)
remote: Counting objects: 98% (410/418)
remote: Counting objects: 99% (414/418)
remote: Counting objects: 100% (418/418)
remote: Counting objects: 100% (418/418), done.
remote: Compressing objects: 0% (1/308)
remote: Compressing objects: 1% (4/308)
remote: Compressing objects: 2% (7/308)
remote: Compressing objects: 3% (10/308)
remote: Compressing objects: 4% (13/308)
remote: Compressing objects: 5% (16/308)
remote: Compressing objects: 6% (19/308)
remote: Compressing objects: 7% (22/308)
remote: Compressing objects: 8% (25/308)
remote: Compressing objects: 9% (28/308)
remote: Compressing objects: 10% (31/308)
remote: Compressing objects: 11% (34/308)
remote: Compressing objects: 12% (37/308)
remote: Compressing objects: 13% (41/308)
remote: Compressing objects: 14% (44/308)
remote: Compressing objects: 15% (47/308)
remote: Compressing objects: 16% (50/308)
remote: Compressing objects: 17% (53/308)
remote: Compressing objects: 18% (56/308)
remote: Compressing objects: 19% (59/308)
remote: Compressing objects: 20% (62/308)
remote: Compressing objects: 21% (65/308)
remote: Compressing objects: 22% (68/308)
remote: Compressing objects: 23% (71/308)
remote: Compressing objects: 24% (74/308)
remote: Compressing objects: 25% (77/308)
remote: Compressing objects: 26% (81/308)
remote: Compressing objects: 27% (84/308)
remote: Compressing objects: 28% (87/308)
remote: Compressing objects: 29% (90/308)
remote: Compressing objects: 30% (93/308)
remote: Compressing objects: 31% (96/308)
remote: Compressing objects: 32% (99/308)
remote: Compressing objects: 33% (102/308)
remote: Compressing objects: 34% (105/308)
remote: Compressing objects: 35% (108/308)
remote: Compressing objects: 36% (111/308)
remote: Compressing objects: 37% (114/308)
remote: Compressing objects: 38% (118/308)
remote: Compressing objects: 39% (121/308)
remote: Compressing objects: 40% (124/308)
remote: Compressing objects: 41% (127/308)
remote: Compressing objects: 42% (130/308)
remote: Compressing objects: 43% (133/308)
remote: Compressing objects: 44% (136/308)
remote: Compressing objects: 45% (139/308)
remote: Compressing objects: 46% (142/308)
remote: Compressing objects: 47% (145/308)
remote: Compressing objects: 48% (148/308)
remote: Compressing objects: 49% (151/308)
remote: Compressing objects: 50% (154/308)
remote: Compressing objects: 51% (158/308)
remote: Compressing objects: 52% (161/308)
remote: Compressing objects: 53% (164/308)
remote: Compressing objects: 54% (167/308)
remote: Compressing objects: 55% (170/308)
remote: Compressing objects: 56% (173/308)
remote: Compressing objects: 57% (176/308)
remote: Compressing objects: 58% (179/308)
remote: Compressing objects: 59% (182/308)
remote: Compressing objects: 60% (185/308)
remote: Compressing objects: 61% (188/308)
remote: Compressing objects: 62% (191/308)
remote: Compressing objects: 63% (195/308)
remote: Compressing objects: 64% (198/308)
remote: Compressing objects: 65% (201/308)
remote: Compressing objects: 66% (204/308)
remote: Compressing objects: 67% (207/308)
remote: Compressing objects: 68% (210/308)
remote: Compressing objects: 69% (213/308)
remote: Compressing objects: 70% (216/308)
remote: Compressing objects: 71% (219/308)
remote: Compressing objects: 72% (222/308)
remote: Compressing objects: 73% (225/308)
remote: Compressing objects: 74% (228/308)
remote: Compressing objects: 75% (231/308)
remote: Compressing objects: 76% (235/308)
remote: Compressing objects: 77% (238/308)
remote: Compressing objects: 78% (241/308)
remote: Compressing objects: 79% (244/308)
remote: Compressing objects: 80% (247/308)
remote: Compressing objects: 81% (250/308)
remote: Compressing objects: 82% (253/308)
remote: Compressing objects: 83% (256/308)
remote: Compressing objects: 84% (259/308)
remote: Compressing objects: 85% (262/308)
remote: Compressing objects: 86% (265/308)
remote: Compressing objects: 87% (268/308)
remote: Compressing objects: 88% (272/308)
remote: Compressing objects: 89% (275/308)
remote: Compressing objects: 90% (278/308)
remote: Compressing objects: 91% (281/308)
remote: Compressing objects: 92% (284/308)
remote: Compressing objects: 93% (287/308)
remote: Compressing objects: 94% (290/308)
remote: Compressing objects: 95% (293/308)
remote: Compressing objects: 96% (296/308)
remote: Compressing objects: 97% (299/308)
remote: Compressing objects: 98% (302/308)
remote: Compressing objects: 99% (305/308)
remote: Compressing objects: 100% (308/308)
remote: Compressing objects: 100% (308/308), done.
Receiving objects: 0% (1/1755)
Receiving objects: 1% (18/1755)
Receiving objects: 2% (36/1755)
Receiving objects: 3% (53/1755)
Receiving objects: 4% (71/1755)
Receiving objects: 5% (88/1755)
Receiving objects: 6% (106/1755)
Receiving objects: 7% (123/1755)
Receiving objects: 8% (141/1755)
Receiving objects: 9% (158/1755)
Receiving objects: 10% (176/1755)
Receiving objects: 11% (194/1755)
Receiving objects: 12% (211/1755)
Receiving objects: 13% (229/1755)
Receiving objects: 14% (246/1755)
Receiving objects: 15% (264/1755)
Receiving objects: 16% (281/1755)
Receiving objects: 17% (299/1755)
Receiving objects: 18% (316/1755)
Receiving objects: 19% (334/1755)
Receiving objects: 20% (351/1755)
Receiving objects: 21% (369/1755)
Receiving objects: 22% (387/1755)
Receiving objects: 23% (404/1755)
Receiving objects: 24% (422/1755)
Receiving objects: 25% (439/1755)
Receiving objects: 26% (457/1755)
Receiving objects: 27% (474/1755)
Receiving objects: 28% (492/1755)
Receiving objects: 29% (509/1755)
Receiving objects: 30% (527/1755)
Receiving objects: 31% (545/1755)
Receiving objects: 32% (562/1755)
Receiving objects: 33% (580/1755)
Receiving objects: 34% (597/1755)
Receiving objects: 35% (615/1755)
Receiving objects: 36% (632/1755)
Receiving objects: 37% (650/1755)
Receiving objects: 38% (667/1755)
Receiving objects: 39% (685/1755)
Receiving objects: 40% (702/1755)
Receiving objects: 41% (720/1755)
Receiving objects: 42% (738/1755)
Receiving objects: 43% (755/1755)
Receiving objects: 44% (773/1755)
Receiving objects: 45% (790/1755)
Receiving objects: 46% (808/1755)
Receiving objects: 47% (825/1755)
Receiving objects: 48% (843/1755)
Receiving objects: 49% (860/1755)
Receiving objects: 50% (878/1755)
Receiving objects: 51% (896/1755)
Receiving objects: 52% (913/1755)
Receiving objects: 53% (931/1755)
Receiving objects: 54% (948/1755)
Receiving objects: 55% (966/1755)
Receiving objects: 56% (983/1755)
Receiving objects: 57% (1001/1755)
Receiving objects: 58% (1018/1755)
Receiving objects: 59% (1036/1755)
Receiving objects: 60% (1053/1755)
Receiving objects: 61% (1071/1755)
Receiving objects: 62% (1089/1755)
Receiving objects: 63% (1106/1755)
Receiving objects: 64% (1124/1755)
Receiving objects: 65% (1141/1755)
Receiving objects: 66% (1159/1755)
Receiving objects: 67% (1176/1755)
Receiving objects: 68% (1194/1755)
Receiving objects: 69% (1211/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 70% (1229/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 71% (1247/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 72% (1264/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 73% (1282/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 74% (1299/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 75% (1317/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 76% (1334/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 77% (1352/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 78% (1369/1755), 16.12 MiB | 32.22 MiB/s
Receiving objects: 78% (1386/1755), 43.29 MiB | 43.28 MiB/s
Receiving objects: 78% (1386/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 79% (1387/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 80% (1404/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 81% (1422/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 82% (1440/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 83% (1457/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 84% (1475/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 85% (1492/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 86% (1510/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 87% (1527/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 88% (1545/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 89% (1562/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 90% (1580/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 91% (1598/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 92% (1615/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 93% (1633/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 94% (1650/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 95% (1668/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 96% (1685/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 97% (1703/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 98% (1720/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 99% (1738/1755), 102.21 MiB | 51.10 MiB/s
remote: Total 1755 (delta 153), reused 354 (delta 108), pack-reused 1337
Receiving objects: 100% (1755/1755), 102.21 MiB | 51.10 MiB/s
Receiving objects: 100% (1755/1755), 114.49 MiB | 49.05 MiB/s, done.
Resolving deltas: 0% (0/912)
Resolving deltas: 1% (10/912)
Resolving deltas: 2% (19/912)
Resolving deltas: 3% (28/912)
Resolving deltas: 4% (37/912)
Resolving deltas: 5% (46/912)
Resolving deltas: 6% (55/912)
Resolving deltas: 7% (64/912)
Resolving deltas: 8% (73/912)
Resolving deltas: 9% (83/912)
Resolving deltas: 10% (92/912)
Resolving deltas: 11% (101/912)
Resolving deltas: 12% (110/912)
Resolving deltas: 13% (119/912)
Resolving deltas: 14% (128/912)
Resolving deltas: 15% (137/912)
Resolving deltas: 16% (146/912)
Resolving deltas: 17% (156/912)
Resolving deltas: 18% (165/912)
Resolving deltas: 19% (174/912)
Resolving deltas: 20% (183/912)
Resolving deltas: 21% (192/912)
Resolving deltas: 22% (201/912)
Resolving deltas: 23% (210/912)
Resolving deltas: 24% (219/912)
Resolving deltas: 25% (228/912)
Resolving deltas: 26% (238/912)
Resolving deltas: 27% (247/912)
Resolving deltas: 28% (256/912)
Resolving deltas: 29% (265/912)
Resolving deltas: 30% (274/912)
Resolving deltas: 31% (283/912)
Resolving deltas: 32% (292/912)
Resolving deltas: 33% (301/912)
Resolving deltas: 34% (311/912)
Resolving deltas: 35% (320/912)
Resolving deltas: 36% (329/912)
Resolving deltas: 37% (338/912)
Resolving deltas: 38% (347/912)
Resolving deltas: 39% (356/912)
Resolving deltas: 40% (365/912)
Resolving deltas: 41% (374/912)
Resolving deltas: 42% (384/912)
Resolving deltas: 43% (393/912)
Resolving deltas: 44% (402/912)
Resolving deltas: 45% (411/912)
Resolving deltas: 46% (420/912)
Resolving deltas: 47% (429/912)
Resolving deltas: 48% (438/912)
Resolving deltas: 49% (447/912)
Resolving deltas: 50% (456/912)
Resolving deltas: 51% (466/912)
Resolving deltas: 52% (475/912)
Resolving deltas: 53% (484/912)
Resolving deltas: 54% (493/912)
Resolving deltas: 55% (502/912)
Resolving deltas: 56% (511/912)
Resolving deltas: 57% (520/912)
Resolving deltas: 58% (529/912)
Resolving deltas: 59% (539/912)
Resolving deltas: 60% (548/912)
Resolving deltas: 61% (557/912)
Resolving deltas: 62% (566/912)
Resolving deltas: 63% (575/912)
Resolving deltas: 64% (584/912)
Resolving deltas: 65% (593/912)
Resolving deltas: 66% (602/912)
Resolving deltas: 67% (612/912)
Resolving deltas: 68% (621/912)
Resolving deltas: 69% (630/912)
Resolving deltas: 70% (639/912)
Resolving deltas: 71% (648/912)
Resolving deltas: 72% (657/912)
Resolving deltas: 73% (666/912)
Resolving deltas: 74% (675/912)
Resolving deltas: 75% (684/912)
Resolving deltas: 76% (694/912)
Resolving deltas: 77% (703/912)
Resolving deltas: 78% (712/912)
Resolving deltas: 79% (721/912)
Resolving deltas: 80% (730/912)
Resolving deltas: 81% (739/912)
Resolving deltas: 82% (748/912)
Resolving deltas: 83% (757/912)
Resolving deltas: 84% (767/912)
Resolving deltas: 85% (776/912)
Resolving deltas: 86% (785/912)
Resolving deltas: 87% (794/912)
Resolving deltas: 88% (803/912)
Resolving deltas: 89% (812/912)
Resolving deltas: 90% (821/912)
Resolving deltas: 91% (830/912)
Resolving deltas: 92% (840/912)
Resolving deltas: 93% (849/912)
Resolving deltas: 94% (858/912)
Resolving deltas: 95% (867/912)
Resolving deltas: 96% (876/912)
Resolving deltas: 97% (885/912)
Resolving deltas: 98% (894/912)
Resolving deltas: 99% (903/912)
Resolving deltas: 100% (912/912)
Resolving deltas: 100% (912/912), done.
Advanced Configuration#
Load the dataset and model (if you are restarting from this point)#
from tensorflow.keras.utils import to_categorical
from sklearn.datasets import fetch_openml
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import LabelEncoder, StandardScaler
from sklearn.metrics import accuracy_score
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
import plotting
# import os
# os.environ['PATH'] = '/opt/Xilinx/Vivado/2019.2/bin:' + os.environ['PATH']
# for this tutorial we wont be actually running Vivado, so I have commented these lines out
# but if you want to look into actually running on an FPGA then simply uncomment these lines
"""
#UNCOMMENT THIS IF LOCAL
X_train_val = np.load("X_train_val.npy")
X_test = np.ascontiguousarray(np.load("X_test.npy"))
y_train_val = np.load("y_train_val.npy")
y_test = np.load("y_test.npy", allow_pickle=True)
classes = np.load("classes.npy", allow_pickle=True)
from tensorflow.keras.models import load_model
model = load_model("model_1/KERAS_check_best_model.h5")
y_keras = model.predict(X_test)
"""
#IF YOU'RE ON COLAB
X_train_val = np.load("iaifi-summer-school/book/X_train_val.npy")
X_test = np.ascontiguousarray(np.load("iaifi-summer-school/book/X_test.npy"))
y_train_val = np.load("iaifi-summer-school/book/y_train_val.npy")
y_test = np.load("iaifi-summer-school/book/y_test.npy", allow_pickle=True)
classes = np.load("iaifi-summer-school/book/classes.npy", allow_pickle=True)
from tensorflow.keras.models import load_model
model = load_model("iaifi-summer-school/book/model_1/KERAS_check_best_model.h5")
y_keras = model.predict(X_test)
2022-11-08 17:46:25.234681: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-11-08 17:46:25.387991: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:46:25.388014: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
2022-11-08 17:46:25.420384: E tensorflow/stream_executor/cuda/cuda_blas.cc:2981] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2022-11-08 17:46:26.252418: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:46:26.252505: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:46:26.252515: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
2022-11-08 17:46:28.409106: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:46:28.409140: W tensorflow/stream_executor/cuda/cuda_driver.cc:263] failed call to cuInit: UNKNOWN ERROR (303)
2022-11-08 17:46:28.409165: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (fv-az196-389): /proc/driver/nvidia/version does not exist
2022-11-08 17:46:28.409470: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
1/5188 [..............................] - ETA: 11:38
52/5188 [..............................] - ETA: 5s
107/5188 [..............................] - ETA: 4s
162/5188 [..............................] - ETA: 4s
217/5188 [>.............................] - ETA: 4s
273/5188 [>.............................] - ETA: 4s
329/5188 [>.............................] - ETA: 4s
383/5188 [=>............................] - ETA: 4s
438/5188 [=>............................] - ETA: 4s
494/5188 [=>............................] - ETA: 4s
549/5188 [==>...........................] - ETA: 4s
605/5188 [==>...........................] - ETA: 4s
661/5188 [==>...........................] - ETA: 4s
716/5188 [===>..........................] - ETA: 4s
771/5188 [===>..........................] - ETA: 4s
826/5188 [===>..........................] - ETA: 4s
881/5188 [====>.........................] - ETA: 3s
936/5188 [====>.........................] - ETA: 3s
992/5188 [====>.........................] - ETA: 3s
1042/5188 [=====>........................] - ETA: 3s
1097/5188 [=====>........................] - ETA: 3s
1152/5188 [=====>........................] - ETA: 3s
1207/5188 [=====>........................] - ETA: 3s
1258/5188 [======>.......................] - ETA: 3s
1313/5188 [======>.......................] - ETA: 3s
1368/5188 [======>.......................] - ETA: 3s
1423/5188 [=======>......................] - ETA: 3s
1477/5188 [=======>......................] - ETA: 3s
1531/5188 [=======>......................] - ETA: 3s
1586/5188 [========>.....................] - ETA: 3s
1641/5188 [========>.....................] - ETA: 3s
1696/5188 [========>.....................] - ETA: 3s
1751/5188 [=========>....................] - ETA: 3s
1806/5188 [=========>....................] - ETA: 3s
1861/5188 [=========>....................] - ETA: 3s
1911/5188 [==========>...................] - ETA: 3s
1966/5188 [==========>...................] - ETA: 2s
2022/5188 [==========>...................] - ETA: 2s
2078/5188 [===========>..................] - ETA: 2s
2134/5188 [===========>..................] - ETA: 2s
2189/5188 [===========>..................] - ETA: 2s
2244/5188 [===========>..................] - ETA: 2s
2299/5188 [============>.................] - ETA: 2s
2354/5188 [============>.................] - ETA: 2s
2408/5188 [============>.................] - ETA: 2s
2462/5188 [=============>................] - ETA: 2s
2516/5188 [=============>................] - ETA: 2s
2570/5188 [=============>................] - ETA: 2s
2624/5188 [==============>...............] - ETA: 2s
2679/5188 [==============>...............] - ETA: 2s
2734/5188 [==============>...............] - ETA: 2s
2789/5188 [===============>..............] - ETA: 2s
2843/5188 [===============>..............] - ETA: 2s
2897/5188 [===============>..............] - ETA: 2s
2952/5188 [================>.............] - ETA: 2s
3006/5188 [================>.............] - ETA: 2s
3061/5188 [================>.............] - ETA: 1s
3113/5188 [=================>............] - ETA: 1s
3167/5188 [=================>............] - ETA: 1s
3220/5188 [=================>............] - ETA: 1s
3275/5188 [=================>............] - ETA: 1s
3330/5188 [==================>...........] - ETA: 1s
3384/5188 [==================>...........] - ETA: 1s
3439/5188 [==================>...........] - ETA: 1s
3494/5188 [===================>..........] - ETA: 1s
3549/5188 [===================>..........] - ETA: 1s
3604/5188 [===================>..........] - ETA: 1s
3660/5188 [====================>.........] - ETA: 1s
3715/5188 [====================>.........] - ETA: 1s
3770/5188 [====================>.........] - ETA: 1s
3825/5188 [=====================>........] - ETA: 1s
3880/5188 [=====================>........] - ETA: 1s
3935/5188 [=====================>........] - ETA: 1s
3991/5188 [======================>.......] - ETA: 1s
4046/5188 [======================>.......] - ETA: 1s
4101/5188 [======================>.......] - ETA: 1s
4156/5188 [=======================>......] - ETA: 0s
4211/5188 [=======================>......] - ETA: 0s
4266/5188 [=======================>......] - ETA: 0s
4320/5188 [=======================>......] - ETA: 0s
4371/5188 [========================>.....] - ETA: 0s
4426/5188 [========================>.....] - ETA: 0s
4480/5188 [========================>.....] - ETA: 0s
4534/5188 [=========================>....] - ETA: 0s
4588/5188 [=========================>....] - ETA: 0s
4643/5188 [=========================>....] - ETA: 0s
4698/5188 [==========================>...] - ETA: 0s
4752/5188 [==========================>...] - ETA: 0s
4807/5188 [==========================>...] - ETA: 0s
4862/5188 [===========================>..] - ETA: 0s
4917/5188 [===========================>..] - ETA: 0s
4972/5188 [===========================>..] - ETA: 0s
5027/5188 [============================>.] - ETA: 0s
5082/5188 [============================>.] - ETA: 0s
5137/5188 [============================>.] - ETA: 0s
5188/5188 [==============================] - 5s 923us/step
Make a new hls4ml config & model#
This time, we’ll create a config with finer granularity. When we print the config dictionary, you’ll notice that an entry is created for each named Layer of the model. See for the first layer, for example:
fc1:
Precision:
weight: ap_fixed<10,4>
bias: ap_fixed<10,4>
result: ap_fixed<10,4>
ReuseFactor: 1
We will also modify the default_precision to be smaller than we know is good just to demonstrate the effect.
import hls4ml
config = hls4ml.utils.config_from_keras_model(
model, granularity="name", default_precision="ap_fixed<10,4>"
)
print("-----------------------------------")
plotting.print_dict(config)
print("-----------------------------------")
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: Input
Layer name: fc1, layer type: Dense
-> Activation (linear), layer name: fc1
Layer name: relu1, layer type: Activation
Layer name: fc2, layer type: Dense
-> Activation (linear), layer name: fc2
Layer name: relu2, layer type: Activation
Layer name: fc3, layer type: Dense
-> Activation (linear), layer name: fc3
Layer name: relu3, layer type: Activation
Layer name: output, layer type: Dense
-> Activation (linear), layer name: output
Layer name: softmax, layer type: Activation
-----------------------------------
Model
Precision: ap_fixed<10,4>
ReuseFactor: 1
Strategy: Latency
LayerName
fc1_input
Precision
result: ap_fixed<10,4>
fc1
Precision
weight: ap_fixed<10,4>
bias: ap_fixed<10,4>
result: ap_fixed<10,4>
ReuseFactor: 1
fc1_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu1
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
fc2
Precision
weight: ap_fixed<10,4>
bias: ap_fixed<10,4>
result: ap_fixed<10,4>
ReuseFactor: 1
fc2_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu2
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
fc3
Precision
weight: ap_fixed<10,4>
bias: ap_fixed<10,4>
result: ap_fixed<10,4>
ReuseFactor: 1
fc3_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu3
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
output
Precision
weight: ap_fixed<10,4>
bias: ap_fixed<10,4>
result: ap_fixed<10,4>
ReuseFactor: 1
output_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
softmax
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
exp_table_t: ap_fixed<18,8,AP_RND,AP_SAT>
inv_table_t: ap_fixed<18,8,AP_RND,AP_SAT>
-----------------------------------
Profiling#
As you can see, hls4ml will allow is to choose the precision of everything in our Neural Network. This is a powerful way to tune the performance, but it’s also complicated. Luckily there are tools in hls4ml.model.profiling
that will help choose the right precision for a given model.
The first thing we will do is to numerically profile the model. This method plots the distribution of the weights (and biases) as a box and whisker plot. The grey boxes show the values which can be represented with the data types used in the hls_model
. Generally, you need the box to overlap completely with the whisker ‘to the right’ (large values) otherwise you’ll get saturation & wrap-around issues from exceeding the top of the fixed-point range. It can be okay for the box not to overlap completely ‘to the left’ (small values), but finding how small you can go is a matter of trial-and-error.
Providing data, in this case just using the first 1000 examples for speed, will show the same distributions captured at the output of each layer.
%matplotlib inline
for layer in config["LayerName"].keys():
config["LayerName"][layer]["Trace"] = True
hls_model = hls4ml.converters.convert_from_keras_model(
model,
hls_config=config,
output_dir="model_1/hls4ml_prj_2",
part="xcu250-figd2104-2L-e",
)
hls4ml.model.profiling.numerical(model=model, hls_model=hls_model, X=X_test[:1000])
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: Dense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: Dense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: Dense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: Dense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Profiling weights (before optimization)
Profiling weights (final / after optimization)
Profiling activations (before optimization)
fc1
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 903us/step
relu1
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 934us/step
fc2
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 942us/step
relu2
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 941us/step
fc3
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 953us/step
relu3
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 931us/step
output
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 978us/step
softmax
1/32 [..............................] - ETA: 1s
32/32 [==============================] - 0s 1ms/step
Profiling activations (final / after optimization)
Recompiling myproject with tracing
Writing HLS project
Done
fc1
relu1
fc2
relu2
fc3
relu3
output
softmax
(<Figure size 640x480 with 1 Axes>,
<Figure size 640x480 with 1 Axes>,
<Figure size 640x480 with 1 Axes>,
<Figure size 640x480 with 1 Axes>)
We can see that in this case the default precision of ap_fixed<16,6>
will fully cover the upper range of the outputs from each layer. This is fully consistent with what we saw earlier from the ROC curve where the fixed-point model was capable of reproducing the floating point result. However, we know that reducing the integer or fractional precision slightly will begin to result in degraded performance.
hls_model.compile()
y_hls = hls_model.predict(X_test)
print(
"Keras Accuracy: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_keras, axis=1))
)
)
print(
"hls4ml Accuracy: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_hls, axis=1))
)
)
fig, ax = plt.subplots(figsize=(9, 9))
_ = plotting.makeRoc(y_test, y_keras, classes)
plt.gca().set_prop_cycle(None) # reset the colors
_ = plotting.makeRoc(y_test, y_hls, classes, linestyle="--")
from matplotlib.lines import Line2D
lines = [Line2D([0], [0], ls="-"), Line2D([0], [0], ls="--")]
from matplotlib.legend import Legend
leg = Legend(ax, lines, labels=["keras", "hls4ml"], loc="lower right", frameon=False)
ax.add_artist(leg)
Writing HLS project
Done
Keras Accuracy: 0.7516506024096385
hls4ml Accuracy: 0.5122831325301205
<matplotlib.legend.Legend at 0x7fe82b00ad50>
Not good at all! Let’s see if we can figure out how to create a model that will work at these lower precisions.
The first thing we can try is adding some regularizers. This will penalize the model for using large weights, which can help to reduce the number of bits that are necessary.
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation, BatchNormalization
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.regularizers import l1
from callbacks import all_callbacks
model = Sequential()
model.add(
Dense(
64,
input_shape=(16,),
name="fc1",
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(Activation(activation="relu", name="relu1"))
model.add(
Dense(
32,
name="fc2",
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(Activation(activation="relu", name="relu2"))
model.add(
Dense(
32,
name="fc3",
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(Activation(activation="relu", name="relu3"))
model.add(
Dense(
5,
name="output",
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(Activation(activation="softmax", name="softmax"))
train = True
if train:
adam = Adam(lr=0.0001)
model.compile(
optimizer=adam, loss=["categorical_crossentropy"], metrics=["accuracy"]
)
callbacks = all_callbacks(
stop_patience=1000,
lr_factor=0.5,
lr_patience=10,
lr_epsilon=0.000001,
lr_cooldown=2,
lr_minimum=0.0000001,
outputDir="model_2",
)
model.fit(
X_train_val,
y_train_val,
batch_size=1024,
epochs=30,
validation_split=0.25,
shuffle=True,
callbacks=callbacks.callbacks,
)
else:
from tensorflow.keras.models import load_model
model = load_model("model_2/KERAS_check_best_model.h5")
WARNING:tensorflow:`epsilon` argument is deprecated and will be removed, use `min_delta` instead.
WARNING:tensorflow:`period` argument is deprecated. Please use `save_freq` to specify the frequency in number of batches seen.
Epoch 1/30
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/optimizers/optimizer_v2/adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
super().__init__(name, **kwargs)
1/487 [..............................] - ETA: 4:26 - loss: 1.7324 - accuracy: 0.3154
WARNING:tensorflow:Callback method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0025s vs `on_train_batch_end` time: 0.0028s). Check your callbacks.
20/487 [>.............................] - ETA: 1s - loss: 1.7065 - accuracy: 0.3165
41/487 [=>............................] - ETA: 1s - loss: 1.6752 - accuracy: 0.3172
62/487 [==>...........................] - ETA: 1s - loss: 1.6461 - accuracy: 0.3209
83/487 [====>.........................] - ETA: 1s - loss: 1.6193 - accuracy: 0.3268
105/487 [=====>........................] - ETA: 0s - loss: 1.5933 - accuracy: 0.3354
126/487 [======>.......................] - ETA: 0s - loss: 1.5710 - accuracy: 0.3472
147/487 [========>.....................] - ETA: 0s - loss: 1.5502 - accuracy: 0.3624
169/487 [=========>....................] - ETA: 0s - loss: 1.5290 - accuracy: 0.3784
191/487 [==========>...................] - ETA: 0s - loss: 1.5097 - accuracy: 0.3918
213/487 [============>.................] - ETA: 0s - loss: 1.4910 - accuracy: 0.4036
235/487 [=============>................] - ETA: 0s - loss: 1.4731 - accuracy: 0.4141
256/487 [==============>...............] - ETA: 0s - loss: 1.4566 - accuracy: 0.4232
276/487 [================>.............] - ETA: 0s - loss: 1.4421 - accuracy: 0.4316
296/487 [=================>............] - ETA: 0s - loss: 1.4282 - accuracy: 0.4393
316/487 [==================>...........] - ETA: 0s - loss: 1.4153 - accuracy: 0.4470
337/487 [===================>..........] - ETA: 0s - loss: 1.4029 - accuracy: 0.4544
358/487 [=====================>........] - ETA: 0s - loss: 1.3910 - accuracy: 0.4616
379/487 [======================>.......] - ETA: 0s - loss: 1.3797 - accuracy: 0.4682
400/487 [=======================>......] - ETA: 0s - loss: 1.3694 - accuracy: 0.4747
421/487 [========================>.....] - ETA: 0s - loss: 1.3595 - accuracy: 0.4808
443/487 [==========================>...] - ETA: 0s - loss: 1.3499 - accuracy: 0.4869
464/487 [===========================>..] - ETA: 0s - loss: 1.3410 - accuracy: 0.4927
485/487 [============================>.] - ETA: 0s - loss: 1.3326 - accuracy: 0.4985
***callbacks***
saving losses to model_2/losses.log
Epoch 1: val_loss improved from inf to 1.14627, saving model to model_2/KERAS_check_best_model.h5
Epoch 1: val_loss improved from inf to 1.14627, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 1: saving model to model_2/KERAS_check_model_last.h5
Epoch 1: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 3ms/step - loss: 1.3320 - accuracy: 0.4989 - val_loss: 1.1463 - val_accuracy: 0.6224 - lr: 1.0000e-04
Epoch 2/30
1/487 [..............................] - ETA: 1s - loss: 1.1266 - accuracy: 0.6357
22/487 [>.............................] - ETA: 1s - loss: 1.1418 - accuracy: 0.6254
43/487 [=>............................] - ETA: 1s - loss: 1.1371 - accuracy: 0.6275
65/487 [===>..........................] - ETA: 1s - loss: 1.1340 - accuracy: 0.6293
86/487 [====>.........................] - ETA: 0s - loss: 1.1307 - accuracy: 0.6318
108/487 [=====>........................] - ETA: 0s - loss: 1.1270 - accuracy: 0.6346
130/487 [=======>......................] - ETA: 0s - loss: 1.1220 - accuracy: 0.6373
152/487 [========>.....................] - ETA: 0s - loss: 1.1173 - accuracy: 0.6398
173/487 [=========>....................] - ETA: 0s - loss: 1.1135 - accuracy: 0.6412
194/487 [==========>...................] - ETA: 0s - loss: 1.1093 - accuracy: 0.6427
215/487 [============>.................] - ETA: 0s - loss: 1.1056 - accuracy: 0.6445
236/487 [=============>................] - ETA: 0s - loss: 1.1029 - accuracy: 0.6459
257/487 [==============>...............] - ETA: 0s - loss: 1.0992 - accuracy: 0.6476
278/487 [================>.............] - ETA: 0s - loss: 1.0965 - accuracy: 0.6488
299/487 [=================>............] - ETA: 0s - loss: 1.0936 - accuracy: 0.6503
320/487 [==================>...........] - ETA: 0s - loss: 1.0912 - accuracy: 0.6515
341/487 [====================>.........] - ETA: 0s - loss: 1.0885 - accuracy: 0.6527
362/487 [=====================>........] - ETA: 0s - loss: 1.0852 - accuracy: 0.6540
383/487 [======================>.......] - ETA: 0s - loss: 1.0824 - accuracy: 0.6551
405/487 [=======================>......] - ETA: 0s - loss: 1.0798 - accuracy: 0.6559
427/487 [=========================>....] - ETA: 0s - loss: 1.0772 - accuracy: 0.6568
448/487 [==========================>...] - ETA: 0s - loss: 1.0740 - accuracy: 0.6581
469/487 [===========================>..] - ETA: 0s - loss: 1.0715 - accuracy: 0.6589
***callbacks***
saving losses to model_2/losses.log
Epoch 2: val_loss improved from 1.14627 to 1.01507, saving model to model_2/KERAS_check_best_model.h5
Epoch 2: val_loss improved from 1.14627 to 1.01507, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 2: saving model to model_2/KERAS_check_model_last.h5
Epoch 2: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 1.0694 - accuracy: 0.6598 - val_loss: 1.0151 - val_accuracy: 0.6797 - lr: 1.0000e-04
Epoch 3/30
1/487 [..............................] - ETA: 1s - loss: 0.9536 - accuracy: 0.6982
22/487 [>.............................] - ETA: 1s - loss: 1.0039 - accuracy: 0.6826
43/487 [=>............................] - ETA: 1s - loss: 1.0104 - accuracy: 0.6807
65/487 [===>..........................] - ETA: 1s - loss: 1.0081 - accuracy: 0.6812
86/487 [====>.........................] - ETA: 0s - loss: 1.0069 - accuracy: 0.6819
107/487 [=====>........................] - ETA: 0s - loss: 1.0061 - accuracy: 0.6820
129/487 [======>.......................] - ETA: 0s - loss: 1.0044 - accuracy: 0.6826
151/487 [========>.....................] - ETA: 0s - loss: 1.0040 - accuracy: 0.6824
172/487 [=========>....................] - ETA: 0s - loss: 1.0013 - accuracy: 0.6835
194/487 [==========>...................] - ETA: 0s - loss: 0.9996 - accuracy: 0.6836
216/487 [============>.................] - ETA: 0s - loss: 0.9973 - accuracy: 0.6847
238/487 [=============>................] - ETA: 0s - loss: 0.9958 - accuracy: 0.6853
260/487 [===============>..............] - ETA: 0s - loss: 0.9941 - accuracy: 0.6858
282/487 [================>.............] - ETA: 0s - loss: 0.9921 - accuracy: 0.6865
304/487 [=================>............] - ETA: 0s - loss: 0.9905 - accuracy: 0.6871
326/487 [===================>..........] - ETA: 0s - loss: 0.9891 - accuracy: 0.6876
348/487 [====================>.........] - ETA: 0s - loss: 0.9879 - accuracy: 0.6879
370/487 [=====================>........] - ETA: 0s - loss: 0.9865 - accuracy: 0.6883
392/487 [=======================>......] - ETA: 0s - loss: 0.9849 - accuracy: 0.6889
414/487 [========================>.....] - ETA: 0s - loss: 0.9837 - accuracy: 0.6894
436/487 [=========================>....] - ETA: 0s - loss: 0.9825 - accuracy: 0.6896
457/487 [===========================>..] - ETA: 0s - loss: 0.9815 - accuracy: 0.6900
478/487 [============================>.] - ETA: 0s - loss: 0.9801 - accuracy: 0.6906
***callbacks***
saving losses to model_2/losses.log
Epoch 3: val_loss improved from 1.01507 to 0.95476, saving model to model_2/KERAS_check_best_model.h5
Epoch 3: val_loss improved from 1.01507 to 0.95476, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 3: saving model to model_2/KERAS_check_model_last.h5
Epoch 3: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.9792 - accuracy: 0.6909 - val_loss: 0.9548 - val_accuracy: 0.6995 - lr: 1.0000e-04
Epoch 4/30
1/487 [..............................] - ETA: 1s - loss: 0.9658 - accuracy: 0.6826
23/487 [>.............................] - ETA: 1s - loss: 0.9487 - accuracy: 0.6996
44/487 [=>............................] - ETA: 1s - loss: 0.9512 - accuracy: 0.7005
65/487 [===>..........................] - ETA: 1s - loss: 0.9493 - accuracy: 0.7017
86/487 [====>.........................] - ETA: 0s - loss: 0.9504 - accuracy: 0.7012
108/487 [=====>........................] - ETA: 0s - loss: 0.9485 - accuracy: 0.7016
130/487 [=======>......................] - ETA: 0s - loss: 0.9464 - accuracy: 0.7025
152/487 [========>.....................] - ETA: 0s - loss: 0.9436 - accuracy: 0.7038
174/487 [=========>....................] - ETA: 0s - loss: 0.9423 - accuracy: 0.7043
195/487 [===========>..................] - ETA: 0s - loss: 0.9418 - accuracy: 0.7049
216/487 [============>.................] - ETA: 0s - loss: 0.9416 - accuracy: 0.7048
237/487 [=============>................] - ETA: 0s - loss: 0.9398 - accuracy: 0.7055
258/487 [==============>...............] - ETA: 0s - loss: 0.9395 - accuracy: 0.7057
280/487 [================>.............] - ETA: 0s - loss: 0.9386 - accuracy: 0.7057
301/487 [=================>............] - ETA: 0s - loss: 0.9370 - accuracy: 0.7059
323/487 [==================>...........] - ETA: 0s - loss: 0.9362 - accuracy: 0.7062
345/487 [====================>.........] - ETA: 0s - loss: 0.9354 - accuracy: 0.7062
366/487 [=====================>........] - ETA: 0s - loss: 0.9344 - accuracy: 0.7063
388/487 [======================>.......] - ETA: 0s - loss: 0.9333 - accuracy: 0.7067
410/487 [========================>.....] - ETA: 0s - loss: 0.9328 - accuracy: 0.7067
432/487 [=========================>....] - ETA: 0s - loss: 0.9321 - accuracy: 0.7068
453/487 [==========================>...] - ETA: 0s - loss: 0.9314 - accuracy: 0.7070
474/487 [============================>.] - ETA: 0s - loss: 0.9299 - accuracy: 0.7077
***callbacks***
saving losses to model_2/losses.log
Epoch 4: val_loss improved from 0.95476 to 0.91388, saving model to model_2/KERAS_check_best_model.h5
Epoch 4: val_loss improved from 0.95476 to 0.91388, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 4: saving model to model_2/KERAS_check_model_last.h5
Epoch 4: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.9294 - accuracy: 0.7078 - val_loss: 0.9139 - val_accuracy: 0.7124 - lr: 1.0000e-04
Epoch 5/30
1/487 [..............................] - ETA: 2s - loss: 0.9179 - accuracy: 0.7100
22/487 [>.............................] - ETA: 1s - loss: 0.9147 - accuracy: 0.7108
42/487 [=>............................] - ETA: 1s - loss: 0.9116 - accuracy: 0.7116
62/487 [==>...........................] - ETA: 1s - loss: 0.9096 - accuracy: 0.7127
83/487 [====>.........................] - ETA: 0s - loss: 0.9096 - accuracy: 0.7131
104/487 [=====>........................] - ETA: 0s - loss: 0.9074 - accuracy: 0.7143
125/487 [======>.......................] - ETA: 0s - loss: 0.9068 - accuracy: 0.7144
145/487 [=======>......................] - ETA: 0s - loss: 0.9051 - accuracy: 0.7147
166/487 [=========>....................] - ETA: 0s - loss: 0.9055 - accuracy: 0.7145
188/487 [==========>...................] - ETA: 0s - loss: 0.9036 - accuracy: 0.7150
210/487 [===========>..................] - ETA: 0s - loss: 0.9030 - accuracy: 0.7152
232/487 [=============>................] - ETA: 0s - loss: 0.9026 - accuracy: 0.7154
254/487 [==============>...............] - ETA: 0s - loss: 0.9027 - accuracy: 0.7150
276/487 [================>.............] - ETA: 0s - loss: 0.9015 - accuracy: 0.7151
298/487 [=================>............] - ETA: 0s - loss: 0.9005 - accuracy: 0.7155
319/487 [==================>...........] - ETA: 0s - loss: 0.8993 - accuracy: 0.7160
340/487 [===================>..........] - ETA: 0s - loss: 0.8986 - accuracy: 0.7164
362/487 [=====================>........] - ETA: 0s - loss: 0.8977 - accuracy: 0.7166
383/487 [======================>.......] - ETA: 0s - loss: 0.8978 - accuracy: 0.7165
405/487 [=======================>......] - ETA: 0s - loss: 0.8971 - accuracy: 0.7168
426/487 [=========================>....] - ETA: 0s - loss: 0.8971 - accuracy: 0.7168
447/487 [==========================>...] - ETA: 0s - loss: 0.8964 - accuracy: 0.7169
467/487 [===========================>..] - ETA: 0s - loss: 0.8956 - accuracy: 0.7170
486/487 [============================>.] - ETA: 0s - loss: 0.8947 - accuracy: 0.7174
***callbacks***
saving losses to model_2/losses.log
Epoch 5: val_loss improved from 0.91388 to 0.88526, saving model to model_2/KERAS_check_best_model.h5
Epoch 5: val_loss improved from 0.91388 to 0.88526, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 5: saving model to model_2/KERAS_check_model_last.h5
Epoch 5: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8947 - accuracy: 0.7174 - val_loss: 0.8853 - val_accuracy: 0.7198 - lr: 1.0000e-04
Epoch 6/30
1/487 [..............................] - ETA: 1s - loss: 0.9286 - accuracy: 0.7158
23/487 [>.............................] - ETA: 1s - loss: 0.8791 - accuracy: 0.7197
44/487 [=>............................] - ETA: 1s - loss: 0.8787 - accuracy: 0.7192
66/487 [===>..........................] - ETA: 0s - loss: 0.8818 - accuracy: 0.7189
87/487 [====>.........................] - ETA: 0s - loss: 0.8818 - accuracy: 0.7192
109/487 [=====>........................] - ETA: 0s - loss: 0.8801 - accuracy: 0.7199
131/487 [=======>......................] - ETA: 0s - loss: 0.8800 - accuracy: 0.7200
153/487 [========>.....................] - ETA: 0s - loss: 0.8785 - accuracy: 0.7204
176/487 [=========>....................] - ETA: 0s - loss: 0.8777 - accuracy: 0.7207
198/487 [===========>..................] - ETA: 0s - loss: 0.8772 - accuracy: 0.7210
219/487 [============>.................] - ETA: 0s - loss: 0.8771 - accuracy: 0.7208
240/487 [=============>................] - ETA: 0s - loss: 0.8769 - accuracy: 0.7209
261/487 [===============>..............] - ETA: 0s - loss: 0.8770 - accuracy: 0.7207
282/487 [================>.............] - ETA: 0s - loss: 0.8760 - accuracy: 0.7212
303/487 [=================>............] - ETA: 0s - loss: 0.8757 - accuracy: 0.7212
324/487 [==================>...........] - ETA: 0s - loss: 0.8748 - accuracy: 0.7215
344/487 [====================>.........] - ETA: 0s - loss: 0.8745 - accuracy: 0.7217
365/487 [=====================>........] - ETA: 0s - loss: 0.8734 - accuracy: 0.7219
386/487 [======================>.......] - ETA: 0s - loss: 0.8733 - accuracy: 0.7218
407/487 [========================>.....] - ETA: 0s - loss: 0.8730 - accuracy: 0.7219
429/487 [=========================>....] - ETA: 0s - loss: 0.8724 - accuracy: 0.7220
451/487 [==========================>...] - ETA: 0s - loss: 0.8716 - accuracy: 0.7222
472/487 [============================>.] - ETA: 0s - loss: 0.8710 - accuracy: 0.7224
***callbacks***
saving losses to model_2/losses.log
Epoch 6: val_loss improved from 0.88526 to 0.86540, saving model to model_2/KERAS_check_best_model.h5
Epoch 6: val_loss improved from 0.88526 to 0.86540, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 6: saving model to model_2/KERAS_check_model_last.h5
Epoch 6: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8705 - accuracy: 0.7225 - val_loss: 0.8654 - val_accuracy: 0.7244 - lr: 1.0000e-04
Epoch 7/30
1/487 [..............................] - ETA: 2s - loss: 0.8517 - accuracy: 0.7354
23/487 [>.............................] - ETA: 1s - loss: 0.8570 - accuracy: 0.7278
45/487 [=>............................] - ETA: 1s - loss: 0.8597 - accuracy: 0.7253
66/487 [===>..........................] - ETA: 1s - loss: 0.8621 - accuracy: 0.7252
87/487 [====>.........................] - ETA: 0s - loss: 0.8614 - accuracy: 0.7253
109/487 [=====>........................] - ETA: 0s - loss: 0.8624 - accuracy: 0.7243
130/487 [=======>......................] - ETA: 0s - loss: 0.8630 - accuracy: 0.7241
151/487 [========>.....................] - ETA: 0s - loss: 0.8612 - accuracy: 0.7248
172/487 [=========>....................] - ETA: 0s - loss: 0.8605 - accuracy: 0.7249
193/487 [==========>...................] - ETA: 0s - loss: 0.8601 - accuracy: 0.7249
215/487 [============>.................] - ETA: 0s - loss: 0.8599 - accuracy: 0.7248
237/487 [=============>................] - ETA: 0s - loss: 0.8590 - accuracy: 0.7252
258/487 [==============>...............] - ETA: 0s - loss: 0.8584 - accuracy: 0.7253
279/487 [================>.............] - ETA: 0s - loss: 0.8584 - accuracy: 0.7250
301/487 [=================>............] - ETA: 0s - loss: 0.8577 - accuracy: 0.7253
323/487 [==================>...........] - ETA: 0s - loss: 0.8565 - accuracy: 0.7257
345/487 [====================>.........] - ETA: 0s - loss: 0.8562 - accuracy: 0.7256
367/487 [=====================>........] - ETA: 0s - loss: 0.8554 - accuracy: 0.7258
389/487 [======================>.......] - ETA: 0s - loss: 0.8552 - accuracy: 0.7260
411/487 [========================>.....] - ETA: 0s - loss: 0.8544 - accuracy: 0.7263
433/487 [=========================>....] - ETA: 0s - loss: 0.8541 - accuracy: 0.7264
455/487 [===========================>..] - ETA: 0s - loss: 0.8536 - accuracy: 0.7264
476/487 [============================>.] - ETA: 0s - loss: 0.8534 - accuracy: 0.7265
***callbacks***
saving losses to model_2/losses.log
Epoch 7: val_loss improved from 0.86540 to 0.85146, saving model to model_2/KERAS_check_best_model.h5
Epoch 7: val_loss improved from 0.86540 to 0.85146, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 7: saving model to model_2/KERAS_check_model_last.h5
Epoch 7: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8535 - accuracy: 0.7264 - val_loss: 0.8515 - val_accuracy: 0.7268 - lr: 1.0000e-04
Epoch 8/30
1/487 [..............................] - ETA: 2s - loss: 0.8695 - accuracy: 0.7080
23/487 [>.............................] - ETA: 1s - loss: 0.8460 - accuracy: 0.7266
45/487 [=>............................] - ETA: 1s - loss: 0.8422 - accuracy: 0.7290
66/487 [===>..........................] - ETA: 1s - loss: 0.8415 - accuracy: 0.7292
87/487 [====>.........................] - ETA: 0s - loss: 0.8424 - accuracy: 0.7289
108/487 [=====>........................] - ETA: 0s - loss: 0.8416 - accuracy: 0.7292
128/487 [======>.......................] - ETA: 0s - loss: 0.8421 - accuracy: 0.7291
149/487 [========>.....................] - ETA: 0s - loss: 0.8418 - accuracy: 0.7289
170/487 [=========>....................] - ETA: 0s - loss: 0.8421 - accuracy: 0.7288
192/487 [==========>...................] - ETA: 0s - loss: 0.8408 - accuracy: 0.7295
214/487 [============>.................] - ETA: 0s - loss: 0.8424 - accuracy: 0.7293
236/487 [=============>................] - ETA: 0s - loss: 0.8425 - accuracy: 0.7290
258/487 [==============>...............] - ETA: 0s - loss: 0.8431 - accuracy: 0.7287
279/487 [================>.............] - ETA: 0s - loss: 0.8429 - accuracy: 0.7288
301/487 [=================>............] - ETA: 0s - loss: 0.8427 - accuracy: 0.7289
322/487 [==================>...........] - ETA: 0s - loss: 0.8424 - accuracy: 0.7290
344/487 [====================>.........] - ETA: 0s - loss: 0.8427 - accuracy: 0.7290
365/487 [=====================>........] - ETA: 0s - loss: 0.8422 - accuracy: 0.7291
386/487 [======================>.......] - ETA: 0s - loss: 0.8417 - accuracy: 0.7293
407/487 [========================>.....] - ETA: 0s - loss: 0.8407 - accuracy: 0.7295
428/487 [=========================>....] - ETA: 0s - loss: 0.8410 - accuracy: 0.7293
450/487 [==========================>...] - ETA: 0s - loss: 0.8406 - accuracy: 0.7294
472/487 [============================>.] - ETA: 0s - loss: 0.8406 - accuracy: 0.7294
***callbacks***
saving losses to model_2/losses.log
Epoch 8: val_loss improved from 0.85146 to 0.84035, saving model to model_2/KERAS_check_best_model.h5
Epoch 8: val_loss improved from 0.85146 to 0.84035, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 8: saving model to model_2/KERAS_check_model_last.h5
Epoch 8: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8407 - accuracy: 0.7293 - val_loss: 0.8404 - val_accuracy: 0.7300 - lr: 1.0000e-04
Epoch 9/30
1/487 [..............................] - ETA: 1s - loss: 0.8137 - accuracy: 0.7363
23/487 [>.............................] - ETA: 1s - loss: 0.8410 - accuracy: 0.7288
45/487 [=>............................] - ETA: 1s - loss: 0.8361 - accuracy: 0.7299
66/487 [===>..........................] - ETA: 0s - loss: 0.8379 - accuracy: 0.7295
87/487 [====>.........................] - ETA: 0s - loss: 0.8373 - accuracy: 0.7304
108/487 [=====>........................] - ETA: 0s - loss: 0.8378 - accuracy: 0.7300
129/487 [======>.......................] - ETA: 0s - loss: 0.8361 - accuracy: 0.7305
150/487 [========>.....................] - ETA: 0s - loss: 0.8357 - accuracy: 0.7305
171/487 [=========>....................] - ETA: 0s - loss: 0.8344 - accuracy: 0.7311
193/487 [==========>...................] - ETA: 0s - loss: 0.8340 - accuracy: 0.7311
215/487 [============>.................] - ETA: 0s - loss: 0.8338 - accuracy: 0.7310
237/487 [=============>................] - ETA: 0s - loss: 0.8336 - accuracy: 0.7312
259/487 [==============>...............] - ETA: 0s - loss: 0.8331 - accuracy: 0.7314
281/487 [================>.............] - ETA: 0s - loss: 0.8332 - accuracy: 0.7311
303/487 [=================>............] - ETA: 0s - loss: 0.8320 - accuracy: 0.7315
325/487 [===================>..........] - ETA: 0s - loss: 0.8323 - accuracy: 0.7313
347/487 [====================>.........] - ETA: 0s - loss: 0.8316 - accuracy: 0.7317
369/487 [=====================>........] - ETA: 0s - loss: 0.8315 - accuracy: 0.7315
391/487 [=======================>......] - ETA: 0s - loss: 0.8315 - accuracy: 0.7314
413/487 [========================>.....] - ETA: 0s - loss: 0.8314 - accuracy: 0.7315
435/487 [=========================>....] - ETA: 0s - loss: 0.8317 - accuracy: 0.7315
457/487 [===========================>..] - ETA: 0s - loss: 0.8317 - accuracy: 0.7314
478/487 [============================>.] - ETA: 0s - loss: 0.8306 - accuracy: 0.7316
***callbacks***
saving losses to model_2/losses.log
Epoch 9: val_loss improved from 0.84035 to 0.83038, saving model to model_2/KERAS_check_best_model.h5
Epoch 9: val_loss improved from 0.84035 to 0.83038, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 9: saving model to model_2/KERAS_check_model_last.h5
Epoch 9: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8303 - accuracy: 0.7317 - val_loss: 0.8304 - val_accuracy: 0.7317 - lr: 1.0000e-04
Epoch 10/30
1/487 [..............................] - ETA: 1s - loss: 0.8136 - accuracy: 0.7354
23/487 [>.............................] - ETA: 1s - loss: 0.8152 - accuracy: 0.7346
44/487 [=>............................] - ETA: 1s - loss: 0.8188 - accuracy: 0.7335
65/487 [===>..........................] - ETA: 1s - loss: 0.8183 - accuracy: 0.7347
86/487 [====>.........................] - ETA: 0s - loss: 0.8186 - accuracy: 0.7345
107/487 [=====>........................] - ETA: 0s - loss: 0.8197 - accuracy: 0.7345
128/487 [======>.......................] - ETA: 0s - loss: 0.8203 - accuracy: 0.7345
149/487 [========>.....................] - ETA: 0s - loss: 0.8222 - accuracy: 0.7335
171/487 [=========>....................] - ETA: 0s - loss: 0.8220 - accuracy: 0.7337
193/487 [==========>...................] - ETA: 0s - loss: 0.8226 - accuracy: 0.7337
214/487 [============>.................] - ETA: 0s - loss: 0.8223 - accuracy: 0.7336
236/487 [=============>................] - ETA: 0s - loss: 0.8216 - accuracy: 0.7340
258/487 [==============>...............] - ETA: 0s - loss: 0.8213 - accuracy: 0.7339
280/487 [================>.............] - ETA: 0s - loss: 0.8219 - accuracy: 0.7339
301/487 [=================>............] - ETA: 0s - loss: 0.8217 - accuracy: 0.7341
323/487 [==================>...........] - ETA: 0s - loss: 0.8220 - accuracy: 0.7339
344/487 [====================>.........] - ETA: 0s - loss: 0.8218 - accuracy: 0.7339
365/487 [=====================>........] - ETA: 0s - loss: 0.8216 - accuracy: 0.7339
386/487 [======================>.......] - ETA: 0s - loss: 0.8215 - accuracy: 0.7341
407/487 [========================>.....] - ETA: 0s - loss: 0.8211 - accuracy: 0.7341
428/487 [=========================>....] - ETA: 0s - loss: 0.8213 - accuracy: 0.7340
450/487 [==========================>...] - ETA: 0s - loss: 0.8216 - accuracy: 0.7340
472/487 [============================>.] - ETA: 0s - loss: 0.8212 - accuracy: 0.7341
***callbacks***
saving losses to model_2/losses.log
Epoch 10: val_loss improved from 0.83038 to 0.82210, saving model to model_2/KERAS_check_best_model.h5
Epoch 10: val_loss improved from 0.83038 to 0.82210, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 10: saving model to model_2/KERAS_check_model_last.h5
Epoch 10: saving model to model_2/KERAS_check_model_last_weights.h5
Epoch 10: saving model to model_2/KERAS_check_model_epoch10.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8212 - accuracy: 0.7340 - val_loss: 0.8221 - val_accuracy: 0.7342 - lr: 1.0000e-04
Epoch 11/30
1/487 [..............................] - ETA: 2s - loss: 0.8513 - accuracy: 0.7275
22/487 [>.............................] - ETA: 1s - loss: 0.8078 - accuracy: 0.7392
43/487 [=>............................] - ETA: 1s - loss: 0.8145 - accuracy: 0.7362
64/487 [==>...........................] - ETA: 1s - loss: 0.8140 - accuracy: 0.7365
85/487 [====>.........................] - ETA: 0s - loss: 0.8134 - accuracy: 0.7375
106/487 [=====>........................] - ETA: 0s - loss: 0.8122 - accuracy: 0.7374
128/487 [======>.......................] - ETA: 0s - loss: 0.8125 - accuracy: 0.7373
149/487 [========>.....................] - ETA: 0s - loss: 0.8117 - accuracy: 0.7376
170/487 [=========>....................] - ETA: 0s - loss: 0.8113 - accuracy: 0.7375
191/487 [==========>...................] - ETA: 0s - loss: 0.8116 - accuracy: 0.7373
212/487 [============>.................] - ETA: 0s - loss: 0.8132 - accuracy: 0.7369
233/487 [=============>................] - ETA: 0s - loss: 0.8132 - accuracy: 0.7370
255/487 [==============>...............] - ETA: 0s - loss: 0.8141 - accuracy: 0.7369
276/487 [================>.............] - ETA: 0s - loss: 0.8145 - accuracy: 0.7364
298/487 [=================>............] - ETA: 0s - loss: 0.8141 - accuracy: 0.7362
319/487 [==================>...........] - ETA: 0s - loss: 0.8145 - accuracy: 0.7360
339/487 [===================>..........] - ETA: 0s - loss: 0.8141 - accuracy: 0.7360
359/487 [=====================>........] - ETA: 0s - loss: 0.8143 - accuracy: 0.7360
380/487 [======================>.......] - ETA: 0s - loss: 0.8148 - accuracy: 0.7358
401/487 [=======================>......] - ETA: 0s - loss: 0.8147 - accuracy: 0.7358
422/487 [========================>.....] - ETA: 0s - loss: 0.8138 - accuracy: 0.7361
443/487 [==========================>...] - ETA: 0s - loss: 0.8135 - accuracy: 0.7362
464/487 [===========================>..] - ETA: 0s - loss: 0.8133 - accuracy: 0.7362
485/487 [============================>.] - ETA: 0s - loss: 0.8130 - accuracy: 0.7364
***callbacks***
saving losses to model_2/losses.log
Epoch 11: val_loss improved from 0.82210 to 0.81447, saving model to model_2/KERAS_check_best_model.h5
Epoch 11: val_loss improved from 0.82210 to 0.81447, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 11: saving model to model_2/KERAS_check_model_last.h5
Epoch 11: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 3ms/step - loss: 0.8130 - accuracy: 0.7363 - val_loss: 0.8145 - val_accuracy: 0.7361 - lr: 1.0000e-04
Epoch 12/30
1/487 [..............................] - ETA: 1s - loss: 0.8459 - accuracy: 0.7334
22/487 [>.............................] - ETA: 1s - loss: 0.8237 - accuracy: 0.7318
44/487 [=>............................] - ETA: 1s - loss: 0.8151 - accuracy: 0.7348
66/487 [===>..........................] - ETA: 0s - loss: 0.8106 - accuracy: 0.7361
88/487 [====>.........................] - ETA: 0s - loss: 0.8075 - accuracy: 0.7370
110/487 [=====>........................] - ETA: 0s - loss: 0.8077 - accuracy: 0.7371
132/487 [=======>......................] - ETA: 0s - loss: 0.8063 - accuracy: 0.7381
154/487 [========>.....................] - ETA: 0s - loss: 0.8072 - accuracy: 0.7374
175/487 [=========>....................] - ETA: 0s - loss: 0.8083 - accuracy: 0.7371
197/487 [===========>..................] - ETA: 0s - loss: 0.8077 - accuracy: 0.7372
219/487 [============>.................] - ETA: 0s - loss: 0.8080 - accuracy: 0.7376
241/487 [=============>................] - ETA: 0s - loss: 0.8079 - accuracy: 0.7374
263/487 [===============>..............] - ETA: 0s - loss: 0.8074 - accuracy: 0.7377
284/487 [================>.............] - ETA: 0s - loss: 0.8066 - accuracy: 0.7380
305/487 [=================>............] - ETA: 0s - loss: 0.8065 - accuracy: 0.7382
326/487 [===================>..........] - ETA: 0s - loss: 0.8072 - accuracy: 0.7379
348/487 [====================>.........] - ETA: 0s - loss: 0.8061 - accuracy: 0.7381
370/487 [=====================>........] - ETA: 0s - loss: 0.8056 - accuracy: 0.7383
391/487 [=======================>......] - ETA: 0s - loss: 0.8056 - accuracy: 0.7384
412/487 [========================>.....] - ETA: 0s - loss: 0.8058 - accuracy: 0.7383
434/487 [=========================>....] - ETA: 0s - loss: 0.8059 - accuracy: 0.7382
456/487 [===========================>..] - ETA: 0s - loss: 0.8062 - accuracy: 0.7381
477/487 [============================>.] - ETA: 0s - loss: 0.8057 - accuracy: 0.7381
***callbacks***
saving losses to model_2/losses.log
Epoch 12: val_loss improved from 0.81447 to 0.80775, saving model to model_2/KERAS_check_best_model.h5
Epoch 12: val_loss improved from 0.81447 to 0.80775, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 12: saving model to model_2/KERAS_check_model_last.h5
Epoch 12: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.8057 - accuracy: 0.7381 - val_loss: 0.8078 - val_accuracy: 0.7385 - lr: 1.0000e-04
Epoch 13/30
1/487 [..............................] - ETA: 1s - loss: 0.7902 - accuracy: 0.7422
23/487 [>.............................] - ETA: 1s - loss: 0.8032 - accuracy: 0.7406
45/487 [=>............................] - ETA: 1s - loss: 0.8075 - accuracy: 0.7382
67/487 [===>..........................] - ETA: 0s - loss: 0.8054 - accuracy: 0.7389
89/487 [====>.........................] - ETA: 0s - loss: 0.8027 - accuracy: 0.7388
111/487 [=====>........................] - ETA: 0s - loss: 0.8040 - accuracy: 0.7383
133/487 [=======>......................] - ETA: 0s - loss: 0.8038 - accuracy: 0.7384
154/487 [========>.....................] - ETA: 0s - loss: 0.8032 - accuracy: 0.7384
175/487 [=========>....................] - ETA: 0s - loss: 0.8038 - accuracy: 0.7384
196/487 [===========>..................] - ETA: 0s - loss: 0.8045 - accuracy: 0.7387
217/487 [============>.................] - ETA: 0s - loss: 0.8037 - accuracy: 0.7392
238/487 [=============>................] - ETA: 0s - loss: 0.8031 - accuracy: 0.7393
260/487 [===============>..............] - ETA: 0s - loss: 0.8025 - accuracy: 0.7393
281/487 [================>.............] - ETA: 0s - loss: 0.8016 - accuracy: 0.7394
302/487 [=================>............] - ETA: 0s - loss: 0.8004 - accuracy: 0.7398
323/487 [==================>...........] - ETA: 0s - loss: 0.7997 - accuracy: 0.7399
344/487 [====================>.........] - ETA: 0s - loss: 0.7993 - accuracy: 0.7402
365/487 [=====================>........] - ETA: 0s - loss: 0.7997 - accuracy: 0.7400
386/487 [======================>.......] - ETA: 0s - loss: 0.8000 - accuracy: 0.7399
407/487 [========================>.....] - ETA: 0s - loss: 0.7998 - accuracy: 0.7399
429/487 [=========================>....] - ETA: 0s - loss: 0.7992 - accuracy: 0.7401
451/487 [==========================>...] - ETA: 0s - loss: 0.7989 - accuracy: 0.7401
473/487 [============================>.] - ETA: 0s - loss: 0.7992 - accuracy: 0.7402
***callbacks***
saving losses to model_2/losses.log
Epoch 13: val_loss improved from 0.80775 to 0.80149, saving model to model_2/KERAS_check_best_model.h5
Epoch 13: val_loss improved from 0.80775 to 0.80149, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 13: saving model to model_2/KERAS_check_model_last.h5
Epoch 13: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 3ms/step - loss: 0.7992 - accuracy: 0.7400 - val_loss: 0.8015 - val_accuracy: 0.7400 - lr: 1.0000e-04
Epoch 14/30
1/487 [..............................] - ETA: 1s - loss: 0.7716 - accuracy: 0.7607
23/487 [>.............................] - ETA: 1s - loss: 0.8036 - accuracy: 0.7405
44/487 [=>............................] - ETA: 1s - loss: 0.8006 - accuracy: 0.7409
65/487 [===>..........................] - ETA: 1s - loss: 0.7994 - accuracy: 0.7404
86/487 [====>.........................] - ETA: 0s - loss: 0.7965 - accuracy: 0.7409
107/487 [=====>........................] - ETA: 0s - loss: 0.7995 - accuracy: 0.7399
129/487 [======>.......................] - ETA: 0s - loss: 0.7984 - accuracy: 0.7404
150/487 [========>.....................] - ETA: 0s - loss: 0.7978 - accuracy: 0.7401
172/487 [=========>....................] - ETA: 0s - loss: 0.7965 - accuracy: 0.7404
193/487 [==========>...................] - ETA: 0s - loss: 0.7970 - accuracy: 0.7403
215/487 [============>.................] - ETA: 0s - loss: 0.7966 - accuracy: 0.7402
237/487 [=============>................] - ETA: 0s - loss: 0.7963 - accuracy: 0.7404
258/487 [==============>...............] - ETA: 0s - loss: 0.7959 - accuracy: 0.7405
280/487 [================>.............] - ETA: 0s - loss: 0.7956 - accuracy: 0.7405
302/487 [=================>............] - ETA: 0s - loss: 0.7946 - accuracy: 0.7408
324/487 [==================>...........] - ETA: 0s - loss: 0.7941 - accuracy: 0.7412
346/487 [====================>.........] - ETA: 0s - loss: 0.7939 - accuracy: 0.7411
366/487 [=====================>........] - ETA: 0s - loss: 0.7943 - accuracy: 0.7410
388/487 [======================>.......] - ETA: 0s - loss: 0.7942 - accuracy: 0.7413
410/487 [========================>.....] - ETA: 0s - loss: 0.7941 - accuracy: 0.7412
432/487 [=========================>....] - ETA: 0s - loss: 0.7943 - accuracy: 0.7412
454/487 [==========================>...] - ETA: 0s - loss: 0.7935 - accuracy: 0.7415
475/487 [============================>.] - ETA: 0s - loss: 0.7933 - accuracy: 0.7416
***callbacks***
saving losses to model_2/losses.log
Epoch 14: val_loss improved from 0.80149 to 0.79622, saving model to model_2/KERAS_check_best_model.h5
Epoch 14: val_loss improved from 0.80149 to 0.79622, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 14: saving model to model_2/KERAS_check_model_last.h5
Epoch 14: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.7935 - accuracy: 0.7416 - val_loss: 0.7962 - val_accuracy: 0.7416 - lr: 1.0000e-04
Epoch 15/30
1/487 [..............................] - ETA: 1s - loss: 0.7411 - accuracy: 0.7637
23/487 [>.............................] - ETA: 1s - loss: 0.7879 - accuracy: 0.7444
45/487 [=>............................] - ETA: 1s - loss: 0.7894 - accuracy: 0.7431
65/487 [===>..........................] - ETA: 1s - loss: 0.7864 - accuracy: 0.7437
86/487 [====>.........................] - ETA: 0s - loss: 0.7886 - accuracy: 0.7425
107/487 [=====>........................] - ETA: 0s - loss: 0.7904 - accuracy: 0.7421
129/487 [======>.......................] - ETA: 0s - loss: 0.7915 - accuracy: 0.7421
151/487 [========>.....................] - ETA: 0s - loss: 0.7901 - accuracy: 0.7426
173/487 [=========>....................] - ETA: 0s - loss: 0.7902 - accuracy: 0.7422
194/487 [==========>...................] - ETA: 0s - loss: 0.7890 - accuracy: 0.7426
215/487 [============>.................] - ETA: 0s - loss: 0.7890 - accuracy: 0.7428
236/487 [=============>................] - ETA: 0s - loss: 0.7878 - accuracy: 0.7433
257/487 [==============>...............] - ETA: 0s - loss: 0.7875 - accuracy: 0.7430
278/487 [================>.............] - ETA: 0s - loss: 0.7882 - accuracy: 0.7429
299/487 [=================>............] - ETA: 0s - loss: 0.7886 - accuracy: 0.7429
320/487 [==================>...........] - ETA: 0s - loss: 0.7884 - accuracy: 0.7430
341/487 [====================>.........] - ETA: 0s - loss: 0.7885 - accuracy: 0.7430
363/487 [=====================>........] - ETA: 0s - loss: 0.7879 - accuracy: 0.7432
385/487 [======================>.......] - ETA: 0s - loss: 0.7884 - accuracy: 0.7430
406/487 [========================>.....] - ETA: 0s - loss: 0.7888 - accuracy: 0.7428
428/487 [=========================>....] - ETA: 0s - loss: 0.7884 - accuracy: 0.7429
450/487 [==========================>...] - ETA: 0s - loss: 0.7889 - accuracy: 0.7428
472/487 [============================>.] - ETA: 0s - loss: 0.7889 - accuracy: 0.7428
***callbacks***
saving losses to model_2/losses.log
Epoch 15: val_loss improved from 0.79622 to 0.79153, saving model to model_2/KERAS_check_best_model.h5
Epoch 15: val_loss improved from 0.79622 to 0.79153, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 15: saving model to model_2/KERAS_check_model_last.h5
Epoch 15: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.7885 - accuracy: 0.7430 - val_loss: 0.7915 - val_accuracy: 0.7427 - lr: 1.0000e-04
Epoch 16/30
1/487 [..............................] - ETA: 1s - loss: 0.7957 - accuracy: 0.7451
23/487 [>.............................] - ETA: 1s - loss: 0.7886 - accuracy: 0.7399
45/487 [=>............................] - ETA: 1s - loss: 0.7859 - accuracy: 0.7415
66/487 [===>..........................] - ETA: 1s - loss: 0.7893 - accuracy: 0.7411
87/487 [====>.........................] - ETA: 0s - loss: 0.7877 - accuracy: 0.7420
109/487 [=====>........................] - ETA: 0s - loss: 0.7855 - accuracy: 0.7436
131/487 [=======>......................] - ETA: 0s - loss: 0.7860 - accuracy: 0.7433
153/487 [========>.....................] - ETA: 0s - loss: 0.7853 - accuracy: 0.7437
174/487 [=========>....................] - ETA: 0s - loss: 0.7852 - accuracy: 0.7437
195/487 [===========>..................] - ETA: 0s - loss: 0.7853 - accuracy: 0.7436
216/487 [============>.................] - ETA: 0s - loss: 0.7858 - accuracy: 0.7434
237/487 [=============>................] - ETA: 0s - loss: 0.7854 - accuracy: 0.7436
259/487 [==============>...............] - ETA: 0s - loss: 0.7854 - accuracy: 0.7437
280/487 [================>.............] - ETA: 0s - loss: 0.7856 - accuracy: 0.7437
302/487 [=================>............] - ETA: 0s - loss: 0.7859 - accuracy: 0.7435
323/487 [==================>...........] - ETA: 0s - loss: 0.7858 - accuracy: 0.7434
345/487 [====================>.........] - ETA: 0s - loss: 0.7858 - accuracy: 0.7434
367/487 [=====================>........] - ETA: 0s - loss: 0.7852 - accuracy: 0.7436
389/487 [======================>.......] - ETA: 0s - loss: 0.7849 - accuracy: 0.7438
410/487 [========================>.....] - ETA: 0s - loss: 0.7849 - accuracy: 0.7438
431/487 [=========================>....] - ETA: 0s - loss: 0.7847 - accuracy: 0.7439
453/487 [==========================>...] - ETA: 0s - loss: 0.7845 - accuracy: 0.7441
475/487 [============================>.] - ETA: 0s - loss: 0.7845 - accuracy: 0.7441
***callbacks***
saving losses to model_2/losses.log
Epoch 16: val_loss improved from 0.79153 to 0.78753, saving model to model_2/KERAS_check_best_model.h5
Epoch 16: val_loss improved from 0.79153 to 0.78753, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 16: saving model to model_2/KERAS_check_model_last.h5
Epoch 16: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 3ms/step - loss: 0.7843 - accuracy: 0.7442 - val_loss: 0.7875 - val_accuracy: 0.7437 - lr: 1.0000e-04
Epoch 17/30
1/487 [..............................] - ETA: 1s - loss: 0.7510 - accuracy: 0.7529
22/487 [>.............................] - ETA: 1s - loss: 0.7756 - accuracy: 0.7476
42/487 [=>............................] - ETA: 1s - loss: 0.7779 - accuracy: 0.7459
63/487 [==>...........................] - ETA: 1s - loss: 0.7813 - accuracy: 0.7443
85/487 [====>.........................] - ETA: 0s - loss: 0.7805 - accuracy: 0.7448
107/487 [=====>........................] - ETA: 0s - loss: 0.7819 - accuracy: 0.7443
129/487 [======>.......................] - ETA: 0s - loss: 0.7820 - accuracy: 0.7440
151/487 [========>.....................] - ETA: 0s - loss: 0.7831 - accuracy: 0.7441
173/487 [=========>....................] - ETA: 0s - loss: 0.7824 - accuracy: 0.7444
195/487 [===========>..................] - ETA: 0s - loss: 0.7828 - accuracy: 0.7443
217/487 [============>.................] - ETA: 0s - loss: 0.7821 - accuracy: 0.7447
238/487 [=============>................] - ETA: 0s - loss: 0.7818 - accuracy: 0.7449
259/487 [==============>...............] - ETA: 0s - loss: 0.7809 - accuracy: 0.7452
281/487 [================>.............] - ETA: 0s - loss: 0.7810 - accuracy: 0.7452
302/487 [=================>............] - ETA: 0s - loss: 0.7808 - accuracy: 0.7453
323/487 [==================>...........] - ETA: 0s - loss: 0.7803 - accuracy: 0.7455
344/487 [====================>.........] - ETA: 0s - loss: 0.7806 - accuracy: 0.7454
365/487 [=====================>........] - ETA: 0s - loss: 0.7808 - accuracy: 0.7455
386/487 [======================>.......] - ETA: 0s - loss: 0.7809 - accuracy: 0.7455
407/487 [========================>.....] - ETA: 0s - loss: 0.7805 - accuracy: 0.7455
428/487 [=========================>....] - ETA: 0s - loss: 0.7808 - accuracy: 0.7453
449/487 [==========================>...] - ETA: 0s - loss: 0.7802 - accuracy: 0.7455
470/487 [===========================>..] - ETA: 0s - loss: 0.7805 - accuracy: 0.7453
***callbacks***
saving losses to model_2/losses.log
Epoch 17: val_loss improved from 0.78753 to 0.78392, saving model to model_2/KERAS_check_best_model.h5
Epoch 17: val_loss improved from 0.78753 to 0.78392, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 17: saving model to model_2/KERAS_check_model_last.h5
Epoch 17: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.7806 - accuracy: 0.7453 - val_loss: 0.7839 - val_accuracy: 0.7448 - lr: 1.0000e-04
Epoch 18/30
1/487 [..............................] - ETA: 1s - loss: 0.7923 - accuracy: 0.7295
23/487 [>.............................] - ETA: 1s - loss: 0.7902 - accuracy: 0.7408
45/487 [=>............................] - ETA: 1s - loss: 0.7863 - accuracy: 0.7411
66/487 [===>..........................] - ETA: 0s - loss: 0.7842 - accuracy: 0.7422
87/487 [====>.........................] - ETA: 0s - loss: 0.7819 - accuracy: 0.7437
108/487 [=====>........................] - ETA: 0s - loss: 0.7815 - accuracy: 0.7436
128/487 [======>.......................] - ETA: 0s - loss: 0.7789 - accuracy: 0.7451
149/487 [========>.....................] - ETA: 0s - loss: 0.7777 - accuracy: 0.7460
170/487 [=========>....................] - ETA: 0s - loss: 0.7789 - accuracy: 0.7457
191/487 [==========>...................] - ETA: 0s - loss: 0.7788 - accuracy: 0.7459
212/487 [============>.................] - ETA: 0s - loss: 0.7787 - accuracy: 0.7460
234/487 [=============>................] - ETA: 0s - loss: 0.7774 - accuracy: 0.7465
255/487 [==============>...............] - ETA: 0s - loss: 0.7776 - accuracy: 0.7465
276/487 [================>.............] - ETA: 0s - loss: 0.7778 - accuracy: 0.7464
297/487 [=================>............] - ETA: 0s - loss: 0.7784 - accuracy: 0.7462
318/487 [==================>...........] - ETA: 0s - loss: 0.7785 - accuracy: 0.7460
339/487 [===================>..........] - ETA: 0s - loss: 0.7795 - accuracy: 0.7456
361/487 [=====================>........] - ETA: 0s - loss: 0.7793 - accuracy: 0.7454
382/487 [======================>.......] - ETA: 0s - loss: 0.7793 - accuracy: 0.7455
403/487 [=======================>......] - ETA: 0s - loss: 0.7785 - accuracy: 0.7458
424/487 [=========================>....] - ETA: 0s - loss: 0.7786 - accuracy: 0.7459
445/487 [==========================>...] - ETA: 0s - loss: 0.7781 - accuracy: 0.7461
467/487 [===========================>..] - ETA: 0s - loss: 0.7776 - accuracy: 0.7462
486/487 [============================>.] - ETA: 0s - loss: 0.7773 - accuracy: 0.7463
***callbacks***
saving losses to model_2/losses.log
Epoch 18: val_loss improved from 0.78392 to 0.78110, saving model to model_2/KERAS_check_best_model.h5
Epoch 18: val_loss improved from 0.78392 to 0.78110, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 18: saving model to model_2/KERAS_check_model_last.h5
Epoch 18: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.7773 - accuracy: 0.7463 - val_loss: 0.7811 - val_accuracy: 0.7453 - lr: 1.0000e-04
Epoch 19/30
1/487 [..............................] - ETA: 1s - loss: 0.7607 - accuracy: 0.7539
23/487 [>.............................] - ETA: 1s - loss: 0.7791 - accuracy: 0.7447
45/487 [=>............................] - ETA: 1s - loss: 0.7778 - accuracy: 0.7457
66/487 [===>..........................] - ETA: 0s - loss: 0.7739 - accuracy: 0.7467
88/487 [====>.........................] - ETA: 0s - loss: 0.7757 - accuracy: 0.7452
110/487 [=====>........................] - ETA: 0s - loss: 0.7756 - accuracy: 0.7458
132/487 [=======>......................] - ETA: 0s - loss: 0.7761 - accuracy: 0.7459
154/487 [========>.....................] - ETA: 0s - loss: 0.7765 - accuracy: 0.7457
176/487 [=========>....................] - ETA: 0s - loss: 0.7758 - accuracy: 0.7460
198/487 [===========>..................] - ETA: 0s - loss: 0.7768 - accuracy: 0.7458
220/487 [============>.................] - ETA: 0s - loss: 0.7757 - accuracy: 0.7461
242/487 [=============>................] - ETA: 0s - loss: 0.7754 - accuracy: 0.7465
264/487 [===============>..............] - ETA: 0s - loss: 0.7748 - accuracy: 0.7468
286/487 [================>.............] - ETA: 0s - loss: 0.7746 - accuracy: 0.7469
308/487 [=================>............] - ETA: 0s - loss: 0.7743 - accuracy: 0.7470
329/487 [===================>..........] - ETA: 0s - loss: 0.7745 - accuracy: 0.7470
351/487 [====================>.........] - ETA: 0s - loss: 0.7737 - accuracy: 0.7472
373/487 [=====================>........] - ETA: 0s - loss: 0.7739 - accuracy: 0.7471
394/487 [=======================>......] - ETA: 0s - loss: 0.7744 - accuracy: 0.7468
416/487 [========================>.....] - ETA: 0s - loss: 0.7748 - accuracy: 0.7469
438/487 [=========================>....] - ETA: 0s - loss: 0.7745 - accuracy: 0.7469
460/487 [===========================>..] - ETA: 0s - loss: 0.7747 - accuracy: 0.7469
481/487 [============================>.] - ETA: 0s - loss: 0.7745 - accuracy: 0.7470
***callbacks***
saving losses to model_2/losses.log
Epoch 19: val_loss improved from 0.78110 to 0.77818, saving model to model_2/KERAS_check_best_model.h5
Epoch 19: val_loss improved from 0.78110 to 0.77818, saving model to model_2/KERAS_check_best_model_weights.h5
Epoch 19: saving model to model_2/KERAS_check_model_last.h5
Epoch 19: saving model to model_2/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 1s 3ms/step - loss: 0.7743 - accuracy: 0.7471 - val_loss: 0.7782 - val_accuracy: 0.7459 - lr: 1.0000e-04
Epoch 20/30
1/487 [..............................] - ETA: 1s - loss: 0.7751 - accuracy: 0.7510
23/487 [>.............................] - ETA: 1s - loss: 0.7702 - accuracy: 0.7486
45/487 [=>............................] - ETA: 1s - loss: 0.7758 - accuracy: 0.7469
67/487 [===>..........................] - ETA: 0s - loss: 0.7732 - accuracy: 0.7477
89/487 [====>.........................] - ETA: 0s - loss: 0.7706 - accuracy: 0.7492
111/487 [=====>........................] - ETA: 0s - loss: 0.7702 - accuracy: 0.7495
133/487 [=======>......................] - ETA: 0s - loss: 0.7715 - accuracy: 0.7487
155/487 [========>.....................] - ETA: 0s - loss: 0.7723 - accuracy: 0.7481
177/487 [=========>....................] - ETA: 0s - loss: 0.7737 - accuracy: 0.7475
199/487 [===========>..................] - ETA: 0s - loss: 0.7730 - accuracy: 0.7477
221/487 [============>.................] - ETA: 0s - loss: 0.7731 - accuracy: 0.7475
243/487 [=============>................] - ETA: 0s - loss: 0.7730 - accuracy: 0.7476
265/487 [===============>..............] - ETA: 0s - loss: 0.7724 - accuracy: 0.7476
286/487 [================>.............] - ETA: 0s - loss: 0.7727 - accuracy: 0.7475
307/487 [=================>............] - ETA: 0s - loss: 0.7729 - accuracy: 0.7473
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
/tmp/ipykernel_5613/3039212493.py in <module>
21 validation_split=0.25,
22 shuffle=True,
---> 23 callbacks=callbacks.callbacks,
24 )
25 else:
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/utils/traceback_utils.py in error_handler(*args, **kwargs)
63 filtered_tb = None
64 try:
---> 65 return fn(*args, **kwargs)
66 except Exception as e:
67 filtered_tb = _process_traceback_frames(e.__traceback__)
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
1562 ):
1563 callbacks.on_train_batch_begin(step)
-> 1564 tmp_logs = self.train_function(iterator)
1565 if data_handler.should_sync:
1566 context.async_wait()
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/util/traceback_utils.py in error_handler(*args, **kwargs)
148 filtered_tb = None
149 try:
--> 150 return fn(*args, **kwargs)
151 except Exception as e:
152 filtered_tb = _process_traceback_frames(e.__traceback__)
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
913
914 with OptionalXlaContext(self._jit_compile):
--> 915 result = self._call(*args, **kwds)
916
917 new_tracing_count = self.experimental_get_tracing_count()
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
945 # In this case we have created variables on the first call, so we run the
946 # defunned version which is guaranteed to never create variables.
--> 947 return self._stateless_fn(*args, **kwds) # pylint: disable=not-callable
948 elif self._stateful_fn is not None:
949 # Release the lock early so that multiple threads can perform the call
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
2495 filtered_flat_args) = self._maybe_define_function(args, kwargs)
2496 return graph_function._call_flat(
-> 2497 filtered_flat_args, captured_inputs=graph_function.captured_inputs) # pylint: disable=protected-access
2498
2499 @property
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
1861 # No tape is watching; skip to running the function.
1862 return self._build_call_outputs(self._inference_function.call(
-> 1863 ctx, args, cancellation_manager=cancellation_manager))
1864 forward_backward = self._select_forward_and_backward_functions(
1865 args,
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
502 inputs=args,
503 attrs=attrs,
--> 504 ctx=ctx)
505 else:
506 outputs = execute.execute_with_cancellation(
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
53 ctx.ensure_initialized()
54 tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 55 inputs, attrs, num_outputs)
56 except core._NotOkStatusException as e:
57 if name is not None:
KeyboardInterrupt:
Again we will se the default precision to be ap_fixed<10,4>
.
for layer in config["LayerName"].keys():
config["LayerName"][layer]["Trace"] = True
hls_model = hls4ml.converters.convert_from_keras_model(
model,
hls_config=config,
output_dir="model_2/hls4ml_prj_1",
part="xcu250-figd2104-2L-e",
)
hls4ml.model.profiling.numerical(model=model, hls_model=hls_model, X=X_test[:1000])
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: Dense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: Dense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: Dense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: Dense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Profiling weights (before optimization)
Profiling weights (final / after optimization)
Profiling activations (before optimization)
fc1
32/32 [==============================] - 0s 595us/step
relu1
32/32 [==============================] - 0s 664us/step
fc2
32/32 [==============================] - 0s 760us/step
relu2
32/32 [==============================] - 0s 924us/step
fc3
32/32 [==============================] - 0s 743us/step
relu3
32/32 [==============================] - 0s 676us/step
output
32/32 [==============================] - 0s 684us/step
softmax
32/32 [==============================] - 0s 794us/step
Profiling activations (final / after optimization)
Recompiling myproject with tracing
Writing HLS project
Done
fc1
relu1
fc2
relu2
fc3
relu3
output
softmax
(<Figure size 432x288 with 1 Axes>,
<Figure size 432x288 with 1 Axes>,
<Figure size 432x288 with 1 Axes>,
<Figure size 432x288 with 1 Axes>)
You can see the difference in the weight profile plots between this model and the previous one quite clearly. Whereas before the smallest weight in the first layer was approximately \(10^{-14}\), now its almost \(10^{-24}\)! However, it hasn’t markedly improved the upper bound of the layers post-activation, so we will need to try something else.
Trace#
Another thing we can try is to use different precisions in different layers. In this case, it seems that the third layer is the one with the largest output, so perhaps we could increase only that precision and leave the others as is?
config["LayerName"]["fc3"]["Precision"]["weight"] = "ap_fixed<12,6>"
hls_model = hls4ml.converters.convert_from_keras_model(
model,
hls_config=config,
output_dir="model_2/hls4ml_prj_2",
part="xcu250-figd2104-2L-e",
)
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: Dense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: Dense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: Dense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Now lets check how this model performs. We are also going to enable a functionality that will extract the intermediate network values from each layer, for botht the hls4ml model and the Keras model.
hls_model.compile()
hls4ml_pred, hls4ml_trace = hls_model.trace(X_test[:1000])
keras_trace = hls4ml.model.profiling.get_ymodel_keras(model, X_test[:1000])
y_hls = hls_model.predict(X_test)
Writing HLS project
Done
Recompiling myproject with tracing
Writing HLS project
Done
Processing fc1 in Keras model...
32/32 [==============================] - 0s 940us/step
32/32 [==============================] - 0s 660us/step
Processing relu1 in Keras model...
32/32 [==============================] - 0s 703us/step
Processing fc2 in Keras model...
32/32 [==============================] - 0s 3ms/step
32/32 [==============================] - 0s 814us/step
Processing relu2 in Keras model...
32/32 [==============================] - 0s 614us/step
Processing fc3 in Keras model...
32/32 [==============================] - 0s 637us/step
32/32 [==============================] - 0s 768us/step
Processing relu3 in Keras model...
32/32 [==============================] - 0s 560us/step
Processing output in Keras model...
32/32 [==============================] - 0s 601us/step
32/32 [==============================] - 0s 610us/step
Processing softmax in Keras model...
32/32 [==============================] - 0s 801us/step
Done taking outputs for Keras model.
Inspect#
Now we can print out, make plots, or do any other more detailed analysis on the output of each layer to understand the performance we see. Let’s print the output of that third layer, for the first sample, for both the Keras and hls4ml models, and also make a plot of the mean difference per sample
print("Keras layer 'fc3', first sample:")
print(keras_trace["fc3"][0])
print("hls4ml layer 'fc3', first sample:")
print(hls4ml_trace["fc3"][0])
print("layer fc3 diff, first sample:")
print(hls4ml_trace["fc3"][0] - keras_trace["fc3"][0])
Keras layer 'fc3', first sample:
[ 0.88930064 -3.8859491 1.5530485 1.7167063 -0.96600306 0.2857443
0.21472323 0.30293974 -0.9408086 0.9514654 1.6882076 2.0520456
1.8682225 0.01729417 1.457709 0.5599562 0.3849154 1.9730089
0.11493069 2.945932 1.6105483 4.3975835 -1.0648595 -0.04774496
-0.79464626 -1.3470746 -2.6663346 -0.90749 -0.02055568 -0.9426051
0.61609864 2.0077007 ]
hls4ml layer 'fc3', first sample:
[ 0.359375 -2.859375 0.875 0.828125 -0.78125 -0.09375 0.09375
-0.28125 -1.109375 0.390625 0.625 1.078125 0.796875 -0.125
0.703125 0.0625 0.171875 0.90625 -0.328125 1.5 0.921875
2.390625 -1.03125 -0.1875 -0.796875 -1.328125 -2.078125 -0.765625
-0.15625 -0.875 0.265625 0.9375 ]
layer fc3 diff, first sample:
[-0.52992564 1.02657413 -0.67804849 -0.88858128 0.18475306 -0.37949431
-0.12097323 -0.58418974 -0.16856641 -0.56084043 -1.06320763 -0.97392058
-1.07134748 -0.14229417 -0.75458395 -0.49745619 -0.21304041 -1.06675887
-0.44305569 -1.44593191 -0.68867326 -2.00695848 0.03360951 -0.13975504
-0.00222874 0.01894963 0.58820963 0.14186502 -0.13569432 0.06760508
-0.35047364 -1.07020068]
plt.hist(
np.mean(hls4ml_trace["fc3"] - keras_trace["fc3"], axis=-1),
bins=np.linspace(-1.0, 1.0, 51),
density=True,
)
plt.xlabel("mean difference (hls4ml - keras)")
plt.show()
Compare#
It’s not looking great. Let’s check the accuracy and ROC curve.
print(
"Keras Accuracy: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_keras, axis=1))
)
)
print(
"hls4ml Accuracy: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_hls, axis=1))
)
)
fig, ax = plt.subplots(figsize=(9, 9))
_ = plotting.makeRoc(y_test, y_keras, classes)
plt.gca().set_prop_cycle(None) # reset the colors
_ = plotting.makeRoc(y_test, y_hls, classes, linestyle="--")
from matplotlib.lines import Line2D
lines = [Line2D([0], [0], ls="-"), Line2D([0], [0], ls="--")]
from matplotlib.legend import Legend
leg = Legend(ax, lines, labels=["keras", "hls4ml"], loc="lower right", frameon=False)
ax.add_artist(leg)
Keras Accuracy: 0.7516506024096385
hls4ml Accuracy: 0.633421686746988
<matplotlib.legend.Legend at 0x14fb4e5b0>
Improving#
Better, but still not great, especially depending on which class we look at. In principle we could try this for other layers, but eventually we may find we are just back to a larger model. Let’s look at some other methods for reducing the size of the network.