Quantization
Contents
!apt-get -qq install -y graphviz && pip install pydot
!pip install -U matplotlib
!pip install git+https://github.com/fastmachinelearning/hls4ml.git@main#egg=hls4ml[profiling]
!pip install qkeras==0.9.0
E: Could not open lock file /var/lib/dpkg/lock-frontend - open (13: Permission denied)
E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), are you root?
Requirement already satisfied: matplotlib in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (3.5.3)
Requirement already satisfied: fonttools>=4.22.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (4.38.0)
Requirement already satisfied: pyparsing>=2.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (3.0.9)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (1.4.4)
Requirement already satisfied: cycler>=0.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (0.11.0)
Requirement already satisfied: python-dateutil>=2.7 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (2.8.2)
Requirement already satisfied: packaging>=20.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (21.3)
Requirement already satisfied: pillow>=6.2.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (9.3.0)
Requirement already satisfied: numpy>=1.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib) (1.21.6)
Requirement already satisfied: typing-extensions in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from kiwisolver>=1.0.1->matplotlib) (4.4.0)
Requirement already satisfied: six>=1.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0)
Collecting hls4ml[profiling]
Cloning https://github.com/fastmachinelearning/hls4ml.git (to revision main) to /tmp/pip-install-jcvv0rwd/hls4ml_3650423685264422bbc2ccdeb9155452
Running command git clone --filter=blob:none --quiet https://github.com/fastmachinelearning/hls4ml.git /tmp/pip-install-jcvv0rwd/hls4ml_3650423685264422bbc2ccdeb9155452
Resolved https://github.com/fastmachinelearning/hls4ml.git to commit fe9d3e71b03e0422c7643027880310bd2cc02cb1
Running command git submodule update --init --recursive -q
Installing build dependencies ... ?25l-
\
|
/
-
\
done
?25h Getting requirements to build wheel ... ?25l-
done
?25h Preparing metadata (pyproject.toml) ... ?25l-
\
done
?25hRequirement already satisfied: calmjs.parse in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.3.0)
Requirement already satisfied: numpy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.21.6)
Requirement already satisfied: tabulate in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.9.0)
Requirement already satisfied: onnx>=1.4.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.12.0)
Requirement already satisfied: qkeras in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.9.0)
Requirement already satisfied: six in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.16.0)
Requirement already satisfied: pydigitalwavetools==1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.1)
Requirement already satisfied: h5py in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (3.7.0)
Requirement already satisfied: pyyaml in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (6.0)
Requirement already satisfied: matplotlib in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (3.5.3)
Requirement already satisfied: pandas in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (1.3.5)
Requirement already satisfied: seaborn in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from hls4ml[profiling]) (0.12.1)
Requirement already satisfied: protobuf<=3.20.1,>=3.12.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from onnx>=1.4.0->hls4ml[profiling]) (3.19.6)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from onnx>=1.4.0->hls4ml[profiling]) (4.4.0)
Requirement already satisfied: ply>=3.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from calmjs.parse->hls4ml[profiling]) (3.11)
Requirement already satisfied: setuptools in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from calmjs.parse->hls4ml[profiling]) (65.5.1)
Requirement already satisfied: cycler>=0.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (0.11.0)
Requirement already satisfied: pillow>=6.2.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (9.3.0)
Requirement already satisfied: pyparsing>=2.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (3.0.9)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (1.4.4)
Requirement already satisfied: fonttools>=4.22.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (4.38.0)
Requirement already satisfied: python-dateutil>=2.7 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (2.8.2)
Requirement already satisfied: packaging>=20.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from matplotlib->hls4ml[profiling]) (21.3)
Requirement already satisfied: pytz>=2017.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pandas->hls4ml[profiling]) (2022.6)
Requirement already satisfied: scipy>=1.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.7.3)
Requirement already satisfied: pyparser in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.0)
Requirement already satisfied: tensorflow-model-optimization>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (0.7.3)
Requirement already satisfied: keras-tuner>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.1.3)
Requirement already satisfied: networkx>=2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (2.6.3)
Requirement already satisfied: tqdm>=4.48.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (4.64.1)
Requirement already satisfied: scikit-learn>=0.23.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras->hls4ml[profiling]) (1.0.2)
Requirement already satisfied: tensorboard in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.10.1)
Requirement already satisfied: kt-legacy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.0.4)
Requirement already satisfied: ipython in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (7.34.0)
Requirement already satisfied: requests in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.28.1)
Requirement already satisfied: joblib>=0.11 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras->hls4ml[profiling]) (1.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras->hls4ml[profiling]) (3.1.0)
Requirement already satisfied: dm-tree~=0.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras->hls4ml[profiling]) (0.1.7)
Requirement already satisfied: parse==1.6.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyparser->qkeras->hls4ml[profiling]) (1.6.5)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.0.32)
Requirement already satisfied: backcall in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.0)
Requirement already satisfied: pexpect>4.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (4.8.0)
Requirement already satisfied: jedi>=0.16 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.18.1)
Requirement already satisfied: pygments in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.13.0)
Requirement already satisfied: decorator in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.1.1)
Requirement already satisfied: pickleshare in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.7.5)
Requirement already satisfied: matplotlib-inline in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.1.6)
Requirement already satisfied: traitlets>=4.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.5.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.26.12)
Requirement already satisfied: charset-normalizer<3,>=2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.1.1)
Requirement already satisfied: certifi>=2017.4.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2022.9.24)
Requirement already satisfied: idna<4,>=2.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.4)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.4.6)
Requirement already satisfied: grpcio>=1.24.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.50.0)
Requirement already satisfied: markdown>=2.6.8 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.4.1)
Requirement already satisfied: werkzeug>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.2.2)
Requirement already satisfied: absl-py>=0.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.3.0)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.8.1)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.6.1)
Requirement already satisfied: wheel>=0.26 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.38.2)
Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.14.1)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.8)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.2.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (4.9)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (1.3.1)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from jedi>=0.16->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.8.3)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (5.0.0)
Requirement already satisfied: ptyprocess>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pexpect>4.3->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.7.0)
Requirement already satisfied: wcwidth in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.2.5)
Requirement already satisfied: MarkupSafe>=2.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from werkzeug>=1.0.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (2.1.1)
Requirement already satisfied: zipp>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.10.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras->hls4ml[profiling]) (3.2.2)
Requirement already satisfied: qkeras==0.9.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (0.9.0)
Requirement already satisfied: keras-tuner>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.1.3)
Requirement already satisfied: pyparser in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.0)
Requirement already satisfied: tqdm>=4.48.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (4.64.1)
Requirement already satisfied: tensorflow-model-optimization>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (0.7.3)
Requirement already satisfied: scikit-learn>=0.23.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.0.2)
Requirement already satisfied: setuptools>=41.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (65.5.1)
Requirement already satisfied: networkx>=2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (2.6.3)
Requirement already satisfied: scipy>=1.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.7.3)
Requirement already satisfied: numpy>=1.16.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from qkeras==0.9.0) (1.21.6)
Requirement already satisfied: requests in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (2.28.1)
Requirement already satisfied: kt-legacy in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (1.0.4)
Requirement already satisfied: ipython in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (7.34.0)
Requirement already satisfied: tensorboard in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (2.10.1)
Requirement already satisfied: packaging in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from keras-tuner>=1.0.1->qkeras==0.9.0) (21.3)
Requirement already satisfied: joblib>=0.11 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras==0.9.0) (1.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from scikit-learn>=0.23.1->qkeras==0.9.0) (3.1.0)
Requirement already satisfied: six~=1.10 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras==0.9.0) (1.16.0)
Requirement already satisfied: dm-tree~=0.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorflow-model-optimization>=0.2.1->qkeras==0.9.0) (0.1.7)
Requirement already satisfied: parse==1.6.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyparser->qkeras==0.9.0) (1.6.5)
Requirement already satisfied: traitlets>=4.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (5.5.0)
Requirement already satisfied: backcall in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.0)
Requirement already satisfied: decorator in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (5.1.1)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (3.0.32)
Requirement already satisfied: matplotlib-inline in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.1.6)
Requirement already satisfied: pickleshare in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.7.5)
Requirement already satisfied: pygments in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (2.13.0)
Requirement already satisfied: pexpect>4.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (4.8.0)
Requirement already satisfied: jedi>=0.16 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.18.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from packaging->keras-tuner>=1.0.1->qkeras==0.9.0) (3.0.9)
Requirement already satisfied: charset-normalizer<3,>=2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (1.26.12)
Requirement already satisfied: idna<4,>=2.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (3.4)
Requirement already satisfied: certifi>=2017.4.17 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests->keras-tuner>=1.0.1->qkeras==0.9.0) (2022.9.24)
Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.14.1)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.4.6)
Requirement already satisfied: werkzeug>=1.0.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.2.2)
Requirement already satisfied: absl-py>=0.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.3.0)
Requirement already satisfied: grpcio>=1.24.3 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.50.0)
Requirement already satisfied: protobuf<3.20,>=3.9.2 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.19.6)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.6.1)
Requirement already satisfied: wheel>=0.26 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.38.2)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.8.1)
Requirement already satisfied: markdown>=2.6.8 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.4.1)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (5.2.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (4.9)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.8)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (1.3.1)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from jedi>=0.16->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.8.3)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (5.0.0)
Requirement already satisfied: ptyprocess>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pexpect>4.3->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.7.0)
Requirement already satisfied: wcwidth in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython->keras-tuner>=1.0.1->qkeras==0.9.0) (0.2.5)
Requirement already satisfied: MarkupSafe>=2.1.1 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from werkzeug>=1.0.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (2.1.1)
Requirement already satisfied: zipp>=0.5 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.10.0)
Requirement already satisfied: typing-extensions>=3.6.4 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (4.4.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner>=1.0.1->qkeras==0.9.0) (3.2.2)
COLAB=True
!pip install wget
import wget
import os.path
# WGET for colab
if not os.path.exists("callbacks.py"):
url = "https://raw.githubusercontent.com/jmduarte/iaifi-summer-school/main/book/callbacks.py"
callbacksFile = wget.download(url)
if not os.path.exists("plotting.py"):
urlPlot = "https://raw.githubusercontent.com/jmduarte/iaifi-summer-school/main/book/plotting.py"
plotFile = wget.download(urlPlot)
Requirement already satisfied: wget in /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages (3.2)
if COLAB:
if not os.path.exists("./iaifi-summer-school"):
!git clone https://github.com/jmduarte/iaifi-summer-school.git
Quantization#
from tensorflow.keras.utils import to_categorical
from sklearn.datasets import fetch_openml
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import LabelEncoder, StandardScaler
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
seed = 0
np.random.seed(seed)
import tensorflow as tf
tf.random.set_seed(seed)
# import os
# os.environ['PATH'] = '/opt/Xilinx/Vivado/2019.2/bin:' + os.environ['PATH']
# for this tutorial we wont be actually running Vivado, so I have commented these lines out
# but if you want to look into actually running on an FPGA then simply uncomment these lines
if not COLAB:
#FOR LOCAL PULL
X_train_val = np.load("X_train_val.npy")
X_test = np.load("X_test.npy")
y_train_val = np.load("y_train_val.npy")
y_test = np.load("y_test.npy")
classes = np.load("classes.npy", allow_pickle=True)
if COLAB:
#FOR COLAB
X_train_val = np.load("iaifi-summer-school/book/X_train_val.npy")
X_test = np.load("iaifi-summer-school/book/X_test.npy")
y_train_val = np.load("iaifi-summer-school/book/y_train_val.npy")
y_test = np.load("iaifi-summer-school/book/y_test.npy")
classes = np.load("iaifi-summer-school/book/classes.npy", allow_pickle=True)
2022-11-08 17:49:11.425781: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-11-08 17:49:11.576273: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:49:11.576296: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
2022-11-08 17:49:11.609716: E tensorflow/stream_executor/cuda/cuda_blas.cc:2981] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2022-11-08 17:49:12.386097: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:49:12.386190: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:49:12.386200: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
Construct a new model#
This time we’re going to use QKeras layers. QKeras is “Quantized Keras” for deep heterogeneous quantization of ML models, maintained by Google.
https://github.com/google/qkeras
The convenient thing here is that the QKeras layers can be used as drop-in replacements for the corresponding Keras layers.
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.regularizers import l1
from callbacks import all_callbacks
from tensorflow.keras.layers import Activation
from qkeras.qlayers import QDense, QActivation
from qkeras.quantizers import quantized_bits, quantized_relu
We’re using QDense
layer instead of Dense
, and QActivation
instead of Activation
. We’re also specifying kernel_quantizer = quantized_bits(6,0,...)
. This will use 6-bits (of which 0 are integer) for the weights. We also use the same quantization for the biases, and quantized_relu(6)
for 6-bit ReLU activations.
model = Sequential()
model.add(
QDense(
64,
input_shape=(16,),
name="fc1",
kernel_quantizer=quantized_bits(6, 0, alpha=1),
bias_quantizer=quantized_bits(6, 0, alpha=1),
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(QActivation(activation=quantized_relu(6), name="relu1"))
model.add(
QDense(
32,
name="fc2",
kernel_quantizer=quantized_bits(6, 0, alpha=1),
bias_quantizer=quantized_bits(6, 0, alpha=1),
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(QActivation(activation=quantized_relu(6), name="relu2"))
model.add(
QDense(
32,
name="fc3",
kernel_quantizer=quantized_bits(6, 0, alpha=1),
bias_quantizer=quantized_bits(6, 0, alpha=1),
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(QActivation(activation=quantized_relu(6), name="relu3"))
model.add(
QDense(
5,
name="output",
kernel_quantizer=quantized_bits(6, 0, alpha=1),
bias_quantizer=quantized_bits(6, 0, alpha=1),
kernel_initializer="lecun_uniform",
kernel_regularizer=l1(0.0001),
)
)
model.add(Activation(activation="softmax", name="softmax"))
2022-11-08 17:49:14.641519: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.7.15/x64/lib
2022-11-08 17:49:14.641561: W tensorflow/stream_executor/cuda/cuda_driver.cc:263] failed call to cuInit: UNKNOWN ERROR (303)
2022-11-08 17:49:14.641584: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (fv-az196-389): /proc/driver/nvidia/version does not exist
2022-11-08 17:49:14.641864: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Train sparse#
Let’s train with model sparsity again, since QKeras layers are also prunable.
from tensorflow_model_optimization.python.core.sparsity.keras import (
prune,
pruning_callbacks,
pruning_schedule,
)
from tensorflow_model_optimization.sparsity.keras import strip_pruning
pruning_params = {
"pruning_schedule": pruning_schedule.ConstantSparsity(
0.75, begin_step=2000, frequency=100
)
}
model = prune.prune_low_magnitude(model, **pruning_params)
Train the model#
We’ll use the same settings as the original model: Adam optimizer with categorical crossentropy loss.
The callbacks will decay the learning rate and save the model into a directory ‘model_5’
The model isn’t very complex, so this should just take a few minutes even on the CPU.
If you’ve restarted the notebook kernel after training once, set train = False
to load the trained model rather than training again.
train = True
if train:
adam = Adam(lr=0.0001)
model.compile(
optimizer=adam, loss=["categorical_crossentropy"], metrics=["accuracy"]
)
callbacks = all_callbacks(
stop_patience=1000,
lr_factor=0.5,
lr_patience=10,
lr_epsilon=0.000001,
lr_cooldown=2,
lr_minimum=0.0000001,
outputDir="model_5",
)
callbacks.callbacks.append(pruning_callbacks.UpdatePruningStep())
model.fit(
X_train_val,
y_train_val,
batch_size=1024,
epochs=30,
validation_split=0.25,
shuffle=True,
callbacks=callbacks.callbacks,
)
# Save the model again but with the pruning 'stripped' to use the regular layer types
model = strip_pruning(model)
model.save("model_5/KERAS_check_best_model.h5")
else:
from tensorflow.keras.models import load_model
from qkeras.utils import _add_supported_quantized_objects
co = {}
_add_supported_quantized_objects(co)
model = load_model("model_5/KERAS_check_best_model.h5", custom_objects=co)
WARNING:tensorflow:`epsilon` argument is deprecated and will be removed, use `min_delta` instead.
WARNING:tensorflow:`period` argument is deprecated. Please use `save_freq` to specify the frequency in number of batches seen.
Epoch 1/30
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/optimizers/optimizer_v2/adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
super().__init__(name, **kwargs)
1/487 [..............................] - ETA: 28:41 - loss: 1.6823 - accuracy: 0.1895
WARNING:tensorflow:Callback method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0044s vs `on_train_batch_end` time: 0.0120s). Check your callbacks.
12/487 [..............................] - ETA: 2s - loss: 1.6677 - accuracy: 0.2107
25/487 [>.............................] - ETA: 1s - loss: 1.6547 - accuracy: 0.2312
38/487 [=>............................] - ETA: 1s - loss: 1.6419 - accuracy: 0.2493
52/487 [==>...........................] - ETA: 1s - loss: 1.6315 - accuracy: 0.2663
65/487 [===>..........................] - ETA: 1s - loss: 1.6218 - accuracy: 0.2861
78/487 [===>..........................] - ETA: 1s - loss: 1.6132 - accuracy: 0.3048
91/487 [====>.........................] - ETA: 1s - loss: 1.6045 - accuracy: 0.3214
104/487 [=====>........................] - ETA: 1s - loss: 1.5953 - accuracy: 0.3369
117/487 [======>.......................] - ETA: 1s - loss: 1.5866 - accuracy: 0.3504
130/487 [=======>......................] - ETA: 1s - loss: 1.5779 - accuracy: 0.3628
143/487 [=======>......................] - ETA: 1s - loss: 1.5696 - accuracy: 0.3738
156/487 [========>.....................] - ETA: 1s - loss: 1.5611 - accuracy: 0.3839
169/487 [=========>....................] - ETA: 1s - loss: 1.5531 - accuracy: 0.3922
182/487 [==========>...................] - ETA: 1s - loss: 1.5450 - accuracy: 0.3999
195/487 [===========>..................] - ETA: 1s - loss: 1.5366 - accuracy: 0.4081
208/487 [===========>..................] - ETA: 1s - loss: 1.5284 - accuracy: 0.4156
221/487 [============>.................] - ETA: 1s - loss: 1.5206 - accuracy: 0.4224
234/487 [=============>................] - ETA: 1s - loss: 1.5129 - accuracy: 0.4288
247/487 [==============>...............] - ETA: 0s - loss: 1.5053 - accuracy: 0.4347
260/487 [===============>..............] - ETA: 0s - loss: 1.4982 - accuracy: 0.4400
273/487 [===============>..............] - ETA: 0s - loss: 1.4905 - accuracy: 0.4458
286/487 [================>.............] - ETA: 0s - loss: 1.4835 - accuracy: 0.4505
299/487 [=================>............] - ETA: 0s - loss: 1.4761 - accuracy: 0.4553
312/487 [==================>...........] - ETA: 0s - loss: 1.4691 - accuracy: 0.4596
325/487 [===================>..........] - ETA: 0s - loss: 1.4620 - accuracy: 0.4638
338/487 [===================>..........] - ETA: 0s - loss: 1.4552 - accuracy: 0.4674
351/487 [====================>.........] - ETA: 0s - loss: 1.4488 - accuracy: 0.4710
364/487 [=====================>........] - ETA: 0s - loss: 1.4423 - accuracy: 0.4744
377/487 [======================>.......] - ETA: 0s - loss: 1.4359 - accuracy: 0.4782
390/487 [=======================>......] - ETA: 0s - loss: 1.4297 - accuracy: 0.4817
403/487 [=======================>......] - ETA: 0s - loss: 1.4235 - accuracy: 0.4853
416/487 [========================>.....] - ETA: 0s - loss: 1.4174 - accuracy: 0.4891
429/487 [=========================>....] - ETA: 0s - loss: 1.4114 - accuracy: 0.4924
442/487 [==========================>...] - ETA: 0s - loss: 1.4057 - accuracy: 0.4958
454/487 [==========================>...] - ETA: 0s - loss: 1.4003 - accuracy: 0.4990
466/487 [===========================>..] - ETA: 0s - loss: 1.3955 - accuracy: 0.5018
478/487 [============================>.] - ETA: 0s - loss: 1.3905 - accuracy: 0.5046
***callbacks***
saving losses to model_5/losses.log
Epoch 1: val_loss improved from inf to 1.19194, saving model to model_5/KERAS_check_best_model.h5
Epoch 1: val_loss improved from inf to 1.19194, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 1: saving model to model_5/KERAS_check_model_last.h5
Epoch 1: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 7s 6ms/step - loss: 1.3871 - accuracy: 0.5065 - val_loss: 1.1919 - val_accuracy: 0.6190 - lr: 1.0000e-04
Epoch 2/30
1/487 [..............................] - ETA: 3s - loss: 1.1709 - accuracy: 0.6367
14/487 [..............................] - ETA: 1s - loss: 1.1848 - accuracy: 0.6272
27/487 [>.............................] - ETA: 1s - loss: 1.1822 - accuracy: 0.6260
40/487 [=>............................] - ETA: 1s - loss: 1.1818 - accuracy: 0.6232
53/487 [==>...........................] - ETA: 1s - loss: 1.1795 - accuracy: 0.6225
66/487 [===>..........................] - ETA: 1s - loss: 1.1746 - accuracy: 0.6248
79/487 [===>..........................] - ETA: 1s - loss: 1.1705 - accuracy: 0.6275
92/487 [====>.........................] - ETA: 1s - loss: 1.1679 - accuracy: 0.6297
105/487 [=====>........................] - ETA: 1s - loss: 1.1649 - accuracy: 0.6319
117/487 [======>.......................] - ETA: 1s - loss: 1.1622 - accuracy: 0.6339
131/487 [=======>......................] - ETA: 1s - loss: 1.1590 - accuracy: 0.6357
144/487 [=======>......................] - ETA: 1s - loss: 1.1552 - accuracy: 0.6380
157/487 [========>.....................] - ETA: 1s - loss: 1.1527 - accuracy: 0.6392
170/487 [=========>....................] - ETA: 1s - loss: 1.1503 - accuracy: 0.6407
183/487 [==========>...................] - ETA: 1s - loss: 1.1476 - accuracy: 0.6419
196/487 [===========>..................] - ETA: 1s - loss: 1.1446 - accuracy: 0.6433
209/487 [===========>..................] - ETA: 1s - loss: 1.1412 - accuracy: 0.6449
222/487 [============>.................] - ETA: 1s - loss: 1.1382 - accuracy: 0.6463
235/487 [=============>................] - ETA: 0s - loss: 1.1352 - accuracy: 0.6479
248/487 [==============>...............] - ETA: 0s - loss: 1.1326 - accuracy: 0.6490
261/487 [===============>..............] - ETA: 0s - loss: 1.1303 - accuracy: 0.6499
274/487 [===============>..............] - ETA: 0s - loss: 1.1279 - accuracy: 0.6509
287/487 [================>.............] - ETA: 0s - loss: 1.1252 - accuracy: 0.6521
300/487 [=================>............] - ETA: 0s - loss: 1.1226 - accuracy: 0.6533
314/487 [==================>...........] - ETA: 0s - loss: 1.1202 - accuracy: 0.6543
327/487 [===================>..........] - ETA: 0s - loss: 1.1181 - accuracy: 0.6553
340/487 [===================>..........] - ETA: 0s - loss: 1.1158 - accuracy: 0.6564
353/487 [====================>.........] - ETA: 0s - loss: 1.1131 - accuracy: 0.6575
366/487 [=====================>........] - ETA: 0s - loss: 1.1105 - accuracy: 0.6587
379/487 [======================>.......] - ETA: 0s - loss: 1.1081 - accuracy: 0.6596
392/487 [=======================>......] - ETA: 0s - loss: 1.1055 - accuracy: 0.6606
406/487 [========================>.....] - ETA: 0s - loss: 1.1032 - accuracy: 0.6613
419/487 [========================>.....] - ETA: 0s - loss: 1.1011 - accuracy: 0.6621
433/487 [=========================>....] - ETA: 0s - loss: 1.0985 - accuracy: 0.6630
446/487 [==========================>...] - ETA: 0s - loss: 1.0966 - accuracy: 0.6637
459/487 [===========================>..] - ETA: 0s - loss: 1.0945 - accuracy: 0.6644
472/487 [============================>.] - ETA: 0s - loss: 1.0925 - accuracy: 0.6651
485/487 [============================>.] - ETA: 0s - loss: 1.0903 - accuracy: 0.6658
***callbacks***
saving losses to model_5/losses.log
Epoch 2: val_loss improved from 1.19194 to 1.01508, saving model to model_5/KERAS_check_best_model.h5
Epoch 2: val_loss improved from 1.19194 to 1.01508, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 2: saving model to model_5/KERAS_check_model_last.h5
Epoch 2: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 5ms/step - loss: 1.0900 - accuracy: 0.6659 - val_loss: 1.0151 - val_accuracy: 0.6934 - lr: 1.0000e-04
Epoch 3/30
1/487 [..............................] - ETA: 2s - loss: 0.9978 - accuracy: 0.7139
14/487 [..............................] - ETA: 1s - loss: 1.0115 - accuracy: 0.6961
27/487 [>.............................] - ETA: 1s - loss: 1.0109 - accuracy: 0.6946
40/487 [=>............................] - ETA: 1s - loss: 1.0061 - accuracy: 0.6946
53/487 [==>...........................] - ETA: 1s - loss: 1.0054 - accuracy: 0.6945
66/487 [===>..........................] - ETA: 1s - loss: 1.0043 - accuracy: 0.6945
79/487 [===>..........................] - ETA: 1s - loss: 1.0030 - accuracy: 0.6952
92/487 [====>.........................] - ETA: 1s - loss: 1.0014 - accuracy: 0.6962
105/487 [=====>........................] - ETA: 1s - loss: 1.0006 - accuracy: 0.6964
118/487 [======>.......................] - ETA: 1s - loss: 1.0004 - accuracy: 0.6964
131/487 [=======>......................] - ETA: 1s - loss: 0.9994 - accuracy: 0.6968
144/487 [=======>......................] - ETA: 1s - loss: 0.9979 - accuracy: 0.6977
157/487 [========>.....................] - ETA: 1s - loss: 0.9956 - accuracy: 0.6985
170/487 [=========>....................] - ETA: 1s - loss: 0.9944 - accuracy: 0.6989
183/487 [==========>...................] - ETA: 1s - loss: 0.9925 - accuracy: 0.6994
196/487 [===========>..................] - ETA: 1s - loss: 0.9916 - accuracy: 0.6997
209/487 [===========>..................] - ETA: 1s - loss: 0.9908 - accuracy: 0.6998
222/487 [============>.................] - ETA: 1s - loss: 0.9898 - accuracy: 0.7000
235/487 [=============>................] - ETA: 1s - loss: 0.9887 - accuracy: 0.7005
248/487 [==============>...............] - ETA: 0s - loss: 0.9871 - accuracy: 0.7013
261/487 [===============>..............] - ETA: 0s - loss: 0.9857 - accuracy: 0.7019
274/487 [===============>..............] - ETA: 0s - loss: 0.9847 - accuracy: 0.7021
287/487 [================>.............] - ETA: 0s - loss: 0.9831 - accuracy: 0.7026
300/487 [=================>............] - ETA: 0s - loss: 0.9817 - accuracy: 0.7032
313/487 [==================>...........] - ETA: 0s - loss: 0.9808 - accuracy: 0.7034
325/487 [===================>..........] - ETA: 0s - loss: 0.9799 - accuracy: 0.7037
335/487 [===================>..........] - ETA: 0s - loss: 0.9793 - accuracy: 0.7038
347/487 [====================>.........] - ETA: 0s - loss: 0.9782 - accuracy: 0.7041
360/487 [=====================>........] - ETA: 0s - loss: 0.9769 - accuracy: 0.7046
373/487 [=====================>........] - ETA: 0s - loss: 0.9757 - accuracy: 0.7050
386/487 [======================>.......] - ETA: 0s - loss: 0.9746 - accuracy: 0.7052
399/487 [=======================>......] - ETA: 0s - loss: 0.9732 - accuracy: 0.7057
412/487 [========================>.....] - ETA: 0s - loss: 0.9724 - accuracy: 0.7059
425/487 [=========================>....] - ETA: 0s - loss: 0.9711 - accuracy: 0.7062
438/487 [=========================>....] - ETA: 0s - loss: 0.9697 - accuracy: 0.7066
451/487 [==========================>...] - ETA: 0s - loss: 0.9682 - accuracy: 0.7071
464/487 [===========================>..] - ETA: 0s - loss: 0.9673 - accuracy: 0.7072
477/487 [============================>.] - ETA: 0s - loss: 0.9660 - accuracy: 0.7076
***callbacks***
saving losses to model_5/losses.log
Epoch 3: val_loss improved from 1.01508 to 0.92650, saving model to model_5/KERAS_check_best_model.h5
Epoch 3: val_loss improved from 1.01508 to 0.92650, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 3: saving model to model_5/KERAS_check_model_last.h5
Epoch 3: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 3s 6ms/step - loss: 0.9652 - accuracy: 0.7078 - val_loss: 0.9265 - val_accuracy: 0.7176 - lr: 1.0000e-04
Epoch 4/30
1/487 [..............................] - ETA: 3s - loss: 0.9231 - accuracy: 0.7090
14/487 [..............................] - ETA: 1s - loss: 0.9242 - accuracy: 0.7180
27/487 [>.............................] - ETA: 1s - loss: 0.9231 - accuracy: 0.7185
40/487 [=>............................] - ETA: 1s - loss: 0.9201 - accuracy: 0.7196
53/487 [==>...........................] - ETA: 1s - loss: 0.9223 - accuracy: 0.7193
66/487 [===>..........................] - ETA: 1s - loss: 0.9214 - accuracy: 0.7197
79/487 [===>..........................] - ETA: 1s - loss: 0.9201 - accuracy: 0.7199
92/487 [====>.........................] - ETA: 1s - loss: 0.9200 - accuracy: 0.7197
105/487 [=====>........................] - ETA: 1s - loss: 0.9194 - accuracy: 0.7197
118/487 [======>.......................] - ETA: 1s - loss: 0.9178 - accuracy: 0.7203
131/487 [=======>......................] - ETA: 1s - loss: 0.9155 - accuracy: 0.7209
144/487 [=======>......................] - ETA: 1s - loss: 0.9144 - accuracy: 0.7212
157/487 [========>.....................] - ETA: 1s - loss: 0.9131 - accuracy: 0.7214
170/487 [=========>....................] - ETA: 1s - loss: 0.9123 - accuracy: 0.7215
183/487 [==========>...................] - ETA: 1s - loss: 0.9117 - accuracy: 0.7216
196/487 [===========>..................] - ETA: 1s - loss: 0.9108 - accuracy: 0.7219
209/487 [===========>..................] - ETA: 1s - loss: 0.9099 - accuracy: 0.7220
222/487 [============>.................] - ETA: 1s - loss: 0.9083 - accuracy: 0.7224
235/487 [=============>................] - ETA: 0s - loss: 0.9076 - accuracy: 0.7225
248/487 [==============>...............] - ETA: 0s - loss: 0.9070 - accuracy: 0.7224
262/487 [===============>..............] - ETA: 0s - loss: 0.9064 - accuracy: 0.7225
276/487 [================>.............] - ETA: 0s - loss: 0.9057 - accuracy: 0.7226
289/487 [================>.............] - ETA: 0s - loss: 0.9051 - accuracy: 0.7227
302/487 [=================>............] - ETA: 0s - loss: 0.9044 - accuracy: 0.7229
315/487 [==================>...........] - ETA: 0s - loss: 0.9040 - accuracy: 0.7228
328/487 [===================>..........] - ETA: 0s - loss: 0.9030 - accuracy: 0.7231
341/487 [====================>.........] - ETA: 0s - loss: 0.9024 - accuracy: 0.7232
353/487 [====================>.........] - ETA: 0s - loss: 0.9018 - accuracy: 0.7234
366/487 [=====================>........] - ETA: 0s - loss: 0.9012 - accuracy: 0.7234
379/487 [======================>.......] - ETA: 0s - loss: 0.9005 - accuracy: 0.7236
392/487 [=======================>......] - ETA: 0s - loss: 0.9001 - accuracy: 0.7236
405/487 [=======================>......] - ETA: 0s - loss: 0.8997 - accuracy: 0.7236
418/487 [========================>.....] - ETA: 0s - loss: 0.8992 - accuracy: 0.7236
431/487 [=========================>....] - ETA: 0s - loss: 0.8987 - accuracy: 0.7236
444/487 [==========================>...] - ETA: 0s - loss: 0.8979 - accuracy: 0.7237
457/487 [===========================>..] - ETA: 0s - loss: 0.8973 - accuracy: 0.7239
471/487 [============================>.] - ETA: 0s - loss: 0.8965 - accuracy: 0.7240
484/487 [============================>.] - ETA: 0s - loss: 0.8962 - accuracy: 0.7239
***callbacks***
saving losses to model_5/losses.log
Epoch 4: val_loss improved from 0.92650 to 0.87623, saving model to model_5/KERAS_check_best_model.h5
Epoch 4: val_loss improved from 0.92650 to 0.87623, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 4: saving model to model_5/KERAS_check_model_last.h5
Epoch 4: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 3s 5ms/step - loss: 0.8961 - accuracy: 0.7240 - val_loss: 0.8762 - val_accuracy: 0.7272 - lr: 1.0000e-04
Epoch 5/30
1/487 [..............................] - ETA: 2s - loss: 0.8560 - accuracy: 0.7422
14/487 [..............................] - ETA: 1s - loss: 0.8748 - accuracy: 0.7246
27/487 [>.............................] - ETA: 1s - loss: 0.8730 - accuracy: 0.7270
40/487 [=>............................] - ETA: 1s - loss: 0.8702 - accuracy: 0.7266
53/487 [==>...........................] - ETA: 1s - loss: 0.8881 - accuracy: 0.7139
66/487 [===>..........................] - ETA: 1s - loss: 0.9895 - accuracy: 0.6477
79/487 [===>..........................] - ETA: 1s - loss: 1.0517 - accuracy: 0.6206
92/487 [====>.........................] - ETA: 1s - loss: 1.0909 - accuracy: 0.6140
105/487 [=====>........................] - ETA: 1s - loss: 1.1186 - accuracy: 0.6127
118/487 [======>.......................] - ETA: 1s - loss: 1.1367 - accuracy: 0.6161
131/487 [=======>......................] - ETA: 1s - loss: 1.1497 - accuracy: 0.6204
144/487 [=======>......................] - ETA: 1s - loss: 1.1586 - accuracy: 0.6246
157/487 [========>.....................] - ETA: 1s - loss: 1.1650 - accuracy: 0.6283
170/487 [=========>....................] - ETA: 1s - loss: 1.1686 - accuracy: 0.6323
183/487 [==========>...................] - ETA: 1s - loss: 1.1720 - accuracy: 0.6349
197/487 [===========>..................] - ETA: 1s - loss: 1.1733 - accuracy: 0.6383
210/487 [===========>..................] - ETA: 1s - loss: 1.1739 - accuracy: 0.6408
223/487 [============>.................] - ETA: 1s - loss: 1.1740 - accuracy: 0.6433
237/487 [=============>................] - ETA: 0s - loss: 1.1741 - accuracy: 0.6452
250/487 [==============>...............] - ETA: 0s - loss: 1.1738 - accuracy: 0.6468
263/487 [===============>..............] - ETA: 0s - loss: 1.1729 - accuracy: 0.6485
274/487 [===============>..............] - ETA: 0s - loss: 1.1719 - accuracy: 0.6501
287/487 [================>.............] - ETA: 0s - loss: 1.1706 - accuracy: 0.6516
299/487 [=================>............] - ETA: 0s - loss: 1.1698 - accuracy: 0.6521
312/487 [==================>...........] - ETA: 0s - loss: 1.1684 - accuracy: 0.6531
325/487 [===================>..........] - ETA: 0s - loss: 1.1671 - accuracy: 0.6541
338/487 [===================>..........] - ETA: 0s - loss: 1.1655 - accuracy: 0.6553
351/487 [====================>.........] - ETA: 0s - loss: 1.1634 - accuracy: 0.6566
364/487 [=====================>........] - ETA: 0s - loss: 1.1618 - accuracy: 0.6575
377/487 [======================>.......] - ETA: 0s - loss: 1.1600 - accuracy: 0.6586
390/487 [=======================>......] - ETA: 0s - loss: 1.1579 - accuracy: 0.6598
403/487 [=======================>......] - ETA: 0s - loss: 1.1562 - accuracy: 0.6607
416/487 [========================>.....] - ETA: 0s - loss: 1.1542 - accuracy: 0.6618
429/487 [=========================>....] - ETA: 0s - loss: 1.1524 - accuracy: 0.6627
442/487 [==========================>...] - ETA: 0s - loss: 1.1503 - accuracy: 0.6634
455/487 [===========================>..] - ETA: 0s - loss: 1.1486 - accuracy: 0.6642
468/487 [===========================>..] - ETA: 0s - loss: 1.1466 - accuracy: 0.6652
481/487 [============================>.] - ETA: 0s - loss: 1.1446 - accuracy: 0.6661
***callbacks***
saving losses to model_5/losses.log
Epoch 5: val_loss did not improve from 0.87623
Epoch 5: val_loss did not improve from 0.87623
Epoch 5: saving model to model_5/KERAS_check_model_last.h5
Epoch 5: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 3s 5ms/step - loss: 1.1439 - accuracy: 0.6664 - val_loss: 1.0731 - val_accuracy: 0.6967 - lr: 1.0000e-04
Epoch 6/30
1/487 [..............................] - ETA: 3s - loss: 1.0150 - accuracy: 0.7324
13/487 [..............................] - ETA: 1s - loss: 1.0668 - accuracy: 0.6976
26/487 [>.............................] - ETA: 1s - loss: 1.0618 - accuracy: 0.7031
39/487 [=>............................] - ETA: 1s - loss: 1.0637 - accuracy: 0.7007
52/487 [==>...........................] - ETA: 1s - loss: 1.0614 - accuracy: 0.7009
65/487 [===>..........................] - ETA: 1s - loss: 1.0608 - accuracy: 0.7005
78/487 [===>..........................] - ETA: 1s - loss: 1.0595 - accuracy: 0.7007
91/487 [====>.........................] - ETA: 1s - loss: 1.0587 - accuracy: 0.7005
104/487 [=====>........................] - ETA: 1s - loss: 1.0569 - accuracy: 0.7006
117/487 [======>.......................] - ETA: 1s - loss: 1.0556 - accuracy: 0.7005
130/487 [=======>......................] - ETA: 1s - loss: 1.0536 - accuracy: 0.7014
143/487 [=======>......................] - ETA: 1s - loss: 1.0523 - accuracy: 0.7014
156/487 [========>.....................] - ETA: 1s - loss: 1.0501 - accuracy: 0.7021
169/487 [=========>....................] - ETA: 1s - loss: 1.0484 - accuracy: 0.7024
182/487 [==========>...................] - ETA: 1s - loss: 1.0474 - accuracy: 0.7024
195/487 [===========>..................] - ETA: 1s - loss: 1.0465 - accuracy: 0.7024
208/487 [===========>..................] - ETA: 1s - loss: 1.0450 - accuracy: 0.7027
221/487 [============>.................] - ETA: 1s - loss: 1.0437 - accuracy: 0.7028
234/487 [=============>................] - ETA: 1s - loss: 1.0418 - accuracy: 0.7033
247/487 [==============>...............] - ETA: 0s - loss: 1.0405 - accuracy: 0.7036
260/487 [===============>..............] - ETA: 0s - loss: 1.0391 - accuracy: 0.7039
273/487 [===============>..............] - ETA: 0s - loss: 1.0376 - accuracy: 0.7042
286/487 [================>.............] - ETA: 0s - loss: 1.0364 - accuracy: 0.7045
297/487 [=================>............] - ETA: 0s - loss: 1.0351 - accuracy: 0.7047
308/487 [=================>............] - ETA: 0s - loss: 1.0344 - accuracy: 0.7047
321/487 [==================>...........] - ETA: 0s - loss: 1.0335 - accuracy: 0.7045
334/487 [===================>..........] - ETA: 0s - loss: 1.0322 - accuracy: 0.7048
347/487 [====================>.........] - ETA: 0s - loss: 1.0310 - accuracy: 0.7050
359/487 [=====================>........] - ETA: 0s - loss: 1.0298 - accuracy: 0.7052
372/487 [=====================>........] - ETA: 0s - loss: 1.0283 - accuracy: 0.7054
385/487 [======================>.......] - ETA: 0s - loss: 1.0270 - accuracy: 0.7056
398/487 [=======================>......] - ETA: 0s - loss: 1.0253 - accuracy: 0.7061
411/487 [========================>.....] - ETA: 0s - loss: 1.0243 - accuracy: 0.7061
424/487 [=========================>....] - ETA: 0s - loss: 1.0230 - accuracy: 0.7063
437/487 [=========================>....] - ETA: 0s - loss: 1.0217 - accuracy: 0.7066
450/487 [==========================>...] - ETA: 0s - loss: 1.0204 - accuracy: 0.7068
463/487 [===========================>..] - ETA: 0s - loss: 1.0192 - accuracy: 0.7069
476/487 [============================>.] - ETA: 0s - loss: 1.0179 - accuracy: 0.7071
***callbacks***
saving losses to model_5/losses.log
Epoch 6: val_loss did not improve from 0.87623
Epoch 6: val_loss did not improve from 0.87623
Epoch 6: saving model to model_5/KERAS_check_model_last.h5
Epoch 6: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 5ms/step - loss: 1.0172 - accuracy: 0.7072 - val_loss: 0.9764 - val_accuracy: 0.7131 - lr: 1.0000e-04
Epoch 7/30
1/487 [..............................] - ETA: 2s - loss: 0.9527 - accuracy: 0.7256
14/487 [..............................] - ETA: 1s - loss: 0.9713 - accuracy: 0.7161
27/487 [>.............................] - ETA: 1s - loss: 0.9738 - accuracy: 0.7145
40/487 [=>............................] - ETA: 1s - loss: 0.9767 - accuracy: 0.7111
53/487 [==>...........................] - ETA: 1s - loss: 0.9740 - accuracy: 0.7114
66/487 [===>..........................] - ETA: 1s - loss: 0.9717 - accuracy: 0.7133
79/487 [===>..........................] - ETA: 1s - loss: 0.9699 - accuracy: 0.7132
92/487 [====>.........................] - ETA: 1s - loss: 0.9691 - accuracy: 0.7129
105/487 [=====>........................] - ETA: 1s - loss: 0.9681 - accuracy: 0.7129
118/487 [======>.......................] - ETA: 1s - loss: 0.9669 - accuracy: 0.7133
131/487 [=======>......................] - ETA: 1s - loss: 0.9662 - accuracy: 0.7140
144/487 [=======>......................] - ETA: 1s - loss: 0.9659 - accuracy: 0.7139
157/487 [========>.....................] - ETA: 1s - loss: 0.9655 - accuracy: 0.7137
170/487 [=========>....................] - ETA: 1s - loss: 0.9637 - accuracy: 0.7143
183/487 [==========>...................] - ETA: 1s - loss: 0.9629 - accuracy: 0.7144
196/487 [===========>..................] - ETA: 1s - loss: 0.9619 - accuracy: 0.7146
210/487 [===========>..................] - ETA: 1s - loss: 0.9606 - accuracy: 0.7148
223/487 [============>.................] - ETA: 1s - loss: 0.9598 - accuracy: 0.7149
234/487 [=============>................] - ETA: 1s - loss: 0.9589 - accuracy: 0.7149
245/487 [==============>...............] - ETA: 0s - loss: 0.9577 - accuracy: 0.7153
258/487 [==============>...............] - ETA: 0s - loss: 0.9568 - accuracy: 0.7152
271/487 [===============>..............] - ETA: 0s - loss: 0.9558 - accuracy: 0.7153
284/487 [================>.............] - ETA: 0s - loss: 0.9550 - accuracy: 0.7155
297/487 [=================>............] - ETA: 0s - loss: 0.9536 - accuracy: 0.7159
310/487 [==================>...........] - ETA: 0s - loss: 0.9527 - accuracy: 0.7161
323/487 [==================>...........] - ETA: 0s - loss: 0.9516 - accuracy: 0.7163
336/487 [===================>..........] - ETA: 0s - loss: 0.9508 - accuracy: 0.7164
349/487 [====================>.........] - ETA: 0s - loss: 0.9495 - accuracy: 0.7167
362/487 [=====================>........] - ETA: 0s - loss: 0.9486 - accuracy: 0.7169
375/487 [======================>.......] - ETA: 0s - loss: 0.9479 - accuracy: 0.7169
388/487 [======================>.......] - ETA: 0s - loss: 0.9471 - accuracy: 0.7171
401/487 [=======================>......] - ETA: 0s - loss: 0.9464 - accuracy: 0.7173
414/487 [========================>.....] - ETA: 0s - loss: 0.9459 - accuracy: 0.7171
427/487 [=========================>....] - ETA: 0s - loss: 0.9451 - accuracy: 0.7172
440/487 [==========================>...] - ETA: 0s - loss: 0.9447 - accuracy: 0.7171
453/487 [==========================>...] - ETA: 0s - loss: 0.9444 - accuracy: 0.7170
467/487 [===========================>..] - ETA: 0s - loss: 0.9436 - accuracy: 0.7171
480/487 [============================>.] - ETA: 0s - loss: 0.9431 - accuracy: 0.7172
***callbacks***
saving losses to model_5/losses.log
Epoch 7: val_loss did not improve from 0.87623
Epoch 7: val_loss did not improve from 0.87623
Epoch 7: saving model to model_5/KERAS_check_model_last.h5
Epoch 7: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 5ms/step - loss: 0.9428 - accuracy: 0.7174 - val_loss: 0.9199 - val_accuracy: 0.7201 - lr: 1.0000e-04
Epoch 8/30
1/487 [..............................] - ETA: 2s - loss: 0.9480 - accuracy: 0.7100
13/487 [..............................] - ETA: 1s - loss: 0.9073 - accuracy: 0.7219
26/487 [>.............................] - ETA: 1s - loss: 0.9117 - accuracy: 0.7216
39/487 [=>............................] - ETA: 1s - loss: 0.9109 - accuracy: 0.7229
52/487 [==>...........................] - ETA: 1s - loss: 0.9119 - accuracy: 0.7211
66/487 [===>..........................] - ETA: 1s - loss: 0.9119 - accuracy: 0.7216
79/487 [===>..........................] - ETA: 1s - loss: 0.9123 - accuracy: 0.7211
92/487 [====>.........................] - ETA: 1s - loss: 0.9116 - accuracy: 0.7210
105/487 [=====>........................] - ETA: 1s - loss: 0.9117 - accuracy: 0.7207
118/487 [======>.......................] - ETA: 1s - loss: 0.9120 - accuracy: 0.7204
131/487 [=======>......................] - ETA: 1s - loss: 0.9124 - accuracy: 0.7200
144/487 [=======>......................] - ETA: 1s - loss: 0.9119 - accuracy: 0.7198
157/487 [========>.....................] - ETA: 1s - loss: 0.9102 - accuracy: 0.7206
170/487 [=========>....................] - ETA: 1s - loss: 0.9101 - accuracy: 0.7207
183/487 [==========>...................] - ETA: 1s - loss: 0.9095 - accuracy: 0.7210
196/487 [===========>..................] - ETA: 1s - loss: 0.9091 - accuracy: 0.7213
210/487 [===========>..................] - ETA: 1s - loss: 0.9095 - accuracy: 0.7210
224/487 [============>.................] - ETA: 1s - loss: 0.9092 - accuracy: 0.7210
237/487 [=============>................] - ETA: 0s - loss: 0.9089 - accuracy: 0.7210
250/487 [==============>...............] - ETA: 0s - loss: 0.9088 - accuracy: 0.7211
263/487 [===============>..............] - ETA: 0s - loss: 0.9082 - accuracy: 0.7211
276/487 [================>.............] - ETA: 0s - loss: 0.9077 - accuracy: 0.7211
289/487 [================>.............] - ETA: 0s - loss: 0.9068 - accuracy: 0.7214
302/487 [=================>............] - ETA: 0s - loss: 0.9066 - accuracy: 0.7214
316/487 [==================>...........] - ETA: 0s - loss: 0.9062 - accuracy: 0.7214
330/487 [===================>..........] - ETA: 0s - loss: 0.9062 - accuracy: 0.7212
343/487 [====================>.........] - ETA: 0s - loss: 0.9060 - accuracy: 0.7212
356/487 [====================>.........] - ETA: 0s - loss: 0.9055 - accuracy: 0.7213
369/487 [=====================>........] - ETA: 0s - loss: 0.9050 - accuracy: 0.7214
382/487 [======================>.......] - ETA: 0s - loss: 0.9046 - accuracy: 0.7215
395/487 [=======================>......] - ETA: 0s - loss: 0.9040 - accuracy: 0.7216
408/487 [========================>.....] - ETA: 0s - loss: 0.9034 - accuracy: 0.7218
421/487 [========================>.....] - ETA: 0s - loss: 0.9032 - accuracy: 0.7216
434/487 [=========================>....] - ETA: 0s - loss: 0.9027 - accuracy: 0.7218
447/487 [==========================>...] - ETA: 0s - loss: 0.9021 - accuracy: 0.7219
460/487 [===========================>..] - ETA: 0s - loss: 0.9016 - accuracy: 0.7220
473/487 [============================>.] - ETA: 0s - loss: 0.9009 - accuracy: 0.7222
486/487 [============================>.] - ETA: 0s - loss: 0.9001 - accuracy: 0.7225
***callbacks***
saving losses to model_5/losses.log
Epoch 8: val_loss did not improve from 0.87623
Epoch 8: val_loss did not improve from 0.87623
Epoch 8: saving model to model_5/KERAS_check_model_last.h5
Epoch 8: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 2s 5ms/step - loss: 0.9001 - accuracy: 0.7225 - val_loss: 0.8878 - val_accuracy: 0.7234 - lr: 1.0000e-04
Epoch 9/30
1/487 [..............................] - ETA: 2s - loss: 0.8886 - accuracy: 0.7207
14/487 [..............................] - ETA: 1s - loss: 0.8781 - accuracy: 0.7291
25/487 [>.............................] - ETA: 1s - loss: 0.8818 - accuracy: 0.7266
38/487 [=>............................] - ETA: 1s - loss: 0.8805 - accuracy: 0.7265
51/487 [==>...........................] - ETA: 1s - loss: 0.8810 - accuracy: 0.7256
63/487 [==>...........................] - ETA: 1s - loss: 0.8808 - accuracy: 0.7259
76/487 [===>..........................] - ETA: 1s - loss: 0.8804 - accuracy: 0.7262
89/487 [====>.........................] - ETA: 1s - loss: 0.8817 - accuracy: 0.7251
102/487 [=====>........................] - ETA: 1s - loss: 0.8821 - accuracy: 0.7247
115/487 [======>.......................] - ETA: 1s - loss: 0.8801 - accuracy: 0.7257
128/487 [======>.......................] - ETA: 1s - loss: 0.8811 - accuracy: 0.7249
141/487 [=======>......................] - ETA: 1s - loss: 0.8811 - accuracy: 0.7247
154/487 [========>.....................] - ETA: 1s - loss: 0.8805 - accuracy: 0.7247
167/487 [=========>....................] - ETA: 1s - loss: 0.8806 - accuracy: 0.7247
180/487 [==========>...................] - ETA: 1s - loss: 0.8799 - accuracy: 0.7250
193/487 [==========>...................] - ETA: 1s - loss: 0.8797 - accuracy: 0.7251
206/487 [===========>..................] - ETA: 1s - loss: 0.8801 - accuracy: 0.7248
219/487 [============>.................] - ETA: 1s - loss: 0.8796 - accuracy: 0.7250
232/487 [=============>................] - ETA: 1s - loss: 0.8795 - accuracy: 0.7250
246/487 [==============>...............] - ETA: 0s - loss: 0.8796 - accuracy: 0.7248
259/487 [==============>...............] - ETA: 0s - loss: 0.8788 - accuracy: 0.7250
272/487 [===============>..............] - ETA: 0s - loss: 0.8782 - accuracy: 0.7253
285/487 [================>.............] - ETA: 0s - loss: 0.8778 - accuracy: 0.7255
298/487 [=================>............] - ETA: 0s - loss: 0.8768 - accuracy: 0.7258
311/487 [==================>...........] - ETA: 0s - loss: 0.8768 - accuracy: 0.7258
323/487 [==================>...........] - ETA: 0s - loss: 0.8772 - accuracy: 0.7255
336/487 [===================>..........] - ETA: 0s - loss: 0.8766 - accuracy: 0.7256
349/487 [====================>.........] - ETA: 0s - loss: 0.8761 - accuracy: 0.7257
362/487 [=====================>........] - ETA: 0s - loss: 0.8762 - accuracy: 0.7255
375/487 [======================>.......] - ETA: 0s - loss: 0.8759 - accuracy: 0.7257
388/487 [======================>.......] - ETA: 0s - loss: 0.8758 - accuracy: 0.7257
401/487 [=======================>......] - ETA: 0s - loss: 0.8757 - accuracy: 0.7256
414/487 [========================>.....] - ETA: 0s - loss: 0.8758 - accuracy: 0.7255
427/487 [=========================>....] - ETA: 0s - loss: 0.8756 - accuracy: 0.7255
440/487 [==========================>...] - ETA: 0s - loss: 0.8753 - accuracy: 0.7257
453/487 [==========================>...] - ETA: 0s - loss: 0.8752 - accuracy: 0.7257
466/487 [===========================>..] - ETA: 0s - loss: 0.8746 - accuracy: 0.7259
479/487 [============================>.] - ETA: 0s - loss: 0.8743 - accuracy: 0.7259
***callbacks***
saving losses to model_5/losses.log
Epoch 9: val_loss improved from 0.87623 to 0.86555, saving model to model_5/KERAS_check_best_model.h5
Epoch 9: val_loss improved from 0.87623 to 0.86555, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 9: saving model to model_5/KERAS_check_model_last.h5
Epoch 9: saving model to model_5/KERAS_check_model_last_weights.h5
***callbacks end***
487/487 [==============================] - 3s 6ms/step - loss: 0.8739 - accuracy: 0.7260 - val_loss: 0.8655 - val_accuracy: 0.7272 - lr: 1.0000e-04
Epoch 10/30
1/487 [..............................] - ETA: 2s - loss: 0.8437 - accuracy: 0.7393
13/487 [..............................] - ETA: 2s - loss: 0.8701 - accuracy: 0.7271
26/487 [>.............................] - ETA: 1s - loss: 0.8655 - accuracy: 0.7279
39/487 [=>............................] - ETA: 1s - loss: 0.8645 - accuracy: 0.7268
52/487 [==>...........................] - ETA: 1s - loss: 0.8653 - accuracy: 0.7261
65/487 [===>..........................] - ETA: 1s - loss: 0.8642 - accuracy: 0.7264
78/487 [===>..........................] - ETA: 1s - loss: 0.8623 - accuracy: 0.7265
91/487 [====>.........................] - ETA: 1s - loss: 0.8620 - accuracy: 0.7265
104/487 [=====>........................] - ETA: 1s - loss: 0.8598 - accuracy: 0.7276
117/487 [======>.......................] - ETA: 1s - loss: 0.8595 - accuracy: 0.7277
130/487 [=======>......................] - ETA: 1s - loss: 0.8581 - accuracy: 0.7286
143/487 [=======>......................] - ETA: 1s - loss: 0.8573 - accuracy: 0.7288
156/487 [========>.....................] - ETA: 1s - loss: 0.8571 - accuracy: 0.7288
169/487 [=========>....................] - ETA: 1s - loss: 0.8578 - accuracy: 0.7285
182/487 [==========>...................] - ETA: 1s - loss: 0.8584 - accuracy: 0.7281
195/487 [===========>..................] - ETA: 1s - loss: 0.8581 - accuracy: 0.7282
208/487 [===========>..................] - ETA: 1s - loss: 0.8584 - accuracy: 0.7279
221/487 [============>.................] - ETA: 1s - loss: 0.8586 - accuracy: 0.7277
234/487 [=============>................] - ETA: 1s - loss: 0.8583 - accuracy: 0.7277
247/487 [==============>...............] - ETA: 0s - loss: 0.8578 - accuracy: 0.7280
260/487 [===============>..............] - ETA: 0s - loss: 0.8573 - accuracy: 0.7281
273/487 [===============>..............] - ETA: 0s - loss: 0.8570 - accuracy: 0.7283
286/487 [================>.............] - ETA: 0s - loss: 0.8567 - accuracy: 0.7284
299/487 [=================>............] - ETA: 0s - loss: 0.8564 - accuracy: 0.7284
312/487 [==================>...........] - ETA: 0s - loss: 0.8558 - accuracy: 0.7287
325/487 [===================>..........] - ETA: 0s - loss: 0.8559 - accuracy: 0.7285
338/487 [===================>..........] - ETA: 0s - loss: 0.8553 - accuracy: 0.7287
351/487 [====================>.........] - ETA: 0s - loss: 0.8553 - accuracy: 0.7287
364/487 [=====================>........] - ETA: 0s - loss: 0.8548 - accuracy: 0.7288
377/487 [======================>.......] - ETA: 0s - loss: 0.8546 - accuracy: 0.7289
390/487 [=======================>......] - ETA: 0s - loss: 0.8541 - accuracy: 0.7291
403/487 [=======================>......] - ETA: 0s - loss: 0.8541 - accuracy: 0.7290
416/487 [========================>.....] - ETA: 0s - loss: 0.8544 - accuracy: 0.7287
429/487 [=========================>....] - ETA: 0s - loss: 0.8547 - accuracy: 0.7286
442/487 [==========================>...] - ETA: 0s - loss: 0.8544 - accuracy: 0.7286
455/487 [===========================>..] - ETA: 0s - loss: 0.8541 - accuracy: 0.7287
468/487 [===========================>..] - ETA: 0s - loss: 0.8545 - accuracy: 0.7284
481/487 [============================>.] - ETA: 0s - loss: 0.8547 - accuracy: 0.7282
***callbacks***
saving losses to model_5/losses.log
Epoch 10: val_loss improved from 0.86555 to 0.84897, saving model to model_5/KERAS_check_best_model.h5
Epoch 10: val_loss improved from 0.86555 to 0.84897, saving model to model_5/KERAS_check_best_model_weights.h5
Epoch 10: saving model to model_5/KERAS_check_model_last.h5
Epoch 10: saving model to model_5/KERAS_check_model_last_weights.h5
Epoch 10: saving model to model_5/KERAS_check_model_epoch10.h5
***callbacks end***
487/487 [==============================] - 2s 5ms/step - loss: 0.8545 - accuracy: 0.7282 - val_loss: 0.8490 - val_accuracy: 0.7287 - lr: 1.0000e-04
Epoch 11/30
1/487 [..............................] - ETA: 3s - loss: 0.8593 - accuracy: 0.7314
14/487 [..............................] - ETA: 1s - loss: 0.8324 - accuracy: 0.7354
27/487 [>.............................] - ETA: 1s - loss: 0.8356 - accuracy: 0.7354
40/487 [=>............................] - ETA: 1s - loss: 0.8418 - accuracy: 0.7310
53/487 [==>...........................] - ETA: 1s - loss: 0.8412 - accuracy: 0.7316
66/487 [===>..........................] - ETA: 1s - loss: 0.8401 - accuracy: 0.7325
79/487 [===>..........................] - ETA: 1s - loss: 0.8418 - accuracy: 0.7320
92/487 [====>.........................] - ETA: 1s - loss: 0.8442 - accuracy: 0.7303
105/487 [=====>........................] - ETA: 1s - loss: 0.8441 - accuracy: 0.7298
118/487 [======>.......................] - ETA: 1s - loss: 0.8434 - accuracy: 0.7294
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
/tmp/ipykernel_6286/2535795537.py in <module>
22 validation_split=0.25,
23 shuffle=True,
---> 24 callbacks=callbacks.callbacks,
25 )
26 # Save the model again but with the pruning 'stripped' to use the regular layer types
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/utils/traceback_utils.py in error_handler(*args, **kwargs)
63 filtered_tb = None
64 try:
---> 65 return fn(*args, **kwargs)
66 except Exception as e:
67 filtered_tb = _process_traceback_frames(e.__traceback__)
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
1562 ):
1563 callbacks.on_train_batch_begin(step)
-> 1564 tmp_logs = self.train_function(iterator)
1565 if data_handler.should_sync:
1566 context.async_wait()
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/util/traceback_utils.py in error_handler(*args, **kwargs)
148 filtered_tb = None
149 try:
--> 150 return fn(*args, **kwargs)
151 except Exception as e:
152 filtered_tb = _process_traceback_frames(e.__traceback__)
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
913
914 with OptionalXlaContext(self._jit_compile):
--> 915 result = self._call(*args, **kwds)
916
917 new_tracing_count = self.experimental_get_tracing_count()
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
945 # In this case we have created variables on the first call, so we run the
946 # defunned version which is guaranteed to never create variables.
--> 947 return self._stateless_fn(*args, **kwds) # pylint: disable=not-callable
948 elif self._stateful_fn is not None:
949 # Release the lock early so that multiple threads can perform the call
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
2495 filtered_flat_args) = self._maybe_define_function(args, kwargs)
2496 return graph_function._call_flat(
-> 2497 filtered_flat_args, captured_inputs=graph_function.captured_inputs) # pylint: disable=protected-access
2498
2499 @property
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
1861 # No tape is watching; skip to running the function.
1862 return self._build_call_outputs(self._inference_function.call(
-> 1863 ctx, args, cancellation_manager=cancellation_manager))
1864 forward_backward = self._select_forward_and_backward_functions(
1865 args,
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
502 inputs=args,
503 attrs=attrs,
--> 504 ctx=ctx)
505 else:
506 outputs = execute.execute_with_cancellation(
/opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/site-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
53 ctx.ensure_initialized()
54 tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 55 inputs, attrs, num_outputs)
56 except core._NotOkStatusException as e:
57 if name is not None:
KeyboardInterrupt:
Check performance#
How does this model which was trained using 6-bits, and 75% sparsity model compare against the original model? Let’s report the accuracy and make a ROC curve. The quantized, pruned model is shown with solid lines, the unpruned original model is shown with dashed lines.
We can also check that this performance isn’t specific to some special setup in QKeras. We will use hls4ml can respect the choice to use 6-bits throughout the model, and match the accuracy. We’ll generate a configuration from this Quantized model, and plot its performance as the dotted line. The generated configuration is printed out. You’ll notice that it uses 7 bits for the type, but we specified 6!? That’s just because QKeras doesn’t count the sign-bit when we specify the number of bits, so the type that actually gets used needs 1 more.
We also use the OutputRoundingSaturationMode
optimizer pass of hls4ml
to set the Activation layers to round, rather than truncate, the cast. This is important for getting good model accuracy when using small bit precision activations. And we’ll set a different data type for the tables used in the Softmax, just for a bit of extra performance.
import hls4ml
import plotting
hls4ml.model.optimizer.get_optimizer("output_rounding_saturation_mode").layers = [
"Activation"
]
hls4ml.model.optimizer.get_optimizer(
"output_rounding_saturation_mode"
).rounding_mode = "AP_RND"
hls4ml.model.optimizer.get_optimizer(
"output_rounding_saturation_mode"
).saturation_mode = "AP_SAT"
config = hls4ml.utils.config_from_keras_model(
model, granularity="name", default_precision="ap_fixed<10,4>"
)
config["LayerName"]["softmax"]["exp_table_t"] = "ap_fixed<18,8>"
config["LayerName"]["softmax"]["inv_table_t"] = "ap_fixed<18,4>"
print("-----------------------------------")
plotting.print_dict(config)
print("-----------------------------------")
hls_model = hls4ml.converters.convert_from_keras_model(
model,
hls_config=config,
output_dir="model_5/hls4ml_prj",
part="xcu250-figd2104-2L-e",
)
hls_model.compile()
y_qkeras = model.predict(np.ascontiguousarray(X_test))
y_hls = hls_model.predict(np.ascontiguousarray(X_test))
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: Input
Layer name: fc1, layer type: QDense
-> Activation (linear), layer name: fc1
Layer name: relu1, layer type: QActivation
-> Activation (quantized_relu), layer name: relu1
Layer name: fc2, layer type: QDense
-> Activation (linear), layer name: fc2
Layer name: relu2, layer type: QActivation
-> Activation (quantized_relu), layer name: relu2
Layer name: fc3, layer type: QDense
-> Activation (linear), layer name: fc3
Layer name: relu3, layer type: QActivation
-> Activation (quantized_relu), layer name: relu3
Layer name: output, layer type: QDense
-> Activation (linear), layer name: output
Layer name: softmax, layer type: Activation
-----------------------------------
Model
Precision: ap_fixed<10,4>
ReuseFactor: 1
Strategy: Latency
LayerName
fc1_input
Precision
result: ap_fixed<10,4>
fc1
Precision
weight: ap_fixed<6,1>
bias: ap_fixed<6,1>
ReuseFactor: 1
fc1_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu1
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
relu1_quantized_relu
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
fc2
Precision
weight: ap_fixed<6,1>
bias: ap_fixed<6,1>
ReuseFactor: 1
fc2_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu2
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
relu2_quantized_relu
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
fc3
Precision
weight: ap_fixed<6,1>
bias: ap_fixed<6,1>
ReuseFactor: 1
fc3_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
relu3
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
relu3_quantized_relu
Precision
result: ap_ufixed<6,0>
ReuseFactor: 1
output
Precision
weight: ap_fixed<6,1>
bias: ap_fixed<6,1>
ReuseFactor: 1
output_linear
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
table_t: ap_fixed<18,8>
softmax
Precision: ap_fixed<10,4>
ReuseFactor: 1
table_size: 1024
exp_table_t: ap_fixed<18,8>
inv_table_t: ap_fixed<18,4>
-----------------------------------
Interpreting Sequential
Topology:
Layer name: fc1_input, layer type: InputLayer, input shapes: [[None, 16]], output shape: [None, 16]
Layer name: fc1, layer type: QDense, input shapes: [[None, 16]], output shape: [None, 64]
Layer name: relu1, layer type: Activation, input shapes: [[None, 64]], output shape: [None, 64]
Layer name: fc2, layer type: QDense, input shapes: [[None, 64]], output shape: [None, 32]
Layer name: relu2, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: fc3, layer type: QDense, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: relu3, layer type: Activation, input shapes: [[None, 32]], output shape: [None, 32]
Layer name: output, layer type: QDense, input shapes: [[None, 32]], output shape: [None, 5]
Layer name: softmax, layer type: Softmax, input shapes: [[None, 5]], output shape: [None, 5]
Creating HLS model
Writing HLS project
WARNING:tensorflow:Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model.
Done
5188/5188 [==============================] - 4s 730us/step
%matplotlib inline
from sklearn.metrics import accuracy_score
from tensorflow.keras.models import load_model
if not COLAB:
model_ref = load_model("model_1/KERAS_check_best_model.h5") #locally
else:
model_ref = load_model("iaifi-summer-school/book/model_1/KERAS_check_best_model.h5") #for colab
y_ref = model_ref.predict(X_test)
print(
"Accuracy baseline: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_ref, axis=1))
)
)
print(
"Accuracy pruned, quantized: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_qkeras, axis=1))
)
)
print(
"Accuracy hls4ml: {}".format(
accuracy_score(np.argmax(y_test, axis=1), np.argmax(y_hls, axis=1))
)
)
fig, ax = plt.subplots(figsize=(9, 9))
_ = plotting.makeRoc(y_test, y_ref, classes)
plt.gca().set_prop_cycle(None) # reset the colors
_ = plotting.makeRoc(y_test, y_qkeras, classes, linestyle="--")
plt.gca().set_prop_cycle(None) # reset the colors
_ = plotting.makeRoc(y_test, y_hls, classes, linestyle=":")
from matplotlib.lines import Line2D
lines = [Line2D([0], [0], ls="-"), Line2D([0], [0], ls="--"), Line2D([0], [0], ls=":")]
from matplotlib.legend import Legend
leg = Legend(
ax,
lines,
labels=["baseline", "pruned, quantized", "hls4ml"],
loc="lower right",
frameon=False,
)
ax.add_artist(leg)
5188/5188 [==============================] - 3s 577us/step
Accuracy baseline: 0.7516506024096385
Accuracy pruned, quantized: 0.7460240963855421
Accuracy hls4ml: 0.7423975903614458
<matplotlib.legend.Legend at 0x16c139730>