ichou1のブログ

主に音声認識、時々、データ分析のことを書く

TensorFlowメモ(checkpointの中身を確認する)

checkpointファイルをもとに、TensorFlowモデルの内部パラメータを確認する方法のメモ。

実行する環境は「TensorFlow 2.X」系

% python -c 'import tensorflow as tf; print(tf.__version__)'
2.1.0


確認用データの準備

下記をもとに、適当なモデルを作って、checkpointファイルを作成する。
モデルの保存と復元  |  TensorFlow Core

pythonコード
import tensorflow as tf
from tensorflow import keras

checkpoint_path = '/tmp/checkpoint_test/test'

''/tmp/checkpoint_test/'が保存先ディレクトリ、"test"がcheckpointファイル名につけるprefix

pythonコード(続き)
# MNISTデータのロード
(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data()

train_labels = train_labels[:1000]
test_labels = test_labels[:1000]

train_images = train_images[:1000].reshape(-1, 28 * 28) / 255.0
test_images = test_images[:1000].reshape(-1, 28 * 28) / 255.0


# モデル定義
def create_model():
  model = tf.keras.models.Sequential([
    keras.layers.Dense(512, activation='relu', input_shape=(784,), name='my_dense_1'),
    keras.layers.Dropout(0.2, name='my_dropout'),
    keras.layers.Dense(10, activation='softmax', name='my_dense_2')
  ])
    
  return model

# モデルのインスタンスを作成
model = create_model()

# モデルのコンパイル
model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.summary()
modelサマリ
Layer (type)                 Output Shape              Param #   
=================================================================
my_dense_1 (Dense)           (None, 512)               401920    
_________________________________________________________________
my_dropout (Dropout)         (None, 512)               0         
_________________________________________________________________
my_dense_2 (Dense)           (None, 10)                5130      
=================================================================
Total params: 407,050
Trainable params: 407,050
Non-trainable params: 0

パラメータ数の内訳は下記のとおり。

  • Layer1 dot: 401408 (= 784 * 512)
  • Layer1 bias: 512
  • Lyaer3 dot: 5120 (= 512 * 10)
  • Lyaer3 bias: 10


pythonコード(続き)
# コールバック定義
cp_callback = tf.keras.callbacks.ModelCheckpoint(checkpoint_path, 
                                                 save_weights_only=True,
                                                 verbose=1)

# トレーニング
model.fit(train_images,
          train_labels,
          epochs = 10, 
          validation_data = (test_images, test_labels),
          callbacks = [cp_callback])

(おまけ)validatationデータに対する精度は86%ほど。

Epoch 00010: saving model to /TTT/DeepSpeech/checkpoint_test/training_1/cp.ckpt
1000/1000 [==============================] - 0s 200us/sample - loss: 0.0380 - accuracy: 0.9990 - val_loss: 0.4392 - val_accuracy: 0.8640
生成されたファイル
% ls -l /tmp/checkpoint_test/
total 4784
-rw-rw-r-- 1 ichou1 ichou1      65  5月  3 11:20 checkpoint
-rw-rw-r-- 1 ichou1 ichou1 4886740  5月  3 11:20 test.data-00000-of-00001
-rw-rw-r-- 1 ichou1 ichou1    1222  5月  3 11:20 test.index


データの読み込み

pythonコード
import tensorflow as tf

checkpoint_path = tf.train.latest_checkpoint('/tmp/checkpoint_test/')

ckpt = tf.compat.v1.train.load_checkpoint(checkpoint_path)

var_to_shape_map = ckpt.get_variable_to_shape_map()

keys = set(var_to_shape_map.keys())

for key in sorted(keys):
    print('-----------')
    print('[param name]: ', key)
    if(key != '_CHECKPOINTABLE_OBJECT_GRAPH'):
        print('[param shape]: ', ckpt.get_tensor(key).shape)

出力結果。
optimizerのトレーニング中の値も保存されていることが確認できる。
今回はAdamを指定したので、パラメータが"m"、"v"、"beta_1"、"beta_2"、"learning rate"他。
Kerasメモ(Optimizer) - ichou1のブログ

-----------
[param name]:  _CHECKPOINTABLE_OBJECT_GRAPH
-----------
[param name]:  layer_with_weights-0/bias/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512,)
-----------
[param name]:  layer_with_weights-0/bias/.OPTIMIZER_SLOT/optimizer/m/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512,)
-----------
[param name]:  layer_with_weights-0/bias/.OPTIMIZER_SLOT/optimizer/v/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512,)
-----------
[param name]:  layer_with_weights-0/kernel/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (784, 512)
-----------
[param name]:  layer_with_weights-0/kernel/.OPTIMIZER_SLOT/optimizer/m/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (784, 512)
-----------
[param name]:  layer_with_weights-0/kernel/.OPTIMIZER_SLOT/optimizer/v/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (784, 512)
-----------
[param name]:  layer_with_weights-1/bias/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (10,)
-----------
[param name]:  layer_with_weights-1/bias/.OPTIMIZER_SLOT/optimizer/m/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (10,)
-----------
[param name]:  layer_with_weights-1/bias/.OPTIMIZER_SLOT/optimizer/v/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (10,)
-----------
[param name]:  layer_with_weights-1/kernel/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512, 10)
-----------
[param name]:  layer_with_weights-1/kernel/.OPTIMIZER_SLOT/optimizer/m/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512, 10)
-----------
[param name]:  layer_with_weights-1/kernel/.OPTIMIZER_SLOT/optimizer/v/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  (512, 10)
-----------
[param name]:  optimizer/beta_1/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  ()
-----------
[param name]:  optimizer/beta_2/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  ()
-----------
[param name]:  optimizer/decay/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  ()
-----------
[param name]:  optimizer/iter/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  ()
-----------
[param name]:  optimizer/learning_rate/.ATTRIBUTES/VARIABLE_VALUE
[param shape]:  ()