keras 的 example 文件 mnist_net2net.py 解析
生活随笔
收集整理的這篇文章主要介紹了
keras 的 example 文件 mnist_net2net.py 解析
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
該程序是介紹,如何把一個淺層的卷積神經網絡,加深,加寬
如先建立一個簡單的神經網絡,結構如下:
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv1 (Conv2D) (None, 28, 28, 64) 640
_________________________________________________________________
pool1 (MaxPooling2D) (None, 14, 14, 64) 0
_________________________________________________________________
conv2 (Conv2D) (None, 14, 14, 64) 36928
_________________________________________________________________
pool2 (MaxPooling2D) (None, 7, 7, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 3136) 0
_________________________________________________________________
fc1 (Dense) (None, 64) 200768
_________________________________________________________________
fc2 (Dense) (None, 10) 650
=================================================================
Total params: 238,986
Trainable params: 238,986
Non-trainable params: 0
_________________________________________________________________
None
訓練完成后,想辦法把他加寬,成下面這樣
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv1 (Conv2D) (None, 28, 28, 128) 1280
_________________________________________________________________
pool1 (MaxPooling2D) (None, 14, 14, 128) 0
_________________________________________________________________
conv2 (Conv2D) (None, 14, 14, 64) 73792
_________________________________________________________________
pool2 (MaxPooling2D) (None, 7, 7, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 3136) 0
_________________________________________________________________
fc1 (Dense) (None, 128) 401536
_________________________________________________________________
fc2 (Dense) (None, 10) 1290
=================================================================
Total params: 477,898
Trainable params: 477,898
Non-trainable params: 0
_________________________________________________________________
None
或者加深,變成下面這樣
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv1 (Conv2D) (None, 28, 28, 64) 640
_________________________________________________________________
pool1 (MaxPooling2D) (None, 14, 14, 64) 0
_________________________________________________________________
conv2 (Conv2D) (None, 14, 14, 64) 36928
_________________________________________________________________
conv2-deeper (Conv2D) (None, 14, 14, 64) 36928
_________________________________________________________________
pool2 (MaxPooling2D) (None, 7, 7, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 3136) 0
_________________________________________________________________
fc1 (Dense) (None, 64) 200768
_________________________________________________________________
fc1-deeper (Dense) (None, 64) 4160
_________________________________________________________________
fc2 (Dense) (None, 10) 650
=================================================================
Total params: 280,074
Trainable params: 280,074
Non-trainable params: 0
_________________________________________________________________
None
也就是介紹如何對神經網絡參數進行增、改、查
首先是獲取參數,獲取卷積層參數和全連接層代碼就是下面兩行:
w_conv1, b_conv1 = teacher_model.get_layer('conv1').get_weights()w_fc1, b_fc1 = teacher_model.get_layer('fc1').get_weights()
加寬的話,修改卷積層和全連接層參數是下面兩行:
model.get_layer('conv1').set_weights([new_w_conv1, new_b_conv1])model.get_layer('fc1').set_weights([new_w_fc1, new_b_fc1])
至于改成什么數據,那就自己可以自由發揮了,要么在原來的基礎上,拼接隨機的一些層,要么把原來的復制一份然后加一些噪音
?
加深的話,就是新建一個神經網絡,把原有的層的參數獲取重新拷貝過去就行了,新增加的層的參數,可以自由發揮如何初始化,
?
修改后的神經網絡重新再進行訓練
總結
以上是生活随笔為你收集整理的keras 的 example 文件 mnist_net2net.py 解析的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: keras 的 example 文件 l
- 下一篇: keras 的 example 文件 m