keras 的 example 文件 deep_dream.py 解析
這個程序大致就是讓神經網絡產生一些夢境般的效果,把實實在在的畫面搞的花里胡哨,虛虛實實,
這里只是分析下代碼,原理的話,可以參考下
https://blog.csdn.net/accepthjp/article/details/77882814
我用上次的那條狗,生成的效果大致是這樣的,
《略》
抱歉,不是圖丟了,是我把圖刪了,可能有人覺得生成的圖片如夢如幻,但我看了有點起雞皮疙瘩
基礎神經網絡使用的是?inception_v3 , 神經網絡結構為:
____________________________________________________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
============================================================================================================================================
input_1 (InputLayer) (None, None, None, 3) 0
____________________________________________________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, None, None, 32) 864 input_1[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_1 (BatchNormalization) (None, None, None, 32) 96 conv2d_1[0][0]
____________________________________________________________________________________________________________________________________________
activation_1 (Activation) (None, None, None, 32) 0 batch_normalization_1[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, None, None, 32) 9216 activation_1[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_2 (BatchNormalization) (None, None, None, 32) 96 conv2d_2[0][0]
____________________________________________________________________________________________________________________________________________
activation_2 (Activation) (None, None, None, 32) 0 batch_normalization_2[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, None, None, 64) 18432 activation_2[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_3 (BatchNormalization) (None, None, None, 64) 192 conv2d_3[0][0]
____________________________________________________________________________________________________________________________________________
activation_3 (Activation) (None, None, None, 64) 0 batch_normalization_3[0][0]
____________________________________________________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, None, None, 64) 0 activation_3[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, None, None, 80) 5120 max_pooling2d_1[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_4 (BatchNormalization) (None, None, None, 80) 240 conv2d_4[0][0]
____________________________________________________________________________________________________________________________________________
activation_4 (Activation) (None, None, None, 80) 0 batch_normalization_4[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, None, None, 192) 138240 activation_4[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_5 (BatchNormalization) (None, None, None, 192) 576 conv2d_5[0][0]
____________________________________________________________________________________________________________________________________________
activation_5 (Activation) (None, None, None, 192) 0 batch_normalization_5[0][0]
____________________________________________________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, None, None, 192) 0 activation_5[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, None, None, 64) 12288 max_pooling2d_2[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_9 (BatchNormalization) (None, None, None, 64) 192 conv2d_9[0][0]
____________________________________________________________________________________________________________________________________________
activation_9 (Activation) (None, None, None, 64) 0 batch_normalization_9[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, None, None, 48) 9216 max_pooling2d_2[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, None, None, 96) 55296 activation_9[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_7 (BatchNormalization) (None, None, None, 48) 144 conv2d_7[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_10 (BatchNormalization) (None, None, None, 96) 288 conv2d_10[0][0]
____________________________________________________________________________________________________________________________________________
activation_7 (Activation) (None, None, None, 48) 0 batch_normalization_7[0][0]
____________________________________________________________________________________________________________________________________________
activation_10 (Activation) (None, None, None, 96) 0 batch_normalization_10[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_1 (AveragePooling2D) (None, None, None, 192) 0 max_pooling2d_2[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, None, None, 64) 12288 max_pooling2d_2[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, None, None, 64) 76800 activation_7[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, None, None, 96) 82944 activation_10[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, None, None, 32) 6144 average_pooling2d_1[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_6 (BatchNormalization) (None, None, None, 64) 192 conv2d_6[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_8 (BatchNormalization) (None, None, None, 64) 192 conv2d_8[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_11 (BatchNormalization) (None, None, None, 96) 288 conv2d_11[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_12 (BatchNormalization) (None, None, None, 32) 96 conv2d_12[0][0]
____________________________________________________________________________________________________________________________________________
activation_6 (Activation) (None, None, None, 64) 0 batch_normalization_6[0][0]
____________________________________________________________________________________________________________________________________________
activation_8 (Activation) (None, None, None, 64) 0 batch_normalization_8[0][0]
____________________________________________________________________________________________________________________________________________
activation_11 (Activation) (None, None, None, 96) 0 batch_normalization_11[0][0]
____________________________________________________________________________________________________________________________________________
activation_12 (Activation) (None, None, None, 32) 0 batch_normalization_12[0][0]
____________________________________________________________________________________________________________________________________________
mixed0 (Concatenate) (None, None, None, 256) 0 activation_6[0][0] activation_8[0][0] activation_11[0][0] activation_12[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, None, None, 64) 16384 mixed0[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_16 (BatchNormalization) (None, None, None, 64) 192 conv2d_16[0][0]
____________________________________________________________________________________________________________________________________________
activation_16 (Activation) (None, None, None, 64) 0 batch_normalization_16[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, None, None, 48) 12288 mixed0[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, None, None, 96) 55296 activation_16[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_14 (BatchNormalization) (None, None, None, 48) 144 conv2d_14[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_17 (BatchNormalization) (None, None, None, 96) 288 conv2d_17[0][0]
____________________________________________________________________________________________________________________________________________
activation_14 (Activation) (None, None, None, 48) 0 batch_normalization_14[0][0]
____________________________________________________________________________________________________________________________________________
activation_17 (Activation) (None, None, None, 96) 0 batch_normalization_17[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_2 (AveragePooling2D) (None, None, None, 256) 0 mixed0[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, None, None, 64) 16384 mixed0[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, None, None, 64) 76800 activation_14[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, None, None, 96) 82944 activation_17[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, None, None, 64) 16384 average_pooling2d_2[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_13 (BatchNormalization) (None, None, None, 64) 192 conv2d_13[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_15 (BatchNormalization) (None, None, None, 64) 192 conv2d_15[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_18 (BatchNormalization) (None, None, None, 96) 288 conv2d_18[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_19 (BatchNormalization) (None, None, None, 64) 192 conv2d_19[0][0]
____________________________________________________________________________________________________________________________________________
activation_13 (Activation) (None, None, None, 64) 0 batch_normalization_13[0][0]
____________________________________________________________________________________________________________________________________________
activation_15 (Activation) (None, None, None, 64) 0 batch_normalization_15[0][0]
____________________________________________________________________________________________________________________________________________
activation_18 (Activation) (None, None, None, 96) 0 batch_normalization_18[0][0]
____________________________________________________________________________________________________________________________________________
activation_19 (Activation) (None, None, None, 64) 0 batch_normalization_19[0][0]
____________________________________________________________________________________________________________________________________________
mixed1 (Concatenate) (None, None, None, 288) 0 activation_13[0][0] activation_15[0][0] activation_18[0][0] activation_19[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, None, None, 64) 18432 mixed1[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_23 (BatchNormalization) (None, None, None, 64) 192 conv2d_23[0][0]
____________________________________________________________________________________________________________________________________________
activation_23 (Activation) (None, None, None, 64) 0 batch_normalization_23[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, None, None, 48) 13824 mixed1[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, None, None, 96) 55296 activation_23[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_21 (BatchNormalization) (None, None, None, 48) 144 conv2d_21[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_24 (BatchNormalization) (None, None, None, 96) 288 conv2d_24[0][0]
____________________________________________________________________________________________________________________________________________
activation_21 (Activation) (None, None, None, 48) 0 batch_normalization_21[0][0]
____________________________________________________________________________________________________________________________________________
activation_24 (Activation) (None, None, None, 96) 0 batch_normalization_24[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_3 (AveragePooling2D) (None, None, None, 288) 0 mixed1[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, None, None, 64) 18432 mixed1[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, None, None, 64) 76800 activation_21[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, None, None, 96) 82944 activation_24[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, None, None, 64) 18432 average_pooling2d_3[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_20 (BatchNormalization) (None, None, None, 64) 192 conv2d_20[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_22 (BatchNormalization) (None, None, None, 64) 192 conv2d_22[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_25 (BatchNormalization) (None, None, None, 96) 288 conv2d_25[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_26 (BatchNormalization) (None, None, None, 64) 192 conv2d_26[0][0]
____________________________________________________________________________________________________________________________________________
activation_20 (Activation) (None, None, None, 64) 0 batch_normalization_20[0][0]
____________________________________________________________________________________________________________________________________________
activation_22 (Activation) (None, None, None, 64) 0 batch_normalization_22[0][0]
____________________________________________________________________________________________________________________________________________
activation_25 (Activation) (None, None, None, 96) 0 batch_normalization_25[0][0]
____________________________________________________________________________________________________________________________________________
activation_26 (Activation) (None, None, None, 64) 0 batch_normalization_26[0][0]
____________________________________________________________________________________________________________________________________________
mixed2 (Concatenate) (None, None, None, 288) 0 activation_20[0][0] activation_22[0][0] activation_25[0][0] activation_26[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, None, None, 64) 18432 mixed2[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_28 (BatchNormalization) (None, None, None, 64) 192 conv2d_28[0][0]
____________________________________________________________________________________________________________________________________________
activation_28 (Activation) (None, None, None, 64) 0 batch_normalization_28[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, None, None, 96) 55296 activation_28[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_29 (BatchNormalization) (None, None, None, 96) 288 conv2d_29[0][0]
____________________________________________________________________________________________________________________________________________
activation_29 (Activation) (None, None, None, 96) 0 batch_normalization_29[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, None, None, 384) 995328 mixed2[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, None, None, 96) 82944 activation_29[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_27 (BatchNormalization) (None, None, None, 384) 1152 conv2d_27[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_30 (BatchNormalization) (None, None, None, 96) 288 conv2d_30[0][0]
____________________________________________________________________________________________________________________________________________
activation_27 (Activation) (None, None, None, 384) 0 batch_normalization_27[0][0]
____________________________________________________________________________________________________________________________________________
activation_30 (Activation) (None, None, None, 96) 0 batch_normalization_30[0][0]
____________________________________________________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, None, None, 288) 0 mixed2[0][0]
____________________________________________________________________________________________________________________________________________
mixed3 (Concatenate) (None, None, None, 768) 0 activation_27[0][0] activation_30[0][0] max_pooling2d_3[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, None, None, 128) 98304 mixed3[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_35 (BatchNormalization) (None, None, None, 128) 384 conv2d_35[0][0]
____________________________________________________________________________________________________________________________________________
activation_35 (Activation) (None, None, None, 128) 0 batch_normalization_35[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, None, None, 128) 114688 activation_35[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_36 (BatchNormalization) (None, None, None, 128) 384 conv2d_36[0][0]
____________________________________________________________________________________________________________________________________________
activation_36 (Activation) (None, None, None, 128) 0 batch_normalization_36[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, None, None, 128) 98304 mixed3[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, None, None, 128) 114688 activation_36[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_32 (BatchNormalization) (None, None, None, 128) 384 conv2d_32[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_37 (BatchNormalization) (None, None, None, 128) 384 conv2d_37[0][0]
____________________________________________________________________________________________________________________________________________
activation_32 (Activation) (None, None, None, 128) 0 batch_normalization_32[0][0]
____________________________________________________________________________________________________________________________________________
activation_37 (Activation) (None, None, None, 128) 0 batch_normalization_37[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, None, None, 128) 114688 activation_32[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, None, None, 128) 114688 activation_37[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_33 (BatchNormalization) (None, None, None, 128) 384 conv2d_33[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_38 (BatchNormalization) (None, None, None, 128) 384 conv2d_38[0][0]
____________________________________________________________________________________________________________________________________________
activation_33 (Activation) (None, None, None, 128) 0 batch_normalization_33[0][0]
____________________________________________________________________________________________________________________________________________
activation_38 (Activation) (None, None, None, 128) 0 batch_normalization_38[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_4 (AveragePooling2D) (None, None, None, 768) 0 mixed3[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, None, None, 192) 147456 mixed3[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, None, None, 192) 172032 activation_33[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, None, None, 192) 172032 activation_38[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, None, None, 192) 147456 average_pooling2d_4[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_31 (BatchNormalization) (None, None, None, 192) 576 conv2d_31[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_34 (BatchNormalization) (None, None, None, 192) 576 conv2d_34[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_39 (BatchNormalization) (None, None, None, 192) 576 conv2d_39[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_40 (BatchNormalization) (None, None, None, 192) 576 conv2d_40[0][0]
____________________________________________________________________________________________________________________________________________
activation_31 (Activation) (None, None, None, 192) 0 batch_normalization_31[0][0]
____________________________________________________________________________________________________________________________________________
activation_34 (Activation) (None, None, None, 192) 0 batch_normalization_34[0][0]
____________________________________________________________________________________________________________________________________________
activation_39 (Activation) (None, None, None, 192) 0 batch_normalization_39[0][0]
____________________________________________________________________________________________________________________________________________
activation_40 (Activation) (None, None, None, 192) 0 batch_normalization_40[0][0]
____________________________________________________________________________________________________________________________________________
mixed4 (Concatenate) (None, None, None, 768) 0 activation_31[0][0] activation_34[0][0] activation_39[0][0] activation_40[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, None, None, 160) 122880 mixed4[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_45 (BatchNormalization) (None, None, None, 160) 480 conv2d_45[0][0]
____________________________________________________________________________________________________________________________________________
activation_45 (Activation) (None, None, None, 160) 0 batch_normalization_45[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, None, None, 160) 179200 activation_45[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_46 (BatchNormalization) (None, None, None, 160) 480 conv2d_46[0][0]
____________________________________________________________________________________________________________________________________________
activation_46 (Activation) (None, None, None, 160) 0 batch_normalization_46[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, None, None, 160) 122880 mixed4[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, None, None, 160) 179200 activation_46[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_42 (BatchNormalization) (None, None, None, 160) 480 conv2d_42[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_47 (BatchNormalization) (None, None, None, 160) 480 conv2d_47[0][0]
____________________________________________________________________________________________________________________________________________
activation_42 (Activation) (None, None, None, 160) 0 batch_normalization_42[0][0]
____________________________________________________________________________________________________________________________________________
activation_47 (Activation) (None, None, None, 160) 0 batch_normalization_47[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, None, None, 160) 179200 activation_42[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, None, None, 160) 179200 activation_47[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_43 (BatchNormalization) (None, None, None, 160) 480 conv2d_43[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_48 (BatchNormalization) (None, None, None, 160) 480 conv2d_48[0][0]
____________________________________________________________________________________________________________________________________________
activation_43 (Activation) (None, None, None, 160) 0 batch_normalization_43[0][0]
____________________________________________________________________________________________________________________________________________
activation_48 (Activation) (None, None, None, 160) 0 batch_normalization_48[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_5 (AveragePooling2D) (None, None, None, 768) 0 mixed4[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, None, None, 192) 147456 mixed4[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, None, None, 192) 215040 activation_43[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, None, None, 192) 215040 activation_48[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, None, None, 192) 147456 average_pooling2d_5[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_41 (BatchNormalization) (None, None, None, 192) 576 conv2d_41[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_44 (BatchNormalization) (None, None, None, 192) 576 conv2d_44[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_49 (BatchNormalization) (None, None, None, 192) 576 conv2d_49[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_50 (BatchNormalization) (None, None, None, 192) 576 conv2d_50[0][0]
____________________________________________________________________________________________________________________________________________
activation_41 (Activation) (None, None, None, 192) 0 batch_normalization_41[0][0]
____________________________________________________________________________________________________________________________________________
activation_44 (Activation) (None, None, None, 192) 0 batch_normalization_44[0][0]
____________________________________________________________________________________________________________________________________________
activation_49 (Activation) (None, None, None, 192) 0 batch_normalization_49[0][0]
____________________________________________________________________________________________________________________________________________
activation_50 (Activation) (None, None, None, 192) 0 batch_normalization_50[0][0]
____________________________________________________________________________________________________________________________________________
mixed5 (Concatenate) (None, None, None, 768) 0 activation_41[0][0] activation_44[0][0] activation_49[0][0] activation_50[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, None, None, 160) 122880 mixed5[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_55 (BatchNormalization) (None, None, None, 160) 480 conv2d_55[0][0]
____________________________________________________________________________________________________________________________________________
activation_55 (Activation) (None, None, None, 160) 0 batch_normalization_55[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, None, None, 160) 179200 activation_55[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_56 (BatchNormalization) (None, None, None, 160) 480 conv2d_56[0][0]
____________________________________________________________________________________________________________________________________________
activation_56 (Activation) (None, None, None, 160) 0 batch_normalization_56[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, None, None, 160) 122880 mixed5[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, None, None, 160) 179200 activation_56[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_52 (BatchNormalization) (None, None, None, 160) 480 conv2d_52[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_57 (BatchNormalization) (None, None, None, 160) 480 conv2d_57[0][0]
____________________________________________________________________________________________________________________________________________
activation_52 (Activation) (None, None, None, 160) 0 batch_normalization_52[0][0]
____________________________________________________________________________________________________________________________________________
activation_57 (Activation) (None, None, None, 160) 0 batch_normalization_57[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, None, None, 160) 179200 activation_52[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, None, None, 160) 179200 activation_57[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_53 (BatchNormalization) (None, None, None, 160) 480 conv2d_53[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_58 (BatchNormalization) (None, None, None, 160) 480 conv2d_58[0][0]
____________________________________________________________________________________________________________________________________________
activation_53 (Activation) (None, None, None, 160) 0 batch_normalization_53[0][0]
____________________________________________________________________________________________________________________________________________
activation_58 (Activation) (None, None, None, 160) 0 batch_normalization_58[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_6 (AveragePooling2D) (None, None, None, 768) 0 mixed5[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, None, None, 192) 147456 mixed5[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, None, None, 192) 215040 activation_53[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, None, None, 192) 215040 activation_58[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, None, None, 192) 147456 average_pooling2d_6[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_51 (BatchNormalization) (None, None, None, 192) 576 conv2d_51[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_54 (BatchNormalization) (None, None, None, 192) 576 conv2d_54[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_59 (BatchNormalization) (None, None, None, 192) 576 conv2d_59[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_60 (BatchNormalization) (None, None, None, 192) 576 conv2d_60[0][0]
____________________________________________________________________________________________________________________________________________
activation_51 (Activation) (None, None, None, 192) 0 batch_normalization_51[0][0]
____________________________________________________________________________________________________________________________________________
activation_54 (Activation) (None, None, None, 192) 0 batch_normalization_54[0][0]
____________________________________________________________________________________________________________________________________________
activation_59 (Activation) (None, None, None, 192) 0 batch_normalization_59[0][0]
____________________________________________________________________________________________________________________________________________
activation_60 (Activation) (None, None, None, 192) 0 batch_normalization_60[0][0]
____________________________________________________________________________________________________________________________________________
mixed6 (Concatenate) (None, None, None, 768) 0 activation_51[0][0] activation_54[0][0] activation_59[0][0] activation_60[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, None, None, 192) 147456 mixed6[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_65 (BatchNormalization) (None, None, None, 192) 576 conv2d_65[0][0]
____________________________________________________________________________________________________________________________________________
activation_65 (Activation) (None, None, None, 192) 0 batch_normalization_65[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, None, None, 192) 258048 activation_65[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_66 (BatchNormalization) (None, None, None, 192) 576 conv2d_66[0][0]
____________________________________________________________________________________________________________________________________________
activation_66 (Activation) (None, None, None, 192) 0 batch_normalization_66[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, None, None, 192) 147456 mixed6[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, None, None, 192) 258048 activation_66[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_62 (BatchNormalization) (None, None, None, 192) 576 conv2d_62[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_67 (BatchNormalization) (None, None, None, 192) 576 conv2d_67[0][0]
____________________________________________________________________________________________________________________________________________
activation_62 (Activation) (None, None, None, 192) 0 batch_normalization_62[0][0]
____________________________________________________________________________________________________________________________________________
activation_67 (Activation) (None, None, None, 192) 0 batch_normalization_67[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, None, None, 192) 258048 activation_62[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, None, None, 192) 258048 activation_67[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_63 (BatchNormalization) (None, None, None, 192) 576 conv2d_63[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_68 (BatchNormalization) (None, None, None, 192) 576 conv2d_68[0][0]
____________________________________________________________________________________________________________________________________________
activation_63 (Activation) (None, None, None, 192) 0 batch_normalization_63[0][0]
____________________________________________________________________________________________________________________________________________
activation_68 (Activation) (None, None, None, 192) 0 batch_normalization_68[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_7 (AveragePooling2D) (None, None, None, 768) 0 mixed6[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, None, None, 192) 147456 mixed6[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, None, None, 192) 258048 activation_63[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, None, None, 192) 258048 activation_68[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, None, None, 192) 147456 average_pooling2d_7[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_61 (BatchNormalization) (None, None, None, 192) 576 conv2d_61[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_64 (BatchNormalization) (None, None, None, 192) 576 conv2d_64[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_69 (BatchNormalization) (None, None, None, 192) 576 conv2d_69[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_70 (BatchNormalization) (None, None, None, 192) 576 conv2d_70[0][0]
____________________________________________________________________________________________________________________________________________
activation_61 (Activation) (None, None, None, 192) 0 batch_normalization_61[0][0]
____________________________________________________________________________________________________________________________________________
activation_64 (Activation) (None, None, None, 192) 0 batch_normalization_64[0][0]
____________________________________________________________________________________________________________________________________________
activation_69 (Activation) (None, None, None, 192) 0 batch_normalization_69[0][0]
____________________________________________________________________________________________________________________________________________
activation_70 (Activation) (None, None, None, 192) 0 batch_normalization_70[0][0]
____________________________________________________________________________________________________________________________________________
mixed7 (Concatenate) (None, None, None, 768) 0 activation_61[0][0] activation_64[0][0] activation_69[0][0] activation_70[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, None, None, 192) 147456 mixed7[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_73 (BatchNormalization) (None, None, None, 192) 576 conv2d_73[0][0]
____________________________________________________________________________________________________________________________________________
activation_73 (Activation) (None, None, None, 192) 0 batch_normalization_73[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, None, None, 192) 258048 activation_73[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_74 (BatchNormalization) (None, None, None, 192) 576 conv2d_74[0][0]
____________________________________________________________________________________________________________________________________________
activation_74 (Activation) (None, None, None, 192) 0 batch_normalization_74[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, None, None, 192) 147456 mixed7[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, None, None, 192) 258048 activation_74[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_71 (BatchNormalization) (None, None, None, 192) 576 conv2d_71[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_75 (BatchNormalization) (None, None, None, 192) 576 conv2d_75[0][0]
____________________________________________________________________________________________________________________________________________
activation_71 (Activation) (None, None, None, 192) 0 batch_normalization_71[0][0]
____________________________________________________________________________________________________________________________________________
activation_75 (Activation) (None, None, None, 192) 0 batch_normalization_75[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, None, None, 320) 552960 activation_71[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, None, None, 192) 331776 activation_75[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_72 (BatchNormalization) (None, None, None, 320) 960 conv2d_72[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_76 (BatchNormalization) (None, None, None, 192) 576 conv2d_76[0][0]
____________________________________________________________________________________________________________________________________________
activation_72 (Activation) (None, None, None, 320) 0 batch_normalization_72[0][0]
____________________________________________________________________________________________________________________________________________
activation_76 (Activation) (None, None, None, 192) 0 batch_normalization_76[0][0]
____________________________________________________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, None, None, 768) 0 mixed7[0][0]
____________________________________________________________________________________________________________________________________________
mixed8 (Concatenate) (None, None, None, 1280) 0 activation_72[0][0] activation_76[0][0] max_pooling2d_4[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, None, None, 448) 573440 mixed8[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_81 (BatchNormalization) (None, None, None, 448) 1344 conv2d_81[0][0]
____________________________________________________________________________________________________________________________________________
activation_81 (Activation) (None, None, None, 448) 0 batch_normalization_81[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, None, None, 384) 491520 mixed8[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, None, None, 384) 1548288 activation_81[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_78 (BatchNormalization) (None, None, None, 384) 1152 conv2d_78[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_82 (BatchNormalization) (None, None, None, 384) 1152 conv2d_82[0][0]
____________________________________________________________________________________________________________________________________________
activation_78 (Activation) (None, None, None, 384) 0 batch_normalization_78[0][0]
____________________________________________________________________________________________________________________________________________
activation_82 (Activation) (None, None, None, 384) 0 batch_normalization_82[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, None, None, 384) 442368 activation_78[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, None, None, 384) 442368 activation_78[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, None, None, 384) 442368 activation_82[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, None, None, 384) 442368 activation_82[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_8 (AveragePooling2D) (None, None, None, 1280) 0 mixed8[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, None, None, 320) 409600 mixed8[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_79 (BatchNormalization) (None, None, None, 384) 1152 conv2d_79[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_80 (BatchNormalization) (None, None, None, 384) 1152 conv2d_80[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_83 (BatchNormalization) (None, None, None, 384) 1152 conv2d_83[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_84 (BatchNormalization) (None, None, None, 384) 1152 conv2d_84[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, None, None, 192) 245760 average_pooling2d_8[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_77 (BatchNormalization) (None, None, None, 320) 960 conv2d_77[0][0]
____________________________________________________________________________________________________________________________________________
activation_79 (Activation) (None, None, None, 384) 0 batch_normalization_79[0][0]
____________________________________________________________________________________________________________________________________________
activation_80 (Activation) (None, None, None, 384) 0 batch_normalization_80[0][0]
____________________________________________________________________________________________________________________________________________
activation_83 (Activation) (None, None, None, 384) 0 batch_normalization_83[0][0]
____________________________________________________________________________________________________________________________________________
activation_84 (Activation) (None, None, None, 384) 0 batch_normalization_84[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_85 (BatchNormalization) (None, None, None, 192) 576 conv2d_85[0][0]
____________________________________________________________________________________________________________________________________________
activation_77 (Activation) (None, None, None, 320) 0 batch_normalization_77[0][0]
____________________________________________________________________________________________________________________________________________
mixed9_0 (Concatenate) (None, None, None, 768) 0 activation_79[0][0] activation_80[0][0]
____________________________________________________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, None, None, 768) 0 activation_83[0][0] activation_84[0][0]
____________________________________________________________________________________________________________________________________________
activation_85 (Activation) (None, None, None, 192) 0 batch_normalization_85[0][0]
____________________________________________________________________________________________________________________________________________
mixed9 (Concatenate) (None, None, None, 2048) 0 activation_77[0][0] mixed9_0[0][0] concatenate_1[0][0] activation_85[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, None, None, 448) 917504 mixed9[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_90 (BatchNormalization) (None, None, None, 448) 1344 conv2d_90[0][0]
____________________________________________________________________________________________________________________________________________
activation_90 (Activation) (None, None, None, 448) 0 batch_normalization_90[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, None, None, 384) 786432 mixed9[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, None, None, 384) 1548288 activation_90[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_87 (BatchNormalization) (None, None, None, 384) 1152 conv2d_87[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_91 (BatchNormalization) (None, None, None, 384) 1152 conv2d_91[0][0]
____________________________________________________________________________________________________________________________________________
activation_87 (Activation) (None, None, None, 384) 0 batch_normalization_87[0][0]
____________________________________________________________________________________________________________________________________________
activation_91 (Activation) (None, None, None, 384) 0 batch_normalization_91[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, None, None, 384) 442368 activation_87[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, None, None, 384) 442368 activation_87[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, None, None, 384) 442368 activation_91[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, None, None, 384) 442368 activation_91[0][0]
____________________________________________________________________________________________________________________________________________
average_pooling2d_9 (AveragePooling2D) (None, None, None, 2048) 0 mixed9[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, None, None, 320) 655360 mixed9[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_88 (BatchNormalization) (None, None, None, 384) 1152 conv2d_88[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_89 (BatchNormalization) (None, None, None, 384) 1152 conv2d_89[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_92 (BatchNormalization) (None, None, None, 384) 1152 conv2d_92[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_93 (BatchNormalization) (None, None, None, 384) 1152 conv2d_93[0][0]
____________________________________________________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, None, None, 192) 393216 average_pooling2d_9[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_86 (BatchNormalization) (None, None, None, 320) 960 conv2d_86[0][0]
____________________________________________________________________________________________________________________________________________
activation_88 (Activation) (None, None, None, 384) 0 batch_normalization_88[0][0]
____________________________________________________________________________________________________________________________________________
activation_89 (Activation) (None, None, None, 384) 0 batch_normalization_89[0][0]
____________________________________________________________________________________________________________________________________________
activation_92 (Activation) (None, None, None, 384) 0 batch_normalization_92[0][0]
____________________________________________________________________________________________________________________________________________
activation_93 (Activation) (None, None, None, 384) 0 batch_normalization_93[0][0]
____________________________________________________________________________________________________________________________________________
batch_normalization_94 (BatchNormalization) (None, None, None, 192) 576 conv2d_94[0][0]
____________________________________________________________________________________________________________________________________________
activation_86 (Activation) (None, None, None, 320) 0 batch_normalization_86[0][0]
____________________________________________________________________________________________________________________________________________
mixed9_1 (Concatenate) (None, None, None, 768) 0 activation_88[0][0] activation_89[0][0]
____________________________________________________________________________________________________________________________________________
concatenate_2 (Concatenate) (None, None, None, 768) 0 activation_92[0][0] activation_93[0][0]
____________________________________________________________________________________________________________________________________________
activation_94 (Activation) (None, None, None, 192) 0 batch_normalization_94[0][0]
____________________________________________________________________________________________________________________________________________
mixed10 (Concatenate) (None, None, None, 2048) 0 activation_86[0][0] mixed9_1[0][0] concatenate_2[0][0] activation_94[0][0]
============================================================================================================================================
Total params: 21,802,784
Trainable params: 21,768,352
Non-trainable params: 34,432
____________________________________________________________________________________________________________________________________________
我們從代碼
img = preprocess_image(base_image_path)
開始看,這里就是正常的讀取一張圖片,只不過為了batch處理,加了一個維度,其他沒有變化
下面是計算?successive_shapes,就是很簡單的計算出了兩個尺寸,一個是原圖尺寸的1/1.4,另一個是原圖尺寸的1/(1.4**2)
如我原圖尺寸為400*500,這里就是計算出285*356,再計算出204*255,也就是
successive_shapes [(204, 255), (285, 357), (400, 500)]
然后下面的代碼可以精簡為:
for shape in successive_shapes:print('Processing image shape', shape)img = resize_img(img, shape)img = gradient_ascent(img,iterations=iterations,step=step,max_loss=max_loss)
因為其余代碼是計算一個?lost_detail,是為了對縮放產生的一點點細節丟失進行一點點補償,也就是不加也能產生效果,只不過會有一點點模糊,所以我們先不關心
這段代碼是先對圖片縮放到最小的那個尺寸,如我這里是(204,255),然后調用?gradient_ascent, gradient_ascent 會調用?eval_loss_and_grads,?eval_loss_and_grads 會調用?fetch_loss_and_grads, 總之就是計算一個loss,和一個loss對應的梯度
?
loss 是 inception_v3 的mixed2, mixed3, mixed4, mixed5基層的輸出的值(四周各去掉兩個像素)的平方和再除以總數,再乘以一個系數,再相加;所以 loss 永遠為正數;
梯度進行一個正則化(就是除以一個絕對值最大的那個數,縮放到 < 1.0),然后再乘以一個0.01,之后,就和原圖加起來了,這樣就對原圖進行了一個修改;
而我們思考一下一個凸函數,或者凹函數,一個y值,對 x 在某個點的導數,如果再加回到x上去的話,那就是向y變得更大的方向移動;
我們這里的x就是img,y值就是loss,所以 loss 會越來越大;
然后再for循環計算,縮放之后再進行幾次計算
?
代碼基本邏輯就是這樣,估計看了之后還是不明白,這,有什么意義嗎?
我想這里確實看不出什么意義,主要是這里的 loss 是一個制定的有點隨心所欲的一個值,
而如果我們把 loss 值定為一個像狗的指標,而如果我們以此方法進行不停的反向計算的話,即便輸入是一條魚,我們也能夠把原圖迭代修改為一個像狗一樣的魚,有機會了可以試一下
我稍微改了一下,可以把原圖識別為狗的圖片,修改為識別為貓,圖片基本變化不大,和原圖一樣,只是原先識別為207,即金毛犬,但是在把原圖修改后,識別為283了,即波斯貓:
from __future__ import print_functionfrom keras.preprocessing.image import load_img, save_img, img_to_array
import numpy as np
import scipy
import argparsefrom keras.applications import inception_v3
from keras import backend as Kparser = argparse.ArgumentParser(description='Deep Dreams with Keras.')
parser.add_argument('base_image_path', metavar='base', type=str,help='Path to the image to transform.')
parser.add_argument('result_prefix', metavar='res_prefix', type=str,help='Prefix for the saved results.')args = parser.parse_args()
base_image_path = args.base_image_path
result_prefix = args.result_prefix# These are the names of the layers
# for which we try to maximize activation,
# as well as their weight in the final loss
# we try to maximize.
# You can tweak these setting to obtain new visual effects.
settings = {'features': {'mixed2': 0.2,'mixed3': 0.5,'mixed4': 2.,'mixed5': 1.5,},
}def preprocess_image(image_path):# Util function to open, resize and format pictures# into appropriate tensors.img = load_img(image_path)img = img_to_array(img)img = np.expand_dims(img, axis=0)img = inception_v3.preprocess_input(img)return imgdef deprocess_image(x):# Util function to convert a tensor into a valid image.if K.image_data_format() == 'channels_first':x = x.reshape((3, x.shape[2], x.shape[3]))x = x.transpose((1, 2, 0))else:x = x.reshape((x.shape[1], x.shape[2], 3))x /= 2.x += 0.5x *= 255.x = np.clip(x, 0, 255).astype('uint8')return xK.set_learning_phase(0)# Build the InceptionV3 network with our placeholder.
# The model will be loaded with pre-trained ImageNet weights.
model = inception_v3.InceptionV3()
dream = model.inputprint('Model loaded.')# Get the symbolic outputs of each "key" layer (we gave them unique names).
layer_dict = dict([(layer.name, layer) for layer in model.layers])# Define the loss.
loss = layer_dict['predictions'].output[0][283]# Compute the gradients of the dream wrt the loss.
grads = K.gradients(loss, dream)[0]
# Normalize gradients.
grads /= K.maximum(K.mean(K.abs(grads)), K.epsilon())# Set up function to retrieve the value
# of the loss and gradients given an input image.
outputs = [loss, grads]
fetch_loss_and_grads = K.function([dream], outputs)def eval_loss_and_grads(x):outs = fetch_loss_and_grads([x])loss_value = outs[0]grad_values = outs[1]return loss_value, grad_valuesdef resize_img(img, size):img = np.copy(img)if K.image_data_format() == 'channels_first':factors = (1, 1,float(size[0]) / img.shape[2],float(size[1]) / img.shape[3])else:factors = (1,float(size[0]) / img.shape[1],float(size[1]) / img.shape[2],1)return scipy.ndimage.zoom(img, factors, order=1)def gradient_ascent(x, iterations, step, max_loss=None):for i in range(iterations):loss_value, grad_values = eval_loss_and_grads(x)if max_loss is not None and loss_value > max_loss:breakprint('..Loss value at', i, ':', loss_value)x += step * grad_valuesreturn x# Playing with these hyperparameters will also allow you to achieve new effects
step = 0.0001 # Gradient ascent step size
num_octave = 3 # Number of scales at which to run gradient ascent
octave_scale = 1.4 # Size ratio between scales
iterations = 2000 # Number of ascent steps per scale
max_loss = 0.9img = preprocess_image(base_image_path)shape=(299,299,3)print('Processing image shape', shape)
img = resize_img(img, shape)
img = gradient_ascent(img,iterations=iterations,step=step,max_loss=max_loss)save_img(result_prefix + '.png', deprocess_image(np.copy(img)))
修改后圖片為:
如果用inception_v3的默認參數進行預測的話,會把它預測為波斯貓
——————————————————————
總目錄
keras的example文件解析
總結
以上是生活随笔為你收集整理的keras 的 example 文件 deep_dream.py 解析的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: keras 的 example 文件 c
- 下一篇: keras 的 example 文件 i