日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人工智能 > pytorch >内容正文

pytorch

深度学习之基于Tensorflow2.0实现Xception网络

發布時間:2023/12/15 pytorch 31 豆豆
生活随笔 收集整理的這篇文章主要介紹了 深度学习之基于Tensorflow2.0实现Xception网络 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

1.Xception網絡簡介

Xception網絡是在2017提出的輕量型網絡,兼顧了準確性和參數量,Xception----->Extreme(極致的) Inception。

2.創新點

引入類似深度可分離卷積:Depthwise Separable Convolution。
為什么說是類似深度可分離卷積呢?

1)深度可分離卷積的實現過程:

2)Xception深度可分離卷積的實現過程:


Xception的深度可分離卷積與傳統的深度可分離卷積的步驟是相反的,但是原論文作者說,兩者的性能差異不大,最終的結果差異也不大,因此在實現的時候還是用的傳統的深度可分離卷積。

3.網絡模型

4.網絡實現

def Xcepiton(nb_class,input_shape):input_ten = Input(shape=input_shape)#block 1#299,299,3 -> 149,149,64x = Conv2D(32,(3,3),strides=(2,2),use_bias=False)(input_ten)x = BatchNormalization()(x)x = Activation('relu')(x)x = Conv2D(64,(3,3),use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)#block2#149,149,64 -> 75,75,128residual = Conv2D(128,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(128,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(128,(3,3),padding='same')(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block3#75,75,128 -> 38,38,256residual = Conv2D(256,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(256,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(256,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block4#38,38,256 -> 19,19,728residual = Conv2D(728,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block 5 - 12#19,19,728 -> 19,19,728for i in range(8):residual = xx = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = layers.add([x,residual])#block13 #19,19,728 -> 10,10,1024residual = Conv2D(1024,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(1024,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block14#10,10,1024 ->10,10,2048x = SeparableConv2D(1536,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(2048,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = GlobalAveragePooling2D()(x)output_ten = Dense(nb_class,activation='softmax')(x)model = Model(input_ten,output_ten)return model model_xception = Xcepiton(24,(img_height,img_width,3)) model_xception.summary()


訓練參數并不多,是一個比較經典的輕量級網絡。
努力加油a啊

總結

以上是生活随笔為你收集整理的深度学习之基于Tensorflow2.0实现Xception网络的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。