日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

TF之RNN:实现利用scope.reuse_variables()告诉TF想重复利用RNN的参数的案例

發布時間:2025/3/21 编程问答 24 豆豆
生活随笔 收集整理的這篇文章主要介紹了 TF之RNN:实现利用scope.reuse_variables()告诉TF想重复利用RNN的参数的案例 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

TF之RNN:實現利用scope.reuse_variables()告訴TF想重復利用RNN的參數的案例

?

目錄

輸出結果

代碼設計


?

輸出結果

后期更新……

?

代碼設計

import tensorflow as tf # 22 scope (name_scope/variable_scope) from __future__ import print_functionclass TrainConfig:batch_size = 20time_steps = 20input_size = 10output_size = 2cell_size = 11learning_rate = 0.01class TestConfig(TrainConfig):time_steps = 1class RNN(object):def __init__(self, config):self._batch_size = config.batch_sizeself._time_steps = config.time_stepsself._input_size = config.input_sizeself._output_size = config.output_sizeself._cell_size = config.cell_sizeself._lr = config.learning_rateself._built_RNN()def _built_RNN(self):with tf.variable_scope('inputs'):self._xs = tf.placeholder(tf.float32, [self._batch_size, self._time_steps, self._input_size], name='xs')self._ys = tf.placeholder(tf.float32, [self._batch_size, self._time_steps, self._output_size], name='ys')with tf.name_scope('RNN'):with tf.variable_scope('input_layer'):l_in_x = tf.reshape(self._xs, [-1, self._input_size], name='2_2D') # (batch*n_step, in_size)# Ws (in_size, cell_size)Wi = self._weight_variable([self._input_size, self._cell_size])print(Wi.name)# bs (cell_size, )bi = self._bias_variable([self._cell_size, ])# l_in_y = (batch * n_steps, cell_size)with tf.name_scope('Wx_plus_b'):l_in_y = tf.matmul(l_in_x, Wi) + bil_in_y = tf.reshape(l_in_y, [-1, self._time_steps, self._cell_size], name='2_3D')with tf.variable_scope('cell'):cell = tf.contrib.rnn.BasicLSTMCell(self._cell_size)with tf.name_scope('initial_state'):self._cell_initial_state = cell.zero_state(self._batch_size, dtype=tf.float32)self.cell_outputs = []cell_state = self._cell_initial_statefor t in range(self._time_steps):if t > 0: tf.get_variable_scope().reuse_variables()cell_output, cell_state = cell(l_in_y[:, t, :], cell_state)self.cell_outputs.append(cell_output)self._cell_final_state = cell_statewith tf.variable_scope('output_layer'):# cell_outputs_reshaped (BATCH*TIME_STEP, CELL_SIZE)cell_outputs_reshaped = tf.reshape(tf.concat(self.cell_outputs, 1), [-1, self._cell_size])Wo = self._weight_variable((self._cell_size, self._output_size))bo = self._bias_variable((self._output_size,))product = tf.matmul(cell_outputs_reshaped, Wo) + bo# _pred shape (batch*time_step, output_size)self._pred = tf.nn.relu(product) # for displacementwith tf.name_scope('cost'):_pred = tf.reshape(self._pred, [self._batch_size, self._time_steps, self._output_size])mse = self.ms_error(_pred, self._ys)mse_ave_across_batch = tf.reduce_mean(mse, 0)mse_sum_across_time = tf.reduce_sum(mse_ave_across_batch, 0)self._cost = mse_sum_across_timeself._cost_ave_time = self._cost / self._time_stepswith tf.variable_scope('trian'):self._lr = tf.convert_to_tensor(self._lr)self.train_op = tf.train.AdamOptimizer(self._lr).minimize(self._cost)@staticmethoddef ms_error(y_target, y_pre):return tf.square(tf.subtract(y_target, y_pre))@staticmethoddef _weight_variable(shape, name='weights'):initializer = tf.random_normal_initializer(mean=0., stddev=0.5, )return tf.get_variable(shape=shape, initializer=initializer, name=name)@staticmethoddef _bias_variable(shape, name='biases'):initializer = tf.constant_initializer(0.1)return tf.get_variable(name=name, shape=shape, initializer=initializer)if __name__ == '__main__':train_config = TrainConfig() #定義train_configtest_config = TestConfig()# # the wrong method to reuse parameters in train rnn # with tf.variable_scope('train_rnn'): # train_rnn1 = RNN(train_config) # with tf.variable_scope('test_rnn'): # test_rnn1 = RNN(test_config)# the right method to reuse parameters in train rnn#目的使train的RNN調用參數,然后利用variable_scope方法共享RNN,讓test的RNN再次調用一樣的參數,with tf.variable_scope('rnn') as scope:sess = tf.Session()train_rnn2 = RNN(train_config)scope.reuse_variables() #告訴TF想重復利用RNN的參數test_rnn2 = RNN(test_config)# tf.initialize_all_variables() no long valid from# 2017-03-02 if using tensorflow >= 0.12if int((tf.__version__).split('.')[1]) < 12 and int((tf.__version__).split('.')[0]) < 1:init = tf.initialize_all_variables()else:init = tf.global_variables_initializer()sess.run(init)

?

?

相關文章
TF之RNN:實現利用scope.reuse_variables()告訴TF想重復利用RNN的參數的案例
?

總結

以上是生活随笔為你收集整理的TF之RNN:实现利用scope.reuse_variables()告诉TF想重复利用RNN的参数的案例的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。