欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

RNN 训练过程中tricks

程序员文章站 2022-07-13 12:42:15
...

1. 初始化

正交初始化比全零初始化效果好。
下面初始化一个双向RNN:

    lstm_fw_cell = tf.nn.rnn_cell.LSTMCell(num_units=nhidden, forget_bias=1.0, initializer=tf.orthogonal_initializer())
    lstm_bw_cell = tf.nn.rnn_cell.LSTMCell(num_units=nhidden, forget_bias=1.0, initializer=tf.orthogonal_initializer())
    hiddens, state = tf.nn.bidirectional_dynamic_rnn(lstm_fw_cell, lstm_bw_cell, x, dtype=tf.float32)

知乎 YJango的回答
https://zhuanlan.zhihu.com/p/28981495)?

知乎 无非的回答
orthogonal初始化+prelu/elu activation 配合使用
Explaining and illustrating orthogonal initialization for recurrent neural networks

2. dropout

dropout作用在cell与cell之间传递;
中间不进行memory的dropout,仅在不同层cell之间传递时候dropout;

# **步骤1:RNN 的输入shape = (batch_size, timestep_size, input_size) 
X = tf.reshape(_X, [-1, 28, 28])
# **步骤2:定义一层 LSTM_cell,只需要说明 hidden_size, 它会自动匹配输入的 X 的维度
lstm_cell = rnn.BasicLSTMCell(num_units=hidden_size, forget_bias=1.0, state_is_tuple=True)
# **步骤3:添加 dropout layer, 一般只设置 output_keep_prob
lstm_cell = rnn.DropoutWrapper(cell=lstm_cell, input_keep_prob=1.0, output_keep_prob=keep_prob)
# **步骤4:调用 MultiRNNCell 来实现多层 LSTM
mlstm_cell = rnn.MultiRNNCell([lstm_cell] * layer_num, state_is_tuple=True)

记得在训练的时候将keep_prob设为需要的值;
测试时将其设置为1.

相关标签: RNN