tensorflow的loss,dropout,AdamOptimizer
程序员文章站
2022-07-13 10:37:46
...
二次代价函数
loss = tf.reduce_mean(tf.square(y-prediction))
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y,logits=prediction))
使用梯度下降法
train_step = tf.train.GradientDescentOptimizer(0.2).minimize(loss)
train_step = tf.train.AdamOptimizer(1e-2).minimize(loss)
droput
drop = tf.nn.dropout(a,keep_prob)
推荐阅读
-
TensorFlow绘制loss/accuracy曲线的实例
-
TensorFlow绘制loss/accuracy曲线的实例
-
Tensorflow:tf.contrib.rnn.DropoutWrapper函数(谷歌已经为Dropout申请了专利!)、MultiRNNCell函数的解读与理解
-
Tensorflow框架 -- Dropout的用法
-
tensorflow的loss,dropout,AdamOptimizer
-
谈谈Tensorflow的dropout
-
tensorflow 参数初始化,dropout或batchnorm,梯度截断的实现
-
基于tensorflow数据集的dropout优化