欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

tf.nn.dropout

程序员文章站 2022-07-13 10:53:00
...

摘自:https://www.jianshu.com/p/c9f66bc8f96c

def dropout(x, keep_prob, noise_shape=None, seed=None, name=None)

输入是:

  • x,你自己的训练、测试数据等
  • keep_prob,dropout概率
  • ……,其它参数不咋用

输出是:

  • A Tensor of the same shape of x

输出的非0元素是原来的 “1/keep_prob” 倍

 

程序举例:

import tensorflow as tf

dropout = tf.placeholder(tf.float32)
x = tf.Variable(tf.ones([10, 10]))
y = tf.nn.dropout(x, dropout)

init = tf.initialize_all_variables()
sess = tf.Session()
sess.run(init)

print sess.run(y, feed_dict = {dropout: 0.4})

运行的结果如下:

[[ 0.   0.   2.5  2.5  0.   0.   2.5  2.5  2.5  2.5]
 [ 0.   2.5  2.5  2.5  2.5  2.5  0.   2.5  0.   2.5]
 [ 2.5  0.   0.   2.5  0.   0.   2.5  0.   2.5  0. ]
 [ 0.   2.5  2.5  2.5  2.5  0.   0.   2.5  0.   2.5]
 [ 0.   0.   0.   0.   0.   0.   0.   0.   2.5  2.5]
 [ 2.5  2.5  2.5  0.   2.5  0.   0.   2.5  2.5  2.5]
 [ 0.   2.5  2.5  2.5  0.   2.5  2.5  0.   0.   0. ]
 [ 0.   2.5  0.   2.5  0.   0.   2.5  2.5  0.   0. ]
 [ 2.5  2.5  2.5  2.5  2.5  0.   0.   2.5  0.   0. ]
 [ 2.5  0.   0.   0.   0.   0.   2.5  2.5  0.   2.5]]

分析一下运行结果:

  • 输入和输出的tensor的shape果然是一样的
  • 不是0的元素都变成了原来的 “1/keep_prob” 倍

tensorflow中的dropout就是:使输入tensor中某些元素变为0,其它没变0的元素变为原来的1/keep_prob大小

博客作者说,小型网络效果一般,慎用