欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

tensorborad

程序员文章站 2024-03-15 11:26:41
...

一:tensorboard简介
tensorboard是tensorflow的可视化工具,可以显示图、展示中间结果。详见:https://tensorflow.google.cn/get_started/summaries_and_tensorboard
二:tf.sumary.histogram
tf.sumary.histogram:直方图。
TensorBoard 直方图信息中心用于显示在 TensorFlow 图中某些 Tensor 随着时间推移而变化的分布。即,该信息中心可显示在不同时间点对应张量的许多张直方图图示。
tf.summary.histogram 基于任意大小和形状的张量,并将张量压缩成一个由许多分箱组成的直方图数据结构,这些分箱有各自的宽度和计数。

import tensorflow as tf

k = tf.placeholder(tf.float32)

# Make a normal distribution, with a shifting mean
mean_moving_normal = tf.random_normal(shape=[1000], mean=(5*k), stddev=1)
# Record that distribution into a histogram summary
tf.summary.histogram("normal/moving_mean", mean_moving_normal)

# Setup a session and summary writer
sess = tf.Session()
writer = tf.summary.FileWriter("/tmp/histogram_example")
#writer = tf.summary.FileWriter("/Users/dudu/Desktop/studykeras")
summaries = tf.summary.merge_all()

# Setup a loop and write the summaries to disk
N = 400
for step in range(N):
  k_val = step/float(N)
  summ = sess.run(summaries, feed_dict={k: k_val})
  writer.add_summary(summ, global_step=step)

执行后使用以下命令:
tensorboard –logdir=/tmp/histogram_example
tensorboard –logdir=/Users/dudu/Desktop/studykeras
tensorborad
tensorborad
三:tf.summary.scalar
一般在画loss,accuary时会用到这个函数。

import tensorflow as tf
import numpy as np

## prepare the original data
with tf.name_scope('data'):
     x_data = np.random.rand(100).astype(np.float32)
     y_data = 0.3*x_data+0.1
##creat parameters
with tf.name_scope('parameters'):
     with tf.name_scope('weights'):
            weight = tf.Variable(tf.random_uniform([1],-1.0,1.0))
            tf.summary.histogram('weight',weight)
     with tf.name_scope('biases'):
           bias = tf.Variable(tf.zeros([1]))
           tf.summary.histogram('bias',bias)
##get y_prediction
with tf.name_scope('y_prediction'):
     y_prediction = weight*x_data+bias
##compute the loss
with tf.name_scope('loss'):
     loss = tf.reduce_mean(tf.square(y_data-y_prediction))
     tf.summary.scalar('loss',loss)
##creat optimizer
optimizer = tf.train.GradientDescentOptimizer(0.5)
#creat train ,minimize the loss
with tf.name_scope('train'):
     train = optimizer.minimize(loss)
#creat init
with tf.name_scope('init'):
     init = tf.global_variables_initializer()
##creat a Session
sess = tf.Session()
#merged
merged = tf.summary.merge_all()
##initialize
writer = tf.summary.FileWriter("/Users/dudu/Desktop/studykeras", sess.graph)
sess.run(init)
## Loop
for step  in  range(101):
    sess.run(train)
    rs=sess.run(merged)
    writer.add_summary(rs, step)

执行后使用以下命令:
tensorboard –logdir=/Users/dudu/Desktop/studykeras
tensorborad
四:tf.summary.image

import tensorflow as tf

# 获取图片数据
file = open('/Users/dudu/Desktop/studykeras/4.png', 'rb')
data = file.read()
file.close()

# 图片处理
image = tf.image.decode_png(data, channels=3)
image = tf.expand_dims(image, 0)

# 添加到日志中
sess = tf.Session()
writer = tf.summary.FileWriter('/Users/dudu/Desktop/studykeras')
summary_op = tf.summary.image("image1", image)

# 运行并写入日志
summary = sess.run(summary_op)
writer.add_summary(summary)

# 关闭
writer.close()
sess.close()

tensorborad
将图片保存为png格式
tensorborad
执行后使用以下命令:
tensorboard –logdir=/Users/dudu/Desktop/studykeras

本文参考了以下博客:
https://www.cnblogs.com/fydeblog/p/7429344.html
https://tensorflow.google.cn/programmers_guide/summaries_and_tensorboard
https://blog.csdn.net/smf0504/article/details/56369758
等等

相关标签: tensorboard

推荐阅读