# TensorBoard使用

TensorBoard设置完成之后的样子应该如下图：

image

## 例程1 矩阵相乘 tfboard1.py

``````import tensorflow as tf
with tf.name_scope('graph') as scope:
matrix1 = tf.constant([[3., 3.]], name = 'matrix')   # 一行两列
matrix2 = tf.constant([[2.], [2.]], name = 'matrix2') # 两行一列
product = tf.matmul(matrix1, matrix2, name = 'product')

sess = tf.Session()

writer = tf.summary.FileWriter("logs1/", sess.graph)

init = tf.global_variables_initializer()

sess.run(init)
``````

tf.name_scope函数是作用域名，上述代码斯即在graph作用域op下，又有三个op（分别是matrix1，matrix2，product),用tf函数内部的name参数命名，这样会在tensorboard中显示。

``````tensorboard --logdir logs1
``````

## 例程2 线性拟合（一） tfboard2.py

``````
import tensorflow as tf
import numpy as np

# 准备原始数据
with tf.name_scope('data'):
x_data = np.random.rand(100).astype(np.float32)
y_data = 0.3*x_data + 0.1

# 参数设置
with tf.name_scope('parameters'):
weight = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
bias = tf.Variable(tf.zeros([1]))

# 得到 y_prediction
with tf.name_scope('y_prediction'):
y_prediction = weight*x_data + bias

# 计算损失率compute the loss
with tf.name_scope('loss'):
loss = tf.reduce_mean(tf.square(y_data - y_prediction))

#

with tf.name_scope('train'):
train = optimizer.minimize(loss)

with tf.name_scope('init'):
init = tf.global_variables_initializer()

sess = tf.Session()
writer = tf.summary.FileWriter("logs2/",sess.graph)
sess.run(init)

for step in range(101):
sess.run(train)
if step%10 == 0:
print(step, 'weight', sess.run(weight), 'bias:', sess.run(bias))

``````

## 例程3 线性拟合（二） tfboard3.py

``````import tensorflow as tf
import numpy as np

with tf.name_scope('data'):
x_data = np.random.rand(100).astype(np.float32)
y_data = 0.3*x_data + 0.1

with tf.name_scope('paremeters'):
with tf.name_scope('weights'):
weight = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
tf.summary.histogram('weight', weight)
with tf.name_scope('biases'):
bias = tf.Variable(tf.zeros([1]))
tf.summary.histogram('bias', bias)

with tf.name_scope('y_prediction'):
y_prediction = weight*x_data + bias

with tf.name_scope('loss'):
loss = tf.reduce_mean(tf.square(y_data - y_prediction))
tf.summary.scalar('loss', loss)

with tf.name_scope('train'):
train = optimizer.minimize(loss)

with tf.name_scope('init'):
init = tf.global_variables_initializer()

sess = tf.Session()
merged = tf.summary.merge_all()
writer = tf.summary.FileWriter("logs3/", sess.graph)
sess.run(init)

for step in range(101):
sess.run(train)
rs = sess.run(merged)

``````

### 推荐阅读更多精彩内容

• 1c8b: 概述 机器学习如此复杂，训练模型的时候，摸不清背后到底是如何运行的。自己设置的参数和关键变量，如果能看...
darkie阅读 6,169评论 5 5
• 最近在写tensorflow的程序，想看看训练过程中损失函数等变化，打算用用tensorboard。 tensor...
TsundNLP阅读 1,668评论 0 0
• 与 TensorFlow 的初次相遇 https://jorditorres.org/wp-content/upl...
布客飞龙阅读 3,381评论 4 89
• 昨天下午突然收到一封加薪的通知邮件，却没有欣喜的感觉。 赶紧微信Boss连书4条撤销加薪的理由，请求维持原有薪资水...
柳夜刀阅读 59评论 0 0
• 时间总不眷念我 才在夜风中教我回想起 六月将尽 才想起还未看过任意一朵 荷花凋零 那天海上曾细雨浪涌 沙洲漫长无人...
愁雨斋阅读 104评论 0 0
• 我们往往在形容某个人在某领域有了杰出的表现时，总会说他具备此领域的天赋或是其它的先天优势。就如写作一样，有人从小就...
苏听风阅读 329评论 0 11
• 第八课-第2周素材包 2017-07-25飞乐鸟 自由练习使用
hh妈阅读 99评论 0 0