一个Tensorflow搭建的网络可视化的例子(有代码)

[复制链接]
1139|3
 楼主| keer_zu 发表于 2019-1-10 18:47 | 显示全部楼层 |阅读模式


将代码运行,会在本地产生一个:logs\ 的文件夹,执行:
  1. tensorboard --logdir  logs


然后出现:

  1. TensorBoard 1.12.0 at http://ubuntu:6006 (Press CTRL+C to quit)


在本机浏览器上输入:

  1. http://ubuntu:6006
即可看到可视化的效果。


代码:
  1. from __future__ import print_function
  2. import tensorflow as tf
  3. import numpy as np

  4. def add_layer(inputs, in_size, out_size, n_layer, activation_function=None):
  5. # add one more layer and return the output of this layer
  6.     layer_name = 'layer%s' % n_layer
  7.     with tf.name_scope(layer_name):
  8.         with tf.name_scope('weights'):
  9.             Weights = tf.Variable(tf.random_normal([in_size, out_size]), name='W')
  10.             tf.summary.histogram(layer_name + '/weights', Weights)
  11.         with tf.name_scope('biases'):
  12.             biases = tf.Variable(tf.zeros([1, out_size]) + 0.1, name='b')
  13.             tf.summary.histogram(layer_name + '/biases', biases)
  14.         with tf.name_scope('Wx_plus_b'):
  15.             Wx_plus_b = tf.add(tf.matmul(inputs, Weights), biases)
  16.         if activation_function is None:
  17.             outputs = Wx_plus_b
  18.         else:
  19.             outputs = activation_function(Wx_plus_b, )
  20.         tf.summary.histogram(layer_name + '/outputs', outputs)
  21.     return outputs





  22. # Make up some real data
  23. x_data = np.linspace(-1, 1, 300)[:, np.newaxis]
  24. noise = np.random.normal(0, 0.05, x_data.shape)
  25. y_data = np.square(x_data) - 0.5 + noise



  26. # define placeholder for inputs to network
  27. with tf.name_scope('inputs'):
  28.     xs = tf.placeholder(tf.float32, [None, 1], name='x_input')
  29.     ys = tf.placeholder(tf.float32, [None, 1], name='y_input')




  30. # add hidden layer
  31. l1 = add_layer(xs, 1, 10, n_layer=1, activation_function=tf.nn.relu)



  32. # add output layer
  33. prediction = add_layer(l1, 10, 1, n_layer=2, activation_function=None)


  34. # the error between prediciton and real data
  35. with tf.name_scope('loss'):
  36.     loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys - prediction), reduction_indices=[1]))
  37.     tf.summary.scalar('loss', loss)


  38. with tf.name_scope('train'):
  39.     train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

  40. sess = tf.Session()
  41. merged = tf.summary.merge_all()
  42. writer = tf.summary.FileWriter("logs/", sess.graph)
  43. init = tf.global_variables_initializer()
  44. sess.run(init)

  45. for i in range(1000):
  46.     sess.run(train_step, feed_dict={xs: x_data, ys: y_data})
  47.     if i % 50 == 0:
  48.         result = sess.run(merged, feed_dict={xs: x_data, ys: y_data})
  49.         writer.add_summary(result, i)



 楼主| keer_zu 发表于 2019-1-11 08:48 | 显示全部楼层
@gaoyang9992006 可以玩玩tensorflow
您需要登录后才可以回帖 登录 | 注册

本版积分规则

个人签名:qq群:49734243 Email:zukeqiang@gmail.com

1488

主题

12949

帖子

55

粉丝
快速回复 在线客服 返回列表 返回顶部