第一个tensorflow小程序——异或门实现

云计算 waitig 815℃ 百度已收录 0评论

代码:

#author@chengxiaona
import tensorflow as tf
import numpy as np

#输入训练数据,这里是python的list, 也可以定义为numpy的ndarray
x_data = [[1., 0.], [0., 1.], [0., 0.], [1., 1.]] 
#定义占位符,占位符在运行图的时候必须feed数据
x = tf.placeholder(tf.float32, shape = [None, 2])
#训练数据的标签,注意维度
y_data = [[1], [1], [0], [0]]
y = tf.placeholder(tf.float32, shape = [None, 1])
#定义variables,在运行图的过程中会被按照优化目标改变和保存
weights = {'w1': tf.Variable(tf.random_normal([2, 16])), 'w2': tf.Variable(tf.random_normal([16, 1]))}
bias = {'b1': tf.Variable(tf.zeros([1])), 'b2': tf.Variable(tf.zeros([1]))}
#定义对于节点的操作函数
def nn(x, weights, bias):
    d1 = tf.matmul(x, weights['w1']) + bias['b1']
    d1 = tf.nn.relu(d1)
    d2 = tf.matmul(d1, weights['w2']) + bias['b2']
    d2 = tf.nn.sigmoid(d2)
    return d2
#预测值
pred = nn(x, weights, bias)
#损失函数
cost = tf.reduce_mean(tf.square(y - pred))
#学习率
learning_rate = 0.01
#定义tf.train用来训练
# train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)  ## max_step: 20000, loss: 0.002638
train_step = tf.train.AdamOptimizer(learning_rate).minimize(cost)  ## max_step: 2000, loss: 0.000014
#初始化参数,图运行的一开始必须初始化所有变量
init = tf.global_variables_initializer()
# correct_pred = tf.equal(tf.argmax(y, 1), tf.argmax(pred, 1))
# accuracy = tf.reduce_mean(tf.cast(correct_pred, 'float'))

#运行图,with语句调用其后面函数的__enter()__函数,将返回值赋给as后面的参数,并在块的最后调用__exit()__函数,相当于
`sess = tf.Sessions(), ... sess.close()`.
with tf.Session() as sess:
    sess.run(init)
    max_step = 2000
    for i in xrange(max_step + 1):
        sess.run(train_step, feed_dict = {x: x_data, y: y_data})
        loss = sess.run(cost, feed_dict = {x: x_data, y: y_data})
        # acc = sess.run(accuracy, feed_dict = {x: x_data, y: y_data})
        # 输出训练误差和测试数据的标签
        if i % 100 == 0:
            print('step: '+ str(i) + '  loss:' + "{:.6f}".format(loss)) #+ '    accuracy:' + "{:.6f}".format(acc))
            print(sess.run(pred, feed_dict = {x: x_data}))
    print('end')

另:

  1. 多维矩阵直接+一维矩阵时,会给每个矩阵的每个元素都加一个数字,不会有维度错误,所以biases可以直接全赋成一维

  2. 关于图中的feed和fetch的数据类型:

可以feed的数据类型: Python scalars, strings, lists, numpy ndarrays, or TensorHandles.
可以fetch的数据类型:Tensor, string.
看一段代码的结果:

import tensorflow as tf

x_data = [[1., 0.], [0., 1.], [0., 0.], [1., 1.]]
# batch_xs = tf.reshape(x_data, shape = [-1, 2])
# print('1', type(batch_xs)) #<class 'tensorflow.python.framework.ops.Tensor'>
# print('2', batch_xs)       #<tf.Tensor 'Reshape:0' shape=(4, 2) dtype=float32>
# y_data = [[1.], [2.]]
x = tf.placeholder(tf.float32, shape = [None, 2])
y = tf.constant([[1.], [2.]])
res = tf.matmul(x, y)
print(type(x), type(y), type(res))  ## all are <class 'tensorflow.python.framework.ops.Tensor'>

with tf.Session() as sess:
    batch_xs = tf.reshape(x_data, shape = [-1, 2])
    print('1', type(batch_xs)) #<class 'tensorflow.python.framework.ops.Tensor'>
    print('2', batch_xs)       #<tf.Tensor 'Reshape:0' shape=(4, 2) dtype=float32>
    print('3', type(x_data))       #<type 'list'>
    print('4', type(batch_xs))     # <class 'tensorflow.python.framework.ops.Tensor'>
    batch_xs = sess.run(batch_xs) 
    print('5', type(batch_xs))     #<type 'numpy.ndarray'>
    print('6', batch_xs)
    # print('7', type(sess.run(batch_xs))) #TypeError: Fetch argument array([[ 1.,  0.],
       # [ 0.,  1.],
       # [ 0.,  0.],
       # [ 1.,  1.]], dtype=float32) has invalid type <type 'numpy.ndarray'>, must be a string or Tensor. (Can not convert a ndarray into a Tensor or Operation.)
    print('7', type(sess.run(res, feed_dict = {x: batch_xs, y: sess.run(y)})))
    print('8', sess.run(res, feed_dict = {x: batch_xs, y: sess.run(y)}))

从print()的结果可以看出:

  • 凡是用tf定义出来的变量,都是tensor
  • 一个变量的数据类型和在tf.Session()里面还是外面无关
  • sess.run()第一个参数是要fetch的变量,这个变量的类型只能是tensor或者string, 后面如果要加feed_dict = {}, 注意feed的数据类型可以是Python scalars, strings, lists, numpy ndarrays, or TensorHandles,不能是tensor.fecth得到的变量是<type 'numpy.ndarray'>。一句话就是,在运行图的时,tensor用sess.run()取出来,然后再feed进去。

代码的实现部分参考了TensorFlow代码实现(二)[实现异或门(XOR)], 十分感谢博主!


本文由【waitig】发表在等英博客
本文固定链接:第一个tensorflow小程序——异或门实现
欢迎关注本站官方公众号,每日都有干货分享!
等英博客官方公众号
点赞 (0)分享 (0)