Multivariable linear regression
- x의 데이타가 하나가 아닌 여러 데이터인 경우에 linear regression입니다.
import tensorflow as tf # 입력 값을 넣어준다. x1_data = [73., 93., 89., 96., 73.] x2_data = [80., 88., 91., 98., 66.] x3_data = [75., 93., 90., 100., 70.] y_data = [152., 185., 180., 196., 142.] # placeholders for a tensor : 입력 값을 저장할 공간. x1 = tf.placeholder(tf.float32) x2 = tf.placeholder(tf.float32) x3 = tf.placeholder(tf.float32) Y = tf.placeholder(tf.float32) # 구하고자 하는 값 w,b w1 = tf.Variable(tf.random_normal([1]), name='weight1') w2 = tf.Variable(tf.random_normal([1]), name='weight2') w3 = tf.Variable(tf.random_normal([1]), name='weight3') b = tf.Variable(tf.random_normal([1]), name='bias') hypothesis = x1 * w1 + x2 * w2 + x3 * w3 + b # cost/loss function cost = tf.reduce_mean(tf.square(hypothesis - Y)) # Minimize. Need a very small learning rate for this data set optimizer = tf.train.GradientDescentOptimizer(learning_rate=1e-5) train = optimizer.minimize(cost) # Launch the graph in a session. sess = tf.Session() # Initializes global variables in the graph. sess.run(tf.global_variables_initializer()) # 2000번 학습중 1000번 마다 출력을 해준다. # 현재의 step, cost, w와b를 통한 y를 출력해준다. for step in range(2001): cost_val, hy_val, _ = sess.run([cost, hypothesis, train], feed_dict={x1: x1_data, x2: x2_data, x3: x3_data, Y: y_data}) if step % 1000 == 0: print(step, "Cost: ", cost_val, "\nPrediction:\n", hy_val)
- 실행 결과
Multivariable linear regression using Matrix
- 이번에는 Matrix를 사용하여 동일한 작업을 하였습니다.
import tensorflow as tf x_data = [[73., 80., 75.], [93., 88., 93.], [89., 91., 90.], [96., 98., 100.], [73., 66., 70.]] y_data = [[152.], [185.], [180.], [196.], [142.]] # placeholders for a tensor that will be always fed. X = tf.placeholder(tf.float32, shape=[None, 3]) Y = tf.placeholder(tf.float32, shape=[None, 1]) W = tf.Variable(tf.random_normal([3, 1]), name='weight') b = tf.Variable(tf.random_normal([1]), name='bias') # Hypothesis hypothesis = tf.matmul(X, W) + b # Simplified cost/loss function cost = tf.reduce_mean(tf.square(hypothesis - Y)) # Minimize optimizer = tf.train.GradientDescentOptimizer(learning_rate=1e-5) train = optimizer.minimize(cost) # Launch the graph in a session. sess = tf.Session() # Initializes global variables in the graph. sess.run(tf.global_variables_initializer()) for step in range(2001): cost_val, hy_val, _ = sess.run( [cost, hypothesis, train], feed_dict={X: x_data, Y: y_data}) if step % 1000 == 0: print(step, "Cost: ", cost_val, "\nPrediction:\n", hy_val)
- 실행 결과
'IT > 머신러닝' 카테고리의 다른 글
[section_5] Logistic (Regression) Classification (0) | 2018.06.01 |
---|---|
[section_4_lab2] 텐서플로우 파일에서 데이터 읽어오기 (0) | 2018.06.01 |
[section_4] Multivariable linear regression using Matrix (0) | 2018.06.01 |
[section_3] Gradient descent algorithm (0) | 2018.06.01 |
[section_2_lab] 머신러닝 Linear Regression 실습 (0) | 2018.06.01 |