반응형
# Lab 4 Multi-variable linear regression
import tensorflow as tf
import numpy as np
x_data = [[73., 80., 75.],
[93., 88., 93.],
[89., 91., 90.],
[96., 98., 100.],
[73., 66., 70.]]
y_data = [[152.],
[185.],
[180.],
[196.],
[142.]]
tf.model = tf.keras.Sequential()
tf.model.add(tf.keras.layers.Dense(units=1, input_dim=3)) # input_dim=3 gives multi-variable regression
tf.model.add(tf.keras.layers.Activation('linear')) # this line can be omitted, as linear activation is default
# advanced reading https://towardsdatascience.com/activation-functions-neural-networks-1cbd9f8d91d6
tf.model.compile(loss='mse', optimizer=tf.keras.optimizers.SGD(lr=1e-5)) # 10의 -5승
tf.model.summary()
history = tf.model.fit(x_data, y_data, epochs=100)
y_predict = tf.model.predict(np.array([[72., 93., 90.]]))
print(y_predict)
up basic code
아래는 data-01-test-score.csv
73,80,75,152
93,88,93,185
89,91,90,180
96,98,100,196
73,66,70,142
53,46,55,101
69,74,77,149
47,56,60,115
87,79,90,175
79,70,88,164
69,70,73,141
70,65,74,141
93,95,91,184
79,80,73,152
70,73,78,148
93,89,96,192
78,75,68,147
81,90,93,183
88,92,86,177
78,83,77,159
82,86,90,177
86,82,89,175
78,83,85,175
76,83,71,149
96,93,95,192
import tensorflow as tf
import numpy as np
xy = np.loadtxt('./data-01-test-score.csv', delimiter=',', dtype=np.float32)
x_data = xy[:, 0:-1]
y_data = xy[:, [-1]]
# Make sure the shape and data are OK
print(x_data, "\nx_data shape:", x_data.shape)
print(y_data, "\ny_data shape:", y_data.shape)
# data output
'''
[[ 73. 80. 75.]
[ 93. 88. 93.]
...
[ 76. 83. 71.]
[ 96. 93. 95.]]
x_data shape: (25, 3)
[[152.]
[185.]
...
[149.]
[192.]]
y_data shape: (25, 1)
'''
tf.model = tf.keras.Sequential()
# activation function doesn't have to be added as a separate layer. Add it as an argument of Dense() layer
tf.model.add(tf.keras.layers.Dense(units=1, input_dim=3, activation='linear'))
# tf.model.add(tf.keras.layers.Activation('linear'))
tf.model.summary()
tf.model.compile(loss='mse', optimizer=tf.keras.optimizers.SGD(lr=1e-5))
history = tf.model.fit(x_data, y_data, epochs=2000)
# Ask my score
print("Your score will be ", tf.model.predict([[100, 70, 101]]))
print("Other scores will be ", tf.model.predict([[60, 70, 110], [90, 100, 80]]))
Epoch 1998/2000
1/1 [==============================] - 0s 737us/step - loss: 11.8143
Epoch 1999/2000
1/1 [==============================] - 0s 692us/step - loss: 11.8100
Epoch 2000/2000
1/1 [==============================] - 0s 808us/step - loss: 11.8057
Your score will be [[172.88719]]
Other scores will be [[179.6662 ]
[181.46442]]
반응형
'머신러닝,딥러닝 > tensorflow' 카테고리의 다른 글
lec08 deep neural network for everyone (0) | 2021.01.22 |
---|---|
lec07 learning rate, data preprocessing overfitting (0) | 2021.01.21 |
lec06 multinominal 개념 소개 (0) | 2021.01.21 |
lec05 logistic (regression) classification (0) | 2021.01.19 |
lec04 multi-variable linear regression tensorflow (0) | 2021.01.18 |
lec 03 linear regrssion 의 cost 최소화 (0) | 2021.01.18 |
tensorflow lec 02 linear (0) | 2021.01.17 |
tensorflow 잘 안됨 Import tensorflow.compat.v1 as tf (0) | 2021.01.17 |