hojeomi blog

4-2. multi-variable linear regression을 TensorFlow에서 구현하기 본문

AI/Machine & Deep Learning

4-2. multi-variable linear regression을 TensorFlow에서 구현하기

호저미 2021. 1. 15. 12:45
4-1. multi-variable linear regression을 TensorFlow에서 구현하기
In [ ]:
 
In [4]:
import tensorflow as tf
import numpy as np

x_data = [[73., 80., 75.],
          [93., 88., 93.],
          [89., 91., 90.],
          [96., 98., 100.],
          [73., 66., 70.]]
y_data = [[152.],
          [185.],
          [180.],
          [196.],
          [142.]]
In [5]:
tf.model = tf.keras.Sequential()

# units: 출력값의 크기(갯수)
tf.model.add(tf.keras.layers.Dense(units=1, input_dim=3))  # input_dim=3 gives multi-variable regression
tf.model.add(tf.keras.layers.Activation('linear'))  # this line can be omitted, as linear activation is default
# advanced reading https://towardsdatascience.com/activation-functions-neural-networks-1cbd9f8d91d6

# lr: learning rate
tf.model.compile(loss='mse', optimizer=tf.keras.optimizers.SGD(lr=1e-5))
tf.model.summary()
# epochs 100: 100번 모델링
history = tf.model.fit(x_data, y_data, epochs=100)
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 1)                 4         
_________________________________________________________________
activation_1 (Activation)    (None, 1)                 0         
=================================================================
Total params: 4
Trainable params: 4
Non-trainable params: 0
_________________________________________________________________
Epoch 1/100
1/1 [==============================] - 0s 265ms/step - loss: 640.0825
Epoch 2/100
1/1 [==============================] - 0s 2ms/step - loss: 202.1772
Epoch 3/100
1/1 [==============================] - 0s 2ms/step - loss: 64.9163
Epoch 4/100
1/1 [==============================] - 0s 3ms/step - loss: 21.8915
Epoch 5/100
1/1 [==============================] - 0s 4ms/step - loss: 8.4049
Epoch 6/100
1/1 [==============================] - 0s 4ms/step - loss: 4.1770
Epoch 7/100
1/1 [==============================] - 0s 3ms/step - loss: 2.8511
Epoch 8/100
1/1 [==============================] - 0s 2ms/step - loss: 2.4349
Epoch 9/100
1/1 [==============================] - 0s 3ms/step - loss: 2.3038
Epoch 10/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2621
Epoch 11/100
1/1 [==============================] - 0s 5ms/step - loss: 2.2484
Epoch 12/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2434
Epoch 13/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2412
Epoch 14/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2399
Epoch 15/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2388
Epoch 16/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2379
Epoch 17/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2370
Epoch 18/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2360
Epoch 19/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2351
Epoch 20/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2342
Epoch 21/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2332
Epoch 22/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2323
Epoch 23/100
1/1 [==============================] - 0s 2ms/step - loss: 2.2314
Epoch 24/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2305
Epoch 25/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2295
Epoch 26/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2286
Epoch 27/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2277
Epoch 28/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2268
Epoch 29/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2259
Epoch 30/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2249
Epoch 31/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2240
Epoch 32/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2231
Epoch 33/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2222
Epoch 34/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2212
Epoch 35/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2203
Epoch 36/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2194
Epoch 37/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2185
Epoch 38/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2176
Epoch 39/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2167
Epoch 40/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2158
Epoch 41/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2148
Epoch 42/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2139
Epoch 43/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2130
Epoch 44/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2121
Epoch 45/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2112
Epoch 46/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2103
Epoch 47/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2093
Epoch 48/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2084
Epoch 49/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2075
Epoch 50/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2066
Epoch 51/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2057
Epoch 52/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2048
Epoch 53/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2039
Epoch 54/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2030
Epoch 55/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2021
Epoch 56/100
1/1 [==============================] - 0s 3ms/step - loss: 2.2011
Epoch 57/100
1/1 [==============================] - 0s 4ms/step - loss: 2.2002
Epoch 58/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1993
Epoch 59/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1984
Epoch 60/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1975
Epoch 61/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1966
Epoch 62/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1957
Epoch 63/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1948
Epoch 64/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1939
Epoch 65/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1930
Epoch 66/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1921
Epoch 67/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1912
Epoch 68/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1903
Epoch 69/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1894
Epoch 70/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1885
Epoch 71/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1876
Epoch 72/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1867
Epoch 73/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1858
Epoch 74/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1849
Epoch 75/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1840
Epoch 76/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1831
Epoch 77/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1822
Epoch 78/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1813
Epoch 79/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1804
Epoch 80/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1795
Epoch 81/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1786
Epoch 82/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1777
Epoch 83/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1768
Epoch 84/100
1/1 [==============================] - 0s 4ms/step - loss: 2.1759
Epoch 85/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1750
Epoch 86/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1741
Epoch 87/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1732
Epoch 88/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1723
Epoch 89/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1714
Epoch 90/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1705
Epoch 91/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1696
Epoch 92/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1688
Epoch 93/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1679
Epoch 94/100
1/1 [==============================] - 0s 2ms/step - loss: 2.1670
Epoch 95/100
1/1 [==============================] - 0s 2ms/step - loss: 2.1661
Epoch 96/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1652
Epoch 97/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1643
Epoch 98/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1634
Epoch 99/100
1/1 [==============================] - 0s 3ms/step - loss: 2.1625
Epoch 100/100
1/1 [==============================] - 0s 2ms/step - loss: 2.1616
In [6]:
y_predict = tf.model.predict(np.array([[72., 93., 90.]]))
print(y_predict)
[[163.68098]]
In [ ]:
 
Comments