hojeomi blog

3-1. Linear Regression 의 cost 최소화의 TensorFlow 구현 본문

AI/Machine & Deep Learning

3-1. Linear Regression 의 cost 최소화의 TensorFlow 구현

호저미 2021. 1. 14. 15:58

 

 

 

In [1]:
from IPython.core.display import display, HTML

display(HTML("<style> .container{width:90% !important;}</style>"))
 
 
 

1. Build graph using TF operations

In [2]:
# Lab 3 Minimizing Cost
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt

x_train = [1, 2, 3, 4]
y_train = [0, -1, -2, -3]

# Sequential 모델은 레이어를 선형으로 연결하여 구성합니다. 레이어 인스턴스를 생성자에게 넘겨줌으로써 Sequential 모델을 구성할 수 있습니다.
# units: 출력값의 크기(갯수)
# inpun_dim: 2D, input_shape: 순환신경망에 유용
tf.model = tf.keras.Sequential()
tf.model.add(tf.keras.layers.Dense(units=1, input_dim=1))

# SGD: 확률적 경사 하강법(Stochastic Gradient Descent)
sgd = tf.keras.optimizers.SGD(lr=0.1)
tf.model.compile(loss='mse', optimizer=sgd)

tf.model.summary()

# fit() trains the model and returns history of train
history = tf.model.fit(x_train, y_train, epochs=100)

y_predict = tf.model.predict(np.array([5, 4]))
print(y_predict)
 
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 1)                 2         
=================================================================
Total params: 2
Trainable params: 2
Non-trainable params: 0
_________________________________________________________________
Epoch 1/100
1/1 [==============================] - 1s 1s/step - loss: 16.0200
Epoch 2/100
1/1 [==============================] - 0s 7ms/step - loss: 7.3634
Epoch 3/100
1/1 [==============================] - 0s 5ms/step - loss: 3.4666
Epoch 4/100
1/1 [==============================] - 0s 6ms/step - loss: 1.7076
Epoch 5/100
1/1 [==============================] - 0s 6ms/step - loss: 0.9089
Epoch 6/100
1/1 [==============================] - 0s 7ms/step - loss: 0.5419
Epoch 7/100
1/1 [==============================] - 0s 6ms/step - loss: 0.3692
Epoch 8/100
1/1 [==============================] - 0s 6ms/step - loss: 0.2843
Epoch 9/100
1/1 [==============================] - 0s 5ms/step - loss: 0.2392
Epoch 10/100
1/1 [==============================] - 0s 5ms/step - loss: 0.2123
Epoch 11/100
1/1 [==============================] - 0s 5ms/step - loss: 0.1941
Epoch 12/100
1/1 [==============================] - 0s 8ms/step - loss: 0.1801
Epoch 13/100
1/1 [==============================] - 0s 8ms/step - loss: 0.1683
Epoch 14/100
1/1 [==============================] - 0s 7ms/step - loss: 0.1579
Epoch 15/100
1/1 [==============================] - 0s 9ms/step - loss: 0.1483
Epoch 16/100
1/1 [==============================] - 0s 8ms/step - loss: 0.1395
Epoch 17/100
1/1 [==============================] - 0s 8ms/step - loss: 0.1312
Epoch 18/100
1/1 [==============================] - 0s 9ms/step - loss: 0.1234
Epoch 19/100
1/1 [==============================] - 0s 8ms/step - loss: 0.1161
Epoch 20/100
1/1 [==============================] - 0s 7ms/step - loss: 0.1093
Epoch 21/100
1/1 [==============================] - 0s 9ms/step - loss: 0.1028
Epoch 22/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0968
Epoch 23/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0911
Epoch 24/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0857
Epoch 25/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0806
Epoch 26/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0759
Epoch 27/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0714
Epoch 28/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0672
Epoch 29/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0632
Epoch 30/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0595
Epoch 31/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0560
Epoch 32/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0527
Epoch 33/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0496
Epoch 34/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0467
Epoch 35/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0439
Epoch 36/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0413
Epoch 37/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0389
Epoch 38/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0366
Epoch 39/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0344
Epoch 40/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0324
Epoch 41/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0305
Epoch 42/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0287
Epoch 43/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0270
Epoch 44/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0254
Epoch 45/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0239
Epoch 46/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0225
Epoch 47/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0212
Epoch 48/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0199
Epoch 49/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0187
Epoch 50/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0176
Epoch 51/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0166
Epoch 52/100
1/1 [==============================] - 0s 13ms/step - loss: 0.0156
Epoch 53/100
1/1 [==============================] - 0s 12ms/step - loss: 0.0147
Epoch 54/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0138
Epoch 55/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0130
Epoch 56/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0122
Epoch 57/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0115
Epoch 58/100
1/1 [==============================] - 0s 6ms/step - loss: 0.0108
Epoch 59/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0102
Epoch 60/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0096
Epoch 61/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0090
Epoch 62/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0085
Epoch 63/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0080
Epoch 64/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0075
Epoch 65/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0071
Epoch 66/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0067
Epoch 67/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0063
Epoch 68/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0059
Epoch 69/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0056
Epoch 70/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0052
Epoch 71/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0049
Epoch 72/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0046
Epoch 73/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0044
Epoch 74/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0041
Epoch 75/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0039
Epoch 76/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0036
Epoch 77/100
1/1 [==============================] - 0s 4ms/step - loss: 0.0034
Epoch 78/100
1/1 [==============================] - 0s 5ms/step - loss: 0.0032
Epoch 79/100
1/1 [==============================] - 0s 7ms/step - loss: 0.0030
Epoch 80/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0028
Epoch 81/100
1/1 [==============================] - 0s 11ms/step - loss: 0.0027
Epoch 82/100
1/1 [==============================] - 0s 12ms/step - loss: 0.0025
Epoch 83/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0024
Epoch 84/100
1/1 [==============================] - 0s 11ms/step - loss: 0.0022
Epoch 85/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0021
Epoch 86/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0020
Epoch 87/100
1/1 [==============================] - 0s 8ms/step - loss: 0.0019
Epoch 88/100
1/1 [==============================] - 0s 11ms/step - loss: 0.0018
Epoch 89/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0016
Epoch 90/100
1/1 [==============================] - 0s 11ms/step - loss: 0.0016
Epoch 91/100
1/1 [==============================] - 0s 9ms/step - loss: 0.0015
Epoch 92/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0014
Epoch 93/100
1/1 [==============================] - 0s 12ms/step - loss: 0.0013
Epoch 94/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0012
Epoch 95/100
1/1 [==============================] - 0s 11ms/step - loss: 0.0011
Epoch 96/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0011
Epoch 97/100
1/1 [==============================] - 0s 10ms/step - loss: 0.0010
Epoch 98/100
1/1 [==============================] - 0s 11ms/step - loss: 9.5320e-04
Epoch 99/100
1/1 [==============================] - 0s 12ms/step - loss: 8.9698e-04
Epoch 100/100
1/1 [==============================] - 0s 14ms/step - loss: 8.4407e-04
[[-3.951684]
 [-2.97514 ]]
In [3]:
# Plot training & validation loss values
plt.plot(history.history['loss'])
plt.title('Model loss')
plt.ylabel('Loss')
# epoch: 학습 횟수
plt.xlabel('Epoch')
plt.legend(['Train', 'Test'], loc='upper left')
plt.show()
 
In [ ]:
 
In [ ]:
 
Comments