欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

TrainData2algorthms

程序员文章站 2024-02-26 22:10:28
...

1.先把数据集进行读取

import pandas as pd
import matplotlib.pyplot as plt
with open('sourcedata2.csv')as f:
    df=pd.read_csv(f,header=0)
df
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
time RollingSpeed RollingForce EntranceThickness OutletThickness Post-tensionForce Pre-tensionForce Vibration
0 18:35:56.72 2.1168 16360 0.011031 0.004393 0.000 0.00 0.113
1 18:35:57.58 2.1413 25590000 0.010997 0.004457 4.698 0.00 0.056
2 18:35:59.08 2.1762 26870000 0.010958 0.004502 9.316 0.00 0.052
3 18:36:00.37 2.1728 27310000 0.010955 0.004313 7.353 16.12 0.050
4 18:36:03.38 2.1765 27680000 0.010947 0.004321 7.178 14.50 0.052
5 18:36:05.09 2.1680 27940000 0.011038 0.004295 7.784 12.53 0.034
6 18:36:07.45 2.1492 27630000 0.011050 0.004345 7.746 12.99 0.063
7 18:36:11.53 2.1469 27230000 0.010986 0.004423 7.864 13.43 0.094
8 18:36:16.68 2.1963 27330000 0.011060 0.004481 7.398 13.36 0.081
9 18:36:18.61 2.1896 26980000 0.011050 0.004526 7.623 12.17 0.050
10 18:36:21.18 2.2060 26830000 0.011009 0.004546 8.426 15.38 0.065
11 18:36:22.69 2.2119 27270000 0.011033 0.004520 7.457 13.13 0.090
12 18:36:23.97 2.2040 27120000 0.010981 0.004513 7.608 11.52 0.062
13 18:36:26.12 2.1947 27550000 0.011029 0.004463 8.150 12.07 0.044
14 18:36:29.55 2.1801 27300000 0.010957 0.004483 7.410 14.78 0.032
15 18:36:31.91 2.1771 27230000 0.011039 0.004496 9.115 13.25 0.050
16 18:36:36.64 2.1487 27540000 0.010938 0.004494 7.223 13.15 0.024
17 18:36:39.21 2.1390 27580000 0.010720 0.004462 8.005 15.70 0.026
18 18:36:43.07 2.1655 28190000 0.010941 0.004474 6.874 11.99 0.054
19 18:36:51.01 2.1704 27640000 0.011107 0.004507 7.239 13.12 0.031
20 18:36:54.02 2.1745 27550000 0.011176 0.004547 7.681 13.47 0.060
21 18:37:00.24 2.2496 26380000 0.011207 0.004855 8.709 14.29 0.043
22 18:37:02.82 2.2178 26720000 0.011256 0.004795 7.886 13.03 0.051
23 18:37:05.18 2.1716 26840000 0.011273 0.004778 8.051 11.99 0.041
24 18:37:09.90 2.1326 26920000 0.011255 0.004772 7.511 12.42 0.034
25 18:37:13.12 2.0962 26920000 0.011222 0.004755 7.942 14.49 0.103
26 18:37:19.13 2.0932 26980000 0.011250 0.004734 7.409 12.75 0.005
27 18:37:26.64 2.0942 27140000 0.011221 0.004751 8.116 12.67 0.005
28 18:37:29.21 2.0957 27360000 0.011238 0.004729 7.147 13.37 0.009
29 18:37:31.79 2.0925 26930000 0.011235 0.004761 7.634 12.88 0.002
30 18:37:32.86 2.0994 27210000 0.011311 0.004763 7.907 13.25 NaN
31 18:37:34.36 2.0975 27080000 0.011286 0.004786 6.594 13.94 NaN
32 18:37:36.94 2.0948 26810000 0.011304 0.004779 7.913 13.54 NaN
33 18:37:45.73 2.0981 26520000 0.011316 0.004875 7.796 12.86 NaN
34 18:37:49.60 2.0885 26280000 0.011334 0.004884 7.628 13.52 NaN
35 18:37:52.17 2.0939 26350000 0.011173 0.004885 7.024 13.23 NaN
36 18:37:53.67 2.0945 26240000 0.011148 0.004885 7.785 10.98 NaN
37 18:37:55.18 2.0875 28590000 0.010276 0.004455 0.000 12.96 NaN
38 18:37:56.08 2.0772 29270000 0.010272 0.004367 0.000 11.81 NaN
39 18:37:57.97 2.0788 31680000 0.010272 0.003505 0.000 12.80 NaN
40 18:37:58.53 2.1316 19210000 0.010272 0.003232 0.000 0.00 NaN
41 18:38:02.04 0.5007 60390 0.010273 0.003830 0.000 0.00 NaN

2进行数据的可视化分析

先看每个变量和振动之间的关系图

X=df[df.columns[1:6]]
y=df['Vibration']
plt.figure()
f,ax1=plt.subplots()
for i in range(1,7):
    number=320+i
    ax1.locator_params(nbins=3)
    ax1=plt.subplot(number)
    plt.title(list(df)[i])
    ax1.scatter(df[df.columns[i]],y)
plt.tight_layout(pad=0.4,w_pad=0.5,h_pad=1.0)
plt.show()
<matplotlib.figure.Figure at 0x7fd52c222b70>

TrainData2algorthms

变量之间的关系图绘制

入口厚度和出口厚度

fig=plt.figure()
ax=fig.add_subplot(111)
plt.axis([0.010,0.012,0.003,0.005])
ax.set_xlabel('EntranceThickness')
ax.set_ylabel('OutletThickness')
ax.scatter(df['EntranceThickness'],df['OutletThickness'])
plt.show()

TrainData2algorthms

这些数据也没有显示出较好的相关关系,波动还是比较明显

轧制力和轧值速度

fig=plt.figure()
ax=fig.add_subplot(111)
plt.axis([2.0,2.3,25000000,30000000])
ax.set_xlabel("RollingSpeed")
ax.set_ylabel("RollingForce")
ax.scatter(df['RollingSpeed'],df['RollingForce'])
plt.show()

TrainData2algorthms

选取坐标范围后,画出数据的散点图.
没有明显的线性关系.

3.绘制热力图矩阵(hotmap)
也就是相关系数矩阵

import pandas as pd
import seaborn as sns 
import matplotlib.pyplot as plt 
drop_elements=['time','Vibration']
train=train.drop(drop_elements,axis=1)
plt.figure(figsize=(8,6))
plt.title("Pearson Correlation of Features",y=1.0,size=15)
sns.heatmap(train.astype(float).corr(),linewidth=0.1,vmax=0.1,
           square=True,linecolor='white',annot=True)
plt.xticks(rotation=90)
plt.yticks(rotation=360)
plt.show()

TrainData2algorthms

这个热力图还是可以看出一些问题,例如 轧制力和前向张力的相关性较高,出口厚度和入口厚度的相关性也比较高.

3.训练算法模型

选择模型架构:

我们需要建立的是一个六个输入,一个输出的前向神经网络.

神经网络是一个双隐层,四层深度神经网络 6x10x5x1: 

df=df[0:30]
df
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
time RollingSpeed RollingForce EntranceThickness OutletThickness Post-tensionForce Pre-tensionForce Vibration
0 18:35:56.72 2.1168 16360 0.011031 0.004393 0.000 0.00 0.113
1 18:35:57.58 2.1413 25590000 0.010997 0.004457 4.698 0.00 0.056
2 18:35:59.08 2.1762 26870000 0.010958 0.004502 9.316 0.00 0.052
3 18:36:00.37 2.1728 27310000 0.010955 0.004313 7.353 16.12 0.050
4 18:36:03.38 2.1765 27680000 0.010947 0.004321 7.178 14.50 0.052
5 18:36:05.09 2.1680 27940000 0.011038 0.004295 7.784 12.53 0.034
6 18:36:07.45 2.1492 27630000 0.011050 0.004345 7.746 12.99 0.063
7 18:36:11.53 2.1469 27230000 0.010986 0.004423 7.864 13.43 0.094
8 18:36:16.68 2.1963 27330000 0.011060 0.004481 7.398 13.36 0.081
9 18:36:18.61 2.1896 26980000 0.011050 0.004526 7.623 12.17 0.050
10 18:36:21.18 2.2060 26830000 0.011009 0.004546 8.426 15.38 0.065
11 18:36:22.69 2.2119 27270000 0.011033 0.004520 7.457 13.13 0.090
12 18:36:23.97 2.2040 27120000 0.010981 0.004513 7.608 11.52 0.062
13 18:36:26.12 2.1947 27550000 0.011029 0.004463 8.150 12.07 0.044
14 18:36:29.55 2.1801 27300000 0.010957 0.004483 7.410 14.78 0.032
15 18:36:31.91 2.1771 27230000 0.011039 0.004496 9.115 13.25 0.050
16 18:36:36.64 2.1487 27540000 0.010938 0.004494 7.223 13.15 0.024
17 18:36:39.21 2.1390 27580000 0.010720 0.004462 8.005 15.70 0.026
18 18:36:43.07 2.1655 28190000 0.010941 0.004474 6.874 11.99 0.054
19 18:36:51.01 2.1704 27640000 0.011107 0.004507 7.239 13.12 0.031
20 18:36:54.02 2.1745 27550000 0.011176 0.004547 7.681 13.47 0.060
21 18:37:00.24 2.2496 26380000 0.011207 0.004855 8.709 14.29 0.043
22 18:37:02.82 2.2178 26720000 0.011256 0.004795 7.886 13.03 0.051
23 18:37:05.18 2.1716 26840000 0.011273 0.004778 8.051 11.99 0.041
24 18:37:09.90 2.1326 26920000 0.011255 0.004772 7.511 12.42 0.034
25 18:37:13.12 2.0962 26920000 0.011222 0.004755 7.942 14.49 0.103
26 18:37:19.13 2.0932 26980000 0.011250 0.004734 7.409 12.75 0.005
27 18:37:26.64 2.0942 27140000 0.011221 0.004751 8.116 12.67 0.005
28 18:37:29.21 2.0957 27360000 0.011238 0.004729 7.147 13.37 0.009
29 18:37:31.79 2.0925 26930000 0.011235 0.004761 7.634 12.88 0.002
# 一些包的引入
from sklearn import datasets, cross_validation, metrics
from sklearn import preprocessing

from tensorflow.contrib import learn

from keras.models import Sequential
from keras.layers import Dense
x=df[df.columns[1:7]]
y=df['Vibration']
x_train,x_test,y_train,y_test=cross_validation.train_test_split(x,y,test_size=0.2)
# 将输入变量进行归一化处理
scaler=preprocessing.StandardScaler()
x_trian=scaler.fit_transform(x_train)
model=Sequential()
model.add(Dense(10,input_dim=6,init='normal',activation='relu'))
model.add(Dense(5,init='normal',activation='relu'))
model.add(Dense(1,init='normal'))
model.compile(loss='mean_squared_error',optimizer='adam')

# 训练模型,查看正确率
model.fit(x_train,y_train,nb_epoch=500,validation_split=0.33,
          shuffle=True,verbose=2)
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:8: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(10, input_dim=6, activation="relu", kernel_initializer="normal")`

/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:9: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(5, activation="relu", kernel_initializer="normal")`
  if __name__ == '__main__':
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:10: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(1, kernel_initializer="normal")`
  # Remove the CWD from sys.path while we load stuff.
/home/dengshuo/anaconda3/lib/python3.6/site-packages/keras/models.py:942: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
  warnings.warn('The `nb_epoch` argument in `fit` '


Train on 16 samples, validate on 8 samples
Epoch 1/500
 - 1s - loss: 6375226.0000 - val_loss: 5214159.0000
Epoch 2/500
 - 0s - loss: 4847505.0000 - val_loss: 3922125.0000
Epoch 3/500
 - 0s - loss: 3646332.5000 - val_loss: 2923160.0000
Epoch 4/500
 - 0s - loss: 2717618.5000 - val_loss: 2159714.7500
Epoch 5/500
 - 0s - loss: 2007860.5000 - val_loss: 1594420.0000
Epoch 6/500
 - 0s - loss: 1482317.7500 - val_loss: 1154666.0000
Epoch 7/500
 - 0s - loss: 1073487.0000 - val_loss: 811432.8750
Epoch 8/500
 - 0s - loss: 754388.8750 - val_loss: 546497.8750
Epoch 9/500
 - 0s - loss: 508082.7812 - val_loss: 345815.8750
Epoch 10/500
 - 0s - loss: 321510.5938 - val_loss: 198714.0938
Epoch 11/500
 - 0s - loss: 184750.6875 - val_loss: 96965.6719
Epoch 12/500
 - 0s - loss: 90154.6172 - val_loss: 33976.9688
Epoch 13/500
 - 0s - loss: 31592.4570 - val_loss: 4241.9888
Epoch 14/500
 - 0s - loss: 3945.4534 - val_loss: 2016.1428
Epoch 15/500
 - 0s - loss: 1873.0994 - val_loss: 20512.2969
Epoch 16/500
 - 0s - loss: 19065.6641 - val_loss: 51329.8047
Epoch 17/500
 - 0s - loss: 47713.4180 - val_loss: 85135.9219
Epoch 18/500
 - 0s - loss: 79140.1094 - val_loss: 113658.8438
Epoch 19/500
 - 0s - loss: 105655.7188 - val_loss: 131461.5312
Epoch 20/500
 - 0s - loss: 122205.6016 - val_loss: 136798.5000
Epoch 21/500
 - 0s - loss: 127167.0469 - val_loss: 130197.8281
Epoch 22/500
 - 0s - loss: 121030.8203 - val_loss: 110119.7500
Epoch 23/500
 - 0s - loss: 102365.6562 - val_loss: 89022.8125
Epoch 24/500
 - 0s - loss: 82753.4375 - val_loss: 69066.6484
Epoch 25/500
 - 0s - loss: 64201.7891 - val_loss: 51627.1367
Epoch 26/500
 - 0s - loss: 47989.8242 - val_loss: 37236.5195
Epoch 27/500
 - 0s - loss: 34612.2656 - val_loss: 25887.8535
Epoch 28/500
 - 0s - loss: 24062.6543 - val_loss: 17334.3555
Epoch 29/500
 - 0s - loss: 16111.5479 - val_loss: 11116.8428
Epoch 30/500
 - 0s - loss: 10332.0498 - val_loss: 6768.4287
Epoch 31/500
 - 0s - loss: 6290.1055 - val_loss: 3850.2249
Epoch 32/500
 - 0s - loss: 3577.7085 - val_loss: 1986.2985
Epoch 33/500
 - 0s - loss: 1845.3606 - val_loss: 874.1240
Epoch 34/500
 - 0s - loss: 811.8243 - val_loss: 281.2087
Epoch 35/500
 - 0s - loss: 260.9658 - val_loss: 34.9176
Epoch 36/500
 - 0s - loss: 32.2987 - val_loss: 0.0055
Epoch 37/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 38/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 39/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 40/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 41/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 42/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 43/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 44/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 45/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 46/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 47/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 48/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 49/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 50/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 51/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 52/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 53/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 54/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 55/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 56/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 57/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 58/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 59/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 60/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 61/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 62/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 63/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 64/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 65/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 66/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 67/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 68/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 69/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 70/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 71/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 72/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 73/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 74/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 75/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 76/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 77/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 78/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 79/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 80/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 81/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 82/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 83/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 84/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 85/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 86/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 87/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 88/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 89/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 90/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 91/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 92/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 93/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 94/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 95/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 96/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 97/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 98/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 99/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 100/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 101/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 102/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 103/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 104/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 105/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 106/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 107/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 108/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 109/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 110/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 111/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 112/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 113/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 114/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 115/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 116/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 117/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 118/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 119/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 120/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 121/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 122/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 123/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 124/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 125/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 126/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 127/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 128/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 129/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 130/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 131/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 132/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 133/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 134/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 135/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 136/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 137/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 138/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 139/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 140/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 141/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 142/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 143/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 144/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 145/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 146/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 147/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 148/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 149/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 150/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 151/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 152/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 153/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 154/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 155/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 156/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 157/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 158/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 159/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 160/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 161/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 162/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 163/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 164/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 165/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 166/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 167/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 168/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 169/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 170/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 171/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 172/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 173/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 174/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 175/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 176/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 177/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 178/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 179/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 180/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 181/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 182/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 183/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 184/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 185/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 186/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 187/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 188/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 189/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 190/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 191/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 192/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 193/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 194/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 195/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 196/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 197/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 198/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 199/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 200/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 201/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 202/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 203/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 204/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 205/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 206/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 207/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 208/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 209/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 210/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 211/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 212/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 213/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 214/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 215/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 216/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 217/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 218/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 219/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 220/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 221/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 222/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 223/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 224/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 225/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 226/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 227/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 228/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 229/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 230/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 231/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 232/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 233/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 234/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 235/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 236/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 237/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 238/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 239/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 240/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 241/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 242/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 243/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 244/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 245/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 246/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 247/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 248/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 249/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 250/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 251/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 252/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 253/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 254/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 255/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 256/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 257/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 258/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 259/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 260/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 261/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 262/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 263/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 264/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 265/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 266/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 267/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 268/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 269/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 270/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 271/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 272/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 273/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 274/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 275/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 276/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 277/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 278/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 279/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 280/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 281/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 282/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 283/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 284/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 285/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 286/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 287/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 288/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 289/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 290/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 291/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 292/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 293/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 294/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 295/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 296/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 297/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 298/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 299/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 300/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 301/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 302/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 303/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 304/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 305/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 306/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 307/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 308/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 309/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 310/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 311/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 312/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 313/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 314/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 315/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 316/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 317/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 318/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 319/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 320/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 321/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 322/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 323/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 324/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 325/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 326/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 327/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 328/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 329/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 330/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 331/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 332/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 333/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 334/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 335/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 336/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 337/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 338/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 339/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 340/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 341/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 342/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 343/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 344/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 345/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 346/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 347/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 348/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 349/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 350/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 351/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 352/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 353/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 354/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 355/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 356/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 357/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 358/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 359/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 360/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 361/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 362/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 363/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 364/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 365/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 366/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 367/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 368/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 369/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 370/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 371/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 372/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 373/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 374/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 375/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 376/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 377/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 378/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 379/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 380/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 381/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 382/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 383/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 384/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 385/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 386/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 387/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 388/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 389/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 390/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 391/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 392/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 393/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 394/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 395/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 396/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 397/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 398/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 399/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 400/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 401/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 402/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 403/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 404/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 405/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 406/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 407/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 408/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 409/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 410/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 411/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 412/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 413/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 414/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 415/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 416/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 417/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 418/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 419/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 420/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 421/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 422/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 423/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 424/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 425/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 426/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 427/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 428/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 429/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 430/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 431/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 432/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 433/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 434/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 435/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 436/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 437/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 438/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 439/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 440/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 441/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 442/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 443/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 444/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 445/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 446/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 447/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 448/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 449/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 450/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 451/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 452/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 453/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 454/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 455/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 456/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 457/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 458/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 459/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 460/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 461/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 462/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 463/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 464/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 465/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 466/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 467/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 468/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 469/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 470/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 471/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 472/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 473/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 474/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 475/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 476/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 477/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 478/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 479/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 480/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 481/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 482/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 483/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 484/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 485/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 486/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 487/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 488/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 489/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 490/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 491/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 492/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 493/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 494/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 495/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 496/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 497/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 498/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 499/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 500/500
 - 0s - loss: 0.0044 - val_loss: 0.0054





<keras.callbacks.History at 0x7fd4e6a34c50>

可以看出训练集和验证集的损失函数 都在减小,最后趋于一个稳定值.

相关标签: t