欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

Keras学习(四)——CNN卷积神经网络

程序员文章站 2022-07-13 11:57:21
...

本文主要介绍使用keras实现CNN对手写数据集进行分类。

示例代码:

import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation, Convolution2D, MaxPooling2D, Flatten
from keras.optimizers import Adam

# 使多次生成的随机数相同
np.random.seed(1337)

# 下载数据集
# X_shape(60000 28x28),y shape(10000)
(X_train, y_train), (X_test, y_test) = mnist.load_data()

# 预处理数据
'''
X_train.reshape(X_train.shape[0], -1) 将60000个28x28的数据变为60000x784
/255:把数据标准化到[0,1]
'''
X_train = X_train.reshape(-1, 1, 28, 28)   # -1:sample个数, 1:channel, 28x28:长宽
X_test = X_test.reshape(-1, 1, 28, 28)
# 将标签变为one-hot形式
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)

# 搭建网络
model = Sequential()
# conv1 layer
model.add(Convolution2D(
    nb_filter=32,  # 滤波器
    nb_row=5,  # filter宽度
    nb_col=5,  # filter高度
    border_mode='same',  # padding的方法
    input_shape=(1,  # channel的个数
                 28, 28),  # width和height
))
model.add(Activation('relu'))
# 池化层 pooling
model.add(
    MaxPooling2D(
        pool_size=(2, 2),
        strides=(2, 2),
        border_mode='same',  # padding method
))

# Conv2 layer
model.add(Convolution2D(64, 5, 5, border_mode='same'))
model.add(Activation('relu'))

# pooling2 layer
model.add(MaxPooling2D(pool_size=(2, 2), border_mode='same'))

# 展开
model.add(Flatten())
# 全连接层1
model.add(Dense(1024))
model.add(Activation('relu'))

# fc2
model.add(Dense(10))
model.add(Activation('softmax'))  # 用于分类的**函数
# 优化器
adam = Adam(lr=1e-4)

# **模型
model.compile(optimizer=adam,
              loss='categorical_crossentropy',
              metrics=['accuracy'])

# 训练
print('Training...')
model.fit(X_train, y_train, nb_epoch=2, batch_size=32)

# 测试
print('\nTesing....\n')
loss, accuracy = model.evaluate(X_test, y_test)

print('\ntest loss', loss)
print('\ntest accuracy', accuracy)

运行结果:

58560/60000 [============================>.] - ETA: 1s - loss: 0.0993 - acc: 0.9691
58624/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9691
58688/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9692
58752/60000 [============================>.] - ETA: 1s - loss: 0.0992 - acc: 0.9692
58816/60000 [============================>.] - ETA: 1s - loss: 0.0996 - acc: 0.9691
58880/60000 [============================>.] - ETA: 1s - loss: 0.0996 - acc: 0.9692
58944/60000 [============================>.] - ETA: 1s - loss: 0.0995 - acc: 0.9692
59008/60000 [============================>.] - ETA: 1s - loss: 0.0994 - acc: 0.9692
59072/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59136/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59200/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9692
59264/60000 [============================>.] - ETA: 0s - loss: 0.0993 - acc: 0.9693
59328/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59392/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59456/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59520/60000 [============================>.] - ETA: 0s - loss: 0.0992 - acc: 0.9693
59584/60000 [============================>.] - ETA: 0s - loss: 0.0991 - acc: 0.9693
59648/60000 [============================>.] - ETA: 0s - loss: 0.0991 - acc: 0.9693
59712/60000 [============================>.] - ETA: 0s - loss: 0.0990 - acc: 0.9694
59776/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
59840/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
59904/60000 [============================>.] - ETA: 0s - loss: 0.0990 - acc: 0.9694
59968/60000 [============================>.] - ETA: 0s - loss: 0.0989 - acc: 0.9694
60000/60000 [==============================] - 64s 1ms/step - loss: 0.0989 - acc: 0.9694

Tesing....


   32/10000 [..............................] - ETA: 28s
  256/10000 [..............................] - ETA: 5s 
  480/10000 [>.............................] - ETA: 3s
  704/10000 [=>............................] - ETA: 3s
  928/10000 [=>............................] - ETA: 2s
 1152/10000 [==>...........................] - ETA: 2s
 1376/10000 [===>..........................] - ETA: 2s
 1600/10000 [===>..........................] - ETA: 2s
 1792/10000 [====>.........................] - ETA: 2s
 2016/10000 [=====>........................] - ETA: 2s
 2240/10000 [=====>........................] - ETA: 2s
 2464/10000 [======>.......................] - ETA: 2s
 2656/10000 [======>.......................] - ETA: 2s
 2816/10000 [=======>......................] - ETA: 2s
 3040/10000 [========>.....................] - ETA: 1s
 3264/10000 [========>.....................] - ETA: 1s
 3488/10000 [=========>....................] - ETA: 1s
 3712/10000 [==========>...................] - ETA: 1s
 3872/10000 [==========>...................] - ETA: 1s
 4096/10000 [===========>..................] - ETA: 1s
 4320/10000 [===========>..................] - ETA: 1s
 4544/10000 [============>.................] - ETA: 1s
 4768/10000 [=============>................] - ETA: 1s
 4960/10000 [=============>................] - ETA: 1s
 5184/10000 [==============>...............] - ETA: 1s
 5376/10000 [===============>..............] - ETA: 1s
 5600/10000 [===============>..............] - ETA: 1s
 5824/10000 [================>.............] - ETA: 1s
 6048/10000 [=================>............] - ETA: 1s
 6272/10000 [=================>............] - ETA: 0s
 6496/10000 [==================>...........] - ETA: 0s
 6720/10000 [===================>..........] - ETA: 0s
 6944/10000 [===================>..........] - ETA: 0s
 7168/10000 [====================>.........] - ETA: 0s
 7360/10000 [=====================>........] - ETA: 0s
 7584/10000 [=====================>........] - ETA: 0s
 7808/10000 [======================>.......] - ETA: 0s
 8000/10000 [=======================>......] - ETA: 0s
 8224/10000 [=======================>......] - ETA: 0s
 8448/10000 [========================>.....] - ETA: 0s
 8640/10000 [========================>.....] - ETA: 0s
 8864/10000 [=========================>....] - ETA: 0s
 9088/10000 [==========================>...] - ETA: 0s
 9216/10000 [==========================>...] - ETA: 0s
 9376/10000 [===========================>..] - ETA: 0s
 9536/10000 [===========================>..] - ETA: 0s
 9728/10000 [============================>.] - ETA: 0s
 9920/10000 [============================>.] - ETA: 0s
10000/10000 [==============================] - 3s 269us/step

test loss 0.08740828046086244

test accuracy 0.9717

 

相关标签: Keras