程序员最近都爱上了这个网站  程序员们快来瞅瞅吧!  it98k网:it98k.com

本站消息

站长简介/公众号

  出租广告位,需要合作请联系站长

+关注
已关注

分类  

暂无分类

标签  

暂无标签

日期归档  

DL之CNN:利用卷积神经网络算法(2→2,基于Keras的API-Functional)利用MNIST(手写数字图片识别)数据集实现多分类预测

发布于2019-08-20 12:51     阅读(502)     评论(0)     点赞(24)     收藏(4)


DL之CNN:利用卷积神经网络算法(2→2,基于Keras的API-Functional)利用MNIST(手写数字图片识别)数据集实现多分类预测

 

 

 

目录

输出结果

设计思路

核心代码


 

 

 

 

 

输出结果

下边两张图对应查看,可知,数字0有965个是被准确识别到!

 

  1. 1.10.0
  2. Size of:
  3. - Training-set: 55000
  4. - Validation-set: 5000
  5. - Test-set: 10000
  6. Epoch 1/1
  7. 128/55000 [..............................] - ETA: 14:24 - loss: 2.3439 - acc: 0.0938
  8. 256/55000 [..............................] - ETA: 14:05 - loss: 2.2695 - acc: 0.1016
  9. 384/55000 [..............................] - ETA: 13:20 - loss: 2.2176 - acc: 0.1302
  10. 512/55000 [..............................] - ETA: 13:30 - loss: 2.1608 - acc: 0.2109
  11. 640/55000 [..............................] - ETA: 13:29 - loss: 2.0849 - acc: 0.2500
  12. 768/55000 [..............................] - ETA: 13:23 - loss: 2.0309 - acc: 0.2734
  13. 896/55000 [..............................] - ETA: 13:30 - loss: 1.9793 - acc: 0.2946
  14. 1024/55000 [..............................] - ETA: 13:23 - loss: 1.9105 - acc: 0.3369
  15. 1152/55000 [..............................] - ETA: 13:22 - loss: 1.8257 - acc: 0.3776
  16. ……
  17. 53760/55000 [============================>.] - ETA: 18s - loss: 0.2106 - acc: 0.9329
  18. 53888/55000 [============================>.] - ETA: 16s - loss: 0.2103 - acc: 0.9330
  19. 54016/55000 [============================>.] - ETA: 14s - loss: 0.2100 - acc: 0.9331
  20. 54144/55000 [============================>.] - ETA: 13s - loss: 0.2096 - acc: 0.9333
  21. 54272/55000 [============================>.] - ETA: 11s - loss: 0.2092 - acc: 0.9334
  22. 54400/55000 [============================>.] - ETA: 9s - loss: 0.2089 - acc: 0.9335
  23. 54528/55000 [============================>.] - ETA: 7s - loss: 0.2086 - acc: 0.9336
  24. 54656/55000 [============================>.] - ETA: 5s - loss: 0.2082 - acc: 0.9337
  25. 54784/55000 [============================>.] - ETA: 3s - loss: 0.2083 - acc: 0.9337
  26. 54912/55000 [============================>.] - ETA: 1s - loss: 0.2082 - acc: 0.9337
  27. 55000/55000 [==============================] - 837s 15ms/step - loss: 0.2080 - acc: 0.9338
  28. 32/10000 [..............................] - ETA: 21s
  29. 160/10000 [..............................] - ETA: 8s
  30. 288/10000 [..............................] - ETA: 6s
  31. 448/10000 [>.............................] - ETA: 5s
  32. 576/10000 [>.............................] - ETA: 5s
  33. 736/10000 [=>............................] - ETA: 4s
  34. 864/10000 [=>............................] - ETA: 4s
  35. 1024/10000 [==>...........................] - ETA: 4s
  36. 1152/10000 [==>...........................] - ETA: 4s
  37. 1312/10000 [==>...........................] - ETA: 4s
  38. 1440/10000 [===>..........................] - ETA: 4s
  39. 1600/10000 [===>..........................] - ETA: 3s
  40. 1728/10000 [====>.........................] - ETA: 3s
  41. ……
  42. 3008/10000 [========>.....................] - ETA: 3s
  43. 3168/10000 [========>.....................] - ETA: 3s
  44. 3296/10000 [========>.....................] - ETA: 3s
  45. 3456/10000 [=========>....................] - ETA: 2s
  46. ……
  47. 5248/10000 [==============>...............] - ETA: 2s
  48. 5376/10000 [===============>..............] - ETA: 2s
  49. 5536/10000 [===============>..............] - ETA: 2s
  50. 5664/10000 [===============>..............] - ETA: 1s
  51. 5792/10000 [================>.............] - ETA: 1s
  52. ……
  53. 7360/10000 [=====================>........] - ETA: 1s
  54. 7488/10000 [=====================>........] - ETA: 1s
  55. 7648/10000 [=====================>........] - ETA: 1s
  56. 7776/10000 [======================>.......] - ETA: 1s
  57. 7936/10000 [======================>.......] - ETA: 0s
  58. 8064/10000 [=======================>......] - ETA: 0s
  59. 8224/10000 [=======================>......] - ETA: 0s
  60. ……
  61. 9760/10000 [============================>.] - ETA: 0s
  62. 9920/10000 [============================>.] - ETA: 0s
  63. 10000/10000 [==============================] - 4s 449us/step
  64. loss 0.05686537345089018
  65. acc 0.982
  66. acc: 98.20%
  67. [[ 965 0 4 0 0 0 4 1 2 4]
  68. [ 0 1128 3 0 0 0 0 1 3 0]
  69. [ 0 0 1028 0 0 0 0 1 3 0]
  70. [ 0 0 10 991 0 2 0 2 3 2]
  71. [ 0 0 3 0 967 0 1 1 1 9]
  72. [ 2 0 1 7 1 863 5 1 4 8]
  73. [ 2 3 0 0 3 2 946 0 2 0]
  74. [ 0 1 17 1 1 0 0 987 2 19]
  75. [ 2 0 9 2 0 1 0 1 955 4]
  76. [ 1 4 3 2 8 0 0 0 1 990]]
  77. _________________________________________________________________
  78. Layer (type) Output Shape Param #
  79. =================================================================
  80. input_1 (InputLayer) (None, 784) 0
  81. _________________________________________________________________
  82. reshape (Reshape) (None, 28, 28, 1) 0
  83. _________________________________________________________________
  84. layer_conv1 (Conv2D) (None, 28, 28, 16) 416
  85. _________________________________________________________________
  86. max_pooling2d (MaxPooling2D) (None, 14, 14, 16) 0
  87. _________________________________________________________________
  88. layer_conv2 (Conv2D) (None, 14, 14, 36) 14436
  89. _________________________________________________________________
  90. max_pooling2d_1 (MaxPooling2 (None, 7, 7, 36) 0
  91. _________________________________________________________________
  92. flatten (Flatten) (None, 1764) 0
  93. _________________________________________________________________
  94. dense (Dense) (None, 128) 225920
  95. _________________________________________________________________
  96. dense_1 (Dense) (None, 10) 1290
  97. =================================================================
  98. Total params: 242,062
  99. Trainable params: 242,062
  100. Non-trainable params: 0
  101. _________________________________________________________________
  102. (5, 5, 1, 16)
  103. (1, 28, 28, 16)

 

设计思路

 

 

 

核心代码

后期更新……

  1. path_model = 'Functional_model.keras'
  2. from tensorflow.python.keras.models import load_model
  3. model2_1 = load_model(path_model)
  4. model_weights_path = 'Functional_model_weights.keras'
  5. model2_1.save_weights(model_weights_path )
  6. model2_1.load_weights(model_weights_path, by_name=True )
  7. model2_1.load_weights(model_weights_path)
  8. result = model.evaluate(x=data.x_test,
  9. y=data.y_test)
  10. for name, value in zip(model.metrics_names, result):
  11. print(name, value)
  12. print("{0}: {1:.2%}".format(model.metrics_names[1], result[1]))
  13. y_pred = model.predict(x=data.x_test)
  14. cls_pred = np.argmax(y_pred, axis=1)
  15. plot_example_errors(cls_pred)
  16. plot_confusion_matrix(cls_pred)
  17. images = data.x_test[0:9]
  18. cls_true = data.y_test_cls[0:9]
  19. y_pred = model.predict(x=images)
  20. cls_pred = np.argmax(y_pred, axis=1)
  21. title = 'MNIST(Sequential Model): plot predicted example, resl VS predict'
  22. plot_images(title, images=images,
  23. cls_true=cls_true,
  24. cls_pred=cls_pred)

 

 

 

 



所属网站分类: 技术文章 > 博客

作者:就是不给你

链接:https://www.pythonheidong.com/blog/article/49319/08ffe0eeb42175503951/

来源:python黑洞网

任何形式的转载都请注明出处,如有侵权 一经发现 必将追究其法律责任

24 0
收藏该文
已收藏

评论内容:(最多支持255个字符)