keras实现路透社新闻主体的分类

阅读: 评论:0

keras实现路透社新闻主体的分类

keras实现路透社新闻主体的分类

keras实现路透社新闻主体的分类

参考书目:《Python深度学习》。
Just for fun!!!

import keras
from keras.datasets import reuters
import matplotlib.pyplot as plt
import numpy as np

Using TensorFlow backend.

1 加载路透社数据集

(train_data,train_label),(test_data,test_label)=reuters.load_data(num_words=10000)

1.1 分割训练集和测试集(切片)

val_data=train_data[:1000]
train_data=train_data[1000:]
val_label=train_label[:1000]
train_label=train_label[1000:]
print(train_data.shape)
print(test_data.shape)
len(train_data[0])

(7982,)
(2246,)
626

1.2 将索引解码为新闻文本

word_index&#_word_index()
rev_word_index=dict([(value,key) for (key,value) in word_index.items()])
dec=' '.join([rev_(i-3,'?') for i in train_data[1]])
dec

‘? qtly div 19 cts vs 19 cts prior pay april 15 record april one reuter 3’

2 数据编码(one-hot)

def one_hot(seq,dim=10000):res&#s((len(seq),dim))for i,j in enumerate(seq):res[i,j]=1return res

2.1 data编码

train_data=one_hot(train_data)
val_data=one_hot(val_data)
test_data=one_hot(test_data)

2.2 label编码

train_label=_categorical(train_label)
val_label=_categorical(val_label)
test_label=_categorical(test_label)

3 构建模型架构

model&#dels.Sequential()
model.add(keras.layers.Dense(64,activation='relu',input_shape=(10000,)))
model.add(keras.layers.Dense(64,activation='relu'))
model.add(keras.layers.Dense(46,activation='softmax'))

4 定义优化器和损失函数

modelpile(optimizer='rmsprop',loss='categorical_crossentropy',metrics=['accuracy'])

5 训练+验证

his=model.fit(train_data,train_label,epochs=20,batch_size=512,validation_data=(val_data,val_label))
Train on 7982 samples, validate on 1000 samples
Epoch 1/20
7982/7982 [==============================] - 2s 292us/step - loss: 2.5309 - acc: 0.4959 - val_loss: 1.7227 - val_acc: 0.6110
Epoch 2/20
7982/7982 [==============================] - 1s 179us/step - loss: 1.4463 - acc: 0.6877 - val_loss: 1.3463 - val_acc: 0.7060
Epoch 3/20
7982/7982 [==============================] - 1s 170us/step - loss: 1.0953 - acc: 0.7648 - val_loss: 1.1710 - val_acc: 0.7440
Epoch 4/20
7982/7982 [==============================] - 1s 168us/step - loss: 0.8697 - acc: 0.8161 - val_loss: 1.0806 - val_acc: 0.7580
Epoch 5/20
7982/7982 [==============================] - 1s 174us/step - loss: 0.7030 - acc: 0.8472 - val_loss: 0.9834 - val_acc: 0.7820
Epoch 6/20
7982/7982 [==============================] - 2s 192us/step - loss: 0.5660 - acc: 0.8796 - val_loss: 0.9419 - val_acc: 0.8020
Epoch 7/20
7982/7982 [==============================] - 1s 181us/step - loss: 0.4578 - acc: 0.9048 - val_loss: 0.9090 - val_acc: 0.8010
Epoch 8/20
7982/7982 [==============================] - 1s 167us/step - loss: 0.3691 - acc: 0.9231 - val_loss: 0.9381 - val_acc: 0.7890
Epoch 9/20
7982/7982 [==============================] - 1s 165us/step - loss: 0.3030 - acc: 0.9312 - val_loss: 0.8910 - val_acc: 0.8090
Epoch 10/20
7982/7982 [==============================] - 1s 165us/step - loss: 0.2537 - acc: 0.9416 - val_loss: 0.9066 - val_acc: 0.8120
Epoch 11/20
7982/7982 [==============================] - 1s 168us/step - loss: 0.2182 - acc: 0.9469 - val_loss: 0.9192 - val_acc: 0.8140
Epoch 12/20
7982/7982 [==============================] - 1s 163us/step - loss: 0.1873 - acc: 0.9511 - val_loss: 0.9070 - val_acc: 0.8130
Epoch 13/20
7982/7982 [==============================] - 1s 171us/step - loss: 0.1699 - acc: 0.9523 - val_loss: 0.9364 - val_acc: 0.8070
Epoch 14/20
7982/7982 [==============================] - 1s 167us/step - loss: 0.1535 - acc: 0.9555 - val_loss: 0.9675 - val_acc: 0.8060
Epoch 15/20
7982/7982 [==============================] - 1s 172us/step - loss: 0.1389 - acc: 0.9559 - val_loss: 0.9707 - val_acc: 0.8150
Epoch 16/20
7982/7982 [==============================] - 1s 165us/step - loss: 0.1313 - acc: 0.9559 - val_loss: 1.0249 - val_acc: 0.8050
Epoch 17/20
7982/7982 [==============================] - 1s 173us/step - loss: 0.1218 - acc: 0.9582 - val_loss: 1.0294 - val_acc: 0.7960
Epoch 18/20
7982/7982 [==============================] - 1s 164us/step - loss: 0.1198 - acc: 0.9579 - val_loss: 1.0454 - val_acc: 0.8030
Epoch 19/20
7982/7982 [==============================] - 1s 166us/step - loss: 0.1139 - acc: 0.9598 - val_loss: 1.0980 - val_acc: 0.7980
Epoch 20/20
7982/7982 [==============================] - 1s 172us/step - loss: 0.1112 - acc: 0.9595 - val_loss: 1.0721 - val_acc: 0.8010

6 结果处理与可视化

6.1 提取各种返回参数

his_dict=his.history
loss=his_dict['loss']
val_loss=his_dict['val_loss']
acc=his_dict['acc']
val_acc=his_dict['val_acc']
epoch=range(1,len(loss)+1)

6.2 画出loss的图像

plt.plot(epoch,loss,'b',label='train_loss')
plt.plot(epoch,val_loss,'r',label='val_loss')
plt.title('train and validation')
plt.xlabel('epoch')
plt.ylabel('loss')
plt.legend()
plt.show()

6.3 画出acc的图像

plt.clf()
plt.plot(epoch,acc,'k',label='train_acc')
plt.plot(epoch,val_acc,'g',label='val_acc')
plt.title('train and validation')
plt.xlabel('epoch')
plt.ylabel('acc')
plt.legend()
plt.show()

7 测试与预测(检验成果)

7.1 测试

test_loss,test_acc=model.evaluate(test_data,test_label)

2246/2246 [==============================] - 0s 197us/step

print('test_loss=',test_loss,'ntest_acc=',test_acc)

test_loss= 1.216040284741912
test_acc= 0.778717720444884

7.2 预测

prediction=model.predict(test_data)
print('predict_result=',np.argmax(prediction[0]))
print('correct_result=',np.argmax(test_label[0]))

predict_result= 3
correct_result= 3

最终结果还算是可以的吧。。。

本文发布于:2024-02-02 04:44:44,感谢您对本站的认可!

本文链接:https://www.4u4v.net/it/170682028541436.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:路透社   主体   新闻   keras
留言与评论(共有 0 条评论)
   
验证码:

Copyright ©2019-2022 Comsenz Inc.Powered by ©

网站地图1 网站地图2 网站地图3 网站地图4 网站地图5 网站地图6 网站地图7 网站地图8 网站地图9 网站地图10 网站地图11 网站地图12 网站地图13 网站地图14 网站地图15 网站地图16 网站地图17 网站地图18 网站地图19 网站地图20 网站地图21 网站地图22/a> 网站地图23