• Python深度学习笔记09--使用Keras建立循环神经网络


    6.2 理解循环神经网络

    6.2.1 Keras中的循环层:

     1 from keras.layers import SimpleRNN
     2 
     3 from keras.models import Sequential
     4 from keras.layers import Embedding, SimpleRNN
     5 
     6 # model = Sequential()
     7 # model.add(Embedding(10000, 32))
     8 # model.add(SimpleRNN(32))
     9 # print(model.summary())
    10 '''
    11 embedding层的参数计算:10000 * 32
    12 imple_rnn层的参数计算:x*h + h*h + y*h + x + y
    13 x是输入向量的维度,本例中是32(第7行定义的32)
    14 h是循环层的向量维度,本例中是32(第8行定义的32)
    15 y是0,本例中没有定义输出
    16 因此计算方式为:32*32 + 32*32 + 0*32 + 32 + 0 = 2080
    17 '''
    18 # model = Sequential()
    19 # model.add(Embedding(10000, 32))
    20 # model.add(SimpleRNN(32, return_sequences=True))
    21 # print(model.summary())
    22 
    23 
    24 # model = Sequential()
    25 # model.add(Embedding(10000, 32))
    26 # model.add(SimpleRNN(32, return_sequences=True))
    27 # model.add(SimpleRNN(32, return_sequences=True))
    28 # model.add(SimpleRNN(32, return_sequences=True)) # 返回完整状态序列
    29 # model.add(SimpleRNN(32))  # 最后一层只返回最终输出
    30 # print(model.summary())
    31 
    32 
    33 from keras.datasets import imdb
    34 from keras.preprocessing import sequence
    35 
    36 max_features = 10000  # number of words to consider as features
    37 maxlen = 500  # cut texts after this number of words (among top max_features most common words)
    38 batch_size = 32
    39 
    40 print('Loading data...')
    41 (input_train, y_train), (input_test, y_test) = imdb.load_data(num_words=max_features)
    42 print(len(input_train), 'train sequences')
    43 print(len(input_test), 'test sequences')
    44 
    45 print('Pad sequences (samples x time)')
    46 input_train = sequence.pad_sequences(input_train, maxlen=maxlen)
    47 input_test = sequence.pad_sequences(input_test, maxlen=maxlen)
    48 print('input_train shape:', input_train.shape)
    49 print('input_test shape:', input_test.shape)
    50 
    51 
    52 from keras.layers import Dense
    53 
    54 model = Sequential()
    55 model.add(Embedding(max_features, 32))
    56 model.add(SimpleRNN(32))
    57 model.add(Dense(1, activation='sigmoid'))
    58 # #print(model.summary())
    59 
    60 
    61 model.compile(optimizer='rmsprop', loss='binary_crossentropy', metrics=['acc'])
    62 history = model.fit(input_train, y_train,
    63                     epochs=10,
    64                     batch_size=128,
    65                     validation_split=0.2)
    66 
    67 
    68 import matplotlib.pyplot as plt
    69 
    70 acc = history.history['acc']
    71 val_acc = history.history['val_acc']
    72 loss = history.history['loss']
    73 val_loss = history.history['val_loss']
    74 
    75 epochs = range(len(acc))
    76 
    77 plt.plot(epochs, acc, 'bo', label='Training acc')
    78 plt.plot(epochs, val_acc, 'b', label='Validation acc')
    79 plt.title('Training and validation accuracy')
    80 plt.legend()
    81 
    82 plt.figure()
    83 
    84 plt.plot(epochs, loss, 'bo', label='Training loss')
    85 plt.plot(epochs, val_loss, 'b', label='Validation loss')
    86 plt.title('Training and validation loss')
    87 plt.legend()
    88 
    89 plt.show()
    90 
    91 # loss: 0.0107 - acc: 0.9970 - val_loss: 0.6327 - val_acc: 0.8286 
  • 相关阅读:
    转载:通过Servlet生成验证码
    转载:web工程中URL地址的推荐写法
    使用Git上传本地项目代码到github
    $watch 和 $apply
    平时用的sublime插件
    zTree.js
    js
    npm install详解
    git
    git基础
  • 原文地址:https://www.cnblogs.com/asenyang/p/14325307.html
Copyright © 2020-2023  润新知