• keras模块学习之Sequential模型学习笔记


        本笔记由博客园-圆柱模板 博主整理笔记发布,转载需注明,谢谢合作!

      Sequential是多个网络层的线性堆叠

      可以通过向Sequential模型传递一个layer的list来构造该模型:

        

    from keras.models import Sequential
    from keras.layers import Dense, Activation
    
    model = Sequential([
    Dense(32, input_dim=784),
    Activation('relu'),
    Dense(10),
    Activation('softmax'),
    ])
    

      也可以通过.add()方法一个个的将layer加入模型中:

       

    model = Sequential()
    model.add(Dense(32, input_dim=784))
    model.add(Activation('relu'))
    

      

    还可以通过merge将两个Sequential模型通过某种方式合并

    Sequential模型的方法:

      

    compile(self, optimizer, loss, metrics=[], sample_weight_mode=None)
    
    fit(self, x, y, batch_size=32, nb_epoch=10, verbose=1, callbacks=[], validation_split=0.0, validation_data=None, shuffle=True, class_weight=None, sample_weight=None)
    
    evaluate(self, x, y, batch_size=32, verbose=1, sample_weight=None)
    
    #按batch获得输入数据对应的输出,函数的返回值是预测值的numpy array
    predict(self, x, batch_size=32, verbose=0)
    
    #按batch产生输入数据的类别预测结果,函数的返回值是类别预测结果的numpy array或numpy
    predict_classes(self, x, batch_size=32, verbose=1)
    
    #本函数按batch产生输入数据属于各个类别的概率,函数的返回值是类别概率的numpy array
    predict_proba(self, x, batch_size=32, verbose=1)
    
    train_on_batch(self, x, y, class_weight=None, sample_weight=None)
    
    test_on_batch(self, x, y, sample_weight=None)
    
    predict_on_batch(self, x)
    
    
    fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose=1, callbacks=[], validation_data=None, nb_val_samples=None, class_weight=None, max_q_size=10)
    
    evaluate_generator(self, generator, val_samples, max_q_size=10)
    

      

  • 相关阅读:
    6、查看历史记录
    A Tour of Go Range
    Go Slices: usage and internals
    A Tour of Go Nil slices
    A Tour of Go Making slices
    A Tour of Go Slicing slices
    A Tour of Go Slices
    A Tour of Go Arrays
    A Tour of Go The new function
    A Tour of Go Struct Literals
  • 原文地址:https://www.cnblogs.com/68xi/p/8661250.html
Copyright © 2020-2023  润新知