• "Classifying plankton with deep neural networks" notes


    • cross entropy loss is not quite the same as optimizing classification accuracy. Althougth the two are correlated.
    • It's not necessarily true that Deep learning approaches are often said to require enormous amount of data to work well. In this competitation, there'are 30,000 examples for 121 classes.
      To achieve this, some tricks are :
      • dropout
      • weight decay
      • data argumentation
      • pre-training
      • pseudo-labeling
      • parameter sharing
    • The method is implemented based on Theano:
      • Python, Numpy, Theano, cuDNN, PyCUDA, Lasagne
      • scikit-image: pre-processing and data argumentation
      • ghalton: quasi-random number generation
    • Hardware:
      • GTX 980, GTX 680, Tesla K40
    • Pre-processing and data argumentation:
      • Normalization: pre-pixel zero mean unit variance

    to be finished

  • 相关阅读:
    servlet
    反射
    网络通信协议
    线程安全,
    线程池, Callable<V>接口
    Thread类,Runnable 接口
    commons-IO
    序列化流与反序列化流,打印流
    转换流,Properties 集合
    缓冲流
  • 原文地址:https://www.cnblogs.com/nn0p/p/4346076.html
Copyright © 2020-2023  润新知