• tensorflow 笔记11:tf.nn.dropout() 的使用


    tf.nn.dropout:函数
    官网说明:
    
    tf.nn.dropout(
        x,
        keep_prob,
        noise_shape=None,
        seed=None,
        name=None
    )
    Defined in tensorflow/python/ops/nn_ops.py.
    
    See the guides: Layers (contrib) > Higher level ops for building neural network layers, Neural Network > Activation Functions
    
    Computes dropout.
    
    With probability keep_prob, outputs the input element scaled up by 1 / keep_prob, otherwise outputs 0. The scaling is so that the expected sum is unchanged.
    
    By default, each element is kept or dropped independently. If noise_shape is specified, 
    it must be broadcastable to the shape of x, and only dimensions with noise_shape[i] == shape(x)[i] will make independent decisions. For example,

    if shape(x) = [k, l, m, n] and noise_shape = [k, 1, 1, n], each batch and channel component will be kept independently and each row and column will be kept or not kept together. Args: x: A floating point tensor. keep_prob: A scalar Tensor with the same type as x. The probability that each element is kept. noise_shape: A 1-D Tensor of type int32, representing the shape for randomly generated keep/drop flags. seed: A Python integer. Used to create random seeds. See tf.set_random_seed for behavior. name: A name for this operation (optional). Returns: A Tensor of the same shape of x. Raises: ValueError: If keep_prob is not in (0, 1] or if x is not a floating point tensor.

    使用说明:

    参数 keep_prob: 表示的是保留的比例,假设为0.8 则 20% 的数据变为0,然后其他的数据乘以 1/keep_prob;keep_prob 越大,保留的越多;

    参数 noise_shape:干扰形状。    此字段默认是None,表示第一个元素的操作都是独立,但是也不一定。比例:数据的形状是shape(x)=[k, l, m, n],而noise_shape=[k, 1, 1, n],则第1和4列是独立保留或删除,第2和3列是要么全部保留,要么全部删除。

    代码举例:

    import os
    import numpy as np
    import tensorflow as tf
    
    x = tf.Variable(tf.ones([10, 10]))
    
    inputs = tf.nn.dropout(x, 0.8)
    
    init = tf.initialize_all_variables()                                                                             
    with tf.Session() as sess:
         sess.run(init)
         print (x.eval())
         print (inputs.eval())

    输出结果:

    [[1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
     [1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]]
    [[1.25 1.25 0.   1.25 1.25 1.25 1.25 1.25 1.25 0.  ]
     [0.   1.25 1.25 1.25 1.25 1.25 1.25 1.25 1.25 1.25]
     [1.25 1.25 1.25 1.25 1.25 1.25 1.25 1.25 1.25 1.25]
     [1.25 1.25 1.25 1.25 1.25 0.   1.25 1.25 1.25 1.25]
     [1.25 1.25 1.25 1.25 1.25 0.   1.25 1.25 0.   1.25]
     [1.25 1.25 1.25 1.25 1.25 0.   0.   1.25 1.25 1.25]
     [0.   1.25 1.25 0.   1.25 1.25 1.25 0.   1.25 0.  ]
     [1.25 0.   0.   1.25 1.25 1.25 1.25 1.25 1.25 1.25]
     [1.25 1.25 1.25 0.   1.25 1.25 1.25 0.   0.   0.  ]
     [1.25 1.25 0.   0.   0.   0.   1.25 1.25 1.25 1.25]]

    加入 noise:

    import os
    import numpy as np
    import tensorflow as tf
    
    
    x = tf.Variable(tf.ones([3,3,3]))
    
    inputs = tf.nn.dropout(x, 0.5,[3,1,3])                                                                           
    
    
    init = tf.initialize_all_variables()
    with tf.Session() as sess:
         sess.run(init)
         print (x.eval())
         print (inputs.eval())

    输出:

    [[[1. 1. 1.]
      [1. 1. 1.]
      [1. 1. 1.]]
    
     [[1. 1. 1.]
      [1. 1. 1.]
      [1. 1. 1.]]
    
     [[1. 1. 1.]
      [1. 1. 1.]
      [1. 1. 1.]]]
    [[[0. 2. 2.]
      [0. 2. 2.]
      [0. 2. 2.]]
    
     [[2. 2. 2.]
      [2. 2. 2.]
      [2. 2. 2.]]
    
     [[0. 2. 2.]
      [0. 2. 2.]
      [0. 2. 2.]]]

  • 相关阅读:
    JS jQuery显示隐藏div的几种方法
    PHP 二维数组去重(保留指定键值的同时去除重复的项)
    Java面试题解析(一)
    Java :面向对象
    使用 Spring Framework 时常犯的十大错误
    Spring Boot 面试的十个问题
    《深入理解 Java 内存模型》读书笔记
    Spring Boot 2.0 迁移指南
    MaidSafe区块链项目白皮书解读
    20190712共学问题归纳
  • 原文地址:https://www.cnblogs.com/lovychen/p/9445563.html
Copyright © 2020-2023  润新知