• 吴恩达深度学习笔记 第二章测验


    参考自:https://blog.csdn.net/u013733326/article/details/79865858

    第2周测验 - 神经网络基础

    1. 神经元节点计算什么?

      • 【 】神经元节点先计算激活函数,再计算线性函数(z = Wx + b)

      • 】神经元节点先计算线性函数(z = Wx + b),再计算激活。

      • 【 】神经元节点计算函数g,函数g计算(Wx + b)。

      • 【 】在 将输出应用于激活函数之前,神经元节点计算所有特征的平均值

        请注意:神经元的输出是a = g(Wx + b),其中g是激活函数(sigmoid,tanh,ReLU,…)。

    2. 下面哪一个是Logistic损失?

      请注意:我们使用交叉熵损失函数。

    3. 假设img是一个(32,32,3)数组,具有3个颜色通道:红色、绿色和蓝色的32x32像素的图像。 如何将其重新转换为列向量?

      x = img.reshape((32 * 32 * 3, 1))
      • 1
    4. 看一下下面的这两个随机数组“a”和“b”:

      a = np.random.randn(2, 3) # a.shape = (2, 3)
      b = np.random.randn(2, 1) # b.shape = (2, 1)
      c = a + b
      • 1
      • 2
      • 3

      请问数组c的维度是多少?

      答: B(列向量)复制3次,以便它可以和A的每一列相加,所以:c.shape = (2, 3)

    5. 看一下下面的这两个随机数组“a”和“b”:

      a = np.random.randn(4, 3) # a.shape = (4, 3)
      b = np.random.randn(3, 2) # b.shape = (3, 2)
      c = a * b
      • 1
      • 2
      • 3

      请问数组“c”的维度是多少?

      答:运算符 “*” 说明了按元素乘法来相乘,但是元素乘法需要两个矩阵之间的维数相同,所以这将报错,无法计算。

    6. 假设你的每一个实例有n_x个输入特征,想一下在X=[x^(1), x^(2)…x^(m)]中,X的维度是多少?

      答: (n_x, m)

      请注意:一个比较笨的方法是当l=1的时候,那么计算一下(l=(l(l)  Z(l)=W(l)A(l) ,所以我们就有:

      • (1)  A(1) = X
      • X.shape = (n_x, m)
      • (1)  Z(1) .shape = ((1)  n(1) , m)
      • (1)  W(1) .shape = ((1)  n(1) , n_x)
    7. 回想一下,np.dot(a,b)在a和b上执行矩阵乘法,而`a * b’执行元素方式的乘法。

      看一下下面的这两个随机数组“a”和“b”:

      a = np.random.randn(12288, 150) # a.shape = (12288, 150)
      b = np.random.randn(150, 45) # b.shape = (150, 45)
      c = np.dot(a, b)
      • 1
      • 2
      • 3

      请问c的维度是多少?

      答: c.shape = (12288, 45), 这是一个简单的矩阵乘法例子。

    8. 看一下下面的这个代码片段:

      
      # a.shape = (3,4)
      
      
      # b.shape = (4,1)
      
      for i in range(3):
        for j in range(4):
          c[i][j] = a[i][j] + b[j]
      • 1
      • 2
      • 3
      • 4
      • 5
      • 6
      • 7
      • 8
      • 9

      请问要怎么把它们向量化?

      答:c = a + b.T

    9. 看一下下面的代码:

      a = np.random.randn(3, 3)
      b = np.random.randn(3, 1)
      c = a * b
      • 1
      • 2
      • 3

      请问c的维度会是多少?
      答:这将会使用广播机制,b会被复制三次,就会变成(3,3),再使用元素乘法。所以: c.shape = (3, 3).

    10. 看一下下面的计算图:

      J = u + v - w
        = a * b + a * c - (b + c)
        = a * (b + c) - (b + c)
        = (a - 1) * (b + c)
      • 1
      • 2
      • 3
      • 4

      答: (a - 1) * (b + c)
      博主注:由于弄不到图,所以很抱歉。



    Week 2 Quiz - Neural Network Basics

    1. What does a neuron compute?

      • [ ] A neuron computes an activation function followed by a linear function (z = Wx + b)

      • [x] A neuron computes a linear function (z = Wx + b) followed by an activation function

      • [ ] A neuron computes a function g that scales the input x linearly (Wx + b)

      • [ ] A neuron computes the mean of all features before applying the output to an activation function

      Note: The output of a neuron is a = g(Wx + b) where g is the activation function (sigmoid, tanh, ReLU, …).

    2. Which of these is the “Logistic Loss”?

      Note: We are using a cross-entropy loss function.

    3. Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?

      • x = img.reshape((32 * 32 * 3, 1))
    4. Consider the two following random arrays “a” and “b”:

      a = np.random.randn(2, 3) # a.shape = (2, 3)
      b = np.random.randn(2, 1) # b.shape = (2, 1)
      c = a + b
      • 1
      • 2
      • 3

      What will be the shape of “c”?

      b (column vector) is copied 3 times so that it can be summed to each column of a. Therefore, c.shape = (2, 3).

    5. Consider the two following random arrays “a” and “b”:

      a = np.random.randn(4, 3) # a.shape = (4, 3)
      b = np.random.randn(3, 2) # b.shape = (3, 2)
      c = a * b
      • 1
      • 2
      • 3

      What will be the shape of “c”?

      “*” operator indicates element-wise multiplication. Element-wise multiplication requires same dimension between two matrices. It’s going to be an error.

    6. Suppose you have n_x input features per example. Recall that X=[x^(1), x^(2)…x^(m)]. What is the dimension of X?

      (n_x, m)

      Note: A stupid way to validate this is use the formula Z^(l) = W^(l)A^(l) when l = 1, then we have

      • A^(1) = X
      • X.shape = (n_x, m)
      • Z^(1).shape = (n^(1), m)
      • W^(1).shape = (n^(1), n_x)
    7. Recall that np.dot(a,b) performs a matrix multiplication on a and b, whereas a*b performs an element-wise multiplication.

      Consider the two following random arrays “a” and “b”:

      a = np.random.randn(12288, 150) # a.shape = (12288, 150)
      b = np.random.randn(150, 45) # b.shape = (150, 45)
      c = np.dot(a, b)
      • 1
      • 2
      • 3

      What is the shape of c?

      c.shape = (12288, 45), this is a simple matrix multiplication example.

    8. Consider the following code snippet:

      
      # a.shape = (3,4)
      
      
      # b.shape = (4,1)
      
      for i in range(3):
        for j in range(4):
          c[i][j] = a[i][j] + b[j]
      • 1
      • 2
      • 3
      • 4
      • 5
      • 6
      • 7
      • 8
      • 9

      How do you vectorize this?

      c = a + b.T

    9. Consider the following code:

      a = np.random.randn(3, 3)
      b = np.random.randn(3, 1)
      c = a * b
      • 1
      • 2
      • 3

      What will be c?

      This will invoke broadcasting, so b is copied three times to become (3,3), and 鈭� is an element-wise product so c.shape = (3, 3).

    10. Consider the following computation graph.

      J = u + v - w
        = a * b + a * c - (b + c)
        = a * (b + c) - (b + c)
        = (a - 1) * (b + c)
      • 1
      • 2
      • 3
      • 4

      Answer: (a - 1) * (b + c)

  • 相关阅读:
    访问通讯录并设置联络人信息
    创建提醒事项
    iOS 高仿:花田小憩3.0.1
    iOS 手机淘宝加入购物车动画分析
    VTMagic 的使用介绍
    React Native 从入门到原理
    用户数据攻略-获取日历事件
    键盘收回方法
    提高jQuery执行效率需要注意几点
    你应该了解的jquery 验证框架
  • 原文地址:https://www.cnblogs.com/Dar-/p/9325697.html
Copyright © 2020-2023  润新知