【中英】【吳恩達課後測驗】Course 1 - 神經網絡和深度學習 - 第二週測驗

【中英】【吳恩達課後測驗】Course 1 - 神經網絡和深度學習 - 第二週測驗


上一篇:【課程1 - 第一週測驗】※※※※※ 【回到目錄】※※※※※下一篇:【課程1 - 第二週編程作業】


第2周測驗 - 神經網絡基礎

  1. 神經元節點計算什麼?

    • 【 】神經元節點先計算激活函數,再計算線性函數(z = Wx + b)

    • 】神經元節點先計算線性函數(z = Wx + b),再計算激活。

    • 【 】神經元節點計算函數g,函數g計算(Wx + b)。

    • 【 】在 將輸出應用於激活函數之前,神經元節點計算所有特徵的平均值

      請注意:神經元的輸出是a = g(Wx + b),其中g是激活函數(sigmoid,tanh,ReLU,…)。

  2. 下面哪一個是Logistic損失?

    請注意:我們使用交叉熵損失函數。

  3. 假設img是一個(32,32,3)數組,具有3個顏色通道:紅色、綠色和藍色的32x32像素的圖像。 如何將其重新轉換爲列向量?

    x = img.reshape((32 * 32 * 3, 1))
  4. 看一下下面的這兩個隨機數組“a”和“b”:

    a = np.random.randn(2, 3) # a.shape = (2, 3)
    b = np.random.randn(2, 1) # b.shape = (2, 1)
    c = a + b

    請問數組c的維度是多少?

    答: B(列向量)複製3次,以便它可以和A的每一列相加,所以:c.shape = (2, 3)

  5. 看一下下面的這兩個隨機數組“a”和“b”:

    a = np.random.randn(4, 3) # a.shape = (4, 3)
    b = np.random.randn(3, 2) # b.shape = (3, 2)
    c = a * b

    請問數組“c”的維度是多少?

    答:運算符 “*” 說明了按元素乘法來相乘,但是元素乘法需要兩個矩陣之間的維數相同,所以這將報錯,無法計算。

  6. 假設你的每一個實例有n_x個輸入特徵,想一下在X=[x^(1), x^(2)…x^(m)]中,X的維度是多少?

    答: (n_x, m)

    請注意:一個比較笨的方法是當l=1的時候,那麼計算一下Z(l)=W(l)A(l) ,所以我們就有:

    • A(1) = X
    • X.shape = (n_x, m)
    • Z(1) .shape = (n(1) , m)
    • W(1) .shape = (n(1) , n_x)
  7. 回想一下,np.dot(a,b)在a和b上執行矩陣乘法,而`a * b’執行元素方式的乘法。

    看一下下面的這兩個隨機數組“a”和“b”:

    a = np.random.randn(12288, 150) # a.shape = (12288, 150)
    b = np.random.randn(150, 45) # b.shape = (150, 45)
    c = np.dot(a, b)

    請問c的維度是多少?

    答: c.shape = (12288, 45), 這是一個簡單的矩陣乘法例子。

  8. 看一下下面的這個代碼片段:

    
    # a.shape = (3,4)
    
    
    # b.shape = (4,1)
    
    for i in range(3):
      for j in range(4):
        c[i][j] = a[i][j] + b[j]

    請問要怎麼把它們向量化?

    答:c = a + b.T

  9. 看一下下面的代碼:

    a = np.random.randn(3, 3)
    b = np.random.randn(3, 1)
    c = a * b

    請問c的維度會是多少?
    答:這將會使用廣播機制,b會被複制三次,就會變成(3,3),再使用元素乘法。所以: c.shape = (3, 3).

  10. 看一下下面的計算圖:

    J = u + v - w
      = a * b + a * c - (b + c)
      = a * (b + c) - (b + c)
      = (a - 1) * (b + c)

    答: (a - 1) * (b + c)
    博主注:由於弄不到圖,所以很抱歉。


上一篇:【第1周測驗-深度學習簡介】※※※※※ 【回到目錄】※※※※※下一篇:【具有神經網絡思維的Logistic迴歸】


Week 2 Quiz - Neural Network Basics

  1. What does a neuron compute?

    • [ ] A neuron computes an activation function followed by a linear function (z = Wx + b)

    • [x] A neuron computes a linear function (z = Wx + b) followed by an activation function

    • [ ] A neuron computes a function g that scales the input x linearly (Wx + b)

    • [ ] A neuron computes the mean of all features before applying the output to an activation function

    Note: The output of a neuron is a = g(Wx + b) where g is the activation function (sigmoid, tanh, ReLU, …).

  2. Which of these is the “Logistic Loss”?

    Note: We are using a cross-entropy loss function.

  3. Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?

    • x = img.reshape((32 * 32 * 3, 1))
  4. Consider the two following random arrays “a” and “b”:

    a = np.random.randn(2, 3) # a.shape = (2, 3)
    b = np.random.randn(2, 1) # b.shape = (2, 1)
    c = a + b

    What will be the shape of “c”?

    b (column vector) is copied 3 times so that it can be summed to each column of a. Therefore, c.shape = (2, 3).

  5. Consider the two following random arrays “a” and “b”:

    a = np.random.randn(4, 3) # a.shape = (4, 3)
    b = np.random.randn(3, 2) # b.shape = (3, 2)
    c = a * b

    What will be the shape of “c”?

    “*” operator indicates element-wise multiplication. Element-wise multiplication requires same dimension between two matrices. It’s going to be an error.

  6. Suppose you have n_x input features per example. Recall that X=[x^(1), x^(2)…x^(m)]. What is the dimension of X?

    (n_x, m)

    Note: A stupid way to validate this is use the formula Z^(l) = W^(l)A^(l) when l = 1, then we have

    • A^(1) = X
    • X.shape = (n_x, m)
    • Z^(1).shape = (n^(1), m)
    • W^(1).shape = (n^(1), n_x)
  7. Recall that np.dot(a,b) performs a matrix multiplication on a and b, whereas a*b performs an element-wise multiplication.

    Consider the two following random arrays “a” and “b”:

    a = np.random.randn(12288, 150) # a.shape = (12288, 150)
    b = np.random.randn(150, 45) # b.shape = (150, 45)
    c = np.dot(a, b)

    What is the shape of c?

    c.shape = (12288, 45), this is a simple matrix multiplication example.

  8. Consider the following code snippet:

    
    # a.shape = (3,4)
    
    
    # b.shape = (4,1)
    
    for i in range(3):
      for j in range(4):
        c[i][j] = a[i][j] + b[j]

    How do you vectorize this?

    c = a + b.T

  9. Consider the following code:

    a = np.random.randn(3, 3)
    b = np.random.randn(3, 1)
    c = a * b

    What will be c?

    This will invoke broadcasting, so b is copied three times to become (3,3), and 鈭� is an element-wise product so c.shape = (3, 3).

  10. Consider the following computation graph.

    J = u + v - w
      = a * b + a * c - (b + c)
      = a * (b + c) - (b + c)
      = (a - 1) * (b + c)

    Answer: (a - 1) * (b + c)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章