Coursera 吳恩達 Machine Learning 課程 week 5 Quiz

編程作業代碼 GitHub 地址:https://github.com/coco-1998-2/Andrew-NG-Machine-Learning-Coursera

100%本地運行通過,不要直接抄,debug有問題的時候參考下就好。若是感覺有用,別忘記Star哦~

 

  • 可以有負數
  • 減去平均值,除以(max-min)

 

  • J值增加,說明 a 取值過大       

 

  • 數據量與特徵數量都不大,可以直接選用normal equation 

  •  特徵變量數量大於一萬,Normal equation 運算會比較慢,應考慮用梯度下降

 

 Backpropagation Algorithm 視頻中嵌入問題:

 

練習1
Which of the following statements about diagnostics are true?Check all that apply.
A.It's hard to tell what will work to improve a learning algorithm, so the best approach is to go with gut feeling and just see what works.
B.Diagnostics can give guidance as to what might be more fruitful things to try to improve a learning algorithm.
C.Diagnostics can be time-consuming to implement and try,but they can still be a very good use of your time.
D.A diagnostic can sometimes rule out certain courses of action (changes to your learning algorithm) as being unlikely to improve its performance significantly.
Answer:B、C、D
分析:
A.瞎猜改進算法顯然是不行的。
B.診斷能夠幫助我們提高算法性能對的。
C.診斷雖然也會花費不長的時間,但是這時間是花的值得的。
D.診斷有時候是直接改變算法而不是提高性能,這樣說當然也行。

 

Gradient Checking:

 

Random Initialization: 

  

 

Suppose you are using gradient descent together with backpropagation to try to minimize J(Θ) as a function of Θ. Which of the following would be a useful step for verifying that the learning algorithm is running correctly?

A) Plot J(Θ) as a function of Θ, to make sure gradient descent is going downhill.

B) Plot J(Θ) as a function of the number of iterations and make sure it is increasing (or at least non-decreasing) with every iteration.

C)Plot J(Θ) as a function of the number of iterations and make sure it is decreasing (or at least non-increasing) with every iteration.

D) Plot J(Θ) as a function of the number of iterations to make sure the parameter values are improving in classification accuracy.

答案應該選擇C。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章