Anndrew Ng's Machine Learning in Coursera(II)

Logistic Regression

1.Classification/Hypothesis Representation

In the former class,we talked about liner regression,today we will go into a new field.
First we have the tumor size and we need to predict whether it is malignant or benign.
  
We do the liner regression and get the line.W define a threshold=0.5. if hypothetic value>threshold=0.5 We say it's malignant. else we say it is benign.

But what if?

Apparently, the new liner regression is not a good classification.

So we have to do something! A new model appear!

 The So-Called Sigmoid Function or Logistic Function is the same.
 The functional image of g is at the bottom right.0<=g of z <=1.When (z>=0,g of z>0.5||| z<0, g of z<0.5);
 
  From above picture,We know when we have the value x,θ,the probability of y=0 +y=1   =1

2.decision boundary 

 what we say the dissection boundary is to classify the boundary of h of x based on all the data;
 As the following , when we have h(x)=g(θ0+θ1x1+θ2x2) and the parameter θ=[-3,1,1]T,
 predict Y=1  if -3+x1+x2>=0
 predict Y=0, if -3+x1+x2<0
 
  Apart from  liner boundary,we have some non-liner boundary.For example:
 
  

3.Cost Function

For a logistic Regression Model,we have  ,so the cost function is: 

Since y=0 or y=1,we can rewrite cost function into:



And we use Gradient Descent to minimize the cost function


So we have

Do you find that it is the same as what we get in the week2 liner regression?

 4.advanced Optimization:


Besides Gradient descent,there are other algorithms such as Conjugate gradient,BFGS or L-BFGS.




發佈了22 篇原創文章 · 獲贊 4 · 訪問量 4萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章