Pattern Recognition course 筆記- combiing, bagging and boosting

僅爲個人筆記使用


The basic question

  • how to design a combiner?
  • how to generate base classifiers?

combination

在這裏插入圖片描述

combiner types

  • fixed rules based on crisp labels or confidences[estimated posterior probabilities]
  • special trained rules based on classifier condidences
  • general trained rules interpreting based-classifier outputs as features
    p(f,ϕω)=p(fω)p(ϕω) p(f,\phi| \omega) = p(f|\omega)p(\phi|\omega)

don not need to measure the full parameters in one go,
reduce the calculation在這裏插入圖片描述

combination —> regularization

combining classifier offen leads to regularization effect

Bagging[ Bootstrap Aggregating]

  1. select a training set size m<mm'<m
  2. select at random nn subsets of ss' training objects[originally: bootstrap]
  3. train a classifier[originally : decision three]
  4. combine [orgina: majority vote]
  5. stabilize volatile classifiers

boosting

  1. initialize all objects with an euqal weight
  2. select a training set size m<mm'<m according to the object weights
  3. train a weak classifier
  4. increase the weights of the erroneously classified objects
  5. repeat as long as needed
  6. combine
  7. inprove performance of weak classifiers

Adaboost

  1. sample training set accoring to set of object weights[initialy equal]
  2. use it for training imple [weak] classifier ωi\omega_{i}
  3. classify entire data set, using weights, to get error estimate ξi\xi_{i}
  4. store classifier weights ai=0.5log((1ξi)/ξi)a_{i}= 0.5 log((1-\xi_{i})/\xi_{i})
  5. multiply weights of erroneously classified objects with exp(ai)exp(a_{i}) and correctly classified objects with exp(ai)exp(-a_{i})
  6. goto 1 as long as needed
  7. Final classifier: weighted voting with weights aia_{i}
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章