僅爲個人筆記使用
The basic question
- how to design a combiner?
- how to generate base classifiers?
combination
combiner types
- fixed rules based on crisp labels or confidences[estimated posterior probabilities]
- special trained rules based on classifier condidences
- general trained rules interpreting based-classifier outputs as features
don not need to measure the full parameters in one go,
reduce the calculation
combination —> regularization
combining classifier offen leads to regularization effect
Bagging[ Bootstrap Aggregating]
- select a training set size
- select at random subsets of training objects[originally: bootstrap]
- train a classifier[originally : decision three]
- combine [orgina: majority vote]
- stabilize volatile classifiers
boosting
- initialize all objects with an euqal weight
- select a training set size according to the object weights
- train a weak classifier
- increase the weights of the erroneously classified objects
- repeat as long as needed
- combine
- inprove performance of weak classifiers
Adaboost
- sample training set accoring to set of object weights[initialy equal]
- use it for training imple [weak] classifier
- classify entire data set, using weights, to get error estimate
- store classifier weights
- multiply weights of erroneously classified objects with and correctly classified objects with
- goto 1 as long as needed
- Final classifier: weighted voting with weights