ML notes personal 4

Gradient Boosting


How it works?

step 1 : model 1 - it always returns the avg value of output col, pred1

step 2 : calculate residual 1 -> predicted-actual = res1

step 3 : make a decision tree iq, cgpa and res1

step 4 : make pred2 

step 5 : res2  = actual - (pred1 + LR*pred2)


step 6 : again make dt based on iq, cgpa, res2, and repeat till res is nearly = 0















 A. Adaboost vs B.Gradient Boost

- A - max depth of decision tree=1

- B - max leaf node - (8 to 32)


Comments

Popular posts from this blog

Extracting Tables and Text from Images Using Python

Positional Encoding in Transformer

Chain Component in LangChain