Naive Bayes Algorithm
It is used to solve classification problems.
This Algorithm is based on Bayes theorem, but before understanding bayes theorem, let's revise some probability concepts:
- Independent Event :
ex, rolling a dice
Prob. of 1 on top : 1/6
Prob. of 2 on top : 1/6
Prob. of 5 on top : 1/6
Here, all events are independent from each other.
- Dependent Event :
ex, A box has 3 red balls and 2 blue balls.
Probability of taking a Red ball : 3/5
Probability of taking a Blue ball : 2/4
As one ball is removed, for the next event, we reduces the total no. of balls by 1. Since, Dependent Events.
P(A and B) = P(A)P(B|A)
We know that,
P(A and B) = P(B and A)
P(A)P(B|A) = P(B)P(A|B)
Baye's Theorem :
= | events | |
= | probability of A given B | |
= | probability of B given A | |
= | the independent probabilities of A and B |
For,
Dependemt Features : X1,X2,......Xn
Independent Feature : y
here, base is same for P(yes/Xi) and P(No/Xi) and its constant, so we can ignore it
These probabilities involves multiplying probabilities of individual features. In such cases, the probabilities and may not sum up to 1 directly because they are computed independently.
So, we need to normalize it,
P(Yes | sunny,Hot) = P(Yes) P(Sunny | Yes)P(Hot | Yes)
P(No | sunny,Hot) = P(No) P(Sunny | No)P(Hot | No)
P(Yes | sunny,Hot) = P(Yes) P(Sunny | Yes)P(Hot | Yes)
= (9/14)(2/9)(2/9)
= 0.0317
P(No | sunny,Hot) = P(No) P(Sunny | No)P(Hot | No)
Comments
Post a Comment