Both classification models are linear functions of features
Joint Distribution VS Conditional Distribution
Logistic Regression models the conditional distribution: P(Y|X)
Correlated VS Independent features
Naive Bayes assumes independence of features and multiple occurrences.
Logistic Regression implicitly captures correlation when training weights.
Generative VS Discriminative
Naive Bayes is a generative model.
Logistic Regression is a discriminative model.
Notes
Naive Bayes:
Find the max probability of n
P(Yn|X)=log(πyn)+∑Kk=1xnk⋅log(θnk)
Logistic Classification:
P(Yn|X)=K(ωTnx)
K = softmax function if multiclass, sigomoid if binary class
MLE vs MAP Notes
Cross Entropy VS MSE Notes
Find the max probability of n
P(Yn|X)=log(πyn)+∑Kk=1xnk⋅log(θnk)
Logistic Classification:
P(Yn|X)=K(ωTnx)
K = softmax function if multiclass, sigomoid if binary class
MLE vs MAP Notes
Cross Entropy VS MSE Notes
No comments:
Post a Comment