Tuesday, July 21, 2020

Naive Bayes Classifier VS Logistic Regression


Both classification models are linear functions of features

Joint Distribution VS Conditional Distribution


Naive Bayes models the joint distribution:  $P(X,Y) = P(Y)P(X|Y)$

Logistic Regression models the conditional distribution: $P(Y|X)$

Correlated VS Independent features



Naive Bayes assumes independence of features and multiple occurrences.

Logistic Regression implicitly captures correlation when training weights.

Generative VS Discriminative


Naive Bayes is a generative model.

Logistic Regression is a discriminative model.


Notes


Naive Bayes:

Find the max probability of n
$ P(Y_n|X) = log({\pi_y}_{n}) + \sum_{_{k=1}}^{K} x_{nk}\cdot log(\theta_{nk})$

Logistic Classification:

$P(Y_n|X) = K(\omega_n^{T}x)$
K = softmax function if multiclass, sigomoid if binary class




MLE vs MAP Notes

Cross Entropy VS MSE Notes

No comments:

Post a Comment

CSES Subtree Queries

You are given a rooted tree consisting of n nodes. The nodes are numbered 1,2,…,n, and node 1 is the root. Each node has a value. Your...