by Francesco Boi
Last Updated September 11, 2019 09:19 AM

In the Elements of Statistical learning book when introducing Linear Discriminant Analysis it says:

A simple application of Bayes theorem gives us

$Pr(G=K|X=x) = \frac{f_k(x)\pi_k}{\sum_{l=1}^Kf_l(x)\pi_l}$

where $\pi_k$ is the prior probability of class $k$ and $f_k(x)$ is the class conditional probability.

- What is the class conditional probability? Is it $Pr(X=x|G=K)$?
- How is derived the above equation from the Bayes theorem? I know $Pr(G=K|X=x) = \frac{Pr(X=x|G=K)Pr(G=K)}{Pr(X=x)}$

I know that $Pr(G=K)=\pi_k$ but I do not know how to derive the rest of the equation.

- Serverfault Query
- Superuser Query
- Ubuntu Query
- Webapps Query
- Webmasters Query
- Programmers Query
- Dba Query
- Drupal Query
- Wordpress Query
- Magento Query
- Joomla Query
- Android Query
- Apple Query
- Game Query
- Gaming Query
- Blender Query
- Ux Query
- Cooking Query
- Photo Query
- Stats Query
- Math Query
- Diy Query
- Gis Query
- Tex Query
- Meta Query
- Electronics Query
- Stackoverflow Query
- Bitcoin Query
- Ethereum Query