Skip to content

Commit

Permalink
Merge branch 'main' of github.com:thomasgassmann/eth-cheatsheets
Browse files Browse the repository at this point in the history
  • Loading branch information
thomasgassmann committed Jun 22, 2024
2 parents c501960 + 6f0046c commit 764ddd1
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion iml/chapters/generative-modeling.tex
Original file line number Diff line number Diff line change
Expand Up @@ -53,4 +53,4 @@ \subsection*{Generative vs. Discriminative}

\textbf{Generative models}:

$p(x,y)$, can be more powerful (dectect outliers, missing values) if assumptions are met, are typically less robust against outliers
$p(x,y)$, can be more powerful (detect outliers, missing values) if assumptions are met, are typically less robust against outliers
4 changes: 2 additions & 2 deletions iml/chapters/various.tex
Original file line number Diff line number Diff line change
Expand Up @@ -50,5 +50,5 @@ \section*{Various}
$M \in \mathbb{R}^{n\times n}$ PSD $\Leftrightarrow \forall x \in \mathbb{R}^n: x^\top Mx \geq 0 \\
\Leftrightarrow$ all principal minors of $M$ have non-negative determinant $\Leftrightarrow \lambda \geq 0 \ \forall \lambda\in\sigma(M)$

\textbf{CLT} For $X_i$ iid with $m = \E[X_1]$ and $\Var[X_1] = \sigma^2$: $\mathbb{P}\left[\frac{\sum_{i=1}^n X_i - n m}{\sqrt{\sigma^2 n}} \leq a\right] \xrightarrow[n \to \infty]{} \Phi(a)$.
\textbf{KL Divergence} $D_{KL}(P||Q) = \mathbb{E}_p[\log(\frac{p(x)}{q(x)})]$, 0 iff $P = Q$, always non-negative
\textbf{CLT} For $X_i$ iid with $m = \E[X_1]$ and $\text{Var}(X_1) = \sigma^2$: $\mathbb{P}\left[\frac{\sum_{i=1}^n X_i - n m}{\sqrt{\sigma^2 n}} \leq a\right] \xrightarrow[n \to \infty]{} \Phi(a)$.
\textbf{KL Divergence} $D_{KL}(P||Q) = \mathbb{E}_p[\log(\frac{p(x)}{q(x)})]$, 0 iff $P = Q$, always non-negative

0 comments on commit 764ddd1

Please sign in to comment.