From 43db8aca3f26b7c090a143cc89355c223e1acc1f Mon Sep 17 00:00:00 2001 From: Thomas Gassmann Date: Fri, 9 Aug 2024 16:21:26 +0200 Subject: [PATCH] Fix dropout probability Signed-off-by: Thomas Gassmann --- iml/chapters/neural-networks.tex | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/iml/chapters/neural-networks.tex b/iml/chapters/neural-networks.tex index 240d074..dc43353 100644 --- a/iml/chapters/neural-networks.tex +++ b/iml/chapters/neural-networks.tex @@ -21,7 +21,7 @@ \subsection*{Backpropagation} Only compute \color{Red} \textbf{the gradient}\color{Black}. Rand. init. weights by distr. assumption for $\varphi$. ( $2 / n_{in}$ for ReLu and $1/n_{in}$ or $ 1/ (n_{in} + n_{out})$ for Tanh) \subsection*{Overfitting} -\textbf{Regularization}; \textbf{Early Stopping}; \textbf{Dropout}: ignore hidden units with prob. $p$, after training use all units and scale weights by $p$; \textbf{Batch Normalization}: normalize the input data (mean 0, variance 1) in each layer +\textbf{Regularization}; \textbf{Early Stopping}; \textbf{Dropout}: keep hidden units with prob. $p$, after training use all units and scale weights by $p$; \textbf{Batch Normalization}: normalize the input data (mean 0, variance 1) in each layer \subsection*{CNN \quad \color{Black}$\varphi(W * v^{(l)})$}