Skip to content

Commit

Permalink
Fix dropout probability
Browse files Browse the repository at this point in the history
Signed-off-by: Thomas Gassmann <tgassmann@student.ethz.ch>
  • Loading branch information
thomasgassmann committed Aug 9, 2024
1 parent 4e3360f commit 43db8ac
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion iml/chapters/neural-networks.tex
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ \subsection*{Backpropagation}
Only compute \color{Red} \textbf{the gradient}\color{Black}. Rand. init. weights by distr. assumption for $\varphi$. ( $2 / n_{in}$ for ReLu and $1/n_{in}$ or $ 1/ (n_{in} + n_{out})$ for Tanh)

\subsection*{Overfitting}
\textbf{Regularization}; \textbf{Early Stopping}; \textbf{Dropout}: ignore hidden units with prob. $p$, after training use all units and scale weights by $p$; \textbf{Batch Normalization}: normalize the input data (mean 0, variance 1) in each layer
\textbf{Regularization}; \textbf{Early Stopping}; \textbf{Dropout}: keep hidden units with prob. $p$, after training use all units and scale weights by $p$; \textbf{Batch Normalization}: normalize the input data (mean 0, variance 1) in each layer

\subsection*{CNN \quad \color{Black}$\varphi(W * v^{(l)})$}

Expand Down

0 comments on commit 43db8ac

Please sign in to comment.