Skip to content

suvojit-0x55aa/LSUV-Keras

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Layer-sequential unit-variance (LSUV) initialization for tf.keras

This is sample code for LSUV and initializations, implemented in python script within Keras framework.

LSUV initialization proposed by Dmytro Mishkin and Jiri Matas in the article All you need is a good Init consists of the two steps.

  • First, pre-initialize weights of each convolution or inner-product layer with orthonormal matrices.
  • Second, proceed from the first to the final layer, normalizing the variance of the output of each layer to be equal to one.

Original implementation can be found at ducha-aiki/LSUVinit.

Result Comparison

Default Init LSUV Init
Fashion-MNIST 83.15 % 85.65 %

References