Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batching for recurrent SOMS #2

Open
stephantul opened this issue Apr 10, 2017 · 0 comments
Open

Batching for recurrent SOMS #2

stephantul opened this issue Apr 10, 2017 · 0 comments

Comments

@stephantul
Copy link
Owner

All recurrent SOMs would benefit from some kind of batching scheme. I currently have batched code, but it doesn't work. My hunch is that batching the recurrent/recursive/merging SOM is non-trivial because there is no way of making sure the concurrently processed batches converge on the same BMU.

Because we take the mean over batches, this means that updates will generally end up being more noisy as the batch size increases.

Two starting ideas:

  1. A possible solution could be to initialize the context Weights and weights of the SOM to some value which allows for convergence (e.g. the weights are skewed in such a way that the concurrently processed batches all go in the same direction), but this entails setting the parameters in advance, which kind of defeats any purpose the SOM has.

  2. A second solution would be to slowly increase the batch size, allowing the SOM to converge to something stable first, and then increasing batch size.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant