Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'Keras-like' model creation #15

Closed
wagenaartje opened this issue May 12, 2017 · 1 comment
Closed

'Keras-like' model creation #15

wagenaartje opened this issue May 12, 2017 · 1 comment

Comments

@wagenaartje
Copy link
Owner

wagenaartje commented May 12, 2017

Recently, I brought up an idea in this comment: #12 (comment). I think the way custom networks can be created still requires some knowledge about network architectures, which may be too complicated for some. I want to revamp model creation, making it more layered like synaptic, but with extra flexibility.

My current idea of network creation:

// Create a network
var model = new Model();
model.add(Dense(10)); // 10 inputs
model.add(LSTM(5, { activation: Activation.RELU }));
model.add(NARX(10));
model.add(Dense(3, { activation: Activation.TANH})); // 3 outputs

// To be discussed, adding custom (recurrent) connections
model.layers[0].connect(model.layers[2]);
model.layers[3].connect(model.layers[1]);

var network = model.compile();
network.train(...

Some basic points:

  • A model should always start and end with a Dense() layer.
  • Optional arguments can be specified in a dictionary, e.g. activation, bias, weight initialisation, connection type to next layer (e.g. ALL_TO_ALL, ONE_TO_ONE)
  • After setting up the layers, the network must be compiled. This turns the network in array of nodes, make it more susceptible to genetic algorithms and optimization
  • The training function will remain the same as now

Things to discuss:

  • What is a good way to allow custom connections? Check out the code above about what my first idea is on how to do this

Layer types I want to embed:

  • LSTM
  • NARX
  • Dense
  • Random
  • GRU
  • Clock
  • Softmax
  • Convolution
  • Pooling
@wagenaartje wagenaartje mentioned this issue May 12, 2017
@wagenaartje
Copy link
Owner Author

I've got a problem with two LSTM layers after each other. Feel free to help: https://stats.stackexchange.com/questions/279529/which-gate-gates-an-inter-lstm-connection

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant