Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add primitive for Sequence classification with LSTM #150

Closed
Hector-hedb12 opened this issue Apr 4, 2019 · 0 comments · Fixed by #153
Closed

Add primitive for Sequence classification with LSTM #150

Hector-hedb12 opened this issue Apr 4, 2019 · 0 comments · Fixed by #153
Assignees
Labels
approved The issue is approved and someone can start working on it new primitives A new primitive is being requested
Milestone

Comments

@Hector-hedb12
Copy link
Contributor

Related to #121

The architecture would be:

Embedding ---> LSTM --> Dropout --> Dense (sigmoid)

You can find an example of this here in the Sequence classification with LSTM section:

from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras.layers import Embedding
from keras.layers import LSTM

max_features = 1024

model = Sequential()
model.add(Embedding(max_features, output_dim=256))
model.add(LSTM(128))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])

model.fit(x_train, y_train, batch_size=16, epochs=10)
score = model.evaluate(x_test, y_test, batch_size=16)
@csala csala added approved The issue is approved and someone can start working on it new primitives A new primitive is being requested labels Apr 4, 2019
@csala csala modified the milestones: 0.1.8, 0.1.9 Apr 22, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved The issue is approved and someone can start working on it new primitives A new primitive is being requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants