Skip to content

Model A not suppose to continue what Model B has trained #604

Answered by ammarsaf
ammarsaf asked this question in Q&A
Discussion options

You must be logged in to vote

Solved ✅ (edited)

Okay I have solved this problem (kudos to my colleague that raise about State issue). In this code, I shared the same layers.Embedding object for both model, which is the culprit of the issues.

To solve this, I need to instantiate new layers.Embedding object, for every model that I want to train. This is because by instantiate new layers.Embedding, it will reset the state of the weight of the object, and not using the previous one.

Note

@mrdbourke I think there is an issue within your NLP course. This is because, in the Tensorflow NLP learning series, you only used a single layers.Embedding object for every model that you trained. Hence, the next model that will be train…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ammarsaf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant