Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added PerDimWeightedAverage component for chain models in SWBD, works… #461

Closed
wants to merge 1 commit into from

Conversation

vijayaditya
Copy link
Contributor

… as good as jesus (tdnn_4f) model

@vijayaditya
Copy link
Contributor Author

I have created a new config generator for TDNNs similar to those available for LSTM. This will not affect any existing TDDNN/Jesus scripts.

@danpovey
Copy link
Contributor

Why do you need a separate train_tdnn.sh script for this? Can't you use the same pattern as for the jesus-configs, to create the configs?

@danpovey
Copy link
Contributor

It's the duplication of code that I'm not crazy about, and the name of the
script train_tdnn_b.sh. Why couldn't you just slightly extend the existing
train_tdnn.sh script?
BTW, did the low-pass filtering help?
Dan

On Wed, Jan 20, 2016 at 9:41 PM, Vijayaditya Peddinti <
notifications@github.com> wrote:

I have created a new config generator for TDNNs similar to those available
for LSTM. This will not affect any existing TDDNN/Jesus scripts.


Reply to this email directly or view it on GitHub
#461 (comment).

@vijayaditya
Copy link
Contributor Author

the only thing different in the new train_tdnn.sh script is the make_configs.py which is being called. This new steps/nnet3/tdnn/make_configs.py script makes use of components defined in steps/nnet3/components.py

@vijayaditya
Copy link
Contributor Author

Low-pass filtering didn't help but I would like to keep it as it was just overtraining a lot and I hope it will improve after l2 regularization. Will start this experiment tomorrow.

@danpovey
Copy link
Contributor

OK.
Better to just make changes to train_tdnn.sh than to duplicate the script.
Dan

On Wed, Jan 20, 2016 at 9:47 PM, Vijayaditya Peddinti <
notifications@github.com> wrote:

Low-pass filtering didn't help but I would like to keep it as it was just
overtraining a lot and I hope it will improve after l2 regularization. Will
start this experiment tomorrow.


Reply to this email directly or view it on GitHub
#461 (comment).

@vijayaditya
Copy link
Contributor Author

OK, will do.

@vijayaditya
Copy link
Contributor Author

just realized I am using a tracking branch for pushing this commit. so I will close this and recreate a pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants