Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document pretrained word embeddings #2310

Closed
lambdaofgod opened this issue Dec 25, 2018 · 2 comments
Closed

Document pretrained word embeddings #2310

lambdaofgod opened this issue Dec 25, 2018 · 2 comments

Comments

@lambdaofgod
Copy link

Description

Documentation doesn't seem to contain information on pretrained models - I can't find any information on them other than getting keys from

gensim.downloader.models.keys()

Is there a reason to not contain this information in documentation?

If there is no such reason, then wouldn't it be beneficial to document these models?

I'd be happy to help with adding this information. But where should we put it? I think downloader documentation would be the correct place.

@menshikh-iv
Copy link
Contributor

@lambdaofgod

Is there a reason to not contain this information in documentation?

Yes, because this is really dynamic (independent repo) + all info stored in gensim-data repo (no reason to duplicate it). See https://radimrehurek.com/gensim/downloader.html for usage.

If there is no such reason, then wouldn't it be beneficial to document these models?

All additional information about models / datasets available through .info() or in https://github.com/RaRe-Technologies/gensim-data repository

@horpto
Copy link
Contributor

horpto commented Jan 10, 2019

@menshikh-iv Shouldn't gensim.downloader description have a link on gensim-data repo ?

And current https://radimrehurek.com/gensim/downloader.html page has broken layout in the text nearly at the footer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants