Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infer model type from config #600

Merged
merged 4 commits into from
Oct 27, 2020
Merged

Infer model type from config #600

merged 4 commits into from
Oct 27, 2020

Conversation

bogdankostic
Copy link
Contributor

@bogdankostic bogdankostic commented Oct 21, 2020

Until now, model types are inferred from the model name, which can cause problems like in deepset-ai/haystack#506. This PR makes it possible to infer the model type from the config files.

Copy link
Contributor

@Timoeller Timoeller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking already good.

I made a suggestion, though it might not possible to include dprReader for now (we are waiting for the transformers implementation)

farm/modeling/language_model.py Show resolved Hide resolved
farm/modeling/tokenization.py Show resolved Hide resolved
Copy link
Contributor

@Timoeller Timoeller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good

@Timoeller Timoeller merged commit b7ecb37 into master Oct 27, 2020
tholor added a commit that referenced this pull request Oct 28, 2020
* DPRProcessor, BiAdaptiveModel modified for optional query/passage

* TextSimilarityProcessor, BiAdaptiveModel haystack inference compatibility added

* TextSimilarityProcessor query/passage features optional

* DPRProcessor, BiAdaptiveModel modified for optional query/passage

* TextSimilarityProcessor, BiAdaptiveModel haystack inference compatibility added

* TextSimilarityProcessor query/passage features optional

* prediction head modified

* bugfix in BiAdaptiveModel init

* Fix removal of yes no answers (#540)

* fix removal of yes no answers

* Make use of the answer_type linked to each answers.

* pin seqeval version

* remove hardcoded answer types list

Co-authored-by: Fabio Tesser <fabio.tesser@gmail.com>
Co-authored-by: Malte Pietsch <malte.pietsch@deepset.ai>

* DPR test modified for max_seq_len_query/max_seq_len_context

* Infer model type from config (#600)

* Inference model and tokenizer type from config

* Infer type from model name as fallback

* BiAdaptive model output type modified to tuple from dict

* dpr tests reflect biadaptive model output type(tuple)

* DPRProcessor, BiAdaptiveModel modified for optional query/passage

* TextSimilarityProcessor, BiAdaptiveModel haystack inference compatibility added

* TextSimilarityProcessor query/passage features optional

* DPRProcessor, BiAdaptiveModel modified for optional query/passage

* TextSimilarityProcessor, BiAdaptiveModel haystack inference compatibility added

* prediction head modified

* bugfix in BiAdaptiveModel init

* DPR test modified for max_seq_len_query/max_seq_len_context

* BiAdaptive model output type modified to tuple from dict

* dpr tests reflect biadaptive model output type(tuple)

* renamed variables from 'context' to 'passage'

* DPR language model loading refactored

* DPR Language model comments fix

Co-authored-by: Branden Chan <33759007+brandenchan@users.noreply.github.com>
Co-authored-by: Fabio Tesser <fabio.tesser@gmail.com>
Co-authored-by: Malte Pietsch <malte.pietsch@deepset.ai>
Co-authored-by: bogdankostic <bogdankostic@web.de>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants