Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RAG] Bumping up transformers version to 3.3.x #579

Merged
merged 5 commits into from
Oct 19, 2020
Merged

[RAG] Bumping up transformers version to 3.3.x #579

merged 5 commits into from
Oct 19, 2020

Conversation

lalitpagaria
Copy link
Contributor

Towards deepset-ai/haystack#443

@tholor Please review

@lalitpagaria
Copy link
Contributor Author

It seems model conversion test is failing but I am not able to pin point issue -

    def test_conversion_adaptive_model(caplog):
        if caplog:
            caplog.set_level(logging.CRITICAL)
    
        model = AdaptiveModel.convert_from_transformers("deepset/bert-base-cased-squad2", device="cpu", task_type="question_answering")
        transformer_model = model.convert_to_transformers()
        transformer_model2 = AutoModelForQuestionAnswering.from_pretrained("deepset/bert-base-cased-squad2")
        # compare weights
        for p1, p2 in zip(transformer_model.parameters(), transformer_model2.parameters()):
            if p1.shape != p2.shape:
                print("diff")
>           assert(p1.data.ne(p2.data).sum() == 0)
E           RuntimeError: The size of tensor a (768) must match the size of tensor b (2) at non-singleton dimension 0

These changes might give some clue huggingface/transformers@v3.2.0...master (I don't suspect that changes in v3.3.0 or v3.3.1 causing this)

@tholor
Copy link
Member

tholor commented Oct 14, 2020

@bogdankostic Can you please have a look at this failing model conversion test? As you are currently refactoring the conversion (#576) we should make sure that it also works for transformers==3.3.1. So probably we should merge #576 first, then rebase or merge here and fix it if still failing.

@bogdankostic
Copy link
Contributor

I think I found the problem about the failing model conversion test. It seems to be related to this PR in transformers: huggingface/transformers#7272
The consequence of this PR seems to be that in FARM, the models still contain pooling layers while in transformers, they don't. That seems to be the reason why the comparison of parameters fails.

I'm not completely sure about the purpose of the pooling layer, but in their PR they describe that it is not needed and in fact not even used. How should we proceed?

@bogdankostic
Copy link
Contributor

Pooling layers are now removed for QA, NER and LM before converting FARM models to transformers in order to conform to the new transformers version.

@tholor
Copy link
Member

tholor commented Oct 14, 2020

Great, I will run our QA performance benchmarks tomorrow. If they pass, we can merge.

@Timoeller
Copy link
Contributor

The QA accuracy benchmark shows some slight changes in performance. Especially the Evaluator drops by 0.7%:

Date FARM commit train EM train f1 train top5 train elapsed Evaluator EM Evaluator f1 Evaluator top1 Evaluator elapsed infer EM infer f1 infer elapsed
14.09.2020 ce34cc2 0.780257 0.82286 0.974395 1123.208 0.777478 0.82156 0.84064 39 78.489 81.710402 26.472
16.10.2020 3ce5a97 0.78017 0.8234 0.97447 1120 0.770066 0.81432 0.83348 40 78.489 81.710402 26.48

@tholor
Copy link
Member

tholor commented Oct 16, 2020

Hmm... not a big diff but still wonder where this comes from. What do you think @Timoeller ? Merging and later trying to understand the small diff if we find some time?

@Timoeller
Copy link
Contributor

I am fine with merging. We will deep dive into QA preds checking and other QA related stuff anyways soon.

How about merging this monday morning or do you depend on it now?

@tholor
Copy link
Member

tholor commented Oct 16, 2020

Monday is fine

@lalitpagaria
Copy link
Contributor Author

Can we please merge this and create new Farm version. Not able to properly run transformers and haystack together even after many dirty hacks.

ERROR: farm 0.4.9 has requirement transformers==3.1.0, but you'll have transformers 3.3.1 which is incompatible.

@tholor tholor merged commit 0844df5 into deepset-ai:master Oct 19, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants