From ba52fe69d5022dec5ab9a3df855918edb27cc213 Mon Sep 17 00:00:00 2001 From: thomwolf Date: Tue, 23 Jul 2019 15:10:02 +0200 Subject: [PATCH] update breaking change section regarding from_pretrained keyword arguments --- README.md | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 81bc1ab6bddd8f..aae27cc8eeb644 100644 --- a/README.md +++ b/README.md @@ -310,8 +310,11 @@ loss, logits, attentions = outputs ### Serialization -Breaking change: Models are now set in evaluation mode by default when instantiated with the `from_pretrained()` method. -To train them don't forget to set them back in training mode (`model.train()`) to activate the dropout modules. +Breaking change in the `from_pretrained()`method: + +1. Models are now set in evaluation mode by default when instantiated with the `from_pretrained()` method. To train them don't forget to set them back in training mode (`model.train()`) to activate the dropout modules. + +2. The additional `*input` and `**kwargs` arguments supplied to the `from_pretrained()` method used to be directly passed to the underlying model's class `__init__()` method. They are now used to update the model configuration attribute instead which can break derived model classes build based on the previous `BertForSequenceClassification` examples. We are working on a way to mitigate this breaking change in [#866](https://github.com/huggingface/pytorch-transformers/pull/866) by forwarding the the model `__init__()` method (i) the provided positional arguments and (ii) the keyword arguments which do not match any configuratoin class attributes. Also, while not a breaking change, the serialization methods have been standardized and you probably should switch to the new method `save_pretrained(save_directory)` if you were using any other serialization method before.