Skip to content

Commit

Permalink
[docstring] Fix docstring for RwkvConfig (huggingface#26782)
Browse files Browse the repository at this point in the history
* update check_docstrings

* update docstring
  • Loading branch information
Bojun-Feng authored and blbadger committed Nov 8, 2023
1 parent cfb8312 commit c7380cf
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
4 changes: 2 additions & 2 deletions src/transformers/models/rwkv/configuration_rwkv.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,15 +61,15 @@ class RwkvConfig(PretrainedConfig):
Dimensionality of the attention hidden states. Will default to `hidden_size` if unset.
intermediate_size (`int`, *optional*):
Dimensionality of the inner feed-forward layers. Will default to 4 times `hidden_size` if unset.
layer_norm_eps (`float`, *optional*, defaults to 1e-5):
layer_norm_epsilon (`float`, *optional*, defaults to 1e-05):
The epsilon to use in the layer normalization layers.
bos_token_id (`int`, *optional*, defaults to 0):
The id of the beginning of sentence token in the vocabulary. Defaults to 0 as RWKV uses the same tokenizer
as GPTNeoX.
eos_token_id (`int`, *optional*, defaults to 0):
The id of the end of sentence token in the vocabulary. Defaults to 0 as RWKV uses the same tokenizer as
GPTNeoX.
rescale_every (`int`, *optional*, default to 6):
rescale_every (`int`, *optional*, defaults to 6):
At inference, the hidden states (and weights of the correponding output layers) are divided by 2 every
`rescale_every` layer. If set to 0 or a negative number, no rescale is done.
tie_word_embeddings (`bool`, *optional*, defaults to `False`):
Expand Down
1 change: 0 additions & 1 deletion utils/check_docstrings.py
Original file line number Diff line number Diff line change
Expand Up @@ -471,7 +471,6 @@
"RobertaPreLayerNormConfig",
"RobertaPreLayerNormModel",
"RobertaTokenizerFast",
"RwkvConfig",
"SEWConfig",
"SEWDConfig",
"SEWDForCTC",
Expand Down

0 comments on commit c7380cf

Please sign in to comment.