Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HELP! An error occur run: 'from elmoformanylangs import Embedder' #101

Open
BaiMeiyingxue opened this issue Jun 27, 2022 · 2 comments
Open

Comments

@BaiMeiyingxue
Copy link

When I installed it and used the command ‘from elmoformanylangs import Embedder’
allennlp 2.9.3,python=3.8
An error occurred

File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\elmo.py:12, in
10 from .modules.embedding_layer import EmbeddingLayer
11 from .utils import dict2namedtuple
---> 12 from .frontend import create_one_batch
13 from .frontend import Model
14 import numpy as np

File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\frontend.py:10, in
8 from .modules.elmo import ElmobiLm
9 from .modules.lstm import LstmbiLm
---> 10 from .modules.token_embedder import ConvTokenEmbedder, LstmTokenEmbedder
12 logger = logging.getLogger('elmoformanylangs')
14 def create_one_batch(x, word2id, char2id, config, oov='', pad='', sort=True):

File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\modules\token_embedder.py:8, in
6 from torch.autograd import Variable
7 import copy
----> 8 from .highway import Highway
11 class LstmTokenEmbedder(nn.Module):
12 def init(self, config, word_emb_layer, char_emb_layer, use_cuda=False):

File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\modules\highway.py:12, in
8 import torch
9 from overrides import overrides
---> 12 class Highway(torch.nn.Module):
13 """
14 A Highway layer <https://arxiv.org/abs/1505.00387>_ does a gated combination of a linear
15 transformation and a non-linear transformation of its input. :math:`y = g * x + (1 - g) *
(...)
28 The non-linearity to use in the highway layers.
29 """
30 def init(self,
31 input_dim: int,
32 num_layers: int = 1,
33 activation: Callable[[torch.Tensor], torch.Tensor] = torch.nn.functional.relu) -> None:

File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\modules\highway.py:47, in Highway()
39 for layer in self.layers:
40 # We should bias the highway layer to just carry its input forward. We do that by
41 # setting the bias on B(x) to be positive, because that means g will be biased to
42 # be high, to we will carry the input forward. The bias on B(x) is the second half
43 # of the bias vector in each Linear layer.
44 layer.bias[input_dim:].data.fill
(1)
46 @OVERRIDES
---> 47 def forward(self, inputs: torch.Tensor) -> torch.Tensor: # pylint: disable=arguments-differ
48 current_input = inputs
49 for layer in self._layers:

File E:\Anaconda3\envs\Elmol\lib\site-packages\overrides-6.1.0-py3.8.egg\overrides\overrides.py:88, in overrides(method, check_signature, check_at_runtime)
58 """Decorator to indicate that the decorated method overrides a method in
59 superclass.
60 The decorator code is executed while loading class. Using this method
(...)
85 docstring from super class
86 """
87 if method is not None:
---> 88 return _overrides(method, check_signature, check_at_runtime)
89 else:
90 return functools.partial(
91 overrides,
92 check_signature=check_signature,
93 check_at_runtime=check_at_runtime,
94 )

File E:\Anaconda3\envs\Elmol\lib\site-packages\overrides-6.1.0-py3.8.egg\overrides\overrides.py:114, in _overrides(method, check_signature, check_at_runtime)
112 return wrapper # type: ignore
113 else:
--> 114 _validate_method(method, super_class, check_signature)
115 return method
116 raise TypeError(f"{method.qualname}: No super class method found")

File E:\Anaconda3\envs\Elmol\lib\site-packages\overrides-6.1.0-py3.8.egg\overrides\overrides.py:135, in _validate_method(method, super_class, check_signature)
129 method.doc = super_method.doc
130 if (
131 check_signature
132 and not method.name.startswith("__")
133 and not isinstance(super_method, property)
134 ):
--> 135 ensure_signature_is_compatible(super_method, method, is_static)

File E:\Anaconda3\envs\Elmol\lib\site-packages\overrides-6.1.0-py3.8.egg\overrides\signature.py:93, in ensure_signature_is_compatible(super_callable, sub_callable, is_static)
90 same_main_module = _is_same_module(sub_callable, super_callable)
92 if super_type_hints is not None and sub_type_hints is not None:
---> 93 ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
94 ensure_all_kwargs_defined_in_sub(
95 super_sig, sub_sig, super_type_hints, sub_type_hints, is_static, method_name
96 )
97 ensure_all_positional_args_defined_in_sub(
98 super_sig,
99 sub_sig,
(...)
104 method_name,
105 )

File E:\Anaconda3\envs\Elmol\lib\site-packages\overrides-6.1.0-py3.8.egg\overrides\signature.py:287, in ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
285 sub_return = sub_type_hints.get("return", None)
286 if not _issubtype(sub_return, super_return) and super_return is not None:
--> 287 raise TypeError(
288 f"{method_name}: return type {sub_return} is not a {super_return}."
289 )

TypeError: Highway.forward: return type <class 'torch.Tensor'> is not a <class 'NoneType'>.

can you help me? thank you

@disperaller
Copy link

Indeed, I ran into the same issue when running 'from elmoformanylangs import Embedder'
could someone help on this?

@Scallions
Copy link

I ues this command to override the overrides. XD
pip install overrides==3.1.0
it worked for me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants