Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Container.__read_io not found if ContainerSubclass.__init__ raises error #913

Closed
3 tasks done
rly opened this issue Jul 24, 2023 · 4 comments · Fixed by #916
Closed
3 tasks done

[Bug]: Container.__read_io not found if ContainerSubclass.__init__ raises error #913

rly opened this issue Jul 24, 2023 · 4 comments · Fixed by #916
Assignees
Labels
category: bug errors in the code or code behavior priority: low alternative solution already working and/or relevant to only specific user(s)

Comments

@rly
Copy link
Contributor

rly commented Jul 24, 2023

What happened?

Given a subclass of AbstractContainer called ContainerSubclass, if ContainerSubclass.__init__ raises an error, then AbstractContainer.__read_io will not be initialized. During the error handling process, apparently __del__ is called on the partially initialized ContainerSubclass object. That results in an AttributeError that __read_io does not exist. This exception is ignored, but we should avoid this by checking to see whether it exists before deleting it.

Steps to Reproduce

>>> import pynwb
>>> from pynwb import TimeSeries
>>> TimeSeries(
...                 name="test_ts1",
...                 data=[0, 1, 2, 3, 4, 5],
...                 unit="grams",
...                 timestamps=[0.0, 0.1, 0.2, 0.3, 0.4, 0.5],
...                 continuity="wrong",
...             )
Exception ignored in: <function AbstractContainer.__del__ at 0x117e95990>
Traceback (most recent call last):
  File "/Users/rly/Documents/NWB/hdmf/src/hdmf/container.py", line 197, in __del__
    del self.__read_io
AttributeError: _AbstractContainer__read_io
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/rly/Documents/NWB/hdmf/src/hdmf/utils.py", line 643, in func_call
    pargs = _check_args(args, kwargs)
  File "/Users/rly/Documents/NWB/hdmf/src/hdmf/utils.py", line 636, in _check_args
    raise ExceptionType(msg)
ValueError: TimeSeries.__init__: forbidden value for 'continuity' (got 'wrong', expected ['continuous', 'instantaneous', 'step'])

Traceback

No response

Operating System

macOS

Python Executable

Conda

Python Version

3.11

Package Versions

No response

Code of Conduct

@rly rly added category: bug errors in the code or code behavior priority: low alternative solution already working and/or relevant to only specific user(s) labels Jul 24, 2023
@rly rly self-assigned this Jul 24, 2023
@rly
Copy link
Contributor Author

rly commented Jul 24, 2023

This is blocking allensdk from using the latest hdmf and pynwb.

@mikejhuang
Copy link

mikejhuang commented Jul 24, 2023

@rly Would you like to elaborate on this further?

The PR blocks allensdk from using the latest versions because they create breaking changes. Until those breaking changes are resolved, please use the versions indicated in the pyproject.toml file in that PR.

@rly
Copy link
Contributor Author

rly commented Jul 25, 2023

@mikejhuang Apologies! I read your PR too hastily and thought that it was referencing a similar issue to the one described here, but it is different. (If it was the same, I would have prioritized resolving this issue more than otherwise.) We would like to help allensdk use the latest versions of pynwb and hdmf while also catching any accidental breaking changes made by releases. Please let us know if something changed in hdmf 3.7.0 that creates issues.

@mikejhuang
Copy link

@rly No problem! We would appreciate your help to resolve the hdmf breakage issues. Here's an example of a failed test that happens with hdmf=>3.5.0.
You can also run the allensdk tests to check for breakages by installing the test dependences in test_requirements.txt in the root allensdk directory, and then running pytest.

_____________________________________________________________________________________________________________________________________________________ test_add_invalid_times ______________________________________________________________________________________________________________________________________________________

invalid_epochs = [{'end_time': 2005.0, 'id': 739448407, 'label': 'stimulus', 'start_time': 1998.0, ...}, {'end_time': 2121.0, 'id': 739...imulus', 'start_time': 2114.0, ...}, {'end_time': 211.0, 'id': 123448407, 'label': 'ProbeB', 'start_time': 114.0, ...}]
tmpdir_factory = TempdirFactory(_tmppath_factory=TempPathFactory(_given_basetemp=None, _trace=<pluggy._tracing.TagTracerSub object at 0x7fcca5ed2d30>, _basetemp=PosixPath('/tmp/pytest-of-mike/pytest-1464'), _retention_count=3, _retention_policy='all'))

    def test_add_invalid_times(invalid_epochs, tmpdir_factory):
        nwbfile_name = \
            str(tmpdir_factory.mktemp("test").join("test_invalid_times.nwb"))
    
        nwbfile = NWBFile(
            session_description="EcephysSession",
            identifier="{}".format(739448407),
            session_start_time=datetime.now()
        )
    
        nwbfile = write_nwb.add_invalid_times(nwbfile, invalid_epochs)
    
        with NWBHDF5IO(nwbfile_name, mode="w") as io:
            io.write(nwbfile)
        nwbfile_in = NWBHDF5IO(nwbfile_name, mode="r").read()
    
        df = nwbfile.invalid_times.to_dataframe()
>       df_in = nwbfile_in.invalid_times.to_dataframe()

allensdk/test/brain_observatory/ecephys/test_write_nwb.py:998: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/utils.py:645: in func_call
    return func(args[0], **pargs)
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/common/table.py:1112: in to_dataframe
    sel = self.__get_selection_as_dict(arg, df=True, **kwargs)
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/common/table.py:963: in __get_selection_as_dict
    raise ve
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/common/table.py:940: in __get_selection_as_dict
    ret['id'] = self.id[arg]
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/container.py:590: in __getitem__
    return self.get(args)
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/hdmf/container.py:598: in get
    return self.data[args]
h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper
    ???
h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper
    ???
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/h5py/_hl/dataset.py:756: in __getitem__
    if self._fast_read_ok and (new_dtype is None):
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/h5py/_hl/base.py:536: in __get__
    value = obj.__dict__[self.func.__name__] = self.func(obj)
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/h5py/_hl/dataset.py:738: in _fast_read_ok
    self._extent_type == h5s.SIMPLE
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/h5py/_hl/base.py:536: in __get__
    value = obj.__dict__[self.func.__name__] = self.func(obj)
h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper
    ???
h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper
    ???
../../anaconda3/envs/allensdk38/lib/python3.8/site-packages/h5py/_hl/dataset.py:629: in _extent_type
    return self.id.get_space().get_simple_extent_type()
h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper
    ???
h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   ???
E   ValueError: Invalid dataset identifier (invalid dataset identifier)

h5py/h5d.pyx:350: ValueError

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: bug errors in the code or code behavior priority: low alternative solution already working and/or relevant to only specific user(s)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants