Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump pyiron-base from 0.9.12 to 0.10.0 #1531

Merged
merged 12 commits into from
Aug 24, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Aug 20, 2024

Bumps pyiron-base from 0.9.12 to 0.10.0.

Release notes

Sourced from pyiron-base's releases.

pyiron_base 0.10.0

What's Changed

New Contributors

Full Changelog: pyiron/pyiron_base@pyiron_base-0.9.12...pyiron_base-0.10.0

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pyiron-base](https://github.com/pyiron/pyiron_base) from 0.9.12 to 0.10.0.
- [Release notes](https://github.com/pyiron/pyiron_base/releases)
- [Changelog](https://github.com/pyiron/pyiron_base/blob/main/CHANGELOG.md)
- [Commits](pyiron/pyiron_base@pyiron_base-0.9.12...pyiron_base-0.10.0)

---
updated-dependencies:
- dependency-name: pyiron-base
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file minor add functionality in a backward compatible manner labels Aug 20, 2024
@coveralls
Copy link

coveralls commented Aug 20, 2024

Pull Request Test Coverage Report for Build 10533586232

Details

  • 60 of 68 (88.24%) changed or added relevant lines in 7 files are covered.
  • 1 unchanged line in 1 file lost coverage.
  • Overall coverage remained the same at 70.93%

Changes Missing Coverage Covered Lines Changed/Added Lines %
pyiron_atomistics/lammps/base.py 9 10 90.0%
pyiron_atomistics/lammps/lammps.py 0 1 0.0%
pyiron_atomistics/atomistics/structure/periodic_table.py 8 10 80.0%
pyiron_atomistics/atomistics/structure/atoms.py 24 28 85.71%
Files with Coverage Reduction New Missed Lines %
pyiron_atomistics/atomistics/structure/atoms.py 1 73.26%
Totals Coverage Status
Change from base Build 10469606254: 0.0%
Covered Lines: 10680
Relevant Lines: 15057

💛 - Coveralls

@jan-janssen
Copy link
Member

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:1245, in GenericJob.from_hdf(self, hdf, group_name)
   1243     exe_dict["READ_ONLY"] = self._hdf5["executable/executable/READ_ONLY"]
   1244     job_dict["executable"] = {"executable": exe_dict}
-> 1245 self.from_dict(obj_dict=job_dict)

TypeError: LammpsBase.from_dict() got an unexpected keyword argument 'obj_dict'

@jan-janssen
Copy link
Member

======================================================================
ERROR: test_transform_trajectory (atomic.job.test_transform_trajectory.TestTransformTrajectory.test_transform_trajectory)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/runner/work/pyiron_atomistics/pyiron_atomistics/tests/atomic/job/test_transform_trajectory.py", line 37, in test_transform_trajectory
    self.project.unpack("get_structure_test")
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/generic.py", line 2004, in unpack
    import_archive.import_jobs(self, archive_directory=origin_path)
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/archiving/import_archive.py", line 54, in import_jobs
    df, common_path = transfer_files(
                      ^^^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/archiving/import_archive.py", line 109, in transfer_files
    copytree(os.path.join(origin_path, common_path), project_path, dirs_exist_ok=True)
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/shutil.py", line 571, in copytree
    with os.scandir(src) as itr:
         ^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'get_structure_test/get_structure_test/lmp'

@jan-janssen
Copy link
Member

@samwaseda There seems to be an issue with the backwards compatibility of the unpack() function to reload the previously packed calculation.

@jan-janssen
Copy link
Member

======================================================================
ERROR: test_output (calphy.test_base.TestCalphy.test_output)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/runner/work/pyiron_atomistics/pyiron_atomistics/tests/calphy/test_base.py", line 164, in test_output
    float(self.output_project["solid_job"].output.spring_constant),
          ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/generic.py", line 1807, in __getitem__
    return self._get_item_helper(item=item, convert_to_object=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/generic.py", line 1867, in _get_item_helper
    return self.load(item)
           ^^^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/jobloader.py", line 105, in __call__
    return super().__call__(job_specifier, convert_to_object=convert_to_object)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/jobloader.py", line 76, in __call__
    return self._project.load_from_jobpath(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/project.py", line 333, in load_from_jobpath
    job = super(Project, self).load_from_jobpath(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/project/generic.py", line 1149, in load_from_jobpath
    job.set_input_to_read_only()
  File "/usr/share/miniconda3/envs/my-env/lib/python3.11/site-packages/pyiron_base/jobs/job/generic.py", line 1434, in set_input_to_read_only
    self.server.lock()
    ^^^^^^^^^^^^^^^^
AttributeError: 'dict' object has no attribute 'lock'

----------------------------------------------------------------------

@samwaseda
Copy link
Member

I guess the remaining errors are not related to unpack

@jan-janssen
Copy link
Member

@pmrv I guess the lock error is related to me switching from the DataContainer to an dataclass but I am not exactly sure how to fix it.

@pmrv
Copy link
Contributor

pmrv commented Aug 20, 2024

@pmrv I guess the lock error is related to me switching from the DataContainer to an dataclass but I am not exactly sure how to fix it.

Am looking at it.

Copy link
Contributor Author

dependabot bot commented on behalf of github Aug 20, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@jan-janssen
Copy link
Member

---------------------------------------------------------------------------
Exception encountered at "In [19]":
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[19], line 1
----> 1 ham_spx_bs = ham_spx_chg.restart_for_band_structure_calculations(job_name="Fe_spx_BS")

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/sphinx/base.py:715, in SphinxBase.restart_for_band_structure_calculations(self, job_name)
    704 def restart_for_band_structure_calculations(self, job_name=None):
    705     """
    706     Restart a new job created from an existing calculation
    707     by reading the charge density for band structures.
   (...)
    713         pyiron_atomistics.sphinx.sphinx.sphinx: new job instance
    714     """
--> 715     return self.restart_from_charge_density(
    716         job_name=job_name, band_structure_calc=True
    717     )

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/sphinx/base.py:735, in SphinxBase.restart_from_charge_density(self, job_name, job_type, band_structure_calc)
    719 def restart_from_charge_density(
    720     self, job_name=None, job_type="Sphinx", band_structure_calc=False
    721 ):
    722     """
    723     Restart a new job created from an existing calculation
    724     by reading the charge density.
   (...)
    733         pyiron_atomistics.sphinx.sphinx.sphinx: new job instance
    734     """
--> 735     ham_new = self.restart(
    736         job_name=job_name,
    737         job_type=job_type,
    738         from_wave_functions=False,
    739         from_charge_density=True,
    740     )
    741     if band_structure_calc:
    742         ham_new._generic_input["restart_for_band_structure"] = True

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/sphinx/base.py:800, in SphinxBase.restart(self, job_name, job_type, from_charge_density, from_wave_functions)
    798         if len(w) > 0:
    799             self.status.not_converged = True
--> 800 new_job = super(SphinxBase, self).restart(job_name=job_name, job_type=job_type)
    802 new_job.input = self.input.copy()
    804 recreate_guess = False

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/atomistics/job/atomistic.py:454, in AtomisticGenericJob.restart(self, job_name, job_type)
    443 def restart(self, job_name=None, job_type=None):
    444     """
    445     Restart a new job created from an existing calculation.
    446     Args:
   (...)
    452         new_ham: New job
    453     """
--> 454     new_ham = super(AtomisticGenericJob, self).restart(
    455         job_name=job_name, job_type=job_type
    456     )
    457     if isinstance(new_ham, GenericMaster) and not isinstance(self, GenericMaster):
    458         new_child = self.restart(job_name=None, job_type=None)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:1343, in GenericJob.restart(self, job_name, job_type)
   1341     job_type = self.__name__
   1342 if job_type == self.__name__ and job_name not in self.project.list_nodes():
-> 1343     new_ham = self.copy_to(
   1344         new_job_name=job_name,
   1345         new_database_entry=False,
   1346         input_only=True,
   1347         copy_files=False,
   1348     )
   1349 else:
   1350     new_ham = self.create_job(job_type, job_name)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:709, in GenericJob.copy_to(self, project, new_job_name, input_only, new_database_entry, delete_existing_job, copy_files)
    706     new_database_entry = False
    708 # Call the copy_to() function defined in the JobCore
--> 709 new_job_core, file_project, hdf5_project, reloaded = self._internal_copy_to(
    710     project=project,
    711     new_job_name=new_job_name,
    712     new_database_entry=new_database_entry,
    713     copy_files=copy_files,
    714     delete_existing_job=delete_existing_job,
    715 )
    717 # Remove output if it should not be copied
    718 if input_only:

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:654, in GenericJob._internal_copy_to(self, project, new_job_name, new_database_entry, copy_files, delete_existing_job)
    649 delete_file_after_copy = _job_store_before_copy(job=self)
    651 # Call the copy_to() function defined in the JobCore
    652 new_job_core, file_project, hdf5_project, reloaded = super(
    653     GenericJob, self
--> 654 )._internal_copy_to(
    655     project=project,
    656     new_job_name=new_job_name,
    657     new_database_entry=new_database_entry,
    658     copy_files=copy_files,
    659     delete_existing_job=delete_existing_job,
    660 )
    661 if reloaded:
    662     return new_job_core, file_project, hdf5_project, reloaded

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/core.py:790, in JobCore._internal_copy_to(self, project, new_job_name, new_database_entry, copy_files, delete_existing_job)
    786     return job_return, file_project, hdf5_project, True
    788 # Create a new job by copying the current python object, move the content
    789 # of the HDF5 file and then attach the new HDF5 link to the new python object.
--> 790 new_job_core = self.copy()
    791 new_job_core._name = new_job_name
    792 new_job_core._hdf5 = hdf5_project

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:490, in GenericJob.copy(self)
    487 copied_self.reset_job_id()
    489 # Reload object from HDF5 file
--> 490 _job_reload_after_copy(
    491     job=copied_self, delete_file_after_copy=delete_file_after_copy
    492 )
    494 # Copy executor - it cannot be copied and is just linked instead
    495 if self.server.executor is not None:

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/util.py:620, in _job_reload_after_copy(job, delete_file_after_copy)
    612 def _job_reload_after_copy(job, delete_file_after_copy):
    613     """
    614     Reload job from HDF5 file after copying
    615 
   (...)
    618         delete_file_after_copy (bool): delete HDF5 file after reload
    619     """
--> 620     job.from_hdf()
    621     if delete_file_after_copy:
    622         job.project_hdf5.remove_file()

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/sphinx/base.py:877, in SphinxBase.from_hdf(self, hdf, group_name)
    875         self.input[k] = gp[k]
    876 elif self._hdf5["HDF_VERSION"] == "0.1.0":
--> 877     super(SphinxBase, self).from_hdf(hdf=hdf, group_name=group_name)
    878     self._structure_from_hdf()
    879     with self._hdf5.open("input") as hdf:

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/interactive.py:364, in InteractiveBase.from_hdf(self, hdf, group_name)
    356 def from_hdf(self, hdf=None, group_name=None):
    357     """
    358     Restore the InteractiveBase object in the HDF5 File
    359 
   (...)
    362         group_name (str): HDF5 subgroup name - optional
    363     """
--> 364     super(InteractiveBase, self).from_hdf(hdf=hdf, group_name=group_name)
    365     with self.project_hdf5.open("input") as hdf5_input:
    366         if "interactive" in hdf5_input.list_nodes():

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/jobs/job/generic.py:1256, in GenericJob.from_hdf(self, hdf, group_name)
   1254     exe_dict["READ_ONLY"] = self._hdf5["executable/executable/READ_ONLY"]
   1255     job_dict["executable"] = {"executable": exe_dict}
-> 1256 self.from_dict(obj_dict=job_dict)

File ~/work/pyiron_atomistics/pyiron_atomistics/pyiron_atomistics/atomistics/job/atomistic.py:301, in AtomisticGenericJob.from_dict(self, obj_dict)
    300 def from_dict(self, obj_dict):
--> 301     super().from_dict(obj_dict=obj_dict)
    302     self._generic_input.from_dict(obj_dict=obj_dict["input"]["generic"])

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:125, in HasDict.from_dict(self, obj_dict, version)
    122         return {k: load(v) for k, v in inner_dict.items()}
    123     return create_from_dict(inner_dict)
--> 125 self._from_dict({k: load(v) for k, v in obj_dict.items()}, version)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:122, in HasDict.from_dict.<locals>.load(inner_dict)
    118     return inner_dict
    119 if not all(
    120     k in inner_dict for k in ("NAME", "TYPE", "OBJECT", "DICT_VERSION")
    121 ):
--> 122     return {k: load(v) for k, v in inner_dict.items()}
    123 return create_from_dict(inner_dict)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:123, in HasDict.from_dict.<locals>.load(inner_dict)
    119 if not all(
    120     k in inner_dict for k in ("NAME", "TYPE", "OBJECT", "DICT_VERSION")
    121 ):
    122     return {k: load(v) for k, v in inner_dict.items()}
--> 123 return create_from_dict(inner_dict)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:59, in create_from_dict(obj_dict)
     57 version = obj_dict.get("VERSION", None)
     58 obj = class_object.instantiate(obj_dict, version)
---> 59 obj.from_dict(obj_dict, version)
     60 return obj

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:125, in HasDict.from_dict(self, obj_dict, version)
    122         return {k: load(v) for k, v in inner_dict.items()}
    123     return create_from_dict(inner_dict)
--> 125 self._from_dict({k: load(v) for k, v in obj_dict.items()}, version)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:123, in HasDict.from_dict.<locals>.load(inner_dict)
    119 if not all(
    120     k in inner_dict for k in ("NAME", "TYPE", "OBJECT", "DICT_VERSION")
    121 ):
    122     return {k: load(v) for k, v in inner_dict.items()}
--> 123 return create_from_dict(inner_dict)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:59, in create_from_dict(obj_dict)
     57 version = obj_dict.get("VERSION", None)
     58 obj = class_object.instantiate(obj_dict, version)
---> 59 obj.from_dict(obj_dict, version)
     60 return obj

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:125, in HasDict.from_dict(self, obj_dict, version)
    122         return {k: load(v) for k, v in inner_dict.items()}
    123     return create_from_dict(inner_dict)
--> 125 self._from_dict({k: load(v) for k, v in obj_dict.items()}, version)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:123, in HasDict.from_dict.<locals>.load(inner_dict)
    119 if not all(
    120     k in inner_dict for k in ("NAME", "TYPE", "OBJECT", "DICT_VERSION")
    121 ):
    122     return {k: load(v) for k, v in inner_dict.items()}
--> 123 return create_from_dict(inner_dict)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:59, in create_from_dict(obj_dict)
     57 version = obj_dict.get("VERSION", None)
     58 obj = class_object.instantiate(obj_dict, version)
---> 59 obj.from_dict(obj_dict, version)
     60 return obj

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/interfaces/has_dict.py:125, in HasDict.from_dict(self, obj_dict, version)
    122         return {k: load(v) for k, v in inner_dict.items()}
    123     return create_from_dict(inner_dict)
--> 125 self._from_dict({k: load(v) for k, v in obj_dict.items()}, version)

File /usr/share/miniconda3/envs/my-env/lib/python3.12/site-packages/pyiron_base/storage/datacontainer.py:1035, in DataContainer._from_dict(self, obj_dict, version)
   1033 with self.unlocked():
   1034     self.clear()
-> 1035     self.update(obj_dict["data"], wrap=True)
   1036 self.read_only = obj_dict.get("READ_ONLY", False)

KeyError: 'data'

@jan-janssen
Copy link
Member

@pmrv I patched the conda package under the assumption that pyiron_base=0.10.0 is anyway broken at the moment and nobody should install it. The general unit tests work fine, still there seem to be an issue copying a job containing a data container and also the windows tests fail with the following error message:

======================================================================
ERROR: test_output (calphy.test_base.TestCalphy.test_output)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\a\pyiron_atomistics\pyiron_atomistics\tests\calphy\test_base.py", line 164, in test_output
    float(self.output_project["solid_job"].output.spring_constant),
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'ProjectHDFio' object has no attribute 'output'
----------------------------------------------------------------------

@jan-janssen jan-janssen marked this pull request as draft August 20, 2024 19:40
@jan-janssen
Copy link
Member

@samwaseda The windows error of the calphy job seems to be related to the unpack() command. Can you take a look at it again?

@samwaseda
Copy link
Member

@samwaseda The windows error of the calphy job seems to be related to the unpack() command. Can you take a look at it again?

Hmmm I’m not really sure if it’s related, because if it’s related it shouldn’t be able to load the job in the first place

@jan-janssen
Copy link
Member

jan-janssen commented Aug 22, 2024

Hmmm I’m not really sure if it’s related, because if it’s related it shouldn’t be able to load the job in the first place

Looking at the code briefly, I feel the issue might be related to the transfer_files() function:

def transfer_files(origin_path: str, project_path: str):
    """
    Transfer files from the origin path to the project path.

    Args:
        origin_path (str): Path to the origin directory.
        project_path (str): Path to the project directory.

    Returns:
        pandas.DataFrame: Job table.
        str: Common path.
    """
    df = get_dataframe(origin_path=origin_path)
    common_path = os.path.commonpath(list(df["project"]))
    copytree(os.path.join(origin_path, common_path), project_path, dirs_exist_ok=True)
    return df, common_path

Should not this be posixpath.join() rather than os.path.join().

@samwaseda
Copy link
Member

samwaseda commented Aug 22, 2024

Looking at the code briefly, I feel the issue might be related to the transfer_files() function:

it still doesn’t sound really plausible because the error comes from the fact that the class simply doesn’t have output. Is it possible that for some reason the inspect mode is invoked?

@samwaseda
Copy link
Member

I just checked pr.inspect but the error would be also a bit different.

@samwaseda
Copy link
Member

samwaseda commented Aug 22, 2024

I’m actually not sure right now how output is defined. Is the attribute dynamically generated if there’s an output folder in the HDF5 file?

@pmrv
Copy link
Contributor

pmrv commented Aug 22, 2024

I’m actually not sure right now how output is defined. Is the attribute dynamically generated if there’s an output folder in the HDF5 file?

I'm assuming the issue is with the database entries after importing the project. If you follow the logic in here called from Project.__getitem__, then if the project cannot find the item in the jobs (as queried from the database), but there is a {item}.h5 file present in the project folder, it returns a bare ProjectHDFio instance, which is also what the error message indicates. So it seems that the h5 files of the imported jobs are in the correct location, but that project path entry in the database are not.

@samwaseda
Copy link
Member

Ok I’ll look into it then

@jan-janssen
Copy link
Member

@pmrv and @samwaseda I was able to fix the windows tests with the fix in pyiron/pyiron_base#1616 . So now it is primarily the jupyter notebook tests and in there the copying of the jobs that fails.

@samwaseda
Copy link
Member

@pmrv and @samwaseda I was able to fix the windows tests with the fix in pyiron/pyiron_base#1616 . So now it is primarily the jupyter notebook tests and in there the copying of the jobs that fails.

Nice! I guess now the problems are not related to pack unpack right?

@jan-janssen jan-janssen reopened this Aug 22, 2024
@jan-janssen
Copy link
Member

Nice! I guess now the problems are not related to pack unpack right?

Yes, I patched the pyiron_base version on conda-forge and the windows tests pass .

@jan-janssen jan-janssen reopened this Aug 23, 2024
@jan-janssen jan-janssen marked this pull request as ready for review August 23, 2024 23:16
@jan-janssen jan-janssen reopened this Aug 23, 2024
@jan-janssen jan-janssen merged commit e906a41 into main Aug 24, 2024
77 of 81 checks passed
@jan-janssen jan-janssen deleted the dependabot/pip/pyiron-base-0.10.0 branch August 24, 2024 06:14
@pmrv pmrv restored the dependabot/pip/pyiron-base-0.10.0 branch September 1, 2024 06:57
@pmrv pmrv deleted the dependabot/pip/pyiron-base-0.10.0 branch September 1, 2024 09:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file minor add functionality in a backward compatible manner
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants