You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the code currently writes some types of information (event ID, Euclidean & planar coordinates) as 2D HDF5 groups, which causes issues when loading into a dataframe. we should definitely fix the event ID to be separate columns (run, subrun, event) to bring this format into line with that of the MultiIndex approach utilised by PandAna, but it would be good to find solutions for coordinate systems too.
ideally we would find a way to abstract away planar information automatically (ie. handle [u,v,y] for DUNE or [x,y] for NOvA using the same code) when separating into columns, or to just work with multi-element columns natively. whatever works. for Euclidean coordinates we can just do this the old-fashioned way, but if we find a neat solution for planar coordinates, perhaps it'll make sense to use it for Euclidean too.
The text was updated successfully, but these errors were encountered:
cerati
pushed a commit
to cerati/numl
that referenced
this issue
Feb 20, 2023
the code currently writes some types of information (event ID, Euclidean & planar coordinates) as 2D HDF5 groups, which causes issues when loading into a dataframe. we should definitely fix the event ID to be separate columns (
run
,subrun
,event
) to bring this format into line with that of the MultiIndex approach utilised by PandAna, but it would be good to find solutions for coordinate systems too.ideally we would find a way to abstract away planar information automatically (ie. handle
[u,v,y]
for DUNE or[x,y]
for NOvA using the same code) when separating into columns, or to just work with multi-element columns natively. whatever works. for Euclidean coordinates we can just do this the old-fashioned way, but if we find a neat solution for planar coordinates, perhaps it'll make sense to use it for Euclidean too.The text was updated successfully, but these errors were encountered: