-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors merging files of run 2031 #412
Comments
do the files (> subrun 80) contain the key |
Yes, they do contain the table of parameters. However, the coordinates columns are not there due to the non-valid timestamps used to interpolate the pointing. In [5]: get_dataset_keys("/fefs/aswg/data/real/DL1/20200227/v0.4.4_v00/dl1_LST-1.1.Run02031.0101.fits.h5")
Out[5]:
['dl1/event/telescope/image/LST_LSTCam',
'dl1/event/telescope/parameters/LST_LSTCam',
'instrument/subarray/layout',
'instrument/subarray/layout.__table_column_meta__',
'instrument/telescope/camera/LSTCam',
'instrument/telescope/camera/LSTCam.__table_column_meta__',
'instrument/telescope/optics',
'instrument/telescope/optics.__table_column_meta__']
In [7]: df.columns
Out[7]:
Index(['dragon_time', 'event_id', 'intensity', 'intercept', 'kurtosis',
'leakage', 'length', 'log_intensity', 'mc_core_distance', 'n_islands',
'num_trig_pix', 'obs_id', 'phi', 'psi', 'r', 'skewness', 'tel_id',
'tel_pos_x', 'tel_pos_y', 'tel_pos_z', 'tib_time', 'time_gradient',
'trigger_time', 'trigger_type', 'ucts_time', 'width', 'wl', 'x', 'y'],
dtype='object') At DL2 those columns should be included provided they did not exist previously and their values set to -90 deg if pointing info is missing for the whole subrun. In principle, merging the files at DL2 level instead should be fine. |
I'm doing this to produce dl2 myself (using aict-tools). So how could I go about this? And why are the coordinate columns missing completely instead of being filled with nan? |
Agreed. Looking at the code again I think this should not happen anymore: if pointing_file_path and event_timestamps > 0:
azimuth, altitude = pointings.cal_pointingposition(event_timestamps, drive_data)
event.pointing[telescope_id].azimuth = azimuth
event.pointing[telescope_id].altitude = altitude
dl1_container.az_tel = azimuth
dl1_container.alt_tel = altitude
else:
dl1_container.az_tel = u.Quantity(np.nan, u.rad)
dl1_container.alt_tel = u.Quantity(np.nan, u.rad) |
Ok, so maybe I'll just wait until the 0.5 processing is done. |
@maxnoe, the v0.5.1 processing is finished. May you try to do the merging now? |
awesome! Yes, will do. |
@morcuended I tried with:
But now the files do not contain any dl1 parameters, only the instrument and dl1datacheck groups |
Probably because it is also trying to merge In [3]: get_dataset_keys("/fefs/aswg/data/real/DL1/20200227/v0.5.1_v03/datacheck_dl1_LST-1.Run02031.0000.h5")
Out[3]:
['dl1datacheck/cosmics',
'dl1datacheck/flatfield',
'dl1datacheck/histogram_binning',
'dl1datacheck/pedestals',
'dl1datacheck/used_trigger_tag',
'instrument/telescope/camera/LSTCam',
'instrument/telescope/camera/LSTCam.__table_column_meta__'] But actually the dl1 files do contain image table: In [2]: get_dataset_keys("/fefs/aswg/data/real/DL1/20200227/v0.5.1_v03/dl1_LST-1.Run02031.0000.h5")
Out[2]:
['dl1/event/telescope/image/LST_LSTCam',
'dl1/event/telescope/monitoring/calibration',
'dl1/event/telescope/monitoring/flatfield',
'dl1/event/telescope/monitoring/pedestal',
'dl1/event/telescope/parameters/LST_LSTCam',
'instrument/subarray/layout',
'instrument/subarray/layout.__table_column_meta__',
'instrument/telescope/camera/LSTCam',
'instrument/telescope/camera/LSTCam.__table_column_meta__',
'instrument/telescope/optics',
'instrument/telescope/optics.__table_column_meta__'] I guess the merge script should avoid those other datacheck files |
I wanted to open a more general issue anyways about not giving directories as argument but a variadic number of files. In my experience, this leads much less often to unexpected behaviour. |
Trying to merge the files of run 2031, after some 80 subruns, no paramters can be written anymore:
The scripts finishes successfully though.
Runs 2032 and 2033 worked without problems.
The text was updated successfully, but these errors were encountered: