-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
deploy multi stimulus raw running speed code (LIMS release) and run jobs #2324
Comments
We need to shepherd this PR through review to merging http://stash.corp.alleninstitute.org/projects/TECH/repos/lims/pull-requests/838/overview |
Also depends on having the list of sessions for release from Corbett The ecephys_session_ids that Wayne gave to Scott is
Corbett has not verified this list yet. |
The ecephys session NWB writer will calculate the running speed and running acquisition dataframes internally, without needing to rely on a separate LIMS queue (#2337) |
Once the LIMS changes (multi stimulus raw running speed code) have been deployed (http://stash.corp.alleninstitute.org/projects/TECH/repos/lims/commits/2ba271fbba16c0aff8d3d6989cc2074b10e0a4d1), the ecephys session data will need to run through the LIMS multi stimulus raw running speed queue. This queue uses the same module as the Multi stimulus running speed module but with a boolean filter set to false. The queue, transitions and executable have already be created in the LIMS.
The well known files created by this queue are used in the behavior ecephys create nwb module with the field named 'raw_running_speed_path'.
Acceptance criteria:
Ask the LIMS team to deploy the new LIMS code
Run all the ecephys sessions in this queue (use the reprocessing page in the LIMS to create the jobs in a 'pending state' or rerun all of the jobs in the previous queue - on finishing they will create new jobs in the next queue)
check to make sure the jobs run without errors
run the jobs behavior ecephys create nwb and make sure there are no errors
The text was updated successfully, but these errors were encountered: