-
Notifications
You must be signed in to change notification settings - Fork 706
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clean up previously generated CSV files #10279
Clean up previously generated CSV files #10279
Conversation
Build Artifacts
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This new task looks great and accurately removes the correct session and summary files. One thing I noticed is that the folder that contains the exported session and summary CSV files, also contains the exported users CSV files. So when log_exports_cleanup
runs, it is also removing any previously generated/exported users CSV file. However, this file should remain untouched. The exported users CSV file is stored in the log_exports
folder with the naming convention:
- {facility_name}_{last 4 digits of facility ID}_users.csv
Sure, will exclude that from the deletion. |
Finally, that test case is fixed 🙌 Hello @LianaHarris360 please have a look and let me know if any other changes are needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @Jaspreet-singh-1032 I manually tested everything and the CSV file for users is no longer being deleted, nice work 🎉 I left a couple of comments and once this is updated, it should be good to go!
Also, if you add the GitHub keyword "closes" or "fixes" before the issue referenced, it will automatically close the issue once this PR is merged.
kolibri/core/logger/tasks.py
Outdated
@@ -116,3 +169,18 @@ def exportsummarylogcsv(facility_id, **kwargs): | |||
kwargs.get("end_date"), | |||
kwargs.get("locale"), | |||
) | |||
|
|||
|
|||
@register_task() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be useful to add in a job ID to this task. The register_task
decorator takes the job_id
as an argument, which will always be used when the task is registered. Using this predictable id deduplicates it and will ensure that there is only one cleanup task running at once.
kolibri/core/logger/tasks.py
Outdated
valid_users_filenames = get_valid_users_csv_filenames() | ||
valid_filenames = valid_filenames.union(valid_logs_filenames) | ||
valid_filenames = valid_filenames.union(valid_users_filenames) | ||
return valid_filenames |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is nothing major, but I think valid_filenames
could be set with one line using: valid_logs_filenames.union(valid_users_filenames)
It seems the task failed because GitHub has updated its host keys. |
Everything looks good to me, thanks @Jaspreet-singh-1032! Before approving the changes, our QA team will take a look at it for testing @radinamatic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Jaspreet-singh-1032 - manually tested this with the .deb and .exe assets and it's functioning correctly as described. The existing .csv files get replaced if they are in the database and if not they are removed from the log_export folder upon generating a new file.
Thank you, everyone ❤️ |
Summary
remove session and summary log exports CSV files that does not have related record in
GenerateCSVLogRequest
.…
References
#10123
…
Reviewer guidance
…
Testing checklist
PR process
Reviewer checklist
yarn
andpip
)