Replies: 6 comments 5 replies
-
I'd personally not recommend using the built in queuing, each chunk has to re-open the file which takes more time and more memory on each chunk-job. I believe it's better/easier to just wrap the export in a job of your own, increasing the timeout and running it on a long-running redis process (using Horizon as you are already doing). Next release will demonstrate this approach in docs, however currently don't have that info ready, but hope you are able to figure it out with articles like https://medium.com/@williamvicary/long-running-jobs-with-laravel-horizon-7655e34752f7 |
Beta Was this translation helpful? Give feedback.
-
I've got the same problem Seems to me the Laravel job property |
Beta Was this translation helpful? Give feedback.
-
Any solution? I have the same problem |
Beta Was this translation helpful? Give feedback.
-
Any solution? I am having same issue.
|
Beta Was this translation helpful? Give feedback.
-
I'm also having the same issue. php artisan queue:work is not working |
Beta Was this translation helpful? Give feedback.
-
Any chance of an answer for this one? I have the export run in it's own Job and the timeout set to 10 minutes but it seems like the export functionality of the package is still trying to get it done within the 1 minute default and therefore fails with the below because of the large dataset.
|
Beta Was this translation helpful? Give feedback.
-
Here is my controller action:
Super simple, we queue up the export (there are over 5k records, can be more)
Again nothing fancy.
Issue: Horizon fails
I get this error in horizon:
From the docs this is support to chunk it and spin up many jobs. How ever in Horizon I see 2-3 jobs go through pending like no tomorrow, then one that stays before dying because of the above error.
Am I doing this wrong?
Beta Was this translation helpful? Give feedback.
All reactions