-
Notifications
You must be signed in to change notification settings - Fork 545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BigQuery: Errno::EPIPE on loading csv file #266
Comments
Thanks again for opening the issue. We'll get right on it. |
It's strange, I succeded only with a csv file with size < 5_000_000 bytes and 39250 rows. |
Hi @vitaliel, and thank you for reporting this! This
A solution, documented in two of the issues above as well as in this Stack Overflow answer, is to add this line before your code (right after requiring
Can you give this a try and let us know if it solves the problem? If so, I will add the solution to the documentation for Thank you @blowmage for providing the background story on this. |
@quartzmo Thanks, it worked. |
@vitaliel Great. I will add documentation of this issue to the API doc for |
There is no fix we can make for this file upload issue, so add documentation instead. [closes googleapis#266]
Update Storage and BigQuery docs with broken pipe solution [closes #266]
FYI, the updated docs will be included in the next point release (0.3.1), but the release after that (0.4.0) will most likely switch dependencies from Faraday to Hurley, meaning this guidance will change. Hopefully Hurley will be an improvement on Faraday and not have this issue in the default provider. :) |
Hi,
I'm trying to upload 100Mb csv file to bigquery, but I get Errno::EPIPE errors.
Snippet:
I get the error after 10 seconds, but If I do not pass chunk_size, it fails after 50 seconds.
Exception:
The text was updated successfully, but these errors were encountered: