bug: client doesn't retry "Job exceeded rate limits" for DDL query jobs that exceed quota for table update operations #1790
Labels
api: bigquery
Issues related to the googleapis/python-bigquery API.
priority: p2
Moderately-important priority. Fix may not be included in next release.
type: bug
Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
In googleapis/python-bigquery-sqlalchemy#1009 (comment) it seems that the query in https://btx-internal.corp.google.com/invocations/ffafb866-6bc0-423f-a86b-df69fb270d57/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-bigquery-sqlalchemy%2Fpresubmit%2Fprerelease-deps;config=default/log with rate limits exceeded errors are not retried.
Environment details
python --version
pip --version
google-cloud-bigquery
version:pip show google-cloud-bigquery
Steps to reproduce
Run a DDL query more than 5 times in 10 seconds, violating the five table metadata update operations per 10 seconds per table limit (https://cloud.google.com/bigquery/quotas#standard_tables).
Code example
Stack trace
The text was updated successfully, but these errors were encountered: