Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve test latency #191

Closed
shollyman opened this issue Jul 24, 2020 · 9 comments
Closed

improve test latency #191

shollyman opened this issue Jul 24, 2020 · 9 comments
Assignees
Labels
api: bigquery Issues related to the googleapis/python-bigquery API. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.

Comments

@shollyman
Copy link
Contributor

shollyman commented Jul 24, 2020

recent runs of presubmit kokoro show we're taking something on the order of 37-48 minutes for typical runs, with a couple invocations significantly higher than that. It's a significant impediment to actually getting things done.

We can definitely test much faster than this.

Some things to look at:

  • Decompose the monolilthic kokoro job that runs all the nox sessions serially into parallel invocations, the same way we decompose the samples invocations. Fan out to get better wall time.

  • Look for obvious slow tests and improve them, reducing the tail of the (to be) parallelized test runs. Based on logging, system tests are unsuprisingly the longer session typically.

  • Look at coverage overlap between samples and library; we may be able to reduce redundancy in testing.

Here's some quick info from running locally on 3.8:

$ nox -s system-3.8 -- --durations=0

44.84s call     tests/system.py::TestBigQuery::test_dbapi_w_standard_sql_types
29.28s call     tests/system.py::TestBigQuery::test_dbapi_w_query_parameters
27.39s call     tests/system.py::TestBigQuery::test_query_many_columns
15.91s call     tests/system.py::TestBigQuery::test_query_w_query_params
15.04s call     tests/system.py::TestBigQuery::test_load_table_from_file_w_explicit_location
13.98s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_automatic_schema
13.79s call     tests/system.py::TestBigQuery::test_query_w_standard_sql_types
13.19s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_nullable_int64_datatype
11.90s call     tests/system.py::TestBigQuery::test_load_table_from_local_avro_file_then_dump_table
11.19s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_required
10.83s call     tests/system.py::TestBigQuery::test_nested_table_to_dataframe
10.50s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_nulls
10.30s call     tests/system.py::TestBigQuery::test_load_table_from_json_schema_autodetect
10.11s call     tests/system.py::TestBigQuery::test_copy_table
9.60s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_explicit_schema
8.56s call     tests/system.py::TestBigQuery::test_dbapi_fetchall
8.33s call     tests/system.py::TestBigQuery::test_extract_table
7.61s call     tests/system.py::TestBigQuery::test_nested_table_to_arrow
7.20s call     tests/system.py::TestBigQuery::test_load_table_from_dataframe_w_nullable_int64_datatype_automatic_schema
7.09s call     tests/system.py::TestBigQuery::test_load_table_from_uri_then_dump_table
7.03s call     tests/system.py::TestBigQuery::test_list_rows_page_size
6.76s call     tests/system.py::TestBigQuery::test_query_w_legacy_sql_types
6.67s call     tests/system.py::TestBigQuery::test_dbapi_connection_does_not_leak_sockets
6.34s call     tests/system.py::TestBigQuery::test_query_w_dml
6.23s call     tests/system.py::TestBigQuery::test_dbapi_w_dml
5.71s call     tests/system.py::TestBigQuery::test_load_avro_from_uri_then_dump_table
4.26s call     tests/system.py::TestBigQuery::test_load_table_from_json_basic_use
3.32s call     tests/system.py::TestBigQuery::test_querying_data_w_timeout
3.12s call     tests/system.py::test_bigquery_magic
2.89s call     tests/system.py::TestBigQuery::test_list_tables
2.87s call     tests/system.py::TestBigQuery::test_query_results_to_dataframe
2.81s call     tests/system.py::TestBigQuery::test_list_datasets
2.81s call     tests/system.py::TestBigQuery::test_job_cancel
2.69s call     tests/system.py::TestBigQuery::test_update_table
2.69s call     tests/system.py::TestBigQuery::test_insert_rows_from_dataframe
2.50s call     tests/system.py::TestBigQuery::test_create_routine
2.46s call     tests/system.py::TestBigQuery::test_query_w_page_size
2.22s call     tests/system.py::TestBigQuery::test_query_results_to_dataframe_w_bqstorage_v1beta1
2.19s call     tests/system.py::TestBigQuery::test_query_results_to_dataframe_w_bqstorage
2.18s call     tests/system.py::TestBigQuery::test_large_query_w_public_data
2.10s call     tests/system.py::TestBigQuery::test_create_table_rows_fetch_nested_schema
2.00s call     tests/system.py::TestBigQuery::test_insert_rows_then_dump_table
1.98s call     tests/system.py::TestBigQuery::test_dbapi_fetch_w_bqstorage_client_large_result_set
1.90s call     tests/system.py::TestBigQuery::test_update_table_schema
1.89s call     tests/system.py::TestBigQuery::test_create_table
1.84s call     tests/system.py::TestBigQuery::test_dbapi_fetch_w_bqstorage_client_v1beta1_large_result_set
1.60s call     tests/system.py::TestBigQuery::test_insert_rows_nested_nested_dictionary
1.60s call     tests/system.py::TestBigQuery::test_create_table_w_time_partitioning_w_clustering_fields
1.59s call     tests/system.py::TestBigQuery::test_insert_rows_nested_nested
1.57s call     tests/system.py::TestBigQuery::test_query_statistics
1.55s call     tests/system.py::TestBigQuery::test_update_dataset
1.46s call     tests/system.py::TestBigQuery::test_query_w_timeout
1.44s call     tests/system.py::TestBigQuery::test_list_rows_empty_table
1.30s call     tests/system.py::TestBigQuery::test_get_dataset
1.18s call     tests/system.py::TestBigQuery::test_delete_dataset_delete_contents_true
1.11s call     tests/system.py::TestBigQuery::test_create_table_with_policy
1.07s call     tests/system.py::TestBigQuery::test_delete_dataset_delete_contents_false
1.00s call     tests/system.py::TestBigQuery::test_create_dataset
0.99s call     tests/system.py::TestBigQuery::test_delete_dataset_with_string
0.93s call     tests/system.py::TestBigQuery::test_query_w_wrong_config
0.88s call     tests/system.py::TestBigQuery::test_list_datasets_w_project
0.77s call     tests/system.py::TestBigQuery::test_close_releases_open_sockets
0.75s call     tests/system.py::TestBigQuery::test_query_w_start_index
0.75s call     tests/system.py::TestBigQuery::test_query_future
0.73s call     tests/system.py::TestBigQuery::test_query_iter
0.47s call     tests/system.py::TestBigQuery::test_list_partitions
0.39s call     tests/system.py::TestBigQuery::test_dbapi_dry_run_query
0.27s call     tests/system.py::TestBigQuery::test_get_failed_job
0.26s call     tests/system.py::TestBigQuery::test_query_w_failed_query
0.17s call     tests/system.py::TestBigQuery::test_get_table_w_public_dataset
0.17s call     tests/system.py::TestBigQuery::test_list_rows_max_results_w_bqstorage
0.16s call     tests/system.py::TestBigQuery::test_get_service_account_email
0.06s setup    tests/system.py::test_bigquery_magic
0.01s setup    tests/system.py::TestBigQuery::test_close_releases_open_sockets

Here's the slowest unit-3.8 timings (way more tests than included here):

1.73s call     tests/unit/test_job.py::TestQueryJob::test_result_invokes_begins
1.45s call     tests/unit/test_client.py::TestClient::test_get_dataset
1.00s call     tests/unit/test_client.py::TestClient::test_create_job_query_config_w_rateLimitExceeded_error
1.00s call     tests/unit/test_client.py::TestClient::test_get_service_account_email_w_custom_retry
0.44s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_bqstorage_updates_progress_bar
0.37s call     tests/unit/test_job.py::Test_AsyncJob::test_cancel_w_custom_retry
0.31s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_bqstorage_empty_streams
0.28s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_error_if_pandas_is_none
0.27s call     tests/unit/test_dbapi_connection.py::TestConnection::test_does_not_keep_cursor_instances_alive
0.26s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow
0.26s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar
0.25s call     tests/unit/test_table.py::TestRowIterator::test_to_arrow_progress_bar
0.25s call     tests/unit/test_dbapi_cursor.py::TestCursor::test_fetchmany_w_row
0.25s call     tests/unit/test_dbapi_cursor.py::TestCursor::test_execute_w_query_dry_run
0.25s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_tqdm_error
0.25s call     tests/unit/test_job.py::TestQueryJob::test_result_w_timeout
0.24s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_bqstorage_snapshot
0.22s call     tests/unit/test_table.py::TestRowIterator::test_to_dataframe_tabledata_list_w_multiple_pages_return_unique_index
0.21s call     tests/unit/test_table.py::TestRowIterator::test_to_arrow_max_results_w_create_bqstorage_warning
0.21s call     tests/unit/test_magics.py::test_bigquery_magic_w_table_id_and_destination_var
0.21s call     tests/unit/test_client.py::TestClient::test__call_api_applying_custom_retry_on_timeout
0.20s call     tests/unit/test_dbapi_connection.py::TestConnection::test_connect_w_client
@shollyman shollyman added api: bigquery Issues related to the googleapis/python-bigquery API. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design. labels Jul 24, 2020
@shollyman shollyman self-assigned this Jul 24, 2020
@tseaver
Copy link
Contributor

tseaver commented Jul 28, 2020

Other latency sources include installing prerequisites, e.g.:

$ python3.8 -m venv /tmp/bq-191
$ /tmp/bq-191/bin/pip install --upgrade pip setuptools wheel
...
Successfully installed pip-20.1.1 setuptools-49.2.0 wheel-0.34.2
$ time /tmp/bq-191/bin/pip install mock pytest google-cloud-testutils pytest-cov freezegun
...
Successfully installed attrs-19.3.0 cachetools-4.1.1 coverage-5.2.1 freezegun-0.3.15 google-auth-1.19.2 google-cloud-testutils-0.1.0 mock-4.0.2 more-itertools-8.4.0 packaging-20.4 pluggy-0.13.1 py-1.9.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pyparsing-2.4.7 pytest-5.4.3 pytest-cov-2.10.0 python-dateutil-2.8.1 rsa-4.6 six-1.15.0 wcwidth-0.2.5

real	0m4.353s
user	0m3.295s
sys	0m0.241s
$ time /tmp/bq-191/bin/pip install grpcio
...
Successfully installed grpcio-1.30.0

real	0m1.762s
user	0m1.549s
sys	0m0.125s
$ time /tmp/bq-191/bin/pip install -e .[all,fastparquet]
...
Successfully installed certifi-2020.6.20 cffi-1.14.1 chardet-3.0.4 fastparquet-0.4.1 google-api-core-1.22.0 google-cloud-bigquery google-cloud-bigquery-storage-1.0.0 google-cloud-core-1.3.0 google-crc32c-0.1.0 google-resumable-media-0.7.0 googleapis-common-protos-1.52.0 idna-2.10 llvmlite-0.31.0 numba-0.50.1 numpy-1.19.1 pandas-1.0.5 protobuf-3.12.2 pyarrow-1.0.0 pycparser-2.20 python-snappy-0.5.4 pytz-2020.1 requests-2.24.0 thrift-0.13.0 tqdm-4.48.0 urllib3-1.25.10

real	0m20.075s
user	0m16.632s
sys	0m2.186s
$ time /tmp/bq-191/bin/pip install ipython
...
Successfully installed backcall-0.2.0 decorator-4.4.2 ipython-7.16.1 ipython-genutils-0.2.0 jedi-0.17.2 parso-0.7.1 pexpect-4.8.0 pickleshare-0.7.5 prompt-toolkit-3.0.5 ptyprocess-0.6.0 pygments-2.6.1 traitlets-4.3.3

real	0m5.102s
user	0m3.953s
sys	0m0.648s

In particular, the 20 seconds to install the .[all,fastparquet] bit in each environment adds up.

@tseaver
Copy link
Contributor

tseaver commented Jul 28, 2020

We should check as well whether the Kokoro environment has everything in place to allow installing binary wheels: compilation from source for the numpy-related packages can run really long. Maybe we need to make the prereq installation more verbose to test that?

@tseaver
Copy link
Contributor

tseaver commented Jul 28, 2020

Also, we should check that current versions of the prereqs still ship binary wheels for Python 2.7: if not, we should consider pinning to older versions (for testing under 2.7) to reduce latency introduced by from-source installs.

@tseaver
Copy link
Contributor

tseaver commented Jul 30, 2020

Testing the non-snippet non-systest sessions, both with resued environments and with recreated, and then skipping all the environment setup:

blacken:

$ time nox -re blacken
nox > Running session blacken
nox > Re-using existing virtual environment at .nox/blacken.
...
nox > Session blacken was successful.

real	0m6.815s
user	0m22.461s
sys	0m0.585s

$ time nox -e blacken
nox > Running session blacken
nox > Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
...
nox > Session blacken was successful.

real	0m3.907s
user	0m3.333s
sys	0m0.269s

$ time .nox/blacken/bin/black docs google samples tests noxfile.py setup.py
...

real	0m0.321s
user	0m0.293s
sys	0m0.029s

docs:

$ time nox -re docs
nox > Running session docs
nox > Re-using existing virtual environment at .nox/docs.
...
nox > Session docs was successful.

$ time nox -e docs
real	0m30.668s
user	0m29.761s
sys	0m0.969s
nox > Running session docs
nox > Creating virtual environment (virtualenv) using python3.8 in .nox/docs
...
nox > Session docs was successful.

real	0m56.376s
user	0m49.791s
sys	0m3.795s

$ time .nox/docs/bin/sphinx-build -W -T -N -b html -d docs/_build/doctrees/ docs/ docs/_build/html/

real	0m1.914s
user	0m1.989s
sys	0m0.612s

$ rm -r docs/_build  # Sphinx short-cuts the build if everything is unchanged
$ time .nox/docs/bin/sphinx-build -W -T -N -b html -d docs/_build/doctrees/ docs/ docs/_build/html/
...

real	0m26.755s
user	0m26.556s
sys	0m0.686s

lint:

$ time nox -re lint
nox > Running session lint
nox > Re-using existing virtual environment at .nox/lint.
...
nox > Session lint was successful.

real	0m10.518s
user	0m26.818s
sys	0m0.629s

$ time nox -e lint
nox > Running session lint
nox > Creating virtual environment (virtualenv) using python3.8 in .nox/lint
...
nox > Session lint was successful.

real	0m18.020s
user	0m32.498s
sys	0m1.193s

$ time .nox/lint/bin/flake8 google/cloud/bigquery

real	0m1.346s
user	0m5.750s
sys	0m0.063s

$ time .nox/lint/bin/flake8 tests

real	0m5.634s
user	0m14.962s
sys	0m0.209s

$ time .nox/lint/bin/flake8 docs/samples

real	0m0.652s
user	0m3.161s
sys	0m0.044s

$ time .nox/lint/bin/flake8 docs/snippets.py

real	0m0.492s
user	0m0.479s
sys	0m0.013s

$ time .nox/lint/bin/black --check docs google samples tests noxfile.py setup.py
...

real	0m0.276s
user	0m0.247s
sys	0m0.030s

lint_setup_py:

$ time nox -re lint_setup_py
nox > Running session lint_setup_py
nox > Re-using existing virtual environment at .nox/lint_setup_py.
...
nox > Session lint_setup_py was successful.

real	0m0.955s
user	0m0.868s
sys	0m0.091s

$ time nox -re lint_setup_py
nox > Running session lint_setup_py
nox > Creating virtual environment (virtualenv) using python3.8 in .nox/lint_setup_py
...
nox > Session lint_setup_py was successful.

real	0m3.097s
user	0m2.637s
sys	0m0.369s

$ time .nox/lint_setup_py/bin/python setup.py check --restructuredtext --strict
...

real	0m0.350s
user	0m0.338s
sys	0m0.013s

unit-2.7:

$ time nox -re unit-2.7
nox > Running session unit-2.7
nox > Re-using existing virtual environment at .nox/unit-2-7.
...
1442 passed, 4 skipped, 16 warnings in 37.40 seconds
nox > Session unit-2.7 was successful.

real	0m41.718s
user	0m36.481s
sys	0m1.364s

$ time nox -e unit-2.7
nox > Running session unit-2.7
nox > Creating virtual environment (virtualenv) using python2.7 in .nox/unit-2-7
...
1442 passed, 4 skipped, 16 warnings in 37.65 seconds
nox > Session unit-2.7 was successful.

real	1m9.504s
user	0m55.786s
sys	0m4.076s

$ time .nox/unit-2-7/bin/py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit
...
1442 passed, 4 skipped, 16 warnings in 37.11 seconds

real	0m37.630s
user	0m31.659s
sys	0m1.027s

unit-3.5:

$ time nox -re unit-3.5
nox > Running session unit-3.5
nox > Re-using existing virtual environment at .nox/unit-3-5.
...
1444 passed, 2 skipped, 25 warnings in 32.97s
nox > Session unit-3.5 was successful.

real	0m38.333s
user	0m33.206s
sys	0m1.306s

$ time nox -e unit-3.5
nox > Running session unit-3.5
nox > Creating virtual environment (virtualenv) using python3.5 in .nox/unit-3-5
...
1444 passed, 2 skipped, 25 warnings in 35.06s
nox > Session unit-3.5 was successful.

real	1m9.469s
user	0m56.697s
sys	0m4.432s

$ time .nox/unit-3-5/bin/py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit
...
1444 passed, 2 skipped, 25 warnings in 34.61s

real	0m35.397s
user	0m28.574s
sys	0m0.914s

unit-3.6:

$ time nox -re unit-3.6
nox > Running session unit-3.6
nox > Re-using existing virtual environment at .nox/unit-3-6.
...
1443 passed, 3 skipped, 25 warnings in 37.41s
nox > Session unit-3.6 was successful.

real	0m42.646s
user	0m37.763s
sys	0m1.296s

$ time nox -e unit-3.6
nox > Running session unit-3.6
nox > Creating virtual environment (virtualenv) using python3.6 in .nox/unit-3-6
...
1446 passed, 25 warnings in 39.22s
nox > Session unit-3.6 was successful.

real	1m16.043s
user	1m1.647s
sys	0m4.568s

$ time .nox/unit-3-6/bin/py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit
...
1446 passed, 25 warnings in 35.93s

real	0m36.703s
user	0m31.705s
sys	0m0.897s

unit-3.7

$ time nox -re unit-3.7
nox > Running session unit-3.7
nox > Re-using existing virtual environment at .nox/unit-3-7.
...
1443 passed, 3 skipped, 25 warnings in 36.55s
nox > Session unit-3.7 was successful.

real	0m41.393s
user	0m34.920s
sys	0m1.398s

$ time nox -e unit-3.7
nox > Running session unit-3.7
nox > Creating virtual environment (virtualenv) using python3.7 in .nox/unit-3-7
...
1446 passed, 25 warnings in 35.38s
nox > Session unit-3.7 was successful.

real	1m10.936s
user	0m58.853s
sys	0m4.746s

$ time .nox/unit-3-7/bin/py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit
...
1446 passed, 25 warnings in 36.82s

real	0m37.508s
user	0m30.038s
sys	0m0.946s

unit-3.8:

unit-3.8
nox > Running session unit-3.8
nox > Re-using existing virtual environment at .nox/unit-3-8.
...
1443 passed, 3 skipped, 25 warnings in 36.83s
nox > Session unit-3.8 was successful.

real	0m41.571s
user	0m33.208s
sys	0m1.267s

$ time nox -e unit-3.8
nox > Running session unit-3.8
nox > Creating virtual environment (virtualenv) using python3.8 in .nox/unit-3-8
...
1446 passed, 25 warnings in 36.23s
nox > Session unit-3.8 was successful.

real	1m7.317s
user	0m55.346s
sys	0m4.168s

$ time .nox/unit-3-8/bin/py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit
...
1446 passed, 25 warnings in 33.87s

real	0m34.597s
user	0m29.128s
sys	0m1.006s

@tseaver
Copy link
Contributor

tseaver commented Jul 30, 2020

Conclusions re: environment creation overhead:

Session Env creation (s) Check run (s) Total (s)
black 3.6 0.3 3.9
docs 29.5 26.8 56.4
lint 9.6 8.4 18.0
lint_setup_py 2.7 0.4 3.1
unit-2.7 31.9 37.6 69.5
unit-3.5 34.1 35.4 69.5
unit-3.6 39.3 36.7 76.0
unit-3.7 32.9 37.5 70.9
unit-3.8 32.7 34.6 67.3

Overall total time: 437.6 seconds.

@tseaver
Copy link
Contributor

tseaver commented Jul 30, 2020

One big thing I note is that the snippets / samples are running in the main Kokoro build, which seems redundant given that we have split them out, too. They take:

Session Runtime (s)
docs/snippets.py (2.7) 30.8
samples/ (2.7) 252.2
docs/snippets.py (3.8) 35.4
samples/ (3.8) 255.6

For a total of 579.0 seconds.

System tests:

Session Runtime (s)
systest-2.7 318.8
systest-3.8 340.1

For a total of 658.9 seconds.

I haven't measured directly, but the environment creation overhead for these tests should be similar to the unit test environments (~35 seconds * 4 envionments, 140 seconds). Altogether, they only account for ~1748 seconds out of the reported 1910 second build time.

@tmatsuo
Copy link
Contributor

tmatsuo commented Jul 31, 2020

I think splitting out systest will be the most valuable. I thought about how to achieve this and here is my current thought.

Here are preparation steps

  1. introduce an envvar RUN_SYSTEM_TESTS for presubmit builds (need to add it to presubmit/common.cfg in google3)
  2. change our noxfile to skip system tests if RUN_SYSTEM_TESTS == false
  3. add RUN_SYSTEM_TESTS=true in .kokoro/presubmit/common.cfg in the synthtool template
    (or we can default to RUN_SYSTEM_TESTS=true so that we don't need to add them)

For actually splitting

  1. add some logic to add .kokoro/presubmit/systests-PY_VER.cfg to synth.py in the bigquery repo as well as the actual build script only run the systest session for the specific Python version.
  2. Once added, add google3 Kokoro configs for those builds
  3. set RUN_SYSTEM_TESTS=false on the main presubmit

@tseaver
Copy link
Contributor

tseaver commented Aug 10, 2020

With the split out of systests into their own builds, the most latent bits are now (based on #218):

Build Time
(main) 24m 23s
system-3.8 9m 16s
system-2.7 8m 35s
docs-presubmit 4m 11s
samples-3.8 4m 7s

@tseaver
Copy link
Contributor

tseaver commented Aug 10, 2020

I believe the main remaining culprits in the main Kokoro build are the snippets-2.7 / snippets-3.8 sessions, which take ~5 minutes each, mostly exercising the samples/tests snippets (not exercised in the Samples-x.y Kokoro builds).

The docs / docsfx sessions each tack on a minute or so: perhaps we should be suppressing them, too, since the docs-presubmit build exercises them?

tseaver added a commit that referenced this issue Aug 10, 2020
gcf-merge-on-green bot pushed a commit that referenced this issue Oct 16, 2020
@tmatsuo Emulating PR #207. I don't know if I'm missing anything:  e.g., I don't quite understand what the `split_system_tests=True` does in the `synth.py` there.

Toward #191
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery API. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.
Projects
None yet
Development

No branches or pull requests

3 participants