Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Table not found though it exists on bigtable emulator #715

Closed
acscott opened this issue Jan 6, 2023 · 1 comment
Closed

Table not found though it exists on bigtable emulator #715

acscott opened this issue Jan 6, 2023 · 1 comment
Assignees
Labels
api: bigtable Issues related to the googleapis/python-bigtable API.

Comments

@acscott
Copy link

acscott commented Jan 6, 2023

Environment details

  • OS type and version: Linux 11.1.0ubuntu4

  • Python version: python --version 3.9.9 (main, Nov 16 2021, 18:43:35)

  • pip version: pip --version pip 22.3.1 from /usr/local/lib/python3.9/site-packages/pip (python 3.9)

  • google-cloud-bigtable version: pip show google-cloud-bigtable Name: google-cloud-bigtable

Version: 2.11.3
Summary: Google Cloud Bigtable API client library
Home-page: https://github.com/googleapis/python-bigtable
Author: Google LLC
Author-email: googleapis-packages@google.com
License: Apache 2.0
Location: /usr/local/lib/python3.9/site-packages
Requires: google-api-core, google-cloud-core, grpc-google-iam-v1, proto-plus, protobuf
Required-by: 

Steps to reproduce

  1. Running in docker through pyCharm, with bigtable service running on port 8086 in another container (bigtruedata/gcloud-bigtable-emulator)
  2. Running in docker through PyCharm, with bigtable service running on host network port 8086
    Google Cloud SDK 404.0.0
    beta 2022.09.23
    bigtable
    bq 2.0.78
    bundled-python3-unix 3.9.12
    cbt 0.12.0
    core 2022.09.23
    gcloud-crc32c 1.0.0
    gke-gcloud-auth-plugin 0.3.0
    gsutil 5.14
    kubectl 1.22.14

I have verified that the table catalog_lut exists with column_family C:

 export BIGTABLE_EMULATOR_HOST=localhost:8086
adam@pop-os:~$ cbt ls
alert
alert_lut
catalog_lut
catalog_table
locus
locus_by_alert_id
locus_by_day
storage
watch_list
watch_object

cbt ls catalog_lut
Family Name	GC Policy
-----------	---------
C		versions() > 1

We are using the workaround GoogleCloudPlatform/cloud-sdk-docker#253 (comment) to work within docker.

Code example

        log.info([x.name + ":" + str(x) for x in self.lookup_table._instance.list_tables()])
        log.info(f"self.lookup_table_cf = {self.lookup_table_cf}")
        log.info(f"self.lookup_table.name = {self.lookup_table.name}")
        log.info(f"self.lookup_table.table_id.table_id = {self.lookup_table.table_id.table_id}")
        row = self.lookup_table.direct_row("test")
        # insert dummy data to satisfy BT design constraints: you cannot store just a rowkey
        row.set_cell(self.lookup_table_cf, "d", ".")
        row.commit()

Stack trace

['projects/antares_dev/instances/antares_dev/tables/alert:<google.cloud.bigtable.table.Table object at 0x7f4271091070>', 'projects/antares_dev/instances/antares_dev/tables/alert_lut:<google.cloud.bigtable.table.Table object at 0x7f4271091e80>', 'projects/antares_dev/instances/antares_dev/tables/storage:<google.cloud.bigtable.table.Table object at 0x7f4271091400>', 'projects/antares_dev/instances/antares_dev/tables/locus_by_alert_id:<google.cloud.bigtable.table.Table object at 0x7f4271091b20>', 'projects/antares_dev/instances/antares_dev/tables/watch_list:<google.cloud.bigtable.table.Table object at 0x7f4271091340>', 'projects/antares_dev/instances/antares_dev/tables/catalog_table:<google.cloud.bigtable.table.Table object at 0x7f4271091460>', 'projects/antares_dev/instances/antares_dev/tables/locus:<google.cloud.bigtable.table.Table object at 0x7f4271091520>', 'projects/antares_dev/instances/antares_dev/tables/catalog_lut:<google.cloud.bigtable.table.Table object at 0x7f4271091760>', 'projects/antares_dev/instances/antares_dev/tables/locus_by_day:<google.cloud.bigtable.table.Table object at 0x7f4271091610>', 'projects/antares_dev/instances/antares_dev/tables/watch_object:<google.cloud.bigtable.table.Table object at 0x7f42710916d0>']
2023-01-06 23:21:23,381 - INFO MainThread catalog.py:create_catalog_lookup_row:195 - self.lookup_table_cf = C
2023-01-06 23:21:23,381 - INFO MainThread catalog.py:create_catalog_lookup_row:196 - self.lookup_table.name = projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>
2023-01-06 23:21:23,381 - INFO MainThread catalog.py:create_catalog_lookup_row:197 - self.lookup_table.table_id.table_id = catalog_lut
test_1           | 
test/integration/test_adapters/test_bigtable/test_catalog_repository.py:76 (TestBigtableCatalogObjectRepository.test_can_add_catalog_object)
args = (table_name: "projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710...et_cell {
      family_name: "C"
      column_qualifier: "d"
      timestamp_micros: -1
      value: "."
    }
  }
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'table_name=projects/antares_dev/instances/antares_dev/tables/%3Cgoogle.cloud....ble.Table+object+at+0x7f42710b9520%3E'), ('x-goog-api-client', 'gl-python/3.9.9 grpc/1.50.0 gax/2.10.2 gapic/2.11.3')]}
result = <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "table "projects/anta.../antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
>
prefetch_first = True

    @functools.wraps(callable_)
    def error_remapped_callable(*args, **kwargs):
        try:
            result = callable_(*args, **kwargs)
            # Auto-fetching the first result causes PubSub client's streaming pull
            # to hang when re-opening the stream, thus we need examine the hacky
            # hidden flag to see if pre-fetching is disabled.
            # https://github.com/googleapis/python-pubsub/issues/93#issuecomment-630762257
            prefetch_first = getattr(callable_, "_prefetch_first_result_", True)
>           return _StreamingResponseIterator(
                result, prefetch_first_result=prefetch_first
            )

/usr/local/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:162: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <google.api_core.grpc_helpers._StreamingResponseIterator object at 0x7f4271091580>
wrapped = <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "table "projects/anta.../antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
>
prefetch_first_result = True

    def __init__(self, wrapped, prefetch_first_result=True):
        self._wrapped = wrapped
    
        # This iterator is used in a retry context, and returned outside after init.
        # gRPC will not throw an exception until the stream is consumed, so we need
        # to retrieve the first result, in order to fail, in order to trigger a retry.
        try:
            if prefetch_first_result:
>               self._stored_first_result = next(self._wrapped)

/usr/local/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:88: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "table "projects/anta.../antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
>

    def __next__(self):
>       return self._next()

/usr/local/lib/python3.9/site-packages/grpc/_channel.py:426: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "table "projects/anta.../antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
>

    def _next(self):
        with self._state.condition:
            if self._state.code is None:
                event_handler = _event_handler(self._state,
                                               self._response_deserializer)
                self._state.due.add(cygrpc.OperationType.receive_message)
                operating = self._call.operate(
                    (cygrpc.ReceiveMessageOperation(_EMPTY_FLAGS),),
                    event_handler)
                if not operating:
                    self._state.due.remove(cygrpc.OperationType.receive_message)
            elif self._state.code is grpc.StatusCode.OK:
                raise StopIteration()
            else:
                raise self
    
            def _response_ready():
                return (self._state.response is not None or
                        (cygrpc.OperationType.receive_message
                         not in self._state.due and
                         self._state.code is not None))
    
            _common.wait(self._state.condition.wait, _response_ready)
            if self._state.response is not None:
                response = self._state.response
                self._state.response = None
                return response
            elif cygrpc.OperationType.receive_message not in self._state.due:
                if self._state.code is grpc.StatusCode.OK:
                    raise StopIteration()
                elif self._state.code is not None:
>                   raise self
E                   grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
E                   	status = StatusCode.NOT_FOUND
E                   	details = "table "projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>" not found"
E                   	debug_error_string = "UNKNOWN:Error received from peer ipv4:172.19.0.2:8086 {created_time:"2023-01-06T23:21:23.382899688+00:00", grpc_status:5, grpc_message:"table \"projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
E                   >

/usr/local/lib/python3.9/site-packages/grpc/_channel.py:826: _MultiThreadedRendezvous

The above exception was the direct cause of the following exception:

self = <test.integration.test_adapters.test_bigtable.test_catalog_repository.TestBigtableCatalogObjectRepository object at 0x7f423a35c640>
int_bt_catalog_object_repo = <antares.adapters.repository.bigtable.catalog.BigtableCatalogObjectRepository object at 0x7f42710b91f0>

    def test_can_add_catalog_object(self, int_bt_catalog_object_repo):
        # create_bigtable_table(
        #     *COMMON_CATALOG_TABLE_DESCRIPTIONS,
        #     instance=int_bt_catalog_object_repo.instance
        # )
        repo = BigtableCatalogObjectRepository(
            int_bt_catalog_object_repo.instance,
            int_bt_catalog_object_repo.lookup_table,
            int_bt_catalog_object_repo.lookup_table_cf,
            COMMON_CATALOG_TABLE_DESCRIPTIONS,
            int_bt_catalog_object_repo.catalog_table_cf,
        )
        catalog_object = build_catalog_object(
            id="1",
            catalog_id="1",
            location=SkyCoord("0d 0d"),
            radius=Angle("1s"),
            properties={
                "ra_deg": 100,
                "dec_deg": 40,
                "id": 1,
                "name": "Catalog Object 001",
            },
        )
    
>       repo.add(catalog_object)

test_catalog_repository.py:102: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../../../antares/adapters/repository/bigtable/catalog.py:155: in add
    self.insert_htm_ids_into_catalog_lookup_table(htm_ids, value)
../../../../antares/adapters/repository/bigtable/catalog.py:175: in insert_htm_ids_into_catalog_lookup_table
    self.insert_catalog_lookup_table(htm_level, value)
../../../../antares/adapters/repository/bigtable/catalog.py:179: in insert_catalog_lookup_table
    row = self.create_catalog_lookup_row(htm_level, value)
../../../../antares/adapters/repository/bigtable/catalog.py:201: in create_catalog_lookup_row
    row.commit()
/usr/local/lib/python3.9/site-packages/google/cloud/bigtable/row.py:473: in commit
    response = self._table.mutate_rows([self])
/usr/local/lib/python3.9/site-packages/google/cloud/bigtable/table.py:729: in mutate_rows
    return retryable_mutate_rows(retry=retry)
/usr/local/lib/python3.9/site-packages/google/cloud/bigtable/table.py:1087: in __call__
    mutate_rows()
/usr/local/lib/python3.9/site-packages/google/api_core/retry.py:283: in retry_wrapped_func
    return retry_target(
/usr/local/lib/python3.9/site-packages/google/api_core/retry.py:190: in retry_target
    return target()
/usr/local/lib/python3.9/site-packages/google/cloud/bigtable/table.py:1136: in _do_mutate_retryable_rows
    responses = data_client.mutate_rows(
/usr/local/lib/python3.9/site-packages/google/cloud/bigtable_v2/services/bigtable/client.py:890: in mutate_rows
    response = rpc(
/usr/local/lib/python3.9/site-packages/google/api_core/gapic_v1/method.py:154: in __call__
    return wrapped_func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (table_name: "projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710...et_cell {
      family_name: "C"
      column_qualifier: "d"
      timestamp_micros: -1
      value: "."
    }
  }
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'table_name=projects/antares_dev/instances/antares_dev/tables/%3Cgoogle.cloud....ble.Table+object+at+0x7f42710b9520%3E'), ('x-goog-api-client', 'gl-python/3.9.9 grpc/1.50.0 gax/2.10.2 gapic/2.11.3')]}
result = <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "table "projects/anta.../antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>\" not found"}"
>
prefetch_first = True

    @functools.wraps(callable_)
    def error_remapped_callable(*args, **kwargs):
        try:
            result = callable_(*args, **kwargs)
            # Auto-fetching the first result causes PubSub client's streaming pull
            # to hang when re-opening the stream, thus we need examine the hacky
            # hidden flag to see if pre-fetching is disabled.
            # https://github.com/googleapis/python-pubsub/issues/93#issuecomment-630762257
            prefetch_first = getattr(callable_, "_prefetch_first_result_", True)
            return _StreamingResponseIterator(
                result, prefetch_first_result=prefetch_first
            )
        except grpc.RpcError as exc:
>           raise exceptions.from_grpc_error(exc) from exc
E           google.api_core.exceptions.NotFound: 404 table "projects/antares_dev/instances/antares_dev/tables/<google.cloud.bigtable.table.Table object at 0x7f42710b9520>" not found

/usr/local/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:166: NotFound

@product-auto-label product-auto-label bot added the api: bigtable Issues related to the googleapis/python-bigtable API. label Jan 6, 2023
@acscott
Copy link
Author

acscott commented Jan 17, 2023

Thank you @parthea and @Mariatta. The problem was on my side. Deep apologies. There was code creating another Bigtable instance and creating the table in that separate instance. So you can close this.

@acscott acscott closed this as completed Jan 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigtable Issues related to the googleapis/python-bigtable API.
Projects
None yet
Development

No branches or pull requests

2 participants