Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: add ability to use single-row transactions #10021

Merged

Conversation

mackenziestarr
Copy link
Contributor

@mackenziestarr mackenziestarr commented Dec 26, 2019

Fixes #10018 🦕

Tested manually with the following script, no longer getting google.api_core.exceptions.FailedPrecondition: 400 Single-row transactions are not allowed by this app profile and confirmed data was written to my test table.

#!/usr/bin/env python

from google.cloud.bigtable.client import Client
from google.cloud.bigtable.row import ConditionalRow
from google.cloud.bigtable.row_filters import PassAllFilter

PROJECT_ID="<PROJECT_ID>"
INSTANCE_ID="<INSTANCE_ID>"
TABLE_ID="<TABLE_ID>"
APP_PROFILE_ID="<APP_PROFILE_ID>"

client = Client(project=PROJECT_ID)
instance = client.instance(instance_id=INSTANCE_ID)
table = instance.table(table_id=TABLE_ID, app_profile_id=APP_PROFILE_ID)

row_cond = ConditionalRow(b'test_conditional', table, PassAllFilter(True))
row_cond.set_cell('test_column_family', 'test_column_qualifier', 'test_cell_value', timestamp=None, state=False)
row_cond.commit()

row = table.row("test_append", append=True)
row.append_cell_value(u'test_column_family', b'test_column_qualifier', b'test-value')
row.commit()

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Dec 26, 2019
@mackenziestarr mackenziestarr force-pushed the mstarr/fix-single-row-transactions branch from 401e048 to c164573 Compare December 26, 2019 21:26
@plamut plamut added the api: bigtable Issues related to the Bigtable API. label Dec 26, 2019
@igorbernstein2 igorbernstein2 added priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Dec 27, 2019
@igorbernstein2
Copy link

Thanks for catching this!

@@ -582,6 +582,7 @@ def commit(self):
table_name=self._table.name,
row_key=self._row_key,
predicate_filter=self._filter.to_pb(),
app_profile_id=self._table._app_profile_id,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this! Would you be able to add a unit test for this as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey @kolea2 see 20a4838, let me know if there is a better way to access the arguments passed to the mock

@crwilcox crwilcox added the kokoro:run Add this label to force Kokoro to re-run the tests. label Jan 2, 2020
@yoshi-kokoro yoshi-kokoro removed the kokoro:run Add this label to force Kokoro to re-run the tests. label Jan 2, 2020
@crwilcox
Copy link
Contributor

crwilcox commented Jan 2, 2020

Thanks for this contribution. It seems 2 of the testcases are failing as well:

=================================== FAILURES ===================================
________________________ TestConditionalRow.test_commit ________________________

self = <tests.unit.test_row.TestConditionalRow testMethod=test_commit>

    def test_commit(self):
        from google.cloud.bigtable.row_filters import RowSampleFilter
        from google.cloud.bigtable_v2.gapic import bigtable_client

        project_id = "project-id"
        row_key = b"row_key"
        table_name = "projects/more-stuff"
        column_family_id1 = u"column_family_id1"
        column_family_id2 = u"column_family_id2"
        column_family_id3 = u"column_family_id3"
        column1 = b"column1"
        column2 = b"column2"

        api = bigtable_client.BigtableClient(mock.Mock())
        credentials = _make_credentials()
        client = self._make_client(
            project=project_id, credentials=credentials, admin=True
        )
        table = _Table(table_name, client=client)
        row_filter = RowSampleFilter(0.33)
        row = self._make_one(row_key, table, filter_=row_filter)

        # Create request_pb
        value1 = b"bytes-value"

        # Create response_pb
        predicate_matched = True
        response_pb = _CheckAndMutateRowResponsePB(predicate_matched=predicate_matched)

        # Patch the stub used by the API method.
        api.transport.check_and_mutate_row.side_effect = [response_pb]
        client._table_data_client = api

        # Create expected_result.
        expected_result = predicate_matched

        # Perform the method and check the result.
        row.set_cell(column_family_id1, column1, value1, state=True)
        row.delete(state=False)
        row.delete_cell(column_family_id2, column2, state=True)
        row.delete_cells(column_family_id3, row.ALL_COLUMNS, state=True)
>       result = row.commit()

tests/unit/test_row.py:446:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <google.cloud.bigtable.row.ConditionalRow object at 0x7fef09396f60>

    def commit(self):
        """Makes a ``CheckAndMutateRow`` API request.

        If no mutations have been created in the row, no request is made.

        The mutations will be applied conditionally, based on whether the
        filter matches any cells in the :class:`ConditionalRow` or not. (Each
        method which adds a mutation has a ``state`` parameter for this
        purpose.)

        Mutations are applied atomically and in order, meaning that earlier
        mutations can be masked / negated by later ones. Cells already present
        in the row are left unchanged unless explicitly changed by a mutation.

        After committing the accumulated mutations, resets the local
        mutations.

        For example:

        .. literalinclude:: snippets_table.py
            :start-after: [START bigtable_row_commit]
            :end-before: [END bigtable_row_commit]

        :rtype: bool
        :returns: Flag indicating if the filter was matched (which also
                  indicates which set of mutations were applied by the server).
        :raises: :class:`ValueError <exceptions.ValueError>` if the number of
                 mutations exceeds the :data:`MAX_MUTATIONS`.
        """
        true_mutations = self._get_mutations(state=True)
        false_mutations = self._get_mutations(state=False)
        num_true_mutations = len(true_mutations)
        num_false_mutations = len(false_mutations)
        if num_true_mutations == 0 and num_false_mutations == 0:
            return
        if num_true_mutations > MAX_MUTATIONS or num_false_mutations > MAX_MUTATIONS:
            raise ValueError(
                "Exceed the maximum allowable mutations (%d). Had %s true "
                "mutations and %d false mutations."
                % (MAX_MUTATIONS, num_true_mutations, num_false_mutations)
            )

        data_client = self._table._instance._client.table_data_client
        resp = data_client.check_and_mutate_row(
            table_name=self._table.name,
            row_key=self._row_key,
            predicate_filter=self._filter.to_pb(),
>           app_profile_id=self._table._app_profile_id,
            true_mutations=true_mutations,
            false_mutations=false_mutations,
        )
E       AttributeError: '_Table' object has no attribute '_app_profile_id'

google/cloud/bigtable/row.py:585: AttributeError
__________________________ TestAppendRow.test_commit ___________________________

self = <tests.unit.test_row.TestAppendRow testMethod=test_commit>

    def test_commit(self):
        from google.cloud._testing import _Monkey
        from google.cloud.bigtable import row as MUT
        from google.cloud.bigtable_v2.gapic import bigtable_client

        project_id = "project-id"
        row_key = b"row_key"
        table_name = "projects/more-stuff"
        column_family_id = u"column_family_id"
        column = b"column"

        api = bigtable_client.BigtableClient(mock.Mock())
        credentials = _make_credentials()
        client = self._make_client(
            project=project_id, credentials=credentials, admin=True
        )
        table = _Table(table_name, client=client)
        row = self._make_one(row_key, table)

        # Create request_pb
        value = b"bytes-value"

        # Create expected_result.
        row_responses = []
        expected_result = object()

        # Patch API calls
        client._table_data_client = api

        def mock_parse_rmw_row_response(row_response):
            row_responses.append(row_response)
            return expected_result

        # Perform the method and check the result.
        with _Monkey(MUT, _parse_rmw_row_response=mock_parse_rmw_row_response):
            row.append_cell_value(column_family_id, column, value)
>           result = row.commit()

tests/unit/test_row.py:595:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <google.cloud.bigtable.row.AppendRow object at 0x7fef0938d278>

    def commit(self):
        """Makes a ``ReadModifyWriteRow`` API request.

        This commits modifications made by :meth:`append_cell_value` and
        :meth:`increment_cell_value`. If no modifications were made, makes
        no API request and just returns ``{}``.

        Modifies a row atomically, reading the latest existing
        timestamp / value from the specified columns and writing a new value by
        appending / incrementing. The new cell created uses either the current
        server time or the highest timestamp of a cell in that column (if it
        exceeds the server time).

        After committing the accumulated mutations, resets the local mutations.

        For example:

        .. literalinclude:: snippets_table.py
            :start-after: [START bigtable_row_commit]
            :end-before: [END bigtable_row_commit]

        :rtype: dict
        :returns: The new contents of all modified cells. Returned as a
                  dictionary of column families, each of which holds a
                  dictionary of columns. Each column contains a list of cells
                  modified. Each cell is represented with a two-tuple with the
                  value (in bytes) and the timestamp for the cell.
        :raises: :class:`ValueError <exceptions.ValueError>` if the number of
                 mutations exceeds the :data:`MAX_MUTATIONS`.
        """
        num_mutations = len(self._rule_pb_list)
        if num_mutations == 0:
            return {}
        if num_mutations > MAX_MUTATIONS:
            raise ValueError(
                "%d total append mutations exceed the maximum "
                "allowable %d." % (num_mutations, MAX_MUTATIONS)
            )

        data_client = self._table._instance._client.table_data_client
        row_response = data_client.read_modify_write_row(
            table_name=self._table.name,
            row_key=self._row_key,
            rules=self._rule_pb_list,
>           app_profile_id=self._table._app_profile_id,
        )
E       AttributeError: '_Table' object has no attribute '_app_profile_id'

google/cloud/bigtable/row.py:915: AttributeError

Also, it isn't currently passing lint. if you run nox -s blacken it will autoformat.

@mackenziestarr
Copy link
Contributor Author

thanks @crwilcox updating now

@mackenziestarr mackenziestarr force-pushed the mstarr/fix-single-row-transactions branch from 7a13fd1 to 20a4838 Compare January 3, 2020 06:56
@mackenziestarr mackenziestarr changed the title add ability to use single-row transactions fix: add ability to use single-row transactions Jan 3, 2020
@mackenziestarr
Copy link
Contributor Author

@crwilcox added unit tests for the changes in 20a4838. unit-2.7 and unit-3.6 pass locally for me (still waiting on Kokoro), assuming this get merged in shortly when can we expect it to make it into a release? I'm currently blocked on not being able to use single-row transactions.

@kolea2 kolea2 added the kokoro:run Add this label to force Kokoro to re-run the tests. label Jan 3, 2020
@yoshi-kokoro yoshi-kokoro removed the kokoro:run Add this label to force Kokoro to re-run the tests. label Jan 3, 2020
@crwilcox
Copy link
Contributor

crwilcox commented Jan 3, 2020

Thanks @mackenziestarr for tidying up the tests and your contribution!

@crwilcox crwilcox merged commit 1dcb95f into googleapis:master Jan 3, 2020
@kolea2
Copy link

kolea2 commented Jan 3, 2020

@mackenziestarr
Copy link
Contributor Author

thanks @kolea2 and @crwilcox for the quick turnaround on the release!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigtable Issues related to the Bigtable API. cla: yes This human has signed the Contributor License Agreement. priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BigTable: single-row transactions are unsupported
7 participants