Skip to content

Commit

Permalink
[low-code connectors] Rename decode_response reference to response (#…
Browse files Browse the repository at this point in the history
…14877)

* checkout files from test branch

* read_incremental works

* reset to master

* remove dead code

* comment

* fix

* Add test

* comments

* utc

* format

* small fix

* Add test with rfc3339

* remove unused param

* fix test

* configurable state checkpointing

* update test

* start working on retrier

* retry predicate

* return response status

* look in error message

* cleanup test

* constant backoff strategy

* chain backoff strategy

* chain retrier

* Add to class types registry

* extract backoff time from header

* wait until

* update

* split file

* parse_records

* classmethod

* delete dead code

* comment

* comment

* comments

* fix

* test for instantiating chain retrier

* fix parsing

* cleanup

* fix

* reset

* never raise on http error

* remove print

* comment

* comment

* comment

* comment

* remove prints

* add declarative stream to registry

* start working on limit paginator

* support for offset pagination

* tests

* move limit value

* extract request option

* boilerplate

* page increment

* delete offset paginator

* update conditional paginator

* refactor and fix test

* fix test

* small fix

* Delete dead code

* Add docstrings

* quick fix

* exponential backoff

* fix test

* fix

* delete unused properties

* fix

* missing unit tests

* uppercase

* docstrings

* rename to success

* compare full request instead of just url

* renmae module

* rename test file

* rename interface

* rename default retrier

* rename to compositeerrorhandler

* fix missing renames

* move action to filter

* str -> minmaxdatetime

* small fixes

* plural

* add example

* handle header variations

* also fix wait time from

* allow using a regex to extract the value

* group()

* docstring

* add docs

* update comment

* docstrings

* fix tests

* rename param

* cleanup stop_condition

* cleanup

* Add examples

* interpolated pagination strategy

* dont need duplicate class

* docstrings

* more docstrings

* docstrings

* update comment

* Update airbyte-cdk/python/airbyte_cdk/sources/declarative/requesters/http_requester.py

Co-authored-by: Sherif A. Nada <snadalive@gmail.com>

* version: Update Parquet library to latest release (#14502)

The upstream Parquet library that is currently pinned for use in the S3 destination plugin is over a year old. The current version is generating invalid schemas for date-time with time-zone fields which appears to be addressed in the `1.12.3` release of the library in commit apache/parquet-java@c72862b

* merge

* 🎉 Source Github: improve schema for stream `pull_request_commits` added "null" (#14613)

Signed-off-by: Sergey Chvalyuk <grubberr@gmail.com>

* Docs: Fixed broken links (#14622)

* fixing broken links

* more broken links

* source-hubspot: change mentioning of Mailchimp into HubSpot  doc (#14620)

* Helm Chart: Add external temporal option (#14597)

* conflict env configmap and chart lock

* reverting lock

* add eof lines and documentation on values yaml

* conflict json file

* rollback json

* solve conflict

* correct minio with new version

Co-authored-by: Guy Feldman <gfeldman@86labs.com>

* 🎉 Add YAML format to source-file reader (#14588)

* Add yaml reader

* Update docs

* Bumpversion of connector

* bump docs

* Update pyarrow dependency

* Upgrade pandas dependency

* auto-bump connector version

Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>

* 🎉 Source Okta: add GroupMembers stream (#14380)

* add Group_Members stream to okta source

- Group_Members return a list of users, the same schema of Users stream.
- Create a shared schema users, and both group_members and users sechema use it as a reference.
- Add Group_Members stream to source connector

* add tests and fix logs schema

- fix the test error: None is not one of enums though the enum type includes both string and null, it comes from json schema validator
https://github.com/python-jsonschema/jsonschema/blob/ddb87afad8f5d5c40600b5ede0ab96e4d4bdf7d3/jsonschema/_validators.py#L279-L285
- change grouop_members to use id as the cursor field since `filter` is not supported in the query string
- fix the abnormal state test on logs stream, when since is abnormally large, until has to defined, an equal or a larger value
- remove logs stream from full sync test, because 2 full sync always has a gap -- at least a new log about users or groups api.

* last polish before submit the PR

- bump docker version
- update changelog
- add the right abnormal value for logs stream
- correct the sample catalog

* address comments::

- improve comments for until parameter under the logs stream
- add use_cache on groupMembers

* add use_cache to Group_Members

* change configured_catalog to test

* auto-bump connector version

Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>

* split test files

* renames

* missing unit test

* add missing unit tests

* rename

* assert isinstance

* start extracting to their own files

* use final instead of classmethod

* assert we retry 429 errors

* Add log

* replace asserts with valueexceptions

* delete superfluous print statement

* fix factory so we don't need to union everything with strings

* get class_name from type

* remove from class types registry

* process error handlers one at a time

* sort

* delete print statement

* comment

* comment

* format

* delete unused file

* comment

* interpolatedboolean

* comment

* not optional

* not optional

* unit tests

* fix request body data

* add test

* move file to right module

* update

* reset to master

* format

* rename to pass_by

* rename to page size

* fix

* add test

* fix body data

* delete extra newlines

* move to subpackage

* fix imports

* handle str body data

* simplify

* fix typing

* always return a map

* rename to inject_into

* only accept enum

* delete conditional paginator

* only return body data

* rename decoded response to response

* decoded_response -> response

Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
Co-authored-by: Tobias Macey <tmacey@boundlessnotions.com>
Co-authored-by: Serhii Chvaliuk <grubberr@gmail.com>
Co-authored-by: Amruta Ranade <11484018+Amruta-Ranade@users.noreply.github.com>
Co-authored-by: Bas Beelen <bjgbeelen@gmail.com>
Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com>
Co-authored-by: Guy Feldman <gfeldman@86labs.com>
Co-authored-by: Christophe Duong <christophe.duong@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
Co-authored-by: Yiyang Li <yiyangli2010@gmail.com>
Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
  • Loading branch information
12 people authored and mfsiega-airbyte committed Jul 21, 2022
1 parent 11a78c1 commit 8a5536c
Show file tree
Hide file tree
Showing 9 changed files with 18 additions and 20 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ class CompositeErrorHandler(ErrorHandler):
type: "CompositeErrorHandler"
error_handlers:
- response_filters:
- predicate: "{{ 'codase' in decoded_response }}"
- predicate: "{{ 'codase' in response }}"
action: RETRY
backoff_strategies:
- type: "ConstantBackoffStrategy"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ class DefaultErrorHandler(ErrorHandler):
`
error_handler:
response_filters:
- predicate: "{{ 'code' in decoded_response }}"
- predicate: "{{ 'code' in response }}"
action: IGNORE
`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ def matches(self, response: requests.Response) -> Optional[ResponseAction]:
return None

def _response_matches_predicate(self, response: requests.Response) -> bool:
return self._predicate and self._predicate.eval(None, decoded_response=response.json())
return self._predicate and self._predicate.eval(None, response=response.json())

def _response_contains_error_message(self, response: requests.Response) -> bool:
if not self._error_message_contains:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ class LimitPaginator(Paginator):
Examples:
1.
* fetches up to 10 records at a time by setting the "limit" request param to 10
* updates the request path with "{{ decoded_response._metadata.next }}"
* updates the request path with "{{ response._metadata.next }}"
paginator:
type: "LimitPaginator"
limit_value: 10
Expand All @@ -33,7 +33,7 @@ class LimitPaginator(Paginator):
option_type: path
pagination_strategy:
type: "CursorPagination"
cursor_value: "{{ decoded_response._metadata.next }}"
cursor_value: "{{ response._metadata.next }}"
`
2.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,10 +43,8 @@ def next_page_token(self, response: requests.Response, last_records: List[Mappin
decoded_response = self._decoder.decode(response)
headers = response.headers
if self._stop_condition:
should_stop = self._stop_condition.eval(
self._config, decoded_response=decoded_response, headers=headers, last_records=last_records
)
should_stop = self._stop_condition.eval(self._config, response=decoded_response, headers=headers, last_records=last_records)
if should_stop:
return None
token = self._cursor_value.eval(config=self._config, last_records=last_records, decoded_response=self._decoder.decode(response))
token = self._cursor_value.eval(config=self._config, last_records=last_records, response=decoded_response)
return token if token else None
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
(
"test_403_with_predicate",
HTTPStatus.FORBIDDEN,
HttpResponseFilter(action=ResponseAction.RETRY, predicate="{{ 'code' in decoded_response }}"),
HttpResponseFilter(action=ResponseAction.RETRY, predicate="{{ 'code' in response }}"),
None,
{},
ResponseStatus.retry(10),
Expand All @@ -95,7 +95,7 @@
(
"test_403_with_predicate",
HTTPStatus.FORBIDDEN,
HttpResponseFilter(action=ResponseAction.RETRY, predicate="{{ 'some_absent_field' in decoded_response }}"),
HttpResponseFilter(action=ResponseAction.RETRY, predicate="{{ 'some_absent_field' in response }}"),
None,
{},
response_status.FAIL,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@
("test_static_token", "token", None, "token"),
("test_token_from_config", "{{ config.config_key }}", None, "config_value"),
("test_token_from_last_record", "{{ last_records[-1].id }}", None, 1),
("test_token_from_decoded_response", "{{ decoded_response._metadata.content }}", None, "content_value"),
("test_token_not_found", "{{ decoded_response.invalid_key }}", None, None),
("test_token_from_response", "{{ response._metadata.content }}", None, "content_value"),
("test_token_not_found", "{{ response.invalid_key }}", None, None),
("test_static_token_with_stop_condition_false", "token", InterpolatedBoolean("{{False}}"), "token"),
("test_static_token_with_stop_condition_true", "token", InterpolatedBoolean("{{True}}"), None),
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ def test_limit_paginator(
expected_next_page_token,
):
limit_request_option = RequestOption(inject_into=RequestOptionType.request_parameter, field_name="limit")
cursor_value = "{{ decoded_response.next }}"
cursor_value = "{{ response.next }}"
url_base = "https://airbyte.io"
config = {}
strategy = CursorPaginationStrategy(cursor_value, stop_condition=stop_condition, decoder=JsonDecoder(), config=config)
Expand Down Expand Up @@ -130,7 +130,7 @@ def test_limit_paginator(
def test_limit_cannot_be_set_in_path():
limit_request_option = RequestOption(inject_into=RequestOptionType.path)
page_token_request_option = RequestOption(inject_into=RequestOptionType.request_parameter, field_name="offset")
cursor_value = "{{ decoded_response.next }}"
cursor_value = "{{ response.next }}"
url_base = "https://airbyte.io"
config = {}
strategy = CursorPaginationStrategy(cursor_value, config)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ def test_full_config():
inject_into: path
pagination_strategy:
type: "CursorPagination"
cursor_value: "{{ decoded_response._metadata.next }}"
cursor_value: "{{ response._metadata.next }}"
url_base: "https://api.sendgrid.com/v3/"
next_page_url_from_token_partial:
class_name: "airbyte_cdk.sources.declarative.interpolation.interpolated_string.InterpolatedString"
Expand Down Expand Up @@ -305,7 +305,7 @@ def test_create_composite_error_handler():
type: "CompositeErrorHandler"
error_handlers:
- response_filters:
- predicate: "{{ 'code' in decoded_response }}"
- predicate: "{{ 'code' in response }}"
action: RETRY
- response_filters:
- http_codes: [ 403 ]
Expand All @@ -316,7 +316,7 @@ def test_create_composite_error_handler():
assert len(component._error_handlers) == 2
assert isinstance(component._error_handlers[0], DefaultErrorHandler)
assert isinstance(component._error_handlers[0]._response_filters[0], HttpResponseFilter)
assert component._error_handlers[0]._response_filters[0]._predicate._condition == "{{ 'code' in decoded_response }}"
assert component._error_handlers[0]._response_filters[0]._predicate._condition == "{{ 'code' in response }}"
assert component._error_handlers[1]._response_filters[0]._http_codes == [403]
assert isinstance(component, CompositeErrorHandler)

Expand All @@ -342,7 +342,7 @@ def test_config_with_defaults():
inject_into: path
pagination_strategy:
type: "CursorPagination"
cursor_value: "{{ decoded_response._metadata.next }}"
cursor_value: "{{ response._metadata.next }}"
requester:
path: "/v3/marketing/lists"
authenticator:
Expand Down Expand Up @@ -388,7 +388,7 @@ def test_create_limit_paginator():
inject_into: path
pagination_strategy:
type: "CursorPagination"
cursor_value: "{{ decoded_response._metadata.next }}"
cursor_value: "{{ response._metadata.next }}"
"""
config = parser.parse(content)

Expand Down

0 comments on commit 8a5536c

Please sign in to comment.