Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🎉 New Source: SAP Fieldglass [low-code cdk] #18656

Merged
merged 19 commits into from
Nov 11, 2022

Conversation

Sid-Lais
Copy link
Contributor

@Sid-Lais Sid-Lais commented Oct 30, 2022

What

Added source connector for SAP. Issue

How

Using low code cdk

Recommended reading order

  1. docs/integrations/sources/sap-fieldglass.md
  2. spec.yaml
  3. sap.yaml

🚨 User Impact 🚨

No change to base code

Pre-merge Checklist

Expand the relevant checklist and delete the others.

New Connector

Community member or Airbyter

  • Community member? Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
    • docs/integrations/README.md
    • airbyte-integrations/builds.md
  • PR name follows PR naming conventions

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • If new credentials are required for use in CI, add them to GSM. Instructions.
  • /test connector=connectors/<name> command is passing
  • New Connector version released on Dockerhub by running the /publish command described here
  • After the connector is published, connector added to connector index as described here
  • Seed specs have been re-generated by building the platform and committing the changes to the seed spec files, as described here

Tests

Acceptance image

@CLAassistant
Copy link

CLAassistant commented Oct 30, 2022

CLA assistant check
All committers have signed the CLA.

@github-actions github-actions bot added the area/connectors Connector related issues label Oct 30, 2022
@github-actions github-actions bot added the area/documentation Improvements or additions to documentation label Oct 30, 2022
@Sid-Lais Sid-Lais marked this pull request as ready for review October 31, 2022 04:16
@Sid-Lais
Copy link
Contributor Author

Tested the connector using sandbox url and works. Do I need to shift to production url ? Production needs login details.

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @Sid-Lais, Marcos from Airbyte here 👋 . We received more than 25 new contributions along the weekend. One is yours 🎉 thank so much for! Our team is limited and maybe the review process can take longer than expected. As described in the Airbyte's Hacktoberfest your contribution was submitted before November 2nd and it is eligible to win the prize. The review process will validate other requirements. I ask to you patience until someone from the team review it.

Because I reviewed some contributions for Hacktoberfest so far I saw some common patterns you can check in advance:

  • Make sure you have added connector documentation to /docs/integrations/
  • Remove the file catalog from /integration_tests
  • Edit the sample_config.json inside /integration_tests
  • For the configured_catalog you can use only json_schema: {}
  • Add title to all properties in the spec.yaml
  • Make sure the documentationUrl in the spec.yaml redirect to Airbyte's future connector page, eg: connector Airtable the documentationUrl: https://docs.airbyte.com/integrations/sources/airtable
  • Review now new line at EOF (end-of-file) for all files.

If possible send to me a DM in Slack with the tests credentials, this process will make easier to us run integration tests and publish your connector. If you only has production keys, make sure to create a bootstrap.md explaining how to get the keys.

@marcosmarxm
Copy link
Member

Tested the connector using sandbox url and works. Do I need to shift to production url ? Production needs login details.

It isn't necessary to use productions keys.

@marcosmarxm marcosmarxm changed the title 🎉 New Source: SAP 🎉 New Source: SAP [low-code cdk] Oct 31, 2022
@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Oct 31, 2022

Tested the connector using sandbox url and works. Do I need to shift to production url ? Production needs login details.

It isn't necessary to use productions keys.

So, I've tested using sandbox url and it works. Testing gives 2 errors for now. Can't test production as it needs login details I guess. Gives connection error.

―――――――――――――――――――――――――――――――――――――――― TestDiscovery.test_primary_keys_exist_in_schema[inputs0] ―――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestDiscovery object at 0x7fed07ed4490>
discovered_catalog = {'data': AirbyteStream(name='data', json_schema={'Data': {'type': 'array', 'items': {'type': 'object', 'properties': {...source_defined_cursor=None, default_cursor_field=None, source_defined_primary_key=[['Job Seeker ID']], namespace=None)}

    def test_primary_keys_exist_in_schema(self, discovered_catalog: Mapping[str, Any]):
        """Check that all primary keys are present in catalog."""
        for stream_name, stream in discovered_catalog.items():
            for pk in stream.source_defined_primary_key or []:
                schema = stream.json_schema
                pk_path = "/properties/".join(pk)
>               pk_field_location = dpath.util.search(schema["properties"], pk_path)
E               KeyError: 'properties'

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:288: KeyError
―――――――――――――――――――――――――――――――――――――――――――――――――――― TestBasicRead.test_read[inputs0] ―――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestBasicRead object at 0x7fed07c3f1f0>, connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='data', json_schema={'Data': {'typ...l_refresh'>, cursor_field=None, destination_sync_mode=<DestinationSyncMode.overwrite: 'overwrite'>, primary_key=None)])
inputs = BasicReadTestConfig(config_path='secrets/config.json', configured_catalog_path='integration_tests/configured_catalog.j...rds=None, validate_schema=True, validate_data_points=False, expect_trace_message_on_failure=True, timeout_seconds=None)
expected_records_by_stream = {}, docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7fed07c3ff70>
detailed_logger = <Logger detailed_logger acceptance_tests_logs/test_core.py__TestBasicRead__test_read[inputs0].txt (DEBUG)>

    def test_read(
        self,
        connector_config,
        configured_catalog,
        inputs: BasicReadTestConfig,
        expected_records_by_stream: MutableMapping[str, List[MutableMapping]],
        docker_runner: ConnectorRunner,
        detailed_logger,
    ):
        output = docker_runner.call_read(connector_config, configured_catalog)
        records = [message.record for message in filter_output(output, Type.RECORD)]
    
        assert records, "At least one record should be read using provided catalog"
    
        if inputs.validate_schema:
            self._validate_schema(records=records, configured_catalog=configured_catalog)
    
        self._validate_empty_streams(records=records, configured_catalog=configured_catalog, allowed_empty_streams=inputs.empty_streams)
        for pks, record in primary_keys_for_records(streams=configured_catalog.streams, records=records):
            for pk_path, pk_value in pks.items():
>               assert (
                    pk_value is not None
                ), f"Primary key subkeys {repr(pk_path)} have null values or not present in {record.stream} stream records."
E               AssertionError: Primary key subkeys ('Job Seeker ID',) have null values or not present in data stream records.
E               assert None is not None

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:485: AssertionError

Second error says Primary key is null. I've used the right primary key given in documentation. The sandbox url does give some null primary keys.

@marcosmarxm
Copy link
Member

Tested the connector using sandbox url and works. Do I need to shift to production url ? Production needs login details.

It isn't necessary to use productions keys.

So, I've tested using sandbox url and it works. Testing gives 2 errors for now. Can't test production as it needs login details I guess. Gives connection error.

―――――――――――――――――――――――――――――――――――――――― TestDiscovery.test_primary_keys_exist_in_schema[inputs0] ―――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestDiscovery object at 0x7fed07ed4490>
discovered_catalog = {'data': AirbyteStream(name='data', json_schema={'Data': {'type': 'array', 'items': {'type': 'object', 'properties': {...source_defined_cursor=None, default_cursor_field=None, source_defined_primary_key=[['Job Seeker ID']], namespace=None)}

    def test_primary_keys_exist_in_schema(self, discovered_catalog: Mapping[str, Any]):
        """Check that all primary keys are present in catalog."""
        for stream_name, stream in discovered_catalog.items():
            for pk in stream.source_defined_primary_key or []:
                schema = stream.json_schema
                pk_path = "/properties/".join(pk)
>               pk_field_location = dpath.util.search(schema["properties"], pk_path)
E               KeyError: 'properties'

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:288: KeyError
―――――――――――――――――――――――――――――――――――――――――――――――――――― TestBasicRead.test_read[inputs0] ―――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestBasicRead object at 0x7fed07c3f1f0>, connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='data', json_schema={'Data': {'typ...l_refresh'>, cursor_field=None, destination_sync_mode=<DestinationSyncMode.overwrite: 'overwrite'>, primary_key=None)])
inputs = BasicReadTestConfig(config_path='secrets/config.json', configured_catalog_path='integration_tests/configured_catalog.j...rds=None, validate_schema=True, validate_data_points=False, expect_trace_message_on_failure=True, timeout_seconds=None)
expected_records_by_stream = {}, docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7fed07c3ff70>
detailed_logger = <Logger detailed_logger acceptance_tests_logs/test_core.py__TestBasicRead__test_read[inputs0].txt (DEBUG)>

    def test_read(
        self,
        connector_config,
        configured_catalog,
        inputs: BasicReadTestConfig,
        expected_records_by_stream: MutableMapping[str, List[MutableMapping]],
        docker_runner: ConnectorRunner,
        detailed_logger,
    ):
        output = docker_runner.call_read(connector_config, configured_catalog)
        records = [message.record for message in filter_output(output, Type.RECORD)]
    
        assert records, "At least one record should be read using provided catalog"
    
        if inputs.validate_schema:
            self._validate_schema(records=records, configured_catalog=configured_catalog)
    
        self._validate_empty_streams(records=records, configured_catalog=configured_catalog, allowed_empty_streams=inputs.empty_streams)
        for pks, record in primary_keys_for_records(streams=configured_catalog.streams, records=records):
            for pk_path, pk_value in pks.items():
>               assert (
                    pk_value is not None
                ), f"Primary key subkeys {repr(pk_path)} have null values or not present in {record.stream} stream records."
E               AssertionError: Primary key subkeys ('Job Seeker ID',) have null values or not present in data stream records.
E               assert None is not None

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:485: AssertionError

Second error says Primary key is null. I've used the right primary key given in documentation. The sandbox url does give some null primary keys.

@girarda it is allow to use nullable primary keys?

@marcosmarxm
Copy link
Member

@Sid-Lais could you share the test account credentials with me?

@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 2, 2022

@Sid-Lais could you share the test account credentials with me?

Shared the credentials on slack

@marcosmarxm
Copy link
Member

@Sid-Lais I'm having issue running ./gradlew airbyte-integrations:connectors:source-sap:integrationTest, can you share the output of this command>?

@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 3, 2022

@Sid-Lais I'm having issue running ./gradlew airbyte-integrations:connectors:source-sap:integrationTest, can you share the output of this command>?

Build completed with 3 failures. It says old python used even though I have python3.10

Building all of Airbyte.
/home/sid/projects/airbyte/airbyte-integrations/connectors
Type-safe dependency accessors is an incubating feature.

> Configure project :
configuring docker task for airbyte-bootloader
configuring docker task for airbyte-container-orchestrator
configuring docker task for airbyte-cron
configuring docker task for airbyte-proxy
configuring docker task for airbyte-server
configuring docker task for airbyte-temporal
configuring docker task for airbyte-webapp
configuring docker task for airbyte-workers
configuring docker task for init
configuring docker task for db-lib
configuring docker task for reporter

> Task :airbyte-integrations:bases:base-normalization:checkPython FAILED
> Task :airbyte-integrations:bases:source-acceptance-test:checkPython FAILED
> Task :airbyte-commons:compileJava FAILED

> Task :airbyte-integrations:connectors:source-sap:checkPython
Using python 3.10.8 from /home/sid/projects/airbyte/airbyte-integrations/connectors/source-sap/.venv (.venv/bin/python)
Using pip 22.3 from /home/sid/projects/airbyte/airbyte-integrations/connectors/source-sap/.venv/lib/python3.10/site-packages/pip (python 3.10)

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':airbyte-integrations:bases:base-normalization:checkPython'.
> Python (/usr) verion 3.8.10 does not match minimal required version: 3.9

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':airbyte-integrations:bases:source-acceptance-test:checkPython'.
> Python (/usr) verion 3.8.10 does not match minimal required version: 3.9

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':airbyte-commons:compileJava'.
> error: invalid source release: 17

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7s
14 actionable tasks: 4 executed, 10 up-to-date

@girarda
Copy link
Contributor

girarda commented Nov 4, 2022

@marcosmarxm @Sid-Lais - it's not possible to have a nullable primary key, but it's possible to not set a primary key

@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 4, 2022

@marcosmarxm @Sid-Lais - it's not possible to have a nullable primary key, but it's possible to not set a primary key

It doesn't let me not set a primary key

―――――――――――――――――――――――――――――――――――――――― TestDiscovery.test_primary_keys_exist_in_schema[inputs0] ―――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestDiscovery object at 0x7f9b31554ac0>
discovered_catalog = {'data': AirbyteStream(name='data', json_schema={'Data': {'type': 'array', 'items': {'type': 'object', 'properties': {...refresh'>], source_defined_cursor=None, default_cursor_field=None, source_defined_primary_key=[[' ']], namespace=None)}

    def test_primary_keys_exist_in_schema(self, discovered_catalog: Mapping[str, Any]):
        """Check that all primary keys are present in catalog."""
        for stream_name, stream in discovered_catalog.items():
            for pk in stream.source_defined_primary_key or []:
                schema = stream.json_schema
                pk_path = "/properties/".join(pk)
>               pk_field_location = dpath.util.search(schema["properties"], pk_path)
E               KeyError: 'properties'

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:288: KeyError

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_primary_keys_exist_in_schema[inputs0] ⨯77% ███████▊  
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_streams_has_sync_modes[inputs0] ✓81% ████████▏ 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_additional_properties_is_true[inputs0] ✓85% ████████▌ Pulling docker image airbyte/source-sap:latest
{"type": "LOG", "log": {"level": "WARN", "message": "\n We did not find the airbyte/source-sap:latest image for this connector. This probably means this version has not yet been published to an accessible docker registry like DockerHub."}}

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_backward_compatibility[inputs0] s88% ████████▉ 

―――――――――――――――――――――――――――――――――――――――――――――――――――― TestBasicRead.test_read[inputs0] ―――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestBasicRead object at 0x7f9b31462dd0>, connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='data', json_schema={'Data': {'typ...l_refresh'>, cursor_field=None, destination_sync_mode=<DestinationSyncMode.overwrite: 'overwrite'>, primary_key=None)])
inputs = BasicReadTestConfig(config_path='secrets/config.json', configured_catalog_path='integration_tests/configured_catalog.j...rds=None, validate_schema=True, validate_data_points=False, expect_trace_message_on_failure=True, timeout_seconds=None)
expected_records_by_stream = {}, docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7f9b312b4820>
detailed_logger = <Logger detailed_logger acceptance_tests_logs/test_core.py__TestBasicRead__test_read[inputs0].txt (DEBUG)>

    def test_read(
        self,
        connector_config,
        configured_catalog,
        inputs: BasicReadTestConfig,
        expected_records_by_stream: MutableMapping[str, List[MutableMapping]],
        docker_runner: ConnectorRunner,
        detailed_logger,
    ):
        output = docker_runner.call_read(connector_config, configured_catalog)
        records = [message.record for message in filter_output(output, Type.RECORD)]
    
        assert records, "At least one record should be read using provided catalog"
    
        if inputs.validate_schema:
            self._validate_schema(records=records, configured_catalog=configured_catalog)
    
        self._validate_empty_streams(records=records, configured_catalog=configured_catalog, allowed_empty_streams=inputs.empty_streams)
        for pks, record in primary_keys_for_records(streams=configured_catalog.streams, records=records):
            for pk_path, pk_value in pks.items():
>               assert (
                    pk_value is not None
                ), f"Primary key subkeys {repr(pk_path)} have null values or not present in {record.stream} stream records."
E               AssertionError: Primary key subkeys (' ',) have null values or not present in data stream records.
E               assert None is not None

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:485: AssertionError

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead.test_read[inputs0] ⨯92% █████████▎{"type": "LOG", "log": {"level": "ERROR", "message": "Docker container failed, code 1, error:\n{\"type\": \"TRACE\", \"trace\": {\"type\": \"ERROR\", \"emitted_at\": 1667523563970.8928, \"error\": {\"message\": \"Something went wrong in the connector. See the logs for more details.\", \"internal_message\": \"2 validation errors for ConfiguredAirbyteCatalog\\nstreams -> 0 -> sync_mode\\n  value is not a valid enumeration member; permitted: 'full_refresh', 'incremental' (type=type_error.enum; enum_values=[<SyncMode.full_refresh: 'full_refresh'>, <SyncMode.incremental: 'incremental'>])\\nstreams -> 0 -> destination_sync_mode\\n  value is not a valid enumeration member; permitted: 'append', 'overwrite', 'append_dedup' (type=type_error.enum; enum_values=[<DestinationSyncMode.append: 'append'>, <DestinationSyncMode.overwrite: 'overwrite'>, <DestinationSyncMode.append_dedup: 'append_dedup'>])\", \"stack_trace\": \"Traceback (most recent call last):\\n  File \\\"/airbyte/integration_code/main.py\\\", line 13, in <module>\\n    launch(source, sys.argv[1:])\\n  File \\\"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\\\", line 131, in launch\\n    for message in source_entrypoint.run(parsed_args):\\n  File \\\"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\\\", line 119, in run\\n    config_catalog = self.source.read_catalog(parsed_args.catalog)\\n  File \\\"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/source.py\\\", line 90, in read_catalog\\n    return ConfiguredAirbyteCatalog.parse_obj(self.read_config(catalog_path))\\n  File \\\"pydantic/main.py\\\", line 521, in pydantic.main.BaseModel.parse_obj\\n  File \\\"pydantic/main.py\\\", line 341, in pydantic.main.BaseModel.__init__\\npydantic.error_wrappers.ValidationError: 2 validation errors for ConfiguredAirbyteCatalog\\nstreams -> 0 -> sync_mode\\n  value is not a valid enumeration member; permitted: 'full_refresh', 'incremental' (type=type_error.enum; enum_values=[<SyncMode.full_refresh: 'full_refresh'>, <SyncMode.incremental: 'incremental'>])\\nstreams -> 0 -> destination_sync_mode\\n  value is not a valid enumeration member; permitted: 'append', 'overwrite', 'append_dedup' (type=type_error.enum; enum_values=[<DestinationSyncMode.append: 'append'>, <DestinationSyncMode.overwrite: 'overwrite'>, <DestinationSyncMode.append_dedup: 'append_dedup'>])\\n\", \"failure_type\": \"system_error\"}}}\n"}}

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead.test_airbyte_trace_message_on_failure[inputs0] ✓96% █████████▋
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py::TestFullRefresh.test_sequential_reads[inputs0] ✓100% ██████████
{"type": "LOG", "log": {"level": "INFO", "message": "/home/sid/projects/airbyte/airbyte-integrations/connectors/source-sap - SAT run - 520f662ba4a4df483f8f0e119941ac2d4ed84a24 - FAILED"}}

============================================================ warnings summary =============================================================
../../bases/source-acceptance-test/source_acceptance_test/config.py:246: 26 warnings
airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py: 1 warning
  /home/sid/projects/airbyte/airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/config.py:246: DeprecationWarning: The 'warn' function is deprecated, use 'warning' instead
    logging.warn("The acceptance-test-config.yml file is in a legacy format. Please migrate to the latest format.")

airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py: 22 warnings
airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py: 1 warning
  /home/sid/projects/airbyte/airbyte-integrations/connectors/source-sap/.venv/lib/python3.10/site-packages/docker/utils/utils.py:52: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    s1 = StrictVersion(v1)

airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py: 22 warnings
airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py: 1 warning
  /home/sid/projects/airbyte/airbyte-integrations/connectors/source-sap/.venv/lib/python3.10/site-packages/docker/utils/utils.py:53: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    s2 = StrictVersion(v2)

-- Docs: https://docs.pytest.org/en/stable/warnings.html
========================================================= short test summary info =========================================================
SKIPPED [1] ../../bases/source-acceptance-test/source_acceptance_test/plugin.py:63: Skipping TestIncremental.test_two_sequential_reads: not found in the config.
SKIPPED [1] ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:51: The previous connector image could not be retrieved.
SKIPPED [1] ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:229: The previous connector image could not be retrieved.
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery::test_primary_keys_exist_in_schema[inputs0]
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead::test_read[inputs0] - AssertionError:...

Results (72.67s):
      22 passed
       2 failed
         - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:282 TestDiscovery.test_primary_keys_exist_in_schema[inputs0]
         - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:465 TestBasicRead.test_read[inputs0]
       2 skipped

@marcosmarxm
Copy link
Member

Hello! I'm going to be out of the office this Friday and won't be able to review your contribution again today, I return next Monday. So far, most contributions look solid and are almost done to be approved. As said in Chris' comment all contributions made before 2-November are eligible to receive the prize and have 2 weeks to merge the contributions. But I ensure next week we're going to have your contribution merged. If you have questions about the implementation you can send them in #hacktoberfest-2022 in Airbyte's Slack.

Sorry the inconvenience and see you again next week, thank you so much for your contribution!

@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 4, 2022

Hello! I'm going to be out of the office this Friday and won't be able to review your contribution again today, I return next Monday. So far, most contributions look solid and are almost done to be approved. As said in Chris' comment all contributions made before 2-November are eligible to receive the prize and have 2 weeks to merge the contributions. But I ensure next week we're going to have your contribution merged. If you have questions about the implementation you can send them in #hacktoberfest-2022 in Airbyte's Slack.

Sorry the inconvenience and see you again next week, thank you so much for your contribution!

No worries. Enjoy your weekend. (⁠◍⁠•⁠ᴗ⁠•⁠◍⁠)

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Sid-Lais I think this source must change the name to SAP Fieldgrass to make clear is only one specific module of SAP.

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments, the error you pointed is related about your schema.

@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 8, 2022

@Sid-Lais I think this source must change the name to SAP Fieldgrass to make clear is only one specific module of SAP.

So all the files also has to be changed to sap_fieldglass from sap?

@marcosmarxm
Copy link
Member

@Sid-Lais I think this source must change the name to SAP Fieldgrass to make clear is only one specific module of SAP.

So all the files also has to be changed to sap_fieldglass from sap?

Yes.

@Sid-Lais Sid-Lais changed the title 🎉 New Source: SAP [low-code cdk] 🎉 New Source: SAP-FIELDGlASS [low-code cdk] Nov 9, 2022
@Sid-Lais
Copy link
Contributor Author

Sid-Lais commented Nov 9, 2022

@marcosmarxm Rebuilt the whole connector from ground and turns out I had somewhat used old generator. Everything works now and all test passed. Test result added to the PR in acceptance section.

@Sid-Lais
Copy link
Contributor Author

@Sid-Lais small comment about unrelated change and it is ready to publish.

Done

@marcosmarxm
Copy link
Member

marcosmarxm commented Nov 10, 2022

/test connector=connectors/source-sap-fieldglass

🕑 connectors/source-sap-fieldglass https://github.com/airbytehq/airbyte/actions/runs/3441285516
✅ connectors/source-sap-fieldglass https://github.com/airbytehq/airbyte/actions/runs/3441285516
Python tests coverage:

	 Name                                                 Stmts   Miss  Cover   Missing
	 ----------------------------------------------------------------------------------
	 source_acceptance_test/base.py                          12      4    67%   16-19
	 source_acceptance_test/config.py                       133      3    98%   87, 93, 230
	 source_acceptance_test/conftest.py                     196     97    51%   35, 41-43, 48, 54, 60, 66, 72-74, 80-95, 100, 105-107, 113-115, 121-122, 127-128, 133, 139, 148-157, 163-168, 232, 238, 244-250, 258-263, 271-284, 289-295, 302-313, 320-336
	 source_acceptance_test/plugin.py                        69     25    64%   22-23, 31, 36, 120-140, 144-148
	 source_acceptance_test/tests/test_core.py              329    106    68%   39, 50-58, 63-70, 74-75, 79-80, 164, 202-219, 228-236, 240-245, 251, 284-289, 327-334, 377-379, 382, 447-455, 484-485, 491, 494, 530-540, 553-578
	 source_acceptance_test/tests/test_incremental.py       145     20    86%   21-23, 29-31, 36-43, 48-61, 224
	 source_acceptance_test/utils/asserts.py                 37      2    95%   57-58
	 source_acceptance_test/utils/common.py                  77     10    87%   15-16, 24-30, 64, 67
	 source_acceptance_test/utils/compare.py                 62     23    63%   21-51, 68, 97-99
	 source_acceptance_test/utils/config_migration.py        23     23     0%   5-37
	 source_acceptance_test/utils/connector_runner.py       112     50    55%   23-26, 32, 36, 39-68, 71-73, 76-78, 81-83, 86-88, 91-93, 96-114, 148-150
	 source_acceptance_test/utils/json_schema_helper.py     105     13    88%   30-31, 38, 41, 65-68, 96, 120, 190-192
	 ----------------------------------------------------------------------------------
	 TOTAL                                                 1479    376    75%

Build Passed

Test summary info:

=========================== short test summary info ============================
SKIPPED [1] ../usr/local/lib/python3.9/site-packages/source_acceptance_test/plugin.py:63: Skipping TestIncremental.test_two_sequential_reads: This connector does not implement incremental sync
SKIPPED [1] ../usr/local/lib/python3.9/site-packages/source_acceptance_test/tests/test_core.py:51: The previous connector image could not be retrieved.
SKIPPED [1] ../usr/local/lib/python3.9/site-packages/source_acceptance_test/tests/test_core.py:229: The previous connector image could not be retrieved.
======================== 24 passed, 3 skipped in 29.55s ========================

@marcosmarxm
Copy link
Member

marcosmarxm commented Nov 11, 2022

/publish connector=connectors/source-sap-fieldglass

🕑 Publishing the following connectors:
connectors/source-sap-fieldglass
https://github.com/airbytehq/airbyte/actions/runs/3441583629


Connector Did it publish? Were definitions generated?
connectors/source-sap-fieldglass

if you have connectors that successfully published but failed definition generation, follow step 4 here ▶️

@marcosmarxm marcosmarxm changed the title 🎉 New Source: SAP-FIELDGLASS [low-code cdk] 🎉 New Source: SAP Fieldglass [low-code cdk] Nov 11, 2022
Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Sid-Lais

@marcosmarxm marcosmarxm merged commit 9eee819 into airbytehq:master Nov 11, 2022
akashkulk pushed a commit that referenced this pull request Dec 2, 2022
* Generated boilerplate

* Connector woring in sandbox

* Connector working with 1 record from stream

* Sample state

* Added docs

* Revert "Added docs"

This reverts commit 413dedff6d8eced2c6c07bf3aa849eb6dce5b30a.

* Added docs finally

* Updated config

* Fixed data.json and cleaned some files

* Changed name from sap to sap-fieldglass and fixed test errors

* Re-added Primary key for future use

* Removed primary key as test failed

* Update settings.json

Fixed formating

* add sap fieldglass to source def

* auto-bump connector version

* remove oss_catalog.json

* revert change to linting file

* run format

Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

8 participants