From 85b0bb22d27da975fce2c15de5c7a11bc684aaa4 Mon Sep 17 00:00:00 2001 From: Maksym Pavlenok Date: Mon, 2 Aug 2021 17:20:39 +0300 Subject: [PATCH] =?UTF-8?q?=F0=9F=8E=89=20Source=20Zendesk:=20Migration=20?= =?UTF-8?q?from=20Singer=20to=20CDK=20(#4861)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * init the new connector source-zendesk-support * Finished a development of ZenDesk streams * Source ZenDesk: finished * Source ZenDesk: remove unused test files * Source ZenDesk: format and validate code * Source Zendesk: update docs * Remove unused files * add a stream_slices logic for ticket_comments stream * 🎉 Python CDK: Allow setting network adapter args on outgoing HTTP requests (#4493) * 🎉 Destination S3: support `anyOf` `allOf` and `oneOf` (#4613) * Support combined restrictions in json schema * Bump s3 version * Add more test cases * Update changelog * Add more test cases * Update documentation * Format code * SAT: verify `AIRBYTE_ENTRYPOINT` is defined (#4478) * save changes required for work; TODO locate all places that need to be updated to make test working * move new test inside test_spec * apply suggestions * change return type + add check env = space_joined_entrypoint * requested * add check entrypoint with env * bump SAT --version && changelog update * merge && fix changelog * changes * add dynamic docker runner creator + test having properties * update the names * change names * make fixtures * upd text * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak * requested changes * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak * apply requested changes * change names (requested) * move binary strings to standard with convertation in builder * fixing merge-conflict side effect Co-authored-by: Eugene Kulak * Migrate Quickstart to use PokeAPI (#4615) * Migrate Quickstart to use PokeAPI * Words words words Co-authored-by: Abhi Vaidyanatha * Left isn't right (#4616) Co-authored-by: Abhi Vaidyanatha * Create on on-oci-vm.md (#4468) * Create on on-oci-vm.md Deployment guide for Airbyte on Oracle Cloud Infrastructure (OCI) VM * Update on-oci-vm.md Adding the image links and uploading images to the repository * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * Update on-oci-vm.md * Add files via upload * Update on-oci-vm.md * Add files via upload * Update on-oci-vm.md * Update on-oci-vm.md Co-authored-by: Abhi Vaidyanatha * 🐛 platform: Fix silent failures in sources (#4617) * add oracle dpeloyment guide to summary.md (#4619) * Mailchimp fix url-base (#4621) * minimal change to show acceptance test failure * exactly fix * bump version and readme * upd * 🎉 New Source: Paypal Transaction (#4240) * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * incremental sync, acceptance test * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * incremental sync, acceptance test * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * updated slices and api limits, added validation for input dates * added tests, fixed cursor related information in schemas and configured catalogs, removed old comments, re-arranged Base PaypalTransactionStream class * added input param 'env' to support production and sandbox envs * added support for sandbox option, updated pattern for optional end date option * added github secrets * added support for sandbox option, updated pattern for optional end date option * fixed Copyright date, removed debug mesages * added docs * fix for test failure - The sync should produce at least one STATE message * removed optional parameter 'end_date' * removed detailed info about balances schema * Delete employees.json * Delete customers.json * Added requests_per_minute rate limit * added unit tests, added custom backoff * added test for stream slices with stream state * removed comments * updated docs pages * fixed format for json files * fixed types in schemas and link to the schema. fixed primary key for Transactions stream * updated stream slices * Updated tests, unified stream_slices for both streams, all instance variables instantiated directly in __init__ method * added CHANGELOG.md * Added build seeds * fixed closing double quotation mark * added paypal entry in builds.md * add fixture helper * added paypal transaction generator script * fixed styling * maximum allowed start_date is extracted from API response now. * fixed schemas * fixed schemas - removed datetime * now maximum_allowed_start_date is identified by last_refreshed_datetime attr in API response. * added possibility to specify additional properties Co-authored-by: Sherif Nada * set db version after full import is complete (#4626) * set db version after full import is complete * check db version in the last step * add comment * Fix docs formatting * Redirect old link to upgrading tutorial (#4635) Co-authored-by: Abhi Vaidyanatha * Fix broken link in SUMMARY.md * Airflow Demo: Remove superset in down.sh (#4638) * Remove superset in down.sh * Clean up superset containers before creating them in up.sh Co-authored-by: Abhi Vaidyanatha * Airflow demo: Clean up scripts and more clearly describe actions (#4639) * Airflow demo: Script cleanup * Correct docker compose name for airflow file * Final fixes * Clean up airbyte destination Co-authored-by: Abhi Vaidyanatha * :tada: Add documentation for configuring Kube GCS logging. (#4622) * Bump version: 0.27.0-alpha → 0.27.1-alpha (#4640) * 0.27.1 Platform Patch Notes (#4644) Co-authored-by: Abhi Vaidyanatha * 🎉 New Source: Zendesk Sunshine (#4359) * pre-PR * add git config * format * Update airbyte-integrations/connectors/source-zendesk-sunshine/requirements.txt upd requirements.txt remove extra Co-authored-by: Eugene Kulak * Update airbyte-integrations/connectors/source-zendesk-sunshine/source_zendesk_sunshine/streams.py backoff time int to float (btw real return type in headers is integer) Co-authored-by: Eugene Kulak * requested changes * fix newline absence && rm unnecessary temp file * url_base to property * rm extra var coming property * rm extra var coming property * save * finishing updating the documentation * forgotten definition * add nullable to pass the test * fix date in the log Co-authored-by: Eugene Kulak * 0.27.1 Connector Patch Notes (#4646) Co-authored-by: Abhi Vaidyanatha * Update connector certification table. (#4647) Co-authored-by: Abhi Vaidyanatha * :bug: Stub out the GCP Env Var in Docker to prevent noisy and harmless errors. (#4642) * Add this to prevent noisy errors. * Add hint to Airflow guide about local example (#4656) Co-authored-by: Abhi Vaidyanatha * fix version for kube automatic migration support (#4649) * format zendesk sunshine connector (#4658) * 🎉 New source: Dixa (#4358) * Turn on MYSQL normalization flag. (#4651) * Turn on normalization flag. Bump versions * Combine admin and settings (#4525) * Add side menu component * Add side menu to settings page. Remove admin link from sidebar * Move NotificationPage * Move ConfigurationPage * Add Sources and Destinations pages to Settings. Delete Admin page * Add MetricsPage * Edit Notifications and Metrics pages * Update feedback for metrics and notification pages * Add update icons data to side menu * Add AccountPage * Job history purging (#4575) * WIP: Job history purging * Created test cases that handle variations of job history purging configuration * Typo fix * Expanded test cases to control for job history on multiple connections at once. * Handle latest job with saved state correctly regardless of order of ids * Whitespace * Externalized sql. Cleaned up constants. * Cleaned up test case persistence code and structure * Whitespace and formatting per standard tooling. * 0.27.1 Announcement Summary (#4678) Co-authored-by: Abhi Vaidyanatha * 🐛 Source Sendgrid: add start_time config and correct primary_key (#4682) * add start_time config and correct primary_key * correct integration tests * correct type * config txt and primary_key * test to show how automatic migration handles deprecated definitions (#4655) * test to show definitions not present in latest seed would be deleted in automatic migration * format * add deprecated config being used scenario * Source dixa: fix unit tests (#4690) * introduce common abstraction for CDC via debezium (#4580) * wip * add file * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * rename class + add missing property * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * reset to minutes * fix build * address review comments * should return Optional * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * Source Dixa: Pin tz in ConversationExport.ms_timestamp_to_datetime (#4696) * Source Dixa: add to connector index (#4701) * allow injecting filters for server (#4677) * allow injecting filters * fmt * upgrade postgres version for new cdc abstraction (#4702) * Fix dependencies for Superset demo (#4705) * Fix superset dependency location * Add some Superset setup Co-authored-by: Abhi Vaidyanatha * 📚 add SSH instructions for OCI VM setup (#4684) Co-authored-by: Sherif A. Nada * upgrade mysql version for new cdc abstraction (#4703) * Update with ALTER TABLE statements (#4707) Co-authored-by: Abhi Vaidyanatha * remove unused deps (#4512) Co-authored-by: Davin Chia * fix config init race condition (#4679) * 🐛 Destination S3: fix minio output for parquet format * Bump destination s3 version (#4718) * Fix scheduler race condition. (#4691) * Periodic connector tests workflow: add `Accept` header per github docs recommendation (#4722) * allow launching integration tests from workflow dispatch (#4723) * Bump version: 0.27.1-alpha → 0.27.2-alpha (#4724) * 🐛 Source Square: Update _send_request method due to changes in Airbyte CDK (#4645) * 🎉 Destination Snowflake: tag snowflake traffic with airbyte ID to enable optimizations from Snowflake (#4713) * 🎉 New source: Typeform (#4541) Typeform source: Forms and Responses streams * Upgrade postgres and redshift destination to remove basic_normalization attribute (#4725) * upgrade snowflake,redshift,postgres to remove basic_normalization * undo snowflake * undo snowflaketest * fix broken assertions for automatic migration tests (#4732) * Slightly improve sed-based yaml parsing (#4721) Previous sed did not handle the valid `profile: foo` * throw exception if we close engine before snapshot is complete + increase timeout for subsequent records (#4730) * throw exception if we close engine before snapshot is complete + increase timeout for subsequent records * add comment + bump postgres version to use new changes * allow publishing airbyte-server to local maven repo (#4717) * allow publishing airbyte-server to local maven repo * Stub this out so the name that is created is airbyte-server-0.27.1-alpha.jar and not airbyte-server-0.27.1-alpha-all.jar. * Add comments. * see if this fixes build Co-authored-by: Davin Chia * CDK: Add initial Destination abstraction and tests (#4719) Co-authored-by: Eugene Kulak * Update docs on GitHub connector now that its Airbyte native (#4739) Co-authored-by: Abhi Vaidyanatha * Remove statement about Postgres connector being based on Singer (#4740) Co-authored-by: Abhi Vaidyanatha * fix flaky migration acceptance test (#4743) * upgrade fabric8 client (#4738) * 🎉 Source MSSQL: implementation for CDC (#4689) * first few classes for mssql cdc * wip * mssql cdc working against unit tests * increment version * add cdc acceptance test * tweaks * add file * working on comprehensive tests * change isolation from snapshot to read_committed_snapshot * finalised type tests * Revert "change isolation from snapshot to read_committed_snapshot" This reverts commit 20c67680714a74ce3489f44e17feeec8905be52f. * small docstring fix * remove unused imports * stress test fixes * minor formatting improvements * mssql cdc docs * finish off cdc docs * format fix * update connector version * add to changelog * fix for sql server agent offline failing cdc enable on tables * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * use new cdc abstraction for mysql * undo wanted change * use cdc abstraction for postgres * add files * pull in latest changes * ready * rename class + add missing property * use renamed class + move constants to MySqlSource * use renamed class + move constants to PostgresSource * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * bring in latest changes from cdc abstraction * reset to minutes * bring in the latest changes * format * fix build * address review comments * bring in latest changes * bring in latest changes * use common abstraction for CDC via debezium for sql server * remove debezium from build * finalise PR * should return Optional * pull in latest changes * pull in latest changes * address review comments * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * lower version for tests to run on CI * format * Update docs/integrations/sources/mssql.md Co-authored-by: Sherif A. Nada * addressing review comments * fix for testGetTargetPosition * format changes Co-authored-by: George Claireaux Co-authored-by: Sherif A. Nada * bump up MSSQL version for cdc (#4694) * first few classes for mssql cdc * wip * mssql cdc working against unit tests * increment version * add cdc acceptance test * tweaks * add file * working on comprehensive tests * change isolation from snapshot to read_committed_snapshot * finalised type tests * Revert "change isolation from snapshot to read_committed_snapshot" This reverts commit 20c67680714a74ce3489f44e17feeec8905be52f. * small docstring fix * remove unused imports * stress test fixes * minor formatting improvements * mssql cdc docs * finish off cdc docs * format fix * update connector version * add to changelog * fix for sql server agent offline failing cdc enable on tables * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * use new cdc abstraction for mysql * undo wanted change * use cdc abstraction for postgres * add files * pull in latest changes * ready * rename class + add missing property * use renamed class + move constants to MySqlSource * use renamed class + move constants to PostgresSource * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * bring in latest changes from cdc abstraction * reset to minutes * bring in the latest changes * format * fix build * address review comments * bring in latest changes * bring in latest changes * use common abstraction for CDC via debezium for sql server * remove debezium from build * finalise PR * should return Optional * pull in latest changes * pull in latest changes * address review comments * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * lower version for tests to run on CI * bump up mssql version for cdc * format * Update docs/integrations/sources/mssql.md Co-authored-by: Sherif A. Nada * addressing review comments * fix for testGetTargetPosition * format changes Co-authored-by: George Claireaux Co-authored-by: Sherif A. Nada * fixed broken links and styling (#4747) * Fix enabling connection in refresh catalog mode (#4527) * Fix enabling connection in refresh catalog mode * Do not update deprecated connectors (#4674) * Do not update deprecated connectors * Fix various connectorDefinition issues: disappearing button, wrong id used for destination update * 🐛 Source Slack: add float_ts field (#4683) * rename float_ts to ts cursor_field * add float_ts * change float_ts to number * change channel_msg * bump version * increase default timeout_seconds slack acc test * timeout_seconds to 1750 * timeout_seconds to 3600 :p * add changelog for slack connector * copy docs to webapp docker image (#4522) * use kube service user for pod sweeper (#4737) * use kube service user for pod sweeper * add pod sweeper logs * temporarily switch to stable for testing * temporarily remove building steps for kube testing since it can use prod images * output date strings from date command * load stable images * remove loading since it can pull the images * increase window for success storage to two hours * revert test logging changes * 🐛 Source GitHub: fix bug with `IssueEvents` stream and add handling for rate limiting (#4708) * Few updates for GitHub source Set correct `cursor_field` for `IssueEvents` stream. Add rate limit handling. Add handling for 403 error. Add handling for 502 error. Co-authored-by: Eugene Kulak Co-authored-by: Sherif A. Nada * :bug: Fix some api-spec errors. (#4742) * Source PostHog: Use account information for checking the connection (#4692) * this should fix the check if no records in annotations stream * update schemas for new SAT requirements && apply user hint upgrade on wrong api key * save schema upd * upd insights schema * upd insights schema2 * upd insights schema3 * upd insights schema4 * upd insights schema5 (null is joking) * upd insights schema6 (null is joking) * upd insights schema7 * upd insights schema8 * upd insights schema8 * bump version && docs * SAT: Improve error message when data mismatches schema (#4753) * improve message when data mismatch schema Co-authored-by: Eugene Kulak * increase sleep duration + show logs in CI (#4756) * Fixed cockroachdb repo image (#4758) * Bump version: 0.27.2-alpha → 0.27.3-alpha (#4761) * update kube docs (#4749) * fix kube overlay version (#4765) * Split Platform and Connector Builds (#4514) * remove second docs check in build(#4766) * Restore template generator and fix formatting. (#4768) * connector generate: fix chown logic (#4774) * Remove example use cases from docs (#4775) Co-authored-by: Abhi Vaidyanatha * Update README.md * 🎉 All java connectors: Added configValidator to check, discover, read and write calls (#4699) * Added configValidator to java connectors * 🎉 Stripe Source: Fix subscriptions stream to return all kinds of subscriptions (including expired and canceled) (#4669) #4669 Stripe Source: Fix subscriptions stream to return all kinds of subscriptions (including expired and canceled) Co-authored-by: Oleksandr Bazarnov * Add note about orphaned Airbyte configs preventing automatic upgrades (#4709) * Add note about removing orphaned Airbyte configs * Remove excess baggage * Add a resetting section to make this more clear. Co-authored-by: Abhi Vaidyanatha * Patch 0.27.2 and 0.27.3 platform notes (#4792) Co-authored-by: Abhi Vaidyanatha * Connector notes for 0.27.3 (#4794) Co-authored-by: Abhi Vaidyanatha * Add new logo to GitHub page (#4796) Co-authored-by: Abhi Vaidyanatha * 🎉 New Destination: Google Cloud Storage (#4784) * Adding Google Cloud Storage as destination * Removed few comments and amended the version * Added documentation in docs/integrations/destinations/gcs.md * Amended gcs.md with the right pull id * Implemented all the fixes requested by tuliren as per https://github.com/airbytehq/airbyte/pull/4329 * Renaming all the files * Branch alligned to S3 0.1.7 (with Avro and Jsonl). Removed redundant file by making S3 a dependency for GCS * Removed some additional duplicates between GCS and S3 * Revert changes in the root files * Revert jdbc files * Fix package names * Refactor gcs config * Format code * Fix gcs connection * Format code * Add acceptance tests * Fix parquet acceptance test * Add ci credentials * Register the connector and update documentations * Fix typo * Format code * Add unit test * Add comments * Update readme Co-authored-by: Sherif A. Nada Co-authored-by: Marco Fontana Co-authored-by: marcofontana.ing@gmail.com Co-authored-by: Marco Fontana Co-authored-by: Sherif A. Nada * 🐛 CDK: Fix logging of initial state value (#4795) * Update abstract_source.py * bump * CHANGELOG.md Co-authored-by: Eugene Kulak * bug fix: use register api (#4811) * 🐛 Add missing dependencies for acceptance tests to run. (#4808) * 🎉 Add Python Destination Template (#4771) * Format. (#4814) * 🎉 Migrate config persistence to database (#4670) * Implement db config persistence * Fix database readiness check * Reduce logging noise * Setup config database in config persistence factory * Update documentation * Load seed from yaml files * Refactor config persistence factory * Add one more test to mimic migration * Remove unnecessary changes * Run code formatter * Update placeholder env values * Set default config database parameters in docker compose Co-authored-by: Christophe Duong * Default setupDatabase to false * Rename variable * Set default config db parameters for server * Remove config db parameters from the env file * Remove unnecessary environment statements * Hide config persistence factory (#4772) * Remove CONFIG_DATABASE_HOST * Use builder in the test * Simplify config persistence builder * Clarify config db connection readiness * Format code * Add logging * Fix typo Co-authored-by: Christophe Duong * Add a config_id only index * Reuse record insertion code * Add id field name to config schema * Support data loading from legacy config schemas * Log missing logs in migration test * Move airbyte configs table to separate directory * Update exception message * Dump specific tables from the job database * Remove postgres specific uuid extension * Comment out future branch * Default configs db variables to empty When defaulting them to the jobs db variables, it somehow does not work. * Log inserted config records * Log all db write operations * Add back config db variables in env file to mute warnings * Log connection exception to debug flaky e2e test * Leave config db variables empty `.env` file does not support variable expansion. Co-authored-by: Christophe Duong Co-authored-by: Charles * 🎉 Source intercom: migration to CDK (#4676) * Added Intercom implementation * Updated segments docs * Updated _send_request method to new airbyte-cdk version * Updated cursor field to datetime string * Added filtering by state for incremental sync * Updated cursor paths for test incremental sync * Added dict type validation to get_data method * Updated catalog * Updated typing for start_date * Updated singer seed to cdk seed * Updated connector docs * Updated sample config file * Sorted streams alphabetically * Removed placeholder comments * Renamed rate_limit to queries_per_hour * Updated common sleep time to backoff_time method * 🎉 New source: Pipedrive connector (#4686) * Add pipedrive source initial * Add initial schemas. Add MVP source implementation. * Implement MVP streams * Complete MVP streams implementation * Apply schema format * Add test creds * Update streams.py Fix schemas * Update replication_start_date format. Add extra pagination condition * Refactor streams, remove unused classes. * Add pipedrive.md docs file. Add Pipedrive source definitions. * Add json source definition. * Update spec.json * Add docs mentions throughout the project files * Make number of Concurrent Jobs configurable. (#4687) * Explicitly pin ec2 runner version to 2.2.1. (#4823) This was a mismash before, partially my fault. Explicitly pinning for now. * 🐛 Source Facebook: Improve rate limit management (#4820) * Improve rate limit management * bump version * facebook-marketing.md update the changelog * format and fix * Source Facebook: fix formatting and publish new version (#4826) * format * disable schema validation * fix urls in AdCreatives stream, enable SAT for creatives * format Co-authored-by: Eugene Kulak * Code generator: Update generator to chown docs and config definition directories (#4819) * Python Demo Destination: KVDB (#4786) * 📚 CDK: Add python destination tutorial (#4800) * 📚 Source Shopify: migrate to new sandbox, update API version to 2021-07 (#4830) (#4830) Source Shopify: migrate to new sandbox, update API version to 2021-07 Co-authored-by: Oleksandr Bazarnov * 🐛 Source Instagram: Read previous state format and upgrade it (#4805) * few fixes for user_insights state * support old state format * format * bump Co-authored-by: Eugene Kulak * Add placeholder (#4816) * Add update button (#4809) * Point to new location for connector build status history (#4840) * Update GAds docs to indicate incremental support * Add openreplay (#4685) * Add openreplay * Add env variables for openreplay * Add openreplay env for k8s * 🎉 Source mixpanel: migration to CDK (#4566) * Mixpanel initiation * copied schemas and specs file from singer connector * authentication and a few streams * Added Funnels + FunnelsList * Added example of funnel response * added incremental Funnels stream with tests * added Annotations, CohortMembers, Engage, Cohorts, Funnels * added Revenue * fixed formatting * fixed variable names * fixed cohort_members and updated export streams * moved start_date and date checks into SourceMixpanel class * added error handling * added unit test, update docs and ci creds * fix url base for export stream * added full and incremental read for export stream * updated acceptance tests, added limit correction based on number of streams, export cursor is stored in datatime string * Funnel stream - added complex state which contains state for each funnel * added attribution windows support and project timezone config * fixed formatting * added default timezone * added dynamic schema generation for Engage and Export streams * fixed formatting * fixed ability to pass start_date in datetime format as well * fixed ability to pass start_date in datetime format as well * added additional_properties field for dynamic schemas. updates regex for start_date matching to support old config file * fixed formatting * export stream - convert all values to default type - string * added schema ref * added new properties for funnel stream * fixed formatting in funnel schema * added build related files * update changelog * fixed and added comments, renamed rate_limit variable * fixed formatting * changed normalization for reserved mixpanel attributes like $browser * alphabetise spec fields * added description about API limit handling * updated comment * Add openreplay variable (#4844) * 🐛 Sendgrid source: Gracefully handle malformed responses from sendgrid API (#4839) * Update job description (#4848) * Update job description * Create senior-product-manager * Create founding-account-executive * Update senior-product-manager * Update SUMMARY.md * Add py destination tutorial to summary.md (#4853) * Update CHANGELOG.md * 🐛 Kube: Fix Source Ports not releasing. (#4822) Closes #4660 . On further investigation, it turns out we were not releasing the source ports. This is because of how the Process abstraction works - waitFor calls close under the hood. We were only calling waitFor if the process was still alive. This is determined by the exitValue which comes from the Kubernetes pod's termination status. However, these ports are a local resource and no close calls means they were left dangling, leading to the behaviour we see today. Explicitly call close after retrieving the exit value of the Kubernetes pod. This better follows traditional assumptions around Processes - if the process returns some exit value, it means all resources associated with that process have been cleaned up. Also, - add in a bunch of debug logging for the future. - have better names for Kubernetes workers to make operations easier. * use new AMI ID for connector builds (#4855) * Wait for config volume to be ready (#4835) * Do not create config directory in fs persistence construction * Run kube acceptance test only for testing purpose * Wait for config volume to be ready * Move config volume wait for fs persistence construction * Restore ci workflow * Prune imports * 🎉 New source: US census (#4228) Co-authored-by: Sherif Nada * publish US Census (connector) (#4857) Co-authored-by: Daniel Mateus Pires Co-authored-by: Daniel Mateus Pires * 🐛 Source JIRA: Fix DBT failing normalization on `Labels` schema. (#4817) (#4817) 🐛 Source JIRA: Fix DBT failing normalization on `Labels` schema. Co-authored-by: Oleksandr Bazarnov * Rename founding-account-executive to founding-account-executive.md * Tweak ConfigNotFoundException class (#4821) * Use internal_api_host env variable * Source ZenDesk: format and validate code * refactor import / export endpoints to use the same code path as auto migration (#4797) * fix build (#4865) * 📝 Add server version requirement for mysql normalization (#4856) * 🐛 Destination MySQL: fix problem if source has a column with json (#4825) * [4583] Fixed MySQL destination of fails is source has a column with json data * hotfix: rename senior PM file to add .md * 📚 improve mongo docs and param descriptions (#4870) * Remove duplicated seed repository (#4869) * add workspace helper (#4868) * add workspace helper * fmt * switch to a fixed limit * 🐛 Fix Oracle spec to declare `sid` instead of `database` param, Redshift to allow `additionalProperties`, MSSQL test and spec to declare spec type correctly (#4874) * Kube: Better Port Abstraction. (#4829) Introduce a better port abstraction whose primary purpose is to confirm that ports are released when the Kube Pod Process is closed. This prevents issues like #4660 I'm also opening more ports so we can run at least 10 syncs in parallel. * Source Zendesk: update docs * Remove unused files * add a stream_slices logic for ticket_comments stream * remove changes of other connections * add secret Zendesk keys to command configs * :bug: Source Zendesk Support: add dummy unit test * add dummy integration test * fix Zendesk not loading username and facebook/twitter id #4373 * sort streams alphabetically * fix test issue with the unsupport field validate_output_from_all_streams * add info to source_definitions.yaml * remove json_schema from configured_catalog.json * add backoff logic * add unit tests * move part of unit tests to integration tests * fix test dependencies * add a build status Co-authored-by: Maksym Pavlenok Co-authored-by: Sherif A. Nada Co-authored-by: LiRen Tu Co-authored-by: vovavovavovavova <39351371+vovavovavovavova@users.noreply.github.com> Co-authored-by: Eugene Kulak Co-authored-by: Abhi Vaidyanatha Co-authored-by: Abhi Vaidyanatha Co-authored-by: Shadab Mohammad <39692236+shadabshaukat@users.noreply.github.com> Co-authored-by: midavadim Co-authored-by: Subodh Kant Chaturvedi Co-authored-by: Davin Chia Co-authored-by: Oliver Meyer <42039965+olivermeyer@users.noreply.github.com> Co-authored-by: Artem Astapenko <3767150+Jamakase@users.noreply.github.com> Co-authored-by: Jenny Brown <85510829+airbyte-jenny@users.noreply.github.com> Co-authored-by: Marcos Marx Co-authored-by: Jared Rhizor Co-authored-by: Charles Co-authored-by: Varun B Patil Co-authored-by: Dmytro <46269553+TymoshokDmytro@users.noreply.github.com> Co-authored-by: Yaroslav Dudar Co-authored-by: Brian Krausz Co-authored-by: George Claireaux Co-authored-by: oleh.zorenko <19872253+Zirochkaa@users.noreply.github.com> Co-authored-by: Eugene Kulak Co-authored-by: Eugene Co-authored-by: John Lafleur Co-authored-by: Anna Lvova <37615075+annalvova05@users.noreply.github.com> Co-authored-by: Marco Fontana Co-authored-by: marcofontana.ing@gmail.com Co-authored-by: Marco Fontana Co-authored-by: Christophe Duong Co-authored-by: Serhii Lazebnyi <53845333+lazebnyi@users.noreply.github.com> Co-authored-by: Vadym Co-authored-by: Vladimir remar Co-authored-by: Oleksandr Co-authored-by: Oleksandr Bazarnov Co-authored-by: Daniel Mateus Pires Co-authored-by: Daniel Mateus Pires Co-authored-by: jrhizor --- .github/workflows/publish-command.yml | 2 +- .github/workflows/test-command.yml | 2 +- .../79c1aa37-dae3-42ae-b333-d1c105477715.json | 8 + .../d29764f8-80d7-4dd7-acbe-1a42005ee5aa.json | 8 - .../resources/seed/source_definitions.yaml | 9 +- airbyte-integrations/builds.md | 2 +- .../source-zendesk-support/.dockerignore | 7 + .../source-zendesk-support/Dockerfile | 25 + .../source-zendesk-support/README.md | 131 +++++ .../acceptance-test-config.yml | 25 + .../acceptance-test-docker.sh | 8 + .../source-zendesk-support/build.gradle | 9 + .../integration_tests/__init__.py | 0 .../integration_tests/abnormal_state.json | 38 ++ .../integration_tests/acceptance.py | 34 ++ .../integration_tests/configured_catalog.json | 168 ++++++ .../integration_tests/integration_test.py | 106 ++++ .../integration_tests/invalid_config.json | 6 + .../connectors/source-zendesk-support/main.py | 33 ++ .../source-zendesk-support/requirements.txt | 2 + .../source-zendesk-support/setup.py | 44 ++ .../source_zendesk_support/__init__.py | 27 + .../schemas/group_memberships.json | 28 + .../schemas/groups.json | 25 + .../schemas/macros.json | 62 +++ .../schemas/organizations.json | 60 +++ .../schemas/satisfaction_ratings.json | 43 ++ .../schemas/shared/attachments.json | 76 +++ .../schemas/shared/metadata.json | 79 +++ .../schemas/shared/via.json | 65 +++ .../schemas/shared/via_channel.json | 101 ++++ .../schemas/sla_policies.json | 85 ++++ .../source_zendesk_support/schemas/tags.json | 11 + .../schemas/ticket_audits.json | 415 +++++++++++++++ .../schemas/ticket_comments.json | 39 ++ .../schemas/ticket_fields.json | 103 ++++ .../schemas/ticket_forms.json | 58 +++ .../schemas/ticket_metrics.json | 151 ++++++ .../schemas/tickets.json | 183 +++++++ .../source_zendesk_support/schemas/users.json | 196 +++++++ .../source_zendesk_support/source.py | 127 +++++ .../source_zendesk_support/spec.json | 46 ++ .../source_zendesk_support/streams.py | 478 ++++++++++++++++++ .../unit_tests/unit_test.py | 59 +++ docs/integrations/sources/zendesk-support.md | 47 +- tools/bin/ci_credentials.sh | 2 +- 46 files changed, 3208 insertions(+), 25 deletions(-) create mode 100644 airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/79c1aa37-dae3-42ae-b333-d1c105477715.json delete mode 100644 airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/d29764f8-80d7-4dd7-acbe-1a42005ee5aa.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/.dockerignore create mode 100644 airbyte-integrations/connectors/source-zendesk-support/Dockerfile create mode 100644 airbyte-integrations/connectors/source-zendesk-support/README.md create mode 100644 airbyte-integrations/connectors/source-zendesk-support/acceptance-test-config.yml create mode 100644 airbyte-integrations/connectors/source-zendesk-support/acceptance-test-docker.sh create mode 100644 airbyte-integrations/connectors/source-zendesk-support/build.gradle create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/__init__.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/abnormal_state.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/acceptance.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/configured_catalog.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/integration_test.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/integration_tests/invalid_config.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/main.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/requirements.txt create mode 100644 airbyte-integrations/connectors/source-zendesk-support/setup.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/__init__.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/group_memberships.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/groups.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/macros.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/organizations.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/satisfaction_ratings.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/attachments.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/metadata.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via_channel.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/sla_policies.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tags.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_audits.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_comments.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_fields.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_forms.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_metrics.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tickets.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/users.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/source.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/spec.json create mode 100644 airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/streams.py create mode 100644 airbyte-integrations/connectors/source-zendesk-support/unit_tests/unit_test.py diff --git a/.github/workflows/publish-command.yml b/.github/workflows/publish-command.yml index b88402234b70..c1d93a03bb88 100644 --- a/.github/workflows/publish-command.yml +++ b/.github/workflows/publish-command.yml @@ -142,9 +142,9 @@ jobs: TWILIO_TEST_CREDS: ${{ secrets.TWILIO_TEST_CREDS }} SOURCE_TYPEFORM_CREDS: ${{ secrets.SOURCE_TYPEFORM_CREDS }} ZENDESK_CHAT_INTEGRATION_TEST_CREDS: ${{ secrets.ZENDESK_CHAT_INTEGRATION_TEST_CREDS }} - ZENDESK_SECRETS_CREDS: ${{ secrets.ZENDESK_SECRETS_CREDS }} ZENDESK_SUNSHINE_TEST_CREDS: ${{ secrets.ZENDESK_SUNSHINE_TEST_CREDS }} ZENDESK_TALK_TEST_CREDS: ${{ secrets.ZENDESK_TALK_TEST_CREDS }} + ZENDESK_SUPPORT_TEST_CREDS: ${{ secrets.ZENDESK_SUPPORT_TEST_CREDS }} ZOOM_INTEGRATION_TEST_CREDS: ${{ secrets.ZOOM_INTEGRATION_TEST_CREDS }} PLAID_INTEGRATION_TEST_CREDS: ${{ secrets.PLAID_INTEGRATION_TEST_CREDS }} DESTINATION_S3_INTEGRATION_TEST_CREDS: ${{ secrets.DESTINATION_S3_INTEGRATION_TEST_CREDS }} diff --git a/.github/workflows/test-command.yml b/.github/workflows/test-command.yml index 07ffb44aedcf..b0cb65d378f5 100644 --- a/.github/workflows/test-command.yml +++ b/.github/workflows/test-command.yml @@ -140,9 +140,9 @@ jobs: TWILIO_TEST_CREDS: ${{ secrets.TWILIO_TEST_CREDS }} SOURCE_TYPEFORM_CREDS: ${{ secrets.SOURCE_TYPEFORM_CREDS }} ZENDESK_CHAT_INTEGRATION_TEST_CREDS: ${{ secrets.ZENDESK_CHAT_INTEGRATION_TEST_CREDS }} - ZENDESK_SECRETS_CREDS: ${{ secrets.ZENDESK_SECRETS_CREDS }} ZENDESK_SUNSHINE_TEST_CREDS: ${{ secrets.ZENDESK_SUNSHINE_TEST_CREDS }} ZENDESK_TALK_TEST_CREDS: ${{ secrets.ZENDESK_TALK_TEST_CREDS }} + ZENDESK_SUPPORT_TEST_CREDS: ${{ secrets.ZENDESK_SUPPORT_TEST_CREDS }} ZOOM_INTEGRATION_TEST_CREDS: ${{ secrets.ZOOM_INTEGRATION_TEST_CREDS }} PLAID_INTEGRATION_TEST_CREDS: ${{ secrets.PLAID_INTEGRATION_TEST_CREDS }} DESTINATION_S3_INTEGRATION_TEST_CREDS: ${{ secrets.DESTINATION_S3_INTEGRATION_TEST_CREDS }} diff --git a/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/79c1aa37-dae3-42ae-b333-d1c105477715.json b/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/79c1aa37-dae3-42ae-b333-d1c105477715.json new file mode 100644 index 000000000000..a5c7ba76f845 --- /dev/null +++ b/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/79c1aa37-dae3-42ae-b333-d1c105477715.json @@ -0,0 +1,8 @@ +{ + "sourceDefinitionId": "79c1aa37-dae3-42ae-b333-d1c105477715", + "name": "Zendesk Support", + "dockerRepository": "airbyte/source-zendesk-support", + "dockerImageTag": "0.1.0", + "documentationUrl": "https://hub.docker.com/r/airbyte/source-zendesk-support", + "icon": "zendesk.svg" +} diff --git a/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/d29764f8-80d7-4dd7-acbe-1a42005ee5aa.json b/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/d29764f8-80d7-4dd7-acbe-1a42005ee5aa.json deleted file mode 100644 index 72951dc8ba82..000000000000 --- a/airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/d29764f8-80d7-4dd7-acbe-1a42005ee5aa.json +++ /dev/null @@ -1,8 +0,0 @@ -{ - "sourceDefinitionId": "d29764f8-80d7-4dd7-acbe-1a42005ee5aa", - "name": "Zendesk Support", - "dockerRepository": "airbyte/source-zendesk-support-singer", - "dockerImageTag": "0.2.3", - "documentationUrl": "https://hub.docker.com/r/airbyte/source-zendesk-support-singer", - "icon": "zendesk.svg" -} diff --git a/airbyte-config/init/src/main/resources/seed/source_definitions.yaml b/airbyte-config/init/src/main/resources/seed/source_definitions.yaml index 824212ba91d1..428565e566d9 100644 --- a/airbyte-config/init/src/main/resources/seed/source_definitions.yaml +++ b/airbyte-config/init/src/main/resources/seed/source_definitions.yaml @@ -191,12 +191,13 @@ dockerImageTag: 0.1.1 documentationUrl: https://hub.docker.com/r/airbyte/source-zendesk-chat icon: zendesk.svg -- sourceDefinitionId: d29764f8-80d7-4dd7-acbe-1a42005ee5aa +- sourceDefinitionId: 79c1aa37-dae3-42ae-b333-d1c105477715 name: Zendesk Support - dockerRepository: airbyte/source-zendesk-support-singer - dockerImageTag: 0.2.3 - documentationUrl: https://hub.docker.com/r/airbyte/source-zendesk-support-singer + dockerRepository: airbyte/source-zendesk-support + dockerImageTag: 0.1.0 + documentationUrl: https://hub.docker.com/r/airbyte/source-zendesk-support icon: zendesk.svg + - sourceDefinitionId: d8313939-3782-41b0-be29-b3ca20d8dd3a name: Intercom dockerRepository: airbyte/source-intercom diff --git a/airbyte-integrations/builds.md b/airbyte-integrations/builds.md index 34c151b8c38f..e387189a95c2 100644 --- a/airbyte-integrations/builds.md +++ b/airbyte-integrations/builds.md @@ -70,7 +70,7 @@ | Typeform | [![source-typeform](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-typeform%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-typeform) | | US Census | [![source-us-census](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-us-census%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/2Fsource-us-census) | | Zendesk Chat | [![source-zendesk-chat](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-zendesk-chat%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-zendesk-chat) | -| Zendesk Support | [![source-zendesk-support-singer](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-zendesk-support-singer%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-zendesk-support-singer) | +| Zendesk Support | [![source-zendesk-support](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-zendesk-support%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-zendesk-support) | | Zendesk Talk | [![source-zendesk-talk](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-zendesk-talk%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-zendesk-talk) | | Zoom | [![source-zoom-singer](https://img.shields.io/endpoint?url=https%3A%2F%2Fdnsgjos7lj2fu.cloudfront.net%2Ftests%2Fsummary%2Fsource-zoom-singer%2Fbadge.json)](https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-zoom-singer) | diff --git a/airbyte-integrations/connectors/source-zendesk-support/.dockerignore b/airbyte-integrations/connectors/source-zendesk-support/.dockerignore new file mode 100644 index 000000000000..d2414c888088 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/.dockerignore @@ -0,0 +1,7 @@ +* +!Dockerfile +!Dockerfile.test +!main.py +!source_zendesk_support +!setup.py +!secrets diff --git a/airbyte-integrations/connectors/source-zendesk-support/Dockerfile b/airbyte-integrations/connectors/source-zendesk-support/Dockerfile new file mode 100644 index 000000000000..7d8ff019a95c --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/Dockerfile @@ -0,0 +1,25 @@ +FROM python:3.7.11-alpine3.14 as base +FROM base as builder + + +RUN apk --no-cache upgrade \ + && pip install --upgrade pip + +WORKDIR /airbyte/integration_code +COPY setup.py ./ +RUN pip install --prefix=/install . + + +FROM base +COPY --from=builder /install /usr/local + +WORKDIR /airbyte/integration_code +COPY main.py ./ +COPY source_zendesk_support ./source_zendesk_support + + +ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" +ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] + +LABEL io.airbyte.version=0.1.0 +LABEL io.airbyte.name=airbyte/source-zendesk-support diff --git a/airbyte-integrations/connectors/source-zendesk-support/README.md b/airbyte-integrations/connectors/source-zendesk-support/README.md new file mode 100644 index 000000000000..0b77093156da --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/README.md @@ -0,0 +1,131 @@ +# Source Zendesk Support Source + +This is the repository for the Source Zendesk Support source connector, written in Python. +For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/source-zendesk-support). + +## Local development + +### Prerequisites +**To iterate on this connector, make sure to complete this prerequisites section.** + +#### Minimum Python version required `= 3.7.0` + +#### Build & Activate Virtual Environment and install dependencies +From this connector directory, create a virtual environment: +``` +python -m venv .venv +``` + +This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your +development environment of choice. To activate it from the terminal, run: +``` +source .venv/bin/activate +pip install -r requirements.txt +``` +If you are in an IDE, follow your IDE's instructions to activate the virtualenv. + +Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is +used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`. +If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything +should work as you expect. + +#### Building via Gradle +You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow. + +To build using Gradle, from the Airbyte repository root, run: +``` +./gradlew :airbyte-integrations:connectors:source-zendesk-support:build +``` + +#### Create credentials +**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/source-zendesk-support) +to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_support/spec.json` file. +Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. +See `integration_tests/sample_config.json` for a sample config file. + +**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source source-zendesk-support test creds` +and place them into `secrets/config.json`. + +### Locally running the connector +``` +python main.py spec +python main.py check --config secrets/config.json +python main.py discover --config secrets/config.json +python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json +``` + +### Locally running the connector docker image + +#### Build +First, make sure you build the latest Docker image: +``` +docker build . -t airbyte/source-zendesk-support:dev +``` + +You can also build the connector image via Gradle: +``` +./gradlew :airbyte-integrations:connectors:source-zendesk-support:airbyteDocker +``` +When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in +the Dockerfile. + +#### Run +Then run any of the connector commands as follows: +``` +docker run --rm airbyte/source-zendesk-support:dev spec +docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-support:dev check --config /secrets/config.json +docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-support:dev discover --config /secrets/config.json +docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-support:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json +``` +## Testing +Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named. +First install test dependencies into your virtual environment: +``` +pip install .[tests] +``` +### Unit Tests +To run unit tests locally, from the connector directory run: +``` +python -m pytest unit_tests +``` + +### Integration Tests +There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector). +#### Custom Integration tests +Place custom tests inside `integration_tests/` folder, then, from the connector root, run +``` +python -m pytest integration_tests +``` +#### Acceptance Tests +Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](source-acceptance-tests.md) for more information. +If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. +To run your integration tests with acceptance tests, from the connector root, run +``` +python -m pytest integration_tests -p integration_tests.acceptance +``` +To run your integration tests with docker + +### Using gradle to run tests +All commands should be run from airbyte project root. +To run unit tests: +``` +./gradlew :airbyte-integrations:connectors:source-zendesk-support:unitTest +``` +To run acceptance and custom integration tests: +``` +./gradlew :airbyte-integrations:connectors:source-zendesk-support:integrationTest +``` + +## Dependency Management +All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. +We split dependencies between two groups, dependencies that are: +* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +* required for the testing need to go to `TEST_REQUIREMENTS` list + +### Publishing a new version of the connector +You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? +1. Make sure your changes are passing unit and integration tests. +1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)). +1. Create a Pull Request. +1. Pat yourself on the back for being an awesome contributor. +1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. diff --git a/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-config.yml b/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-config.yml new file mode 100644 index 000000000000..8eaac6aadf86 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-config.yml @@ -0,0 +1,25 @@ +# See [Source Acceptance Tests](https://docs.airbyte.io/contributing-to-airbyte/building-new-connector/source-acceptance-tests.md) +# for more information about how to configure these tests +connector_image: airbyte/source-zendesk-support:dev +tests: + spec: + - spec_path: "source_zendesk_support/spec.json" + connection: + - config_path: "secrets/config.json" + status: "succeed" + - config_path: "integration_tests/invalid_config.json" + status: "failed" + discovery: + - config_path: "secrets/config.json" + basic_read: + - config_path: "secrets/config.json" + configured_catalog_path: "integration_tests/configured_catalog.json" + incremental: + - config_path: "secrets/config.json" + configured_catalog_path: "integration_tests/configured_catalog.json" + future_state_path: "integration_tests/abnormal_state.json" + cursor_paths: + ticket_comments: ["created_at"] + full_refresh: + - config_path: "secrets/config.json" + configured_catalog_path: "integration_tests/configured_catalog.json" diff --git a/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-docker.sh b/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-docker.sh new file mode 100644 index 000000000000..db28f196367c --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/acceptance-test-docker.sh @@ -0,0 +1,8 @@ +#!/usr/bin/env sh + +docker run --rm -it \ + -v /var/run/docker.sock:/var/run/docker.sock \ + -v /tmp:/tmp \ + -v $(pwd):/test_input \ + airbyte/source-acceptance-test \ + --acceptance-test-config /test_input diff --git a/airbyte-integrations/connectors/source-zendesk-support/build.gradle b/airbyte-integrations/connectors/source-zendesk-support/build.gradle new file mode 100644 index 000000000000..f612915490f1 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/build.gradle @@ -0,0 +1,9 @@ +plugins { + id 'airbyte-python' + id 'airbyte-docker' + id 'airbyte-source-acceptance-test' +} + +airbytePython { + moduleDirectory 'source_zendesk_support' +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/__init__.py b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/__init__.py new file mode 100644 index 000000000000..e69de29bb2d1 diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/abnormal_state.json b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/abnormal_state.json new file mode 100644 index 000000000000..278b1f781b21 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/abnormal_state.json @@ -0,0 +1,38 @@ +{ + "users": { + "updated_at": "2022-07-19T22:21:37Z" + }, + "groups": { + "updated_at": "2022-07-15T22:19:01Z" + }, + "organizations": { + "updated_at": "2022-07-15T19:29:14Z" + }, + "satisfaction_ratings": { + "updated_at": "2022-07-20T10:05:18Z" + }, + "tickets": { + "generated_timestamp": 1816817368 + }, + "group_memberships": { + "updated_at": "2022-04-23T15:34:20Z" + }, + "ticket_fields": { + "updated_at": "2022-12-11T19:34:05Z" + }, + "ticket_forms": { + "updated_at": "2022-12-11T20:34:37Z" + }, + "ticket_metrics": { + "updated_at": "2022-07-19T22:21:26Z" + }, + "macros": { + "updated_at": "2022-12-11T19:34:06Z" + }, + "ticket_comments": { + "created_at": "2022-07-19T22:21:26Z" + }, + "ticket_audits": { + "created_at": "2022-07-19T22:21:26Z" + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/acceptance.py b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/acceptance.py new file mode 100644 index 000000000000..d6cbdc97c495 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/acceptance.py @@ -0,0 +1,34 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + + +import pytest + +pytest_plugins = ("source_acceptance_test.plugin",) + + +@pytest.fixture(scope="session", autouse=True) +def connector_setup(): + """ This fixture is a placeholder for external resources that acceptance test might require.""" + yield diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/configured_catalog.json b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/configured_catalog.json new file mode 100644 index 000000000000..2b51f2022b23 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/configured_catalog.json @@ -0,0 +1,168 @@ +{ + "streams": [ + { + "stream": { + "name": "group_memberships", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "groups", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "macros", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "organizations", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "satisfaction_ratings", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "sla_policies", + "json_schema": {}, + "supported_sync_modes": ["full_refresh"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "full_refresh", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "tags", + "json_schema": {}, + "supported_sync_modes": ["full_refresh"], + "source_defined_primary_key": [["name"]] + }, + "sync_mode": "full_refresh", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "ticket_audits", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["created_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "ticket_comments", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["created_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "ticket_fields", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "ticket_forms", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "ticket_metrics", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "tickets", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["generated_timestamp"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + }, + { + "stream": { + "name": "users", + "json_schema": {}, + "supported_sync_modes": ["full_refresh", "incremental"], + "source_defined_cursor": true, + "default_cursor_field": ["updated_at"], + "source_defined_primary_key": [["id"]] + }, + "sync_mode": "incremental", + "destination_sync_mode": "append" + } + ] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/integration_test.py b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/integration_test.py new file mode 100644 index 000000000000..17ae998f7808 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/integration_test.py @@ -0,0 +1,106 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + +import json + +import pendulum +import requests_mock +from source_zendesk_support import SourceZendeskSupport +from source_zendesk_support.streams import Macros, TicketAudits, TicketMetrics, Tickets, Users + +CONFIG_FILE = "secrets/config.json" + + +class TestIntegrationZendeskSupport: + """This test class provides a set of tests for different Zendesk streams. + The Zendesk API has difference pagination and sorting mechanisms for streams. + Let's try to check them + """ + + @staticmethod + def prepare_stream_args(): + """Generates streams settings from a file""" + with open(CONFIG_FILE, "r") as f: + return SourceZendeskSupport.convert_config2stream_args(json.loads(f.read())) + + def _test_export_stream(self, stream_cls: type): + stream = stream_cls(**self.prepare_stream_args()) + record_timestamps = {} + for record in stream.read_records(sync_mode=None): + # save the first 5 records + if len(record_timestamps) > 5: + break + record_timestamps[record["id"]] = record[stream.cursor_field] + for record_id, timestamp in record_timestamps.items(): + state = {stream.cursor_field: timestamp} + for record in stream.read_records(sync_mode=None, stream_state=state): + assert record["id"] != record_id + break + + def test_export_with_unixtime(self): + """ Tickets stream has 'generated_timestamp' as cursor_field and it is unixtime format'' """ + self._test_export_stream(Tickets) + + def test_export_with_str_datetime(self): + """ Other export streams has 'updated_at' as cursor_field and it is datetime string format """ + self._test_export_stream(Users) + + def _test_insertion(self, stream_cls: type, index: int = None): + """try to update some item""" + stream = stream_cls(**self.prepare_stream_args()) + all_records = list(stream.read_records(sync_mode=None)) + state = stream.get_updated_state(current_stream_state=None, latest_record=all_records[-1]) + + incremental_records = list(stream_cls(**self.prepare_stream_args()).read_records(sync_mode=None, stream_state=state)) + assert len(incremental_records) == 0 + + if index is None: + # select a middle index + index = int(len(all_records) / 2) + updated_record_id = all_records[index]["id"] + all_records[index][stream.cursor_field] = stream.datetime2str(pendulum.now().astimezone()) + + with requests_mock.Mocker() as m: + url = stream.url_base + stream.path() + data = { + (stream.response_list_name or stream.name): all_records, + "next_page": None, + } + m.get(url, text=json.dumps(data)) + incremental_records = list(stream_cls(**self.prepare_stream_args()).read_records(sync_mode=None, stream_state=state)) + + assert len(incremental_records) == 1 + assert incremental_records[0]["id"] == updated_record_id + + def test_not_sorted_stream(self): + """for streams without sorting but with pagination""" + self._test_insertion(TicketMetrics) + + def test_sorted_page_stream(self): + """for streams with pagination and sorting mechanism""" + self._test_insertion(Macros, 0) + + def test_sorted_cursor_stream(self): + """for stream with cursor pagination and sorting mechanism""" + self._test_insertion(TicketAudits, 0) diff --git a/airbyte-integrations/connectors/source-zendesk-support/integration_tests/invalid_config.json b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/invalid_config.json new file mode 100644 index 000000000000..b0855267d841 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/integration_tests/invalid_config.json @@ -0,0 +1,6 @@ +{ + "email": "broken.email@invalid.config", + "api_token": "", + "subdomain": "test-failure-airbyte", + "start_date": "2030-01-01T00:00:00Z" +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/main.py b/airbyte-integrations/connectors/source-zendesk-support/main.py new file mode 100644 index 000000000000..05ea934e3103 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/main.py @@ -0,0 +1,33 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + + +import sys + +from airbyte_cdk.entrypoint import launch +from source_zendesk_support import SourceZendeskSupport + +if __name__ == "__main__": + source = SourceZendeskSupport() + launch(source, sys.argv[1:]) diff --git a/airbyte-integrations/connectors/source-zendesk-support/requirements.txt b/airbyte-integrations/connectors/source-zendesk-support/requirements.txt new file mode 100644 index 000000000000..0411042aa091 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/requirements.txt @@ -0,0 +1,2 @@ +-e ../../bases/source-acceptance-test +-e . diff --git a/airbyte-integrations/connectors/source-zendesk-support/setup.py b/airbyte-integrations/connectors/source-zendesk-support/setup.py new file mode 100644 index 000000000000..90c4a34f1c54 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/setup.py @@ -0,0 +1,44 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + + +from setuptools import find_packages, setup + +MAIN_REQUIREMENTS = ["airbyte-cdk", "pytz"] + +TEST_REQUIREMENTS = ["pytest~=6.1", "source-acceptance-test", "requests-mock==1.9.3", "timeout-decorator==0.5.0"] + +setup( + version="0.1.0", + name="source_zendesk_support", + description="Source implementation for Zendesk Support.", + author="Airbyte", + author_email="contact@airbyte.io", + packages=find_packages(), + install_requires=MAIN_REQUIREMENTS, + package_data={"": ["*.json", "schemas/*.json", "schemas/shared/*.json"]}, + extras_require={ + "tests": TEST_REQUIREMENTS, + }, +) diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/__init__.py b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/__init__.py new file mode 100644 index 000000000000..b9df6c98610b --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/__init__.py @@ -0,0 +1,27 @@ +""" +MIT License + +Copyright (c) 2020 Airbyte + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. +""" + +from .source import SourceZendeskSupport + +__all__ = ("SourceZendeskSupport",) diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/group_memberships.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/group_memberships.json new file mode 100644 index 000000000000..2e8bfa5440bc --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/group_memberships.json @@ -0,0 +1,28 @@ +{ + "properties": { + "default": { + "type": ["null", "boolean"] + }, + "url": { + "type": ["null", "string"] + }, + "user_id": { + "type": ["null", "integer"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "group_id": { + "type": ["null", "integer"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "id": { + "type": ["null", "integer"] + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/groups.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/groups.json new file mode 100644 index 000000000000..b10e430d0375 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/groups.json @@ -0,0 +1,25 @@ +{ + "type": ["null", "object"], + "properties": { + "name": { + "type": ["null", "string"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "url": { + "type": ["null", "string"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "deleted": { + "type": ["null", "boolean"] + }, + "id": { + "type": ["null", "integer"] + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/macros.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/macros.json new file mode 100644 index 000000000000..1110d1e1bbb9 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/macros.json @@ -0,0 +1,62 @@ +{ + "properties": { + "id": { + "type": ["null", "integer"] + }, + "position": { + "type": ["null", "integer"] + }, + "restriction": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "ids": { + "items": { + "type": ["null", "integer"] + }, + "type": ["null", "array"] + }, + "type": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "title": { + "type": ["null", "string"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "url": { + "type": ["null", "string"] + }, + "description": { + "type": ["null", "string"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "active": { + "type": ["null", "boolean"] + }, + "actions": { + "items": { + "properties": { + "field": { + "type": ["null", "string"] + }, + "value": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/organizations.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/organizations.json new file mode 100644 index 000000000000..f01e405d5843 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/organizations.json @@ -0,0 +1,60 @@ +{ + "type": ["null", "object"], + "properties": { + "group_id": { + "type": ["null", "integer"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "tags": { + "type": ["null", "array"], + "items": { + "type": ["null", "string"] + } + }, + "shared_tickets": { + "type": ["null", "boolean"] + }, + "organization_fields": { + "type": ["null", "object"], + "additionalProperties": true + }, + "notes": { + "type": ["null", "string"] + }, + "domain_names": { + "type": ["null", "array"], + "items": { + "type": ["null", "string"] + } + }, + "shared_comments": { + "type": ["null", "boolean"] + }, + "details": { + "type": ["null", "string"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "name": { + "type": ["null", "string"] + }, + "external_id": { + "type": ["null", "string"] + }, + "url": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "deleted_at": { + "type": ["null", "string"], + "format": "date-time" + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/satisfaction_ratings.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/satisfaction_ratings.json new file mode 100644 index 000000000000..fcf319896d20 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/satisfaction_ratings.json @@ -0,0 +1,43 @@ +{ + "type": "object", + "properties": { + "id": { + "type": ["null", "integer"] + }, + "assignee_id": { + "type": ["null", "integer"] + }, + "group_id": { + "type": ["null", "integer"] + }, + "reason_id": { + "type": ["null", "integer"] + }, + "requester_id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "url": { + "type": ["null", "string"] + }, + "score": { + "type": ["null", "string"] + }, + "reason": { + "type": ["null", "string"] + }, + "comment": { + "type": ["null", "string"] + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/attachments.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/attachments.json new file mode 100644 index 000000000000..5c235ba83a1c --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/attachments.json @@ -0,0 +1,76 @@ +{ + "type": ["null", "array"], + "items": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "size": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "height": { + "type": ["null", "integer"] + }, + "width": { + "type": ["null", "integer"] + }, + "content_url": { + "type": ["null", "string"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + }, + "thumbnails": { + "items": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "size": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "height": { + "type": ["null", "integer"] + }, + "width": { + "type": ["null", "integer"] + }, + "content_url": { + "type": ["null", "string"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + } + }, + "type": ["null", "object"] + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/metadata.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/metadata.json new file mode 100644 index 000000000000..b68e6ae7fa1e --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/metadata.json @@ -0,0 +1,79 @@ +{ + "type": ["null", "object"], + "properties": { + "custom": {}, + "trusted": { + "type": ["null", "boolean"] + }, + "notifications_suppressed_for": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "flags_options": { + "type": ["null", "object"], + "properties": { + "2": { + "type": ["null", "object"], + "properties": { + "trusted": { + "type": ["null", "boolean"] + } + } + }, + "11": { + "type": ["null", "object"], + "properties": { + "trusted": { + "type": ["null", "boolean"] + }, + "message": { + "type": ["null", "object"], + "properties": { + "user": { + "type": ["null", "string"] + } + } + } + } + } + } + }, + "flags": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "system": { + "type": ["null", "object"], + "properties": { + "location": { + "type": ["null", "string"] + }, + "longitude": { + "type": ["null", "number"] + }, + "message_id": { + "type": ["null", "string"] + }, + "raw_email_identifier": { + "type": ["null", "string"] + }, + "ip_address": { + "type": ["null", "string"] + }, + "json_email_identifier": { + "type": ["null", "string"] + }, + "client": { + "type": ["null", "string"] + }, + "latitude": { + "type": ["null", "number"] + } + } + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via.json new file mode 100644 index 000000000000..4fb4506bb191 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via.json @@ -0,0 +1,65 @@ +{ + "type": ["null", "object"], + "properties": { + "channel": { + "type": ["null", "string"] + }, + "source": { + "type": ["null", "object"], + "properties": { + "from": { + "type": ["null", "object"], + "properties": { + "ticket_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "subject": { + "type": ["null", "string"] + }, + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + }, + "original_recipients": { + "type": ["null", "array"], + "items": { + "type": ["null", "string"] + } + }, + "id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "deleted": { + "type": ["null", "boolean"] + }, + "title": { + "type": ["null", "string"] + } + } + }, + "to": { + "type": ["null", "object"], + "properties": { + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + } + } + }, + "rel": { + "type": ["null", "string"] + } + } + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via_channel.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via_channel.json new file mode 100644 index 000000000000..d37cc65685bf --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/shared/via_channel.json @@ -0,0 +1,101 @@ +{ + "type": ["null", "object"], + "properties": { + "channel": { + "type": ["null", "string"] + }, + "source": { + "type": ["null", "object"], + "properties": { + "from": { + "type": ["null", "object"], + "properties": { + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + }, + "original_recipients": { + "type": ["null", "array"], + "items": { + "type": ["null", "string"] + } + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "subject": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "title": { + "type": ["null", "string"] + }, + "deleted": { + "type": ["null", "boolean"] + }, + "revision_id": { + "type": ["null", "integer"] + }, + "topic_id": { + "type": ["null", "integer"] + }, + "topic_name": { + "type": ["null", "string"] + }, + "profile_url": { + "type": ["null", "string"] + }, + "username": { + "type": ["null", "string"] + }, + "phone": { + "type": ["null", "string"] + }, + "formatted_phone": { + "type": ["null", "string"] + }, + "facebook_id": { + "type": ["null", "string"] + } + } + }, + "to": { + "type": ["null", "object"], + "properties": { + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + }, + "email_ccs": { + "type": ["null", "string"] + }, + "profile_url": { + "type": ["null", "string"] + }, + "username": { + "type": ["null", "string"] + }, + "phone": { + "type": ["null", "string"] + }, + "formatted_phone": { + "type": ["null", "string"] + }, + "facebook_id": { + "type": ["null", "string"] + } + } + }, + "rel": { + "type": ["null", "string"] + } + } + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/sla_policies.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/sla_policies.json new file mode 100644 index 000000000000..22b176617629 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/sla_policies.json @@ -0,0 +1,85 @@ +{ + "properties": { + "id": { + "type": ["integer"] + }, + "url": { + "type": ["null", "string"] + }, + "title": { + "type": ["null", "string"] + }, + "description": { + "type": ["null", "string"] + }, + "position": { + "type": ["null", "integer"] + }, + "filter": { + "properties": { + "all": { + "type": ["null", "array"], + "items": { + "properties": { + "field": { + "type": ["null", "string"] + }, + "operator": { + "type": ["null", "string"] + }, + "value": { + "type": ["null", "string", "number", "boolean"] + } + }, + "type": ["object"] + } + }, + "any": { + "type": ["null", "array"], + "items": { + "properties": { + "field": { + "type": ["null", "string"] + }, + "operator": { + "type": ["null", "string"] + }, + "value": { + "type": ["null", "string"] + } + }, + "type": ["object"] + } + } + }, + "type": ["null", "object"] + }, + "policy_metrics": { + "type": ["null", "array"], + "items": { + "properties": { + "priority": { + "type": ["null", "string"] + }, + "target": { + "type": ["null", "integer"] + }, + "business_hours": { + "type": ["null", "boolean"] + }, + "metric": {} + }, + "type": ["null", "object"] + } + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + } + }, + "type": ["object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tags.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tags.json new file mode 100644 index 000000000000..437ff323b1b7 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tags.json @@ -0,0 +1,11 @@ +{ + "type": ["null", "object"], + "properties": { + "count": { + "type": ["null", "integer"] + }, + "name": { + "type": ["null", "string"] + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_audits.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_audits.json new file mode 100644 index 000000000000..eba361129080 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_audits.json @@ -0,0 +1,415 @@ +{ + "type": ["null", "object"], + "properties": { + "events": { + "type": ["null", "array"], + "items": { + "type": ["null", "object"], + "properties": { + "attachments": { + "items": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "size": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "height": { + "type": ["null", "integer"] + }, + "width": { + "type": ["null", "integer"] + }, + "content_url": { + "type": ["null", "string"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + }, + "thumbnails": { + "items": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "size": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "height": { + "type": ["null", "integer"] + }, + "width": { + "type": ["null", "integer"] + }, + "content_url": { + "type": ["null", "string"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + } + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "data": { + "type": ["null", "object"], + "properties": { + "transcription_status": { + "type": ["null", "string"] + }, + "transcription_text": { + "type": ["null", "string"] + }, + "to": { + "type": ["null", "string"] + }, + "call_duration": { + "type": ["null", "string"] + }, + "answered_by_name": { + "type": ["null", "string"] + }, + "recording_url": { + "type": ["null", "string"] + }, + "started_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "answered_by_id": { + "type": ["null", "integer"] + }, + "from": { + "type": ["null", "string"] + } + } + }, + "formatted_from": { + "type": ["null", "string"] + }, + "formatted_to": { + "type": ["null", "string"] + }, + "transcription_visible": {}, + "trusted": { + "type": ["null", "boolean"] + }, + "html_body": { + "type": ["null", "string"] + }, + "subject": { + "type": ["null", "string"] + }, + "field_name": { + "type": ["null", "string"] + }, + "audit_id": { + "type": ["null", "integer"] + }, + "value": { + "type": ["null", "array", "string"], + "items": { + "type": ["null", "string"] + } + }, + "author_id": { + "type": ["null", "integer"] + }, + "via": { + "properties": { + "channel": { + "type": ["null", "string"] + }, + "source": { + "properties": { + "to": { + "properties": { + "address": { + "type": ["null", "string"] + }, + "name": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "from": { + "properties": { + "title": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + }, + "subject": { + "type": ["null", "string"] + }, + "deleted": { + "type": ["null", "boolean"] + }, + "name": { + "type": ["null", "string"] + }, + "original_recipients": { + "items": { + "type": ["null", "string"] + }, + "type": ["null", "array"] + }, + "id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "revision_id": { + "type": ["null", "integer"] + } + }, + "type": ["null", "object"] + }, + "rel": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + } + }, + "type": ["null", "object"] + }, + "type": { + "type": ["null", "string"] + }, + "macro_id": { + "type": ["null", "string"] + }, + "body": { + "type": ["null", "string"] + }, + "recipients": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "macro_deleted": { + "type": ["null", "boolean"] + }, + "plain_body": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "previous_value": { + "type": ["null", "array", "string"], + "items": { + "type": ["null", "string"] + } + }, + "macro_title": { + "type": ["null", "string"] + }, + "public": { + "type": ["null", "boolean"] + }, + "resource": { + "type": ["null", "string"] + } + } + } + }, + "author_id": { + "type": ["null", "integer"] + }, + "metadata": { + "type": ["null", "object"], + "properties": { + "custom": {}, + "trusted": { + "type": ["null", "boolean"] + }, + "notifications_suppressed_for": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "flags_options": { + "type": ["null", "object"], + "properties": { + "2": { + "type": ["null", "object"], + "properties": { + "trusted": { + "type": ["null", "boolean"] + } + } + }, + "11": { + "type": ["null", "object"], + "properties": { + "trusted": { + "type": ["null", "boolean"] + }, + "message": { + "type": ["null", "object"], + "properties": { + "user": { + "type": ["null", "string"] + } + } + } + } + } + } + }, + "flags": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "system": { + "type": ["null", "object"], + "properties": { + "location": { + "type": ["null", "string"] + }, + "longitude": { + "type": ["null", "number"] + }, + "message_id": { + "type": ["null", "string"] + }, + "raw_email_identifier": { + "type": ["null", "string"] + }, + "ip_address": { + "type": ["null", "string"] + }, + "json_email_identifier": { + "type": ["null", "string"] + }, + "client": { + "type": ["null", "string"] + }, + "latitude": { + "type": ["null", "number"] + } + } + } + } + }, + "id": { + "type": ["null", "integer"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "via": { + "type": ["null", "object"], + "properties": { + "channel": { + "type": ["null", "string"] + }, + "source": { + "type": ["null", "object"], + "properties": { + "from": { + "type": ["null", "object"], + "properties": { + "ticket_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "subject": { + "type": ["null", "string"] + }, + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + }, + "original_recipients": { + "type": ["null", "array"], + "items": { + "type": ["null", "string"] + } + }, + "id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "deleted": { + "type": ["null", "boolean"] + }, + "title": { + "type": ["null", "string"] + } + } + }, + "to": { + "type": ["null", "object"], + "properties": { + "name": { + "type": ["null", "string"] + }, + "address": { + "type": ["null", "string"] + } + } + }, + "rel": { + "type": ["null", "string"] + } + } + } + } + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_comments.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_comments.json new file mode 100644 index 000000000000..df3aa01c3bb6 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_comments.json @@ -0,0 +1,39 @@ +{ + "properties": { + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "body": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "type": { + "type": ["null", "string"] + }, + "html_body": { + "type": ["null", "string"] + }, + "plain_body": { + "type": ["null", "string"] + }, + "public": { + "type": ["null", "boolean"] + }, + "audit_id": { + "type": ["null", "integer"] + }, + "author_id": { + "type": ["null", "integer"] + }, + "via": { "$ref": "via.json" }, + "metadata": { "$ref": "metadata.json" }, + "attachments": { "$ref": "attachments.json" } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_fields.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_fields.json new file mode 100644 index 000000000000..b84b9afdb894 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_fields.json @@ -0,0 +1,103 @@ +{ + "properties": { + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "title_in_portal": { + "type": ["null", "string"] + }, + "visible_in_portal": { + "type": ["null", "boolean"] + }, + "collapsed_for_agents": { + "type": ["null", "boolean"] + }, + "regexp_for_validation": { + "type": ["null", "string"] + }, + "title": { + "type": ["null", "string"] + }, + "position": { + "type": ["null", "integer"] + }, + "type": { + "type": ["null", "string"] + }, + "editable_in_portal": { + "type": ["null", "boolean"] + }, + "raw_title_in_portal": { + "type": ["null", "string"] + }, + "raw_description": { + "type": ["null", "string"] + }, + "custom_field_options": { + "items": { + "properties": { + "name": { + "type": ["null", "string"] + }, + "value": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "default": { + "type": ["null", "boolean"] + }, + "raw_name": { + "type": ["null", "string"] + } + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "tag": { + "type": ["null", "string"] + }, + "removable": { + "type": ["null", "boolean"] + }, + "active": { + "type": ["null", "boolean"] + }, + "url": { + "type": ["null", "string"] + }, + "raw_title": { + "type": ["null", "string"] + }, + "required": { + "type": ["null", "boolean"] + }, + "id": { + "type": ["null", "integer"] + }, + "description": { + "type": ["null", "string"] + }, + "agent_description": { + "type": ["null", "string"] + }, + "required_in_portal": { + "type": ["null", "boolean"] + }, + "system_field_options": { + "type": ["null", "array"], + "items": {} + }, + "sub_type_id": { + "type": ["null", "integer"] + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_forms.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_forms.json new file mode 100644 index 000000000000..0c94cb05c689 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_forms.json @@ -0,0 +1,58 @@ +{ + "properties": { + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "name": { + "type": ["null", "string"] + }, + "display_name": { + "type": ["null", "string"] + }, + "raw_display_name": { + "type": ["null", "string"] + }, + "position": { + "type": ["null", "integer"] + }, + "raw_name": { + "type": ["null", "string"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "active": { + "type": ["null", "boolean"] + }, + "default": { + "type": ["null", "boolean"] + }, + "in_all_brands": { + "type": ["null", "boolean"] + }, + "end_user_visible": { + "type": ["null", "boolean"] + }, + "url": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "restricted_brand_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "ticket_field_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_metrics.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_metrics.json new file mode 100644 index 000000000000..a139c863d2b9 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/ticket_metrics.json @@ -0,0 +1,151 @@ +{ + "properties": { + "metric": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "time": { + "type": ["null", "string"] + }, + "instance_id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "status": { + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + }, + "type": ["null", "object"] + }, + "type": { + "type": ["null", "string"] + }, + "agent_wait_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "assignee_stations": { + "type": ["null", "integer"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "first_resolution_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "full_resolution_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "group_stations": { + "type": ["null", "integer"] + }, + "latest_comment_added_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "on_hold_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "reopens": { + "type": ["null", "integer"] + }, + "replies": { + "type": ["null", "integer"] + }, + "reply_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "requester_updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "requester_wait_time_in_minutes": { + "type": ["null", "object"], + "properties": { + "calendar": { + "type": ["null", "integer"] + }, + "business": { + "type": ["null", "integer"] + } + } + }, + "status_updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "url": { + "type": ["null", "string"] + }, + "initially_assigned_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "assigned_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "solved_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "assignee_updated_at": { + "type": ["null", "string"], + "format": "date-time" + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tickets.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tickets.json new file mode 100644 index 000000000000..20bfc48b7074 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/tickets.json @@ -0,0 +1,183 @@ +{ + "properties": { + "organization_id": { + "type": ["null", "integer"] + }, + "requester_id": { + "type": ["null", "integer"] + }, + "problem_id": { + "type": ["null", "integer"] + }, + "is_public": { + "type": ["null", "boolean"] + }, + "description": { + "type": ["null", "string"] + }, + "follower_ids": { + "items": { + "type": ["null", "integer"] + }, + "type": ["null", "array"] + }, + "submitter_id": { + "type": ["null", "integer"] + }, + "generated_timestamp": { + "type": ["null", "integer"] + }, + "brand_id": { + "type": ["null", "integer"] + }, + "id": { + "type": ["null", "integer"] + }, + "group_id": { + "type": ["null", "integer"] + }, + "type": { + "type": ["null", "string"] + }, + "recipient": { + "type": ["null", "string"] + }, + "collaborator_ids": { + "items": { + "type": ["null", "integer"] + }, + "type": ["null", "array"] + }, + "tags": { + "items": { + "type": ["null", "string"] + }, + "type": ["null", "array"] + }, + "has_incidents": { + "type": ["null", "boolean"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "raw_subject": { + "type": ["null", "string"] + }, + "status": { + "type": ["null", "string"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "custom_fields": { + "items": { + "properties": { + "id": { + "type": ["null", "integer"] + }, + "value": {} + }, + "type": ["null", "object"] + }, + "type": ["null", "array"] + }, + "url": { + "type": ["null", "string"] + }, + "allow_channelback": { + "type": ["null", "boolean"] + }, + "allow_attachments": { + "type": ["null", "boolean"] + }, + "due_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "followup_ids": { + "items": { + "type": ["null", "integer"] + }, + "type": ["null", "array"] + }, + "priority": { + "type": ["null", "string"] + }, + "assignee_id": { + "type": ["null", "integer"] + }, + "subject": { + "type": ["null", "string"] + }, + "external_id": { + "type": ["null", "string"] + }, + "via": { + "$ref": "via_channel.json" + }, + "ticket_form_id": { + "type": ["null", "integer"] + }, + "satisfaction_rating": { + "type": ["null", "object", "string"], + "properties": { + "id": { + "type": ["null", "integer"] + }, + "assignee_id": { + "type": ["null", "integer"] + }, + "group_id": { + "type": ["null", "integer"] + }, + "reason_id": { + "type": ["null", "integer"] + }, + "requester_id": { + "type": ["null", "integer"] + }, + "ticket_id": { + "type": ["null", "integer"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "url": { + "type": ["null", "string"] + }, + "score": { + "type": ["null", "string"] + }, + "reason": { + "type": ["null", "string"] + }, + "comment": { + "type": ["null", "string"] + } + } + }, + "sharing_agreement_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "email_cc_ids": { + "type": ["null", "array"], + "items": { + "type": ["null", "integer"] + } + }, + "forum_topic_id": { + "type": ["null", "integer"] + } + }, + "type": ["null", "object"] +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/users.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/users.json new file mode 100644 index 000000000000..11df801acee3 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/schemas/users.json @@ -0,0 +1,196 @@ +{ + "type": ["null", "object"], + "properties": { + "verified": { + "type": ["null", "boolean"] + }, + "role": { + "type": ["null", "string"] + }, + "tags": { + "items": { + "type": ["null", "string"] + }, + "type": ["null", "array"] + }, + "chat_only": { + "type": ["null", "boolean"] + }, + "role_type": { + "type": ["null", "integer"] + }, + "phone": { + "type": ["null", "string"] + }, + "organization_id": { + "type": ["null", "integer"] + }, + "details": { + "type": ["null", "string"] + }, + "email": { + "type": ["null", "string"] + }, + "only_private_comments": { + "type": ["null", "boolean"] + }, + "signature": { + "type": ["null", "string"] + }, + "restricted_agent": { + "type": ["null", "boolean"] + }, + "moderator": { + "type": ["null", "boolean"] + }, + "updated_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "external_id": { + "type": ["null", "string"] + }, + "time_zone": { + "type": ["null", "string"] + }, + "photo": { + "type": ["null", "object"], + "properties": { + "thumbnails": { + "items": { + "type": ["null", "object"], + "properties": { + "width": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + }, + "size": { + "type": ["null", "integer"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "height": { + "type": ["null", "integer"] + } + } + }, + "type": ["null", "array"] + }, + "width": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "inline": { + "type": ["null", "boolean"] + }, + "content_url": { + "type": ["null", "string"] + }, + "content_type": { + "type": ["null", "string"] + }, + "file_name": { + "type": ["null", "string"] + }, + "size": { + "type": ["null", "integer"] + }, + "mapped_content_url": { + "type": ["null", "string"] + }, + "id": { + "type": ["null", "integer"] + }, + "height": { + "type": ["null", "integer"] + } + } + }, + "name": { + "type": ["null", "string"] + }, + "shared": { + "type": ["null", "boolean"] + }, + "id": { + "type": ["null", "integer"] + }, + "created_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "suspended": { + "type": ["null", "boolean"] + }, + "shared_agent": { + "type": ["null", "boolean"] + }, + "shared_phone_number": { + "type": ["null", "boolean"] + }, + "user_fields": { + "type": ["null", "object"], + "additionalProperties": true + }, + "last_login_at": { + "type": ["null", "string"], + "format": "date-time" + }, + "alias": { + "type": ["null", "string"] + }, + "two_factor_auth_enabled": { + "type": ["null", "boolean"] + }, + "notes": { + "type": ["null", "string"] + }, + "default_group_id": { + "type": ["null", "integer"] + }, + "url": { + "type": ["null", "string"] + }, + "active": { + "type": ["null", "boolean"] + }, + "permanently_deleted": { + "type": ["null", "boolean"] + }, + "locale_id": { + "type": ["null", "integer"] + }, + "custom_role_id": { + "type": ["null", "integer"] + }, + "ticket_restriction": { + "type": ["null", "string"] + }, + "locale": { + "type": ["null", "string"] + }, + "report_csv": { + "type": ["null", "boolean"] + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/source.py b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/source.py new file mode 100644 index 000000000000..86d62b21d328 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/source.py @@ -0,0 +1,127 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + +import base64 +from typing import Any, List, Mapping, Tuple + +import requests +from airbyte_cdk.sources import AbstractSource +from airbyte_cdk.sources.streams import Stream +from airbyte_cdk.sources.streams.http.auth import TokenAuthenticator + +from .streams import ( + GroupMemberships, + Groups, + Macros, + Organizations, + SatisfactionRatings, + SlaPolicies, + SourceZendeskException, + Tags, + TicketAudits, + TicketComments, + TicketFields, + TicketForms, + TicketMetrics, + Tickets, + Users, + UserSettingsStream, +) + + +class BasicApiTokenAuthenticator(TokenAuthenticator): + """basic Authorization header""" + + def __init__(self, email: str, password: str): + # for API token auth we need to add the suffix '/token' in the end of email value + email_login = email + "/token" + token = base64.b64encode(f"{email_login}:{password}".encode("utf-8")) + super().__init__(token.decode("utf-8"), auth_method="Basic") + + +class SourceZendeskSupport(AbstractSource): + """Source Zendesk Support fetch data from Zendesk CRM that builds customer + support and sales software which aims for quick implementation and adaptation at scale. + """ + + @classmethod + def get_authenticator(cls, config: Mapping[str, Any]) -> BasicApiTokenAuthenticator: + if config["auth_method"].get("email") and config["auth_method"].get("api_token"): + return BasicApiTokenAuthenticator(config["auth_method"]["email"], config["auth_method"]["api_token"]) + raise SourceZendeskException(f"Not implemented authorization method: {config['auth_method']}") + + def check_connection(self, logger, config) -> Tuple[bool, any]: + """Connection check to validate that the user-provided config can be used to connect to the underlying API + + :param config: the user-input config object conforming to the connector's spec.json + :param logger: logger object + :return Tuple[bool, any]: (True, None) if the input config can be used to connect to the API successfully, + (False, error) otherwise. + """ + auth = self.get_authenticator(config) + settings = None + try: + settings = UserSettingsStream(config["subdomain"], authenticator=auth).get_settings() + except requests.exceptions.RequestException as e: + return False, e + + active_features = [k for k, v in settings.get("active_features", {}).items() if v] + logger.info("available features: %s" % active_features) + if "organization_access_enabled" not in active_features: + return False, "Organization access is not enabled. Please check admin permission of the current account" + return True, None + + @classmethod + def convert_config2stream_args(cls, config: Mapping[str, Any]) -> Mapping[str, Any]: + """Convert input configs to parameters of the future streams + This function is used by unit tests too + """ + return { + "subdomain": config["subdomain"], + "start_date": config["start_date"], + "authenticator": cls.get_authenticator(config), + } + + def streams(self, config: Mapping[str, Any]) -> List[Stream]: + """Returns relevant a list of available streams + :param config: A Mapping of the user input configuration as defined in the connector spec. + """ + args = self.convert_config2stream_args(config) + # sorted in alphabet order + return [ + GroupMemberships(**args), + Groups(**args), + Macros(**args), + Organizations(**args), + SatisfactionRatings(**args), + SlaPolicies(**args), + Tags(**args), + TicketAudits(**args), + TicketComments(**args), + TicketFields(**args), + TicketForms(**args), + TicketMetrics(**args), + Tickets(**args), + Users(**args), + ] diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/spec.json b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/spec.json new file mode 100644 index 000000000000..20f4af4f65e3 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/spec.json @@ -0,0 +1,46 @@ +{ + "documentationUrl": "https://docs.airbyte.io/integrations/sources/zendesk-support", + "connectionSpecification": { + "$schema": "http://json-schema.org/draft-07/schema#", + "title": "Source Zendesk Support Spec", + "type": "object", + "required": ["start_date", "subdomain", "auth_method"], + "additionalProperties": false, + "properties": { + "start_date": { + "type": "string", + "description": "The date from which you'd like to replicate data for Zendesk Support API, in the format YYYY-MM-DDT00:00:00Z. All data generated after this date will be replicated.", + "examples": ["2020-10-15T00:00:00Z"], + "pattern": "^[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}Z$" + }, + "subdomain": { + "type": "string", + "description": "The subdomain for your Zendesk Support" + }, + "auth_method": { + "title": "ZenDesk Authorization Method", + "type": "object", + "description": "Zendesk service provides 2 auth method: API token and oAuth2. Now only the first one is available. Another one will be added in the future", + "oneOf": [ + { + "title": "API Token", + "type": "object", + "required": ["email", "api_token"], + "additionalProperties": false, + "properties": { + "email": { + "type": "string", + "description": "The user email for your Zendesk account" + }, + "api_token": { + "type": "string", + "description": "The value of the API token generated. See the docs for more information", + "airbyte_secret": true + } + } + } + ] + } + } + } +} diff --git a/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/streams.py b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/streams.py new file mode 100644 index 000000000000..8eb0da8623d1 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/source_zendesk_support/streams.py @@ -0,0 +1,478 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + + +import calendar +import time +from abc import ABC, abstractmethod +from datetime import datetime +from typing import Any, Iterable, List, Mapping, MutableMapping, Optional +from urllib.parse import parse_qsl, urlparse + +import pytz +import requests +from airbyte_cdk.models import SyncMode +from airbyte_cdk.sources.streams.http import HttpStream + +DATETIME_FORMAT = "%Y-%m-%dT%H:%M:%SZ" + + +class SourceZendeskException(Exception): + """default exception of custom SourceZendesk logic""" + + +class SourceZendeskSupportStream(HttpStream, ABC): + """"Basic Zendesk class""" + + primary_key = "id" + + page_size = 100 + created_at_field = "created_at" + updated_at_field = "updated_at" + + def __init__(self, subdomain: str, **kwargs): + super().__init__(**kwargs) + + # add the custom value for generation of a zendesk domain + self._subdomain = subdomain + + @property + def url_base(self) -> str: + return f"https://{self._subdomain}.zendesk.com/api/v2/" + + @staticmethod + def _parse_next_page_number(response: requests.Response) -> Optional[int]: + """Parses a response and tries to find next page number""" + next_page = response.json()["next_page"] + if next_page: + return dict(parse_qsl(urlparse(next_page).query)).get("page") + return None + + def backoff_time(self, response: requests.Response) -> int: + """ + The rate limit is 700 requests per minute + # monitoring-your-request-activity + See https://developer.zendesk.com/api-reference/ticketing/account-configuration/usage_limits/ + The response has a Retry-After header that tells you for how many seconds to wait before retrying. + """ + retry_after = response.headers.get("Retry-After") + if retry_after: + return int(retry_after) + # the header X-Rate-Limit returns a amount of requests per minute + # we try to wait twice as long + rate_limit = float(response.headers.get("X-Rate-Limit") or 0) + if rate_limit: + return (60.0 / rate_limit) * 2 + # default value if there is not any headers + return 60 + + @staticmethod + def str2datetime(str_dt: str) -> datetime: + """convert string to datetime object + Input example: '2021-07-22T06:55:55Z' FROMAT : "%Y-%m-%dT%H:%M:%SZ" + """ + if not str_dt: + return None + return datetime.strptime(str_dt, DATETIME_FORMAT) + + @staticmethod + def datetime2str(dt: datetime) -> str: + """convert datetime object to string + Output example: '2021-07-22T06:55:55Z' FROMAT : "%Y-%m-%dT%H:%M:%SZ" + """ + return datetime.strftime(dt.replace(tzinfo=pytz.UTC), DATETIME_FORMAT) + + +class UserSettingsStream(SourceZendeskSupportStream): + """Stream for checking of a request token and permissions""" + + def path(self, *args, **kwargs) -> str: + return "account/settings.json" + + def next_page_token(self, *args, **kwargs) -> Optional[Mapping[str, Any]]: + # this data without listing + return None + + def parse_response(self, response: requests.Response, **kwargs) -> Iterable[Mapping]: + """returns data from API""" + settings = response.json().get("settings") + if settings: + yield settings + + def get_settings(self) -> Mapping[str, Any]: + for resp in self.read_records(SyncMode.full_refresh): + return resp + raise SourceZendeskException("not found settings") + + +class IncrementalEntityStream(SourceZendeskSupportStream, ABC): + """Stream for endpoints where an entity name can be used in a path value + https://.zendesk.com/api/v2/.json + """ + + # default sorted field + cursor_field = SourceZendeskSupportStream.updated_at_field + + # for partial cases when JSON root name of responses is not equal a name value + response_list_name: str = None + + def __init__(self, start_date: str, **kwargs): + super().__init__(**kwargs) + # add the custom value for skiping of not relevant records + self._start_date = self.str2datetime(start_date) if isinstance(start_date, str) else start_date + + def path(self, **kwargs) -> str: + return f"{self.name}.json" + + def parse_response(self, response: requests.Response, **kwargs) -> Iterable[Mapping]: + """returns a list of records""" + # filter by start date + for record in response.json().get(self.response_list_name or self.name) or []: + if record.get(self.created_at_field) and self.str2datetime(record[self.created_at_field]) < self._start_date: + continue + yield record + yield from [] + + def get_updated_state(self, current_stream_state: MutableMapping[str, Any], latest_record: Mapping[str, Any]) -> Mapping[str, Any]: + # try to save maximum value of a cursor field + return { + self.cursor_field: max( + str((latest_record or {}).get(self.cursor_field, "")), str((current_stream_state or {}).get(self.cursor_field, "")) + ) + } + + +class IncrementalExportStream(IncrementalEntityStream, ABC): + """Use the incremental export API to get items that changed or + were created in Zendesk Support since the last request + See: https://developer.zendesk.com/api-reference/ticketing/ticket-management/incremental_exports/ + + You can make up to 10 requests per minute to these endpoints. + """ + + # maximum of 1,000 + page_size = 1000 + + # try to save a stage after every 100 records + # this endpoint provides responces in ascending order. + state_checkpoint_interval = 100 + + @staticmethod + def str2unixtime(str_dt: str) -> int: + """convert string to unixtime number + Input example: '2021-07-22T06:55:55Z' FROMAT : "%Y-%m-%dT%H:%M:%SZ" + Output example: 1626936955" + """ + if not str_dt: + return None + dt = datetime.strptime(str_dt, DATETIME_FORMAT) + return calendar.timegm(dt.utctimetuple()) + + def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]: + data = response.json() + if data["end_of_stream"]: + # true if the current request has returned all the results up to the current time; false otherwise + return None + return {"start_time": data["end_time"]} + + def path(self, *args, **kwargs) -> str: + return f"incremental/{self.name}.json" + + def request_params( + self, stream_state: Mapping[str, Any] = None, next_page_token: Mapping[str, Any] = None, **kwargs + ) -> MutableMapping[str, Any]: + + params = {"per_page": self.page_size} + if not next_page_token: + # try to search all reconds with generated_timestamp > start_time + current_state = stream_state.get(self.cursor_field) + if current_state and isinstance(current_state, str) and not current_state.isdigit(): + # try to save a stage with UnixTime format + current_state = self.str2unixtime(current_state) + start_time = int(current_state or time.mktime(self._start_date.timetuple())) + 1 + # +1 because the API returns all records where generated_timestamp >= start_time + + now = calendar.timegm(datetime.now().utctimetuple()) + if start_time > now - 60: + # start_time must be more than 60 seconds ago + start_time = now - 61 + params["start_time"] = start_time + + else: + params.update(next_page_token) + return params + + +class IncrementalUnsortedStream(IncrementalEntityStream, ABC): + """Stream for loading without sorting + + Some endpoints don't provide approachs for data filtration + We can load all reconds fully and select updated data only + """ + + def __init__(self, **kwargs): + super().__init__(**kwargs) + # Flag for marking of completed process + self._finished = False + # For saving of a relevant last updated date + self._max_cursor_date = None + + def _get_stream_date(self, stream_state: Mapping[str, Any], **kwargs) -> datetime: + """Can change a date of comparison""" + return self.str2datetime((stream_state or {}).get(self.cursor_field)) + + def parse_response(self, response: requests.Response, stream_state: Mapping[str, Any], **kwargs) -> Iterable[Mapping]: + """try to select relevant data only""" + + if not self.cursor_field: + yield from super().parse_response(response, stream_state=stream_state, **kwargs) + else: + send_cnt = 0 + cursor_date = self._get_stream_date(stream_state, **kwargs) + + for record in super().parse_response(response, stream_state=stream_state, **kwargs): + updated = self.str2datetime(record[self.cursor_field]) + if not self._max_cursor_date or self._max_cursor_date < updated: + self._max_cursor_date = updated + if not cursor_date or updated > cursor_date: + send_cnt += 1 + yield record + if not send_cnt: + self._finished = True + yield from [] + + def get_updated_state(self, current_stream_state: MutableMapping[str, Any], latest_record: Mapping[str, Any]) -> Mapping[str, Any]: + + max_updated_at = self.datetime2str(self._max_cursor_date) if self._max_cursor_date else "" + return {self.cursor_field: max(max_updated_at, (current_stream_state or {}).get(self.cursor_field, ""))} + + @property + def is_finished(self): + return self._finished + + @abstractmethod + def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]: + """can be different for each case""" + + +class IncrementalUnsortedPageStream(IncrementalUnsortedStream, ABC): + """Stream for loading without sorting but with pagination + This logic can be used for a small data size when this data is loaded fast + """ + + def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]: + next_page = self._parse_next_page_number(response) + if not next_page: + self._finished = True + return None + return next_page + + def request_params(self, next_page_token: Mapping[str, Any] = None, **kwargs) -> MutableMapping[str, Any]: + params = super().request_params(next_page_token=next_page_token, **kwargs) + params["page"] = next_page_token or 1 + return params + + +class FullRefreshStream(IncrementalUnsortedPageStream, ABC): + """"Stream for endpoints where there are not any created_at or updated_at fields""" + + # reset to default value + cursor_field = SourceZendeskSupportStream.cursor_field + + +class IncrementalSortedCursorStream(IncrementalUnsortedStream, ABC): + """Stream for loading sorting data with cursor based pagination""" + + def request_params(self, next_page_token: Mapping[str, Any] = None, **kwargs) -> MutableMapping[str, Any]: + params = super().request_params(next_page_token=next_page_token, **kwargs) + params.update({"sort_by": self.cursor_field, "sort_order": "desc", "limit": self.page_size}) + + if next_page_token: + params["cursor"] = next_page_token + return params + + def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]: + if self.is_finished: + return None + return response.json().get("before_cursor") + + +class IncrementalSortedPageStream(IncrementalUnsortedPageStream, ABC): + """Stream for loading sorting data with normal pagination""" + + def request_params(self, **kwargs) -> MutableMapping[str, Any]: + params = super().request_params(**kwargs) + if params: + params.update({"sort_by": self.cursor_field, "sort_order": "desc", "limit": self.page_size}) + return params + + +class TicketComments(IncrementalSortedPageStream): + """TicketComments stream: https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_comments/ + ZenDesk doesn't provide API for loading of all comments by one direct endpoints. + Thus at first we loads all updated tickets and after this tries to load all created/updated + comments per every ticket""" + + response_list_name = "comments" + cursor_field = IncrementalSortedPageStream.created_at_field + + def path(self, stream_state: Mapping[str, Any] = None, stream_slice: Mapping[str, Any] = None, **kwargs) -> str: + ticket_id = stream_slice["id"] + return f"tickets/{ticket_id}/comments.json" + + def stream_slices( + self, sync_mode, cursor_field: List[str] = None, stream_state: Mapping[str, Any] = None + ) -> Iterable[Optional[Mapping[str, Any]]]: + """Loads all updated tickets after last stream state""" + stream_state = stream_state or {} + # convert a comment state value to a ticket one + # Comment state: {"created_at": "2021-07-30T12:30:09Z"} => Ticket state {"generated_timestamp": 1627637409} + ticket_stream_value = Tickets.str2unixtime(stream_state.get(self.cursor_field)) + + tickets = Tickets(self._start_date, subdomain=self._subdomain, authenticator=self.authenticator).read_records( + sync_mode=sync_mode, cursor_field=cursor_field, stream_state={Tickets.cursor_field: ticket_stream_value} + ) + stream_state_dt = self.str2datetime(stream_state.get(self.cursor_field)) + + # selects all tickets what have at least one comment + ticket_ids = [ + { + "id": ticket["id"], + "start_stream_state": stream_state_dt, + Tickets.cursor_field: ticket[Tickets.cursor_field], + } + for ticket in tickets + if ticket["comment_count"] + ] + self.logger.info(f"Found updated {len(ticket_ids)} ticket(s) with comments") + # sort slices by generated_timestamp + ticket_ids.sort(key=lambda ticket: ticket[Tickets.cursor_field]) + return ticket_ids + + def _get_stream_date(self, stream_state: Mapping[str, Any], stream_slice: Mapping[str, Any], **kwargs) -> datetime: + """For each tickets all comments must be compared with a start value of stream state""" + return stream_slice["start_stream_state"] + + +# NOTE: all Zendesk endpoints can be splitted into several templates of data loading. +# 1) with API built-in incremental approach +# 2) pagination and sorting mechanism +# 3) cursor pagination and sorting mechanism +# 4) without sorting but with pagination +# 5) without created_at/updated_at fields + +# endpoints provide a built-in incremental approach +class Users(IncrementalExportStream): + """Users stream: https://developer.zendesk.com/api-reference/ticketing/ticket-management/incremental_exports/""" + + +class Organizations(IncrementalExportStream): + """Organizations stream: https://developer.zendesk.com/api-reference/ticketing/ticket-management/incremental_exports/""" + + +class Tickets(IncrementalExportStream): + """Tickets stream: https://developer.zendesk.com/api-reference/ticketing/ticket-management/incremental_exports/""" + + # The API compares the start_time with the ticket's generated_timestamp value, not its updated_at value. + # The generated_timestamp value is updated for all entity updates, including system updates. + # If a system update occurs after a event, the unchanged updated_at time will become earlier relative to the updated generated_timestamp time. + cursor_field = "generated_timestamp" + + def get_updated_state(self, current_stream_state: MutableMapping[str, Any], latest_record: Mapping[str, Any]) -> Mapping[str, Any]: + """Save state as integer""" + state = super().get_updated_state(current_stream_state, latest_record) + if state: + state[self.cursor_field] = int(state[self.cursor_field]) + return state + + def request_params(self, **kwargs) -> MutableMapping[str, Any]: + """Adds the field 'comment_count'""" + params = super().request_params(**kwargs) + params["include"] = "comment_count" + return params + + +# endpoints provide a pagination mechanism but we can't manage a response order + + +class Groups(IncrementalUnsortedPageStream): + """Groups stream: https://developer.zendesk.com/api-reference/ticketing/groups/groups/""" + + +class GroupMemberships(IncrementalUnsortedPageStream): + """GroupMemberships stream: https://developer.zendesk.com/api-reference/ticketing/groups/group_memberships/""" + + +class SatisfactionRatings(IncrementalUnsortedPageStream): + """SatisfactionRatings stream: https://developer.zendesk.com/api-reference/ticketing/ticket-management/satisfaction_ratings/""" + + +class TicketFields(IncrementalUnsortedPageStream): + """TicketFields stream: https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_fields/""" + + +class TicketForms(IncrementalUnsortedPageStream): + """TicketForms stream: https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_forms/""" + + +class TicketMetrics(IncrementalUnsortedPageStream): + """TicketMetric stream: https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_metrics/""" + + +# endpoints provide a pagination and sorting mechanism + + +class Macros(IncrementalSortedPageStream): + """Macros stream: https://developer.zendesk.com/api-reference/ticketing/business-rules/macros/""" + + +# endpoints provide a cursor pagination and sorting mechanism + + +class TicketAudits(IncrementalSortedCursorStream): + """TicketAudits stream: https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_audits/""" + + # ticket audits doesn't have the 'updated_by' field + cursor_field = "created_at" + + # Root of response is 'audits'. As rule as an endpoint name is equal a response list name + response_list_name = "audits" + + +# endpoints dont provide the updated_at/created_at fields +# thus we can't implement an incremental logic for them + + +class Tags(FullRefreshStream): + """Tags stream: https://developer.zendesk.com/api-reference/ticketing/ticket-management/tags/""" + + # doesn't have the 'id' field + primary_key = "name" + + +class SlaPolicies(FullRefreshStream): + """SlaPolicies stream: https://developer.zendesk.com/api-reference/ticketing/business-rules/sla_policies/""" + + def path(self, *args, **kwargs) -> str: + return "slas/policies.json" diff --git a/airbyte-integrations/connectors/source-zendesk-support/unit_tests/unit_test.py b/airbyte-integrations/connectors/source-zendesk-support/unit_tests/unit_test.py new file mode 100644 index 000000000000..2fecd7df1ce6 --- /dev/null +++ b/airbyte-integrations/connectors/source-zendesk-support/unit_tests/unit_test.py @@ -0,0 +1,59 @@ +# +# MIT License +# +# Copyright (c) 2020 Airbyte +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. +# + +import json +from unittest import TestCase + +import requests_mock +import timeout_decorator +from airbyte_cdk.sources.streams.http.exceptions import UserDefinedBackoffException +from source_zendesk_support import SourceZendeskSupport +from source_zendesk_support.streams import Tags + +CONFIG_FILE = "secrets/config.json" + + +class TestZendeskSupport(TestCase): + """This test class provides a set of tests for different Zendesk streams. + The Zendesk API has difference pagination and sorting mechanisms for streams. + Let's try to check them + """ + + @staticmethod + def prepare_stream_args(): + """Generates streams settings from a file""" + with open(CONFIG_FILE, "r") as f: + return SourceZendeskSupport.convert_config2stream_args(json.loads(f.read())) + + @timeout_decorator.timeout(10) + def test_backoff(self): + """Zendesk sends the header 'Retry-After' about needed delay. + All streams have to handle it""" + timeout = 1 + stream = Tags(**self.prepare_stream_args()) + with requests_mock.Mocker() as m: + url = stream.url_base + stream.path() + m.get(url, text=json.dumps({}), status_code=429, headers={"Retry-After": str(timeout)}) + with self.assertRaises(UserDefinedBackoffException): + list(stream.read_records(sync_mode=None)) diff --git a/docs/integrations/sources/zendesk-support.md b/docs/integrations/sources/zendesk-support.md index 5cd911aaad87..0c155596b615 100644 --- a/docs/integrations/sources/zendesk-support.md +++ b/docs/integrations/sources/zendesk-support.md @@ -5,9 +5,8 @@ The Zendesk Support source supports both Full Refresh and Incremental syncs. You can choose if this connector will copy only the new or updated data, or all rows in the tables and columns you set up for replication, every time a sync is run. This source can sync data for the [Zendesk Support API](https://developer.zendesk.com/rest_api/docs/support). - -This Source Connector is based on a [Singer Tap](https://github.com/singer-io/tap-zendesk). - +This Source Connector is based on a [Airbyte CDK](https://docs.airbyte.io/contributing-to-airbyte/python). +Incremental sync are implemented on API side by its filters ### Output schema This Source is capable of syncing the following core Streams: @@ -27,6 +26,29 @@ This Source is capable of syncing the following core Streams: * [Tags](https://developer.zendesk.com/rest_api/docs/support/tags) * [SLA Policies](https://developer.zendesk.com/rest_api/docs/support/sla_policies) + ### Not implemented schema + These Zendesk endpoints are available too. But syncing with them will be implemented in the future. + #### Tickets +* [Ticket Attachments](https://developer.zendesk.com/api-reference/ticketing/tickets/ticket-attachments/) +* [Ticket Requests](https://developer.zendesk.com/api-reference/ticketing/tickets/ticket-requests/) +* [Ticket Metric Events](https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_metric_events/) +* [Ticket Activities](https://developer.zendesk.com/api-reference/ticketing/tickets/activity_stream/) +* [Ticket Skips](https://developer.zendesk.com/api-reference/ticketing/tickets/ticket_skips/) + + #### Help Center +* [Articles](https://developer.zendesk.com/api-reference/help_center/help-center-api/articles/) +* [Article Attachments](https://developer.zendesk.com/api-reference/help_center/help-center-api/article_attachments/) +* [Article Comments](https://developer.zendesk.com/api-reference/help_center/help-center-api/article_comments/) +* [Categories](https://developer.zendesk.com/api-reference/help_center/help-center-api/categories/) +* [Management Permission Groups](https://developer.zendesk.com/api-reference/help_center/help-center-api/permission_groups/) +* [Translations](https://developer.zendesk.com/api-reference/help_center/help-center-api/translations/) +* [Sections](https://developer.zendesk.com/api-reference/help_center/help-center-api/sections/) +* [Topics](https://developer.zendesk.com/api-reference/help_center/help-center-api/topics) +* [Themes](https://developer.zendesk.com/api-reference/help_center/help-center-api/theming) +* [Posts](https://developer.zendesk.com/api-reference/help_center/help-center-api/posts) +* [Themes](https://developer.zendesk.com/api-reference/help_center/help-center-api/posts) +* [Post Comments](https://developer.zendesk.com/api-reference/help_center/help-center-api/post_comments/) + ### Data type mapping | Integration Type | Airbyte Type | Notes | @@ -35,13 +57,13 @@ This Source is capable of syncing the following core Streams: | `number` | `number` | | | `array` | `array` | | | `object` | `object` | | - ### Features | Feature | Supported?\(Yes/No\) | Notes | | :--- | :--- | :--- | | Full Refresh Sync | Yes | | | Incremental - Append Sync | Yes | | +| Incremental - Debuped + History Sync | Yes | Enabled according to type of destination | | Namespaces | No | | ### Performance considerations @@ -51,16 +73,23 @@ The connector is restricted by normal Zendesk [requests limitation](https://deve The Zendesk connector should not run into Zendesk API limitations under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. ## Getting started - ### Requirements +* Zendesk Subdomain +* Auth Method + * API Token + * Zendesk API Token + * Zendesk Email + * oAuth2 (not implemented) -* Zendesk API Token -* Zendesk Email -* Zendesk Subdomain ### Setup guide -Generate a API access token using the [Zendesk support](https://support.zendesk.com/hc/en-us/articles/226022787-Generating-a-new-API-token-) +Generate a API access token using the [Zendesk support](https://support.zendesk.com/hc/en-us/articles/226022787-Generating-a-new-API-token) We recommend creating a restricted, read-only key specifically for Airbyte access. This will allow you to control which resources Airbyte should be able to access. +### CHANGELOG +| Version | Date | Pull Request | Subject | +| :------ | :-------- | :----- | :------ | +| `0.1.0` | 2021-07-21 | [4861](https://github.com/airbytehq/airbyte/issues/3698) | created CDK native zendesk connector | + diff --git a/tools/bin/ci_credentials.sh b/tools/bin/ci_credentials.sh index d6ea6c56d4fd..24f164f53e6e 100755 --- a/tools/bin/ci_credentials.sh +++ b/tools/bin/ci_credentials.sh @@ -102,7 +102,7 @@ write_standard_creds source-typeform "$SOURCE_TYPEFORM_CREDS" write_standard_creds source-us-census "$SOURCE_US_CENSUS_TEST_CREDS" write_standard_creds source-zendesk-chat "$ZENDESK_CHAT_INTEGRATION_TEST_CREDS" write_standard_creds source-zendesk-sunshine "$ZENDESK_SUNSHINE_TEST_CREDS" -write_standard_creds source-zendesk-support-singer "$ZENDESK_SECRETS_CREDS" +write_standard_creds source-zendesk-support "$ZENDESK_SUPPORT_TEST_CREDS" write_standard_creds source-zendesk-talk "$ZENDESK_TALK_TEST_CREDS" write_standard_creds source-zoom-singer "$ZOOM_INTEGRATION_TEST_CREDS"