-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 New source: Coda [python cdk] #18675
🎉 New source: Coda [python cdk] #18675
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @himanshuc3, Marcos from Airbyte here 👋 . We received more than 25 new contributions along the weekend. One is yours 🎉 thank so much for! Our team is limited and maybe the review process can take longer than expected. As described in the Airbyte's Hacktoberfest your contribution was submitted before November 2nd and it is eligible to win the prize. The review process will validate other requirements. I ask to you patience until someone from the team review it.
Because I reviewed some contributions for Hacktoberfest so far I saw some common patterns you can check in advance:
- Make sure you have added connector documentation to
/docs/integrations/
- Remove the file
catalog
from/integration_tests
- Edit the
sample_config.json
inside/integration_tests
- For the
configured_catalog
you can use onlyjson_schema: {}
- Add title to all properties in the
spec.yaml
- Make sure the
documentationUrl
in thespec.yaml
redirect to Airbyte's future connector page, eg: connector Airtable thedocumentationUrl: https://docs.airbyte.com/integrations/sources/airtable
- Review now new line at EOF (end-of-file) for all files.
If possible send to me a DM in Slack with the tests credentials, this process will make easier to us run integration tests and publish your connector. If you only has production keys, make sure to create a bootstrap.md explaining how to get the keys.
Hello! I'm going to be out of the office this Friday and won't be able to review your contribution again today, I return next Monday. So far, most contributions look solid and are almost done to be approved. As said in Chris' comment all contributions made before 2-November are eligible to receive the prize and have 2 weeks to merge the contributions. But I ensure next week we're going to have your contribution merged. If you have questions about the implementation you can send them in Sorry the inconvenience and see you again next week, thank you so much for your contribution! |
@marcosmarxm Thanks for the update. I would ensure most of the items are checked off by then since I needed to review the list of items mentioned by you. |
@himanshuc3 did you have time to check the list? |
apologies, didn't get bandwith to work on this due to university curriculum. |
4062f75
to
797c495
Compare
@marcosmarxm Verified and pushed changes mentioned in the checklist. |
I'll take a look after today @himanshuc3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some comments, @himanshuc3 there are some missing code to finish the connector. Let me know if you need assistance with it.
# TODO: Define your stream schemas | ||
Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). | ||
|
||
The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. | ||
|
||
The schema of a stream is the return value of `Stream.get_json_schema`. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this file.
"streams": [ | ||
{ | ||
"stream": { | ||
"name": "docs", | ||
"json_schema": { | ||
"$schema": "http://json-schema.org/draft-07/schema#", | ||
"type": "object", | ||
"properties": { | ||
"access_key": { | ||
"type": "string" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this file.
@property | ||
def cursor_field(self) -> str: | ||
""" | ||
TODO | ||
Override to return the cursor field used by this stream e.g: an API entity might always use created_at as the cursor field. This is | ||
usually id or date based. This field's presence tells the framework this in an incremental stream. Required for incremental. | ||
|
||
:return str: The name of the cursor field. | ||
""" | ||
return [] | ||
|
||
def get_updated_state(self, current_stream_state: MutableMapping[str, Any], latest_record: Mapping[str, Any]) -> Mapping[str, Any]: | ||
""" | ||
Override to determine the latest state after reading the latest record. This typically compared the cursor_field from the latest record and | ||
the current state and picks the 'most' recent cursor. This is how a stream's state is determined. Required for incremental. | ||
""" | ||
return {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This probably won't work to getting an incremental sync.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
haven't yet implemented it, is it required for the creation of a data source. It could probably better if it's done in another pr, incremental feature addition ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove it if isn't used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@himanshuc3 please implement the pagination for streams
def request_params( | ||
self, stream_state: Mapping[str, Any], stream_slice: Mapping[str, any] = None, next_page_token: Mapping[str, Any] = None | ||
) -> MutableMapping[str, Any]: | ||
""" | ||
TODO: Override this method to define any query parameters to be set. Remove this method if you don't need to define request params. | ||
Usually contains common params e.g. pagination size etc. | ||
""" | ||
return {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The source is lacking of pagination which is required even for full refresh sync mode. Please check the API documentation and implement it.
@himanshuc3 please also share the output of integration tests. |
2266b62
to
df987b3
Compare
/publish connector=connectors/source-coda
if you have connectors that successfully published but failed definition generation, follow step 4 here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @himanshuc3
thanks @marcosmarxm for helping me with any doubts. Hope to return and contribute further whenever bandwidth is available. |
not sure why the Add labels github action is failing, It throws the following error:
|
Thanks @himanshuc3 for the contribution |
Hi all, thank you for this integration!! When will we see it in Airbyte Cloud? |
@YowanR can you check this request? |
Thanks @marcosmarxm! @apolo-damasco We are working through our backlog to get connectors onto Airbyte Cloud. If you want to see this connector on Cloud sooner, it would help if you could give this comment a 👍. Thanks! |
So good news, the connection is already showing up and working! |
Ha, give me some time. I'm free after 10 dec. Will connect post that.
…On Thu, Dec 1, 2022, 3:27 AM Apolodoro de Damasco ***@***.***> wrote:
So good news, the connection is already showing up and working!
However, is lacking the ability to sync the rows of tables which is the
main use case for this integration :(
At the moment is only possible to sync metadata of your docs...
@himanshuc3 <https://github.com/himanshuc3> any chance you could
implement it?
—
Reply to this email directly, view it on GitHub
<#18675 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AESHGXO4442H3OQOJ4YHB7TWK7EVTANCNFSM6AAAAAARSQN7XY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
* init coda * fix: Streams added * fix: Common review comments * fix: definition * fix: Remove sample files * fix: pagination * integration test configs * fix: remove validations * fix: unit test * fix: unit test removed * fix: unit tests added * linting * fix: pytest to 6.2.5 * remove columns * update connector * add coda to source def * remove coda from source def * readd missing file * rollback change to index.html * correct some files * rollback change in cart.com file * fix sample files * run format * auto-bump connector version Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
@himanshuc3 any news on this? |
@apolo-damasco I think we should create a new issue to track what you are asking above. Can you do this and tag me? I'll work on getting prioritized |
What
The following PR adds Python-CDK based Coda Connector. For more context, please refer to airbytehq/connector-contest#210.
How
Using Python-CDK based connector for Coda streams, following streams have been baked in:
Recommended reading order
spec.yaml
schemas/
folder🚨 User Impact 🚨
Not applicable
Pre-merge Checklist
Expand the relevant checklist and delete the others.
New Connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampledocs/integrations/README.md
airbyte-integrations/builds.md
Airbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing/publish
command described hereTests
Unit
Put your unit tests output here.
Integration
Put your integration tests output here.
Acceptance
Put your acceptance tests output here.