Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Airbyte feature requests #4037

Closed
dmdmishra opened this issue Jun 10, 2021 · 6 comments
Closed

Airbyte feature requests #4037

dmdmishra opened this issue Jun 10, 2021 · 6 comments

Comments

@dmdmishra
Copy link

Hi Team,

Does airbyte provides or plan to provide any of the following features/connectors:

  1. API to start / view status of job in airbyte.
  2. Connector for Denodo - data virtualization platform
  3. Kafka consumer connector
  4. data quality or data validation within airbyte to produce results with good and bad records after validation checks. Something what great expectations does.
  5. CDC support for oracle and SQL server including all dml operations on primary keys , partitions , updates , deletes and truncates.
  6. convert input files into desired output files, like JSON to avro or JSON to CSV etc..
  7. calling custom API as source and target connectors.
  8. data encryption at rest and transit
  9. running transformation within snowflake without moving the data out of it and using the processing power of snowflake.
  10. using spark mpp power to move historical data for m source to s3.
@dmdmishra
Copy link
Author

Any thought

@marcosmarxm
Copy link
Member

marcosmarxm commented Jun 17, 2021

@dmdmishra all of them are valid suggestion some of them already have specific open. But can you open specific issues to them be discuss individually? (please make a search to not create duplicated issues)

@marcosmarxm
Copy link
Member

For point 3: #2655

@marcosmarxm
Copy link
Member

marcosmarxm commented Jun 17, 2021

convert input files into desired output files, like JSON to avro or JSON to CSV etc..

We have source connection json or csv and you can choose the destination you prefer. At the moment we support json or csv. You can create specific issues to have avro.

@marcosmarxm
Copy link
Member

running transformation within snowflake without moving the data out of it and using the processing power of snowflake.

We recently release the custom operators where you can use dbt with snowflake to do this. You can read more about here

@marcosmarxm
Copy link
Member

Closing this because is not a unit of request and need to break into each point. @dmdmishra Feel free to open them invidually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants