A terraform-module for provisioning Google BigQuery sink connector onto an Aiven managed KafkaConnect cluster. This module depends on Aiven kafka init module to access basic information about Aiven's KafkaConnect cluster
Access to Aiven terraform provider requires an API authentication token which can be generated
from Aiven console
Aiven authentication token can be provided as an environment variable with TF_VAR_
prefix or in a .tfvars
file,
otherwise from Harness Secrets Manager if you are provisioning from Harness.
To be able to sink to a BigQuery project you need a BigQuery project and a dataset created beforehand. And you need a service account that has BigQueryEditor access to be able to create tables inside that dataset.
- When
service_account_id
is provided, each connector will add a key to that service account and provides the key as JSON to the connector for authentication. Key will be destroyed along with the connector. - When
key_file
is provided, connector will not create any new key but will use the provided one here - When both are provided option 1 will be applied
- When none are provided connector does not sink any data
module "bigquery-sink" {
source = "github.com/entur/terraform-aiven-kafka-connect-bigquery-sink//modules/bigquery-sink?ref=v0.2.1"
...
}
See the README.md
under module's subfolder for a list of supported inputs and
outputs. For examples showing how they're implemented, check the examples subfolder.
You can control the version of a module dependency by adding ?ref=TAG
at the end of the source argument, as shown in
the example above. This is highly recommended. You can find a list of available
versions here.
Dependency automation tools such as Renovate Bot will be able to discover new releases and suggest updates automatically.