Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubernetes-public: BigQuery Data transfer to k8s-infra-kettle #2747

Merged
merged 1 commit into from
Sep 16, 2021

Conversation

ameukam
Copy link
Member

@ameukam ameukam commented Sep 16, 2021

Part of : #1308
Ref : #787

Add a service account with workload identity to ensure k8s service
account kettle can push data to BQ dataset k8s-infra-kettle.

Add a BQ data transfer job to copy data from k8s-gubernator:build to
dataset k8s-infra-kettle. The job is not periodically triggered.

Add script to auto-deploy kettle on GKE cluster aaa.

Signed-off-by: Arnaud Meukam ameukam@gmail.com

@k8s-ci-robot k8s-ci-robot added cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. size/M Denotes a PR that changes 30-99 lines, ignoring generated files. labels Sep 16, 2021
@ameukam
Copy link
Member Author

ameukam commented Sep 16, 2021

cc @spiffxp

@k8s-ci-robot k8s-ci-robot added area/terraform Terraform modules, testing them, writing more of them, code in infra/gcp/clusters/ approved Indicates a PR has been approved by an approver from all required OWNERS files. wg/k8s-infra labels Sep 16, 2021
@k8s-ci-robot k8s-ci-robot added size/L Denotes a PR that changes 100-499 lines, ignoring generated files. area/apps Application management, code in apps/ and removed size/M Denotes a PR that changes 30-99 lines, ignoring generated files. labels Sep 16, 2021
Part of : kubernetes#1308
Ref : kubernetes#787

Add a service account with workload identity to ensure k8s service
account kettle can push data to BQ dataset k8s-infra-kettle.

Add a BQ data transfer job to copy data from k8s-gubernator:build to
dataset k8s-infra-kettle. The job is not periodically triggered.

Add script to auto-deploy kettle on GKE cluster aaa.

Signed-off-by: Arnaud Meukam <ameukam@gmail.com>
Copy link
Member

@spiffxp spiffxp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/approve
/lgtm

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Sep 16, 2021
@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: ameukam, spiffxp

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot merged commit 2002eff into kubernetes:main Sep 16, 2021
@k8s-ci-robot k8s-ci-robot added this to the v1.23 milestone Sep 16, 2021
@ameukam
Copy link
Member Author

ameukam commented Sep 17, 2021

This is partially deployed :

module.aaa_kettle_sa.google_service_account.serviceaccount: Creating...
google_service_account.bq_kettle_data_transfer_writer: Creating...
module.aaa_kettle_sa.google_service_account.serviceaccount: Creation complete after 2s [id=projects/kubernetes-public/serviceAccounts/kettle@kubernetes-public.iam.gserviceaccount.com]
google_service_account.bq_kettle_data_transfer_writer: Creation complete after 2s [id=projects/kubernetes-public/serviceAccounts/bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com]
data.google_iam_policy.prod_kettle_dataset_iam_policy: Reading... [id=3302437639]
module.aaa_kettle_sa.google_service_account_iam_policy.serviceaccount_iam: Creating...
google_project_iam_member.bq_kettle_data_transfer_jobuser_binding: Creating...
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creating...
data.google_iam_policy.prod_kettle_dataset_iam_policy: Read complete after 0s [id=551688621]
google_bigquery_dataset_iam_policy.prod_kettle_dataset: Modifying... [id=projects/kubernetes-public/datasets/k8s_infra_kettle]
module.aaa_kettle_sa.google_service_account_iam_policy.serviceaccount_iam: Creation complete after 1s [id=projects/kubernetes-public/serviceAccounts/kettle@kubernetes-public.iam.gserviceaccount.com]
google_bigquery_dataset_iam_policy.prod_kettle_dataset: Modifications complete after 1s [id=projects/kubernetes-public/datasets/k8s_infra_kettle]
google_project_iam_member.bq_kettle_data_transfer_jobuser_binding: Creation complete after 9s [id=kubernetes-public/roles/bigquery.jobUser/serviceAccount:bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com]
╷
│ Error: Error creating Config: googleapi: Error 403: BigQuery Data Transfer service has not been used in project 127754664067 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/bigquerydatatransfer.googleapis.com/overview?project=127754664067 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
│ 
│   with google_bigquery_data_transfer_config.bq_data_transfer_kettle,
│   on k8s-kettle.tf line 84, in resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle":
│   84: resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle" {

ameukam added a commit to ameukam/k8s.io that referenced this pull request Sep 17, 2021
Part of : kubernetes#1308
Ref: kubernetes#787
Followup of : kubernetes#2747

Enable GCP required service for data transfer

Signed-off-by: Arnaud Meukam <ameukam@gmail.com>
@ameukam
Copy link
Member Author

ameukam commented Sep 17, 2021

Opened and deployed #2749

Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
  + create

Terraform will perform the following actions:

  # google_bigquery_data_transfer_config.bq_data_transfer_kettle will be created
  + resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle" {
      + data_source_id         = "cross_region_copy"
      + destination_dataset_id = "k8s_infra_kettle"
      + disabled               = true
      + display_name           = "BigQuery data transfer to k8s_infra_kettle"
      + id                     = (known after apply)
      + location               = "US"
      + name                   = (known after apply)
      + params                 = {
          + "overwrite_destination_table" = "true"
          + "source_dataset_id"           = "build"
          + "source_project_id"           = "k8s-gubernator"
        }
      + project                = "kubernetes-public"
      + service_account_name   = "bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com"

      + email_preferences {
          + enable_failure_email = false
        }
    }

Plan: 1 to add, 0 to change, 0 to destroy.
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creating...
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creation complete after 4s [id=projects/127754664067/locations/us/transferConfigs/617ffed3-0000-274f-bb87-d4f547e654cc]
Releasing state lock. This may take a few moments...

ameukam added a commit to ameukam/k8s.io that referenced this pull request Sep 17, 2021
Part of : kubernetes#1308
Ref: kubernetes#787
Followup of : kubernetes#2747

Enable GCP required service for data transfer

Signed-off-by: Arnaud Meukam <ameukam@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. area/apps Application management, code in apps/ area/infra Infrastructure management, infrastructure design, code in infra/ area/terraform Terraform modules, testing them, writing more of them, code in infra/gcp/clusters/ cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. lgtm "Looks good to me", indicates that a PR is ready to be merged. size/L Denotes a PR that changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants