Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module doesnt honor specified GKE master version #1460

Closed
kkapoor1987 opened this issue Nov 15, 2022 · 3 comments · Fixed by #1625
Closed

Module doesnt honor specified GKE master version #1460

kkapoor1987 opened this issue Nov 15, 2022 · 3 comments · Fixed by #1625
Labels
bug Something isn't working

Comments

@kkapoor1987
Copy link

TL;DR

When both the release channel and master version are specified the module ignores the master version which creates a discrepancy in the intended cluster version vs what the module is creating.

https://github.com/terraform-google-modules/terraform-google-kubernetes-engine/blob/master/modules/private-cluster/cluster.tf#L57

Expected behavior

If you user specifies the version module should either use that version or fail provisioning with error message version not supported

Observed behavior

We use this module to deploy GKE clusters. We use a Stable release channel but control the master version so that we can reliably deploy the same K8s version across our fleet.

Terraform Configuration

module "gke" {
  source             = "terraform-google-modules/kubernetes-engine/google//modules/private-cluster"
  version            = "23.3.0"
  project_id         = var.AccountID
  name               = var.ClusterName
  kubernetes_version = local.latest_gke_patched_version
  regional           = true
  region             = var.Region
  network            = var.ClusterName
  subnetwork         = var.ServiceSubnets[0]
  ip_range_pods      = var.PodRange
  ip_range_services  = var.ClusterServiceRangeName
  horizontal_pod_autoscaling = true
  database_encryption        = local.cluster_encryption_config
  enable_private_endpoint  = false
  enable_private_nodes     = true
  remove_default_node_pool = true
  network_policy           = true
  monitoring_service       = "none"
  master_ipv4_cidr_block   = "10.0.64.0/28"
  logging_service = "logging.googleapis.com/kubernetes"
  cluster_resource_labels           = local.tags
  node_pools                 = local.node_pools
  master_authorized_networks = local.master_authorized_networks
  
  # Release Channel configuration
  release_channel = var.release_channel_version
  maintenance_exclusions = [{
    name = "minor_upgrade_exclusion"
    start_time = timestamp()
    end_time = timeadd(timestamp(),local.end_time_in_hrs)
    exclusion_scope = "NO_MINOR_UPGRADES" # Scope should be one of: NO_UPGRADES | NO_MINOR_UPGRADES | NO_MINOR_OR_NODE_UPGRADES
  }]
}

Terraform Version

1.0.11

Additional information

No response

@kkapoor1987 kkapoor1987 added the bug Something isn't working label Nov 15, 2022
@github-actions
Copy link

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days

@github-actions github-actions bot added the Stale label Jan 14, 2023
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 22, 2023
@slimatic
Copy link
Contributor

Can this be reopened?

@apeabody apeabody reopened this Apr 27, 2023
@apeabody apeabody removed the Stale label Apr 27, 2023
@slimatic
Copy link
Contributor

slimatic commented May 1, 2023

A new pull request PR #1625 has been opened related to the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants