Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Managed Prometheus #1298

Closed
wants to merge 13 commits into from

Conversation

abousias
Copy link

@abousias abousias commented Jun 20, 2022

With the release of the 4.25.0 "beta" provider, it is now possible to enable Managed Prometheus.
With this we can now enable this feature in the beta submodules.

@google-cla
Copy link

google-cla bot commented Jun 20, 2022

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@abousias abousias marked this pull request as ready for review June 21, 2022 13:58
@abousias abousias requested review from a team, Jberlinsky and bharathkkb as code owners June 21, 2022 13:58
@abousias
Copy link
Author

Resolves #1290 and #1167

Copy link
Member

@bharathkkb bharathkkb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @abousias

@@ -95,6 +95,15 @@ resource "google_container_cluster" "primary" {
enable_components = var.monitoring_enabled_components
}
}
dynamic "monitoring_config" {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is an existing dynamic monitoring_config, I think we can nest managed_prometheus as a dynamic under that.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for quick response and the review comment @bharathkkb! I kept it separate to have a separate for_each check as in one case we have var.monitoring_enabled_components=[] but we still want managed prometheus enabled. I didn't think of nesting with another dynamic! I will try that!

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bharathkkb the nested dynamic, the way I have it now, doesn't seem to be working as expected. With the var.enable_managed_prometheus set to true, the service didn't get enabled and the tfstate is like this:

...
"monitoring_config": [
  {
    "enable_components": [
      "SYSTEM_COMPONENTS"
    ],
    "managed_prometheus": []
  }
],
...

I am looking into it.

Copy link
Author

@abousias abousias Jun 22, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interestingly switching to a static block:

https://github.com/abousias/terraform-google-kubernetes-engine/blob/c5170bb58cab6706377baf24e58cadc28e685205/autogen/main/cluster.tf.tmpl#L91-L100

It gives an api error:

image

I am reverting to the initial as I don't have a working solution at the moment.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am having the error, with the reverted code too. The issue didn't appear because in my test I was applying the change to an existing cluster. Now I am getting this with a brand new cluster.

https://github.com/abousias/terraform-google-kubernetes-engine/blob/c5170bb58cab6706377baf24e58cadc28e685205/autogen/main/cluster.tf.tmpl#L81

I am looking into it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @jroiseux I was about to check exactly the same conditional 😄! I will take a look and copy! Many thanks for the help! Nice to know that it will work 😉!

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bharathkkb ready for re-review. It works fine now.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ I discovered that the current version is trying to disable SYSTEM_COMPONENTS, here is the plan against an existing cluster:

image

Whereas the separate block one leaves the enabled_components untouched:

image

This is because the dynamic is "forced" to produce the block but the var.monitoring_enabled_components is []. My existing cluster is also based on the main module instead of the beta ones. I am checking if that is an issue for me.

@jroiseux this is most probably happening with your version, isn't it?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@abousias Yes, it happens on my version too.

This is what the statefile looks like when I use the current module for "beta-private-cluster" (without the changes from this PR) without setting var.monitoring_enabled_components :

Screenshot from 2022-06-23 09-16-34

I think it might be an issue with the default value. As enabled_components seems to end up as ["SYSTEM_COMPONENTS"], even though the default is [].

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bharathkkb I am reverting back to the original implementation, leaving TF to merge the blocks in one well after the evaluation of the the for_eachs. As @jroiseux also points out, the other option would be to change the default value of var.monitoring_enabled_components to something like [ "SYSTEM_COMPONENTS" ] which I don't know if it is a good idea.

@abousias abousias requested a review from bharathkkb June 21, 2022 21:50
@comment-bot-dev
Copy link

@abousias
Thanks for the PR! 🚀
✅ Lint checks have passed.

@bharathkkb
Copy link
Member

Thanks for working on this @abousias and collaborating with @jroiseux
I merged #1307

@bharathkkb bharathkkb closed this Jun 27, 2022
@abousias
Copy link
Author

Thanks for working on this @abousias and collaborating with @jroiseux
I merged #1307

Thank you @bharathkkb!

@abousias
Copy link
Author

Thanks for working on this @abousias and collaborating with @jroiseux
I merged #1307

Thank you @bharathkkb!

@bharathkkb and @jroiseux was the issue, where the SYSTEM_COMPONENTS were getting disabled solved? Because I see now the code on #1307 similar to a previous version of this. I think this is a blocker!

@jroiseux
Copy link
Contributor

Thanks for working on this @abousias and collaborating with @jroiseux
I merged #1307

Thank you @bharathkkb!

@bharathkkb and @jroiseux was the issue, where the SYSTEM_COMPONENTS were getting disabled solved? Because I see now the code on #1307 similar to a previous version of this. I think this is a blocker!

Yes, it is no longer trying to disable it because of enable_components = length(var.monitoring_enabled_components) > 0 ? var.monitoring_enabled_components : null. So when it's set to null, it doesn't try to modify the existing value.

@abousias
Copy link
Author

Thanks for working on this @abousias and collaborating with @jroiseux
I merged #1307

Thank you @bharathkkb!

@bharathkkb and @jroiseux was the issue, where the SYSTEM_COMPONENTS were getting disabled solved? Because I see now the code on #1307 similar to a previous version of this. I think this is a blocker!

Yes, it is no longer trying to disable it because of enable_components = length(var.monitoring_enabled_components) > 0 ? var.monitoring_enabled_components : null. So when it's set to null, it doesn't try to modify the existing value.

Indeed, sorry I missed the PR description! I double checked in my scenarios too, it works. Thanks again @jroiseux !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants