Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About GCS Lifecycle Rules: Adding Unexpected Conditions #14044

Closed
heiyohei opened this issue Mar 20, 2023 · 16 comments · Fixed by GoogleCloudPlatform/magic-modules#9547, hashicorp/terraform-provider-google-beta#6711 or #16683

Comments

@heiyohei
Copy link

heiyohei commented Mar 20, 2023

The condition "after 0 days since the object was created" is set in the object's conditions, even though it is not specified. Please advise how to delete this condition.

■ Terraform Version
1.37

■ Terraform Configuration Files

Maximum number of versions per object

lifecycle_rule {
action {
type = var.gcs_param_con.type
}
condition {
num_newer_versions = each.value.num_newer_versions
with_state = var.gcs_param_con.with_state_ver
}
}

Non-Current Version Expiration Date

lifecycle_rule {
action {
type = var.gcs_param_con.type
}
condition {
days_since_noncurrent_time = each.value.days_since_noncurrent_time
}
}

Affected Resource(s)

google_storage_bucket

@heiyohei heiyohei added the bug label Mar 20, 2023
@edwardmedia edwardmedia self-assigned this Mar 21, 2023
@edwardmedia
Copy link
Contributor

@heiyohei which resource do you refer to? Can you share a config that can show your question?

@heiyohei
Copy link
Author

@edwardmedia
The resource is at the following URL
https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/storage_bucket
As a beginner, let me ask an additional question.
What exactly is Config information?

@edwardmedia
Copy link
Contributor

@heiyohei does below config work for you? Here is the doc regarding the condition field. What problem are you facing?

resource "google_storage_bucket" "auto-expire" {
  name          = "auto-expiring-bucket"
  location      = "US"
  force_destroy = true

  lifecycle_rule {
    condition {
      age = 3
    }
    action {
      type = "Delete"
    }
  }

  lifecycle_rule {
    condition {
      age = 1
    }
    action {
      type = "AbortIncompleteMultipartUpload"
    }
  }
}

@zenit89
Copy link

zenit89 commented Apr 7, 2023

I have the same issue. The below configuration creates the lifecycle rule with an additional and unwanted age = 0 condition.

resource "google_storage_bucket" "bucket" {
  name          = "auto-expiring-bucket"
  location      = "US"
  force_destroy = true

  lifecycle_rule {
    condition {
      num_newer_versions = 8
    }
    action {
      type = "Delete"
    }
  }
}

@eraac
Copy link
Contributor

eraac commented Apr 12, 2023

Same here, my configuration

  lifecycle_rule {
    action {
      type = "Delete"
    }

    condition {
      days_since_noncurrent_time = 30
      num_newer_versions         = 2
    }
  }

The rule from the console
image

@edwardmedia
Copy link
Contributor

@zenit89 @eraac I see what you presented. But I guess 0+ was added by the API, and the provider does not have much control over it. After apply, I don't see a permadiff. Do you see anything wrong from the terraform perspective?

Also you able to create buckets with the lifecycle using other methods that can avoid 0+?

@eraac
Copy link
Contributor

eraac commented Apr 16, 2023

Also you able to create buckets with the lifecycle using other methods that can avoid 0+?

Yes, from the console the 0+ isn't added
image

I don't think the 0+ can lead to delete files that the others 2 conditions will not delete, it's just confusing

@edwardmedia
Copy link
Contributor

edwardmedia commented Apr 17, 2023

I see where the problem is. This is a bug in the provider. The provider sends 0 for age when it is not set.

Related to #12917

@SarahFrench
Copy link
Member

SarahFrench commented Apr 18, 2023

Here's some notes from when I investigated.

Config I used:

resource "google_storage_bucket" "test" {
  name                     = "test-issue-14044"
  location                 = "US"
  force_destroy            = true
  public_access_prevention = "enforced"

  lifecycle_rule {
    condition {
      with_state = "ARCHIVED"
    }
    action {
      type = "Delete"
    }
  }
}

When creating the bucket using the local backend for Terraform I see the age value is saved as null because it's not set. Here's the state for the bucket:

{
    ... omitted some stuff ...
            "lifecycle_rule": [
              {
                "action": [
                  {
                    "storage_class": "",
                    "type": "Delete"
                  }
                ],
                "condition": [
                  {
                    "age": null,
                    "created_before": "",
                    "custom_time_before": "",
                    "days_since_custom_time": null,
                    "days_since_noncurrent_time": null,
                    "matches_prefix": [],
                    "matches_storage_class": [],
                    "matches_suffix": [],
                    "noncurrent_time_before": "",
                    "num_newer_versions": null,
                    "with_state": "ARCHIVED"
                  }
                ]
              }
            ],
}

The issue appears to be at this point:

conditions := v.(*schema.Set).List()

This creates a value that sets age to 0.

[]interface {}{map[string]interface {}{"age":0, "created_before":"", "custom_time_before":"", "days_since_custom_time":0, "days_since_noncurrent_time":0, "matches_prefix":[]interface {}{}, "matches_storage_class":[]interface {}{}, "matches_suffix":[]interface {}{}, "noncurrent_time_before":"", "num_newer_versions":0, "with_state":"ARCHIVED"}}

This function is invoked by the create and update functions for the resource.

@edwardmedia
Copy link
Contributor

b/278745929

@kustodian
Copy link

We have this same issue. Does anyone know if it makes a difference to the rule because reading a lifecycle rule like this:
image

Looks scary and I don't know if it works as expected.

@zouyc1990
Copy link

It hasn't been repaired

@thomasmaclean
Copy link

I've created a draft PR that uses a failing test to confirm the problem exists, but I can't find a way to solve it. Re: an earlier comment from @SarahFrench:

The issue appears to be at this point:

conditions := v.(*schema.Set).List()

This creates a value that sets age to 0.

It looks like the rule condition comes into the referenced function with that value already set, which you can confirm by adding the line log.Printf("(%v, %T)\n", v, v) after the initial nil check in expandStorageBucketLifecycleRuleCondition().

So far my testing has confirmed that you can't unset the Age field on update or have it unset during create, and any attempt to will result in the field being set to 0.

Notably days_since_custom_time also comes into expandStorageBucketLifecycleRuleCondition() with a value of 0 but it doesn't have the same issues as the age field, i.e. it isn't automatically included in the API request with a zero value. The best guess I have for why this is comes down to Age being *int64 rather than int64. One option would be to always count Age 0 as nil but unfortunately it appears that an age of 0 is considered valid, so this isn't tenable.

This issue will have to be solved higher up in the stack than I'm immediately familiar with, as it appears that the current data handling model isn't set up with work with *int64 fields that could be nil.

@kustodian
Copy link

As far as I can see this issue should be fixed in the provider version 5.9.0 (see the last item in the release notes).

I just set the provider version to 5.9.0, deleted one lifecycle rule and ran terraform apply to recreate the lifecycle rule and I still see the 0+ days since object was created condition.
image

So this issue is not fixed and should be reopened.

@somethingnew2-0
Copy link

somethingnew2-0 commented Dec 13, 2023

Can confirm, the issue is still unfixed in 5.9.0. Please re-open this Issue.

Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 13, 2024
@hashicorp hashicorp unlocked this conversation Apr 1, 2024
@hashicorp hashicorp locked and limited conversation to collaborators Apr 1, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.