Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicating Deployments using Datasource #329

Open
phergoualch opened this issue Dec 16, 2024 · 4 comments
Open

Duplicating Deployments using Datasource #329

phergoualch opened this issue Dec 16, 2024 · 4 comments

Comments

@phergoualch
Copy link

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Terraform Version

1.9.8

Affected Resource(s)

data.prefect_deployment
resource.prefect_deployment

Terraform Configuration Files

data "data" "existing" {
  id = "9e1eb232-4f15-404d-8a84-3727cc68f584"
}

resource "prefect_deployment" "copy" {
  name                     = "copy"
  description              = "A copy of an existing deployment"
  flow_id                  = data.prefect_deployment.existing.flow_id
  work_pool_name           = data.prefect_deployment.existing.work_pool_name
  work_queue_name          = data.prefect_deployment.existing.work_queue_name
  entrypoint               = data.prefect_deployment.existing.entrypoint
  path                     = data.prefect_deployment.existing.path
  job_variables            = data.prefect_deployment.existing.job_variables
  parameter_openapi_schema = data.prefect_deployment.existing.parameter_openapi_schema
  enforce_parameter_schema = data.prefect_deployment.existing.enforce_parameter_schema
}

Debug Output

Panic Output

https://gist.github.com/phergoualch/a0837a64b6f5246cd75be8b233c5512e

Expected Behavior

I would like to be able to datasource a deployment to create a copy with another name and parameters. I was able to do it using a Python script without issues but would like to have it using Terraform

Actual Behavior

Steps to Reproduce

  1. terraform apply

Important Factoids

References

  • #0000
@phergoualch phergoualch added the bug Something isn't working label Dec 16, 2024
@jamiezieziula jamiezieziula added the feature label Dec 16, 2024 — with Linear
@jamiezieziula jamiezieziula removed the bug Something isn't working label Dec 16, 2024
@mitchnielsen mitchnielsen self-assigned this Dec 18, 2024
@mitchnielsen
Copy link
Contributor

Thanks for reaching out @phergoualch.

I created a manifest based on your setup and was able to duplicate the deployment. I used version 2.12.0 of the provider, which is what I saw in your output gist.

Could you provide your full (relevant) configuration here so I can try again to replicate the problem? As a reference, here's what I used:

resource "prefect_flow" "example" {
  name = "example-flow"
  tags = ["example"]
}

resource "prefect_deployment" "example" {
  name        = "example-deployment"
  description = "Example deployment"
  flow_id     = prefect_flow.example.id

  enforce_parameter_schema = true
  entrypoint               = "flow.py:hello_world"
  job_variables = jsonencode({
    "env" : { "some-key" : "some-value" }
  })
  parameter_openapi_schema = jsonencode({
    "type" : "object",
    "properties" : {
      "some-parameter" : { "type" : "string" }
      "some-parameter2" : { "type" : "string" }
    }
  })
  path            = "./foo/bar"
  tags            = ["example"]
  work_pool_name  = "mitch-testing-pool"
  work_queue_name = "default"
}

data "prefect_deployment" "existing" {
  id = prefect_deployment.example.id
}

resource "prefect_deployment" "copy" {
  name        = "copy"
  description = "A copy of an existing deployment"

  enforce_parameter_schema = data.prefect_deployment.existing.enforce_parameter_schema
  entrypoint               = data.prefect_deployment.existing.entrypoint
  flow_id                  = data.prefect_deployment.existing.flow_id
  job_variables            = data.prefect_deployment.existing.job_variables
  parameter_openapi_schema = data.prefect_deployment.existing.parameter_openapi_schema
  path                     = data.prefect_deployment.existing.path
  tags                     = data.prefect_deployment.existing.tags
  work_pool_name           = data.prefect_deployment.existing.work_pool_name
  work_queue_name          = data.prefect_deployment.existing.work_queue_name
}
image

@phergoualch
Copy link
Author

Hey @mitchnielsen, thanks for your answer. It is the same as my answer in #330 as both my issues are linked

The deployment output when datasourcing with id

{
  "account_id" = tostring(null)
  "concurrency_limit" = tonumber(null)
  "concurrency_options" = null /* object */
  "created" = "2024-12-13T14:19:12Z"
  "description" = "Main flow for MySQL backup operations"
  "enforce_parameter_schema" = tobool(null)
  "entrypoint" = "flow.py:mysql_backup"
  "flow_id" = "79685e32-e1cd-46f5-b401-b4dfaa1b8427"
  "id" = "9e1eb232-4f15-404d-8a84-3727cc68f584"
  "job_variables" = "{\"image\":\"503532123506.dkr.ecr.eu-west-1.amazonaws.com/prefect-flows:mysql-backup\"}"
  "manifest_path" = tostring(null)
  "name" = "mysql-backup"
  "parameter_openapi_schema" = "{\"definitions\":{\"BackupConfig\":{\"description\":\"Backup configuration\",\"properties\":{\"backup_dir\":{\"default\":\"/tmp/sqlbackup\",\"description\":\"Backup directory path\",\"title\":\"Backup Dir\",\"type\":\"string\"},\"db_exclude\":{\"default\":[\"information_schema\",\"innodb\",\"sys\",\"tmp\",\"mysql\",\"performance_schema\",\"awsdms_control\"],\"description\":\"Databases to exclude from backup\",\"items\":{\"type\":\"string\"},\"title\":\"Db Exclude\",\"type\":\"array\"},\"s3_bucket\":{\"default\":\"rs-databases-backup\",\"description\":\"S3 bucket name\",\"title\":\"S3 Bucket\",\"type\":\"string\"}},\"title\":\"BackupConfig\",\"type\":\"object\"},\"DBCredentials\":{\"description\":\"Database connection credentials\",\"properties\":{\"host\":{\"description\":\"Database host\",\"title\":\"Host\",\"type\":\"string\"},\"password\":{\"anyOf\":[{\"type\":\"string\"},{\"type\":\"null\"}],\"default\":null,\"description\":\"Database password\",\"title\":\"Password\"},\"port\":{\"default\":3306,\"description\":\"Database port\",\"title\":\"Port\",\"type\":\"integer\"},\"secret_arn\":{\"anyOf\":[{\"type\":\"string\"},{\"type\":\"null\"}],\"default\":null,\"description\":\"AWS Secrets Manager ARN\",\"title\":\"Secret Arn\"},\"user\":{\"anyOf\":[{\"type\":\"string\"},{\"type\":\"null\"}],\"default\":null,\"description\":\"Database username\",\"title\":\"User\"}},\"required\":[\"host\"],\"title\":\"DBCredentials\",\"type\":\"object\"},\"MySQLOptions\":{\"description\":\"MySQL client options configuration\",\"properties\":{\"backup\":{\"default\":\"--opt --skip-lock-tables --single-transaction --max_allowed_packet=256M --routines\",\"description\":\"MySQL dump options\",\"title\":\"Backup\",\"type\":\"string\"}},\"title\":\"MySQLOptions\",\"type\":\"object\"}},\"properties\":{\"config\":{\"$ref\":\"#/definitions/BackupConfig\",\"position\":1,\"title\":\"config\"},\"credentials\":{\"$ref\":\"#/definitions/DBCredentials\",\"position\":0,\"title\":\"credentials\"},\"mysql_opts\":{\"anyOf\":[{\"$ref\":\"#/definitions/MySQLOptions\"},{\"type\":\"null\"}],\"default\":null,\"position\":2,\"title\":\"mysql_opts\"}},\"required\":[\"credentials\",\"config\"],\"title\":\"Parameters\",\"type\":\"object\"}"
  "parameters" = "{}"
  "path" = "."
  "paused" = false
  "pull_steps" = tolist([])
  "storage_document_id" = "00000000-0000-0000-0000-000000000000"
  "tags" = tolist(null) /* of string */
  "updated" = "2024-12-19T15:45:23Z"
  "version" = "1ff9fcc2f5f4c5a9f97fec0a6bde0e83"
  "work_pool_name" = "ecs-fargate-spot"
  "work_queue_name" = "default"
  "workspace_id" = tostring(null)
}

Note that the exisiting deployment was not created using Terraform but Python and it's maybe the cause. Maybe one of the parameters that the Terraform provider is not linking ?
I want to be able to create a "template" deployment with the flow.deploy() function and then be able to duplicate it using Terraform

@mitchnielsen
Copy link
Contributor

Thanks, I was able to duplicate a deployment using a datasource so I'm not quite sure how to replicate your issue yet.

Some things we can try:

  • Use a newer version of the provider than your current (2.12.0)
  • See if fix(deployments datasource): get by name #343 helps (or confirm you're using id in the datasource for now; I can see you are from the issue description, but just confirming)

@mitchnielsen
Copy link
Contributor

@phergoualch - give 2.13.4 a try when you have a chance, we pushed a fix for datasourcing Deployments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants