Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: different buckets #6581

Closed
fade2black opened this issue Jan 19, 2024 · 6 comments
Closed

Question: different buckets #6581

fade2black opened this issue Jan 19, 2024 · 6 comments
Labels

Comments

@fade2black
Copy link

fade2black commented Jan 19, 2024

When I generate a pipeline using sam pipeline ... it guides me through a process by asking questions. One of them is about bucket (for prod env). I enter a bucket name mybucket123 and the SAM adds the following lines inside codepipeline.yml file

...
Parameters:
  ProdArtifactBucket:
    Type: String
    Default: "mybucket123"
.... 

and associates ProdArtifactBucket with the ENV_BUCKET env var.

It also creates two buckets (with corresponding policies):

  ...
  PipelineArtifactsBucket:
    Type: AWS::S3::Bucket
    DeletionPolicy: Retain
    UpdateReplacePolicy: Retain
    Properties:
    ...

  PipelineArtifactsLoggingBucket:
    Type: AWS::S3::Bucket
    DeletionPolicy: Retain
    UpdateReplacePolicy: Retain
    Properties:
    ...

and uses the bucket I entered for sam deploy command

sam deploy --stack-name ${ENV_STACK_NAME}
                    --template ${ENV_TEMPLATE}
                    --capabilities CAPABILITY_IAM
                    --region ${ENV_REGION}
                    --s3-bucket ${ENV_BUCKET}
                   ...

So my question is: why does SAM ask for a bucket (for artifacts) if it creates a new one to download code, etc? Can't we just use the newly created bucket for sam deploy?

@fade2black fade2black added the stage/needs-triage Automatically applied to new issues and PRs, indicating they haven't been looked at. label Jan 19, 2024
@jysheng123 jysheng123 added type/question area/deploy sam deploy command and removed stage/needs-triage Automatically applied to new issues and PRs, indicating they haven't been looked at. labels Jan 24, 2024
@jysheng123
Copy link
Contributor

Hi, thanks for the asking the question. sam deploy does not know which bucket too upload to initially for deployment, hence the --s3-bucket flag. However the reason why we can't use the buckets defined in the template file is that they are not yet created yet, they first need to be uploaded then deployed. However before that happens, an actual bucket needs to be provided for the deployment to be uploaded too, hence the ProdArtifactBucket. Let me know if there any other questions you had or if that explanation did not make sense.

@fade2black
Copy link
Author

fade2black commented Jan 24, 2024

@jysheng123 Thank you for reply. Yes, the explanation makes sense.
I am simply worried about the fact that just in order to create a simple Hello Lambda sam project you need:

  1. S3 bucket for deployment pipeline
  2. S3 bucket for logging of the previous bucket, and
  3. S3 bucket for --s3-bucket flag.

This seems to be overkill, doesn't it?
So, what I do is first I run pipeline without providing a bucket (inside samconfig.toml). Upon successful deployment , the pipeline stack it will start the pipeline (for the first time), but will fail building because the no bucket is provided. Then I simply copy the newly created S3 bucket, update the samconfig.toml with this bucket, commit and push. It triggers the pipeline again and deploys successfully. Thus I reduce the the num of buckets to 2 at the cost of the first failure. Makes sense or looks like antipattern? Actually, I could not find a flag saying to the pipeline not to run when it is first created. If it existed then I could just turn off the first run and could avoid that failure. I think running the pipeline as soon as it is deployed does not make much sense. It should be only deployment and not trigger. What do you think?

@jysheng123
Copy link
Contributor

Hmm, I understand your logic but creating multiple buckets doesn't cost much resources at all, and to us, this many buckets is normal for creating a deployment pipeline since each of the buckets serve it's own unique purpose. That is why we don't really have functionality for not running the pipeline when it is first created. Is there a use case in why you want to only have 2 buckets? If you specify the bucket getting deployed before, you would limit the buckets created in the process to 2 as well.

@fade2black
Copy link
Author

fade2black commented Jan 30, 2024

@jysheng123 No specific use case, just want to keep S3 Dashboard clean :-). But I found another solution.
When I first time create a pipeline I store the bucket as an ENV var for a specific stage/action like

EnvironmentVariables: !Sub |
   [
     {"name": "ENV_BUCKET", "value": "${PipelineArtifactsBucket}"}
   ]

and later use it with the deploy command

sam deploy --config-env ${CONFIGENV} \
             --s3-bucket ${ENV_BUCKET} \
             --resolve-image-repos \
             --no-confirm-changeset \
             --no-fail-on-empty-changeset

@jysheng123
Copy link
Contributor

Ah sounds good, thanks for bringing up the question then, apologies if I was not helpful enough for this question.

Copy link
Contributor

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants