Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

helm_pull: Silence false no_log warning #796

Merged
merged 5 commits into from
Jan 17, 2025

Conversation

colshine1
Copy link
Contributor

@colshine1 colshine1 commented Nov 22, 2024

SUMMARY

Apply no_log=True to pass_credentials to silence false positive warning.

Fixes similar issue to: #423

ISSUE TYPE
  • Bugfix Pull Request
COMPONENT NAME

changelog/fragements/796-false-positive-helmull.yaml
plugins/modules/helm_pull.py

Copy link

Copy link

Copy link
Contributor

@yurnov yurnov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

confirmed that even in module_utils we have no_logs=True and password not appear in module output, it's a false positive warning

@yurnov
Copy link
Contributor

yurnov commented Nov 24, 2024

Hi @colshine1,

please update the integration test with the following:
https://github.com/ansible-collections/kubernetes.core/blob/cd686316e9b3af6df67c19027b573b27468234e3/tests/integration/targets/helm_pull/tasks/main.yml#L170C1-L182C61

please add

              - '"Module did not set no_log for pass_credentials" not in _result.warnings'

Copy link

Copy link
Contributor

@yurnov yurnov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -189,7 +189,7 @@ def main():
repo_password=dict(
type="str", no_log=True, aliases=["password", "chart_repo_password"]
),
pass_credentials=dict(type="bool", default=False),
pass_credentials=dict(type="bool", default=False, no_log=True),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pass_credentials=dict(type="bool", default=False, no_log=True),
pass_credentials=dict(type="bool", default=False, no_log=False),

This should be False since this is a false positive.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be False since this is a false positive.

False is the default one and it will behave in the same way as now, it throws a warning Module did not set no_log for pass_credentials. So, it should be no_log=True

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, explicitly setting no_log=False is the correct way to deal with a false positive.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, explicitly setting no_log=False is the correct way to deal with a false positive.

Hm... I will check how it works

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I setup a sample test playbook:

- hosts: localhost
  tasks:
    - name: Download chart to controlhost
      kubernetes.core.helm_pull:
        chart_ref: "https://domain.com/helm_chart-0.0.1.tgz"
        destination: "/tmp/"
        untar_chart: false
        repo_username: user
        repo_password: password
        pass_credentials: false

I run the playbook with -vvv to get the module parameters in the output, the output fails as I set fake credentials for this test.

Without any changes to the module the output is:

[WARNING]: Module did not set no_log for pass_credentials
fatal: [localhost]: FAILED! => {
    "changed": false,
    "command": "helm pull https://domain.com/helm_chart-0.0.1.tgz --username user --******** ******** --destination /tmp/",
    "invocation": {
        "module_args": {
            "binary_path": null,
            "chart_ca_cert": null,
            "chart_devel": null,
            "chart_ref": "https://domain.com/helm_chart-0.0.1.tgz",
            "chart_ssl_cert_file": null,
            "chart_ssl_key_file": null,
            "chart_version": null,
            "destination": "/tmp/",
            "pass_credentials": false,
            "provenance": false,
            "repo_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "repo_url": null,
            "repo_username": "user",
            "skip_tls_certs_check": false,
            "untar_chart": false,
            "verify_chart": false,
            "verify_chart_keyring": null
        }
    },
    "msg": "Failure when executing Helm command.",
    "rc": 1,
    "stderr": "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized\n",
    "stderr_lines": [
        "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized"
    ],
    "stdout": "",
    "stdout_lines": []
}

setting no_log to False on line 192:

fatal: [localhost]: FAILED! => {
    "changed": false,
    "command": "helm pull https://domain.com/helm_chart-0.0.1.tgz --username user --******** ******** --destination /tmp/",
    "invocation": {
        "module_args": {
            "binary_path": null,
            "chart_ca_cert": null,
            "chart_devel": null,
            "chart_ref": "https://domain.com/helm_chart-0.0.1.tgz",
            "chart_ssl_cert_file": null,
            "chart_ssl_key_file": null,
            "chart_version": null,
            "destination": "/tmp/",
            "pass_credentials": false,
            "provenance": false,
            "repo_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "repo_url": null,
            "repo_username": "user",
            "skip_tls_certs_check": false,
            "untar_chart": false,
            "verify_chart": false,
            "verify_chart_keyring": null
        }
    },
    "msg": "Failure when executing Helm command.",
    "rc": 1,
    "stderr": "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized\n",
    "stderr_lines": [
        "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized"
    ],
    "stdout": "",
    "stdout_lines": []
}

setting no_log to False on line 192:

fatal: [localhost]: FAILED! => {
    "changed": false,
    "command": "helm pull https://domain.com/helm_chart-0.0.1.tgz --username user --******** ******** --destination /tmp/",
    "invocation": {
        "module_args": {
            "binary_path": null,
            "chart_ca_cert": null,
            "chart_devel": null,
            "chart_ref": "https://domain.com/helm_chart-0.0.1.tgz",
            "chart_ssl_cert_file": null,
            "chart_ssl_key_file": null,
            "chart_version": null,
            "destination": "/tmp/",
            "pass_credentials": false,
            "provenance": false,
            "repo_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "repo_url": null,
            "repo_username": "user",
            "skip_tls_certs_check": false,
            "untar_chart": false,
            "verify_chart": false,
            "verify_chart_keyring": null
        }
    },
    "msg": "Failure when executing Helm command.",
    "rc": 1,
    "stderr": "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized\n",
    "stderr_lines": [
        "Error: failed to fetch https://domain.com/helm_chart-0.0.1.tgz : 401 Unauthorized"
    ],
    "stdout": "",
    "stdout_lines": []
}

I added no_log=True to be consistent with this: https://github.com/Akasurde/kubernetes.core/blob/22013686e7f2f735d5de5850b612dce4daa04b1a/plugins/modules/helm_repository.py#L231. However, it looks like this no longer exists in the module so I'm happy to update based on whatever is the correct solution.

@@ -180,6 +180,7 @@
- '"--username ansible" in _result.command'
- '"--password ***" in _result.command'
- '"--keyring pubring.gpg" in _result.command'
- '"Module did not set no_log for pass_credentials" not in _result.warnings'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be _result.stderr instead of _result.warnings, module helm_pull returns warnings into stderr

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated as requested

Copy link

@yurnov yurnov mentioned this pull request Dec 10, 2024
32 tasks
Copy link

@yurnov
Copy link
Contributor

yurnov commented Dec 20, 2024

Hi @colshine1 or @gravesm, could you please rebase to master?

@yurnov
Copy link
Contributor

yurnov commented Dec 21, 2024

Strangely, I saw that message in my testing environment some time ago, but I'm not sure about the package versions. Today trying to reproduce the issue with ansible-core 2.16.14 and kubernetes.core both 3.2.0 and 5.0.0 versions, and the issue is not reproducible. So, probably that PR is not required anymore

@yurnov
Copy link
Contributor

yurnov commented Dec 25, 2024

I tested the behavior again and was able to reproduce the issue with 5.0.0:

root@b3336f683ab1:/# cat playbook.yaml
- hosts: localhost
  gather_facts: false
  tasks:
    - name: Download chart to controlhost
      kubernetes.core.helm_pull:
        chart_ref: "oci://registry-1.docker.io/bitnamicharts/wordpress"
        destination: "/tmp/"
        untar_chart: false
        repo_username: username
        repo_password: password
root@b3336f683ab1:/# ansible --version
ansible [core 2.16.13]
  config file = None
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.12/site-packages/ansible
  ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/local/bin/ansible
  python version = 3.12.4 (main, Jun 27 2024, 00:07:37) [GCC 12.2.0] (/usr/local/bin/python)
  jinja version = 3.1.4
  libyaml = True
root@b3336f683ab1:/# ansible-galaxy collection list

# /root/.ansible/collections/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 5.0.0
root@b3336f683ab1:/# 
root@b3336f683ab1:/# ansible-playbook playbook.yaml
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match
'all'

PLAY [localhost] *******************************************************************************************************

TASK [Download chart to controlhost] ***********************************************************************************
[WARNING]: Module did not set no_log for pass_credentials
changed: [localhost]

PLAY RECAP *************************************************************************************************************
localhost                  : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

root@b3336f683ab1:/#

Same with main:

root@b3336f683ab1:/# rm -rf /root/.ansible/collections/ansible_collections/kubernetes
root@b3336f683ab1:/# ansible-galaxy collection install git+https://github.com/ansible-collections/kubernetes.core.git
Cloning into '/root/.ansible/tmp/ansible-local-37507k343jvb/tmpwjkkb14a/kubernetes.coreasfqq6kh'...
remote: Enumerating objects: 631, done.
remote: Counting objects: 100% (631/631), done.
remote: Compressing objects: 100% (390/390), done.
remote: Total 631 (delta 79), reused 466 (delta 63), pack-reused 0 (from 0)
Receiving objects: 100% (631/631), 351.77 KiB | 2.16 MiB/s, done.
Resolving deltas: 100% (79/79), done.
Your branch is up to date with 'origin/main'.
Starting galaxy collection install process
Process install dependency map
Starting collection install process
Installing 'kubernetes.core:5.0.0' to '/root/.ansible/collections/ansible_collections/kubernetes/core'
Created collection for kubernetes.core:5.0.0 at /root/.ansible/collections/ansible_collections/kubernetes/core
kubernetes.core:5.0.0 was installed successfully
root@b3336f683ab1:/# ansible-playbook playbook.yaml
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match
'all'

PLAY [localhost] *******************************************************************************************************

TASK [Download chart to controlhost] ***********************************************************************************
[WARNING]: Module did not set no_log for pass_credentials
changed: [localhost]

PLAY RECAP *************************************************************************************************************
localhost                  : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

root@b3336f683ab1:/#

So, WARNING in place.

And finally, this PR (with no_log=False as in ecf31d8):

root@b3336f683ab1:/# rm -rf /root/.ansible/collections/ansible_collections/kubernetes
root@b3336f683ab1:/# ansible-galaxy collection install git+https://github.com/colshine1/kubernetes.core.git
Cloning into '/root/.ansible/tmp/ansible-local-3924g37iv8s5/tmposrgl53f/kubernetes.corepchwkorq'...
remote: Enumerating objects: 599, done.
remote: Counting objects: 100% (599/599), done.
remote: Compressing objects: 100% (369/369), done.
remote: Total 599 (delta 76), reused 429 (delta 63), pack-reused 0 (from 0)
Receiving objects: 100% (599/599), 342.95 KiB | 3.65 MiB/s, done.
Resolving deltas: 100% (76/76), done.
Your branch is up to date with 'origin/main'.
Starting galaxy collection install process
Process install dependency map
Starting collection install process
Installing 'kubernetes.core:5.0.0' to '/root/.ansible/collections/ansible_collections/kubernetes/core'
Created collection for kubernetes.core:5.0.0 at /root/.ansible/collections/ansible_collections/kubernetes/core
kubernetes.core:5.0.0 was installed successfully
root@b3336f683ab1:/# ansible-playbook playbook.yaml
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match
'all'

PLAY [localhost] *******************************************************************************************************

TASK [Download chart to controlhost] ***********************************************************************************
changed: [localhost]

PLAY RECAP *************************************************************************************************************
localhost                  : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

root@b3336f683ab1:/#

So, @gravesm, please rebase this PR to main and backport to stable-3 and stable-5

Copy link

@abikouo abikouo mentioned this pull request Jan 15, 2025
Copy link

Copy link

Build succeeded (gate pipeline).
https://ansible.softwarefactory-project.io/zuul/buildset/9d602ae2e5a84cfcb38b23bc551a4caf

✔️ ansible-galaxy-importer SUCCESS in 3m 47s
✔️ build-ansible-collection SUCCESS in 5m 25s

@softwarefactory-project-zuul softwarefactory-project-zuul bot merged commit ecc64ca into ansible-collections:main Jan 17, 2025
50 checks passed
Copy link

patchback bot commented Jan 17, 2025

Backport to stable-3: 💚 backport PR created

✅ Backport PR branch: patchback/backports/stable-3/ecc64cace1d18bc3617f373a94bc23203c160532/pr-796

Backported as #857

🤖 @patchback
I'm built with octomachinery and
my source is open — https://github.com/sanitizers/patchback-github-app.

patchback bot pushed a commit that referenced this pull request Jan 17, 2025
SUMMARY
Apply no_log=True to pass_credentials to silence false positive warning.
Fixes similar issue to: #423
ISSUE TYPE

Bugfix Pull Request

COMPONENT NAME
changelog/fragements/796-false-positive-helmull.yaml
plugins/modules/helm_pull.py

Reviewed-by: Yuriy Novostavskiy
Reviewed-by: Mike Graves <mgraves@redhat.com>
Reviewed-by: Irum Malik
(cherry picked from commit ecc64ca)
Copy link

patchback bot commented Jan 17, 2025

Backport to stable-5: 💚 backport PR created

✅ Backport PR branch: patchback/backports/stable-5/ecc64cace1d18bc3617f373a94bc23203c160532/pr-796

Backported as #858

🤖 @patchback
I'm built with octomachinery and
my source is open — https://github.com/sanitizers/patchback-github-app.

patchback bot pushed a commit that referenced this pull request Jan 17, 2025
SUMMARY
Apply no_log=True to pass_credentials to silence false positive warning.
Fixes similar issue to: #423
ISSUE TYPE

Bugfix Pull Request

COMPONENT NAME
changelog/fragements/796-false-positive-helmull.yaml
plugins/modules/helm_pull.py

Reviewed-by: Yuriy Novostavskiy
Reviewed-by: Mike Graves <mgraves@redhat.com>
Reviewed-by: Irum Malik
(cherry picked from commit ecc64ca)
softwarefactory-project-zuul bot pushed a commit that referenced this pull request Jan 17, 2025
This is a backport of PR #796 as merged into main (ecc64ca).
SUMMARY
Apply no_log=True to pass_credentials to silence false positive warning.
Fixes similar issue to: #423
ISSUE TYPE

Bugfix Pull Request

COMPONENT NAME
changelog/fragements/796-false-positive-helmull.yaml
plugins/modules/helm_pull.py
softwarefactory-project-zuul bot pushed a commit that referenced this pull request Jan 17, 2025
This is a backport of PR #796 as merged into main (ecc64ca).
SUMMARY
Apply no_log=True to pass_credentials to silence false positive warning.
Fixes similar issue to: #423
ISSUE TYPE

Bugfix Pull Request

COMPONENT NAME
changelog/fragements/796-false-positive-helmull.yaml
plugins/modules/helm_pull.py
@yurnov yurnov mentioned this pull request Jan 20, 2025
softwarefactory-project-zuul bot pushed a commit that referenced this pull request Jan 20, 2025
SUMMARY
Version 3.3.0 of ansible-collection kubernetes.core came with several improvements and bugfixes
ISSUE TYPE

New release pull request

Changelog
Minor Changes

k8s_drain - Improve error message for pod disruption budget when draining a node (#797).

Bugfixes

helm - Helm version checks did not support RC versions. They now accept any version tags. (#745).
helm_pull - Apply no_log=True to pass_credentials to silence false positive warning.. (#796).
k8s_drain - Fix k8s_drain does not wait for single pod (#769).
k8s_drain - Fix k8s_drain runs into a timeout when evicting a pod which is part of a stateful set  (#792).
kubeconfig option should not appear in module invocation log (#782).
kustomize - kustomize plugin fails with deprecation warnings (#639).
waiter - Fix waiting for daemonset when desired number of pods is 0. (#756).

ADDITIONAL INFORMATION
Collection kubernets.core version 3.3.0 is compatible with ansible-core>=2.14.0

Reviewed-by: Alina Buzachis
Reviewed-by: Yuriy Novostavskiy
Reviewed-by: Mike Graves <mgraves@redhat.com>
This was referenced Jan 20, 2025
softwarefactory-project-zuul bot pushed a commit that referenced this pull request Jan 20, 2025
SUMMARY
This release came with new module helm_registry_auth, and improvements to the error messages in the k8s_drain module, new parameter insecure_registry for helm_template module and several bug fixes.
ISSUE TYPE

New release pull request

Changelog
Minor Changes

Bump version of ansible-lint to minimum 24.7.0 (#765).
Parameter insecure_registry added to helm_template as equivalent of insecure-skip-tls-verify (#805).
connection/kubectl.py - Added an example of using the kubectl connection plugin to the documentation (#741).
k8s_drain - Improve error message for pod disruption budget when draining a node (#797).

Bugfixes

helm - Helm version checks did not support RC versions. They now accept any version tags. (#745).
helm_pull - Apply no_log=True to pass_credentials to silence false positive warning.. (#796).
k8s_drain - Fix k8s_drain does not wait for single pod (#769).
k8s_drain - Fix k8s_drain runs into a timeout when evicting a pod which is part of a stateful set  (#792).
kubeconfig option should not appear in module invocation log (#782).
kustomize - kustomize plugin fails with deprecation warnings (#639).
waiter - Fix waiting for daemonset when desired number of pods is 0. (#756).

New Modules

helm_registry_auth - Helm registry authentication module

ADDITIONAL INFORMATION
Collection kubernets.core version 3.1.0 is compatible with ansible-core>=2.15.0

Reviewed-by: Mike Graves <mgraves@redhat.com>
yurnov added a commit to yurnov/kubernetes.core that referenced this pull request Jan 20, 2025
SUMMARY
This release came with new module helm_registry_auth, and improvements to the error messages in the k8s_drain module, new parameter insecure_registry for helm_template module and several bug fixes.
ISSUE TYPE

New release pull request

Changelog
Minor Changes

Bump version of ansible-lint to minimum 24.7.0 (ansible-collections#765).
Parameter insecure_registry added to helm_template as equivalent of insecure-skip-tls-verify (ansible-collections#805).
connection/kubectl.py - Added an example of using the kubectl connection plugin to the documentation (ansible-collections#741).
k8s_drain - Improve error message for pod disruption budget when draining a node (ansible-collections#797).

Bugfixes

helm - Helm version checks did not support RC versions. They now accept any version tags. (ansible-collections#745).
helm_pull - Apply no_log=True to pass_credentials to silence false positive warning.. (ansible-collections#796).
k8s_drain - Fix k8s_drain does not wait for single pod (ansible-collections#769).
k8s_drain - Fix k8s_drain runs into a timeout when evicting a pod which is part of a stateful set  (ansible-collections#792).
kubeconfig option should not appear in module invocation log (ansible-collections#782).
kustomize - kustomize plugin fails with deprecation warnings (ansible-collections#639).
waiter - Fix waiting for daemonset when desired number of pods is 0. (ansible-collections#756).

New Modules

helm_registry_auth - Helm registry authentication module

ADDITIONAL INFORMATION
Collection kubernets.core version 3.1.0 is compatible with ansible-core>=2.15.0

Reviewed-by: Mike Graves <mgraves@redhat.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants