Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: collection ansible-test linting issues #1520

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions .github/workflows/pull-request-management.yml
Original file line number Diff line number Diff line change
Expand Up @@ -297,8 +297,9 @@ jobs:
ansible_test:
name: Run ansible-test validation
runs-on: ubuntu-20.04
needs: [ molecule_eos_designs, molecule_cloudvision ]
if: needs.cloudvision.status != 'failed' && needs.molecule_eos_designs.status != 'failed' && needs.file-changes.outputs.plugins == 'true'
needs: [ ]
#needs: [ molecule_eos_designs, molecule_cloudvision ]
#if: needs.cloudvision.status != 'failed' && needs.molecule_eos_designs.status != 'failed' && needs.file-changes.outputs.plugins == 'true'
strategy:
fail-fast: true
matrix:
Expand All @@ -320,7 +321,7 @@ jobs:
- name: 'ansible-test linting'
run: |
cd ansible_collections/arista/avd/
ansible-test sanity -v --requirements --docker
ansible-test sanity -v --requirements --docker --python ${{ matrix.python_version }}

galaxy_importer:
name: Test galaxy-importer
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,9 +78,9 @@ on using Ansible to provision Arista EOS devices either with or without Arista C
## What are the requirements to run Ansible?

Ansible can run on almost anything, but in production scenarios Ansible is typically deployed on a virtual Linux server,
running on the customers preferred hypervisor. This Ansible server then communicates either directly with the
running on the customer's preferred hypervisor. This Ansible server then communicates either directly with the
Arista network devices via eAPI or with Arista CloudVision Portal, which in turn communicates with the Arista network devices.
Controlling what Ansible does is typically done using an SSH terminal session to the Ansible server from the Operators computer.
Controlling what Ansible does is typically done using an SSH terminal session to the Ansible server from the Operator's computer.

![Figure: Ansible and CVP](../_media/getting-started/Ansible-and-CVP-httpapi.png)

Expand Down Expand Up @@ -121,7 +121,7 @@ AVD also uses the information provided to produce complete documentation of the

## When and when not to use AVD

Its important to note when and perhaps more importantly when not to use AVD.
It's important to note when and perhaps more importantly when not to use AVD.

AVD is designed to generate and deploy complete configuration files in a manner where the network device's running-configuration is
completely replaced. As such, caution should be exercised when running AVD against an existing manually-configured network. Various
Expand Down Expand Up @@ -207,15 +207,15 @@ all:
ansible_host: 10.255.0.16
```

Dont confuse ***hosts*** with servers or similar. A host can be anything that can be accessed via SSH or an API, to be managed by Ansible,
Don't confuse ***hosts*** with servers or similar. A host can be anything that can be accessed via SSH or an API, to be managed by Ansible,
including Arista switches.

The settings inside the inventory.yml file are defined in a tree-like structure using ***groups***. Groups can contain hosts or other groups -
making it easier to apply common configuration to a group of devices.

The ***all*** line at the top is a default group that contains all ***hosts*** i.e. all switches. Dont worry too much about that for now.
The ***all*** line at the top is a default group that contains all ***hosts*** i.e. all switches. Don't worry too much about that for now.

The ***children:*** keyword is used to define groups of groups i.e. just an internal keyword to differentiate between hosts and groups.
The ***children:*** keyword is used to define "groups of groups" i.e. just an internal keyword to differentiate between hosts and groups.

The multi-colored figure below is just a visualization of the same text file, showing the different grouping and parent/child relationships:

Expand Down Expand Up @@ -324,7 +324,7 @@ Group variables can be overridden by specifying host variables for specific devi
(See [DEFAULT_HASH_BEHAVIOUR](https://docs.ansible.com/ansible/latest/reference_appendices/config.html#default-hash-behaviour)).
The order of precedence is (from lowest to highest):

- 'All' group (because it is the parent of all other groups)
- 'All' group (because it is the 'parent' of all other groups)
- Parent group
- Child group
- Host
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@
from ansible.plugins.loader import lookup_loader
from ansible_collections.arista.avd.plugins.module_utils.strip_empties import strip_null_from_data
from datetime import datetime
import yaml


class ActionModule(ActionBase):
Expand All @@ -27,8 +26,8 @@ def run(self, tmp=None, task_vars=None):
n = self._task.args.get("root_key")
n = self._templar.template(n)
if not isidentifier(n):
raise AnsibleActionFail(f"The argument 'root_key' value of '{n}' is not valid. Keys must start with a letter or underscore character, "
"and contain only letters, numbers and underscores.")
raise AnsibleActionFail(f"The argument 'root_key' value of '{n}' is not valid. Keys must start with a letter or underscore character, \
and contain only letters, numbers and underscores.")
root_key = n

if "templates" in self._task.args:
Expand All @@ -40,9 +39,9 @@ def run(self, tmp=None, task_vars=None):
else:
raise AnsibleActionFail("The argument 'templates' must be set")

dest = self._task.args.get("dest",False)
template_output = self._task.args.get("template_output",False)
debug = self._task.args.get("debug",False)
dest = self._task.args.get("dest", False)
template_output = self._task.args.get("template_output", False)
debug = self._task.args.get("debug", False)

else:
raise AnsibleActionFail("The argument 'templates' must be set")
Expand All @@ -57,12 +56,12 @@ def run(self, tmp=None, task_vars=None):
# This list contains timestamps from every step for every template. This is useful for identifying slow templates.
# Here we pull in the list from any previous tasks, so we can just add the the list.
if debug:
avd_yaml_templates_to_facts_debug = template_vars.get('avd_yaml_templates_to_facts_debug',[])
avd_yaml_templates_to_facts_debug = template_vars.get('avd_yaml_templates_to_facts_debug', [])

for template_item in template_list:
if debug:
debug_item = template_item
debug_item['timestamps'] = { "starting": datetime.now() }
debug_item['timestamps'] = {"starting": datetime.now()}

template = template_item.get('template')
if not template:
Expand Down Expand Up @@ -113,7 +112,7 @@ def run(self, tmp=None, task_vars=None):
# This is to resolve any input values with inline jinja using variables/facts set by the input templates.
if template_output:
if debug:
debug_item = { 'action': 'template_output', 'timestamps': { 'combine_data': datetime.now() } }
debug_item = {'action': 'template_output', 'timestamps': {'combine_data': datetime.now()}}

if root_key:
template_vars[root_key] = output
Expand Down Expand Up @@ -143,7 +142,7 @@ def run(self, tmp=None, task_vars=None):

# Depending on the file suffix of 'dest' (default: 'json') we will format the data to yaml or just write the output data directly.
# The Copy module used in 'write_file' will convert the output data to json automatically.
if dest.split('.')[-1] in ["yml", "yaml"] :
if dest.split('.')[-1] in ["yml", "yaml"]:
write_file_result = self.write_file(yaml.dump(output, indent=2, sort_keys=False, width=130), task_vars)
else:
write_file_result = self.write_file(output, task_vars)
Expand All @@ -169,9 +168,8 @@ def run(self, tmp=None, task_vars=None):
result['ansible_facts'] = output
return result


# The write_file function is implementing the Ansible 'copy' action_module, to benefit from Ansible builtin functionality like 'changed'.
def write_file(self, content, task_vars):
# The write_file function is implementing the Ansible 'copy' action_module, to benefit from Ansible builtin functionality like 'changed'.
# Reuse task data
new_task = self._task.copy()

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
configlet_prefix:
description: Prefix to put on configlet.
required: false
default: 'AVD'
type: str
destination:
description: Optional path to save variable.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@
Autodetects data format based on file suffix. '.yml', '.yaml' -> YAML, default -> JSON
required: false
type: str
template_output
template_output:
description: |
If true the output data will be run through another jinja2 rendering before returning.
This is to resolve any input values with inline jinja using variables/facts set by the input templates.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1417,7 +1417,7 @@ vxlan_interface:
virtual_router_encapsulation_mac_address: < mlag-system-id | ethernet_address (H.H.H) >
qos:
# !!!Warning, only few hardware types with software version >= 4.26.0 support the below knobs to configure Vxlan DSCP mapping.
# For the Traffic Class to be derived based on the outer DSCP field of the incoming VxLan packet, the core ports must be in DSCP Trust mode.
# For the Traffic Class to be derived based on the outer DSCP field of the incoming VxLan packet, the core ports must be in "DSCP Trust" mode.
dscp_propagation_encapsulation: < true | false >
map_dscp_to_traffic_class_decapsulation: < true | false >
vlans:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

import re
from natsort import os_sorted

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.add_md_toc import add_md_toc
import pytest
import md_toc
Expand All @@ -7,10 +10,10 @@


DIR_PATH = os.path.dirname(os.path.realpath(
__file__))+'/toc_files'
MD_INPUTS = [None, DIR_PATH+'/valid_file.md']
MD_INPUT_INVALID = DIR_PATH+'/invalid_file.md'
EXPECTED_TOC = DIR_PATH+'/expected_toc.md'
__file__)) + '/toc_files'
MD_INPUTS = [None, DIR_PATH + '/valid_file.md']
MD_INPUT_INVALID = DIR_PATH + '/invalid_file.md'
EXPECTED_TOC = DIR_PATH + '/expected_toc.md'
TOC_MARKER = '<!-- toc -->'
SKIP_LINES_LIST = [0, 1, 2]
TOC_LEVELS = [1, 2, 3]
Expand Down Expand Up @@ -58,7 +61,7 @@ def test_add_md_toc_invalid(self):
assert resp.strip() != expected_toc.strip()

def test_add_md_toc_btw_specific_markers(self):
with open(DIR_PATH+'/markers_at_bottom.md', "r") as input_file:
with open(DIR_PATH + '/markers_at_bottom.md', "r") as input_file:
resp = add_md_toc(input_file.read(), skip_lines=0,
toc_levels=2, toc_marker=TOC_MARKER)

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.default import default, FilterModule
import pytest
from jinja2.runtime import Undefined
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.esi_management import FilterModule
import logging
import pytest
Expand All @@ -17,12 +20,12 @@
class TestEsiManagementFilter():
def test_generate_esi_without_prefix(self):
resp = f.generate_esi(ESI_SHORT)
assert resp == "0000:0000:"+ESI_SHORT
assert resp == "0000:0000:" + ESI_SHORT

def test_generate_esi_with_prefix(self):
assert ESI_PREFIX is not None and ESI_PREFIX != ""
resp = f.generate_esi(ESI_SHORT, ESI_PREFIX)
assert resp == ESI_PREFIX+ESI_SHORT
assert resp == ESI_PREFIX + ESI_SHORT

def test_lacp_id(self):
assert ESI_SHORT_1 is not None and ESI_SHORT_1 != ""
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.list_compress import FilterModule, AnsibleFilterError
import pytest

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.markdown_rendering import FilterModule
import pytest

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.filter.natural_sort import FilterModule, convert
import pytest
from jinja2.runtime import Undefined
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@

from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.module_utils.strip_empties import strip_null_from_data
import pytest
import logging
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.modules.configlet_build_config import get_configlet
import os
import logging
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
from ansible_collections.arista.avd.plugins.modules.inventory_to_container import is_in_filter, isIterable, get_device_option_value, serialize, get_devices, isLeaf, get_containers
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.modules.inventory_to_container import is_in_filter, isIterable, get_device_option_value
from ansible_collections.arista.avd.plugins.modules.inventory_to_container import serialize, get_devices, isLeaf, get_containers
import os
import logging
import pytest
Expand All @@ -20,10 +24,37 @@

PARENT_CONTAINER = {
"default_parent": {"parent": "Tenant",
"expected_output": {'all': {}, 'CVP': {'devices': ['cv_ztp', 'cv_server'], 'parent_container': 'all'}, 'DC1': {'parent_container': 'all'}, 'DC1_FABRIC': {'parent_container': 'DC1'}, 'DC1_L2LEAFS': {'parent_container': 'DC1_FABRIC'}, 'DC1_L2LEAF1': {'devices': ['DC1-L2LEAF1A'], 'parent_container': 'DC1_L2LEAFS'}, 'DC1_L2LEAF2': {'devices': ['DC1-L2LEAF2A'], 'parent_container': 'DC1_L2LEAFS'}, 'DC1_L3LEAFS': {'parent_container': 'DC1_FABRIC'}, 'DC1_LEAF1': {'devices': ['DC1-LEAF1A', 'DC1-LEAF1B'], 'parent_container': 'DC1_L3LEAFS'}, 'DC1_LEAF2': {'devices': ['DC1-LEAF2A', 'DC1-LEAF2B'], 'parent_container': 'DC1_L3LEAFS'}, 'DC1_SPINES': {'devices': ['DC1-SPINE1', 'DC1-SPINE2'], 'parent_container': 'DC1_FABRIC'}, 'DC1_SERVERS': {'devices': [], 'parent_container': 'DC1'}, 'DC1_TENANTS_NETWORKS': {'devices': [], 'parent_container': 'DC1'}}
"expected_output": {'all': {},
'CVP': {'devices': ['cv_ztp', 'cv_server'], 'parent_container': 'all'},
'DC1': {'parent_container': 'all'},
'DC1_FABRIC': {'parent_container': 'DC1'},
'DC1_L2LEAFS': {'parent_container': 'DC1_FABRIC'},
'DC1_L2LEAF1': {'devices': ['DC1-L2LEAF1A'],
'parent_container': 'DC1_L2LEAFS'
},
'DC1_L2LEAF2': {'devices': ['DC1-L2LEAF2A'],
'parent_container': 'DC1_L2LEAFS'
},
'DC1_L3LEAFS': {'parent_container': 'DC1_FABRIC'},
'DC1_LEAF1': {'devices': ['DC1-LEAF1A', 'DC1-LEAF1B'],
'parent_container': 'DC1_L3LEAFS'
},
'DC1_LEAF2': {'devices': ['DC1-LEAF2A', 'DC1-LEAF2B'],
'parent_container': 'DC1_L3LEAFS'
},
'DC1_SPINES': {'devices': ['DC1-SPINE1', 'DC1-SPINE2'],
'parent_container': 'DC1_FABRIC'
},
'DC1_SERVERS': {'devices': [],
'parent_container': 'DC1'
},
'DC1_TENANTS_NETWORKS': {'devices': [],
'parent_container': 'DC1'
}
}
},
"non_default_parent": {"parent": "CVP",
"expected_output": {'CVP': {'parent_container': 'Tenant'}}
"expected_output": {'CVP': {'parent_container': 'Tenant'}}
}
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

from ansible_collections.arista.avd.plugins.test.defined import defined, TestModule
from jinja2.runtime import Undefined
from ansible.errors import AnsibleError
Expand Down