Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests fail after 2020-12-31 #1705

Closed
bmwiedemann opened this issue Sep 21, 2020 · 5 comments · Fixed by #1777
Closed

tests fail after 2020-12-31 #1705

bmwiedemann opened this issue Sep 21, 2020 · 5 comments · Fixed by #1777

Comments

@bmwiedemann
Copy link

bmwiedemann commented Sep 21, 2020

cfn-lint version: (cfn-lint --version) 0.36.0

Description of issue.

While working on reproducible builds for openSUSE, I found that our python-cfn-lint package fails its tests after 2020-12-31

To reproduce:

osc checkout openSUSE:Factory/python-cfn-lint && cd $_
osc build --noservice --vm-type=kvm --build-opt=--vm-custom-opt="-rtc base=2020-12-31T00:00:00" standard

maybe from src/cfnlint/data/AdditionalSpecs/LmbdRuntimeLifecycle.json

    "python2.7": {
        "eol": "2020-12-31",
        "deprecated": "2020-12-31",
        "successor": "python3.8"
    }

error log:

 ======================================================================
 FAIL: test_templates (integration.test_quickstart_templates.TestQuickStartTemplates)
 Test same templates using integration approach
 ----------------------------------------------------------------------
 Traceback (most recent call last):
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/test_quickstart_templates.py", line 79, in test_templates
     self.run_module_integration_scenarios(rules)
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/__init__.py", line 90, in run_module_integration_scenarios
     self.assertEqual(len(expected_results), len(matches), 'Expected {} failures, got {} on {}'.format(
 AssertionError: 40 != 41 : Expected 40 failures, got 41 on test/fixtures/templates/quickstart/openshift.yaml
 
 ======================================================================
 FAIL: test_module_integration (integration.test_quickstart_templates_non_strict.TestQuickStartTemplates)
 Test same templates using integration approach
 ----------------------------------------------------------------------
 Traceback (most recent call last):
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/test_quickstart_templates_non_strict.py", line 53, in test_module_integration
     self.run_module_integration_scenarios(rules)
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/__init__.py", line 90, in run_module_integration_scenarios
     self.assertEqual(len(expected_results), len(matches), 'Expected {} failures, got {} on {}'.format(
 AssertionError: 19 != 20 : Expected 19 failures, got 20 on test/fixtures/templates/quickstart/openshift.yaml
 
 ======================================================================
 FAIL: test_templates (integration.test_quickstart_templates_non_strict.TestQuickStartTemplates)
 Test Successful JSON Parsing
 ----------------------------------------------------------------------
 Traceback (most recent call last):
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/__init__.py", line 34, in run_scenarios
     result = subprocess.check_output(['cfn-lint'] + extra_params + ['--format', 'json',
   File "/usr/lib64/python3.8/subprocess.py", line 411, in check_output
     return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
 subprocess.CalledProcessError: Command '['cfn-lint', '--include-checks', 'I', '--include-expiremental', '--configure-rule', 'E3012:strict=false', '--format', 'json', '--', 'test/fixtures/templates/quickstart/openshift.yaml']' returned non-zero exit status 14.
 
 During handling of the above exception, another exception occurred:
 
 Traceback (most recent call last):
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/test_quickstart_templates_non_strict.py", line 38, in test_templates
     self.run_scenarios([
   File "/home/abuild/rpmbuild/BUILD/cfn-python-lint-0.30.1/test/integration/__init__.py", line 52, in run_scenarios
     self.assertEqual(error.returncode, exit_code, 'Expected {} exit code, got {} on {}'.format(
 AssertionError: 14 != 12 : Expected 12 exit code, got 14 on test/fixtures/templates/quickstart/openshift.yaml
 
 ----------------------------------------------------------------------
 Ran 486 tests in 50.118s
 
 FAILED (failures=3)
@kddejong
Copy link
Contributor

Thanks for submitting this. I agree with this. We need to figure out a way to articulate the date in our tests so they don't start failing based on day. There are some errors that are time based. In this case as AWS EOLs a certain Lambda environment we want to provide that information to our users. @PatMyron The other thing here is that the allowed values for a Lambda environment would also change as a runtime is EOL. I wonder if we should change how some of this works to rely on AllowedValues than a specific rule for Lambda.

@PatMyron
Copy link
Contributor

The other thing here is that the allowed values for a Lambda environment would also change as a runtime is EOL. I wonder if we should change how some of this works to rely on AllowedValues than a specific rule for Lambda.

As the AllowedValues switch to being sourced from botocore, they'll automatically contain all the runtimes, including the older ones that are no longer allowed

Also multiple dates for each runtime for warn and error level, which is nice to have beyond only having errors one day

@PatMyron
Copy link
Contributor

@bmwiedemann how far into the future does the project need tests to pass? Or is the expectation that current tests pass forever?

@PatMyron
Copy link
Contributor

PatMyron commented Oct 5, 2020

@bmwiedemann
Copy link
Author

bmwiedemann commented Nov 13, 2020

how far into the future does the project need tests to pass

I usually test +15y from now, because some software will be used that long in some places. But if you can make it work forever, it is even better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants