Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use ctx instead of konfig for integration test #465

Merged

Conversation

chriskim06
Copy link
Member

Fixes #410

@k8s-ci-robot k8s-ci-robot added the cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. label Jan 14, 2020
@k8s-ci-robot k8s-ci-robot added the size/M Denotes a PR that changes 30-99 lines, ignoring generated files. label Jan 14, 2020
@codecov-io
Copy link

codecov-io commented Jan 14, 2020

Codecov Report

Merging #465 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #465   +/-   ##
=======================================
  Coverage   56.52%   56.52%           
=======================================
  Files          19       19           
  Lines         927      927           
=======================================
  Hits          524      524           
  Misses        350      350           
  Partials       53       53

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d3103f3...1a9a1e4. Read the comment docs.

@ahmetb
Copy link
Member

ahmetb commented Jan 14, 2020

It seems like this is not making any performance improvements.
Both old and new builds on travis show 15 seconds for hack/run-integration-tets.sh. That is so weird. Any ideas why this might be?

Maybe the improvement is miniscule on a cloud-based machine (e.g. Travis) compared to running tests on a residential network on your laptop.

@chriskim06
Copy link
Member Author

There wasn't that much of a difference either when I ran it locally either. I think using konfig took around 12 sec for hack/run-integration-test.sh and ctx took about 11 sec. I can double check later but I don't remember seeing any real significant performance improvement.

@ahmetb
Copy link
Member

ahmetb commented Jan 15, 2020

That is so odd. How many places do we install this plugin? Every time should be a clean download.

@chriskim06
Copy link
Member Author

chriskim06 commented Jan 15, 2020

It gets installed a decent amount it seems:

$ rg install integration_test/
integration_test/upgrade_test.go
41:             Krew("install", "--manifest", filepath.Join("testdata", validPlugin+constants.ManifestExtension)).
60:             Krew("install", validPlugin).
78:func TestKrewUpgrade_ValidPluginInstalledFromManifest(t *testing.T) {
85:             Krew("install", validPlugin).

integration_test/install_test.go
31:func TestKrewInstall(t *testing.T) {
36:     if err := test.Krew("install", validPlugin); err == nil {
41:     if err := test.Krew("install"); err == nil {
45:     test.Krew("install", validPlugin).RunOrFailOutput()
49:func TestKrewInstallReRun(t *testing.T) {
55:     test.Krew("install", validPlugin).RunOrFail()
56:     test.Krew("install", validPlugin).RunOrFail()
60:func TestKrewInstall_MultiplePositionalArgs(t *testing.T) {
66:     test.WithIndex().Krew("install", validPlugin, validPlugin2).RunOrFailOutput()
71:func TestKrewInstall_Stdin(t *testing.T) {
78:             Krew("install").RunOrFailOutput()
84:func TestKrewInstall_StdinAndPositionalArguments(t *testing.T) {
93:             Krew("install", validPlugin).RunOrFail()
98:func TestKrewInstall_Manifest(t *testing.T) {
104:    test.Krew("install",
110:func TestKrewInstall_ManifestURL(t *testing.T) {
118:    test.Krew("install",
124:func TestKrewInstall_ManifestAndArchive(t *testing.T) {
130:    test.Krew("install",
137:func TestKrewInstall_OnlyArchive(t *testing.T) {
143:    err := test.Krew("install",
147:            t.Errorf("Expected install to fail but was successful")
151:func TestKrewInstall_ManifestArgsAreMutuallyExclusive(t *testing.T) {
159:    if err := test.Krew("install",
167:func TestKrewInstall_NoManifestArgsWhenPositionalArgsSpecified(t *testing.T) {
173:    err := test.Krew("install", validPlugin,
181:    err = test.Krew("install", validPlugin,

integration_test/version_test.go
37:             "InstallPath",

integration_test/system_test.go
33:     test.WithIndex().Krew("install", validPlugin).RunOrFail()
35:     // needs to be after initial installation
81:     test.Krew("install",

integration_test/list_test.go
36:     test.Krew("install", validPlugin).RunOrFail()
43:     // TODO(ahmetb): install multiple plugins and see if the output is sorted

integration_test/uninstall_test.go
26:func TestKrewUninstall(t *testing.T) {
34:     if err := test.Krew("uninstall").Run(); err == nil {
37:     if err := test.Krew("uninstall", validPlugin).Run(); err == nil {
38:             t.Fatal("expected failure deleting non-installed plugin")
40:     test.Krew("install", validPlugin).RunOrFailOutput()
41:     test.Krew("uninstall", validPlugin).RunOrFailOutput()
44:     if err := test.Krew("uninstall", validPlugin).Run(); err == nil {
45:             t.Fatal("expected failure for uninstalled plugin")
55:     test.WithIndex().Krew("install", validPlugin).RunOrFailOutput()
72:     test.Krew("install", validPlugin).RunOrFail()

integration_test/testdata/ctx.yaml
13:    If fzf is installed on your machine, you can interactively choose

A few of the installs in install_test.go are installing fooPlugin from the foo.yaml manifest in the testdata directory.

@chriskim06
Copy link
Member Author

From the issue:

... konfig creates a dedicated dist archive for distribution and does not use the repo tarball which is created automatically for github releases. This is ok, but github keeps track of download counts for release assets. As we run our tests often, this distorts the download numbers of konfig quite a bit. It would be better to switch to a small plugin which is distributed purely as repo tarball, for example ctx.

I wasn't expecting this issue to be about a performance upgrade for the integration tests based on what was mentioned in the issue, but more about some kind of automatic release statistics that affects konfig but not ctx in this case. Does that sound right @corneliusweig?

@ahmetb
Copy link
Member

ahmetb commented Jan 15, 2020

Yeah I think that's sensible.

/lgtm
/approve

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Jan 15, 2020
@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: ahmetb, chriskim06

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Jan 15, 2020
@k8s-ci-robot k8s-ci-robot merged commit 1579013 into kubernetes-sigs:master Jan 15, 2020
@corneliusweig
Copy link
Contributor

@chriskim06 Yes, that's right. This was more about skewed download statistics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. lgtm "Looks good to me", indicates that a PR is ready to be merged. size/M Denotes a PR that changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use small shell-based plugin for integration tests instead of konfig
5 participants