Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help needed: launch integration tests from master branch #859

Closed
antechrestos opened this issue Nov 13, 2019 · 12 comments
Closed

Help needed: launch integration tests from master branch #859

antechrestos opened this issue Nov 13, 2019 · 12 comments
Labels
area/internal bugs related to kaniko development workflow area/testing Issues related to testing kaniko itself kind/bug Something isn't working priority/p3 agreed that this would be good to have, but no one is available at the moment.

Comments

@antechrestos
Copy link
Contributor

Actual behavior

I created a gcs bucket and a repository to launch integration tests. I am running on ubuntu 18.04 and go 1.12.

I am trying to validate my development environment as my IT tests failed on my PR and I want to understand what happened.

Every time I launch it from master branch, I keep having random errors such as

Error building images: Failed to build image gcr.io/silent-caster-258821/antechrestos-test-kaniko/docker-dockerfile_test_cache_copy with kaniko command "[docker run -v /home/buce8373/.config/gcloud:/root/.config/gcloud -v /tmp/buce8373/983610448:/kaniko/benchmarks -v /home/buce8373/go/src/github.com/GoogleContainerTools/kaniko/integration:/workspace -e BENCHMARK_FILE=false executor-image -f /workspace/dockerfiles/Dockerfile_test_cache_copy -d gcr.io/silent-caster-258821/antechrestos-test-kaniko/kaniko-dockerfile_test_cache_copy  -c /workspace]": exit status 1 INFO[0000] Resolved base name google/cloud-sdk:256.0.0-alpine to google/cloud-sdk:256.0.0-alpine 
INFO[0000] Using dockerignore file: /workspace/.dockerignore 
INFO[0000] Resolved base name google/cloud-sdk:256.0.0-alpine to google/cloud-sdk:256.0.0-alpine 
INFO[0000] Downloading base image google/cloud-sdk:256.0.0-alpine 
INFO[0002] Error while retrieving image from cache: getting file info: stat /cache/sha256:46cac9de07e621720947afd83a5d6c4a67fc6de8ce710b89f79b9d3f25e12e06: no such file or directory 
INFO[0002] Downloading base image google/cloud-sdk:256.0.0-alpine 
INFO[0003] Built cross stage deps: map[]                
INFO[0003] Downloading base image google/cloud-sdk:256.0.0-alpine 
INFO[0004] Error while retrieving image from cache: getting file info: stat /cache/sha256:46cac9de07e621720947afd83a5d6c4a67fc6de8ce710b89f79b9d3f25e12e06: no such file or directory 
INFO[0004] Downloading base image google/cloud-sdk:256.0.0-alpine 
INFO[0005] Unpacking rootfs as cmd COPY context/foo /usr/bin requires it. 
error building image: error building stage: removing whiteout etc/ca-certificates/.wh..wh..opq: fstatat /etc/ca-certificates/.wh..opq: operation not permitted
exit status 1

or

Error building images: Failed to build image gcr.io/silent-caster-258821/antechrestos-test-kaniko/docker-dockerfile_test_extract_fs with kaniko command "[docker run -v /home/buce8373/.config/gcloud:/root/.config/gcloud -v /tmp/buce8373/989993316:/kaniko/benchmarks -v /home/buce8373/go/src/github.com/GoogleContainerTools/kaniko/integration:/workspace -e BENCHMARK_FILE=false executor-image -f /workspace/dockerfiles/Dockerfile_test_extract_fs -d gcr.io/silent-caster-258821/antechrestos-test-kaniko/kaniko-dockerfile_test_extract_fs  -c /workspace]": exit status 125 docker: Error response from daemon: device or resource busy.
See 'docker run --help'.
exit status 1

or else

Error building images: Failed to build image gcr.io/silent-caster-258821/antechrestos-test-kaniko/docker-dockerfile_test_copy_bucket with docker command "[docker build -t gcr.io/silent-caster-258821/antechrestos-test-kaniko/docker-dockerfile_test_copy_bucket -f dockerfiles/Dockerfile_test_copy_bucket .]": exit status 1 Sending build context to Docker daemon  129.5kB
Step 1/19 : FROM alpine@sha256:5ce5f501c457015c4b91f91a15ac69157d9b06f1a75cf9107bf2b62e0843983a
 ---> 791c3e2ebfcb
Step 2/19 : COPY context/foo foo
 ---> Using cache
 ---> 03e1fbeab82b
Step 3/19 : COPY context/foo /foodir/
 ---> Using cache
 ---> 311d856ca3d8
Step 4/19 : COPY context/bar/b* bar/
 ---> Using cache
 ---> 5922bc04d50b
Step 5/19 : COPY context/fo? /foo2
 ---> Using cache
 ---> 31e916267c63
Step 6/19 : COPY context/bar/doesnotexist* context/foo hello
 ---> Using cache
 ---> 50aa2af5cac8
Step 7/19 : COPY ./context/empty /empty
 ---> Using cache
 ---> 6783045fe248
Step 8/19 : COPY ./ dir/
 ---> b0fbbd565939
Removing intermediate container 0bb06c5b65ba
Step 9/19 : COPY . newdir
 ---> 05680af47e19
Removing intermediate container aea9ebe6c973
Step 10/19 : COPY context/bar /baz/
device or resource busy
exit status 1
FAIL	github.com/GoogleContainerTools/kaniko/integration	215.794s
Makefile:55: recipe for target 'integration-test' failed
make: *** [integration-test] Error 1

Is there anything I am missing?

@antechrestos
Copy link
Contributor Author

antechrestos commented Nov 13, 2019

I suspected problem whith parallelism. It is not.
By disabling parallelism and forcing integration sequentially, I ended with an error proving there is no link....

Building images for Dockerfile Dockerfile_test_volume_4
Error building images: Failed to build image gcr.io/silent-caster-258821/antechrestos-test-kaniko/docker-dockerfile_test_volume_4 with kaniko command "[docker run -v /home/buce8373/.config/gcloud:/root/.config/gcloud -v /tmp/buce8373/457398607:/kaniko/benchmarks -v /home/buce8373/go/src/github.com/GoogleContainerTools/kaniko/integration:/workspace -e BENCHMARK_FILE=false executor-image -f /workspace/dockerfiles/Dockerfile_test_volume_4 -d gcr.io/silent-caster-258821/antechrestos-test-kaniko/kaniko-dockerfile_test_volume_4  -c /workspace]": exit status 1 INFO[0000] Resolved base name rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 to rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 
INFO[0000] Using dockerignore file: /workspace/.dockerignore 
INFO[0000] Resolved base name rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 to rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 
INFO[0000] Error while retrieving image from cache: getting file info: stat /cache/sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2: no such file or directory 
INFO[0000] Downloading base image rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 
INFO[0003] Built cross stage deps: map[]                
INFO[0003] Error while retrieving image from cache: getting file info: stat /cache/sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2: no such file or directory 
INFO[0003] Downloading base image rabbitmq@sha256:57b028a4bb9592ece3915e3e9cdbbaecb3eb82b753aaaf5250f8d25d81d318e2 
INFO[0003] Unpacking rootfs as cmd COPY context/foo /usr/local/bin/ requires it. 
error building image: error building stage: removing whiteout var/lib/apt/lists/auxfiles/.wh..wh..opq: fstatat /var/lib/apt/lists/auxfiles/.wh..opq: operation not permitted

@cvgw cvgw added area/internal bugs related to kaniko development workflow kind/bug Something isn't working more-information-needed labels Nov 15, 2019
@cvgw
Copy link
Contributor

cvgw commented Nov 15, 2019

@antechrestos could you share the commands you are using to execute the integration tests? Thanks

@cvgw
Copy link
Contributor

cvgw commented Nov 15, 2019

Can you also share the git ref you are executing the tests again. I'm slightly confused because you mention master, but the log output looks like the failure is on a newly introduced test

@antechrestos
Copy link
Contributor Author

@cvgw i launch integration tests as specified in the documentation

make integration-test

Thank you

@cvgw cvgw added the priority/p3 agreed that this would be good to have, but no one is available at the moment. label Nov 15, 2019
@antechrestos
Copy link
Contributor Author

@cvgw I join you the last test I made. In the log you can see that:

  • I start from a clean repo with head pointing to commit 4f789a0
  • I cleaned all local docker images
  • I run on go 1.12
  • my repository was clean

Also, before running the tests

  • my out directory was cleaned
  • my buckets were also cleaned as new

Thanks!

@cvgw
Copy link
Contributor

cvgw commented Nov 23, 2019

@antechrestos I'm fairly certain what you are seeing is a known race condition in docker. I sometimes see it on my own machine as well. Here is an issue that I believe is relevant docker/for-linux#711

Unfortunately I don't have any answers on how to fix it; I'm still working on resolving it on my own machine.

@cvgw cvgw added area/testing Issues related to testing kaniko itself and removed more-information-needed labels Nov 23, 2019
@antechrestos
Copy link
Contributor Author

@cvgw thank I will give it a try by tomorrow!

@antechrestos
Copy link
Contributor Author

@cvgw Adding "storage-driver": "overlay",in /etc/docker/daemon.json did the trick. Thank you very much I was then able to launch integration tests.
Shouldn't the DEVELOPMENT.md be amended to suggest it?

@cvgw
Copy link
Contributor

cvgw commented Dec 2, 2019

@antechrestos I'm glad to hear that fixed it for you! Unfortunately that doesn't seem to fix it for my machine; perhaps it is dependent on the platform or something like that?

@tejal29
Copy link
Member

tejal29 commented Jan 10, 2020

Looks like this issue is resolved.

@tejal29 tejal29 closed this as completed Jan 10, 2020
@antechrestos
Copy link
Contributor Author

@tejal29 Yeah sorry, I forgot to close it. Thanks

@antechrestos
Copy link
Contributor Author

@tejal29 @cvgw Should I do a PR to upgrade the documentation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/internal bugs related to kaniko development workflow area/testing Issues related to testing kaniko itself kind/bug Something isn't working priority/p3 agreed that this would be good to have, but no one is available at the moment.
Projects
None yet
Development

No branches or pull requests

3 participants