Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(context): add podman support #1142

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

TaylorHere
Copy link

now we can use envd with podman.socket

envd context create --name podman --builder podman-container --use
DOCKER_HOST=unix:///run/user/1000/podman/podman.sock envd bootstrap

@muniu-bot
Copy link

muniu-bot bot commented Nov 2, 2022

Welcome @TaylorHere! It looks like this is your first PR to tensorchord/envd 🎉

@TaylorHere
Copy link
Author

trying to figure out how to add a podman socket when the CI test, did not use GitHub actions before.

@TaylorHere TaylorHere changed the title feat(builder): add podman support WIP: feat(builder): add podman support Nov 2, 2022
@muniu-bot
Copy link

muniu-bot bot commented Nov 2, 2022

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: TaylorHere
Once this PR has been reviewed and has the lgtm label, please assign terrytangyuan for approval by writing /assign @terrytangyuan in a comment. For more information see:The Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@TaylorHere TaylorHere marked this pull request as draft November 7, 2022 13:41
@TaylorHere TaylorHere force-pushed the main branch 2 times, most recently from 254841c to 2561f10 Compare November 7, 2022 14:39
feat(CI): add setup podman

Signed-off-by: taylorhere <taylorherelee@gmail.com>
@TaylorHere TaylorHere marked this pull request as ready for review November 8, 2022 15:33
@TaylorHere TaylorHere changed the title WIP: feat(builder): add podman support feat(builder): add podman support Nov 8, 2022
@TaylorHere TaylorHere changed the title feat(builder): add podman support feat(context): add podman support Nov 8, 2022
@muniu-bot muniu-bot bot requested a review from gaocegege November 8, 2022 15:37
@aseaday
Copy link
Member

aseaday commented Nov 9, 2022

LGTM

Copy link
Member

@gaocegege gaocegege left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution! 🎉 👍

Does podman load work now?

/cc @kemingy @VoVAllen

@VoVAllen
Copy link
Member

VoVAllen commented Nov 9, 2022

From what I understood here, podman has the same implementation as docker. And we are adding a new context name for podman, right?

@TaylorHere
Copy link
Author

From what I understood here, podman has the same implementation as docker. And we are adding a new context name for podman, right?

yes, I'm adding a new context name for podman and also adding a buildkit connection string for podman, which does envd work with podman

@TaylorHere
Copy link
Author

Thanks for your contribution! tada +1

Does podman load work now?

/cc @kemingy @VoVAllen

the CI can only work in this PR, local build envd is not working, also in my forked repo CI is not passed

@gaocegege
Copy link
Member

I got this error: Cannot connect to the Docker daemon at unix:///run/user/1000/podman/podman.sock. Is the docker daemon running?

Did you encounter this before?

@gaocegege
Copy link
Member

I got this error: Cannot connect to the Docker daemon at unix:///run/user/1000/podman/podman.sock. Is the docker daemon running?

Did you encounter this before?

Weird. I restart the podman service then the file is created. Before the restart, there is no file /run/user/1000/podman/podman.sock but podman does work well.

@gaocegege
Copy link
Member

gaocegege commented Nov 10, 2022

After the restart, I got a new error

DEBU[2022-11-10T10:34:20+08:00] pulling image                                 container=test mir
ror= tag="docker.io/moby/buildkit:v0.10.3"                                                      
error: failed to create buildkit client: failed to bootstrap the buildkitd: failed to pull image
: Error response from daemon: failed to resolve image name: short-name "moby/buildkit:v0.10.3" d
id not resolve to an alias and no unqualified-search registries are defined in "/etc/containers/
registries.conf"                                                                                

@gaocegege
Copy link
Member

@gaocegege
Copy link
Member

gaocegege commented Nov 10, 2022

Got this error when loading the image:

error: failed to receive status: rpc error: code = Unavailable desc = error reading from server: command [podman exec -i test buildctl dial-stdio] has exited with exit status 255, please make sure the URL is valid, and Docker 18.09 or later is installed on the remote host: stderr=Error: timed out waiting for file /home/gaocegege/.local/share/containers/storage/overlay-containers/0ba7a89450b410d52693393dedb9302df8e9983c3cfe165b51f58e10730017bf/userdata/5cc1397b230dd3147c997c03a1fa5493c3a334038311135ebde54d03656224db/exit/0ba7a89450b410d52693393dedb9302df8e9983c3cfe165b51f58e10730017bf: internal libpod error

@TaylorHere TaylorHere closed this Nov 10, 2022
@TaylorHere TaylorHere reopened this Nov 10, 2022
@TaylorHere
Copy link
Author

TaylorHere commented Nov 10, 2022

Got this error when loading the image:

error: failed to receive status: rpc error: code = Unavailable desc = error reading from server: command [podman exec -i test buildctl dial-stdio] has exited with exit status 255, please make sure the URL is valid, and Docker 18.09 or later is installed on the remote host: stderr=Error: timed out waiting for file /home/gaocegege/.local/share/containers/storage/overlay-containers/0ba7a89450b410d52693393dedb9302df8e9983c3cfe165b51f58e10730017bf/userdata/5cc1397b230dd3147c997c03a1fa5493c3a334038311135ebde54d03656224db/exit/0ba7a89450b410d52693393dedb9302df8e9983c3cfe165b51f58e10730017bf: internal libpod error

I got this error but did not know why, maybe a bug from podman, you can see podman's debug log, which shows the loaded image is an empty file, but the make e2e-test-cli can work sometimes, which is weird, maybe the ci test is not working with podman correctly but connected with docker

@TaylorHere
Copy link
Author

 => exporting to oci image format                                                                                                                             2.7s
 => => exporting layers                                                                                                                                       1.5s
 => => exporting manifest sha256:d455a0ef6f205f908117953e36db96d41845bb9cec01bf325ba2babdeb1b8b77                                                             0.0s
 => => exporting config sha256:19009efecb6ec386bc52b0676e581b59144e600c5d8cbd3786ecc0c54a33528c                                                               0.0s
 => => sending tarball                                                                                                                                        1.1s
error: No such image: envd:dev: image not known

while DOCKER_HOST=unix:///run/user/1000/podman/podman.sock envd up

@TaylorHere
Copy link
Author

TaylorHere commented Nov 10, 2022

 => exporting to oci image format                                                                                                                             2.7s
 => => exporting layers                                                                                                                                       1.5s
 => => exporting manifest sha256:d455a0ef6f205f908117953e36db96d41845bb9cec01bf325ba2babdeb1b8b77                                                             0.0s
 => => exporting config sha256:19009efecb6ec386bc52b0676e581b59144e600c5d8cbd3786ecc0c54a33528c                                                               0.0s
 => => sending tarball                                                                                                                                        1.1s
error: No such image: envd:dev: image not known

while DOCKER_HOST=unix:///run/user/1000/podman/podman.sock envd up

@ - - [10/Nov/2022:15:38:16 +0800] "GET /v1.40/images/json?filters=%7B%22label%22%3A%7B%22ai.tensorchord.envd.build.digest%3D931dc819db3e572%22%3Atrue%7D%2C%22reference%22%3A%7B%22envd%3Adev%22%3Atrue%7D%7D HTTP/1.1" 200 3 "" "Go-http-client/1.1"
DEBU[0004] IdleTracker:idle 4m+0h/4t connection(s)       X-Reference-Id=0xc000010ff8
DEBU[0004] IdleTracker:new 4m+0h/4t connection(s)        X-Reference-Id=0xc000604018
DEBU[0004] IdleTracker:active 4m+0h/5t connection(s)     X-Reference-Id=0xc000604018
@ - - [10/Nov/2022:15:38:17 +0800] "HEAD /_ping HTTP/1.1" 200 0 "" "Go-http-client/1.1"
DEBU[0004] IdleTracker:idle 5m+0h/5t connection(s)       X-Reference-Id=0xc000604018
DEBU[0004] IdleTracker:active 5m+0h/5t connection(s)     X-Reference-Id=0xc000604018
@ - - [10/Nov/2022:15:38:17 +0800] "HEAD /_ping HTTP/1.1" 200 0 "" "Go-http-client/1.1"
DEBU[0004] IdleTracker:idle 5m+0h/5t connection(s)       X-Reference-Id=0xc000604018
DEBU[0004] IdleTracker:active 5m+0h/5t connection(s)     X-Reference-Id=0xc000604018
DEBU[0027] Loading image from "/var/tmp/api_load.tar4068737496" 
DEBU[0027] -> Attempting to load "/var/tmp/api_load.tar4068737496" as an OCI directory 
DEBU[0027] parsed reference into "[overlay@/home/taylor/.local/share/containers/storage+/run/user/1000/containers]localhost/var/tmp/api_load.tar4068737496:latest" 
DEBU[0027] Copying source image /var/tmp/api_load.tar4068737496: to destination image [overlay@/home/taylor/.local/share/containers/storage+/run/user/1000/containers]localhost/var/tmp/api_load.tar4068737496:latest 
DEBU[0027] Error loading /var/tmp/api_load.tar4068737496: initializing source oci:/var/tmp/api_load.tar4068737496:: open /var/tmp/api_load.tar4068737496/index.json: not a directory 
DEBU[0027] -> Attempting to load "/var/tmp/api_load.tar4068737496" as an OCI archive 
DEBU[0028] Error deleting temporary directory: <nil>    
DEBU[0028] parsed reference into "[overlay@/home/taylor/.local/share/containers/storage+/run/user/1000/containers]localhost/dev:latest" 
DEBU[0028] Copying source image /var/tmp/api_load.tar4068737496: to destination image [overlay@/home/taylor/.local/share/containers/storage+/run/user/1000/containers]localhost/dev:latest 
DEBU[0028] Using blob info cache at /home/taylor/.local/share/containers/cache/blob-info-cache-v1.boltdb 
DEBU[0028] IsRunningImageAllowed for image oci-archive:/var/tmp/api_load.tar4068737496 
DEBU[0028]  Using default policy section                
DEBU[0028]  Requirement 0: allowed                      
DEBU[0028] Overall: allowed                             
Getting image source signatures
DEBU[0028] Manifest has MIME type application/vnd.docker.distribution.manifest.v2+json, ordered candidate list [application/vnd.docker.distribution.manifest.v2+json, application/vnd.docker.distribution.manifest.v1+prettyjws, application/vnd.oci.image.manifest.v1+json, application/vnd.docker.distribution.manifest.v1+json] 
DEBU[0028] ... will first try using the original manifest unmodified 
DEBU[0028] Skipping blob sha256:c00545c54abe685118a9477bd95d99c1c4f98abd430b1fe8f67bb7a4a4de1d54 (already present): 
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b [--------------------------------------] 0.0b / 0.0b
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob c00545c54abe skipped: already exists  
Copying blob 7a0805df587b skipped: already exists  
Copying blob 1cdae5e5c7ff [--------------------------------------] 0.0b / 0.0b
Copying blob eaead16dc43b skipped: already exists  
Copying blob c83403c23aee skipped: already exists  
Copying blob e33b1614b502 skipped: already exists  
Copying blob 77de15f6a868 [--------------------------------------] 0.0b / 0.0b
Copying blob f02229881901 [--------------------------------------] 0.0b / 0.0b
Copying blob 63d959beab39 [--------------------------------------] 0.0b / 0.0b
Copying blob d4eeb435307c [--------------------------------------] 0.0b / 0.0b
Copying blob 7f02186baa14 skipped: already exists  
Copying blob 9504ef966851 skipped: already exists  
Copying blob de49f3d1169c skipped: already exists  
Copying blob 7348bb41d6ab [--------------------------------------] 0.0b / 0.0b
Copying blob 24c1dc02675d [--------------------------------------] 0.0b / 0.0b
Copying blob e0752d42ad07 done  
Copying blob 4f4fb700ef54 [--------------------------------------] 0.0b / 0.0b
Copying blob 0de0cb551022 [--------------------------------------] 0.0b / 0.0b
Copying blob f245bca5033c skipped: already exists  
Copying blob dae5bef954c2 done  
Copying blob 287c63037080 done  
Copying blob 7192df8f072f done  
Copying blob 391fb798110b done  
Copying blob e9cec4a5a81e done  
Copying blob 4f4fb700ef54 [--------------------------------------] 0.0b / 0.0b
Copying blob 1a65025e6c29 done  
Copying blob 9c720a6add6d done  
Copying blob 4f4fb700ef54 skipped: already exists  
Copying blob 90d05e8f4a9a done  
DEBU[0029] No compression detected                      
DEBU[0029] Using original blob without modification     
Copying config b060cf2a73 done  
Writing manifest to image destination
Storing signatures
DEBU[0029] setting image creation date to 2022-11-10 07:38:37.635963342 +0000 UTC 
DEBU[0029] created new image ID "b060cf2a73ebe1ca7e5151e63c880a1e486e54ad3f7f18c27b44722fc67dfb00" 
DEBU[0029] saved image metadata "{\"signatures-sizes\":{\"sha256:a5e9392809164a7f3e184b4260bc24e0d86322713a377c46ad15f569999a91ef\":[]}}" 
DEBU[0029] set names of image "b060cf2a73ebe1ca7e5151e63c880a1e486e54ad3f7f18c27b44722fc67dfb00" to [localhost/dev:latest] 
DEBU[0030] error deleting tmp dir: <nil>                
@ - - [10/Nov/2022:15:38:17 +0800] "POST /v1.40/images/load?quiet=1 HTTP/1.1" 200 50 "" "Go-http-client/1.1"
DEBU[0030] IdleTracker:idle 5m+0h/5t connection(s)       X-Reference-Id=0xc000604018
DEBU[0030] IdleTracker:closed 5m+0h/5t connection(s)     X-Reference-Id=0xc000604018
DEBU[0030] IdleTracker:new 4m+0h/5t connection(s)        X-Reference-Id=0xc0005a8370
DEBU[0030] IdleTracker:active 4m+0h/6t connection(s)     X-Reference-Id=0xc0005a8370
@ - - [10/Nov/2022:15:38:42 +0800] "HEAD /_ping HTTP/1.1" 200 0 "" "Go-http-client/1.1"
DEBU[0030] IdleTracker:idle 5m+0h/6t connection(s)       X-Reference-Id=0xc0005a8370
DEBU[0030] IdleTracker:active 5m+0h/6t connection(s)     X-Reference-Id=0xc0005a8370
INFO[0030] Request Failed(Not Found): no container with name or ID "envd" found: no such container 
@ - - [10/Nov/2022:15:38:42 +0800] "GET /v1.40/containers/envd/json HTTP/1.1" 404 120 "" "Go-http-client/1.1"
DEBU[0030] IdleTracker:idle 5m+0h/6t connection(s)       X-Reference-Id=0xc0005a8370
DEBU[0030] IdleTracker:active 5m+0h/6t connection(s)     X-Reference-Id=0xc0005a8370
DEBU[0030] Looking up image "envd:dev" in local containers storage 
DEBU[0030] Trying "envd:dev" ...                        
DEBU[0030] Loading registries configuration "/etc/containers/registries.conf" 
DEBU[0030] Loading registries configuration "/etc/containers/registries.conf.d/shortnames.conf" 
DEBU[0030] Trying "localhost/envd:dev" ...              
DEBU[0030] Trying "docker.io/library/envd:dev" ...      
DEBU[0030] Trying "docker.io/library/envd:dev" ...      
INFO[0030] Request Failed(Not Found): No such image: envd:dev: image not known 
@ - - [10/Nov/2022:15:38:42 +0800] "POST /v1.40/containers/create?name=envd HTTP/1.1" 404 96 "" "Go-http-client/1.1"
DEBU[0030] IdleTracker:idle 5m+0h/6t connection(s)       X-Reference-Id=0xc0005a8370
DEBU[0030] IdleTracker:closed 5m+0h/6t connection(s)     X-Reference-Id=0xc000010ff8
DEBU[0030] IdleTracker:closed 4m+0h/6t connection(s)     X-Reference-Id=0xc000604040
DEBU[0030] IdleTracker:closed 3m+0h/6t connection(s)     X-Reference-Id=0xc000604010
DEBU[0030] IdleTracker:closed 2m+0h/6t connection(s)     X-Reference-Id=0xc0005a8370
DEBU[0030] IdleTracker:closed 1m+0h/6t connection(s)     X-Reference-Id=0xc000604048

seems like podman treats the image name as localhost/dev, envd up will work while manually re-tag to envd:dev

@TaylorHere
Copy link
Author

TaylorHere commented Nov 10, 2022

seems like podman treats the image name as localhost/dev, envd up will work while manually re-tag to envd:dev

the index.json buildkit produce

{
    "schemaVersion": 2,
    "manifests": [
        {
            "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
            "digest": "sha256:f11ea37a75710ce5de012179bdd931851aef4b3549b6e07f99fa5df0c26803b5",
            "size": 6364,
            "annotations": {
                "io.containerd.image.name": "docker.io/library/envd:dev",
                "org.opencontainers.image.created": "2022-11-10T17:59:24Z",
                "org.opencontainers.image.ref.name": "dev"
            }
        }
    ]
}

here are some conversations around this
containers/podman#10809
containers/podman#11619
containers/podman#12560

moby/buildkit#1218

containers/common#793
containers/common#853

https://github.com/opencontainers/image-spec/blob/main/annotations.md#pre-defined-annotation-keys

@gaocegege
Copy link
Member

Maybe it is related to the OCI image spec. Docker image spec is slightly different with the OCI image, I think.

@TaylorHere
Copy link
Author

TaylorHere commented Nov 11, 2022

podman fixed image loading issue @containers/common#793 but ubuntu 22 did not ship newer version of podman yet.
manually tested envd up with podman 4.2.0 and everything works fine.
I will commit more tests later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Status: No status
Development

Successfully merging this pull request may close these issues.

4 participants