Skip to content
This repository has been archived by the owner on Apr 24, 2023. It is now read-only.

Support Docker containers in the spark client #951

Open
pschorf opened this issue Sep 4, 2018 · 3 comments
Open

Support Docker containers in the spark client #951

pschorf opened this issue Sep 4, 2018 · 3 comments

Comments

@pschorf
Copy link
Contributor

pschorf commented Sep 4, 2018

No description provided.

@PerilousApricot
Copy link

Is it possible that the (undocumented) option spark.executor.cook.container [1] could be used to emulate "native" docker support?

https://github.com/twosigma/spark/blob/a72488df9e89d2570a0dbbb80767b398a87e1d8d/resource-managers/cook/src/main/scala/org/apache/spark/scheduler/cluster/cook/CookSchedulerContext.scala#L47

@PerilousApricot
Copy link

Okay, I started work on this, my goal is to wire the existing mesos/docker config options into Cook as opposed to making cook-specific keys for the ame functinality. Is that acceptable?

@pschorf
Copy link
Contributor Author

pschorf commented Sep 10, 2018

Sounds good to me.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants