Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cloud edge model parallel training #231

Open
skrlin opened this issue Nov 3, 2021 · 2 comments
Open

Cloud edge model parallel training #231

skrlin opened this issue Nov 3, 2021 · 2 comments
Labels
kind/question Indicates an issue that is a support question.

Comments

@skrlin
Copy link

skrlin commented Nov 3, 2021

It is unknown whether Sedna supports traditional distributed training methods, such as model parallelism.
I can divide the model into different layers and distribute the training tasks of different layers to different edge nodes or central cloud nodes.
I wonder if Senda supports it?

@skrlin skrlin added the kind/question Indicates an issue that is a support question. label Nov 3, 2021
@JoeyHwong-gk
Copy link
Contributor

Distributed training falls under the category of machine learning frameworks, so maybe it's depend on the framework you use.

@MooreZheng
Copy link

MooreZheng commented Feb 7, 2022

Our team had quite a few discussions on this issue. We understand that traditional distributed learning can help to reduce training time. Sedna would like to support both edge-cloud collaborative training and inference, but more focuses on stand-alone models execution at present. Stand-alone models help to make sure the runtime/ robust service on each node, especially when some nodes are offline, which would be a better choice then model cutting. We can discuss the pros and cons of stand-along models and split models on weekly meeting if interested. Besides, we are very welcome to new proposals in KubeEdge SIG AI.

1. If one is looking for tools
As for distributed learning, Sedna supports both edge-cloud collaborative training and inference. For example, you can do it in a way of federated learning, which is intrinsically distributed multi-task learning without too much consideration on privacy.

One can also provide some real-world requirements/ projects to help community members to justify where there are further technique issues to tackle. We are welcome to real-world applications with Senda.

2. If one is planning for a related proposal
We also welcome more proposals on distributed learning in Sedna. One can introduce related techniques in routine meetings.

Would you mind introducing more on why Sedna needs this to community members, e.g., applications, requirements, or techniques? Sedna routine meetings run in a weekly manner (every Thursday) with https://zoom.us/my/kubeedge.

Besides, a similar issue is posted recently as #276.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/question Indicates an issue that is a support question.
Projects
None yet
Development

No branches or pull requests

3 participants