Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Canvas widget #32

Closed
hecrj opened this issue Oct 23, 2019 · 9 comments
Closed

Canvas widget #32

hecrj opened this issue Oct 23, 2019 · 9 comments
Labels
feature New feature or request help wanted Extra attention is needed widget
Milestone

Comments

@hecrj
Copy link
Member

hecrj commented Oct 23, 2019

A widget to draw freely in 2D or 3D. It could be used to draw charts, implement a Paint clone, a CAD application, etc.

As a first approach, we could expose the underlying renderer directly here, and couple this widget with it (wgpu for now). Once wgpu gets WebGL or WebGPU support, this widget will be able to run on web too. The renderer primitive could be a simple texture that the widget draws to.

In the long run, we could expose a renderer-agnostic abstraction to perform the drawing.

@hecrj hecrj added feature New feature or request help wanted Extra attention is needed labels Oct 23, 2019
@hecrj hecrj added this to the 0.3.0 milestone Oct 23, 2019
@Xaeroxe
Copy link
Contributor

Xaeroxe commented Nov 30, 2019

Just gonna drop my two cents here.

Rendering a custom widget doesn't make much sense unless you know both the renderer and the widget.

This is where low level rendering access becomes important, for example if I'm rendering a virtual scene in my widget I'll probably want to control that via a graphics card. However not all rendering stacks will have a GPU available to them, so it may be hard to present this in a platform agnostic way. Additionally, the tech used to access the GPU may differ from platform to platform.

So to create this I'd define the drawing trait in terms of the rendering stack, which is actually really similar to what you've already done with the myriad "Renderer" traits. So I suppose the takeaway of my comment is based around this sentence in the initial comment

In the long run, we could expose a renderer-agnostic abstraction to perform the drawing.

That's probably not worth doing, as the means by which a custom widget could be rendered are vast and unknowable from the perspective of iced.

@parasyte
Copy link

parasyte commented Feb 3, 2020

I started working on a first pass at this. Having problems with the Renderer trait exported from iced_native, and Application exported from iced_core.

I need some way to update the Renderer with a custom wgpu::Pipeline (created by user code, outside of iced. I thought about passing a closure into Application::run, but specifying the types is causing me grief; what's the input? With the wgpu backend, we want a wgpu::Device, and the output is a CanvasPipeline type that the Renderer can use. This really complicates the public API.

The only improvement I can think of trying is using Box<dyn Any> for both input and output types, and try to downcast them where needed.

Does anyone have any other suggestions? This would probably be a lot easier without the multiple abstraction layers spread across several crates. I only care about supporting wgpu for now (see the comment directly above for rationale; I completely agree with that sentiment.)

Because of these challenges, I'm not comfortable sharing the code. I just don't think it's reviewable in this state.

@hecrj
Copy link
Member Author

hecrj commented Feb 3, 2020

The first step here is to create a Canvas widget with an API similar to the CanvasRenderingContext2D on the Web.

I think an implementation of this API can be achieved without exposing internal rendering details, like pipelines or shaders.

The cool thing about this approach is that we could make the Canvas widget work on the Web almost for free with a trivial implementation.

There has been some work done in this direction and it looks promising. I think I'll be able to share something soon!


@parasyte Could you describe your use case? Why do you need to create custom pipelines? I believe you would need a different kind of widget than the one described in this issue, at least for now.

If you want to render your scene with Iced on top, I believe the way to go for that is to integrate your current wgpu renderer with the iced_wgpu crate. As you probably are aware, this is currently not possible. However, I think it's better to invest time exploring in that direction rather than trying to create a Canvas widget that does too much without clear boundaries.

@parasyte
Copy link

parasyte commented Feb 3, 2020

The specific use case I have in mind is building a tile map editor (like Tiled). The CanvasRenderingContext2D API isn’t much value to me, since I would almost exclusively make use of the ImageData equivalent. The Canvas2D API has a lot of power, but it is a poor fit for applications like this, not to mention CAD and 3D modeling.

ImageData isn’t a great fit either. WebGL is much closer to what I would like. It would just make things like indexed palette rendering and dithering so much easier.

These reasons (and others; caching, and no support for animations) rules out the Image widget for me, too. Hope that helps understand where I’m coming from.

@hecrj
Copy link
Member Author

hecrj commented Feb 20, 2020

@parasyte Now that #183 and #193 have landed, I think we should be able to start exploring more powerful solutions to satisfy use cases similar to yours.

While #193 only supports basic 2D graphics, I think it should be possible to build a similar widget for advanced graphics (custom shaders, 3D, etc.) by exposing wgpu directly.

@hecrj
Copy link
Member Author

hecrj commented Feb 22, 2020

@n826vnl8 I think the seeds are planted and an advanced canvas widget can be implemented using the current Canvas implementation as a guide while adding a Primitive::Texture or similar to iced_wgpu.

Because of this, I am moving on to working on other features that may challenge the current design of the library (like #27, #30, and #31) and can help us notice any shortcomings.

Thus, no plans yet! That said, I will happily discuss use cases and review any design ideas. Everyone is welcome to join the Zulip server and start a topic there!

@dhardy
Copy link

dhardy commented Apr 3, 2020

Possibly of interest, I achieved this in KAS via use of a CustomPipe trait — the user implements this plus a CustomWindow trait, and uses the latter to parametrise the toolkit's pipeline. This only works because the rendering pipeline is extensible and still requires a downcast to do the actual drawing. Perhaps a similar method could be used in Iced?

BTW I think iced_wgpu docs are missing the "canvas" feature?

@pacmancoder
Copy link
Contributor

pacmancoder commented Oct 18, 2021

Hi! I am currently searching for GUI framework for my existing project (RustZX) And iced feels like it would be a good fit, allowing to make GUI for both for desktop and the web.

However it still missing the core issue which I need - possibility to draw texture on the canvas with given filtering (specifically, nearest neighbor). As I understand, this task is blocker for this to happen.

The original task mentions that wgpu lacks wupport for WebGL, but I believe this has changed recently - I have successfully compiled some wgpu examples with WebGL backend.

What steps we should make currently to make this possible @hecrj ?

@hecrj hecrj added the widget label Jan 18, 2022
@hecrj
Copy link
Member Author

hecrj commented Jan 18, 2022

Since we already have a 2D Canvas widget, we can close this and keep discussing the 3D widget in #343.

@hecrj hecrj closed this as completed Jan 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request help wanted Extra attention is needed widget
Projects
None yet
Development

No branches or pull requests

6 participants
@dhardy @parasyte @hecrj @pacmancoder @Xaeroxe and others