Giter Site home page Giter Site logo

Canvas widget about iced HOT 9 CLOSED

iced-rs avatar iced-rs commented on May 6, 2024 42
Canvas widget

from iced.

Comments (9)

hecrj avatar hecrj commented on May 6, 2024 5

@parasyte Now that #183 and #193 have landed, I think we should be able to start exploring more powerful solutions to satisfy use cases similar to yours.

While #193 only supports basic 2D graphics, I think it should be possible to build a similar widget for advanced graphics (custom shaders, 3D, etc.) by exposing wgpu directly.

from iced.

dhardy avatar dhardy commented on May 6, 2024 4

Possibly of interest, I achieved this in KAS via use of a CustomPipe trait — the user implements this plus a CustomWindow trait, and uses the latter to parametrise the toolkit's pipeline. This only works because the rendering pipeline is extensible and still requires a downcast to do the actual drawing. Perhaps a similar method could be used in Iced?

BTW I think iced_wgpu docs are missing the "canvas" feature?

from iced.

hecrj avatar hecrj commented on May 6, 2024 1

Since we already have a 2D Canvas widget, we can close this and keep discussing the 3D widget in #343.

from iced.

Xaeroxe avatar Xaeroxe commented on May 6, 2024

Just gonna drop my two cents here.

Rendering a custom widget doesn't make much sense unless you know both the renderer and the widget.

This is where low level rendering access becomes important, for example if I'm rendering a virtual scene in my widget I'll probably want to control that via a graphics card. However not all rendering stacks will have a GPU available to them, so it may be hard to present this in a platform agnostic way. Additionally, the tech used to access the GPU may differ from platform to platform.

So to create this I'd define the drawing trait in terms of the rendering stack, which is actually really similar to what you've already done with the myriad "Renderer" traits. So I suppose the takeaway of my comment is based around this sentence in the initial comment

In the long run, we could expose a renderer-agnostic abstraction to perform the drawing.

That's probably not worth doing, as the means by which a custom widget could be rendered are vast and unknowable from the perspective of iced.

from iced.

parasyte avatar parasyte commented on May 6, 2024

I started working on a first pass at this. Having problems with the Renderer trait exported from iced_native, and Application exported from iced_core.

I need some way to update the Renderer with a custom wgpu::Pipeline (created by user code, outside of iced. I thought about passing a closure into Application::run, but specifying the types is causing me grief; what's the input? With the wgpu backend, we want a wgpu::Device, and the output is a CanvasPipeline type that the Renderer can use. This really complicates the public API.

The only improvement I can think of trying is using Box<dyn Any> for both input and output types, and try to downcast them where needed.

Does anyone have any other suggestions? This would probably be a lot easier without the multiple abstraction layers spread across several crates. I only care about supporting wgpu for now (see the comment directly above for rationale; I completely agree with that sentiment.)

Because of these challenges, I'm not comfortable sharing the code. I just don't think it's reviewable in this state.

from iced.

hecrj avatar hecrj commented on May 6, 2024

The first step here is to create a Canvas widget with an API similar to the CanvasRenderingContext2D on the Web.

I think an implementation of this API can be achieved without exposing internal rendering details, like pipelines or shaders.

The cool thing about this approach is that we could make the Canvas widget work on the Web almost for free with a trivial implementation.

There has been some work done in this direction and it looks promising. I think I'll be able to share something soon!


@parasyte Could you describe your use case? Why do you need to create custom pipelines? I believe you would need a different kind of widget than the one described in this issue, at least for now.

If you want to render your scene with Iced on top, I believe the way to go for that is to integrate your current wgpu renderer with the iced_wgpu crate. As you probably are aware, this is currently not possible. However, I think it's better to invest time exploring in that direction rather than trying to create a Canvas widget that does too much without clear boundaries.

from iced.

parasyte avatar parasyte commented on May 6, 2024

The specific use case I have in mind is building a tile map editor (like Tiled). The CanvasRenderingContext2D API isn’t much value to me, since I would almost exclusively make use of the ImageData equivalent. The Canvas2D API has a lot of power, but it is a poor fit for applications like this, not to mention CAD and 3D modeling.

ImageData isn’t a great fit either. WebGL is much closer to what I would like. It would just make things like indexed palette rendering and dithering so much easier.

These reasons (and others; caching, and no support for animations) rules out the Image widget for me, too. Hope that helps understand where I’m coming from.

from iced.

hecrj avatar hecrj commented on May 6, 2024

@n826vnl8 I think the seeds are planted and an advanced canvas widget can be implemented using the current Canvas implementation as a guide while adding a Primitive::Texture or similar to iced_wgpu.

Because of this, I am moving on to working on other features that may challenge the current design of the library (like #27, #30, and #31) and can help us notice any shortcomings.

Thus, no plans yet! That said, I will happily discuss use cases and review any design ideas. Everyone is welcome to join the Zulip server and start a topic there!

from iced.

pacmancoder avatar pacmancoder commented on May 6, 2024

Hi! I am currently searching for GUI framework for my existing project (RustZX) And iced feels like it would be a good fit, allowing to make GUI for both for desktop and the web.

However it still missing the core issue which I need - possibility to draw texture on the canvas with given filtering (specifically, nearest neighbor). As I understand, this task is blocker for this to happen.

The original task mentions that wgpu lacks wupport for WebGL, but I believe this has changed recently - I have successfully compiled some wgpu examples with WebGL backend.

What steps we should make currently to make this possible @hecrj ?

from iced.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.