<lethalbit>
Here is a quick question, So I know that you went with dearimgui for perf reasons for the waveform displays, right? Have you thought about maybe moving to something like Qt for the GUI? You can keep the waveform displays as realtime vulkan/opengl surfaces and get the performance you need out of it, but also not have to deal with the pain that is dearimgui
<lethalbit>
Not sure if this was discussed already or not,
<lethalbit>
was just thinking about it and wondering
<azonenberg>
lethalbit: We moved away from GTK due to poor performance and imgui worked much better with all of the complex window management etc
<lethalbit>
Yeah fair, It is GTK after all :v (Qt is much better imo)
<azonenberg>
it's a very nice development flow
<azonenberg>
I remember we looked at qt, i dont recall all the factors that went into the decision
<lethalbit>
and fair, the issue I have with imgui is it's rough and really badly breaks system coheasion
<azonenberg>
Yes, it doesnt integrate as much with the OS visuals as qt does
<lethalbit>
and things like the dockers in imgui no matter the application, always break for me when popping out external windows
<azonenberg>
(also a11y isn't great either, although for such a visually centric app that isnt a huge deal)
<lethalbit>
yeah
<azonenberg>
i.e. if you're blind enough to need a screen reader, it doesnt really matter if you can have it read the gui chrome to you
<azonenberg>
the app is still going to be pretty useless sinc you can't see the waveforms
<azonenberg>
and there's no practical way around that
<lethalbit>
I partially ask because if it was in Qt, I would likely be more able/willing to help with the UI stuff, because not only do I know it much much better, immediate-mode GUIs just irk me in a way I can't explain so I just don't ever touch them code-wise if I can avoid it
<lethalbit>
but your'e the project lead so it's all up to you! Was just curious so I wanted to ask
<lethalbit>
s/your'e/you're/
<lethalbit>
no worries either way
<azonenberg>
Yeah i remember it was carefully considered and rejected
<azonenberg>
as was wxwidgets
<lethalbit>
wx i can understand, why Qt, if I may ask
<lethalbit>
(not saying Qt is good, all GUI toolkits suck major eggs, but Qt is better than most:tm:)
<azonenberg>
Dont remember it was years ago and i dont think we wrote it all up in one place, if you check chat logs from 5 years ago you might get some hints :p
<lethalbit>
Ough
<lethalbit>
Anyway, fair, fair
<azonenberg>
sorry i dont have a better answer lol
<lethalbit>
Nah, it's all good,
<lethalbit>
it was a silly question on my part
<azonenberg>
What i can say is that long term ngscopeclient does not have to be the only frontend for libscopehal/libscopeprotocols
<azonenberg>
it's roughly 34K of 225K lines of code in the project
<azonenberg>
so it's comparatively simple to replace/supplement with e.g. a touch optimized and/or mobile frontend
<azonenberg>
like, given how much ngscopeclient absolutely screams on mac PCs
<azonenberg>
how well would it run on an iPad?
<azonenberg>
you obviously would not want to run ngscopeclient as-is
<azonenberg>
but the same backend with a touch optimized UI in front of it could be amazingly performant
<lethalbit>
Yeah
<lethalbit>
maybe one day a big $VENDOR will ship it on their big fancy DSOs :v
<lethalbit>
but I might try to hack around a Qt variant, just for fun then
<lethalbit>
*shrug*
<lethalbit>
(because ofc I need an additional project /j lol)
<azonenberg>
Lol. Well I won't stop you, and it might ultimately result in some refactorings that lead to better front/back end decoupling across the project lol
<lethalbit>
*shrug*
<lethalbit>
I actually,
<lethalbit>
wanted to write an OBS plugin for it, so I could add things as video sources when streaming
<lethalbit>
which could be neat
<azonenberg>
oh interesting, like a single WaveformArea into OBS?
<azonenberg>
(I've done full-window captures of ngscopeclient before, that's how i film demo videos at the moment, but i normally am doing pre-recorded not live streaming)
<lethalbit>
Yeah
<lethalbit>
Well, hmm, so, it might actually be better, if ngscopeclient could expose a shared texture to pipewire via DMA-BUF
<lethalbit>
that could then be added to OBS
<lethalbit>
that way you don't need to deal with the nightmare of setup for the instrument and such in OBS
<azonenberg>
This would likely work well for an analog waveform, which is drawn to a texture by a custom compute shader
<lethalbit>
yeah
<lethalbit>
if it renders to a texture it could be shared via DMA-BUF
<azonenberg>
Digital waveforms and protocol decodes are currently drawn via ImDrawList to the main application window
<lethalbit>
aaah, okay
<azonenberg>
basically the way things work is (simplifying a bit)
<azonenberg>
first we rasterize all of the textures by the big crazy-tuned compute shader software renderer to one fp32 grayscale texture per DisplayedWaveform (i.e. a WaveformArea's view of a channel)
<azonenberg>
so if you have two channels displayed in a single WaveformArea you get two monochrome textures
<azonenberg>
then we tone map them to per-DisplayedWaveform RGBA textures in a separate shader
<azonenberg>
then we use imgui to composite all of these textures and draw the GUI chrome and decode overlays on top
<azonenberg>
and sorry, i misspoke
<azonenberg>
digital waveforms use the same flow but a different shader for the waveform -> fp32 step
<azonenberg>
it's only protocol decodes (and things like cursors, markers, etc) along with the rest of the gui chrome that are drawn in imgui
<azonenberg>
but the point is, this all gets composited at the end
<azonenberg>
there is no single texture in ngscopeclient containing the entire content of a WaveformArea minus the chrome
<azonenberg>
the final compositing happens as the window is being drawn so we'd splat a floating dialog over the plot in the same render pass we draw the decodes
<lethalbit>
Mmm
<azonenberg>
and then eye patterns and constellation diagrams are similar... the big difference is that as of now the rasterizing / integration is done in software (with some AVX if available)
<azonenberg>
but it still ends up as a fp32 texture on the gpu which is tone mapped to rgba by a shader
<azonenberg>
We've thought about making the decode overlays follow the same flow, at which point it would be straightforward to (if we wanted to) alpha blend them all to a single texture in yet another shader before drawing the gui chrome
<lethalbit>
yeh, that would be nice
<azonenberg>
gpu rendering the decodes, at least the outer bubbles separate from the text, would be a speedup
<lethalbit>
mhm
<azonenberg>
the final compositing shader step seems mostly pointless, it would be a slight waste of gpu memory capacity and bw for no obvious benefit other than your use case (i.e. it might be better to expose the channels as separate textures and composite in OBS or something?)
<azonenberg>
but anyway, if its something you want to explore i'm curious, just not gonna rearchitect the app around that use case
<azonenberg>
i'm certainly open to better integrations with other tooling in general
<lethalbit>
Hmm, Depends? Having a final single texture makes the pipewire DMA-BUF interface easier, as we only need to deal with a single texture
<lethalbit>
doing the post-compositing in OBS would be possible, but not sure if it would be ideal, I guess it depends
<lethalbit>
would need a lot of thinking
<azonenberg>
yeah we could always add a compositing shader on our side for whatever reason and just only invoke it when doing the bridge plugin
<azonenberg>
it doesnt have to be part of the normal render path
<lethalbit>
yeah,
<azonenberg>
(but if you want decode bubbles we'd have to shader-ify that which is on the long term roadmap but not any time soon)
<lethalbit>
yeah,
<lethalbit>
I think the decoders would be nice in the long term, but like, it not being there wouldn't be a huge dealbreaker
<lethalbit>
but unifying the waveform displays as much as possible would help things forsure,
<lethalbit>
I'm mainly thinking of analog waveform/eye diagram usage for the OBS embed, so it could be worked around in the mean time
<lethalbit>
and, wrt the OBS side of things, If it was a single unified texture I think OBS (or really, anything that can use DMA-BUF) would ingest
<lethalbit>
and, wrt the OBS side of things, If it was a single unified texture I think OBS (or really, anything that can use DMA-BUF) would ingest it without a dedicated plugin
<lethalbit>
oops
<lethalbit>
but if the textures are all seperate or however it wants to be done then yeah we'd need an OBS plugin to integrate it
<lethalbit>
but if that's the case anyway for things like mesurements and such, then we might need to do that anyway so we can re-render the UI into OBS ontop of the waveform
<azonenberg>
yeah measurements are a different story, those output scalars etc
<lethalbit>
in that case we would also need to establish a metadata sideband between OBS and ngscopeclient anyway over something like unix sockets or w/e
<lethalbit>
so I guess in general, however it's done isn't a huge deal, just depends on which side of things most of the effort will occur on
<lethalbit>
*shrug*
<lethalbit>
it's a silly idea in general though, just doing a window capture would work well enough for most things
<lethalbit>
but having dedicated integration for waveforms would be nice in a "super extra extra" nice way
<lethalbit>
I could try to play with it eventually