-
-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to make the pixel buffer larger than the window #319
Comments
What actually happens when your surface texture does not match the window size depends on the backend. The Vulkan backend (default for Windows and Linux) refuses to do anything, so you get the default background color decided by the window manager. Using the DirectX 12 backend on Windows will stretch the image to fit the window. Metal on macOS does something similar to DirectX 12, etc. Basically, doing this is "unspecified behavior". I do have to question why you want to do this because it sounds like an XY problem. Whatever extra information you are including in the texture arguably does not need to be there because it will be unused by definition when you hide it from being displayed. The only thing I can think of is that you might be trying to code-golf to reduce a few lines of code for copying a small region from a larger texture. If that is the case, we have better ways... For terminology used in the code and documentation: The "render texture" is the name given to the render target for the On the other hand, the "backing texture" is the name given to the texture that you physically upload pixel color values to. This texture can have any format you want, and its size is exactly what you specify. It is a "texture" in the GPU sense and is distinct from the raw pixel data. The byte data that you upload has a 1:1 mapping onto the backing texture. In other words, this is the "pre-scaled" texture without a border. Finally, the "surface texture" is not a real GPU texture. The documentation calls it a "logical texture" for this reason. It is just a conceptual wrapper around the render texture. It allows us to maintain a reference to the window and its size in a generic way. We cannot query the window size because the only thing we know about it is that it is a system-defined window pointer that implements |
This should give you a much clearer sense of what the two textures look like. I captured these with RenderDoc on Windows 11. This is the "backing texture" for the And this one is the "render texture". Both textures were captured on the same frame, and the window was resized slightly bigger than the default (to give it a border). This texture is "what you actually see" in the window. |
Yeah, I told you it was "edge case" so yes, code golf. I wanted to reduce the number of texture uploads :-) To explain it fully: my main texture is a composite PAL/NTSC signal. As you know there's a notion of "color burst" which tells how to decode the composite signal: that is with or without colour. I wanted to put the colour bust on/off in my texture, on its right edge "a bit like" what is done with a TV. If I don't do that, then I have to put that information in another texture which means more code and more GPU transfers... I wanted to avoid that :-) So yes, definitely code golf (but not sub-par, I admit) |
For the backing/render texture... If it's to get a border, why didn't you use a regular texture border, this would have avoided the additional texture. But sure, you knwo what you do better than I :-) |
I use the
This kind of compositing is almost certainly what you want to do, instead of trying to clip the surface texture. I don't know about other crates, but with
What is a "regular texture border"? Regardless, you always need at least two textures. The source texture that you upload to, and the destination render target that the shader writes to. The render target has to be window-sized (for reasons described above). Scaling and adding a border are just two things that are done for convenient/nice handling of the size difference between the two textures. |
hmmm... I forgot to mention that my conversion from "pixels" buffer (composite signal) to the displayed RGB signal is done with a WGSL shader. Hence my willingness to keep the pipeline simple by not multiplying GPU textures. In my case, the "color burst" signal is an information I use to change the behavior of the shader (which also goes in the direction of having everything crammed into a single texture -- although not 100% needed). For the regular texture border, I was refering to the fact that one can tell a shader to use a given border color when it accesses texels which coordinates are outside the texture (that's part of the choice to wrap the texture, clamp it, mirror it, etc.) |
That is an important detail, yeah. I was under the impression that the "analog video signal" was constructed on the CPU side. But then the good news is that you should be able to do the clipping in your shader, right?
I follow. You can see that the pipeline clamps the texture coordinates on the edges: Lines 34 to 36 in bc8235f
|
Hello,
I've been using pixels for a while now and I'm now stuck at some edge case. Basically, I'd like to put a bit more information in the pixels buffer than what will be actually drawn. This ends up with my "buffer" being 1 pixel wider than the actual window's width. In the documentation of the SurfaceTexture, it is written : "It is recommended (but not required) that the width and height are equivalent to the physical dimensions of the surface.". But that's not 100% clear to me what happens behind the scenes. Of course I have tried to make a surface texture wider than the actual window size but this brings a white picture (I was expecting a distorted picture, maybe zoomed). I thought that maybe the scaling matrix was at fault, esp. the constraint over the scale factor, so I set it to one. But that didn't succeed...
If you could just tell me where I should start looking that would be very kind. I've also looked in other places in the code and saw there was a "render texture" and a "backing texture" but I fail to understand the difference.
Thank your for your time...
The text was updated successfully, but these errors were encountered: