I built a blue noise generator and dithering library in Rust and TypeScript. It generates blue noise textures and applies blue noise dithering to images. There’s a small web demo to try it out [1]. The code is open source [2] [3]
Half the posts here are people promoting their own projects without even mentioning the (really impressive) OP. Bit weird
PMunch 32 minutes ago [-]
Just did a bit of a deep dive into dithering myself, for my project of creating an epaper laptop. https://peterme.net/building-an-epaper-laptop-dithering.html it compares both error diffusion algorithms as well as Bayer, blue noise, and some more novel approaches. Just in case anyone wants to read a lot more about dithering!
ivanjermakov 21 minutes ago [-]
There is something very satisfying in viewing media at 100% resolution of your screen. Every pixel is crisp and plays a role. Joy not available by watching videos or viewing scaled images.
ggambetta 2 hours ago [-]
I used ordered dithering in my ZX Spectrum raytracer (https://gabrielgambetta.com/zx-raytracer.html#fourth-iterati...). In this case it's applied to a color image, but since every 8x8-pixel block can only have one of two colors (one of these fun limitations of the Spectrum), it's effectively monochrome dithering.
a_shovel 1 hours ago [-]
Bayer dithering in particular is part of the signature look of Flipnote Studio animations, which you may recognize from animators like kekeflipnote (e.g. https://youtu.be/Ut-fJCc0zS4)
spicyjpeg 21 minutes ago [-]
Bayer dithering was also employed heavily on the original PlayStation. The PS1's GPU was capable of Gouraud shading with 24-bit color precision, but the limited capacity (1 MB) and bandwidth of VRAM made it preferable to use 16-bit framebuffers and textures. In an attempt to make the resulting color bands less noticeable, Sony thus added the ability to dither pixels written to the framebuffer on-the-fly using a 4x4 Bayer matrix hardcoded in the GPU [1]. On a period-accurate CRT TV using a cheap composite video cable, the picture would get blurred enough to hide away the dithering artifacts; obviously an emulator or a modern LCD TV will quickly reveal them, resulting in a distinct grainy look that is often replicated in modern "PS1-style" indie games.
Interestingly enough, despite the GPU being completely incapable of "true" 24-bit rendering, Sony decided to ship the PS1 with a 24-bit video DAC and the ability to display 24-bit framebuffers regardless. This ended up being used mainly for title screens and video playback, as the PS1's hardware MJPEG decoder retained support for 24-bit output.
[1] https://blue-noise.blode.co [2] https://github.com/mblode/blue-noise-rust [3] https://github.com/mblode/blue-noise-typescript
Interestingly enough, despite the GPU being completely incapable of "true" 24-bit rendering, Sony decided to ship the PS1 with a 24-bit video DAC and the ability to display 24-bit framebuffers regardless. This ended up being used mainly for title screens and video playback, as the PS1's hardware MJPEG decoder retained support for 24-bit output.
[1]: https://psx-spx.consoledev.net/graphicsprocessingunitgpu/#24...
If the author stops by, I'd be interested to hear about the tech used.
Dithering - Part 1
https://news.ycombinator.com/item?id=45750954