Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Am I the only person who thinks the Floyd-Steinberg dithering is superior is clarity and detail?

The Atkinson dithering makes the image appear overexposed/blown-out (not true to the original image).



The point of Atkinson is that it's much faster than Floyd-Steinberg on old CPUs. Floyd-Steinberg requires multiplying most source pixel by a small number (3, 5, 7) and doing a right shift by 4. Atkinson is just doing a right shift by 3. On the original Macintosh's Motorola 68000, I could see Atkinson being more than twice as fast.


Was that actually the goal, or was it just a happy side effect? (I don't know the history.)

Was a 2x speedup for dithering even important at the time, especially if it involved a sacrifice of quality? It's not like dithering images was something people did regularly, in the first couple generations of Macs.

You'd dither a few images for a game or something you were building, that you were lucky to get from a scanner. It was a pretty rare thing. Speed wasn't really relevant, as far as I remember.


I could have sworn I read on folklore.org that it was for speed, but I'm not finding it. You do have a point. Maybe Atkinson thought his dithering was more elegant without the multiplies or preferred how it looked on his test images.

I only used a B&W Mac a few times, but I do remember Windows 3.1 doing on-the-fly ordered dithering when running with palettized color (and being very surprised at NOT seeing the dithering on the blue gradient of a setup program once I started using high color modes). Windows 1.0 apparently was capable of doing it as well.


> Windows 3.1 doing on-the-fly ordered dithering when running with palettized color

Do you have a source for that? That's very much the opposite of what I remember. If you had 16 colors or even 256 colors, I don't remember anything in the Windows UX being dithered. Like I don't think you could pass an RGB color to GDI to draw a light pink line and it would dither it for you.

The only dithering I remember was indeed the background of blue gradients in Setup, and I always assumed that was custom code. After all, it's not like GDI had commands for gradients either.


It's also a lot less local.


As another comment mentioned, the differ is being done naively w.r.t. the color space. Handling rgb (or gray) values as linear values is usually wrong


I think Atkinson might have the edge if you were looking at it on a blurry CRT instead of a modern LCD/LED screen.


I prefer Atkinson dithering. I think it preserves more details when the resolution is very low. For more high resolutions floyd-steinberg is better though


Completely agreed.

I don't get the appeal of Atkinson dithering at all -- it makes the noise more "clumpy" or "textured" and thereby inherently reduces the amount of detail you can perceive. I don't think that's something subjective.

And if you want the "richer contrast" that the article describes Atkinson as providing, then easy -- just increase the contrast of the grayscale image before dithering. Then you actually have control over whatever final contrast you want -- it can be whatever you want! But you won't lose detail the way Atkinson does.


I agree that area above the nostrils appears blown-out, but I prefer the eyes more in the Atkinson version. So neither algorithm is superior to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: