Single-pass palette refinement and ordered dithering

Lobsters Hottest Papers

Summary

A single-pass method combines online k-means palette refinement with ordered Bayer dithering, eliminating the separate pixel-mapping step and yielding slight speedups while producing visually interesting results.

<p><a href="https://lobste.rs/s/tzws72/single_pass_palette_refinement_ordered">Comments</a></p>
Original Article Export to Word Export to PDF
View Cached Full Text

Cached at: 04/23/26, 11:56 AM

# Single-pass palette refinement and ordered dithering Source: [https://30fps.net/pages/bayer-order-online-kmeans/](https://30fps.net/pages/bayer-order-online-kmeans/) *Bayer\-order pixel traversal for online k\-means clustering\.* ![](https://30fps.net/pages/bayer-order-online-kmeans/results/shuffled/kodim23_256_bayer2x2_k32.png)The Bayer dither pattern in this image is a result of the pixel processing order and not a distinct dithering algorithm\.Usually image color reduction is done in two independent phases: first a*color quantization*algorithm finds a palette, which is then used by a*pixel mapping*phase to assign every output pixel a color from the palette\. Ignoring any possible dithering for now, the chosen palette color is the one with the shortest Euclidean distance to the original pixel color\. But if we know both the old and new colors for each pixel, it’s possible to*refine*the palette\. Palette refinement can be done with repeated[“k\-means”](https://en.wikipedia.org/wiki/K-means_clustering)iterations that take all pixels that were assigned the same palette color index and compute new palette colors as averages of said pixels\. The same can also be done sequentially, which results in[*online k\-means*\(OKM\)](https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/C/sk_means.htm)where each pixel slightly nudges its chosen palette color towards its own original color\. The palette subtly changes after each processed pixel, and not only at the end of an iteration like in regular “batch” k\-means\. What I’ve got to show is**a method that modifies the online k\-means algorithm to produce an ordered dithering pattern as a side effect\.**My method doesn’t generate the highest quality images but allows you to skip the final pixel mapping step, which yields a small speedup, in theory\. But the main reason for publishing this experiment is that I think the results look neat\. ## Skipping pixel mapping You see, both online and batch k\-means are supposed to be driven just as a post\-processing pass for an already color\-quantized image\. A bad palette goes in, a slightly better one comes out, pixels are assigned to its colors, and*tada\!*– color reduction happens\. In other words, k\-means is followed by a final pixel mapping step using Euclidean distances\. I misunderstood the above mechanism when I first implemented color quantization using online k\-means, and presented the online k\-means intermediate image as is*without*Euclidean pixel mapping\. This mistake left the images noisy and yours truly confused about why my results looked so much worse than in the 2019 paper[*Fast color quantization using MacQueen’s k‑means algorithm*](https://link.springer.com/article/10.1007/s11554-019-00914-6)I was reading\. Let me demonstrate why it was an interesting mishap\. Let’s say we are quantizing the[“two macaws” Kodak test image](https://r0k.us/graphics/kodak/kodim23.html): ![](https://30fps.net/pages/bayer-order-online-kmeans/originals/kodim23_256.png)Let’s do it properly first\. Compute a palette for it using the maximin k\-means initialization algorithm \(see the appendix\), run one iteration of online k\-means, and finish with Euclidean pixel mapping\. This is what we get: ![](https://30fps.net/pages/bayer-order-online-kmeans/results/kodim23_256_raster_k32_euclidean.png)It’s alright but the red parrot looks a bit rough\. As it turns out, online k\-means is sensitive to the point processing order, so the raster order \(left to right, top to bottom\) used above is not the best choice\. Traversing the pixels in a random order \(“shuffling”\) is enough to improve the results: ![](https://30fps.net/pages/bayer-order-online-kmeans/results/kodim23_256_random_k32_euclidean.png)Notice how the red parrot’s head looks softer now\. This is how you’re supposed to apply OKM to color quantization\. Now we finally get to the point of this article\. What do the*intermediate*images look like? What if we do not assign each pixel the closest palette color at the end but keep the indices chosen while the palette was still being formed? The results are shown below for both raster and random order\. OrderIntermediate OKM result \(32 colors\)Raster![](https://30fps.net/pages/bayer-order-online-kmeans/results/kodim23_256_raster_k32.png)Uniform random![](https://30fps.net/pages/bayer-order-online-kmeans/results/kodim23_256_random_k32.png)For the raster order, the top and bottom halves of the image look different; the palette clearly changed between scanlines and only reached its final colors at the bottom\. I find the result surprisingly cool\. Note for example the brown\-green gradient created on the beak of the yellow\-blue macaw\. The uniform random result could even be considered*superior*to the “proper” pixel\-mapped ones because the added noise acts as dithering\. Alternating palette colors break some of the false contours, resulting in smoother\-looking gradients\. Which brings us to… ## Bayer matrix order The uniform random order created a white noise pattern\. The problem is that white noise is clumpy; see the uneven pattern in the green bokeh gradient between the parrots\. There are all kinds of different noise patterns that improve on uniformly random noise, and in fact, in the paper they recommend[the Sobol sequence](https://en.wikipedia.org/wiki/Sobol_sequence)to visit only a fraction of the pixels, for speed\. One very well dispersed pattern is the one generated by Bayer matrices\. They[create mostly high frequencies](https://bartwronski.com/2016/10/30/dithering-part-three-real-world-2d-quantization-dithering/#attachment_2251)and thus smooth out the clumps\. The Bayer pattern also gives classic retro videogame vibes, which is why I chose it for this experiment\. The matrices store a sequence of integers that[in 1\-bit quantization act as thresholds](https://tannerhelland.com/2012/12/28/dithering-eleven-algorithms-source-code.html#i-count-9-algorithms-but-you-promised-11--where-are-the-other-two)to bias the choice between a black and white pixel to represent a continuous greyscale tone\. For color palettes[it’s more involved](https://matejlou.blog/2023/12/06/ordered-dithering-for-arbitrary-or-irregular-palettes/), but luckily we don’t have to worry about that, because in this experiment**we’ll interpret the numbers as the pixel processing order**\. So, for example, given the 2x2 matrix below ``` [1 3] [4 2] ``` an online k\-means iteration will first visit the top left pixel \(`1`\) of each 2x2 block in the image, then the bottom right pixel \(`2`\) and so on\. The same can be done for the 4x4 matrix: ``` [ 1 9 3 11] [13 5 15 7] [ 4 12 2 10] [16 8 14 6] ``` Hopefully the modified traversal order results in a checkerboard pattern of some sort\. And guess what? It works\! Not perfect by all means but seems interesting already: OrderIntermediate OKM result \(32 colors\)Bayer 2x2 \(unshuffled\)![](https://30fps.net/pages/bayer-order-online-kmeans/results/unshuffled/kodim23_256_bayer2x2_k32.png)Bayer 4x4 \(unshuffled\)![](https://30fps.net/pages/bayer-order-online-kmeans/results/unshuffled/kodim23_256_bayer4x4_k32.png)The pattern is quite stylized and far from what’s usually considered dithering\. We can tone it down by adding some randomness to the process\. The first attempt still processed the image in a raster order in a sense; even though a single pixel was processed in each 2x2 or 4x4 block, the blocks were still handled from left to right and top to bottom\. Let’s change the block order to be uniform random within each matrix element’s pixels\. It’s hard to explain and I’m not sure showing code helps, but I mean going from this: ``` def gen_matrix_pattern(M, data_shape): S = M.shape[0] M_yx = dither_matrix_to_offsets(M) H = data_shape[0] // S W = data_shape[1] // S for mi in range(S*S): for yt in range(H): for xt in range(W): y = S*yt + int(M_yx[mi][0]) x = S*xt + int(M_yx[mi][1]) yield y,x ``` to this ``` def gen_matrix_pattern_shuffled(M, data_shape): S = M.shape[0] M_yx = dither_matrix_to_offsets(M) H = data_shape[0] // S W = data_shape[1] // S for mi in range(S*S): coords = [] for yt in range(H): for xt in range(W): y = S*yt + int(M_yx[mi][0]) x = S*xt + int(M_yx[mi][1]) # yield y,x # not returning coordinate directly coords.append((y,x)) # collect pixels to visit to a list random.shuffle(coords) # randomize order within this matrix element for y, x in coords: yield y, x ``` You should be able to appreciate how the result looks much more like ordered dithering now: OrderIntermediate OKM result \(32 colors\)Bayer 2x2 \(shuffled\)![](https://30fps.net/pages/bayer-order-online-kmeans/results/shuffled/kodim23_256_bayer2x2_k32.png)Bayer 4x4 \(shuffled\)![](https://30fps.net/pages/bayer-order-online-kmeans/results/shuffled/kodim23_256_bayer4x4_k32.png)This result is what I wanted to show\. This algorithm is unique in its ability to create ordered dither patterns while refining the palette at the same time\. As it often is the case, effectiveness depends on the image\. Below the shuffled Bayer 2x2 result is almost indistinguishable from the random one and the unshuffled image is almost beautiful in its brokenness\. See more result images[on their own page](https://30fps.net/pages/bayer-order-online-kmeans/result_table.html)\. I hope you enjoyed this little experiment\. If you have questions or ideas for improvements, feel free to message me on[Mastodon](https://mastodon.gamedev.place/@pekkavaa)or[Bluesky](https://bsky.app/profile/pekkavaa.bsky.social)\. *I’m writing a book on color reduction algorithms\.[Sign up here](https://paletteprogramming.com/)if you’re interested\.* ## Code A standalone script that implements the discussed method: - [shuffled\_bayer\_okm\_reduction\.py](https://codeberg.org/pekkavaa/shuffled-bayer-okm-color-reduction/src/branch/main/shuffled_bayer_okm_reduction.py)\([local copy](https://30fps.net/pages/bayer-order-online-kmeans/shuffled_bayer_okm_reduction.py)\) Usage:`uv run shuffled\_bayer\_okm\_reduction\.py image\.png 16 \-\-order bayer2x2` ## Appendix The raw, pre\-refinement palettes in this experiment came from the “maximin” algorithm\. I chose it for its deterministic behavior and easy implementation\. More information on the maximin k\-means initializerMaximin, also known as the “maximum coverage” color quantizer, is a deterministic variant of the standard[k\-means\+\+ initializer](https://en.wikipedia.org/wiki/K-means%2B%2B)\. It first adds the mean point as the first cluster center and then iteratively picks the data point furthest from any existing cluster as the next center, until*K*centers have been found\. A straightforward NumPy implementation looks like this: ``` def maximin_init(X, K): clusters = [] N = X.shape[0] clusters.append(np.mean(X, axis=0)) while len(clusters) < K: max_dist = 0 max_dist_i = None for i in range(N): x = X[i] c_idx = 1e9 for cj in clusters: dist = sum((cj - x)**2) if dist < c_idx: c_idx = dist if c_idx > max_dist: max_dist = c_idx max_dist_i = i new_center = X[max_dist_i] clusters.append(new_center) return clusters ``` See the full script for a vectorized version\.

Similar Articles

Dithering with CSS

Hacker News Top

The article demonstrates how to apply dithering effects to images using CSS filters and SVG feTurbulence to maintain a consistent aesthetic.

DALL·E: Introducing outpainting

OpenAI Blog

OpenAI introduces Outpainting for DALL·E, enabling users to extend generated or uploaded images to create large-scale images in any aspect ratio while maintaining visual consistency with shadows, reflections, and textures.