- Repeatedly refit palettes since k-means is only a local
optimization. This can produce incremental improvements in image
quality but may also overfit, especially on complex images.
- use pygame to render incremental images
- Fix off-by-one in palette striping
- When fitting palettes, first cluster a 16-colour palette for the
entire image and use this to initialize the centroids for individual
palettes. This improves quality when fitting images with large
blocks of colour, since they will otherwise be fit separately and
may have slight differences. With a global initializer these will
tend to be the same. This also improves performance.
- switch to pyclustering for kmedians
- allow choosing the same palette as previous line, with a multiplicative penalty to distance in case it's much better
- iterate kmedians multiple times and choose the best, since it's only a local optimum
when dithering with two limitations:
- cannot choose the same palette as the previous line (this avoids banding)
- must be within +/- 1 of the "base" palette for the line number
This gives pretty good results!
direction. Otherwise, errors can accumulate in an RGB channel if
there are no palette colours with an extremal value, and then when
we introduce a new palette the error all suddenly discharges in a
spurious horizontal line. This now gives quite good results!
* Switch to using L1-norm for k-means, per suggestion of Lucas
Scharenbroich: "A k-medians effectively uses an L1 distance metric
instead of L2 for k-means. Using a squared distance metric causes
the fit to "fall off" too quickly and allows too many of the k
centroids to cluster around areas of high density, which results in
many similar colors being selected. A linear cost function forces
the centroids to spread out since the error influence has a broader
range."
Avoids the banding but not clear if it's overall better
Also implement my own k-means clustering which is able to keep some
centroids fixed, e.g. to be able to retain some fixed palette entries
while swapping out others. I was hoping this would improve colour
blending across neighbouring palettes but it's also not clear if it
does.
This gives the best of both worlds: dithering in a linear space, with
good (and fast) perceptual error differences
TBD: would linear RGB work as well as XYZ?