- Repeatedly refit palettes since k-means is only a local
optimization. This can produce incremental improvements in image
quality but may also overfit, especially on complex images.
- use pygame to render incremental images
- Fix off-by-one in palette striping
- When fitting palettes, first cluster a 16-colour palette for the
entire image and use this to initialize the centroids for individual
palettes. This improves quality when fitting images with large
blocks of colour, since they will otherwise be fit separately and
may have slight differences. With a global initializer these will
tend to be the same. This also improves performance.
- switch to pyclustering for kmedians
- allow choosing the same palette as previous line, with a multiplicative penalty to distance in case it's much better
- iterate kmedians multiple times and choose the best, since it's only a local optimum
when dithering with two limitations:
- cannot choose the same palette as the previous line (this avoids banding)
- must be within +/- 1 of the "base" palette for the line number
This gives pretty good results!
direction. Otherwise, errors can accumulate in an RGB channel if
there are no palette colours with an extremal value, and then when
we introduce a new palette the error all suddenly discharges in a
spurious horizontal line. This now gives quite good results!
* Switch to using L1-norm for k-means, per suggestion of Lucas
Scharenbroich: "A k-medians effectively uses an L1 distance metric
instead of L2 for k-means. Using a squared distance metric causes
the fit to "fall off" too quickly and allows too many of the k
centroids to cluster around areas of high density, which results in
many similar colors being selected. A linear cost function forces
the centroids to spread out since the error influence has a broader
range."
Avoids the banding but not clear if it's overall better
Also implement my own k-means clustering which is able to keep some
centroids fixed, e.g. to be able to retain some fixed palette entries
while swapping out others. I was hoping this would improve colour
blending across neighbouring palettes but it's also not clear if it
does.
This gives the best of both worlds: dithering in a linear space, with
good (and fast) perceptual error differences
TBD: would linear RGB work as well as XYZ?
Use this to precompute a new ntsc palette with 256 entries (though
only 84 unique colours) that are available by appropriate pixel
sequences. Unfortunately the precomputed distance matrix for this
palette is 4GB!
Optimize the precomputation to be less memory hungry, while also
making efficient use of the mmapped output file.
Add support for dithering images using this 8-bit palette depth,
i.e. to optimize for NTSC rendering. This often gives better image
quality since more colours are available, especially when modulating
areas of similar colour.
Fix 140 pixel dithering and render the output including NTSC fringing
instead of the unrealistic 140px output that doesn't include it.
Add support for rendering output image using any target palette, which
is useful e.g. for comparing how an 8-pixel NTSC rendered image will
be displayed on an emulator using 4-pixel ntsc emulation (there is
usually some colour bias, because the 8 pixel chroma blending tends to
average away colours).
Switch the output binary format to write AUX memory first, which
matches the image format of other utilities.