Here's an example regarding the image color quantization I've been talking about.
This is an original subtitle bitmap having a palette of 4 or 5 colors:
fate-suite\apng\o_sample.png
All following images are zoomed to 400% and taken as screenshots from Photoshop
ffmpeg -y -loglevel verbose -i "fate-suite\apng\o_sample.png" -filter_complex "split[split1][split2];[split1]palettegen=max_colors=255:use_alpha=1[pal1];[split2][pal1]paletteuse=use_alpha=1" -frames:v 1 out.png
ffmpeg -y -loglevel verbose -i "fate-suite\apng\o_sample.png" -filter_complex "split[split1][split2];[split1]palettegen=max_colors=255[pal1];[split2][pal1]paletteuse" -frames:v 1 out.png
ffmpeg -y -loglevel verbose -i "..\fate-suite\apng\o_sample.png" -filter_complex "elbg=pal8=1" -frames:v 1 out.png
ffmpeg -y -loglevel verbose -i "..\fate-suite\apng\o_sample.png" -filter_complex "elbg=pal8=1:use_alpha=1" -frames:v 1 out.png
Now we convert it to RGBA for scaling to 58%.
The result is this:
Magnified to 200%:
Now, the task is to reduce the amount of discrete colors as much as possible while keeping the quality of appearance.
This is something that Photoshop can do very well:
The image is reduced to only 8 palette colors and it's still looking fine
The palettization can be controlled in several ways and the results are pretty good, even with a small number of colors.
It looks easy when working with PS, but choosing the right set of colors and properly replacing them in the image, while retaining borders and blending smoothly - that's not as easy as it seems. A basic implementation and logic can be implemented quickly, but it probably won't provide results at a similar level of quality. Though, I haven't tried yet.