>>112632589>10-bit H.264, which is a different codec
Showing how clueless you are. h264 is h264, no matter what settings you use. You can choose 10-bit color and 8-bit DCT accuracy, or 8-bit color and 16-bit DCT accuracy, or any other settings the spec allows. The internal accuracy has nothing to do with whether you choose 10-bit color or not.>Under good lighting, with a good (true 8-bit or better) monitor and good eyes, you can see the banding in a smooth undithered 8-bit YUV gradient
How many videos are of a smooth gradient? Unless you deliberately make banded gradients, 8-bit videos will not have banded gradients. If your Chinese cartoons are banded blame the production companies, not the color depth. Fucking retard.>banding artefacts
You're a retard. Either the banding is in the source, in which case it's not an artefact, and adding a dither is not "debanding", it's adding noise to obscure that the source had banding.>you should dither when converting from high-bit-depth
You're fucking retarded. Dithering when converting from 10-bit to 8-bit does not add noise, it actually adds data compared to a cutoff filter. Pic related, a 1-bit image with dithering which retains data from the original (8-bit? 10-bit? 16-bit? 32-bit?) image.
DITHERING WHEN CONVERTING FROM 8-BIT TO 10-BIT DOES NOTHING BUT ADD NOISE. Let's give an example:
2-bit gradient (four colors):
Converted to 1-bit without dither:
Converted to 1-bit with dither:
You can clearly see the dithered gradient has more information, because the dither retained information from the higher-bit-depth gradient.
THIS DOESN'T HAPPEN WITH ADDING A DITHER WHEN INCREASING THE BIT DEPTH BECAUSE THERE'S NO INFORMATION TO RETAIN. IT'S PURE NOISE.>If you knew what you were talking about, you wouldn't have to pepper your response with insults
The internet is full of retards who don't know that they're retards. When I call you that, I'm trying to alert you to this fact.