I think just about every field in Computer Science has these kinds of problems though, where people learn just enough to be dangerous.
E.g.: the default sRGB colour space used by typical computers is non-linear, so you can't interpolate using naive algebra. You should convert to linear light, interpolate, then convert back to sRGB with the original gamma curve.
Practically nobody does this.
Even Adobe PhotoShop has this feature off by default which is just insane.
Just as with geospatial mappings, there are further nuances to imaging colour spaces. E.g.: Most colour spaces are designed to match some phosphor or LED colour, and do not match the human eye's response curves. So almost any "maths" you perform on almost any colour space, even in linear light will introduce colour shifts or other artefacts. E.g.: changing intensity, upscaling, downscaling, blending, etc...
A much better colour space would be to use something like ICtCp, which is designed to match the response of the human eye, so that the "coordinate system" would map better to the actual responses of the rods and cones: https://en.wikipedia.org/wiki/ICtCp
Note how the map of the colours is smooth, because it allocates "coordinate vector bits" to perceptual colours evenly.
(Dolby is just about the only organisation that gets this stuff even vaguely right.)
All of the above issues can turn up in simple Long/Lat coordinates. There's always some idiot that ignores the fact that the coordinate density changes with location, or that the cells aren't even rectangles near the poles, or that interpolation is not trivial, etc...
I think just about every field in Computer Science has these kinds of problems though, where people learn just enough to be dangerous.
E.g.: the default sRGB colour space used by typical computers is non-linear, so you can't interpolate using naive algebra. You should convert to linear light, interpolate, then convert back to sRGB with the original gamma curve.
Practically nobody does this.
Even Adobe PhotoShop has this feature off by default which is just insane.
Just as with geospatial mappings, there are further nuances to imaging colour spaces. E.g.: Most colour spaces are designed to match some phosphor or LED colour, and do not match the human eye's response curves. So almost any "maths" you perform on almost any colour space, even in linear light will introduce colour shifts or other artefacts. E.g.: changing intensity, upscaling, downscaling, blending, etc...
A much better colour space would be to use something like ICtCp, which is designed to match the response of the human eye, so that the "coordinate system" would map better to the actual responses of the rods and cones: https://en.wikipedia.org/wiki/ICtCp
Note how the map of the colours is smooth, because it allocates "coordinate vector bits" to perceptual colours evenly.
This is one of the key features of Dolby Vision: https://professional.dolby.com/siteassets/pdfs/ictcp_dolbywh...
(Dolby is just about the only organisation that gets this stuff even vaguely right.)
All of the above issues can turn up in simple Long/Lat coordinates. There's always some idiot that ignores the fact that the coordinate density changes with location, or that the cells aren't even rectangles near the poles, or that interpolation is not trivial, etc...