I’ve been experimenting with warping images to look flat when physically projected (with a projector) from any angle. The idea is to project a regular grid of polygons but choose the texture UV coords so that the image appears straight to the viewer. I’ve used OpenCV’s camera calibration code to first calibrate my camera, then to locate grid intersections in 2D space on a chessboard pattern from the chosen eye point.
Using the known screen coordinates of each internal chessboard square and the coordinates detected in the camera image, it’s possible to calculate UV texture coordinates to project a straight image over the chessboard polygons.
It’s not necessary to know the position of the projector or anything about it’s optics, or to calibrate for the projector in any way.
I’d like to try to extend the approach to the a fulldome environment as I can’t find a freely available projector alignment, calibration and blending solution for hemispherical domes (as we used for Black Shoals).