Made quite a lot of progress on the calculating warps and interacting with my dome/projector simulator. The calibrator application requests fisheye “photos” from the simulator via the embedded HTTP server. It also generates and sends the chessboard calibration pattern to the simulator and calculates a warp for each simulated projector in turn. The approximate pose of the projector and an optimum scene FOV is also calculated based on the eye point.

To view the quality of the alignment and warp, the simulator can also render the dome structure but with the modelview and field of view passed back by the calibration process. This modelview matrix has the eye point fixed at the center of the dome sphere and looks in a direction with a FOV large enough (but no larger) to render the part of the scene that will be warped into that projector’s part of the dome.

After bugs were fixed and the fisheye resolution adjusted upwards (to a 4k image) the alignment seems pretty good at least in simulation.

Four simulated projectors projecting content rendered (then warped) from the virtual eye point