Hi there, I added a new core to the distribution today called "MiSTer Laggy", you'll see it in the _Utilities directory. MiSTer Laggy is a core for doing display latency testing and it works with a small hardware peripheral that connects to the user port. It works much like a Time Sleuth but uses your MiSTer to run the firmware and generate the video signal. There are more details on it's use in the readme: https://github.com/mister-devel/MiSTerL ... Ter#readme
A KiCAD project and 3D printer files are in the hardware directory: https://github.com/MiSTer-devel/MiSTerL ... n/hardware
If you are in the US I have a small number of premade devices for sale here: https://www.tindie.com/products/pbretro/mister-laggy/
Biggest thing I've discovered with this device so far is that you really should use vrr_mode if you are using an LG OLED. Their low latency boost mode is fantastic, 2.4ms latency, but only if the refresh rate is 50hz or 60hz. Lots of arcade cores have refresh rates like 58hz or 62hz and when receiving a refresh rate like that the LG OLED goes into a buffering mode and adds 1-3 frames of latency. When in VRR mode it does not add this additional latency. Given how complex the settings and possible inputs are on modern TVs I'm sure there are a lot of other "gotchas" and sub-optimal pathways that it's possible to hit.
Some details on the core development itself. I didn't want to have to write all of the UI and other logic in HDL, so I made a core with a 68000 CPU and a basic timemap graphics system. The firmware is written in C and compiled to 68000 with GCC (it's pretty cool that GCC still supports CPUs like the 68000). I already had a toolchain setup because I was making from test roms for 68000-base arcade hardware so repurposing it for this project was simple enough. Since it is all in C I was able to easily port over the HDMI video mode code from Main so the core itself can change it's resolution and refresh rate.