I've got a lag testing result that I can't explain and have a hunch it might be down to a problem with the emulation, so I'm here looking for help from someone who understands the MAME code.
I did some lag testing a while back on Defender:
Groovymame framedelay 8, Dell 1913 TN monitor.
120fps video with LEDs lit by the Fire button.
The result was that the input lag at 3/4 screen depth was about as expected, but the lag at 1/4 screen depth (which I would expect to be 8ms better) is actually about 8ms worse. [ edit: I've taken a larger sample of 100 shots to get a better value than I posted yesterday ]
I haven't been able to come up with any explanation other than something odd in the emulation.
Defender polls the inputs twice per video frame, the inputs state at the start of the video frame is used to update the lower half of the screen while the top half is being drawn. And then the input state read mid video frame is used to update the top half of the screen(which will be displayed next video frame). This results in low average lag and low variability of lag, with actions in the upper and lower half of the screen being equally responsive.
I'm just guessing but I can see that the person doing the translation to MAME might have looked at the video frame and said that the top half of the screen uses an older input state and therefore decided to use the inputs state previous to the current one. That sounds reasonable even, until you think about how much lag is already being introduced by 60hz polling versus 120hz, the emulation response is already lagging so you want to pull back 8ms rather than give away another 16ms.