I extracted a little project to be able to show what I am measuring:
The code is extracted-copied from several places, sorry about the quality... I hope it compiles on other boxes, too...
It builds two binaries, egl-demo and gtk-demo -- one creates opengl context with xlib and egl, the other with gtkglarea.
My monitor has a 60.06Hz screen refresh:
$ xrandr --verbose
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
eDP1 connected primary 1920x1080+0+0 (0x6f) normal (normal left inverted right x axis y axis) 344mm x 193mm
[...]
1920x1080 (0x6f) 140.000MHz +HSync -VSync *current +preferred
h: width 1920 start 1968 end 2068 total 2100 skew 0 clock 66.67KHz
v: height 1080 start 1083 end 1084 total 1110 clock 60.06Hz
[...]
The program prints some timing info, all of them are microseconds. I measure my average, min and max rendering time, and more importantly my average, min and max frame time, i.e., the wallclock time between two runs of my render callback function.
On my machine, with the egl-demo, I have strict 60.06 FPS:
Full avg compute time: 433
1 sec avg compute time: 451
Min compute time: 388
Max compute time: 496
Full avg FPS: 60.06
1 sec FPS: 60.06
Min frame time: 16460
Max frame time: 16846
With gtk-demo, the numbers are not so nice:
Full avg compute time: 428
1 sec avg compute time: 456
Min compute time: 258
Max compute time: 2107
Full avg FPS: 59.03
1 sec FPS: 58.90
Min frame time: 15116
Max frame time: 19014
Although it is visible that I have bigger jitter with gtk, the max frame time indicates that presumably there are no dropped frames in the classical sense (halved refresh, 30fps on average), because in that case I would expect much bigger max frame times, at least bigger than 24000 usecs.
So, my conclusion is that on my machine gtk does not sync to vsync, but for some other timer.
Again: Ubuntu 16.04, more-or-less out of the box X config: libgtk 3.18.9-1ubuntu3.1, unity 7.4.0+16.04.20160906-0ubuntu1, compiz 0.9.12.2+16.04.20160823-0ubuntu1, X.org 1.18.4. I am not familiar with X, but maybe these lines are relevant from the log:
[ 118.087] (II) intel(0): Using Kernel Mode Setting driver: i915, version 1.6.0 20151010
[...]
[ 118.092] (--) intel(0): Integrated Graphics Chipset: Intel(R) HD Graphics 4600
I would be happy if this issue would uncover some bug, so I contributed to the community. :)
Background info:
I am working on a project where I need fluid scrolling of a background, and the gtk version has clearly visible and annoying glitches. This is why I also need the displayed frame counter, as in special cases I can compute the scrolling (shift) offset based on this instead of wall clock to have smoother result. Indeed, it seems GtkGlArea is not suitable for this, although I don't understand why; I don't see why should it be worse than x11-egl...