I'm slightly skeptical of their benchmarks -- the fact they're using 300fps numbers is a red flag. No one runs games at that speed. What's the comparison for a game running at 30 - 60fps? My hunch is that D3D has a tiny fixed per-frame cost that becomes irrelevant when frames take 10x longer to render.
Sorry, my point was that they picked a benchmark that was too easy. I suspect if the machine was straining to hit 60fps their results could be vastly different. That's how most games run, so that'd be more interesting to see.
I think the point is that benchmarking the graphics subsystem's throughput is easiest when you look at high framerate games; otherwise CPU or just rendering time might be the dominating factor.
That would change the size of the effect, but OpenGL would still be faster. Also, 60fps x 5 = 300, so 1/5th of 20% ~= 4% faster which is still significant for them.