This is a custom shader we are using right now to render flat shaded geometry into a deferred render buffer. Please ignore the custom made stuff (#pragma), it shouldn't be of relevance in this case.
Now, this shader works. Most of the time.
On some particular systems, namely a Surface Pro 4 and a Surface Pro 3 with Intel Iris 540 / HD 5000 GPUs, both running Windows 10, the following "optimization" breaks rendering.
- vertex shader:
- line 13: remove unused out attribute,
out vec3 v_position;
- line 20: change into local var,
vec3 _position = a_position;
- line 23: change to use local var,
vec4 position = vec4(_position, 1.0);
- line 13: remove unused out attribute,
- fragment shader:
- line 34: remove unused in attribute,
in vec3 v_position;
- line 34: remove unused in attribute,
Result is, that on this GPU, it now renders all black. Actually, judging by the output I see it looks like the alpha test fails all the time, discarding all fragments.
On other GPUs (NVidia on Win10/OS X, Intel on OS X) it works as intended.
It's been a horrible exercise to track the problem down this one innocently looking change. Would be nice to get an idea what's wrong about it...
After some shader instruction shuffling... I present The Solution!
not working:
working: