OpenGL ES2.0 Lighting in the top shader or fragment shader

advertisements

I have seen many different tutorials on lighting in OpenGL ES2.0.

Some use the vertex shader to do all the lighting and transforms and then just pass the final colour through the fragment shader.

Others pass the position and other variables from the vertex shader and then do all the lighting in the fragment shader.

From my experience i always thought lighting should be done in the fragment shader. Can anyone tell my why do one over the other?


Traditional, fixed-pipeline OpenGL did lighting at the vertices and merely interpolated per fragment. So it tended to show visible seaming along edges:

That was considered an acceptable compromise however, because lighting was too expensive to do per-pixel. Hardware is better now but lighting is still more expensive to do per pixel. So I guess there's a potential argument there. Also I guess if you were trying to emulate the old fixed pipeline you might deliberately do lighting inaccurately.

However I'm struggling to think of any particularly sophisticated algorithm that would be amenable. Is it possible that the examples you've seen are just doing things like figuring out the tangent and cotangent vectors per vertex, or some other similar expensive step, then interpolating those per pixel and doing the absolute final calculations in there?