Shaders are an essential part of Clairvoyance and are responsible for a large majority of what is rendered to the screen. To implement shaders, several accommodations were made in order to make the switch over to programmable (instead of fixed) passes. This includes the addition of a material class, a technique class, and a rendering pass class, as well as related management classes.
The graphics pipeline allows the programmer to use two distinct types of shaders: a vertex shader and fragment shader (and there are certainly more!). These shaders take the place of the pre-existing programs that run on the graphics card. By overriding these operations with our own programmable passes, we can achieve a wide variety of custom graphic effects. Dynamic lighting is commonly achieved using shaders.
The vertex shader deals, as the name suggests, deals with the transformation of vertices. At a core level, this means a vertex’s position, its normal, and any applicable UV coordinate. It can also be made responsible for dealing with things like blend weights and fog.
A vertex program must store a projection matrix, view matrix, and model matrix, and be able to take in position and normal vectors. Output variables are then used to bring the final computed values into the fragment shader.
Normal = normalize(vec3(view_matrix * model_matrix * vec4(normal, 0.0))); UV0 = vec2(uv0); gl_Position = projection_matrix * view_matrix * model_matrix * vec4(position, 1.0); vertexPos = vec4(position, 1.0f);
The fragment shader is responsible for the final image that is outputted onto the screen. Herein, the programmer can take into account the use of dynamic lighting, material properties, as well as a plethora of custom effects, like blurring. Multiple fragment shaders can be used and reused in combination to achieve a variety of effects.
To achieve basic lighting in our engine, we setup a basic material and light structure in the fragment shader itself. Instances of these structures are then used in a variety of functions that compute attenuation, diffuse lighting, specular lighting, and so forth.
Output variables from the vertex shader are then recognized as input variables in the fragment shader. These variables allow us to then perform the per-vertex calculations necessary to light each fragment. For example, because we calculated the normals of each vertex previously, we now know how these vertices will behave under our lighting conditions.
gLight.position = vec4(0.0f,0.0f,0.0f,1.0f); gLight.color = vec4(1,1,1,1); eyePosition = vec3(0.0f,5.0f,20.0f); gMaterial.Ke = vec4(0.0f, 0.0f, 0.0f, 1.0f); gMaterial.Ka = vec4(0.33f, 0.22f, 0.03f, 1.0f); gMaterial.Kd = vec4(0.78f, 0.57f, 0.11f, 1.0f); gMaterial.Ks = vec4(0.99f, 0.91f, 0.81f, 1.0f); gMaterial.shininess = 27.8f; // Calculate emissive and ambient terms vec4 emissive = gMaterial.Ke; vec4 ambient = gMaterial.Ka * globalAmbient; // Loop over diffuse and specular contributions for each light vec3 diffuseLight; vec3 specularLight; vec3 diffuseSum = vec3(0,0,0); vec3 specularSum = vec3(0,0,0); computeLighting(gLight, vertexPos.xyz, Normal, eyePosition, gMaterial.shininess, diffuseLight, specularLight); diffuseSum += diffuseLight; specularSum += specularLight; vec3 diffuse = gMaterial.Kd.xyz * diffuseSum; vec3 specular = gMaterial.Ks.xyz * specularSum; output.xyz = emissive.xyz + ambient.xyz + diffuseLight + specularLight; output.w = 1;