Developer Blog 4: OpenGL and shaders

17th May, 2016 | by Scratch, Lead developer

Hello again and welcome to yet another developer blog by Mischie-... nope, wait, I just got word that Angelo is feeling better, so I'm going to give the keyboard to him and let him ramble on!

Today, I want to talk a little bit about graphics. Specifically, OpenGL. In Serious Sam Classic you could change the graphics API between OpenGL and Direct3D. For Serious Sam Revolution, we have decided to drop support for Direct3D entirely. We've done this for a number of reasons, one in particular is that the 2 programmers that work on this project (am I talking in third person?) are not engine programmers, so we couldn't be bothered to maintain graphics APIs for both Direct3D and OpenGL, since it would take way too much time with way too little gain. With that being said, the other reason we've dropped Direct3D support is because OpenGL is backwards (and forwards) compatible, meaning that we can use 4.4 API calls (for example) in the current implementation without majorly redoing the entire graphics code. Getting D3D9 in a D3D8 application is a lot more complex, and potentially not even worth the trouble, since most (if not all) graphic cards nowadays support a decent enough version of OpenGL.

Okay, moving on. Enough about Direct3D. Blegh, Microsoft. ;) (Just kidding, I'm still a fan of their products.) What about OpenGL? What have we done to improve the graphics? So far we've mostly toyed around with shaders, and implementing it into the game in a way that level designers can use these shaders in however way they wish.

We currently support vertex and fragment shaders for 2 things: models and post processing. Now what I can show you here is limited, mainly because I am not a shader programmer~ uh.. I mean.. mathemagician. But despite me not capable of writing amazing shaders, this should still get you excited that you will be able to potentially use this in your levels!

Since a lot of regular people are reading this who probably don't know a lot about GLSL, I'm going to explain a little bit of how it exactly works, and what would be the exact potential for your levels. Let's go over a fairly simple and basic post processing shader first. We have a Post Processing Effect entity in our world, which we have given the following fragment shader:

#version 330 core

in vec2 UV;
out vec4 outColor;
uniform sampler2D screen;

void main()
{
  // main color
  vec4 col = texture(screen, UV);

  // switch red and blue values around
  float red = col.x;
  col.x = col.z;
  col.z = red;

  // use that color
  outColor = col;
}

What the above shader does is take a pixel from the screen at the given UV coordinates, then switch around the red and blue values in the pixel, and then put it back in its place. Fairly simple, right? In our PP Effect entity, we've set the range to about 4 meters, so we can step in and out of the range of the entity. Here's the result:

Now this shader is just the most basic shader you could probably think of. Most shaders are a lot more complex and confusing, as you can see at the amazing site Shadertoy (warning: this site might freeze your browser while it loads) Here's an example of a more complex shader that I wrote - I use the word "complex" as a relative term here ;) - that uses the "distance" uniform to get the player's distance to the entity:

#version 330 core

in vec2 UV;
out vec4 outColor;
uniform sampler2D screen;
uniform vec2 resolution;
uniform float distance;

const float pi = 3.1415926;
const float pi2 = 1.5707963;

vec4 lerp(vec4 a, vec4 b, float x)
{
  return a + (b - a) * x;
}

void main()
{
  vec4 col = texture(screen, UV);
  vec4 sum = vec4(0);
  for(int x = -4; x <= 4; x++) {
    for(int y = -4; y <= 4; y++) {
      sum += texture(screen, vec2(
        UV.x + x * (1.0 / resolution.x),
        UV.y + y * (1.0 / resolution.y))) / 81.0;
    }
  }
  float delta = abs(sin(UV.x * pi + pi2)) / (distance / 21.4791f);
  outColor = lerp(col, vec4(sum.xyz, 1), delta);
}

The result of this is as seen below:

Just for the heck of it, I decided to take a random shader (editor note: this shader has since been removed from Shadertoy, so the link was removed in the recent website update) from Shadertoy and use it as a Post Processing shader, just to show what you can do with this:

Pretty, isn't it? I recommend you stroll around Shadertoy if you're interested in this kind of thing. That website is full of great demos, and it even has a WebGL playground where you can write your own shaders right in your browser.

Oh, and you like gif animations, right? I'm sure you do. Here, have one.

And ofcourse it also works on model holders, though I have yet to think of a decent example for that, so more on that later, I guess!

On a more technical note, I should list the currently available uniform variables.

  • (PP only) sampler2D screen - the screen texture
  • (PP only) vec3 velocityrel - relative player velocity
  • (PP only) vec3 velocityabs - absolute player velocity
  • (PP only) float distance - distance between player and PP entity
  • (PP only) vec2 resolution - screen resolution
  • float time - time in seconds since the shader started
  • float paramN[0-4] - configurable parameters in entity

Well, that's that. I believe this will open up a lot of possibilities in new and interesting gameplay, considering we've also been working on scriptabi~ uhh, I mean.. that's for a later developer blog. ;)

What do you think? Does this inspire you? What would you like to see modders do with this? Post it in the comments below!

Until next week - Angelo the catt out.

Comments

There are no comments yet.