Fragment Shader always uses 1.0 for alpha channel


I have a 2d texture that I loaded with

glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, gs.width(), gs.height(), 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, gs.buffer());

where gs is an object that with methods that return the proper types.

In the fragment shader I sample from the texture and attempt to use that as the alpha channel for the resultant color. If I use the sampled value for other channels in the output texture it produces what I would expect. Any value that I use for the alpha channel appears to be ignored, because it always draws Color.

I am clearing the screen using:

glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

Can anyone suggest what I might be doing wrong? I am getting an OpenGL 4.0 context with 8 red, 8 green, 8 blue, and 8 alpha bits.

Vertex Shader:

#version 150

in vec2 position;
in vec3 color;
in vec2 texcoord;

out vec3 Color;
out vec2 Texcoord;

void main()
    Texcoord = texcoord;
    Color = color;
    gl_Position = vec4(position, 0.0, 1.0);

Fragment Shader:

#version 150

in vec3 Color;
in vec2 Texcoord;

out vec4 outColor;

uniform sampler2D tex;

void main()
    float t = texture(tex, Texcoord);
    outColor = vec4(Color, t);

Frankly, I am surprised this actually works. texture (...) returns a vec4 (unless you are using a shadow/integer sampler, which you are not). You really ought to be swizzling that texture down to just a single component if you intend to store it in a float.

I am guessing you want the alpha component of your texture, but who honestly knows -- try this instead:

float t = texture (tex, Texcoord).a; // Get the alpha channel of your texture

A half-way decent GLSL compiler would warn/error you for doing what you are trying to do right now. I suspect yours is as well, but you are not checking the shader info log when you compile your shader.


The original answer did not even begin to address the madness you are doing with your GL_DEPTH_COMPONENT internal format texture. I completely missed that because the code did not fit on screen.

Why are you using gs.rgba() to pass data to a texture whose internal and pixel transfer format is exactly 1 component? Also, if you intend to use a depth texture in your shader then the reason it is always returning a=1.0 is actually very simple:

Beginning with GLSL 1.30, when sampled using texture (...), depth textures are automatically setup to return the following vec4:

vec4 (r, r, r, 1.0).

The RGB components are replaced with the value of R (the floating-point depth), and A is replaced with a constant value of 1.0.