OpenGL depth test against cleared depth not as I expected (moderngl)

132 Views Asked by At

I am rendering a quad to a blank screen with its depth first cleared to (0.25). The depth of each vertex is set equal to its y coordinate, and it is a rectangle ranging from (-.5, -.5) to (.5, .5), so I would expect that the depth test (set to '>'), would obscure the bottom quarter of the quad, but instead, I see the entire quad on the screen. Based on my vertex shader code, I would expect that the depths of the vertices are -.5 and .5, so the bottom quarter where the depth is <.25 is what I would expect to fail the depth test, but this is not what I see. And, if I switch the depth test to '<', instead of showing only the bottom quarter, I see none of it. My program is as below:

import moderngl as gl
import numpy as np
import pygame as pg

pg.init()
pg.display.set_mode((800, 600), pg.DOUBLEBUF | pg.OPENGL)
context = gl.create_context(require=430)


bright_program = context.program(vertex_shader='''
#version 330
in vec2 position;
in vec3 in_color;
out vec3 out_color;

void main() {
    gl_Position = vec4(position, position.y, 1.0);
    out_color = vec3(in_color);
}
''', fragment_shader='''
#version 330
in vec3 out_color;
layout(location=0) out vec4 fragColor;

void main() {
    fragColor = vec4(out_color, 1.0);
}
''')
buffer = context.buffer(np.array([-.5, -.5, 1, 1, 1,
    .5, -.5, 1, 0, 1,
    .5, .5, 1, 1, 0,
    -.5, -.5, 1, 1, 1,
    .5, .5, 1, 1, 0,
    -.5, .5, 0, 1, 1], dtype='float32'))
vao = context.vertex_array(bright_program, ((buffer, '2f4 3f4', 'position', 'in_color'),))
context.enable(gl.DEPTH_TEST)
context.depth_func = '>'

clock = pg.time.Clock()
def render():
    r, g, b = 0, 0, 0
    context.clear(r, g, b, 1, 0.25)
    vao.render()

running = True
while running:
    clock.tick(24)
    for event in pg.event.get():
        if event.type == pg.QUIT:
            running = False
    render()
    pg.display.flip()

pg.quit()

Then, additionally, if I change the depth in the .clear line to 0.5, the two blend functions show the top and bottom halves of the quad, instead of all or nothing as I would expect. It seems like the depth of the entire screen is instead ranging from 0 to 1, rather than the depth of the top and bottom edges of the quad being +-.5. So why aren't the depths of the quad's vertices the +-.5 values I expect when it reaches the depth test? Does it have something to do with normalized device coords, even though I'm setting the w component of gl_Position to 1?

1

There are 1 best solutions below

0
On

If you do the math (basic interval mapping) then you'd see that you'd indeed see the whole quad because all of its fragments depths are greater than or equal to 0.25.

The default OpenGL depth range is from 0 to 1 and you are clearing it to 0.25. And the NDC z range that gets mapped to this depth range is [-1..+1].

The following "ASCII art" code block should show the two linear intervals (depth range and your quad's z value):

depth range:   0 x . . 1
                 ^
                 |
quad z value: -1 y . z 1
  • x should depict the depth value 0.25 (every dot is in steps of 0.25 here).
  • y should depict your quad's z value of -0.5
  • z is your quad z value of +0.5
  • every tick in the lower "quad z value" interval is in steps of 0.5

The ASCII "arrow" from y to x should indicate that your quad's z value of -0.5 gets mapped to the depth value of 0.25. Therefore, every quad's z value larger than that will also get mapped to a depth value larger than 0.25.