how to set up CAEAGLLayer subclass with openGL context: Current draw framebuffer is invalid

3k Views Asked by At

I'm trying to set up a CAEAGLLayer subclass with a gl context. That is, instead of creating a UIView subclass which returns a CAEAGLLayer and binding a gl context to this layer from within the UIView subclass, I'm directly subclassing the layer and trying to setup the context in the layer's init, like so:

- (id)init
{
    self = [super init];
    if (self) {
        self.opaque = YES;

        _glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
        NSAssert([EAGLContext setCurrentContext:_glContext], @"");

        glGenRenderbuffers(1, &_colorRenderBuffer);
        glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderBuffer);
        [_glContext renderbufferStorage:GL_RENDERBUFFER fromDrawable:self];

        glGenFramebuffers(1, &_framebuffer);
        glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
        glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderBuffer);

        /// . . .

up to that point everything seems fine. However, I then try to create a shader program with a "pass-thru" vertex/fragment shader pair and while linking the program returns no errors, validation fails saying: "Current draw framebuffer is invalid."

The code that links and validates the shader program (after attaching the shaders) looks like so, just in case:

- (BOOL)linkAndValidateProgram
{
    GLint status;
    glLinkProgram(_shaderProgram);

#ifdef DEBUG
    GLint infoLogLength;
    GLchar *infoLog = NULL;
    glGetProgramiv(_shaderProgram, GL_INFO_LOG_LENGTH, &infoLogLength);
    if (infoLogLength > 0) {
        infoLog = (GLchar *)malloc(infoLogLength);
        glGetProgramInfoLog(_shaderProgram, infoLogLength, &infoLogLength, infoLog);
        NSLog(@"Program link log:\n%s", infoLog);
        free(infoLog);
    }
#endif

    glGetProgramiv(_shaderProgram, GL_LINK_STATUS, &status);
    if (!status) {
        return NO;
    }

    glValidateProgram(_shaderProgram);

#ifdef DEBUG
    glGetProgramiv(_shaderProgram, GL_INFO_LOG_LENGTH, &infoLogLength);
    if (infoLogLength > 0) {
        infoLog = (GLchar *)malloc(infoLogLength);
        glGetProgramInfoLog(_shaderProgram, infoLogLength, &infoLogLength, infoLog);
        NSLog(@"Program validation log:\n%s", infoLog);
        free(infoLog);
    }
#endif

    glGetProgramiv(_shaderProgram, GL_VALIDATE_STATUS, &status);
    if (!status) {
        return NO;
    }

    glUseProgram(_shaderProgram);
    return YES;
}

I'm wondering if there might be some extra setup at some point throughout the lifecycle of CAEAGLLayer that I might be unaware of and might be skipping by trying to setup GL in init?

2

There are 2 best solutions below

2
SaldaVonSchwartz On BEST ANSWER

The problem was the layer has no dimensions at that point in init. Which in turn makes it where trying to set the render buffer storage to the layer implies a buffer of 0.

UPDATE: My current best thinking is that, instead of imposing a size on init (which worked fine for the purposes of testing but is kind hacky), I should just re set the buffer storage whenever the layer changes sizes. So I'm overriding -setBounds: like so:

- (void)setBounds:(CGRect)bounds
{
    [super setBounds:bounds];
    [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:self];
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &someVariableToHoldWidthIfYouNeedIt);
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &someVariableToHoldHeightIfYouNeedIt);
}
0
Dimi On

As far as I know you have to overwrite the layerClass method in the View, like this

    + (Class)layerClass
    {
       return [MYCEAGLLayer class];
    }

Also you have to set the drawableProperties on the MYCEAGLLayer.