I've been trying to apply the filters I use in the android-gpuimage library in the Mediacodec surface context. So far I've succeeded in using the filters that only require one extra texture map. However, when I try to apply a filter that needs at least two, the result is an either blue-colored or rainbow-colored mess.
The following issue deals with the one that uses a texture lookup filter and an vignette filter.
The vertex shader I used is as follows:
uniform mat4 uMVPMatrix;
uniform mat4 textureTransform;
attribute vec4 vPosition;
attribute vec4 vTexCoordinate;
varying vec2 v_TexCoordinate;
void main() {
gl_Position = uMVPMatrix * vPosition;
v_TexCoordinate = (textureTransform * vTexCoordinate).xy;
}
The fragment shader I used is as follows:
#extension GL_OES_EGL_image_external : require
precision lowp float;
varying highp vec2 v_TexCoordinate;
uniform samplerExternalOES u_Texture; //MediaCodec decoder provided data
uniform sampler2D inputImageTexture2; //Amaro filter map
uniform sampler2D inputImageTexture3; //Common vignette map
void main()
{
vec3 texel = texture2D(u_Texture, v_TexCoordinate).rgb;
vec2 red = vec2(texel.r, 0.16666);
vec2 green = vec2(texel.g, 0.5);
vec2 blue = vec2(texel.b, 0.83333);
texel.rgb = vec3(
texture2D(inputImageTexture2, red).r,
texture2D(inputImageTexture2, green).g,
texture2D(inputImageTexture2, blue).b);
//After further research I found the problem is somewhere below
vec2 tc = (2.0 * v_TexCoordinate) - 1.0;
float d = dot(tc, tc);
vec2 lookup = vec2(d, texel.r);
texel.r = texture2D(inputImageTexture3, lookup).r;
lookup.y = texel.g;
texel.g = texture2D(inputImageTexture3, lookup).g;
lookup.y = texel.b;
texel.b = texture2D(inputImageTexture3, lookup).b;
//The problem is somewhere above
gl_FragColor = vec4(texel, 1.0);
}
The end result of that program looked like this:

Is this the result of a bad vignette map, or is it something to do with the vignette application part of the fragment shader?
EDIT:
The texture used for inputImageTexture2:
The texture used for inputImageTexture3:


Turns out the way I load my textures matters.
My current code for loading textures:
The previous incarnation used the
texturesarray, an array I use to load the data from MediaCodec and the watermark. For some reason if I use that instead of generating anIntBufferfor each texture, the textures used in the fragment shader get jumbled or something.