at the moment I´m building a simple 3D renderer. The renderer reads object information from an Wavefront OBJ file. What I get from the file are vertex coordinates, normal coordinates and UV coordinates. With the OBJ file comes an MTL file which holds information about materials. What im doing right now is generating buffers with glGenBuffers:
- Vertexbuffer
- Normalbuffer
- UV Coordinates Buffer
- Diffus Color Buffer
- Specular Color Buffer
- Ambient Color Buffer
- Transparency Buffer
These buffers or arrays are in order so that each vertex has its correct normal, uv coordinates etc.
From what I understand, this is how glMultiDrawArrays wants the data to be arranged, and it works. So my scene is been rendered and the objects have the correct material color but no textures yet. I should add that I'm using a vertexshader and a fragmentshader which i found in an tutorial and they seam to work. The fragmentshader does the color calculation.
Now I want to add textures to my objects (the vertices already have the right uv coordinates as mentioned above). My first thought was to first render all objects that have no textur to get the speed advantage from glMultiDrawArrays and then after that render each object, that has a texture, on its own.
But then i saw that the fragmentshader is sampling the textures for colors by using the UV coordinates. My thought know is to somehow sample the textures with the UV coordinates myself and write the colors into my color buffer (overwriting the vertex colors with the texture color for the vertices that have UV coordinates) and by doing this I could render the whole scene in one go.
Im very new to this topic and so my questions are:
can I somehow use the texture sampler outside the fragmentshader to get the color information?
AND
my second question would be if this is the complete wrong way to do this in the first place and there is a standard way I don't know about yet?
I would appreciate if somebody could point me in the right direction. Thank you