I'm working on a C++ project using DirectX 11 with HLSL shaders.
I have a texture which is mapped onto some geometry. Each vertex of the geometry has a position and a texture coordinate.
In the pixel shader, I now can easily obtain the texture coordinate for exactly this one pixel. But how can I sample the color from the neighboring pixels?
For example for the pixel at position 0.2 / 0, I get the texture coordinate 0.5 / 0, which is blue. But how do I get the texture coordinate from let's say 0.8 / 0?
Edit:
What I'm actually implementing is a Volume Renderer using raycasting. The volume to-be-rendered is a set of 2D slices which are parallel and aligned, but not necessarily equidistant. For the volume I use DirectX's Texture3D class in order to easily get interpolation in z direction.
Now I cast rays through the volume and sample the 3D texture value at equidistant steps on that ray. Now my problem comes into play. I cannot simply sample the Texture3D at my current ray position, as the slices are not necessarily equidistant. So I have to somehow "lookup" the texture coordinate of that position in 3D space and then sample the texture using this texture coordinate.
I already have an idea how to implement this, which would be an additional Texture3D of the same size where the color of the texel at position xyz can be interpreted as the texture coordinate at position xyz.
This would solve my problem but I think it is maybe overkill and there might be a simpler way to accomplish the same thing.
Edit 2:
Here is another illustration of the sampling problem I am trying to fix.
The root of the problem is that my Texture3D is distorted in z direction.
From within one single pixelshader instance I want to obtain the texture coordinate for any given position xyz in the volume, not only for the current fragment being rendered.
Edit 3:
Thanks for all the good comments and suggestions.
The distances between the slices in z-order can be completely random, so they cannot be described mathematically by a function. So what I basically have is a very simple class, e.g.
struct Vertex
{
float4 Position; // position in space
float4 TexCoord; // position in dataset
}
I pass those objects to the buffer of my vertex shader.
There, the values are simply passed through to the pixel shader.
My interpolation is set to D3D11_FILTER_MIN_MAG_MIP_LINEAR
so I get a nice interpolation for my data and the respective texture coordinates.
The signature of my pixel shader looks like this:
float4 PShader( float4 position : SV_POSITION
, float4 texCoord : TEXCOORD
) : SV_TARGET
{
...
}
So for each fragment to-be-rendered on the screen, I get the position in space ( position
) and the corresponding position ( texCoord
) in the ( interpolated ) dataset. So far so good.
Now, from this PShader
instance, I want to access not only texCoord
at position
, but also the texCoord
s of other position
s.
I want to do raycasting, so for each screen-space fragment, I want to cast a ray and sample the volume dataset at discrete steps.
The black plane symbolizes the screen. The other planes are my dataset where the slices are aligned and parallel, but not equidistant. The green line is the ray that I cast from the screen to the dataset. The red spheres are the locations where I want to sample the dataset.
DirectX knows how to interpolate the stuff correctly, as it does so for every screen-space fragment.
I thought I could easily access this interpolation function and query the interpolated texCoord
for position xyz. But as it seems DirectX has not a mechanism to do this.
So the only solution really might be to use a 1D-Texture for z-lookup and interpolate between the values manually in the shader. Then use this information to lookup the pixel value at this position.