Shader saturate() creating gradient when it shouldn't

1.4k Views Asked by At

Writing a shader for Unity and I don't understand what is happening here. I have a sphere with a material attached. Material has a shader attached. Shader is very simple, it generates some simplex noise, then uses that as the color for the sphere.

Shader code:

Shader "SimplexTest"
{
    Properties
    {
      _SimplexScale("Simplex Scale", Vector) = (4.0,4.0,4.0,1.0)
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" "Queue"="Geometry" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"
            #include "SimplexNoise3D.hlsl"

            float3 _SimplexScale;

            struct appdata
            {
                float4 vertex : POSITION;
            };

            struct v2f
            {
                float4 position : SV_POSITION;
                float4 color : COLOR0;
            };

            v2f vert (appdata v)
            {
                v2f o;
                float4 worldPosition = mul(unity_ObjectToWorld, v.vertex);
                o.position = UnityObjectToClipPos(v.vertex);

                float small = snoise(_SimplexScale * worldPosition);
                // o.color = small * 10; // Non-Saturated (A) Version
                o.color = saturate(small * 10); // Saturated (B) Version

                return o;
            }


            fixed4 frag (v2f i) : SV_Target
            {
                return i.color;
            }
            ENDCG
        }
    }
}

The simplex noise function is from Keijiro's Github.

Non-Saturated (A) Version:
Version A

Saturated (B) Version:
Version B

The non-saturated version (Version A) is as expected. By multiplying the simplex noise by a factor of 10, the result is a series of white and black splotches. Saturate (Version B) should theoretically just chop all the white splotches off at 1 and the black ones at 0. But it seems to be creating a whole new gradient and I don't know why.

I assume I'm missing some key assumption, but it isn't clear to me what that would be, since the math seems correct.

1

There are 1 best solutions below

2
On BEST ANSWER

In the grey areas, the fragment shader has pixels which are between vertices of color 0 and 1. That means that the input is interpolated between those values, creating the grey i.color value.


One solution is to first, change your mesh to have separate vertices for each triangle, so that each triangle can have all of its vertices with the same normal.

Then, once you have all of the vertices for each triangle sharing the same normal, you can make the color based off a SV_NORMAL value in the vertex shader, instead of a transformed POSITION value.

Something along the lines of:

struct appdata
{
    float4 vertex : POSITION;
    float4 normal : NORMAL;
};

...

v2f vert (appdata v)
{
    v2f o;
    float3 worldNormal = UnityObjectToWorldNormal(v.normal);
    o.position = UnityObjectToClipPos(v.vertex);

    float small = snoise(_SimplexScale * worldNormal);
    o.color = saturate(small * 10);

    return o;
}

Then, it should work as expected.


Another way that could help make this work (if you want to keep the little bit of interpolation in version A) is to let the interpolation continue, but just do the saturation in the fragment shader:

v2f vert (appdata v)
{
    v2f o;
    float4 worldPosition = mul(unity_ObjectToWorld, v.vertex);
    o.position = UnityObjectToClipPos(v.vertex);

    float small = snoise(_SimplexScale * worldPosition);
    o.color = small * 10;

    return o;
}


fixed4 frag (v2f i) : SV_Target
{
    return saturate(i.color);
}

The reason your problem doesn't happen to a great extent in example A is that it's interpolating between something less than or equal to zero and a large positive number. And since something interpolated between a very large number and something around zero will generally be greater than 1, you get a lot of pixels that render as white.