You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The texture builtin tests work by calling the texture builtin to sample a texture and calling a software sampler to sample a texture and expect them to match within some tolerance.
The bias tests though are particularly an issue because they want to test that the bias is clamped from -16 to 15.99. To do that they make a 3 mip level texture. They pick some level in that texture, say 1.2. They then choose a bias, example bias = -17.5. They they compute what derivative they would have to pass in so that given clamp(bias, -16, 15.99) + mipLevelByDerivative(derivative) the result 1.2 (the target level)
The issue is, derivative in this case is something like 0.00002027413656763476 a small number. Small enough that precision issues come up where the level the GPU computes is like 1.05 or 1.35 (we wanted 1.2) and this is probably a 1 bit difference but, choosing a level that's 15% off is not within the tolerance of the test.
One solution is just to increase the tolerance.
Another solution which @shrekshao suggested, instead of comparing the sample values, look up all of the texels that would be sampled and just check that the result is between those colors. If we chose a target level like 0.9 then we might get 0.7 or 1.1 (1.1) but, we could always choose a target level of half way between to levels ok 0.5, 1.5, or 2.5. This would still let us test clamping but would be more likely to pass.
The text was updated successfully, but these errors were encountered:
The texture builtin tests work by calling the texture builtin to sample a texture and calling a software sampler to sample a texture and expect them to match within some tolerance.
The bias tests though are particularly an issue because they want to test that the bias is clamped from -16 to 15.99. To do that they make a 3 mip level texture. They pick some level in that texture, say 1.2. They then choose a bias, example bias = -17.5. They they compute what derivative they would have to pass in so that given
clamp(bias, -16, 15.99) + mipLevelByDerivative(derivative)
the result 1.2 (the target level)The issue is,
derivative
in this case is something like0.00002027413656763476
a small number. Small enough that precision issues come up where the level the GPU computes is like 1.05 or 1.35 (we wanted 1.2) and this is probably a 1 bit difference but, choosing a level that's 15% off is not within the tolerance of the test.One solution is just to increase the tolerance.
Another solution which @shrekshao suggested, instead of comparing the sample values, look up all of the texels that would be sampled and just check that the result is between those colors. If we chose a target level like 0.9 then we might get 0.7 or 1.1 (1.1) but, we could always choose a target level of half way between to levels ok 0.5, 1.5, or 2.5. This would still let us test clamping but would be more likely to pass.
The text was updated successfully, but these errors were encountered: