?/TD> |
Microsoft DirectX 9.0 |
Depth values of fragments, generated while rasterizing a primitive, can be biased. This ability helps mitigate z-fighting issues when drawing co-planar primitives.
This ability was exposed in previous versions of Microsoft?Direct3D?but was never specified nor implemented in the Reference Device.
D3DPRASTERCAPS_DEPTHBIAS and D3DPRASTERCAPS_SLOPESCALEDEPTHBIAS have been added to distinguish between the legacy devices that exposed the D3DPRASTERCAPS_ZBIAS and performed some unspecified behavior, and those that can perform true slope-scale based depth bias. Applications now specify two floating point values: D3DRS_DEPTHBIAS and D3DRS_SLOPESCALEDEPTHBIAS, which are used to compute the offset. The offset is added to the fragment's interpolated depth value to produce the final depth value that is used for depth testing and optionally is written into the depth buffer.
Offset = m * D3DRS_SLOPESCALEDEPTHBIAS + D3DRS_DEPTHBIAS
where m is the maximum depth slope of the triangle being rendered.
m = max(abs(delta z / delta x), abs(delta z / delta y))
The units for the D3DRS_DEPTHBIAS and D3DRS_SLOPESCALEDEPTHBIAS render states depend on whether z-buffering or w-buffering is enabled. The application must provide suitable values.
The bias is not applied to any line and point primitive. However, this bias needs to be applied to triangles drawn in wireframe mode.
// RenderStates D3DRS_SLOPESCALEDEPTHBIAS, // New, defaults to zero D3DRS_DEPTHBIAS, // New, defaults to zero
// Caps D3DPRASTERCAPS_DEPTHBIAS // New D3DPRASTERCAPS_SLOPESCALEDEPTHBIAS // New