WebGL and playwright different output when using the built in browser

Summary

A production WebGL application rendered correctly in standard Chromium but produced intermittent visual artifacts only when run inside the Playwright automation browser. The issue manifested as incorrect ID-based coloring, “garbage” values, or leftover pixels in instanced geometry (specifically spheres), while other geometries (quads) remained stable. The root cause was identified as a precision loss in the default highp precision of the Playwright Chromium build’s WebGL fragment shader, combined with potential driver-level differences in the headless environment.

Root Cause

The core issue lies in the mismatch of floating-point precision guarantees between the development environment (standard Chrome) and the automation environment (Playwright’s bundled Chromium).

  • Precision Mismatch: The vertex shader passes a float varying (vShaderIndex) to the fragment shader. While the vertex shader likely calculates this value precisely, the rasterization and interpolation process in the fragment shader can introduce minute floating-point errors (e.g., 5.999999 instead of 6.0).
  • Invalid Casting: In the standard browser, the GPU drivers handle the int(vShaderIndex) conversion robustly, likely rounding implicitly or preserving enough precision. In the Playwright environment, the specific build of Chromium or the underlying OS graphics driver (especially in headless mode) casts the float directly to an integer, truncating the decimal. A value of 5.999 becomes 5, causing the if logic to select the wrong color branch or fall through to the default black.
  • State Leakage: The “garbage” values observed suggest UBO (Uniform Buffer Object) or attribute state leakage. When the sphere is drawn, the shader logic fails to correctly isolate instances due to the precision error. If the shader fails to match the expected ID, it should default to black or discard. However, if the shader program is shared or if the precision error causes the ID to map to an invalid index (accessing uninitialized memory in a theoretical lookup table, though the posted code uses if/else), it might read stale data from the shader pipeline state.

Why This Happens in Real Systems

  • Headless Driver Quirks: Playwright runs browsers in a headless environment. This often forces the OS to use a “Generic Software Adapter” or a different optimized driver stack than the one used for interactive rendering. These drivers often have stricter or looser precision handling for performance.
  • Shader Compilation Differences: Chromium’s WebGL implementation compiles shaders differently based on the OS and GPU context. Playwright’s build might disable specific optimizations or enable strict precision validation that standard Chrome does not.
  • Dynamic Shader Generation: The user mentioned generating shaders dynamically. Without explicitly defining precision mediump float; or precision highp float; at the top of the fragment shader, browsers default to mediump (which is often 16-bit floats) in WebGL1. While highp is requested in the snippet implicitly, the specific driver in the Playwright environment might downgrade it to mediump to save memory, leading to the artifacts.

Real-World Impact

  • Automated Visual Regression Failures: Tests relying on WebGL snapshots (e.g., verifying graph layouts) will fail due to pixel mismatches, blocking CI/CD pipelines.
  • Misleading Debugging: The issue appears as logic bugs in the shader code (incorrect IDs) but is actually a hardware/driver constraint issue, leading developers to waste time debugging correct logic.
  • Platform Inconsistency: A build that works on developer machines fails on CI runners, creating the classic “works on my machine” divergence.

Example or Code

The user’s shader logic demonstrates the vulnerability:

// Vertex Shader
attribute float aShaderIndex;
varying float vShaderIndex;
void main(){
    vShaderIndex = aShaderIndex;
}

// Fragment Shader
varying float vShaderIndex;

void main(){
    vec3 c = vec3(0.);
    int i = int(vShaderIndex)%6; // CRITICAL LINE: Precision loss happens here

    // ... if/else blocks ...
}

The problem is the cast int(vShaderIndex). If vShaderIndex is 5.000001, it is truncated to 5. If it is 4.999999, it is truncated to 4.

How Senior Engineers Fix It

  • Precision Specification: Explicitly set precision qualifiers in the fragment shader to ensure consistency across all implementations.
    precision highp float; // Force high precision
    precision highp int;
  • Rounding instead of Truncating: Instead of casting directly (int(x)), use floor(x + 0.5) to perform a proper mathematical round. This ensures that 4.999 rounds to 5 and 5.001 rounds to 5.
    int i = int(floor(vShaderIndex + 0.5)) % 6;
  • Robust Comparison: For ID matching, use an epsilon comparison or store IDs as integers as early as possible in the pipeline (using flat interpolation if available in WebGL2/ES3.0) to avoid interpolation altogether.
  • Color Buffer Clearing: Ensure the framebuffer is cleared between draws if the artifacts look like “leftovers” (Z-fighting or lack of clearing), though this looks like a precision issue.

Why Juniors Miss It

  • “It Works on My Machine”: Juniors test in the interactive browser where drivers are optimized for visual fidelity, masking precision issues that occur in headless or generic environments.
  • Implicit Trust in Compilers: They assume float to int casting behaves exactly like integer math, overlooking that floating-point math is inherently imprecise.
  • Lack of Shader Sanitation: Dynamic shader generation often skips standard boilerplate (like defining precision defaults), relying on browser defaults which are inconsistent.
  • Symptom Misdiagnosis: Seeing “garbage data” often leads to thinking about memory leaks or buffer overflows rather than simple arithmetic precision errors in the shader.