1.png

Implementing wet effects in games—such as glossy surfaces, dripping water, and immersive simulations of wetness—presents significant technical challenges in terms of code complexity and mechanics. These effects require precise manipulation of shaders, textures, and physics to create realistic interactions, like water droplets sliding down a character's surface or rippling reflections on wet ground.

TL;DR it simply takes using a slider or adding your asset to the wetness pipeline in your Tetra project. Keep reading if you actually want to understand why that slider exists, what it does and how it is working faster than other 3D engines on the Web.

**In the early days of gaming, water representations were rudimentary, often limited to static blue areas or simple animated textures in 2D titles. Techniques like palette cycling or scrolling overlays provided basic illusions of movement, but lacked depth or interactivity due to hardware constraints. By the 1990s, advancements in 3D rendering, seen in titles like Wave Race 64 and Quake, introduced dynamic waves and basic refractions through vertex manipulations and texture distortions. These required real-time calculations that strained early consoles, marking the shift from mere visual tricks to interactive simulations. The 2000s saw further intensification with games like BioShock and Uncharted, where wetness became integral to the environment and gameplay. Water not only altered visuals—dripping from surfaces and pooling dynamically—but also influenced mechanics, such as reduced grip on wet materials. This era popularized Physically Based Rendering (PBR), necessitating shaders that accurately modeled light interactions with wet surfaces, including increased specularity and layered clearcoats for realistic sheen. Transitioning these wet effects to a browser environment exacerbates the difficulties due to inherent platform limitations.

15.png

The process involves balancing visual fidelity with performance, as inefficient implementations can lead to frame rate drops or excessive resource consumption. Developers usually must integrate physically based rendering principles, adjusting material properties like specular highlights, roughness, and clearcoat layers to mimic water's influence on surfaces. Moreover, simulating dynamic behaviors, such as dripping under gravity, demands robust physics engines that handle particle systems and surface interactions without overwhelming the system.

Unlike native applications, browsers operate in a sandboxed context, introducing overhead from JavaScript execution and security restrictions that hinder direct hardware access. Performance varies widely across devices, with WebGPU in Tetra (our Web3D engine) offering potential for advanced shaders, but browser throttling and memory caps often result in suboptimal rendering. Achieving real-time fluidity for effects like water flow requires optimized compute pipelines, yet the single-threaded nature of JavaScript can introduce delays in physics updates. Cross-browser compatibility adds another layer of complexity, as WebGPU support becoming less inconsistent in 2025, primarily robust in Chrome. This necessitates hybrid implementations, where advanced features such as ray-marched reflections are downgraded, compromising the realism of wet effects. Diverse input methods further complicate interactivity, demanding responsive simulations that adapt without exceeding computational budgets. Scalability issues are prominent, with high-resolution assets for droplets or normal maps risking memory overflows and performance hitches on lower-end hardware.

Fluid dynamics simulations, essential for ocean or rain effects, involve solving complex equations like Navier-Stokes, but browser thread limitations force approximations that trade accuracy for efficiency. Debugging in browsers is inherently more challenging, with development tools offering less granular insights than native profilers. Shader errors, such as flawed refraction logic, manifest variably across environments, extending iteration times. Asset loading is impeded by CORS policies, complicating the integration of necessary textures for convincing wetness. The evolving web standards introduce ongoing uncertainty; while WebGPU enables sophisticated effects, API fluctuations and bugs demand frequent adaptations. Mobile browser constraints on energy use further limit GPU-intensive simulations, requiring hybrid strategies that blend physics with perceptual shortcuts to maintain accessibility. Custom ShaderPasses offer a viable path forward, integrating traditional particle systems and dripping physics with post-processing filters on WebGPU to approximate native-level effects. By combining CPU-managed instanced meshes for droplet positioning—adhering to surfaces via raycasting—with GPU-accelerated noise distortions for ripples, we can create efficient, layered wetness. This approach leverages WebGPU's compute shaders for parallel particle updates, reducing CPU load while enhancing realism through animated UV offsets and gloss enhancements. In practice, a ShaderPass can blend these elements by processing particle data in the fragment shader, applying screen-space refractions and highlights that respond to scene lighting. This fusion of legacy techniques (physics-based dripping) with modern optimizations (asynchronous compute) minimizes draw calls, achieving high-fidelity results comparable to desktop engines, all within browser constraints.

1.png

4.png

A critical component in achieving depth-aware wet effects in browser-based rendering is the depth mask, often implemented via a depth texture. This texture captures the per-pixel distance from the camera to the scene geometry, stored in the depth buffer during the initial render pass. In a custom ShaderPass, we can sample this depth texture within the fragment shader to modulate effects, ensuring that wetness distortions or overlays respect the 3D structure—preventing artifacts like effects bleeding through occluded surfaces. For instance, by unpacking depth values and comparing them against screen-space coordinates, we can apply refractions or highlights only to foreground elements, enhancing realism in post-processing. This technique is essential for effects like falling water, where droplets must appear to interact with model contours without flattening the scene.

Determining the size of avatars, or animated 3D models, is straightforward through bounding box computation, which provides dimensions for scaling and positioning effects. For a static model, any object can encapsulate the mesh followed by yielding a vector with width, height, and depth. Animated models, however, require dynamic recalculation, as skeletal deformations alter geometry; this involves updating the bounding box per frame after applying animations, potentially using bone transformations for skinned meshes to ensure accuracy. This size data informs effect placement, such as normalizing particle distributions to the model's scale, avoiding disproportionate dripping on varying avatar sizes.

Placing particles on an animated mesh surface for dripping effects begins with surface sampling techniques to generate initial positions that conform to the model's topology. We facilitate this by randomly sampling points and normals from the mesh geometry, ensuring particles adhere realistically to contours like a character's limbs or torso. For animated models, sampling must occur post-animation update each frame to account for movement, preventing particles from detaching during motion. This setup creates a foundation for effects, with particles instantiated as instanced meshes for efficiency, their positions stored in arrays for subsequent physics application.

Dripping effects are then simulated through physics integration, where particles respond to gravity and surface interactions. Initial velocities and accelerations are applied downward, with raycasting used to detect mesh intersections and constrain movement—keeping droplets sliding along normals until they reach edges and fall freely. As our engine is WebGPU-optimized, this can leverage compute shaders for large particle counts, updating positions in parallel while incorporating friction to slow slides on surfaces. The result is a believable cascade, with optional scaling for droplet merging or elongation, all computed in the animation loop for real-time responsiveness. These positioned and physics-simulated particles feed into the custom ShaderPass, where they enhance the falling water effect in post-processing. By passing particle data—such as updated positions and trails—as uniforms or textures to the shader, we can blend them with the depth texture for occlusion-aware rendering, creating streaks or splashes that distort the underlying image. On WebGPU, this achieves near-native fluidity by combining particle physics outputs with screen-space filters like animated offsets and refractions, resulting in cohesive, physics-driven wetness that scales efficiently across browser devices.

In Tetra, the integration of depth masks and wet shader passes is streamlined for playable avatars, ensuring seamless application of wetness effects without extensive developer intervention. All pre-built avatars in our solutions are equipped with depth textures by default, generated during the initial render pass and accessible in subsequent post-processing stages. This setup allows the custom ShaderPass to sample depth values automatically, modulating effects like distortions or highlights to align with the avatar's geometry. As a result, water simulations—whether static sheen or dynamic dripping—respect occlusion and surface depth, preventing visual anomalies and enhancing immersion across animated models.

The wet shader pass itself is pre-included in the material pipeline for avatars and other scene objects, leveraging WebGPU's capabilities to apply PBR adjustments universally. This means parameters such as roughness reduction, specular amplification, and clearcoat layering are handled natively within the shader, triggered by the presence of a wetness factor. Developers benefit from this abstraction, as no custom code is required to adapt the pass for player characters, NPCs, or environmental assets; the engine's node-based materials ensure compatibility, with automatic propagation of wetness data through the render graph. This design philosophy minimizes setup overhead, allowing creators to focus on gameplay rather than technical plumbing. For any object, including avatars, the process simplifies to selection within Infinity SDK and tweaking a single "wetness" slider. This scalar value, typically ranging from 0 (dry) to 1 (fully saturated), interpolates shader uniforms in real-time, blending effects like normal map perturbations for droplets or UV offsets for flow. The slider's adjustments are non-destructive and can be animated or tied to environmental triggers, such as rain exposure, without necessitating additional physics simulations or particle emitters.

By embedding these features at the asset level, the system democratizes advanced visual effects, making them accessible even to non-technical users. This approach not only accelerates iteration in browser environments, where performance is paramount, but also ensures consistency across devices, as WebGPU handles the underlying computations efficiently. Ultimately, it empowers developers to prototype and refine wet interactions rapidly, dreamscaping more dynamic and responsive game worlds.