Extreme Views:

3DGS Filter for Novel View Synthesis from Out-of-Distribution Camera Poses

20th International Symposium on Visual Computing (ISVC) 2025

Concordia University
paper
Arxiv
github
Code coming soon

Abstract

3D Gaussian Splatting (3DGS) has become a popular approach for real-time 3D reconstruction and novel view synthesis, representing scenes as clouds of anisotropic Gaussian primitives optimized from posed training images. While this works well for views close to the training cameras, users often navigate far beyond these trajectories, exposing regions of space that are under constrained by the original data. Augmenting the training pipeline can make 3DGS models more robust to such out-of-distribution (OOD) viewpoints, but it cannot eliminate OOD artifacts: the model can only learn what is supported by the training cameras, so unseen regions of space remain under-constrained and free to hallucinate unstable, visually noisy artifacts. In this work, we treat these hallucinations as a render-time problem rather than a training-time deficiency. We introduce a view-conditioned, gradient-based filter that actively detects and suppresses unstable, high-sensitivity Gaussians in under-constrained regions before they manifest as visible artifacts. Our method derives per-intersection sensitivity scores from intermediate gradients; instead of relying on orientation-agnostic criteria (e.g., thresholds on overall variance or size), these scores are view-conditioned and sensitive to how anisotropic Gaussians are oriented relative to the camera. They then drive a lightweight rejection step that prunes hallucination-prone Gaussians on a per-view basis. For challenging OOD camera trajectories, our drop-in real-time module substantially improves visual quality and clarity of principal-objects without retraining or offline fine-tuning.


EV3DGS (Ours)
3DGS
EV3DGS (Ours)
3DGS

Methodology

At render time, we apply a two-pass filtering pipeline that leverages intermediate gradients to identify unstable 3D Gaussians during ray-marched rendering.

In the first pass, we compute the intermediate gradient magnitude for every ray–Gaussian intersection. Whenever this value exceeds a gradient sensitivity threshold, \(\gamma\), we increment a rejection count for the corresponding Gaussian while also tracking its total number of intersections.

In the second pass, we derive a per-Gaussian sensitivity score as the ratio between rejected and total intersections. If this score exceeds a ratio threshold, \(\tau\), the Gaussian is excluded from the render loop for the current viewpoint. This ensures only Gaussians with sufficiently low intermediate gradients (i.e., stable contributions) participate in the composition of the final image.


More Visual Results

EV3DGS (Ours)
3DGS
EV3DGS (Ours)
3DGS
EV3DGS (Ours)
3DGS
EV3DGS (Ours)
3DGS

Citation

@article{EV3DGS25,
  title = {Extreme Views: 3DGS Filter for Novel View Synthesis from Out-of-Distribution Camera Poses},
  author = {Bowness, Damian and Poullis, Charalambos},
  journal = {arXiv},
  year = {2025},
}