Style: Stylized Gaussian Splatting

We introduce 𝒢‐Style, a novel algorithm designed to transfer the style of an image onto a 3D scene represented using Gaussian Splatting. Gaussian Splatting is a powerful 3D representation for novel view synthesis, as—compared to other approaches based on Neural Radiance Fields—it provides fast scene...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computer graphics forum Ročník 43; číslo 7
Hlavní autoři: Kovács, Áron Samuel, Hermosilla, Pedro, Raidou, Renata G.
Médium: Journal Article
Jazyk:angličtina
Vydáno: 01.10.2024
Témata:
ISSN:0167-7055, 1467-8659
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We introduce 𝒢‐Style, a novel algorithm designed to transfer the style of an image onto a 3D scene represented using Gaussian Splatting. Gaussian Splatting is a powerful 3D representation for novel view synthesis, as—compared to other approaches based on Neural Radiance Fields—it provides fast scene renderings and user control over the scene. Recent pre‐prints have demonstrated that the style of Gaussian Splatting scenes can be modified using an image exemplar. However, since the scene geometry remains fixed during the stylization process, current solutions fall short of producing satisfactory results. Our algorithm aims to address these limitations by following a three‐step process: In a pre‐processing step, we remove undesirable Gaussians with large projection areas or highly elongated shapes. Subsequently, we combine several losses carefully designed to preserve different scales of the style in the image, while maintaining as much as possible the integrity of the original scene content. During the stylization process and following the original design of Gaussian Splatting, we split Gaussians where additional detail is necessary within our scene by tracking the gradient of the stylized color. Our experiments demonstrate that 𝒢‐Style generates high‐quality stylizations within just a few minutes, outperforming existing methods both qualitatively and quantitatively.
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.15259