Photogrammetry vs Gaussian Splatting for Virtual Reality
Vložit
- čas přidán 18. 12. 2023
- What’s better for viewing real life VR environments?
I give my thoughts on this and compare how either feels in VR.
You can download my Gaussian Splats and view them in VR yourself!
Collection of splats: drive.google.com/drive/folder...
VR Splat viewer: www.gracia.ai/
Links discussed in video:
Luma labs: lumalabs.ai/
Unity plugin: github.com/aras-p/UnityGaussi...
Animated splats: shenhanqian.github.io/gaussia... - Hry
If you don't want to listen to me ramble and just want to look at the comparisons here are the timestamps:
02:22 - Back to back comparison
08:07 - Side to side comparison
13:15 - Quick splat vs lengthy photogrammetry
I can totally relate to the rambling…that was me trying to explain to a friend how next-level this tech is. I too was awestruck by the fidelity compared to photogrammetry. Especially the fine detailed objects, or shiny, reflective or transparent objects that would never work in photogrammetry. I actually really like the soft painterly hazy aesthetic…it feels like a memory.
I’ve been experimenting with PostShot ahead of a trip to visit my folks. Planning to scan my childhood home before my folks move to preserve the space.
15:12 Never thought about it. Gaussian Splattering could "bring back" a loved one who passed away. Only for a short moment where you are only watching a still frame of the person but the possibilites are immesurable.
I'm sure this will be used for 'photo albums' of loved ones. A nice thought.
there are already papers on animated gaussian splatting (including people)
Gaussian Splatting 4D videos are possible. It's not just a 3D image
czcams.com/video/YsPPmf-E6Lg/video.html&ab_channel=ZhangChen - video guassian splatting
Exactly why I am looking into splatting unfortunately.
Excatly what i was looking for, incredible for capturing spatial memories
i could see google doing this with street view
And they should
Yeah! Splat the world.
i believe google did make a paper on it.
there is already paper for that
This looks absolutely amazing!😮
Great stuff, I think we will mix technologies to get the best of all worlds for the forseeable future
beautiful thought
WELL DONE
this is so interesting because i was interested in bringing 3d scans into one of my fave games as mods but other modders told me "its too many tris its impossible" but then here you are running it in vr which is suually even more demanding. I am wishing i can bring stuff like this in game.
Yeah, you definitely can. There are Gaussian Splat add-ons for Unity and Unreal if you are using either of those game engines.
sadly the game i am using isnt either D: I want to use 3d scans which work but modders tell me theres too many tris. Wondering if it can be reduced easily while keeping detail!@@Puffycheeks
@@AnnasGamingCornerUsually in photogrammetry software there is a way to simplify or decimate a model before export. If it is a premade 3d scan Unreal engine has a way to use automatic line of distance models for high poly stuff to make it optimized. What game engine are you using?
it is called transport fever 2 (on my chanl) and I am using scans from polycam! :)@@Puffycheeks
@@AnnasGamingCornerFeel free to email me at Puffycheeks288@gmail.com if the scans you are using are free assets online I could try are simplify them.
I was unaware that the splats change their properties depending on the direction we view them from, that explains a lot of how specular highlights can change, and how reflections can work. Google did this in their Lightfield demo by 3D reconstruction and from what I could tell texture swapping. Kind of disappointing as it kind of removes some of the magic, but made sense at the time. That technique also suffered badly from being limited to the 1m sphere of their camera array.
If I could take a video of my apartment and, as you said, port it into VR Chat, I would do that in a heartbeat to get my international friends to visit... on location, as it were. I very much look forward to this technology maturing, so we will have neat GUI applications to generate and export these environments. Also, I have completely forgotten to look at this myself, while I've watched videos and tutorials for a while now. Thanks for sharing!
Thanks for watching my video BOLL! The space shuttle in the Lightfield demo is still the most realistic thing I've ever seen in VR. I'm also excited to see this tech mature, feel free to download my splats and view them in the 'Gracia VR' viewer linked in the description. (I recommend the 'Fireplace' file)
Thank you
what a time to be alive
Very cool indeed
ty
Great!
How to export this into quest 2 ? What engine to use ? And what sdk to use? Can you please explain? I am interested to try this. This is awesome..😊
x2
Does Garcia work with the point cloud files that the Luma AI iphone app generates?
I tried uploading my point cloud luma files up to garcia, but it doesnt seem to like that format?
The biggest problem of nerfs and gaussian splats i see their lack of editability like it's like capturing hologram it looks great but for world construction pacticulary for games it seems really hard to use..
For the moment yes that is a good point. But the tech is moving fast, have a look at this example of editing splats: x.com/Andersonmancini/status/1737880610906009634?s=20
What would be nice if this could be overlaid on simple geometry, kind of like using a bump map. Having this high level of detail that 3d objects could interact with.
Gaussian splats' data got the potentials. Geometry wise, it's 3d point cloud. So add/remove points are straight forward. Also, you can do lod, streaming easily. The difficult part is to add them to game lighting pipeline to support dynamic lighting. But I believe it's impossible in the future. Currently there are researches to reverse lighting of the gaussian splats. So I think we may see more progress soon.
Have also been thinking about this. Same as one would sculpt a high dense 3d model and then bake the details on normalmaps etc. on a simplified geometry.@@dbellamy6694
That's not what it's made for.
Maybe it can be used as skyboxes?
It would work amazing for skyboxes!
Any way to run this in browser? Or directly from headset hard drive?
checkout PlayCanvas - it has webvr capability - but you need to tether to the PC to view in VR - If you access from the browser in your headset it freaks out with the 3dGS
what VR software do you use to place splat in?
Gracia AI probably, based on the interface
Interesting video. Would be great if when capturing video from your point of view you move and look around a bit slower.
I could have done some post stabilization of the VR video, yes. But I didn't think of that when editing. VR footage is inherently 'fast'. Nowadays most games will have inbuilt smoothing for the 2D view on the screen that helps this.
I wonder if apple’s Vision Pro spatial video uses something like this. They say the iPhone 15 pro camera can record the spatial video but the cameras aren’t human eye distance apart so it seems like they’d have to be doing some 3D math with the cameras. I wonder if they’ll end up adding a “capture your whole environment in 3D” feature
No they don't. It's really just light fields that they use. It's spatial video, not volumetric video.
Volumetric videos (point clouds, gaussian splats) are still waiting to become mainstream.
its hard to get good splats without running 400 photos
I‘m not too knowledgeable about gaussian splatting, but I think it would be nice to somehow convert these splats to roughness, translucency, bump, displacement, normal and opacity maps etc…then map them onto the photogrammetry polygon model
Look up 2 minute papers Gaussian splats. He does some stuff about that.
@@judgsmith thanks will do
Gaussian splatting does much better job than polygon model for rendering. Gaussian splatting is to replace polygon rasterization but not the textures for polygons
@@vickytao2010 but splats are not practical to work with. You can’t model them, can’t import them into a game engine and apply physics etc. They are not flexible and at best usable for vfx shots.
*promo sm*
There is no point cloud in 3D Gaussian splatting it is built from splats. It is still ok from a distance and actually quite good but you come close and it all breaks. That is why you see them around. You can still import it to 3D modeling software but only if you do not plan to come close to them in 3D environment.
The 3d guassian splatting can use point cloud data structure. And because so, it's easy to add lod to it so it can handle the case you are talking about of "come close". There is a research talking about 3d guassian splatting LOD which shows decent results.
What lidar scanner do you recommend? Cool shit.
I used the Polycam app and the LiDAR sensor on an iPhone 12 Pro.