So I threw an obj file with 16m vectors/faces at PBR for fun, and thus I spent today's hack time getting more familiar with golang's memory profiling.
There's a large amount of GC (space) overhead as it builds things, but lets put that aside for now since fixing the GC usage still doesn't allow loading the file and I can PR those fixes without discussion.
surface.Triangle takes up quite a bit of memory: 11 vectors + material + bounds. In my test file normals and textures don't even exist for these particular triangles.
I considered turning Triangle.Normals/Textures into pointers (which would slow down processing), or maybe something like a SimpleTriangle and fiddling with interfaces where normals/textures actually exist... but I noticed in every obj file I've looked at all of the numbers are float32 safe.
Do you know if this is true in a broader case? It makes sense for a lot of the internal numbers to be 64bit, especially the samples which can be large averaged numbers, but it might be a huge savings (and potential performance boost as it should reduce memory bandwidth) to use 32bits for the underlying data where possible.
Sadly, it's not an easy change. for shiggles I started switching out 64's for 32's but there's a lot of it, and preserving 64 where it should be 64 might require adding a vec64 structure since vec is nice and cleanly reused everywhere :)
Also, sadly, it looks pretty fiddly in go to make 32 vs 64 a compile time option :( Haven't read through all the possibilities yet.
At any rate, wondering if you've had any thoughts along these lines? I can see via the googles that it's a big area of discussion, though some notes about GPU acceleration being better at 32bit.
thanks!
So I threw an obj file with 16m vectors/faces at PBR for fun, and thus I spent today's hack time getting more familiar with golang's memory profiling.
There's a large amount of GC (space) overhead as it builds things, but lets put that aside for now since fixing the GC usage still doesn't allow loading the file and I can PR those fixes without discussion.
surface.Triangle takes up quite a bit of memory: 11 vectors + material + bounds. In my test file normals and textures don't even exist for these particular triangles.
I considered turning Triangle.Normals/Textures into pointers (which would slow down processing), or maybe something like a SimpleTriangle and fiddling with interfaces where normals/textures actually exist... but I noticed in every obj file I've looked at all of the numbers are float32 safe.
Do you know if this is true in a broader case? It makes sense for a lot of the internal numbers to be 64bit, especially the samples which can be large averaged numbers, but it might be a huge savings (and potential performance boost as it should reduce memory bandwidth) to use 32bits for the underlying data where possible.
Sadly, it's not an easy change. for shiggles I started switching out 64's for 32's but there's a lot of it, and preserving 64 where it should be 64 might require adding a vec64 structure since vec is nice and cleanly reused everywhere :)
Also, sadly, it looks pretty fiddly in go to make 32 vs 64 a compile time option :( Haven't read through all the possibilities yet.
At any rate, wondering if you've had any thoughts along these lines? I can see via the googles that it's a big area of discussion, though some notes about GPU acceleration being better at 32bit.
thanks!