Inspired by Neural Radiance Fields (NeRF)
bpy.ops.object.origin_set(type='GEOMETRY_ORIGIN', center='BOUNDS')
- Normalise the scale in blender (ref).
- The margin for a unit cube to fit in a unit sphere is to scale each component by 0.55 after -1 to +1 normalisation.
- Convert to PLY with textures projected to Vertex Colors.
- Meshes with more than % of normals pointing not outwards of the unit sphere are bad.
- Meshes with vertices within a % of the center distance of the unit sphere are bad.
- Train a network based on input unit vector and position, position is one parameter that represents a normalised 0-10242 index position on a x6 subdivided unit icosphere (10,242 vertices). (4 components + category feature vectors).
- Reverse trace the ray from each vertex position in the vertex normal direction to the nearest aliased point on the bounding unit icosphere.
- Take in a category based feature vector e.g (dog,cat,horse,pig,sheep,cow) 0-1 for each category, such a network allows generating hybrids between the 6 four legged animals.
- [optional] Add an input parameter to encode the index for the current model being trained as an input that can be used as a random seed.
- The network outputs a vector position and an rgb color (6 components).
- The network is executed once per-ray to generate a point cloud, the ray starting position can be anywhere on the x6 subdivided unit icosphere (maximum 10,242 starting positions uniformly spaced around the unit icosphere and unlimited ray angles from starting position).
- The point cloud can be meshed using Ball Pivoting, Marching Cubes, or DMTet.
Result: Fast pre-trained inference models for specialised 3D mesh generation.
- Fast training process.
- Fast inference process.
- Smaller training data file size and memory usage.
- CPU inference is fast due to FNN/MLP using FMA.
- Trains on actual vertex data and not rendered angles (images) of a 3D mesh like a NeRF does.