All Projects
NavigationROS 2NVIDIA IsaacOmniverseDepth camerasPoint clouds

Multi-view 3D reconstruction and mapping (showreel)

Dense mapping, nvblox-style fusion, and Isaac task environments for navigation and manipulation.

Multi-view 3D reconstruction and mapping (showreel)

Capabilities

This compilation spans dense mapping and scene understanding from multiple cameras, volumetric and surfel-style mapping (including nvblox-style pipelines), and fusion of depth and color for both navigation and manipulation. Work here also ties into simulation arenas such as Isaac Lab and kitchen-style task environments so reconstructed scenes and synthetic scenes can support the same downstream stacks.

How we demonstrate it

We show the work through side-by-side clips and stills from several engagements: multi-camera reconstruction runs, mapping visualizations, Isaac and Omniverse environment shots, and exports such as point clouds or generated assets that feed perception and motion planning. Together they document how we go from sensors to maps to robot-ready representations.

Highlights

Together these artifacts document a repeatable path from sensors to maps to robot-ready scene representations for navigation and manipulation stacks.

Technologies

ROS 2NVIDIA IsaacOmniverseDepth camerasPoint clouds3D reconstructionNavigation

Gallery

Project Media

Multi-view 3D reconstruction and mapping (showreel)
Multi-view 3D reconstruction and mapping (showreel)
Multi-view 3D reconstruction and mapping (showreel)
Multi-view 3D reconstruction and mapping (showreel)
Multi-view 3D reconstruction and mapping (showreel)