I've spent two summers at Meta Reality Labs, where I worked on neural rendering for 3D reconstruction and next-generation computational AR/VR displays.
I've also served as a reviewer for SIGGRAPH, SIGGRAPH Asia, NeurIPS, ISMAR, TIP, TCI, ICIP, and ICASSP.
When state-of-the-art neural rendering meets next-generation holographic VR displays: converting
optimized Gaussian splats to holograms that support natural focus cues.
The novel combination of a multisource laser array and a dynamic Fourier amplitude modulator significantly improves
holographic VR display étendue and light field hologram image quality.
A near-eye display design that pairs inverse-designed metasurface waveguides with AI-driven holographic displays
to enable full-colour 3D augmented reality from a compact glasses-like form factor.
The inclusion of parallax cues in CGH rendering plays a crucial role in enhancing perceptual realism,
and we show this through a live demonstration of 4D light field holograms.
A novel light-efficiency loss function, AI-driven
CGH techniques, and camera-in-the-loop calibration greatly improves holographic projector
brightness and image quality.
We propose an image-to-image translation algorithm based on generative adversarial networks
that rectifies fisheye images without the need of paired training data.