showSidebars ==
showTitleBreadcrumbs == 1
node.field_disable_title_breadcrumbs.value ==

PhD Dissertation Defense by Ramesh Darshana Rathnayake KANATTA GAMAGE | Refining LiDAR-driven Depth Perception for Pervasive Spatial Computing

Please click here if you are unable to view this page.

 

Refining LiDAR-driven Depth Perception for Pervasive Spatial Computing

Ramesh Darshana Rathnayake KANATTA GAMAGE

PhD Candidate
School of Computing and Information Systems
Singapore Management University
 

FULL PROFILE

Research Area

Dissertation Committee

Research Advisor
Committee Members
External Member
  • MA Dong, Associate Professor, Department of Computer Science and Technology, University of Cambridge
 

Date

12 March 2026 (Thursday)

Time

6:30pm - 7:30pm

Venue

Meeting room 5.1, 
Level 5
School of Computing and Information Systems 1,
Singapore Management University,
80 Stamford Road
Singapore 178902

Please register by 10 March 2026.

We look forward to seeing you at this research seminar.

 

ABOUT THE TALK

Spatial computing has been gaining popularity due to advancements in sensing and reasoning capabilities over 3D spaces. Light Detection and Ranging (LiDAR) technology is a key enabler of this progress, with LiDAR systems now widely integrated into devices ranging from mobile phones to domestic robots because of their increasing commercial availability and affordability. Despite its highly accurate 3D reconstruction capabilities, LiDAR still faces major challenges, including high energy consumption, mutual interference, and hardware constraints that force trade-offs among range, frame rate, and resolution. 

This thesis presents three systems: (a) D2SR, (b) NeuroLiDAR, and (c) MuLES, to address these challenges. First, D2SR is middleware for scanning LiDARs designed to mitigate mutual interference in environments where multiple LiDAR-equipped pervasive devices operate independently in close proximity. Second, NeuroLiDAR introduces a sensor-fusion framework that combines LiDAR with neuromorphic event cameras to overcome low frame-rate limitations, using event-driven keyframe activation and depth extrapolation to provide adaptive, high-frequency depth estimation. Finally, MuLES is a low-power multistatic LiDAR architecture that shifts energy-intensive emission to infrastructure while mobile devices use passive receivers to generate point clouds and estimate trajectories.

Together, these system-level innovations advance the efficiency and adaptability of LiDAR sensing for pervasive devices, enabling more efficient, robust, and scalable LiDAR-driven spatial computing across diverse devices and environments.

 

SPEAKER BIOGRAPHY

Darshana Rathnayake is a Ph.D. candidate at Singapore Management University (SMU), under the supervision of Lee Kong Chian Professor Archan Misra. His research sits at the intersection of pervasive sensing and spatial computing, with a focus on improving the accuracy, efficiency, and robustness of LiDAR-based depth perception to enable reliable 3D understanding of physical environments. During his PhD, he spent time as a visiting research student with the Mens, Manus, Machina (M3S) program at the Singapore-MIT Alliance for Research and Technology (SMART), a research partnership between MIT and Singapore.