|
Advancing Embodied AI through Interactive Reasoning
Speaker (s):

Zhu Yixin
Assistant Professor,
Peking University (PKU) Institute for AI
|
Date:
Time:
Venue:
|
|
24 January 2024, Wednesday
10:30am – 11:30am
School of Computing & Information
Systems 1 (SCIS 1)
Level 5, Meeting Room 5-1
Singapore Management University
Singapore 178902
Please register by 23 January 2024.
We look forward to seeing you at this research seminar.

|
|
About the Talk
Embodied AI agents are pivotal for autonomously performing tasks in human everyday life. My research aims to integrate reasoning abilities in these agents. This talk will cover recent advancements in embodied AI, focusing on interactive reasoning in scene understanding, motion sequence generation, and dexterous manipulation skills. In scene understanding, the approach is centered on enabling autonomous, open-world interactions via multi-modal reasoning and affordance analysis. For motion sequence generation, novel representations have been developed to effectively manage complex motion constraints. In dexterous manipulation, the integration of tactile feedback with manipulation policies has led to significant improvements. This body of work provides insights into the diverse aspects of embodied AI and their potential applications in real-world scenarios.
About the Speaker
Dr. Yixin Zhu received a Ph.D. degree (‘18) from UCLA advised by Prof. Song-Chun Zhu. His research builds interactive AI by integrating high-level common sense (functionality, affordance, physics, causality, intent) with raw sensory inputs (pixels and haptic signals) to enable richer representation and cognitive reasoning on objects, scenes, shapes, numbers, and agents. Dr. Zhu directs the PKU CoRe Lab, working on abstract reasoning, visually grounded reasoning, and interactive reasoning.
|