|
 Tailoring Transformer-based Deep Learning for Code Generation and Translation |  | Imam Nur Bani YUSUF PhD Candidate School of Computing and Information Systems Singapore Management University | Research Area Dissertation Committee Research Advisor Committee Members External Member - Julia LAWALL, Senior Research Scientist, Inria Paris Centre
|
| | Date 22 May 2025 (Thursday) | Time 2:00pm - 3:00pm | Venue Meeting room 5.1, Level 5 School of Computing and Information Systems 1, Singapore Management University, 80 Stamford Road Singapore 178902 | Please register by 20 May 2025. We look forward to seeing you at this research seminar. 
|
|
|
| ABOUT THE TALK Software is increasingly pervasive in modern society, making the effective translation of human intent into code essential. Novice programmers often struggle with domain-specific code due to limited background knowledge, while experienced developers face challenges in maintaining evolving large-scale codebases. Traditional pattern-based approaches address these issues, but such approaches are task-specific and require significant adaptation for different tasks. Transformer-based models offer a more flexible alternative, as the same architecture can be tailored for diverse programming tasks.
This dissertation investigates how Transformer-based models can be customized for various code generation and translation tasks. First, it introduces Transformer-based approaches that assist end-users with limited domain-specific knowledge in writing trigger-action and Arduino programs. Second, it addresses the automation of code evolution in large codebases, a process that is often time-consuming and error-prone when done manually. This includes an empirical study on deep learning models for generating Linux kernel semantic patches, followed by a development of a dual learning framework that improves how Transformer-based models learn code-to-code transformation patterns from change examples. Finally, this dissertation presents an efficient method that leverages graph modality to enhance the adaptability of Transformer-based models across different code generation and translation tasks.
These contributions demonstrate the versatility of Transformer-based models in code generation and translation, reducing barriers for novices while enhancing productivity for experienced developers. The findings open new opportunities for broader applications in software engineering. | | SPEAKER BIOGRAPHY Imam is a fifth-year PhD candidate in Computer Science at the School of Computing and Information Systems. He conducts research at the intersection of language models and software engineering under the supervision of Professor Lingxiao Jiang. Specifically, his research focuses on how to leverage language models to improve developer efficiency in software engineering tasks. Before joining SMU, he earned his bachelor's degree in Telecommunication Engineering from the Institut Teknologi Bandung. In his leisure time, Imam enjoys expanding his horizons through audiobooks, staying active through running, and exploring new cultures through travel. |
|