showSidebars ==
showTitleBreadcrumbs == 1
node.field_disable_title_breadcrumbs.value ==

Research Seminar by YING Jiahao and TIAN Zichen

Please click here if you are unable to view this page.

 

Research Seminar by YING Jiahao and TIAN Zichen

DATE :

26 November 2024, Tuesday

TIME :

3:00pm to 4:00pm

VENUE :

Meeting room 5.1, Level 5
School of Computing and Information Systems 1,
Singapore Management University,
80 Stamford Road,
Singapore 178902

Please register by 25 November 2024

 

There are 2 talks in this session, each talk is approximately 30 minutes. All sessions are for pre-conference talk for The Thirty-Eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024).

 

About the Talk (s)

 

Talk #1: Automating Dataset Updates Towards Reliable and Timely Evaluation of Large Language Models
by YING Jiahao, PhD Candidate

Large language models (LLMs) have achieved impressive performance across various natural language benchmarks, prompting a continual need to curate more difficult datasets for larger LLMs, which is costly and time-consuming. In this paper, we propose to automate dataset updating and provide systematical analysis regarding its effectiveness in dealing with benchmark leakage issue, difficulty control, and stability. Thus, once current benchmark has been mastered or leaked, we can update it for timely and reliable evaluation. There are two updating strategies: 1) mimicking strategy to generate similar samples based on original data, preserving stylistic and contextual essence, and 2) extending strategy that further expands existing samples at varying cognitive levels by adapting Bloom’s taxonomy of educational objectives. Extensive experiments on updated MMLU and BIG-Bench demonstrate the stability of the proposed strategies and find that the mimicking strategy can effectively alleviate issues of overestimation from benchmark leakage. In cases where the efficient mimicking strategy fails, our extending strategy still shows promising results. Additionally, by controlling the difficulty, we can better discern the models’ performance and enable fine-grained analysis — neither too difficult nor too easy an exam can fairly judge students’ learning status. To the best of our knowledge, we are the first to automate updating benchmarks for reliable and timely evaluation. Our demo leaderboard can be found at https://yingjiahao14.github.io/Automating-DatasetUpdates/.

Talk #2: Learning De-Biased Representations for Remote-Sensing Imagery
by TIAN Zichen , PhD Student

Remote sensing (RS) imagery, which requires specialised satellites to collect and is difficult to annotate, suffers from data scarcity and class imbalance. Due to data scarcity in certain spectrum, training large-scale RS models from scratch is unrealistic, and the alternative is to transfer pre-trained models by fine-tuning or or a more data-efficient method LoRA. Due to class imbalance, transferred models show strong bias, where features of the major class dominate over the minor class. We propose debLoRA, a generic training approach that works with LoRA variants to yield debiased features. It is an unsupervised approach that can diversify minor class features based on the shared attributes with major classes, where the attributes are obtained by a simple step of clustering. To evaluate it, we conduct extensive experiments in two transfer learning scenarios: from natural to optical RS images, and from optical RS to multi-spectrum RS images. We perform object classification and oriented object detection on the optical RS dataset and the SAR dataset. Results show that debLoRA consistently surpasses prior arts across these transfer scenarios, yielding up to 3.3 and 4.7 percentage points gains on tail classes for natural → optical RS and optical RS → multi-spectrum RS adaptation, respectively, while preserving head class performance.

 

 

About the Speaker (s)

 

 

Jiahao YING is a PhD candidate in Computer Science at the School of Computing and Information Systems at SMU, under the supervision of Assistant Professor SUN Qianru and external co-supervisor Cao Yixin. His interest is in LLMs evaluation and improvement.

 
 

Zichen TIAN is a PhD student in Computer Science at the School of Computing and Information Systems at SMU, under the supervision of Assistant Professor SUN Qianru. His interest is in parameter efficient transfer learning and data imbalance tasks.