News

Two Papers Accepted in Machine Learning

Congratulations to Dr Pan Yuangang, Early Career Principal Investigator (PI) and Dr Shi Yaxin, Scientist, on the acceptance of their latest papers in Machine Learning, a prestigious journal dedicated to advancing computational approaches to learning.

Dr Pan’s paper was accepted through the journal track of the 17th Asian Conference on Machine Learning (ACML 2025), while Dr Shi’s paper was accepted via the journal track of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD) 2025.


Dr Pan Yuangang

Early Career PI
Senior Scientist
A*STAR CFAR

Auto-clustering with Continuous Distribution Estimation on Centroids
Yuangang Pan
, Yinghua Yao, Atsushi Nitanda, Joey Tianyi Zhou, Ivor Tsang

Determining the true number of clusters is a long-standing challenge in unsupervised learning. Traditional probabilistic and deep clustering methods typically rely on discrete centroid distributions and complex inference schemes such as MCMC or variational inference, making them difficult to scale and ineffective for high-dimensional text and image data.

The Auto-ClusTering (ACT) framework addresses these limitations through a continuous centroid distribution that removes the need to predefine cluster numbers. Supported by rigorous theoretical insights, its fit-then-prune strategy efficiently eliminates redundant centroids and naturally recovers a cluster structure close to the ground truth.

 With its simple yet powerful gradient-based inference, ACT integrates seamlessly with deep representation learning and scales effectively across diverse data modalities. Extensive experiments demonstrate that ACT not only infers cluster numbers accurately but also achieves superior clustering performance compared to both parametric and nonparametric baselines. This work showcases A*STAR CFAR’s leadership in developing scalable and principled nonparametric clustering methods for real-world AI applications.

scientist---shi-yaxin
Dr Shi Yaxin
Scientist
A*STAR CFAR

Uncover and Unlearn Nuisances: Agnostic Fully Test-Time Adaptaion
Ponhvoan Srey, Yaxin Shi, Hangwei Qian, Jing Li, and Ivor W. Tsang

Fully Test-Time Adaptation (FTTA) addresses domain shifts without access to source data and or the original training protocols of pre-trained models. Traditional strategies that align source and target feature distributions are infeasible in FTTA due to the lack of training data and unpredictability of target domains.

The Test-time Invariant Representation learning through Nuisance Unlearning (TIRNU) introduces a new paradigm that addresses these challenges, enabling Agnostic Fully Test-Time Adaptation (AFTTA) under real-world uncertainty. By systematically uncovering and unlearning nuisances, TIRNU enables AI models to autonomously adapt to unseen domains without access to source data or labels, achieving robust and invariant understanding under real-world uncertainty.

At its core, this research reflects a broader principle: just as human cognition advances through continual refinement—distilling signal from noise and shedding accumulated bias—intelligent systems too can evolve to meet unforeseen circumstances by uncovering noise, unlearning bias, and returning to essence — a step toward truly self-evolving, artificial general intelligence.

  • Read the full paper here.

>> Learn more about ACML 2025 and ECMLPKDD 2025.