Explore and Establish Synergistic Effects Between Weight Pruning and Coreset Selection in Neural Network Training
Published in AAAI, 2026
Modern deep neural networks require massive computational resources. Weight pruning and coreset selection are two effective paradigms for reducing these costs, but they are typically studied in isolation. This paper explores the synergistic effects between them, revealing that redundant samples complicate weight pruning while irrelevant weights undermine coreset selection.
To harness this interplay, the authors propose Simultaneous Weight and Sample Tailoring (SWaST), a mechanism that alternately performs weight pruning and coreset selection. They identify a phenomenon termed “critical double-loss,” where important weights and supportive samples are simultaneously removed, leading to irreversible degradation. SWaST incorporates a state preservation mechanism to mitigate this issue, enabling stable joint optimization. Experiments demonstrate that SWaST achieves significant accuracy boosts (up to 17.83%) and FLOPs reductions (10%-90%) compared to independent application of these techniques.
