Efficient Structural Pruning of Pre-trained Language Models via Multi-Objective Neural Architecture Search
Neural architecture search can be effectively used to find sub-networks of pre-trained language models that balance model efficiency and generalization performance.