Integrating Structural Information Enhances Protein Language Models for Diverse Downstream Tasks
Incorporating structural information through a simple, efficient, and scalable adapter architecture can significantly improve the performance and training efficiency of protein language models across diverse downstream tasks.