EquiformerV2 is a novel Equivariant Transformer that outperforms previous methods on the OC20 dataset. By scaling to higher degrees, it achieves up to 9% improvement in forces and 4% in energies. The model offers better speed-accuracy trade-offs and reduces DFT calculations needed for computing adsorption energies by 2ˆ. EquiformerV2 also shows better data efficiency compared to GemNet-OC when trained on only the OC22 dataset. The proposed architectural improvements include attention re-normalization, separable S2 activation, and separable layer normalization. These enhancements enable EquiformerV2 to efficiently incorporate higher-degree tensors and improve performance significantly.
A otro idioma
del contenido fuente
arxiv.org
Ideas clave extraídas de
by Yi-Lun Liao,... a las arxiv.org 03-08-2024
https://arxiv.org/pdf/2306.12059.pdfConsultas más profundas