Centrala begrepp
EquiformerV2 improves performance on large-scale datasets by incorporating higher-degree representations and architectural improvements.
Sammanfattning
EquiformerV2 is a novel Equivariant Transformer that outperforms previous methods on the OC20 dataset. By scaling to higher degrees, it achieves up to 9% improvement in forces and 4% in energies. The model offers better speed-accuracy trade-offs and reduces DFT calculations needed for computing adsorption energies by 2ˆ. EquiformerV2 also shows better data efficiency compared to GemNet-OC when trained on only the OC22 dataset. The proposed architectural improvements include attention re-normalization, separable S2 activation, and separable layer normalization. These enhancements enable EquiformerV2 to efficiently incorporate higher-degree tensors and improve performance significantly.
Statistik
EquiformerV2 outperforms previous methods on large-scale OC20 dataset by up to 9% on forces and 4% on energies.
EquiformerV2 offers a 2ˆ reduction in DFT calculations needed for computing adsorption energies.
EquiformerV2 trained on only OC22 dataset outperforms GemNet-OC trained on both OC20 and OC22 datasets.
Citat
"EquiformerV2 outperforms previous state-of-the-art methods with improvements of up to 9% on forces and 4% on energies."
"Putting this all together, we propose EquiformerV2, which is developed on large and diverse OC20 dataset."
"Additionally, when used in the AdsorbML algorithm for performing adsorption energy calculations, EquiformerV2 achieves the highest success rate."