Analyzing Transformer Models for Natural Language Processing on Embedded Devices
The authors investigate the performance of transformer language models on embedded devices, focusing on resource constraints and accuracy requirements. They explore trade-offs between model size, accuracy, and system resources.