Generative Pretrained Structured Transformers (GPST) is an unsupervised syntactic language model that overcomes limitations of previous models by pre-training on raw texts with high parallelism, demonstrating superiority in various tasks compared to GPT-2.
GPST is an unsupervised syntactic language model that outperforms existing models in various tasks, demonstrating its potential as a foundational architecture for large language models.