The content introduces AraPoemBERT, a BERT-based language model pretrained on Arabic poetry text. It outperforms other models in tasks like poet's gender classification and poetry sub-meter classification. The dataset used contains over 2.09 million verses associated with attributes like meter, sub-meter, poet, rhyme, and topic. AraPoemBERT demonstrates effectiveness in understanding and analyzing Arabic poetry.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Faisal Qarah at arxiv.org 03-20-2024
https://arxiv.org/pdf/2403.12392.pdfDeeper Inquiries