This paper surveys the state of quantum natural language processing, showing how NLP-related techniques including word embeddings, sequential models, attention, and grammatical parsing have been used in quantum language processing. It also introduces a new quantum design for the basic task of text encoding, and discusses the challenges of distinguishing between hypothetical and actual statements in language models.
This paper describes experiments demonstrating that some natural language processing tasks can already be performed using quantum computers, though so far only with small datasets. The authors explore various approaches including topic classification, bigram modeling, and ambiguity resolution, highlighting the potential of quantum computing for language processing while also identifying key challenges in scaling these techniques to larger, more realistic datasets.