Quantum Self-Attention Neural Networks for Text Classification

Abstract

An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in Quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can make up for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises.

Publication
arXiv preprint arXiv:2205.05625
Guangxi Li
Guangxi Li
Research Associate (incoming March 2024)

I obtained my BS and MS in Computer Science from University of Electronic Science and Technology of China. I obtained my PhD degree in Computer Science from University of Technology Sydney. My research interests include quantum computing and quantum machine learning.

Xin Wang
Xin Wang
Associate Professor

The main focus of my research is to better understand the limits of information processing with quantum systems and the power of quantum artificial intelligence.