Quantum-like Generalization of Complex Word Embedding: a lightweight approach for textual classification
In this paper, we present an extension, and an evaluation, to existing Quantum like approaches of word embedding for IR tasks that (1) improves complex features detection of word use (e.g., syntax and semantics), (2) enhances how this method extends these aforementioned uses across linguistic contexts (i.e., to model lexical ambiguity)-specifically Question Classification-, and (3) reduces computational resources needed for training and operating Quantum based neural networks, when confronted with existing models. This approach could also be latter applicable to significantly enhance the state-of the-art across Natural Language Processing (NLP) word-level tasks such as entity recognition, part-of-speech tagging, or sentence-level ones such as textual relatedness and entailment, to name a few.
READ FULL TEXT