Show simple item record

dc.contributor.authorJaiswal, Amit Kumaren
dc.contributor.authorHoldack, Guilhermeen
dc.contributor.authorFrommholz, Ingoen
dc.contributor.authorLiu, Haimingen
dc.date.accessioned2020-01-21T13:07:10Z
dc.date.available2020-01-21T13:07:10Z
dc.date.issued2018-09-30
dc.identifier.citationJaiswal AK, Holdack G, Frommholz I, Liu H (2018) 'Quantum-like generalization of complex word embedding: a lightweight approach for textual classification.', Lernen, Wissen, Daten, Analysen 2018 - Mannheim, CEUR Workshop Proceedings.en
dc.identifier.urihttp://hdl.handle.net/10547/623794
dc.description.abstractIn this paper, we present an extension, and an evaluation, to existing Quantum like approaches of word embedding for IR tasks that (1) improves complex features detection of word use (e.g., syntax and semantics), (2) enhances how this method extends these aforementioned uses across linguistic contexts (i.e., to model lexical ambiguity) - specifically Question Classification -, and (3) reduces computational resources needed for training and operating Quantum based neural networks, when confronted with existing models. This approach could also be latter applicable to significantly enhance the state-of the-art across Natural Language Processing (NLP) word-level tasks such as entity recognition, part-of-speech tagging, or sentence-level ones such as textual relatedness and entailment, to name a few.
dc.language.isoenen
dc.publisherCEUR Workshop Proceedingsen
dc.relation.urlhttp://ceur-ws.org/Vol-2191/paper19.pdfen
dc.subjectword embeddingen
dc.subjectquantum theoryen
dc.subjectword-contexten
dc.subjectG730 Neural Computingen
dc.titleQuantum-like generalization of complex word embedding: a lightweight approach for textual classification.en
dc.typeConference papers, meetings and proceedingsen
dc.contributor.departmentUniversity of Bedfordshireen
dc.date.updated2020-01-21T13:04:59Z
html.description.abstractIn this paper, we present an extension, and an evaluation, to existing Quantum like approaches of word embedding for IR tasks that (1) improves complex features detection of word use (e.g., syntax and semantics), (2) enhances how this method extends these aforementioned uses across linguistic contexts (i.e., to model lexical ambiguity) - specifically Question Classification -, and (3) reduces computational resources needed for training and operating Quantum based neural networks, when confronted with existing models. This approach could also be latter applicable to significantly enhance the state-of the-art across Natural Language Processing (NLP) word-level tasks such as entity recognition, part-of-speech tagging, or sentence-level ones such as textual relatedness and entailment, to name a few.


This item appears in the following Collection(s)

Show simple item record