• Login
    View Item 
    •   Home
    • Research from April 2016
    • Computing
    • View Item
    •   Home
    • Research from April 2016
    • Computing
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of UOBREPCommunitiesTitleAuthorsIssue DateSubmit DateSubjectsPublisherJournalDepartmentThis CollectionTitleAuthorsIssue DateSubmit DateSubjectsPublisherJournalDepartment

    My Account

    LoginRegister

    About

    AboutLearning ResourcesResearch Graduate SchoolResearch InstitutesUniversity Website

    Statistics

    Display statistics

    Enhancing text comprehension via fusing pre-trained language model with knowledge graph

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Authors
    Qian, Jing
    Li, Gangmin
    Atkinson, Katie
    Yue, Yong
    Affiliation
    Xi'an Jiaotong-Liverpool University
    University of Bedfordshire
    University of Liverpool
    Issue Date
    2024-02-16
    Subjects
    knowledge graphs
    sentence representation learning
    natural language understanding
    Subject Categories::G710 Speech and Natural Language Processing
    
    Metadata
    Show full item record
    Other Titles
    ACAI '23: Proceedings of the 2023 6th International Conference on Algorithms, Computing and Artificial Intelligence
    Abstract
    Pre-trained language models (PLMs) such as BERT and GPTs capture rich linguistic and syntactic knowledge from pre-training over large-scale text corpora, which can be further fine-tuned for specific downstream tasks. However, these models still have limitations as they rely on knowledge gained from plain text and ignore structured knowledge such as knowledge graphs (KGs). Recently, there has been a growing trend of explicitly integrating KGs into PLMs to improve their performance. For instance, K-BERT incorporates KG triples as domain-specific supplements into input sentences. Nevertheless, we have observed that such methods do not consider the semantic relevance between the introduced knowledge and the original input sentence, leading to the issue of knowledge impurities. To address this issue, we propose a semantic matching-based approach that enriches the input text with knowledge extracted from an external KG. The architecture of our model comprises three components: the knowledge retriever (KR), the knowledge injector (KI), and the knowledge aggregator (KA). The KR, built upon the sentence representation learning model (i.e. CoSENT), retrieves triples with high semantic relevance to the input sentence from an external KG to alleviate the issue of knowledge impurities. The KI then integrates the retrieved triples from the KR into the input text by converting the original sentence into a knowledge tree with multiple branches, the knowledge tree is transformed into an accessible sequence of text that can be fed into the KA. Finally, the KA takes the flattened knowledge tree and passes it through an embedding layer and a masked Transformer encoder. We conducted extensive evaluations on eight datasets covering five text comprehension tasks, and the experimental results demonstrate that our approach exhibits competitive advantages over popular knowledge-enhanced PLMs such as K-BERT and ERNIE.
    Citation
    Qian J, Li G, Atkinson K, Yue Y (2023) 'Enhancing text comprehension via fusing pre-trained language model with knowledge graph', 6th International Conference on Algorithms, Computing and Artificial Intelligence (ACAI '23) - Sanya, Association for Computing Machinery.
    Publisher
    Association for Computing Machinery
    URI
    http://hdl.handle.net/10547/626198
    DOI
    10.1145/3639631.3639689
    Additional Links
    https://dl.acm.org/doi/10.1145/3639631.3639689
    Type
    Conference papers, meetings and proceedings
    Language
    en
    ISBN
    9798400709203
    ae974a485f413a2113503eed53cd6c53
    10.1145/3639631.3639689
    Scopus Count
    Collections
    Computing

    entitlement

     
    DSpace software (copyright © 2002 - 2025)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.