Improving Question Answering over Knowledge Graphs using Graph Summarization

Question Answering (QA) systems over Knowledge Graphs (KGs) (KGQA) automatically answer natural language questions using triples contained in a KG. The key idea is to represent questions and entities of a KG as low-dimensional embeddings. Previous KGQAs have attempted to represent entities using Kno...

Full description

Bibliographic Details
Main Authors: Li, Sirui, Wong, Kok Wai, Fung, Chun Che, Zhu, Dengya
Format: Conference Paper
Published: Springer 2021
Online Access:https://link.springer.com/chapter/10.1007/978-3-030-92273-3_40
http://hdl.handle.net/20.500.11937/87131
_version_ 1848764898557820928
author Li, Sirui
Wong, Kok Wai
Fung, Chun Che
Zhu, Dengya
author_facet Li, Sirui
Wong, Kok Wai
Fung, Chun Che
Zhu, Dengya
author_sort Li, Sirui
building Curtin Institutional Repository
collection Online Access
description Question Answering (QA) systems over Knowledge Graphs (KGs) (KGQA) automatically answer natural language questions using triples contained in a KG. The key idea is to represent questions and entities of a KG as low-dimensional embeddings. Previous KGQAs have attempted to represent entities using Knowledge Graph Embedding (KGE) and Deep Learning (DL) methods. However, KGEs are too shallow to capture the expressive features and DL methods process each triple independently. Recently, Graph Convolutional Network (GCN) has shown to be excellent in providing entity embeddings. However, using GCNs to KGQAs is inefficient because GCNs treat all relations equally when aggregating neighbourhoods. Also, a problem could occur when using previous KGQAs: in most cases, questions often have an uncertain number of answers. To address the above issues, we propose a graph summarization technique using Recurrent Convolutional Neural Network (RCNN) and GCN. The combination of GCN and RCNN ensures that the embeddings are propagated together with the relations relevant to the question, and thus better answers. The proposed graph summarization technique can be used to tackle the issue that KGQAs cannot answer questions with an uncertain number of answers. In this paper, we demonstrated the proposed technique on the most common type of questions, which is single-relation questions. Experiments have demonstrated that the proposed graph summarization technique using RCNN and GCN can provide better results when compared to the GCN. The proposed graph summarization technique significantly improves the recall of actual answers when the questions have an uncertain number of answers.
first_indexed 2025-11-14T11:26:40Z
format Conference Paper
id curtin-20.500.11937-87131
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T11:26:40Z
publishDate 2021
publisher Springer
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-871312022-01-11T05:55:38Z Improving Question Answering over Knowledge Graphs using Graph Summarization Li, Sirui Wong, Kok Wai Fung, Chun Che Zhu, Dengya Question Answering (QA) systems over Knowledge Graphs (KGs) (KGQA) automatically answer natural language questions using triples contained in a KG. The key idea is to represent questions and entities of a KG as low-dimensional embeddings. Previous KGQAs have attempted to represent entities using Knowledge Graph Embedding (KGE) and Deep Learning (DL) methods. However, KGEs are too shallow to capture the expressive features and DL methods process each triple independently. Recently, Graph Convolutional Network (GCN) has shown to be excellent in providing entity embeddings. However, using GCNs to KGQAs is inefficient because GCNs treat all relations equally when aggregating neighbourhoods. Also, a problem could occur when using previous KGQAs: in most cases, questions often have an uncertain number of answers. To address the above issues, we propose a graph summarization technique using Recurrent Convolutional Neural Network (RCNN) and GCN. The combination of GCN and RCNN ensures that the embeddings are propagated together with the relations relevant to the question, and thus better answers. The proposed graph summarization technique can be used to tackle the issue that KGQAs cannot answer questions with an uncertain number of answers. In this paper, we demonstrated the proposed technique on the most common type of questions, which is single-relation questions. Experiments have demonstrated that the proposed graph summarization technique using RCNN and GCN can provide better results when compared to the GCN. The proposed graph summarization technique significantly improves the recall of actual answers when the questions have an uncertain number of answers. 2021 Conference Paper http://hdl.handle.net/20.500.11937/87131 https://link.springer.com/chapter/10.1007/978-3-030-92273-3_40 Springer restricted
spellingShingle Li, Sirui
Wong, Kok Wai
Fung, Chun Che
Zhu, Dengya
Improving Question Answering over Knowledge Graphs using Graph Summarization
title Improving Question Answering over Knowledge Graphs using Graph Summarization
title_full Improving Question Answering over Knowledge Graphs using Graph Summarization
title_fullStr Improving Question Answering over Knowledge Graphs using Graph Summarization
title_full_unstemmed Improving Question Answering over Knowledge Graphs using Graph Summarization
title_short Improving Question Answering over Knowledge Graphs using Graph Summarization
title_sort improving question answering over knowledge graphs using graph summarization
url https://link.springer.com/chapter/10.1007/978-3-030-92273-3_40
http://hdl.handle.net/20.500.11937/87131