|Table of Contents|

[1] Li Huiying, Zhao Man, Yu Wenqi,. A multi-attention RNN-based relation linking approachfor question answering over knowledge base [J]. Journal of Southeast University (English Edition), 2020, 36 (4): 385-392. [doi:10.3969/j.issn.1003-7985.2020.04.003]
Copy

A multi-attention RNN-based relation linking approachfor question answering over knowledge base()
面向知识库问答的多注意力RNN关系链接方法
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
36
Issue:
2020 4
Page:
385-392
Research Field:
Computer Science and Engineering
Publishing date:
2020-12-20

Info

Title:
A multi-attention RNN-based relation linking approachfor question answering over knowledge base
面向知识库问答的多注意力RNN关系链接方法
Author(s):
Li Huiying, Zhao Man, Yu Wenqi
School of Computer Science and Engineering, Southeast University, Nanjing 211189, China
李慧颖, 赵满, 余文麒
东南大学计算机科学与工程学院, 南京 211189
Keywords:
question answering over knowledge base(KBQA) entity linking relation linking multi-attention bidirectional long short-term memory(Bi-LSTM) large-scale complex question answering dataset(LC-QuAD)
知识库问答 实体链接 关系链接 多注意力双向长短时记忆网络 大规模复杂问答数据集
PACS:
TP311
DOI:
10.3969/j.issn.1003-7985.2020.04.003
Abstract:
Aiming at the relation linking task for question answering over knowledge base, especially the multi relation linking task for complex questions, a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed, which works for both simple and complex questions. First, the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels, and named entities in questions are labeled by the conditional random field(CRF)model. Candidate entities are generated based on a dictionary, the disambiguation of candidate entities is realized based on predefined rules, and named entities mentioned in questions are linked to entities in knowledge base. Next, questions are classified into simple or complex questions by the machine learning method. Starting from the identified entities, for simple questions, one-hop relations are collected in the knowledge base as candidate relations; for complex questions, two-hop relations are collected as candidates. Finally, the multi-attention Bi-LSTM model is used to encode questions and candidate relations, compare their similarity, and return the candidate relation with the highest similarity as the result of relation linking. It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions, and the Bi-LSTM model with two attentions is adopted for complex questions. The experimental results show that, based on the effective entity linking method, the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions, which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding.
针对知识库问答场景中的关系链接任务, 尤其是面向复杂问题的多关系链接任务, 提出一种基于多注意力RNN模型的关系链接方法, 该方法既适用于简单问题也适用于复杂问题.首先, 在字符及词级别基础上通过Bi-LSTM模型学习问句的向量表示, 利用条件随机场模型标注问句中的命名实体.基于词典生成候选实体, 基于预定义规则实现候选实体消歧, 将问句中命名实体指称链接到知识库中实体.然后, 采用机器学习方法将问句分类为简单问题及复杂问题.从已识别实体出发, 对于简单问题, 在知识库中收集一跳关系作为候选关系;对于复杂问题, 收集二跳关系作为候选.最后, 采用多注意力Bi-LSTM模型对问句及候选关系进行编码, 比较相似度, 返回相似度最高的候选关系作为关系链接的结果.值得注意的是, 对于简单问题采用带有一个注意力的Bi-LSTM模型, 对于复杂问题则采用带有2个注意力的Bi-LSTM模型.实验结果表明:在有效的实体链接方法基础上, 引入注意力机制的Bi-LSTM关系链接方法对于无论简单问题还是复杂问题效果都有所提升, 且优于现有基于图算法或基于语言学的关系链接方法.

References:

[1] Singh K, Radhakrishna A S, Both A, et al. Why reinvent the wheel:Let’s build question answering systems together[C]//Proceedings of the 2018 World Wide Web Conference. Lyon, France, 2018:1247-1256. DOI:10.1145/3178876.3186023.
[2] Mendes P N, Jakob M, García-Silva A, et al. DBpedia spotlight:Shedding light on the web of documents[C]//Proceedings of the 7th International Conference on Semantic Systems. Graz, Austria, 2011:1-8. DOI:10.1145/2063518.2063519.
[3] Hoffart J, Yosef M A, Bordino I, et al. Robust disambiguation of named entities in text[C]//Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. Edinburgh, UK, 2011:782-792.
[4] Moro A, Raganato A, Navigli R. Entity linking meets word sense disambiguation:A unified approach[J]. Transactions of the Association for Computational Linguistics, 2014, 2:231-244. DOI:10.1162/tacl_a_00179.
[5] Speck R, Ngonga Ngomo A C. Ensemble learning for named entity recognition[C]//Proceedings of the 13th International Semantic Web Conference. Trentino, Italy, 2014:519-534. DOI:10.1007/978-3-319-11964-9.
[6] Ferragina P, Scaiella U. TAGME:On-the-fly annotation of short text fragments(by wikipedia entities)[C]//Proceedings of the 19th ACM Conference on Information and Knowledge Management. Toronto, Canada, 2010:1625-1628. DOI:10.1145/1871437.1871689.
[7] Waitelonis J, Sack H. Named entity linking in #Tweets with KEA[C]//Proceedings of the 6th Workshop on ‘Making Sense of Microposts’. Montreal, Canada, 2016:61-63.
[8] Singh K, Mulang I O, Lytra I, et al. Capturing knowledge in semantically-typed relational patterns to enhance relation linking[C]//Proceedings of the Knowledge Capture Conference. Austin, TX, USA, 2017:1-8. DOI:10.1145/3148011.3148031.
[9] Mulang I O, Singh K, Orlandi F. Matching natural language relations to knowledge graph properties for question answering[C]//Proceedings of the 13th International Conference on Semantic Systems. Amsterdam, the Netherlands, 2017:89-96. DOI:10.1145/3132218.3132229.
[10] Yu M, Yin W, Hasan K S, et al. Improved neural relation detection for knowledge base question answering[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, Canada, 2017:571-581. DOI:10.18653/v1/p17-1053.
[11] Dubey M, Banerjee D, Chaudhuri D, et al. EARL:Joint entity and relation linking for question answering over knowledge graphs[C]//Proceedings of the 17th International Semantic Web Conference. Monterey, CA, USA, 2018:108-126. DOI:10.1007/978-3-030-00671-6_7.
[12] Sakor A, Mulang I O, Singh K, et al. Old is gold:linguistic driven approach for entity and relation linking of short text[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics. Minneapolis, MN, USA, 2019:2336-2346. DOI:10.18653/v1/n19-1243.
[13] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[C]//Proceedings of the International Conference on Learning Representations. San Diego, USA, 2015.
[14] Trivedi P, Maheshwari G, Dubey M, et al. LC-QuAD:A corpus for complex question answering over knowledge graphs[C]//Proceedings of the 16th International Semantic Web Conference. Vienna, Austria, 2017:210-218. DOI:10.1007/978-3-319-68204-4_22.
[15] Usbeck R, R�F6;der M, Ngomo A C N, et al. GERBIL:General entity annotator benchmarking framework[C]//Proceedings of the 24th International Conference on World Wide Web. Florence, Italy, 2015:1133-1143. DOI:10.1145/2736277.2741626.
[16] Pennington J, Socher R, Manning C D. GloVe:Global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, Qatar, 2014:1532-1543. DOI:10.3115/v1/d14-1162.
[17] Devlin J, Chang M, Lee K, et al. BERT:Pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Minneapolis, MN, USA, 2019:4171-4186.

Memo

Memo:
Biography: Li Huiying(1977—), female, doctor, associate professor, huiyingli@seu.edu.cn.
Foundation item: The National Natural Science Foundation of China(No.61502095).
Citation: Li Huiying, Zhao Man, Yu Wenqi. A multi-attention RNN-based relation linking approach for question answering over knowledge base[J].Journal of Southeast University(English Edition), 2020, 36(4):385-392.DOI:10.3969/j.issn.1003-7985.2020.04.003.
Last Update: 2020-12-20