How Entity Name Embedings Affect the Quality of Entity Alignment

Main Article Content

Daniil Ivanovic Gusev
Zinaida Vladimirovna Apanovich

Abstract

Cross-lingual entity alignment algorithms are designed to look for identical real-world objects in multilingual knowledge graphs. This problem occurs, for example, when searching for drugs manufactured in different countries under different names, or when searching for imported equipment. At the moment, there are several open-source libraries that collect implementations of entity alignment algorithms as well as test data sets for various languages. This paper describes experiments with several popular entity alignment algorithms applied to a Russian-English dataset. In addition to translating entity names from Russian to English, experiments on combining the various generators of entity name embeddings with the various generators of relational information embeddings have been conducted. In order to obtain more detailed information about the results of the EA approaches, an assessment by entity types, the number of relationships and attributes have been made. These experiments allowed us to significantly improve the accuracy of several EA algorithms on the English-Russian dataset.

Article Details

References

1. Sun Z., Zhang Q., Hu W., Wang C., Chen M., Akrami F. et al. A benchmarking study of embedding-based entity alignment for knowledge graphs // Proc. VLDB Endowment. 2020. Vol. 13. P. 2326–2340.
2. Gnezdilova V.A., Apanovich Z.V., Russian-English dataset and comparative analysis of algorithms for cross-language embedding-based entity alignment // Journal of Physics: Conference Series. 2021. Vol. 2099.
3. Zhang Q., Sun Z., Hu W., Chen M., Guo L. et al. Multi-view knowledge graph embedding for entity alignment // Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. 2019. P. 5429–5435.
4. Mikolov T., Chen K., Corrado G., Dean J. Efficient estimation of word representations in vector space, January 2013, URL: https://arxiv.org/abs/1301.3781.
5. Bordes A., Usunier N., Garcia-Durán A, Weston J., Yakhnenko O. Translating embeddings for modeling multi-relational data // Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013. Vol. 2. P. 2787–2795.
6. Wu Y., Liu X., Feng Y., Wang Z., Yan R., Zhao D. Relation-aware entity alignment for heterogeneous knowledge graphs // Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. 2019. P. 5278–5284.
7. Veličković P., Cucurull G., Casanova A., Romero A., Liò P., Bengio Y. Graph attention networks// ICLR. 2018. 12 p.
8. Wang Z., Lv Q., Lan X., Zhang Y. Cross-lingual knowledge graph alignment via graph convolutional networks // Proc. of the Conference on Empirical Methods in Natural Language Processing. 201., P. 349–357.
9. Mao X., Wang W., Wu Y., Lan M. From alignment to assignment: frustratingly simple unsupervised entity alignment // Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. 2021. P. 2843–2853.
10. Xu K., Wang L., Yu M., Feng Y., Song Y., et al. Cross-lingual knowledge graph alignment via graph matching neural network // Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019. P. 3156–3161.
11. Pennington J, Socher R., Manning C.D. GloVe: Global Vectors for Word Representation // Conference on Empirical Methods in Natural Language. 2014. P. 1532–1543.
12. Bojanowski P., Grave E., Joulin A., Mikolov T. Enriching word vectors with subword information // Transactions of the Association for Computational Linguistics. 2017. P. 135–146.
13. Devlin J., Chang M.-W., Lee K., Toutanova K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding // Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019. Vol. 1. P. 4171–4186.
14. Fuglede B., Topsoe F. Jensen–Shannon divergence and Hilbert space embedding // Proceedings of the International Symposium on Information Theory, 2004. IEEE.
15. Sun Z., Hu W., Zhang Q., Qu Y. Bootstrapping entity alignment with knowledge graph embedding // Proc. 27th International Joint Conference on Artificial Intelligence (IJCAI-18), P. 4396–4402.
16. Guo L., Sun Z., Hu W. Learning to Exploit Long-term relational dependencies in knowledge graphs // Proceedings of the 36th International Conference on Machine Learning. 2019. Vol. 57. P. 2505–2514.
17. Maaten L. van der, Hinton G. Visualizing data using t-SNE // Journal of Machine LearningResearch. 2008. Vol. 86. P. 2579–2605.
18. Yang Z., Dai Z., Yang Y., Carbonell J., Salakhutdinov R. et al. XLNet: generalized autoregressive pretraining for language understanding // Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019. P. 5753–5763.
19. Feng F., Yang Y., Cer D., Arivazhagan N., Wang W. Language-agnostic BERT sentence embedding. 2020. URL: https://arxiv.org/abs/2007.01852.