Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A Practical Guide to Hybrid Natural Language ProcessingBuilding Hybrid Representations from Text Corpora, Knowledge Graphs, and Language Models

A Practical Guide to Hybrid Natural Language Processing: Building Hybrid Representations from... [In the previous chapter we saw how knowledge graph embedding algorithms can capture structured knowledge about concepts and relations in a graph as embeddings in a vector space, which then can be used in downstream tasks. However, this type of approaches can only capture the knowledge that is explicitly represented in the graph, hence lacking in recall and domain coverage. In this chapter, we focus on algorithms that address this limitation through the combination of information from both unstructured text corpora and structured knowledge graphs. The first approach is Vecsigrafo, which produces corpus-based word, lemma, and concept embeddings from large disambiguated corpora. Vecsigrafo jointly learns word, lemma, and concepts embeddings, bringing together textual and symbolic knowledge representations in a single, unified formalism for use in neural natural language processing architectures. The second and more recent approach is called Transigrafo, which adopts recent Transformer-based language models to derive concept-level contextual embeddings, providing state-of-the-art performance in word-sense disambiguation with reduced complexity.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

A Practical Guide to Hybrid Natural Language ProcessingBuilding Hybrid Representations from Text Corpora, Knowledge Graphs, and Language Models

Loading next page...
 
/lp/springer-journals/a-practical-guide-to-hybrid-natural-language-processing-building-YrQ9MHPg6E

References (0)

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Springer International Publishing
Copyright
© Springer Nature Switzerland AG 2020
ISBN
978-3-030-44829-5
Pages
57 –89
DOI
10.1007/978-3-030-44830-1_6
Publisher site
See Chapter on Publisher Site

Abstract

[In the previous chapter we saw how knowledge graph embedding algorithms can capture structured knowledge about concepts and relations in a graph as embeddings in a vector space, which then can be used in downstream tasks. However, this type of approaches can only capture the knowledge that is explicitly represented in the graph, hence lacking in recall and domain coverage. In this chapter, we focus on algorithms that address this limitation through the combination of information from both unstructured text corpora and structured knowledge graphs. The first approach is Vecsigrafo, which produces corpus-based word, lemma, and concept embeddings from large disambiguated corpora. Vecsigrafo jointly learns word, lemma, and concepts embeddings, bringing together textual and symbolic knowledge representations in a single, unified formalism for use in neural natural language processing architectures. The second and more recent approach is called Transigrafo, which adopts recent Transformer-based language models to derive concept-level contextual embeddings, providing state-of-the-art performance in word-sense disambiguation with reduced complexity.]

Published: Jun 17, 2020

There are no references for this article.