Neural Wikipedian: Generating Textual Summaries from Knowledge Base Triples
19 Pages Posted: 13 Sep 2018 First Look: Accepted
Most people need textual or visual interfaces in order to make sense of Semantic Web data. In this paper, we investigate the problem of generating natural language summaries for Semantic Web data using neural networks. Our end-to-end trainable architecture encodes the information from a set of triples into a vector of fixed dimensionality and generates a textual summary by conditioning the output on the encoded vector. We explore a set of different approaches that enable our models to verbalise entities from the input set of triples in the generated text. Our systems are trained and evaluated on two corpora of loosely aligned Wikipedia snippets with triples from DBpedia and Wikidata, with promising results.
Keywords: Natural Language Generation, Neural Networks, Semantic Web Triples
Suggested Citation: Suggested Citation