Authors
Mohamed Morsey, Jens Lehmann, Sören Auer, Claus Stadler, Sebastian Hellmann
Publication date
2012/4/20
Journal
Program
Volume
46
Issue
2
Pages
157-181
Publisher
Emerald Group Publishing Limited
Description
Purpose
DBpedia extracts structured information from Wikipedia, interlinks it with other knowledge bases and freely publishes the results on the web using Linked Data and SPARQL. However, the DBpedia release process is heavyweight and releases are sometimes based on several months old data. DBpedia‐Live solves this problem by providing a live synchronization method based on the update stream of Wikipedia. This paper seeks to address these issues.
Design/methodology/approach
Wikipedia provides DBpedia with a continuous stream of updates, i.e. a stream of articles, which were recently updated. DBpedia‐Live processes that stream on the fly to obtain RDF data and stores the extracted data back to DBpedia. DBpedia‐Live publishes the newly added/deleted triples in files, in order to enable synchronization between the DBpedia endpoint and other DBpedia mirrors.
Findings
During the …
Total citations
2011201220132014201520162017201820192020202120222023202418231824171516121591097
Scholar articles