Authors
Mohamed Morsey, Jens Lehmann, Sören Auer, Claus Stadler, Sebastian Hellmann
Description
Purpose–DBpedia extracts structured information from Wikipedia, interlinks it with other knowledge bases and freely publishes the results on the web using Linked Data and SPARQL. However, the DBpedia release process is heavyweight and releases are sometimes based on several months old data. DBpedia-Live solves this problem by providing a live synchronization method based on the update stream of Wikipedia. This paper seeks to address these issues.
Design/methodology/approach–Wikipedia provides DBpedia with a continuous stream of updates, ie a stream of articles, which were recently updated. DBpedia-Live processes that stream on the fly to obtain RDF data and stores the extracted data back to DBpedia. DBpedia-Live publishes the newly added/deleted triples in files, in order to enable synchronization between the DBpedia endpoint and other DBpedia mirrors.
Findings–During the realization of …
Scholar articles
M Morsey, J Lehmann, S Auer, C Stadler, S Hellmann