Setting up a Linked Data mirror from RDF dumps (DBpedia 2015-04, Freebase, Wikidata, LinkedGeoData, …) with Virtuoso 7.2.1 and Docker (optional)

So you’re the guy who is allowed to setup a local DBpedia mirror or more generally a local Linked Data mirror for your work group? OK, today is your lucky day and you’re in the right place. I hope you’ll be able to benefit from my many hours of trials and errors. If anything goes wrong (or everything works fine), feel free to leave a comment below.

Versions of this guide

There are four older versions of this guide:

  • Oct. 2010: The first version focusing on DBpedia 3.5 – 3.6 and Virtuoso 6.1
  • May 2012: A bigger update to DBpedia 3.7 (new local language versions) and Virtuoso 6.1.5+ (with a lot of updates making pre-processing of the dumps easier)
  • Apr. 2014: Update to DBpedia 3.9 and Virtuoso 7
  • Nov. 2014: Update to DBpedia 2014 and other Datasets and Virtuoso 7.1.0

In this step by step guide I’ll tell you how to install a local Linked Data mirror of the DBpedia 2015-04, hosting a combination of the regular English and (exemplary) the i18n German datasets adding up to nearly 850 M triples.

I’ll also mention how you can add the following datasets / vocabularies adding up to nearly 6 G triples:

As DBpedia is quite modular and has many internationalized (i18n) versions it has its own section in this guide, the other datasets don’t, as they maximally need minor repacking and a single line to load as explained below.

Used Versions

  • DBpedia 2015-04
  • Virtuoso OpenSource 7.2.1
  • Ubuntu 14.04 LTS or Debian 8


A strong machine with root access and enough RAM: We used a VM with 4 Cores and 32 GBs of RAM for DBpedia only. If you intend to also load Freebase and other datasets i recommend at least 64 GBs of RAM (we actually ended up using a 16 Core, 256 GB RAM Server in our research group). For installing i recommend more than 128 GB free HD space for DBpedia alone, 512 GB if you want to load Freebase as well, especially for downloading and repacking the datasets, as well as the growing database file when importing (mine grew to 64 GBs for DBpedia and 320 GB with all the datasets mentioned above).

This guide applies to a clean install. Please check that there’s no older version of Virtuoso installed with dpkg -l | grep virtuoso ; which isql ; which isql-vt (no output is good). If there is, please know what you’re doing. Virtuoso 6 and 7 use different default locations for their DBs, but in general newer versions should be able to upgrade older DB files if correctly configured to use the same DB file. In general i’d suggest to either uninstall the older version and its config files and then install the new one according to this guide or to isolate the newer one with the docker approach mentioned below.

For the impatient and docker affine

As an alternative to the following sections, which will explain how to build everything from source yourself and go into details about the DBpedia dump files, i also provide a docker image (source) that you can use to automate and simplify the process a lot:

mkdir -p "$dump_dir"
cd "$dump_dir"

# downloading
wget -r -nc -nH --cut-dirs=1 -np -l1 \
    -A '*.nt.bz2' -A '*.owl' -R '*unredirected*' \

# repacking
apt-get install pigz pbzip2
for i in */*.nt.bz2 ; do echo $i ; pbzip2 -dc "$i" | pigz - > "${i%bz2}gz" && rm "$i"; done
mkdir classes
cd classes

# install some VAD packages for DBpedia into our db which we'll keep in db_dir
docker run -d --name dbpedia-vadinst \
    -v "$db_dir":/var/lib/virtuoso-opensource-7 \
    joernhees/virtuoso run &&
docker exec dbpedia-vadinst wait_ready &&
docker exec dbpedia-vadinst isql-vt PROMPT=OFF VERBOSE=OFF BANNER=OFF \
    "EXEC=vad_install('/usr/share/virtuoso-opensource-7/vad/rdf_mappers_dav.vad');" &&
docker exec dbpedia-vadinst isql-vt PROMPT=OFF VERBOSE=OFF BANNER=OFF \
    "EXEC=vad_install('/usr/share/virtuoso-opensource-7/vad/dbpedia_dav.vad');" &&
docker stop dbpedia-vadinst &&
docker rm -v dbpedia-vadinst &&

# starting the import
docker run --rm \
    -v "$db_dir":/var/lib/virtuoso-opensource-7 \
    -v "$dump_dir"/classes:/import:ro \
    joernhees/virtuoso import '' &&
# docker import of the actual data (will use 64 GB RAM and take about 1 hour)
docker run --rm \
    -v "$db_dir":/var/lib/virtuoso-opensource-7 \
    -v "$dump_dir"/core:/import:ro \
    -e "NumberOfBuffers=$((64*85000))" \
    joernhees/virtuoso import '' &&

# running the local endpoint on port 8891 with 32 GB RAM:
docker run --name dbpedia \
    -v "$db_dir":/var/lib/virtuoso-opensource-7 \
    -p 8891:8890 \
    -e "NumberOfBuffers=$((32*85000))" \
    joernhees/virtuoso run

# access one of the following for example:
# http://localhost:8891/sparql
# http://localhost:8891/resource/Bonn
# http://localhost:8891/conductor (user: dba, pw: dba)

The manual version

Download and build Virtuoso

We’ll download Virtuoso OpenSource: either from SourceForge or GitHub (make sure you get v7.2.1 as in this guide or a newer version).

Unlike in earlier versions of this guide we’ll now first build the .deb packages and then install them with apt-get.

As building will install a lot of extra packages that you only need for building, i prepared another docker image (source) that will do the whole building job inside a container for you and put the resulting .deb packages (and DBpedia VAD) into your ~/virtuoso_deb folder:

docker run --rm -it -v ~/virtuoso_deb:/export/ joernhees/dpkg_build \ \
# this should run for about 15 minutes
# compilation by default sadly does not create the dbpedia VAD package, so
# to do that, the above command stops after compilation in interactive mode.
# in there just execute this:
cd /tmp/build/virtuoso*/ &&
./configure --with-layout=debian --enable-dbpedia-vad &&
cd binsrc &&
make &&
cp dbpedia/dbpedia_dav.vad /export &&

If you used this, you can skip the following down to installing the .deb packages.

If not, to do the building manually run this to download the file, put it in your home dir on the server, then extract it and switch to the directory:

mkdir ~/virtuoso_deb
cd ~/virtuoso_deb
tar -xvzf virtuoso-7.2.1.tar.gz
cd virtuoso-opensource-7.2.1  # or newer, depending what you got

Afterwards you can use the following to install the build dependencies and actually build the .deb packages:

# install build tools
sudo apt-get install -y build-essential devscripts
# to install Virtuoso build dependencies
mk-build-deps -irt'apt-get --no-install-recommends -yV' && dpkg-checkbuilddeps
# to build Virtuoso with 5 processes in parallel
# choose something like your server's #CPUs + 1
dpkg-buildpackage -us -uc -5

This will take about 15 min.
Afterwards if everything worked out, you should have the *.deb files in ~/virtuoso_deb.

We continue to also build the DBpedia VAD:

./configure --with-layout=debian --enable-dbpedia-vad && \
cd binsrc && make \
cp dbpedia/dbpedia_dav.vad ~/virtuoso_deb/

Finally, let’s create a small local repository out of the .deb files you just built. The advantage of this is that you can simply install virtuoso-server with its dependencies with apt. In theory you could also resolve them manually and install everything with dpkg -i ..., but where’s the fun in that?

cd ~/virtuoso_deb
dpkg-scanpackages ./ | gzip > Packages.gz

Installing Virtuoso

No matter if you used the docker or manual building approach for the .deb packages of Virtuoso, you should now be able to install them with apt-get install ... after telling it where to look for the files for example by doing this:

sudo echo "deb file:~/virtuoso_deb ./" >> /etc/apt/sources.list.d/virtuoso_local_packages.list
sudo apt-get update

After this just install Virtuoso with the following command (it should warn you about untrusted sources of the Virtuoso packages, which is because we just built them ourselves):

sudo apt-get install virtuoso-server \
    virtuoso-vad-bpel \
    virtuoso-vad-conductor \
    virtuoso-vad-demo \
    virtuoso-vad-doc \
    virtuoso-vad-isparql \
    virtuoso-vad-ods \
    virtuoso-vad-rdfmappers \
    virtuoso-vad-sparqldemo \
    virtuoso-vad-syncml \

The above will ask you for a DBA password. Please pick one.

Installing the VAD packages here will actually not install them in the Virtuoso DB file, but just move them in the right place so they can for example be installed as mentioned later.

To also move the DBpedia VAD in place for later you can just run this:

sudo cp ~/virtuoso_deb/dbpedia_dav.vad /usr/share/virtuoso-opensource-7/vad/

Configuring Virtuoso

Now change the following values in /etc/virtuoso-opensource-7/virtuoso.ini, the performance tuning stuff is according to

# note: Virtuoso ignores lines starting with whitespace and stuff after a ;
# you need to include the directory where your datasets will be downloaded
# to, in our case /usr/local/data/datasets:
DirsAllowed = ., /usr/share/virtuoso/vad, /usr/local/data/datasets
# IMPORTANT: for performance also do this
# the following two are as suggested by comments in the original .ini
# file in order to use the RAM on your server:
NumberOfBuffers = 2720000
MaxDirtyBuffers = 2000000
# each buffer caches a 8K page of data and occupies approx. 8700 bytes of
# memory. It's suggested to set this value to 65 % of ram for a db only server
# so if you have 32 GB of ram: 32*1000^3*0.65/8700 = 2390804
# default is 2000 which will use 16 MB ram ;)
# Make sure to remove whitespace if you uncomment existing lines!
MaxCheckpointRemap = 625000
# set this to 1/4th of NumberOfBuffers
# I like to increase the ResultSetMaxrows, MaxQueryCostEstimationTime
# and MaxQueryExecutionTime drastically as it's a local store where we
# do quite complex queries... up to you (don't do this if a lot of people
# use it).
# In any case for the importer to be more robust add the following setting
# to this section:
ShortenLongURIs = 1

Afterwards restart Virtuoso:

sudo /etc/init.d/virtuoso-opensource-7 stop

You should now have a running Virtuoso server.

DBpedia URIs (en) vs. DBpedia IRIs (i18n)

The DBpedia 2015-04 consists of several datasets: one “standard” English version and several localized versions for other languages (i18n). The standard version mints URIs by going through all English Wikipedia articles. For all of these the Wikipedia cross-language links are used to extract corresponding labels in other languages for the en URIs (e.g., core/labels-en-uris_de.nt.bz2). This is problematic as for example articles which are only in the German Wikipedia won’t be extracted. To solve this problem the i18n versions exists and create IRIs in the form of for every article in the German Wikipedia (e.g., core-i18n/de/labels_de.nt.bz2).

This approach has several implications. For backwards compatibility reasons the standard DBpedia makes statements about URIs such as while the local chapters, like the German one, make statements about IRIs such asöder (note the ö). In other words and as written above: the standard DBpedia uses URIs to identify things, while the localized versions use IRIs. This also means thatöder shouldn’t work. That said, clicking the link will actually work as there is magic going on in your browser to give you what you probably meant. Using curl curl -i -L -H "Accept: application/rdf+xml"öder or SPARQLing the endpoint will nevertheless not be so nice/sloppy and can cause quite some headache. Observe how the following two SPARQL queries return different results: select * where { dbpedia:Gerhard_Schröder ?p ?o. } vs. select * where { <> ?p ?o. }. In order to mitigate this historic problem a bit DBpedia actually offers owl:sameAs links from IRIs to URIs: core/iri-same-as-uri_en.nt.bz2 which you should load, so you at least have a link to what you want if someone tries to get info about an IRI.

As the standard DBpedia provides labels, abstracts and a couple other things in several languages, there are two types of files in the localized DBpedia folders: There are triples directly associating the English URIs with for example the German labels ({core,core-i18n/de}/labels-en-uris_de.nt.bz2) and there are the localized triple files which associate for example the DE IRIs with the German labels (core-i18n/de/labels_de.nt.bz2).

Downloading the DBpedia dump files, de-duplication & Repacking

For our group we decided that we wanted a reasonably complete mirror of the standard DBpedia (EN) (have a look at the core directory, which contains all datasets loaded into the public DBpedia SPARQL Endpoint), but also the i18n versions for the German DBpedia loaded in separate graphs, as well as each of their pagelink datasets in yet another separate graph each. For this we download the corresponding files in (NT) format as follows. If you need something different do so (and maybe report back if there were problems and how you solved them).

# see comment above, you could also get another DBpedia version...
mkdir -p /usr/local/data/datasets/dbpedia/2015-04
cd /usr/local/data/datasets/dbpedia/2015-04
wget -r -nc -nH --cut-dirs=1 -np -l1 -A '*.nt.bz2' -A '*.owl' -R '*unredirected*'{core/,core-i18n/en,core-i18n/de,dbpedia_2015-04.owl}

As already mentioned, the DBpedia 2015-04 introduced a core folder which contains all files loaded on the public DBpedia endpoint. Be aware that if you download other folders like above you’ll be downloading some files twice in other folders (e.g., labels-en-uris_de.nt.bz2 can be found in both, the core folder and the core-i18n/de folder). Quite obvious, but especially the core-i18n/en folder contains very many duplicate files from core. If want to see which downloaded files are duplicates (independent of their name) and especially which core-i18n/en files were not loaded on the public endpoint, so are not in core, you can do the following:

# compute md5 hashes for all downloaded files
find . -mindepth 2 -type f -print0 | xargs -0 md5sum > md5sums

# first check if there are duplicates in other folders without core
LC_ALL=C sort md5sums | grep -v '/core/' | uniq -w32 -D
ba3fc042b14cb41e6c4282a6f7c45e02  ./core-i18n/en/instance-types-dbtax-dbo_en.nt.bz2
ba3fc042b14cb41e6c4282a6f7c45e02  ./core-i18n/en/instance_types_dbtax-dbo.nt.bz2

So it seems the ./core-i18n/en/instance-types-dbtax-dbo_en.nt.bz2 and ./core-i18n/en/instance_types_dbtax-dbo.nt.bz2 files are actually the same.

To list all the files in core-i18n/en which are duplicates do this:

# list all dup files in core-i18n/en
LC_ALL=C sort md5sums | uniq -w32 -D | grep '/core-i18n/en'
068975f6dd60f29d13c8442b0dbe403d  ./core-i18n/en/skos-categories_en.nt.bz2
14a770f293524a5713f741a1a448bcfa  ./core-i18n/en/short-abstracts_en.nt.bz2
1904ad5bc4579fd7efe7f40673c32f79  ./core-i18n/en/specific-mappingbased-properties_en.nt.bz2
1958649209bc90944c65eccd30d37c6c  ./core-i18n/en/infobox-property-definitions_en.nt.bz2
2774d36ce14e0143ca4fa25ed212a598  ./core-i18n/en/external-links_en.nt.bz2
314162db2acb516a1ef5fcb3a2c7df2b  ./core-i18n/en/geonames_links_en.nt.bz2
3b42f351fc30f6b6b97d3f2a16ef6db3  ./core-i18n/en/instance-types-transitive_en.nt.bz2
3b61b11bdcb50a0d44ca8f4bd68f4762  ./core-i18n/en/revision-ids_en.nt.bz2
43a8b17859c50d37f4cab83573c2992e  ./core-i18n/en/instance_types_sdtyped-dbo_en.nt.bz2
4c847b2754294c555236d09485200435  ./core-i18n/en/instance-types_en.nt.bz2
63e2cde88e7bdefb6739c62aa234fc1e  ./core-i18n/en/category-labels_en.nt.bz2
64cbbac14769aadf560496b4d948d5e1  ./core-i18n/en/interlanguage-links-chapters_en.nt.bz2
75f2d135459c824feee1d427e4165a4f  ./core-i18n/en/transitive-redirects_en.nt.bz2
82fe80c3868a89d54fec26c919a4fa50  ./core-i18n/en/revision-uris_en.nt.bz2
8407c84d262b573418326bdd8f591b95  ./core-i18n/en/mappingbased-properties_en.nt.bz2
87df057913a05dbb5666f360d20fa542  ./core-i18n/en/freebase-links_en.nt.bz2
8cc921fbab5d02ad83b1fda2f87c23f0  ./core-i18n/en/wikipedia-links_en.nt.bz2
9152e34db96df2dd4991e78b7e53ff3f  ./core-i18n/en/article-categories_en.nt.bz2
94b48e9df78f746e60a9d0c1aafa3241  ./core-i18n/en/infobox-properties_en.nt.bz2
a254ce4596d045cc047959831edd318a  ./core-i18n/en/disambiguations_en.nt.bz2
ab29899e43fab1c6f060cdb8955c5b19  ./core-i18n/en/images_en.nt.bz2
ae046e03be0cf29eac1e3b8a8b3d6b03  ./core-i18n/en/persondata_en.nt.bz2
b4710d36b8dc915f07f5cec2d9971a27  ./core-i18n/en/page-ids_en.nt.bz2
ba3fc042b14cb41e6c4282a6f7c45e02  ./core-i18n/en/instance-types-dbtax-dbo_en.nt.bz2
ba3fc042b14cb41e6c4282a6f7c45e02  ./core-i18n/en/instance_types_dbtax-dbo.nt.bz2
bd90ce4064a120794b5eb5a8d024a97d  ./core-i18n/en/long-abstracts_en.nt.bz2
e4c422d1d23c69eff3b9d7d7df3f2f80  ./core-i18n/en/homepages_en.nt.bz2
eafc557cde69fd1cd8f78565c385ee16  ./core-i18n/en/iri-same-as-uri_en.nt.bz2
ef48deae48c9c9c5e17585e3f0243663  ./core-i18n/en/labels_en.nt.bz2
fa8800165c7e80509a4ebddc5f0caf90  ./core-i18n/en/geo-coordinates_en.nt.bz2

# to delete the duplicates from /core-i18n/en, leaving just one of each:
LC_ALL=C sort md5sums | uniq -w32 -D | grep '/core-i18n/en' | uniq -w32 | cut -d' ' -f3 | xargs rm

# afterwards these should be left:
ls -1 core-i18n/en

As Virtuoso can only import plain (uncompressed) or gzipped files, but the DBpedia dumps are bzipped, you can either repack them into gzip format or extract them. On our server the importing procedure was reasonably slower from extracted files than from gzipped ones (ignoring the vast amount of wasted disk space for the extracted files). File access becomes a bottleneck if you have a couple of cores idling. This is why I decided on repacking all the files from bz2 to gz. As you can see I do the repacking with the parallel versions of bz2 and gz. If that’s not suitable for you, feel free to change it. You might also want to change this if you want to do it in parallel to downloading. The repackaging process below took about 30 minutes but was worth it in the end. The more CPUs you have, the more you can parallelize this process.

# if you want to save space do this:
apt-get install pigz pbzip2
for i in core/*.nt.bz2 core-i18n/*/*.nt.bz2 ; do echo $i ; pbzip2 -dc "$i" | pigz - > "${i%bz2}gz" && rm "$i" ; done

# else do:
#pbzip2 */*.bz2

# notice that the extraction (and repacking) of *.bz2 takes quite a while (about 30 minutes)
# gzipped data is reasonably packed, but still very fast to access (in contrast to bz2), so maybe this is the best choice.

Data Cleaning and The bulk loader scripts

In contrast to the previous versions of this article the Virtuoso import will take care of shortening too long IRIs itself. Also it seems the bulk loader script is included in the more recent Virtuoso versions, so as a reference only: see the old version for the cleaning script and VirtBulkRDFLoaderExampleDbpedia and
for info about the bulk loader scripts.

Importing DBpedia dumps into Virtuoso

Now AFTER the re-/unpacking of the DBpedia dumps we will register all files in the DBpedia dir (recursively ld_dir_all) to be added to the DBpedia graph. If you use this method make sure that only files reside in the given subtree that you really want to import.
Also don’t forget to import the dbpedia_2015-04.owl file!
If you only want one directory’s files to be added (non recursive) use ld_dir('dir', '*.*', 'graph');.
If you manually want to add some files, use ld_add('file', 'graph');.
See the VirtBulkRDFLoaderScript file for details.

Be warned that it might be a bad idea to import the normal and i18n dataset into the same graph if you didn’t select specific languages, as it might introduce a lot of duplicates that are hard to disentangle.

In order to keep track (and easily reproduce) what was selected and imported into which graph, I actually link (ln -s) the repacked files into a directory structure beneath /usr/local/data/datasets/dbpedia/2015-04/importedGraphs/ and import from there instead. To make sure you think about this, I use that path below, so it won’t work if you didn’t pay attention. If you really want to import all downloaded files, just import /usr/local/data/datasets/dbpedia/2015-04/.

Also be aware of the fact that if you load certain parts of dumps in different graphs (such as I did with the pagelinks, as well as the i18n versions of the DE and FR datasets) that only triples from the graph will be shown when you visit the local pages with your browser (SPARQL is unaffected by this)!

So if you only want to load the same datasets as loaded on the official endpoint then importing the core folder (first section below) and dbpedia_2015-04.owl file should be enough.

The following will prepare the linking for the datasets we loaded:

cd /usr/local/data/datasets/dbpedia/2015-04/
mkdir importedGraphs
cd importedGraphs

# ln -s ../../dbpedia*.owl ./  # see below!
ln -s ../../core/*.nt.gz ./
cd ..

ln -s ../../core-i18n/en/anchor-text_en.nt.gz ./
ln -s ../../core-i18n/en/article-templates_en.nt.gz ./
ln -s ../../core-i18n/en/genders_en.nt.gz ./
ln -s ../../core-i18n/en/instance_types_dbtax-dbo.nt.gz ./
ln -s ../../core-i18n/en/instance_types_dbtax_ext.nt.gz ./
ln -s ../../core-i18n/en/instance_types_lhd_dbo_en.nt.gz ./
ln -s ../../core-i18n/en/instance_types_lhd_ext_en.nt.gz ./
ln -s ../../core-i18n/en/out-degree_en.nt.gz ./
ln -s ../../core-i18n/en/page-length_en.nt.gz ./
cd ..

ln -s ../../core-i18n/en/page-links_en.nt.gz ./
cd ..

ln -s ../../core-i18n/en/topical-concepts_en.nt.gz ./
cd ..

ln -s ../../core-i18n/de/article-categories_de.nt.gz ./
ln -s ../../core-i18n/de/article-templates_de.nt.gz ./
ln -s ../../core-i18n/de/category-labels_de.nt.gz ./
ln -s ../../core-i18n/de/disambiguations_de.nt.gz ./
ln -s ../../core-i18n/de/external-links_de.nt.gz ./
ln -s ../../core-i18n/de/freebase-links_de.nt.gz ./
ln -s ../../core-i18n/de/geo-coordinates_de.nt.gz ./
ln -s ../../core-i18n/de/geonames_links_de.nt.gz ./
ln -s ../../core-i18n/de/homepages_de.nt.gz ./
ln -s ../../core-i18n/de/images_de.nt.gz ./
ln -s ../../core-i18n/de/infobox-properties_de.nt.gz ./
ln -s ../../core-i18n/de/infobox-property-definitions_de.nt.gz ./
ln -s ../../core-i18n/de/instance-types_de.nt.gz ./
ln -s ../../core-i18n/de/instance_types_lhd_dbo_de.nt.gz ./
ln -s ../../core-i18n/de/instance_types_lhd_ext_de.nt.gz ./
ln -s ../../core-i18n/de/instance-types-transitive_de.nt.gz ./
ln -s ../../core-i18n/de/interlanguage-links-chapters_de.nt.gz ./
ln -s ../../core-i18n/de/interlanguage-links_de.nt.gz ./
ln -s ../../core-i18n/de/iri-same-as-uri_de.nt.gz ./
ln -s ../../core-i18n/de/labels_de.nt.gz ./
ln -s ../../core-i18n/de/long-abstracts_de.nt.gz ./
ln -s ../../core-i18n/de/mappingbased-properties_de.nt.gz ./
ln -s ../../core-i18n/de/out-degree_de.nt.gz ./
ln -s ../../core-i18n/de/page-ids_de.nt.gz ./
ln -s ../../core-i18n/de/page-length_de.nt.gz ./
ln -s ../../core-i18n/de/persondata_de.nt.gz ./
ln -s ../../core-i18n/de/pnd_de.nt.gz ./
ln -s ../../core-i18n/de/revision-ids_de.nt.gz ./
ln -s ../../core-i18n/de/revision-uris_de.nt.gz ./
ln -s ../../core-i18n/de/short-abstracts_de.nt.gz ./
ln -s ../../core-i18n/de/skos-categories_de.nt.gz ./
ln -s ../../core-i18n/de/specific-mappingbased-properties_de.nt.gz ./
ln -s ../../core-i18n/de/transitive-redirects_de.nt.gz ./
ln -s ../../core-i18n/de/wikipedia-links_de.nt.gz ./
cd ..

ln -s ../../core-i18n/de/page-links_de.nt.gz ./
cd ..

This should have prepared your importedGraphs directory. From this directory you can run the following command which prints out the necessary isql-vt commands to register your graphs for importing:

for g in * ; do echo "ld_dir_all('$(pwd)/$g', '*.*', 'http://$g');" ; done

One more thing (thanks to Romain): In order for the DBpedia.vad package (which is installed at the end) to work correctly, the dbpedia_2014.owl file needs to be imported into graph

Note: In the following i will assume that your Virtuoso isql command is called isql-vt. If you’re in lack of such a command, it might be called isql or isql-v, but this usually means you installed it using some other method than described in here

isql-vt # enter Virtuoso isql mode
-- we are in sql mode now
ld_add('/usr/local/data/datasets/remote/dbpedia/2015-04/dbpedia_2015-04.owl', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');
ld_dir_all('/usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/', '*.*', '');

-- do the following to see which files were registered to be added:
select * from DB.DBA.LOAD_LIST;
-- if unsatisfied use:
-- delete from DB.DBA.LOAD_LIST;

You can now also register other datasets like Freebase, DBLP, Yago, Umbel and … that you want to be loaded after downloading them to the appropriate directories like this:

ld_add('/usr/local/data/datasets/remote/', '');
ld_dir_all('/usr/local/data/datasets/remote/umbel/External Ontologies', '*.n3', '');
ld_add('/usr/local/data/datasets/remote/umbel/Ontology/umbel.n3', '');
ld_add('/usr/local/data/datasets/remote/umbel/Reference Structure/umbel_reference_concepts.n3', '');
ld_add('/usr/local/data/datasets/remote/yago/yago3/2015-11-04/yagoLabels.ttl.gz', '');

ld_add('/usr/local/data/datasets/remote/dblp/l3s/2015-11-04/dblp.nt.gz', '');

ld_dir_all('/usr/local/data/datasets/remote/wikidata/', '*.nt.gz', '');
ld_dir_all('/usr/local/data/datasets/remote/freebase/2015-08-09', '*.nt.gz', '');
ld_dir_all('/usr/local/data/datasets/remote/linkedgeodata/2014-09-09', '*.*', '');

Our full DB.DBA.LOAD_LIST currently looks like this:

select ll_graph, ll_file from DB.DBA.LOAD_LIST;
ll_graph                               ll_file
VARCHAR                                VARCHAR NOT NULL
____________________________________                     /usr/local/data/datasets/remote/dblp/l3s/2015-11-04/dblp.nt.gz   /usr/local/data/datasets/remote/dbpedia/2015-04/dbpedia_2015-04.owl                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                  /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                 /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/           /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/        /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/     /usr/local/data/datasets/remote/dbpedia/2015-04/importedGraphs/                /usr/local/data/datasets/remote/freebase/2015-08-09/fb2w.nt.gz                /usr/local/data/datasets/remote/freebase/2015-08-09/freebase-rdf-2015-08-09-00-01.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Abutters.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Abutters.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-AerialwayThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-AerialwayThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-AerowayThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-AerowayThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Amenity.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Amenity.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-BarrierThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-BarrierThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Boundary.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Boundary.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Craft.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Craft.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-CyclewayThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-CyclewayThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-EmergencyThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-EmergencyThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-HistoricThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-HistoricThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Leisure.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Leisure.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-LockThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-LockThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-ManMadeThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-ManMadeThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-MilitaryThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-MilitaryThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Office.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Office.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Place.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Place.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-PowerThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-PowerThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-PublicTransportThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-PublicTransportThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-RailwayThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-RailwayThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-RouteThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-RouteThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Shop.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-Shop.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-SportThing.node.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-SportThing.way.sorted.nt.gz               /usr/local/data/datasets/remote/linkedgeodata/2014-09-09/2014-09-09-ontology.sorted.nt.gz                      /usr/local/data/datasets/remote/              /usr/local/data/datasets/remote/umbel/External Ontologies/dbpedia-ontology.n3              /usr/local/data/datasets/remote/umbel/External Ontologies/geonames.n3              /usr/local/data/datasets/remote/umbel/External Ontologies/opencyc.n3              /usr/local/data/datasets/remote/umbel/External Ontologies/same-as.n3              /usr/local/data/datasets/remote/umbel/External Ontologies/              /usr/local/data/datasets/remote/umbel/External Ontologies/wikipedia.n3                 /usr/local/data/datasets/remote/umbel/Ontology/umbel.n3              /usr/local/data/datasets/remote/umbel/Reference Structure/umbel_reference_concepts.n3                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/                /usr/local/data/datasets/remote/wikidata/     /usr/local/data/datasets/remote/yago/yago3/2015-11-04/yagoLabels.ttl.gz

219 Rows. -- 8 msec.

OK, now comes the fun (and long part: about 1.5 hours (new Virtuoso 7 is cool 😉 for DBpedia alone, +~6 hours for Freebase)… After we registered the files to be added, now let’s finally start the process. Fire up screen if you didn’t already. (For more detailed metering than below see VirtTipsAndTricksGuideLDMeterUtility.)

sudo apt-get install screen
screen isql-vt
-- depending on the amount of CPUs and your IO performance you can run
-- more rdf_loader_run(); commands in other isql-vt sessions which will
-- speed up the import process.
-- you can watch the progress from another isql-vt session with:
-- select * from DB.DBA.LOAD_LIST;
-- if you need to stop the loading for any reason: rdf_load_stop();
-- if you want to force stopping: rdf_load_stop(1);
commit work;

After this:
Take a look into var/lib/virtuoso/db/virtuoso.log and run this:


Should you find any errors in there… FIX THEM! You might be able to use the dump, but it’s incomplete in those cases. Any error quits out of the loading of the corresponding file and continues with the next one, so you’re only using the part of that file up to the place where the error occurred. (Should you find errors you can’t fix, please leave a comment.)

Final polishing

You can & should now install the DBpedia and RDF Mappers packages from the Virtuoso Conductor.

login: dba
pw: dba

Go to System Admin / Packages. Install the DBpedia (v. 1.4.30) and rdf_mappers (v. 1.34.74) packages (takes about 5 minutes).

Testing your local mirror

Go to the sparql-endpoint of your server http://your-server:8890/sparql (or in isql-vt prefix with: SPARQL)

sparql SELECT count(*) WHERE { ?s ?p ?o } ;

This shouldn’t take long in Virtuoso 7 anymore and for me now returns 849,521,186 for DBpedia (en+de) or 5,959,006,725 with all the datasets mentioned above.

I also like this query showing all the graphs and how many triples are in them:

sparql SELECT ?g COUNT(*) as ?c { GRAPH ?g {?s ?p ?o.} } GROUP BY ?g ORDER BY DESC(?c);
g                                                            c
LONG VARCHAR                                                 LONG VARCHAR
__________________________________________________________                                      3126890738                                     1013866920                                      841008708                                           411914840                                 158878272                                        119876594                                       99042212                                           81987210                              59622795                           44963422                                    480616   256065                           157560                         28880                                            8727
http://localhost:8890/DAV/                                   4806                   2472                                       1584                                  1480                               1226                             937                                    857                  804                   741                 696                691                         661
virtrdf-label                                                638                                   557                                     553                       482                 444                 386                             332                     311                          252                       225                      183                    172                               160                                160                     144                     143                             139                          117                    103             102                      90                        87                  85                       79                68                        41              32                    26                      23         21                      21
http://localhost:8890/sparql                                 14                    12
dbprdf-label                                                 6                                    3

62 Rows. -- 58092 msec.

Congratulations, you just imported nearly 850 million triples (or nearly 6 G triples for all datasets).

Backing up this initial state

Now is a good moment to backup the whole db (takes about half an hour):

sudo -i
cd /
/etc/init.d/virtuoso-opensource stop &&
tar -cvf - /var/lib/virtuoso | lzop > virtuoso-7.1.0-DBDUMP-$(date '+%F')-dbpedia-2015-04-en_de.tar.lzop &&
/etc/init.d/virtuoso-opensource start

Afterwards you might want to repack this with xz (lzma) like this:

# apt-get install xz pxz
for f in virtuoso-7.1.0-DBDUMP-*.tar.lzop ; do lzop -d -c "$f" | pxz > "${f%lzop}.xz" ; done

Yay, done 😉
As always, feel free to leave comments if i made a mistake or to tell us about your problems or how happy you are :D.


Many thanks to the DBpedia team for their endless efforts of providing us all with a great dataset. Also many thanks to the Virtuoso crew for releasing an OpenSource version of their DB.


  • 2015-12-07: added a check for older installed versions.

34 thoughts on “Setting up a Linked Data mirror from RDF dumps (DBpedia 2015-04, Freebase, Wikidata, LinkedGeoData, …) with Virtuoso 7.2.1 and Docker (optional)

  1. Pingback: Setting up a local DBpedia 3.9 mirror with Virtuoso 7 | Jörn's Blog

  2. Pingback: Setting up a local DBpedia 2014 mirror with Virtuoso 7.1.0 | Jörn's Blog

  3. Pingback: Setting up a local DBpedia 3.7 mirror with Virtuoso 6.1.5+ | Jörn's Blog

  4. Pingback: Setting up a local DBpedia mirror with Virtuoso | Jörn's Blog

  5. Asha Subramanian

    Hi, I need urgent help !! I have executed your instructions to the exact specifications. Finally when I try to install virtuoso server from the package as per the instx
    sudo apt-get install virtuoso-server \
    virtuoso-vad-bpel \
    virtuoso-vad-conductor \
    virtuoso-vad-demo \
    virtuoso-vad-doc \
    virtuoso-vad-isparql \
    virtuoso-vad-ods \
    virtuoso-vad-rdfmappers \
    virtuoso-vad-sparqldemo \
    virtuoso-vad-syncml \

    I get the following error –
    itl@webobservatory:~/virtuoso_deb$ sudo apt-get install virtuoso-server \
    > virtuoso-vad-bpel \
    > virtuoso-vad-conductor \
    > virtuoso-vad-demo \
    > virtuoso-vad-doc \
    > virtuoso-vad-isparql \
    > virtuoso-vad-ods \
    > virtuoso-vad-rdfmappers \
    > virtuoso-vad-sparqldemo \
    > virtuoso-vad-syncml \
    > virtuoso-vad-tutorial
    Reading package lists… Done
    Building dependency tree
    Reading state information… Done
    Some packages could not be installed. This may mean that you have
    requested an impossible situation or if you are using the unstable
    distribution that some required packages have not yet been created
    or been moved out of Incoming.
    The following information may help to resolve the situation:

    The following packages have unmet dependencies:
    virtuoso-server : Depends: virtuoso-opensource-7 but it is not going to be installed
    E: Unable to correct problems, you have held broken packages.

    I am on
    Distributor ID: Ubuntu
    Description: Ubuntu 14.04.3 LTS
    Release: 14.04
    Codename: trusty
    and trying to install Virtuoso 7.2.1 along with DBPEDIA latest dumps
    Thanks in advance,

    1. joern Post author

      what does [cci]sudo apt-get install virtuoso-opensource-7[/cci] do? maybe also try with aptitude, its error messages are sometimes more informative: [cci]sudo aptitude install virtuoso-server …[/cci]

    2. Asha Subramanian

      When I try to any of the above commands you mention, I am asked to run sudo apt-get -f install..Unable to get aptitude installed.. When I run sudo apt-get -f install, this is the output I get..

      itl@webobservatory:~/virtuoso_deb$ sudo apt-get -f install
      Reading package lists… Done
      Building dependency tree
      Reading state information… Done
      Correcting dependencies… Done
      The following package was automatically installed and is no longer required:
      Use ‘apt-get autoremove’ to remove it.
      The following extra packages will be installed:
      libiodbc2 virtuoso-opensource-7-bin virtuoso-opensource-7-common
      Suggested packages:
      The following packages will be REMOVED:
      odbcinst odbcinst1debian2 virtuoso-opensource-6.1
      The following NEW packages will be installed:
      libiodbc2 virtuoso-opensource-7-bin virtuoso-opensource-7-common
      0 upgraded, 3 newly installed, 3 to remove and 50 not upgraded.
      18 not fully installed or removed.
      Need to get 0 B/5,520 kB of archives.
      After this operation, 13.9 MB of additional disk space will be used.
      Do you want to continue? [Y/n] y
      WARNING: The following packages cannot be authenticated!
      virtuoso-opensource-7-common virtuoso-opensource-7-bin
      Install these packages without verification? [y/N] y
      (Reading database … 183285 files and directories currently installed.)
      Preparing to unpack …/virtuoso-opensource-7-common_7.2_amd64.deb …
      Unpacking virtuoso-opensource-7-common (7.2) …
      dpkg: error processing archive /var/cache/apt/archives/virtuoso-opensource-7-common_7.2_amd64.deb (–unpack):
      trying to overwrite ‘/usr/bin/inifile’, which is also in package virtuoso-opensource-6.1-common 6.1.6+repack-0ubuntu3
      Errors were encountered while processing:
      E: Sub-process /usr/bin/dpkg returned an error code (1)

      There are no hints on the web to resolve this error.. Can you pls help ?


      1. joern Post author

        This looks very much like you’re trying to install Virtuoso 7 on a system where Virtuoso 6 is/was already installed. A lot has changed since Virtuoso 6, especially regarding the debian packages. I’m not entirely sure they can co-exist or properly upgrade from earlier versions (and your problem seems to indicate they don’t), which is why i’d suggest to uninstall Virtuoso 6 and related packages before installing Virtuoso 7. If you didn’t install Virtuoso 6 yourself, proceed with care, as some system services might use it and Virtuoso 7 will use a different DB folder by default. If you want to keep your system untouched, i’d suggest to use docker to isolate Virtuoso 7 from your system.

        You can find the currently installed packages like this: [cci]dpkg -l | grep virtuoso[/cci], uninstall with [cci]apt-get purge …[/cci] to remove the packages and config files. Afterwards i’d also run [cci]apt-get update ; apt-get upgrade[/cci] as you seem to have quite some packages on hold. If you don’t get any more errors, you should now be able to install Virtuoso 7.

        1. asha subramanian

          Thats precisely what I did, removed all traces of virtuoso 6 and its associated packages and then continued with your instx to install virtuoso 7.. I am all set now and well on my way into installing DBPEDIA files.. Thanks for all the help ..

  6. Leandro

    Hi, I’m trying to set up a dbpedia mirror. I did all the steps until rdf_loader_run(); but I got an “Out of disk space” error in virtuoso.log file. Do I need to change the virtuoso db dir?


    1. joern Post author

      seems so… by default your DB will end up in `/var/lib/virtuoso-opensource-7`, check with `df -h` and symlink as necessary

      1. Leandro

        Thanks for the response! There wasn`t enough available space in ‘/var/lib/virtuoso-opensource-7’. So I moved the ‘/db’ folder to another partition and created the symlinks in /db. My local mirror works Ok (only dbpedia-EN) but unfortunately when I checked the data, I realized that is not the same data as dbpedia. For example, if I run a local sparql query to get the types of resource dbr:Argentina, I got additional data that is not in the original online dbpedia. For example, I got that dbr:Argentina rdf types are “dbo:List” or “dbo:Article”….which seems to be an error. I got similar results with other resources like Persons. Is there any way to check the possible origin of these errors?


  7. Leandro

    Hi Joern,

    If I need to set up a mirror of another different system, (like Freebase, DBLP, LinkedGeoData, etc) I have to download them and register as you previusly mentioned above, but the question is, can this this affects the dbpedia mirror? when I run an sparql query to get dbpedia classes for example, it queries all the datasets? Is there any form to create “diferents” (or independent) datasets in a similiar way we create different databases on a conventional DBMS like MySQL ?

    1. joern Post author

      Hmm, as far as i know there is no “simple” way to achieve something like this in Virtuoso… Stardog for example allows you to run several “databases” in one instance, but in Virtuoso i never stumbled over this feature. It’s probably possible to “fake” this with Virtuoso’s permission system by making a separate endpint (e.g., localhost:8890/sparql_endpoint2), assigning another user to it and giving it other permissions on the loaded named graphs, but it’s definitely not “different databases” “like MySQL” or like Stardog.

      What i’d recommend is to simply use docker to run several Virtuoso instances as you need them… you can run them alongside each other on different ports as you like or just start them on demand…

  8. Vladimir Alexiev

    Why do you think IRIs shouldn’t work on English dbpedia? is an encoded URL that means exactly the same asöder. It’s not “some magic in your browser”, but standard URL encoding: if you load RDF files with the encoded URL, the statements will still end up at the unencoded URL. You can check eg with this:
    select * { ?p ?o}

    Of course, German DBpedia may have more statements about that resource (and as you wrote, indeed may have more resources that are not in English DBpedia). That’s why you have statements like

    Or for Bulgarian resource:

    You will agree the last URL is much better than its encoded variant That’s why recent DBpedia releases use IRIs only, and not encoded URLs.

  9. Hoa Ngo

    Hi Joern,

    Thank you for the detail instructions. I am following your steps to install manual Virtuoso 7 & DBPedia 15-04 (only for English). It takes 56 hours for loading dump data to virtuoso database, and it has not finished yet. Checking the database size (/var/lib/virtuoso-opensource-7/db/virtuoso.db), it is now 47G and still increasing.

    My Dell Optiplex 990 has 8G RAM and CPU is i5-2500 4 core. The OS is Ubuntu 14.04. I am running separately 4 terminal with: screen isql-vt

    I wonder if I correctly run 4 parallel isql-vt like that? Do you have any experience in this situation? Do you have any idea why it run so slow?


    1. joern Post author

      You need more than 8 GB of RAM to efficiently load that dataset. If that’s not an option, i at least wouldn’t run multiple importer processes, as your system will already be busy enough with all the IO seeks and cache invalidation.

  10. norm


    How can we load only a subset of data? Like data only geographical data?


  11. Anish Pradhan

    I am trying to run queries at “http://localhost:8890/sparql”. but I am getting none as response. Is that the right interface to run query?

  12. David

    I’ve setup a dbpedia 3.9 mirror, but when I do your:
    SELECT ?g COUNT(*) as ?c { GRAPH ?g {?s ?p ?o.} } GROUP BY ?g ORDER BY DESC(?c)

    I get: 322401908 2037874

    …I’m guessing I went wrong somewhere? any ideas how to merge them?

    1. joern Post author

      yes, i guess you just registered certain things to be loaded into different graphs (they don’t magically remove the trailing /, but take whatever you pass in literally)

  13. Armin

    Hi Jörn,

    thank you so much for writing this guide. I’m currently trying to get the docker version to run, I’ve downloaded the files and so on but when I try to launch the first container with “docker run -d –name dbpedia-vadinst \
    -v “$db_dir”:/var/lib/virtuoso-opensource-7 \
    joernhees/virtuoso run” I get “Starting Virtuoso Open Source Edition 7.2 : virtuoso-opensource-7” and after a few seconds it says “Starting Virtuoso Open Source Edition 7.2 : virtuoso-opensource-7 failed!” I don’t get any more error messages.

    Tried it on our server 126gb and on my machine but the same error. I’m running on Ubuntu 16.04 and Docker version 17.03.1-ce, build c6d412e.

    Any help would be appreciated

    1. Armin

      Played around with it, it seems the problem is the db_dir directory.
      If I launch the container without the volume virtuoso launches, but as soon as I add it I get
      initializing db dir… done.
      [FAIL] Starting Virtuoso Open Source Edition 7.2 : virtuoso-opensource-7 failed!

      1. joern Post author

        Hi, yepp sorry, it seems the init-db during last build didn’t shut down completely. I’ll fix this as soon as i find a couple of minutes. For now the workaround is to init the db like docker run --rm -v "$db_dir":/var/lib/virtuoso-opensource-7 joernhees/virtuoso, then rm "$db_dir"/db/virtuoso.lck and then continue as you like.

  14. Renjith

    I was trying to setup mirror dbpedia 2016-04 with viruoso, I did tried to load with above mentioned step but it didn’t works. Is it different from above instruction to load .ttl ? if so, It would be really help full if you can guide a way to load the 2016 data.

  15. Huong Nguyen

    Hi Joern,
    Thank you for your guides. I face to the problem when I did
    /usr/local/data/datasets/dbpedia/2015-04$ find . -mindepth 2 -type f -print0 | xargs -0 md5sum > md5sums
    bash: md5sums: Permission denied.
    Please let me know how can to solve it to continue next steps?
    Thanks in advance

  16. NasrO

    Thank you very much, I’ve created a local mirror successfully using docker instructions.
    I’ve a question regarding opening access to the server so that I can send sparql queries from anywhere on the internet. Is there a way to do that ?

    Thanks in advance.

  17. Huong Nguyen

    Please let me know where can I get these packages the DBpedia (v. 1.4.30) and rdf_mappers (v. 1.34.74) packages? Thank you


Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.