@prefix headers coming out-of-order


Hi guys!

It looks like the turtle output from the LDP server isn’t (always) correct.

curl -L -H “Accept: text/turtle” http://training.fairdata.solutions/DAV/home/LDP/gofair/ | less

You’ll see that there are a few @prefix lines at the beginning, then a line of data, then another @prefix line, followed by the rest of the data. This crashes my turtle parser. This just started happening this morning - everything was working fine on Friday.

Any ideas?



Hi guys! Ummm… I’m a bit desperate on this issue. I have to give a workshop using LDP in a few days, and right now, I cannot even parse the messages coming back from the server.

Any advice appreciated… URGENTLY!



Ummmmm… hello? Anybody home?


I wiped the folder and started from scratch. I’ll let you know if I see this again.

feel free to close this for now.



Hi guys,

So yes, I can confirm now that this is a real problem. It happens once the LDP container reaches ~800-1000 records. It happens consistently (I have created fresh “folders” 3 times, and each time this bug has appeared after I put about a thousand records into that container). It is a problem with the headers - if I take only the first 50 lines of turtle and pass it to my parser (Ruby raptor), it segfaults.

Any advice is greatly appreciated.


(by the way, if I request application/ld+json instead of text/turtle, it crashes virtuoso entirely and I have to restart)


@markwilkinson: Is the query in your first post ie

curl -L -H “Accept: text/turtle” http://training.fairdata.solutions/DAV/home/LDP/gofair/

suppose to exhibit the problem currently, as it does not appear ie I don’t see any @prefix headers ?

Are you able to provide steps to recreate such that it can be recreated locally ?


They come out-of-order when I make that call…??

$ curl -L -H ‘Accept: text/turtle’ http://training.fairdata.solutions/DAV/home/LDP/gofair/ > t.ttl
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 82520 100 82520 0 0 6078 0 0:00:13 0:00:13 --:–:-- 18922
markw@markw ~/Documents/CODE $ head -20 t.ttl
@prefix rdf: http://www.w3.org/1999/02/22-rdf-syntax-ns# .
@prefix rdfs: http://www.w3.org/2000/01/rdf-schema# .
rdfs:Resource rdf:type rdfs:Class .
@prefix ns2: http://training.fairdata.solutions/DAV/home/LDP/gofair/ .
@prefix ldp: http://www.w3.org/ns/ldp# .
ns2:obs_2147365908 rdf:type ldp:Resource .
@prefix ns4: http://semanticscience.org/resource/ .
ns2:obs_2147365908 rdf:type ns4:measuring ,
rdfs:Resource .
ns2:species_290307346 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290308565 rdf:type ns4:pathogen ,
rdfs:Resource ,
ldp:Resource .
ns2:species_290307396 rdf:type ns4:pathogen ,
rdfs:Resource ,
ldp:Resource .
ns2:species_290310811 rdf:type ns4:pathogen ,
rdfs:Resource ,
ldp:Resource .
ns2:species_290307202 rdf:type ns4:pathogen ,
rdfs:Resource ,
ldp:Resource .
ns2:species_290310128 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290309376 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290310064 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290307373 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290309635 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290309803 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290307801 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290307721 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
ns2:species_290307422 rdf:type ldp:Resource ,
ns4:pathogen ,
rdfs:Resource .
@prefix ns5: http://purl.org/dc/dcmitype/ .
ns2: rdf:type ns5:Dataset ,
ldp:Container ,
ldp:BasicContainer .
@prefix dc: http://purl.org/dc/elements/1.1/ .


@markwilkinson: Can you please confirm the version of the Virtuoso binary (virtuoso-t -?) and VAD applications (“vad_list_packages()”) you have installed, such that I can attempt to setup a test case locally ?

Also I presume the files in your http://training.fairdata.solutions/DAV/home/LDP/gofair/ folder are publicly available for download, and what method do you use for loading them into the Virtuoso WebDAV folder ?


Virtuoso Open Source Edition (Column Store) (multi threaded)
Version 7.2.6-rc1.3230-pthreads as of Nov 2 2018 (4d226f4)
Compiled for Linux (x86_64-generic_glibc25-linux-gnu)
Copyright © 1998-2018 OpenLink Software

(using the latest openlink Docker image)

Connected to OpenLink Virtuoso
Driver: 07.20.3230 OpenLink Virtuoso ODBC Driver
name title version build_date install_date

Briefcase ODS Briefcase 1.21.68 2018-08-16 12:08 2018-12-11 02:11
Framework ODS Framework 1.89.47 2018-08-16 12:06 2018-12-11 02:10
conductor Virtuoso Conductor 1.00.8785 2018-11-02 11:55 2018-12-11 02:09

Yes, the data is publicly available for download.

I use LDP “POST” of an RDF resource, with a Slug header for the desired filename, and Accept: text/turtle, Content-type: text/turtle, to the /DAV/home/LDP/gofair/ ldp:Container

For the first few hundred there is no problem (or at least, I have never seen a problem). It’s only when I get into the higher numbers - using exactly the same script, just breaking out of it at different times.

Hope that helps! Cheers!


@markwilkinson: Are you able to provide the script you run and a set of files to upload to a test LDP container to recreate the problem locally ?


not easily, unfortunately. It uses libraries that are not publicly available.


Re-confirming: Same script, same dataset, same order of record loading (http POST to an LDP container). It “breaks” at arbitrary times (today after just ~500 records!). Loading just 100 records was fine.

Very odd! (and a real blocker for me, unfortunately…)