SPARQL Insert Problems


We are trying to insert data into a triple store. We tried three methods. One is very slow for realistic numbers and the other did not work.

Could you help?

We have tried:

1.Endpoint + Triple file reader + SPARQL Insert

This was very slow for more than 1000 triples

  1. SPARQL Executor for something like

"INSERT DATA { GRAPH <TEST> {\n", $triple$, "\n} }"

Where $triple$ is a single string concatenating n triples.

We only managed to run this for up to 80 triples at a time.

In comparison, we could upload 1000 triples using the same query in curl.

Workflow without specific endpoint

  1. SPARQL File Inserter

We get the following error for an UPDATE endpoint.

ERROR SPARQL File Inserter 12:130 Execute failed: HTTP 405 error making the query: HTTP method not allowed: SPARQL Update : use POST

SPARQLInsertion.knwf (16.8 KB)

Hi @pgha ,
I took a quick look into the code of the SPARQL nodes.
The SPARQL Insert node runs an INSERT DATA for each row individually. So this will take its time for a big table.
What is the Error in this case?
That is pretty strange as both the File Inserter and the Insert node use an update request. Did you use the same endpoint in both cases?
However, as the SPARQL File Inserter will read the File and trigger a INSERT DATA for ever triple individually I would not expect that to be any faster then the SPARQL Insert anyway.