Giter Site home page Giter Site logo

greenfishk / starvers_eval Goto Github PK

View Code? Open in Web Editor NEW

This project forked from rdfostrich/bear

2.0 2.0 0.0 111.98 MB

๐Ÿ‹ Fork of the BEAR benchmark with additional evaluations: https://github.com/webdata/BEAR

License: GNU Lesser General Public License v3.0

Shell 4.67% Python 7.93% Dockerfile 0.52% Jupyter Notebook 86.25% Java 0.63%

starvers_eval's People

Stargazers

 avatar  avatar

starvers_eval's Issues

ERROR in CBNG (Change-based approach with named graphs)

Querying SPARQL endpoint http://Starvers:7200/repositories/cbng_bearb_day with query lookup_queries_p_q32_v2.txt
Following row cannot be removed: ['http://dbpedia.org/resource/Islamic_State_of_Iraq_and_the_Levant', 'http://dbpedia.org/resource/Template:Infobox_war_faction']

Cause: Error in filter clause in string comparison in SPARQL query:
filter (str(?graph) <= "http://starvers_eval/v2/added" || str(?graph) <= "http://starvers_eval/v2/deleted")

cannot ingest alldata.TB_star_hierarchical.ttl

The triple

http://open-data.europa.eu/en/data/dataset/0019c864-1b5d-48da-ad41-d4bb2c18ce7b http://purl.org/dc/terms/description "Frontex Annual Risk Analysis (ARA) 2014 presents a European summary of trends and developments along the external borders of the Member States of the EU. It focuses on describing current challenges that are likely to impact on operations coordinated along the external borders. It presents the latest update\r\nregarding the situation before the border, at the border and after the border." .

in /starvers_eval/rawdata/bearc/alldata.IC.nt/26.nt turns into following erroneous triple in /starvers_eval/rawdata/bearc/alldata.TB_star_hierarchical.ttl where the part in bold is missing

<< << http://open-data.europa.eu/en/data/dataset/0019c864-1b5d-48da-ad41-d4bb2c18ce7b http://purl.org/dc/terms/description "Frontex Annual Risk Analysis (ARA) 2014 presents a European summary of trends and developments along the external borders of the Member States of the EU. It focuses on describing current challenges that are likely to impact on operations coordinated along the\nregarding the situation before the border, at the border and after the border.">>https://github.com/GreenfishK/DataCitation/versioning/valid_from "2022-10-01T12:00:25.000+00:00"^^http://www.w3.org/2001/XMLSchema#dateTime >>https://github.com/GreenfishK/DataCitation/versioning/valid_until "2022-10-01T12:00:26.000+00:00"^^http://www.w3.org/2001/XMLSchema#dateTime .

Cannot build in-memory snapshot of graph for Jenatdb2 and cb_sr_ng

Stack trace

2023-02-08 Wednesday 13:48:01 root:INFO:Build snapshot version 0363 from endpoint http://Starvers:3030/cb_sr_ng_bearb_hour/sparql
2023-02-08 Wednesday 13:48:20 root:INFO:Shutting down fuseki server and finishing evaluation of cb_sr_ng_bearb_hour.

Docker stack trace

evaluate_single_1          | Traceback (most recent call last):
evaluate_single_1          |   File "/starvers_eval/scripts/6_evaluate/query.py", line 232, in <module>
evaluate_single_1          |     build_snapshot(snapshot_g)
evaluate_single_1          |   File "/starvers_eval/scripts/6_evaluate/query.py", line 221, in build_snapshot
evaluate_single_1          |     del_set = engine.query().convert()
evaluate_single_1          |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 960, in query
evaluate_single_1          |     return QueryResult(self._query())
evaluate_single_1          |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 926, in _query
evaluate_single_1          |     response = urlopener(request)
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 222, in urlopen
evaluate_single_1          |     return opener.open(url, data, timeout)
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 525, in open
evaluate_single_1          |     response = self._open(req, data)
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 542, in _open
evaluate_single_1          |     result = self._call_chain(self.handle_open, protocol, protocol +
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 502, in _call_chain
evaluate_single_1          |     result = func(*args)
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 1383, in http_open
evaluate_single_1          |     return self.do_open(http.client.HTTPConnection, req)
evaluate_single_1          |   File "/usr/local/lib/python3.8/urllib/request.py", line 1358, in do_open
evaluate_single_1          |     r = h.getresponse()
evaluate_single_1          |   File "/usr/local/lib/python3.8/http/client.py", line 1348, in getresponse
evaluate_single_1          |     response.begin()
evaluate_single_1          |   File "/usr/local/lib/python3.8/http/client.py", line 316, in begin
evaluate_single_1          |     version, status, reason = self._read_status()
evaluate_single_1          |   File "/usr/local/lib/python3.8/http/client.py", line 285, in _read_status
evaluate_single_1          |     raise RemoteDisconnected("Remote end closed connection without"
evaluate_single_1          | http.client.RemoteDisconnected: Remote end closed connection without response

Seems that it is not due to memory overflow

Unexpected error for Jenatdb2, bearb_hour and tb_sr_rs

2023-03-19 Sunday 11:46:00 root:INFO:Querying SPARQL endpoint http://Starvers:3030/tb_sr_rs_bearb_hour/sparql with query lookup_queries_po_q11_v1298.txt
2023-03-19 Sunday 11:46:00 root:WARNING:The query execution lookup_queries_po_q11_v1298.txt reached the timeout of Nones. The execution_time will be set to -1. The results will not be serialized.

An exception is thrown but unfortunatelly we get not error message other than the warning. I improved the script so that we also print the actual error message. New execution is pending

GraphDB adding every "activated" repository to heap

GraphDB is stacking up the heap with every "activated" repository
A repository becomes activated as soon as it is queried

This is only an issue for "multi repository" policies. Currently, I am only using single repository policies where every version is stored within a named graph

Query gets stuck with the bearc dataset, cb_sr_ng policy

Following two queries gets stuck when executing them against rdflib's Graph().

query8_q5_v0.txt

PREFIX dcat: <http://www.w3.org/ns/dcat#>
PREFIX dc: <http://purl.org/dc/elements/1.1/>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>

# version 00
Select *
{
    ?dataset rdf:type dcat:Dataset .
    ?dataset dc:title ?title .
    ?dataset dcat:distribution ?distribution .
    ?distribution dcat:accessURL ?URL .
    ?distribution dcat:mediaType "text/csv" .
    ?distribution dc:title ?filetitle .
    ?distribution dc:description ?description .
}

query9_q0_v0.txt

PREFIX dcat: <http://www.w3.org/ns/dcat#>
PREFIX dc: <http://purl.org/dc/elements/1.1/>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>

# version 00
Select *
{
    ?dataset rdf:type dcat:Dataset .
    ?dataset dc:title ?title .
    ?distr1 dcat:distribution ?dataset .
    ?distr1 dcat:accessURL ?URL1 .
    ?distr1 dcat:mediaType "text/csv" .
    ?distr1 dc:title ?titleFile1 .
    ?distr1 dc:description ?description1 .
    ?distr2 dcat:distribution ?dataset .
    ?distr2 dcat:accessURL ?URL2 .
    ?distr2 dcat:mediaType "text/tab-separated-values" .
    ?distr2 dc:title ?titleFile2 .
    ?distr2 dc:description ?description2 .
}

Jena cannot load config file

Stacktrace

evaluate_1            | cp: cannot stat '/starvers_eval/configs/jenatdb2_ic_sr_ng_bearc/*.ttl': No such file or directory
evaluate_1            | [2023-01-26 10:23:32] Server     INFO  Apache Jena Fuseki 4.0.0
evaluate_1            | [2023-01-26 10:23:32] Config     INFO  FUSEKI_HOME=/jena-fuseki
evaluate_1            | [2023-01-26 10:23:32] Config     INFO  FUSEKI_BASE=/run
evaluate_1            | [2023-01-26 10:23:32] Config     INFO  Shiro file: file:///run/shiro.ini
evaluate_1            | [2023-01-26 10:23:33] Server     INFO  System
evaluate_1            | [2023-01-26 10:23:33] Server     INFO    Memory: 90.0 GiB
evaluate_1            | [2023-01-26 10:23:33] Server     INFO    Java:   11.0.11
evaluate_1            | [2023-01-26 10:23:33] Server     INFO    OS:     Linux 5.15.0-52-generic amd64
evaluate_1            | [2023-01-26 10:23:33] Server     INFO    PID:    158
evaluate_1            | [2023-01-26 10:23:33] Server     INFO  Started 2023/01/26 10:23:33 UTC on port 3030
evaluate_1            | Traceback (most recent call last):
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 926, in _query
evaluate_1            |     response = urlopener(request)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 222, in urlopen
evaluate_1            |     return opener.open(url, data, timeout)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 531, in open
evaluate_1            |     response = meth(req, response)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 640, in http_response
evaluate_1            |     response = self.parent.error(
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 569, in error
evaluate_1            |     return self._call_chain(*args)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 502, in _call_chain
evaluate_1            |     result = func(*args)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 649, in http_error_default
evaluate_1            |     raise HTTPError(req.full_url, code, msg, hdrs, fp)
evaluate_1            | urllib.error.HTTPError: HTTP Error 404: Not Found
evaluate_1            | 
evaluate_1            | During handling of the above exception, another exception occurred:
evaluate_1            | 
evaluate_1            | Traceback (most recent call last):
evaluate_1            |   File "/starvers_eval/scripts/6_evaluate/query.py", line 187, in <module>
evaluate_1            |     result = engine.query()
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 960, in query
evaluate_1            |     return QueryResult(self._query())
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 932, in _query
evaluate_1            |     raise EndPointNotFound(e.read())
evaluate_1            | SPARQLWrapper.SPARQLExceptions.EndPointNotFound: EndPointNotFound: It was not possible to connect to the given endpoint: check it is correct. 
evaluate_1            | 
evaluate_1            | Response:
evaluate_1            | b'Error 404: Not Found\n'

java crashed during the construct_tb_star_ds function call in construct_datasets.py

Stack trace

construct_datasets_1 | # A fatal error has been detected by the Java Runtime Environment:
construct_datasets_1 | #
construct_datasets_1 | # SIGBUS (0x7) at pc=0x00007f74ff550c76, pid=48, tid=96
construct_datasets_1 | #
construct_datasets_1 | # JRE version: OpenJDK Runtime Environment 18.9 (11.0.11+9) (build 11.0.11+9)
construct_datasets_1 | # Java VM: OpenJDK 64-Bit Server VM 18.9 (11.0.11+9, mixed mode, tiered, g1 gc, linux-amd64)
construct_datasets_1 | # Problematic frame:
construct_datasets_1 | # v ~StubRoutines::jlong_disjoint_arraycopy
construct_datasets_1 | #
construct_datasets_1 | # Core dump will be written. Default location: Core dumps may be processed with "/lib/systemd/systemd-coredump %P %u %g %s %t 9223372036854775808 %h" (or dumping to //core.48)
construct_datasets_1 | #
construct_datasets_1 | # An error report file with more information is saved as:
construct_datasets_1 | # //hs_err_pid48.log
construct_datasets_1 | Compiled method (c2) 694701 5184 4 org.apache.jena.dboe.trans.bplustree.BPTreeRecords::promote (116 bytes)
construct_datasets_1 | total in heap [0x00007f7507373310,0x00007f7507376a88] = 14200
construct_datasets_1 | relocation [0x00007f7507373488,0x00007f75073735f0] = 360
construct_datasets_1 | main code [0x00007f7507373600,0x00007f7507374e80] = 6272
construct_datasets_1 | stub code [0x00007f7507374e80,0x00007f7507374ee0] = 96
construct_datasets_1 | oops [0x00007f7507374ee0,0x00007f7507374f00] = 32
construct_datasets_1 | metadata [0x00007f7507374f00,0x00007f7507375070] = 368
construct_datasets_1 | scopes data [0x00007f7507375070,0x00007f7507376070] = 4096
construct_datasets_1 | scopes pcs [0x00007f7507376070,0x00007f75073767f0] = 1920
construct_datasets_1 | dependencies [0x00007f75073767f0,0x00007f7507376810] = 32
construct_datasets_1 | handler table [0x00007f7507376810,0x00007f7507376918] = 264
construct_datasets_1 | nul chk table [0x00007f7507376918,0x00007f7507376a88] = 368
construct_datasets_1 | Could not load hsdis-amd64.so; library not loadable; PrintAssembly is disabled
construct_datasets_1 | #
construct_datasets_1 | # If you would like to submit a bug report, please visit:
construct_datasets_1 | # https://bugreport.java.com/bugreport/crash.jsp
construct_datasets_1 | #

SPARQL-star delete statements are stalled in Jena

Docker-compose output

eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO  Apache Jena Fuseki 4.0.0
eval_construct_datasets_1  | [2023-03-23 08:44:31] Config     INFO  FUSEKI_HOME=/jena-fuseki
eval_construct_datasets_1  | [2023-03-23 08:44:31] Config     INFO  FUSEKI_BASE=/run
eval_construct_datasets_1  | [2023-03-23 08:44:31] Config     INFO  Shiro file: file:///run/shiro.ini
eval_construct_datasets_1  | [2023-03-23 08:44:31] Config     INFO  Configuration file: /run/config.ttl
eval_construct_datasets_1  | [2023-03-23 08:44:31] Config     INFO  Load configuration: file:///run/configuration/tb_rs_sr_bearc.ttl
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO  Configuration file: /run/config.ttl
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO  Path = /tb_rs_sr_bearc
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO  System
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO    Memory: 90.0 GiB
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO    Java:   11.0.11
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO    OS:     Linux 5.15.0-52-generic amd64
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO    PID:    207
eval_construct_datasets_1  | [2023-03-23 08:44:31] Server     INFO  Started 2023/03/23 08:44:31 UTC on port 3030
eval_construct_datasets_1  | [2023-03-23 08:44:32] Fuseki     INFO  [1] POST http://Starvers:3030/tb_rs_sr_bearc/update

Query from logfile:

PREFIX vers: <https://github.com/GreenfishK/DataCitation/versioning/>
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>


delete {
    << <<?s ?p ?o>> vers:valid_from ?valid_from >> vers:valid_until "9999-12-31T00:00:00.000+02:00"^^xsd:dateTime
}
insert {
    << <<?s ?p ?o>> vers:valid_from ?valid_from >> vers:valid_until ?newVersion. # outdate
}
where {
    values (?s ?p ?o) {
        ( <http://open-data.europa.eu/en/data/dataset/a140d18d-1b9c-46b7-bff9-064dfa89d816/resource/3a763560-dd38-43e1-b259-5e42bddbc910> <http://purl.org/dc/terms/license> <http://ec.europa.eu/geninfo/legal_notices_en.htm> )
        }
    # versioning
    << <<?s ?p ?o>> vers:valid_from ?valid_from >> vers:valid_until "9999-12-31T00:00:00.000+02:00"^^xsd:dateTime.
    BIND(xsd:dateTime("2022-10-01T12:00:01.000+00:00") AS ?newVersion).
}

Query with many joins gets stuck in the scenario tb_sr_rs, bearc and Jena

Querying SPARQL endpoint http://Starvers:3030/tb_sr_rs_bearc/sparql with query query9_q0_v0.txt

Query

PREFIX vers: <https://github.com/GreenfishK/DataCitation/versioning/> 
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>  
SELECT ?description2 ?title ?distr1 ?URL2 ?description1 ?distr2 ?titleFile2 ?dataset ?titleFile1 ?URL1 {
<< <<?distr1 <http://www.w3.org/ns/dcat#mediaType> "text/csv">> vers:valid_from ?valid_from_1 >> vers:valid_until ?valid_until_1. filter(?valid_from_1 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_1)  
<< <<?distr1 <http://www.w3.org/ns/dcat#distribution> ?dataset>> vers:valid_from ?valid_from_2 >> vers:valid_until ?valid_until_2. filter(?valid_from_2 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_2) 
<< <<?distr2 <http://www.w3.org/ns/dcat#distribution> ?dataset>> vers:valid_from ?valid_from_3 >> vers:valid_until ?valid_until_3. filter(?valid_from_3 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_3)  
<< <<?distr2 <http://www.w3.org/ns/dcat#mediaType> "text/tab-separated-values">> vers:valid_from ?valid_from_4 >> vers:valid_until ?valid_until_4. filter(?valid_from_4 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_4)  
<< <<?dataset <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/dcat#Dataset>>> vers:valid_from ?valid_from_5 >> vers:valid_until ?valid_until_5. filter(?valid_from_5 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_5)  
<< <<?distr1 <http://purl.org/dc/elements/1.1/description> ?description1>> vers:valid_from ?valid_from_6 >> vers:valid_until ?valid_until_6. filter(?valid_from_6 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_6)  
<< <<?distr1 <http://purl.org/dc/elements/1.1/title> ?titleFile1>> vers:valid_from ?valid_from_7 >> vers:valid_until ?valid_until_7. filter(?valid_from_7 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_7)  
<< <<?distr1 <http://www.w3.org/ns/dcat#accessURL> ?URL1>> vers:valid_from ?valid_from_8 >> vers:valid_until ?valid_until_8. filter(?valid_from_8 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_8)  
<< <<?distr2 <http://purl.org/dc/elements/1.1/description> ?description2>> vers:valid_from ?valid_from_9 >> vers:valid_until ?valid_until_9. filter(?valid_from_9 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_9)  
<< <<?distr2 <http://purl.org/dc/elements/1.1/title> ?titleFile2>> vers:valid_from ?valid_from_10 >> vers:valid_until ?valid_until_10. filter(?valid_from_10 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_10)  
<< <<?distr2 <http://www.w3.org/ns/dcat#accessURL> ?URL2>> vers:valid_from ?valid_from_11 >> vers:valid_until ?valid_until_11. filter(?valid_from_11 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_11)  
<< <<?dataset <http://purl.org/dc/elements/1.1/title> ?title>> vers:valid_from ?valid_from_12 >> vers:valid_until ?valid_until_12. 
filter(?valid_from_12 <= ?tsBGP_0 && ?tsBGP_0 < ?valid_until_12)  bind("2022-10-01T12:00:00.000+00:00"^^xsd:dateTime as ?tsBGP_0)}

Evaluation gets stuck with jenatdb2

After sucessfully executing jenatdb2, cb_sr_ng, bearc and jenatdb2, cb_sr_ng, bearb_hour the evaluation process gets stuck with jenatdb2, cb_sr_ng, bearb_day

Issues with curl in Dockerfile

RUN apt-get install curl=7.74.0-1.3+deb11u3 -y

Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
curl : Depends: libcurl4 (= 7.74.0-1.3+deb11u3) but 7.74.0-1.3+deb11u5 is to be installed

Non-empty DB directories cannot be removed

During the execution of construct_tb_star_ds I get this error in the second iteration:

rm: cannot remove '/starvers_eval/databases/construct_datasets/graphdb/repositories/tb_rs_sr_bearb_day/storage/history': Directory not empty
rm: cannot remove '/starvers_eval/databases/construct_datasets/graphdb/repositories/tb_rs_sr_bearb_day/storage/literals-index': Directory not empty
17:16:59.185 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Input folders recursively: false
17:16:59.187 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Iterator cache set to: 'auto'
17:16:59.187 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Chunk size set to: 'auto'
17:16:59.188 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Parsing tasks: 2
17:16:59.188 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Temporary data folder: /.
17:16:59.190 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Restore point interval: 3,600s
17:16:59.190 [main] INFO  c.ontotext.graphdb.importrdf.Preload - CONFIG FILE: /starvers_eval/configs/construct_datasets/graphdb_tb_rs_sr_bearb_day/graphdb-config.ttl
17:16:59.205 [main] INFO  c.ontotext.graphdb.importrdf.Preload - Attaching to location: /starvers_eval/databases/construct_datasets/graphdb
org.eclipse.rdf4j.repository.RepositoryException: Failed to lock directory: /starvers_eval/databases/construct_datasets/graphdb/repositories. Is another GraphDB instance running?
        at com.ontotext.graphdb.GraphDBRepositoryManager.init(GraphDBRepositoryManager.java:219)
        at com.ontotext.graphdb.importrdf.BaseLoadTool.mainInternal(BaseLoadTool.java:161)
        at com.ontotext.graphdb.importrdf.Preload.call(Preload.java:250)
        at com.ontotext.graphdb.importrdf.Preload.call(Preload.java:53)
        at picocli.CommandLine.executeUserObject(CommandLine.java:1953)
        at picocli.CommandLine.access$1300(CommandLine.java:145)
        at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2358)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2352)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2314)
        at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179)
        at picocli.CommandLine$RunLast.execute(CommandLine.java:2316)
        at picocli.CommandLine.execute(CommandLine.java:2078)
        at com.ontotext.graphdb.importrdf.ImportRDF.main(ImportRDF.java:25)
/opt/graphdb/dist/bin/graphdb: line 75: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8)
Exception in thread "main" org.apache.catalina.LifecycleException: Protocol handler initialization failed
        at org.apache.catalina.connector.Connector.initInternal(Connector.java:1051)
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:136)
        at org.apache.catalina.core.StandardService.initInternal(StandardService.java:556)
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:136)
        at org.apache.catalina.core.StandardServer.initInternal(StandardServer.java:1045)
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:136)
        at org.apache.catalina.startup.Tomcat.init(Tomcat.java:475)
        at com.ontotext.graphdb.server.GraphDB.start(GraphDB.java:171)
        at com.ontotext.graphdb.server.GraphDBServer.main(GraphDBServer.java:10)
Caused by: java.net.BindException: Address already in use
        at java.base/sun.nio.ch.Net.bind0(Native Method)
        at java.base/sun.nio.ch.Net.bind(Net.java:459)
        at java.base/sun.nio.ch.Net.bind(Net.java:448)
        at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
        at org.apache.tomcat.util.net.NioEndpoint.initServerSocket(NioEndpoint.java:275)
        at org.apache.tomcat.util.net.NioEndpoint.bind(NioEndpoint.java:230)
        at org.apache.tomcat.util.net.AbstractEndpoint.bindWithCleanup(AbstractEndpoint.java:1227)
        at org.apache.tomcat.util.net.AbstractEndpoint.init(AbstractEndpoint.java:1240)
        at org.apache.coyote.AbstractProtocol.init(AbstractProtocol.java:606)
        at org.apache.coyote.http11.AbstractHttp11Protocol.init(AbstractHttp11Protocol.java:77)
        at org.apache.catalina.connector.Connector.initInternal(Connector.java:1048)
        ... 8 more
/opt/graphdb/dist/bin/graphdb: line 75: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8)
Traceback (most recent call last):
  File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 926, in _query
    response = urlopener(request)
  File "/usr/local/lib/python3.8/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/local/lib/python3.8/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/local/lib/python3.8/urllib/request.py", line 640, in http_response
    response = self.parent.error(
  File "/usr/local/lib/python3.8/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/local/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/local/lib/python3.8/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: 

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/starvers_eval/scripts/3_construct_datasets/construct_datasets.py", line 411, in <module>
    construct_tb_star_ds(source_ic0=f"{data_dir}/{snapshot_dir}/" + "1".zfill(ic_basename_lengths[dataset])  + ".nt",
  File "/starvers_eval/scripts/3_construct_datasets/construct_datasets.py", line 145, in construct_tb_star_ds
    rdf_star_engine.insert(triples=added_triples_raw, timestamp=vers_ts, chunk_size=chunk_size)
  File "/starvers_eval/python_venv/lib/python3.8/site-packages/starvers/starvers.py", line 470, in insert
    self.sparql_post.query()
  File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 960, in query
    return QueryResult(self._query())
  File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 932, in _query
    raise EndPointNotFound(e.read())
SPARQLWrapper.SPARQLExceptions.EndPointNotFound: EndPointNotFound: It was not possible to connect to the given endpoint: check it is correct. 

Malformed BEARC query

Malformed query: query10_q7_v0.txt

Stacktrace

Creating network "starvers_eval_default" with the default driver
Creating starvers_eval_evaluate_1 ... done
Attaching to starvers_eval_evaluate_1
evaluate_1            | /opt/graphdb/dist/bin/graphdb: line 75: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8)
evaluate_1            | Traceback (most recent call last):
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 926, in _query
evaluate_1            |     response = urlopener(request)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 222, in urlopen
evaluate_1            |     return opener.open(url, data, timeout)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 531, in open
evaluate_1            |     response = meth(req, response)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 640, in http_response
evaluate_1            |     response = self.parent.error(
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 569, in error
evaluate_1            |     return self._call_chain(*args)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 502, in _call_chain
evaluate_1            |     result = func(*args)
evaluate_1            |   File "/usr/local/lib/python3.8/urllib/request.py", line 649, in http_error_default
evaluate_1            |     raise HTTPError(req.full_url, code, msg, hdrs, fp)
evaluate_1            | urllib.error.HTTPError: HTTP Error 400: 
evaluate_1            | 
evaluate_1            | During handling of the above exception, another exception occurred:
evaluate_1            | 
evaluate_1            | Traceback (most recent call last):
evaluate_1            |   File "/starvers_eval/scripts/6_evaluate/query.py", line 187, in <module>
evaluate_1            |     result = engine.query()
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 960, in query
evaluate_1            |     return QueryResult(self._query())
evaluate_1            |   File "/starvers_eval/python_venv/lib/python3.8/site-packages/SPARQLWrapper/Wrapper.py", line 930, in _query
evaluate_1            |     raise QueryBadFormed(e.read())
evaluate_1            | SPARQLWrapper.SPARQLExceptions.QueryBadFormed: QueryBadFormed: A bad request has been sent to the endpoint: probably the SPARQL query is badly formed. 
evaluate_1            | 
evaluate_1            | Response:
evaluate_1            | b'MALFORMED QUERY: Encountered " "order" "ORDER "" at line 17, column 1.\nWas expecting one of:\n    "(" ...\n    "{" ...\n    "}" ...\n    "[" ...\n    "." ...\n    <NIL> ...\n    <ANON> ...\n    "optional" ...\n    "graph" ...\n    "minus" ...\n    "filter" ...\n    "true" ...\n    "false" ...\n    "bind" ...\n    "service" ...\n    "values" ...\n    <Q_IRI_REF> ...\n    <PNAME_NS> ...\n    <PNAME_LN> ...\n    <BLANK_NODE_LABEL> ...\n    <VAR1> ...\n    <VAR2> ...\n    <INTEGER> ...\n    <INTEGER_POSITIVE> ...\n    <INTEGER_NEGATIVE> ...\n    <DECIMAL> ...\n    <DECIMAL_POSITIVE> ...\n    <DECIMAL_NEGATIVE> ...\n    <DOUBLE> ...\n    <DOUBLE_POSITIVE> ...\n    <DOUBLE_NEGATIVE> ...\n    <STRING_LITERAL1> ...\n    <STRING_LITERAL2> ...\n    <STRING_LITERAL_LONG1> ...\n    <STRING_LITERAL_LONG2> ...\n    "<<" ...\n    '

Evaluation of update statements terminates

This is the output from docker logs

OpenJDK 64-Bit Server VM warning: Insufficient space for shared memory file:
   838651
Try using the -Djava.io.tmpdir= option to select an alternate temp location.

Jena fails to start because it cannot bind to 0.0.0.0/0.0.0.0:3030

This happend after finishing the evaluation for tb_rs_sr, bearc and JenaTDB. On the start of the evaluation of tb_rs_sr, bearb_hour and JenaTDB I got this error:

Stacktrace

[2023-02-03 22:21:03] Server     INFO  Apache Jena Fuseki 4.0.0
[2023-02-03 22:21:03] Config     INFO  FUSEKI_HOME=/jena-fuseki
[2023-02-03 22:21:03] Config     INFO  FUSEKI_BASE=/run
[2023-02-03 22:21:03] Config     INFO  Shiro file: file:///run/shiro.ini
[2023-02-03 22:21:03] Config     INFO  Configuration file: /run/config.ttl
[2023-02-03 22:21:03] Config     INFO  Load configuration: file:///run/configuration/config.ttl
[2023-02-03 22:21:04] Server     INFO  Configuration file: /run/config.ttl
[2023-02-03 22:21:04] Server     INFO  Path = /tb_sr_rs_bearb_hour
[2023-02-03 22:21:04] Server     INFO  System
[2023-02-03 22:21:04] Server     INFO    Memory: 90.0 GiB
[2023-02-03 22:21:04] Server     INFO    Java:   11.0.11
[2023-02-03 22:21:04] Server     INFO    OS:     Linux 5.15.0-52-generic amd64
[2023-02-03 22:21:04] Server     INFO    PID:    610
[2023-02-03 22:21:04] Server     ERROR SPARQLServer: Failed to start server: Failed to bind to 0.0.0.0/0.0.0.0:3030

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.