atomgraph / linkeddatahub Goto Github PK
View Code? Open in Web Editor NEWThe low-code Knowledge Graph application platform. Apache license.
Home Page: https://atomgraph.github.io/LinkedDataHub/
License: Apache License 2.0
The low-code Knowledge Graph application platform. Apache license.
Home Page: https://atomgraph.github.io/LinkedDataHub/
License: Apache License 2.0
It looks like that for ScatterChart
and LineChart
, the X/Y axes somehow get switched:
https://linkeddatahub.com:4443/demo/northwind-traders/charts/6616b879-d50d-4091-9e3d-63b488c57822/
All tests passing even though WebID delegation was disabled :/
Create a test under http-tests
.
This is a regression from the master
branch which used to show an attached blank node with metadata like start/end datetime, triple and subject counts.
Right now creating a Restriction
instance such as TopicOfConceptItem
results in a 500 Internal Server Error
cause by an exception in the SkolemizingDatasetProvider
:
org.glassfish.jersey.server.internal.process.MappableException: org.apache.jena.ontology.ConversionException: Cannot convert node http://www.w3.org/2002/07/owl#Restriction to OntClass: it does not have rdf:type owl:Class or equivalent
Can be reproduce using the SKOS demo app.
Replace cert:secretary
with acl:delegates
, as that seems to be the convention now
During installation of https://linkeddatahub.com:4443/demo/city-graph/
:
There is insufficient memory for the Java Runtime Environment to continue.
Native memory allocation (mmap) failed to map 39321600 bytes for committing reserved memory.
An error report file with more information is saved as:
/usr/local/tomcat/hs_err_pid39.log
Right now non-safe ResourceBase
and graph.Item
methods are littered with ban()
calls. Ideally it should be possible to separate them into a separate layer, most likely a Jersey response filter.
Using https://linkeddatahub.com/proxml/test/?filterRegex=label&uri=http://topbraid.org/examples/kennedys#AlfredTucker
I get the full kennedys dataset as result.
Using https://linkeddatahub.com/demo/iswc-2017/?filterRegex=label&uri=https%3A%2F%2Fw3id.org%2Fscholarlydata%2Fperson%2Fmartin-voigt
I get only info on martin-voight.
Is this different behaviour due to the difference in # versus / url's. Or are different templates triggered? If yes, how can I find this out in the most efficient way?
Setup is basically the same as in the previous issue. All docker based services seem to start normally (the only issue is service startup time Server startup in 339410 ms
, not sure why it is so big and whether it's fine) but when one tries to access the launched app via browser (Firefox) with generated owner.p12
certificate, the following error occurs and browser ending up with displaying a blank page.
Docker logs:
Attaching to linkeddatahub_fuseki-admin_1, linkeddatahub_linkeddatahub_1, linkeddatahub_fuseki-end-user_1, linkeddatahub_nginx_1, linkeddatahub_email-server_1
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO Apache Jena Fuseki 3.13.0
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO Configuration file /var/fuseki/config.ttl
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO Path = /ds; Services = [""=>gsp-rw, ""=>query, ""=>update]
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO Memory: 483.4 MiB
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO Java: 1.8.0_111
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO OS: Linux 4.15.0-55-generic amd64
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:28] Server INFO PID: 1
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:29] Server INFO Start Fuseki (port=3030)
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:36] Fuseki INFO [1] GET http://fuseki-admin:3030/ds/
[33mlinkeddatahub_1 |[0m ### Generating server certificate
[33mlinkeddatahub_1 |[0m
[33mlinkeddatahub_1 |[0m ### Quad store URL of the root admin service: http://fuseki-admin:3030/ds/
[33mlinkeddatahub_1 |[0m
[33mlinkeddatahub_1 |[0m ### Loading default datasets into the end-user/admin triplestores...
[33mlinkeddatahub_1 |[0m ### URL http://fuseki-end-user:3030/ds/ responded
[33mlinkeddatahub_1 |[0m % Total % Received % Xferd Average Speed Time Time Time Current
[33mlinkeddatahub_1 |[0m Dload Upload Total Spent Left Speed
[33mlinkeddatahub_1 |[0m
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 46107 0 64 100 46043 90 64932 --:--:-- --:--:-- --:--:-- 64940
100 46108 0 65 100 46043 90 64348 --:--:-- --:--:-- --:--:-- 64305
[33mlinkeddatahub_1 |[0m {
[33mlinkeddatahub_1 |[0m "count" : 207 ,
[33mlinkeddatahub_1 |[0m "tripleCount" : 0 ,
[33mlinkeddatahub_1 |[0m "quadCount" : 207
[33mlinkeddatahub_1 |[0m }
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO Apache Jena Fuseki 3.13.0
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO Configuration file /var/fuseki/config.ttl
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO Path = /ds; Services = [""=>gsp-rw, ""=>query, ""=>update]
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO Memory: 483.4 MiB
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO Java: 1.8.0_111
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO OS: Linux 4.15.0-55-generic amd64
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:28] Server INFO PID: 1
[35mnginx_1 |[0m ### Waiting for linkeddatahub...
[35mnginx_1 |[0m ### linkeddatahub responded
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:29] Server INFO Start Fuseki (port=3030)
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:34] Fuseki INFO [1] GET http://fuseki-end-user:3030/ds/
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:35] Fuseki INFO [1] 200 OK (474 ms)
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:35] Fuseki INFO [2] GET http://fuseki-end-user:3030/ds/
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:35] Fuseki INFO [2] 200 OK (186 ms)
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:35] Fuseki INFO [3] POST http://fuseki-end-user:3030/ds/
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:35] Fuseki INFO [3] Body: Content-Length=46043, Content-Type=application/n-quads, Charset=null => N-Quads : Count=207 Triples=0 Quads=207
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:36] Fuseki INFO [3] 200 OK (701 ms)
[31memail-server_1 |[0m + sed -ri '
[31memail-server_1 |[0m s/^#?(dc_local_interfaces)=.*/\1='\''[0.0.0.0]:25 ; [::0]:25'\''/;
[31memail-server_1 |[0m s/^#?(dc_other_hostnames)=.*/\1='\'''\''/;
[31memail-server_1 |[0m s/^#?(dc_relay_nets)=.*/\1='\''172.19.0.3\/16'\''/;
[31memail-server_1 |[0m s/^#?(dc_eximconfig_configtype)=.*/\1='\''internet'\''/;
[31memail-server_1 |[0m ' /etc/exim4/update-exim4.conf.conf
[31memail-server_1 |[0m + update-exim4.conf -v
[31memail-server_1 |[0m using non-split configuration scheme from /etc/exim4/exim4.conf.template
[31memail-server_1 |[0m 1 LOG: MAIN
[31memail-server_1 |[0m 1 exim 4.92 daemon started: pid=1, -q15m, listening for SMTP on port 25 (IPv4)
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:40] Fuseki INFO [1] 200 OK (3.585 s)
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:40] Fuseki INFO [2] GET http://fuseki-admin:3030/ds/
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:42] Fuseki INFO [2] 200 OK (2.332 s)
[33mlinkeddatahub_1 |[0m ### URL http://fuseki-admin:3030/ds/ responded
[33mlinkeddatahub_1 |[0m % Total % Received % Xferd Average Speed Time Time Time Current
[33mlinkeddatahub_1 |[0m Dload Upload Total Spent Left Speed
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:42] Fuseki INFO [3] POST http://fuseki-admin:3030/ds/
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:43] Fuseki INFO [3] Body: Content-Length=547638, Content-Type=application/n-quads, Charset=null => N-Quads : Count=2392 Triples=0 Quads=2392
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:43] Fuseki INFO [3] 200 OK (1.077 s)
[33mlinkeddatahub_1 |[0m
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 534k 0 0 100 534k 0 527k 0:00:01 0:00:01 --:--:-- 527k
100 534k 0 67 100 534k 61 487k 0:00:01 0:00:01 --:--:-- 487k
[33mlinkeddatahub_1 |[0m {
[33mlinkeddatahub_1 |[0m "count" : 2392 ,
[33mlinkeddatahub_1 |[0m "tripleCount" : 0 ,
[33mlinkeddatahub_1 |[0m "quadCount" : 2392
[33mlinkeddatahub_1 |[0m }
[33mlinkeddatahub_1 |[0m openjdk version "1.8.0_181"
[33mlinkeddatahub_1 |[0m OpenJDK Runtime Environment (build 1.8.0_181-8u181-b13-1~deb9u1-b13)
[33mlinkeddatahub_1 |[0m OpenJDK 64-Bit Server VM (build 25.181-b13, mixed mode)
[33mlinkeddatahub_1 |[0m intx CompilerThreadStackSize = 0 {pd product}
[33mlinkeddatahub_1 |[0m uintx ErgoHeapSizeLimit = 0 {product}
[33mlinkeddatahub_1 |[0m uintx HeapSizePerGCThread = 87241520 {product}
[33mlinkeddatahub_1 |[0m uintx InitialHeapSize := 33554432 {product}
[33mlinkeddatahub_1 |[0m uintx LargePageHeapSizeThreshold = 134217728 {product}
[33mlinkeddatahub_1 |[0m uintx MaxHeapSize := 524288000 {product}
[33mlinkeddatahub_1 |[0m intx ThreadStackSize = 1024 {pd product}
[33mlinkeddatahub_1 |[0m intx VMThreadStackSize = 1024 {pd product}
[33mlinkeddatahub_1 |[0m ### Waiting for http://fuseki-end-user:3030/ds/...
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:43] Fuseki INFO [4] GET http://fuseki-end-user:3030/ds/
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:44] Fuseki INFO [4] 200 OK (173 ms)
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:44] Fuseki INFO [5] GET http://fuseki-end-user:3030/ds/
[32mfuseki-end-user_1 |[0m [2020-01-28 16:00:44] Fuseki INFO [5] 200 OK (112 ms)
[33mlinkeddatahub_1 |[0m ### URL http://fuseki-end-user:3030/ds/ responded
[33mlinkeddatahub_1 |[0m ### Waiting for http://fuseki-admin:3030/ds/...
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:44] Fuseki INFO [4] GET http://fuseki-admin:3030/ds/
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:46] Fuseki INFO [4] 200 OK (2.190 s)
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:46] Fuseki INFO [5] GET http://fuseki-admin:3030/ds/
[36mfuseki-admin_1 |[0m [2020-01-28 16:00:47] Fuseki INFO [5] 200 OK (1.391 s)
[33mlinkeddatahub_1 |[0m ### URL http://fuseki-admin:3030/ds/ responded
[33mlinkeddatahub_1 |[0m ### Waiting for nginx...
[33mlinkeddatahub_1 |[0m ### Host nginx responded
[33mlinkeddatahub_1 |[0m Listening for transport dt_socket at address: 8000
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.909 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version: Apache Tomcat/8.0.53
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.920 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built: Jun 29 2018 14:42:45 UTC
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.921 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server number: 8.0.53.0
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.922 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name: Linux
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.923 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version: 4.15.0-55-generic
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.924 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture: amd64
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.925 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home: /usr/lib/jvm/java-8-openjdk-amd64/jre
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.927 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version: 1.8.0_181-8u181-b13-1~deb9u1-b13
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.928 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor: Oracle Corporation
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.929 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /usr/local/tomcat
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.930 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /usr/local/tomcat
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.932 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/usr/local/tomcat/conf/logging.properties
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.935 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.936 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djdk.tls.ephemeralDHKeySize=2048
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.937 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.939 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -agentlib:jdwp=transport=dt_socket,address=8000,server=y,suspend=n
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.940 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Duser.timezone=Europe/Copenhagen
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.941 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dignore.endorsed.dirs=
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.943 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/usr/local/tomcat
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.945 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/usr/local/tomcat
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.947 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/usr/local/tomcat/temp
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.948 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded APR based Apache Tomcat Native library 1.2.17 using APR version 1.5.2.
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.950 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true].
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:49.964 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized (OpenSSL 1.1.0f 25 May 2017)
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:50.272 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-apr-8080"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:50.313 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["ajp-apr-8009"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:50.400 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-nio-8443"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:51.698 INFO [main] org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a shared selector for servlet write/read
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:51.710 INFO [main] org.apache.catalina.startup.Catalina.load Initialization processed in 3387 ms
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:51.803 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service Catalina
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:51.807 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet Engine: Apache Tomcat/8.0.53
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:00:51.855 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDescriptor Deploying configuration descriptor /usr/local/tomcat/conf/Catalina/localhost/ROOT.xml
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:01:00.588 INFO [localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
[35mnginx_1 |[0m 195.231.4.32 - - [28/Jan/2020:16:04:54 +0000] "GET login.cgi HTTP/1.1" 400 157 "-" "-"
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:30.989 WARNING [localhost-startStop-1] org.apache.catalina.util.SessionIdGeneratorBase.createSecureRandom Creation of SecureRandom instance for session ID generation using [SHA1PRNG] took [330,040] milliseconds.
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:31.054 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDescriptor Deployment of configuration descriptor /usr/local/tomcat/conf/Catalina/localhost/ROOT.xml has finished in 339,196 ms
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:31.063 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["http-apr-8080"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:31.107 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["ajp-apr-8009"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:31.114 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["http-nio-8443"]
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:06:31.122 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 339410 ms
[35mnginx_1 |[0m 161.142.234.65 [28/Jan/2020:16:14:43 +0000] TCP [] [upstream_server_https] 200 1416 7 6.556
[35mnginx_1 |[0m 161.142.234.65 [28/Jan/2020:16:14:43 +0000] TCP [] [upstream_server_https] 200 1416 7 6.315
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:14:50.068 INFO [http-nio-8443-exec-9] com.sun.jersey.server.impl.application.WebApplicationImpl._initiate Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
[35mnginx_1 |[0m 161.142.234.65 [28/Jan/2020:16:14:55 +0000] TCP [] [upstream_server_https] 200 1467 1422 5.953
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:15:01.694 INFO [http-nio-8443-exec-9] com.sun.jersey.server.impl.application.DeferredResourceConfig$ApplicationHolder.<init> Instantiated the Application class com.atomgraph.linkeddatahub.Application
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:15:04.309 SEVERE [http-nio-8443-exec-9] com.sun.jersey.spi.container.ContainerResponse.mapException Exception mapper com.atomgraph.server.mapper.NotFoundExceptionMapper@45cf1f83 for Throwable com.atomgraph.core.exception.NotFoundException: Application not found threw a RuntimeException when attempting to obtain the response
[33mlinkeddatahub_1 |[0m 28-Jan-2020 17:15:04.315 SEVERE [http-nio-8443-exec-9] com.sun.jersey.spi.container.ContainerResponse.logException Mapped exception to response: 500 (Internal Server Error)
[33mlinkeddatahub_1 |[0m java.lang.IllegalArgumentException: Ontology cannot be null
[33mlinkeddatahub_1 |[0m at com.atomgraph.processor.util.TemplateMatcher.match(TemplateMatcher.java:229)
[33mlinkeddatahub_1 |[0m at com.atomgraph.processor.util.TemplateMatcher.match(TemplateMatcher.java:137)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateProvider.getTemplate(TemplateProvider.java:77)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateProvider.getTemplate(TemplateProvider.java:72)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateProvider.getContext(TemplateProvider.java:67)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateProvider.getContext(TemplateProvider.java:37)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateCallProvider.getTemplate(TemplateCallProvider.java:97)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateCallProvider.getTemplateCall(TemplateCallProvider.java:73)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateCallProvider.getContext(TemplateCallProvider.java:68)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.provider.TemplateCallProvider.getContext(TemplateCallProvider.java:38)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.mapper.ExceptionMapperBase.getTemplateCall(ExceptionMapperBase.java:164)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.mapper.ExceptionMapperBase.getResponseBuilder(ExceptionMapperBase.java:102)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.mapper.NotFoundExceptionMapper.toResponse(NotFoundExceptionMapper.java:35)
[33mlinkeddatahub_1 |[0m at com.atomgraph.server.mapper.NotFoundExceptionMapper.toResponse(NotFoundExceptionMapper.java:29)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.spi.container.ContainerResponse.mapException(ContainerResponse.java:480)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1479)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
[33mlinkeddatahub_1 |[0m at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
[33mlinkeddatahub_1 |[0m at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
[33mlinkeddatahub_1 |[0m at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.filters.HttpHeaderSecurityFilter.doFilter(HttpHeaderSecurityFilter.java:126)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:94)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:80)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
[33mlinkeddatahub_1 |[0m at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:502)
[33mlinkeddatahub_1 |[0m at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1152)
[33mlinkeddatahub_1 |[0m at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:684)
[33mlinkeddatahub_1 |[0m at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1539)
[33mlinkeddatahub_1 |[0m at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1495)
[33mlinkeddatahub_1 |[0m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[33mlinkeddatahub_1 |[0m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[33mlinkeddatahub_1 |[0m at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
[33mlinkeddatahub_1 |[0m at java.lang.Thread.run(Thread.java:748)
[33mlinkeddatahub_1 |[0m
[35mnginx_1 |[0m 161.142.234.65 [28/Jan/2020:16:15:04 +0000] TCP [] [upstream_server_https] 200 1730 1789 15.191
At least on items, the links in the mode dropdown are wrong, e.g.:
https://linkeddatahub.com:4443/demo/city-graph/copenhagen/electric-car-chargers/?mode=http%3A%2F%2Fatomgraph.com%2Fns%2Fclient%23MapMode
When it should be this:
https://linkeddatahub.com:4443/demo/city-graph/copenhagen/electric-car-chargers/elbil_ladestander.14?mode=http%3A%2F%2Fatomgraph.com%2Fns%2Fclient%23MapMode
Reported by @timbl and @timathom
Client certificate public key did not match public key of WebID 'https://www.w3.org/People/Berners-Lee/card#i’” at https://linkeddatahub.com/docs/administration/acl
Same for https://timathom.databox.me/profile/card#me
Could be due to the lack of @base
in the RDF files.
As explained in #4, agents should be able to edit their data (which resides in a named graph in the admin app) to add secretary URIs and new certificate keys.
Fails with 500 Internal Server Exception
due to java.lang.NoClassDefFoundError: Could not initialize class com.github.jsonldjava.utils.JsonUtils
Web-Client parameters such as uri
, endpoint
, query
, mode
, accept
are handled separately, before the Processor parameters (such as limit
, offset
etc). That should be documented under https://linkeddatahub.com/docs/administration/sitemap/#parameters
Currently the non-self-signed service certificate (e.g. from LetsEncrypt) configuration is not documented anywhere
After creating a child document, container view is not updated. Varnish returns a stale cached response.
There used to be code that BAN
s (removes from Varnish cache) all objects with container URL after a child document is created?
Something broken during bs2:PropertyControl
refactoring
After 3 edits, dct:created
value is shown as being later than dct:modified
:
Date Created
21 August 2020 12:11
Date Modified
21 August 2020 10:13
21 August 2020 10:18
21 August 2020 10:27
However in the RDF data the values correct, only the timezones are different (GMT+2 and GMT (Z
)):
<http://purl.org/dc/terms/created>
"2020-08-21T12:11:10.412+02:00"^^<http://www.w3.org/2001/XMLSchema#dateTime> ;
<http://purl.org/dc/terms/modified>
"2020-08-21T10:27:31.125Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> , "2020-08-21T10:18:24.954Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> , "2020-08-21T10:13:28.324Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> ;
We need a mechanism to specify children ordering in containers. Currently it is controlled using ORDER BY
in the container's SELECT
query, and while that works correctly, the ordering is generally lost after we wrap the query into DESCRIBE
and retrieve a graph, which is unordered. Therefore a secondary sort in XSLT needs to be done, and to be able to sort by properties and their values, we need to know the URI of the property used as the sort key.
Currently, if the agent is authenticated but not authorized, it can still see "Create", "Edit" buttons etc. and can see the forms.
When "Save" is clicked, the server returns 403 Not Authorized
, but the UI silently swallows it.
Fix by showing an error popup.
Alternatively, don't show the form controls in the first place?
Next to each URI, add a small button that copies its value into the clipboard. As seen on AWS etc.
This kind of icon could be used: https://freeicons.io/free-mobile-app-icons/copy-icon-icon
Currently we only provide create
CLI scripts for resources such as services, queries, charts etc. That results in duplicate resources with different URIs being created with each app install run, and means that resource metadata cannot be updated.
We do have update-document
script for documents which updates them using a PUT
request. We should use this approach for resources as well, and provide update-*
scripts in addition to create-*
.
Facet and parallax navigation generates a new container state with each action, the user should be able to go back to a previous state. Right now the only recourse is to refresh the page and go to the initial container state.
It should also be possible to generate URL for a state/restore state from a URL.
Right now layout mode controls are only shown in the Container view. There's no reason why some of them (at least ac:MapMode
and ac:GraphMode
couldn't also be shown in Item view.
XSLT processor hangs when parsing StreamSource
read from https://www.w3.org/1999/xhtml/vocab
.
Looks like it's not a Saxon bug but a Xerces bug in XMLConfiguration
, probably something like:
https://issues.apache.org/jira/browse/NUTCH-2223
I've created an LDH app that adds document data to a container. I can see the container and links to the resources in the container, but when I click on the links to the resources, I get a 404. Stopping and then restarting the containers (with docker-compose down
followed by docker-compose up -d
fixes the issue. @namedgraph recommended this remedy after suspecting a caching issue.
The XSLT stylesheets could be augmented with RDFa attributes
CSV to RDF mapping queries in City Graph app are pretty poor:
schema:serviceOperator
with string values)schema:maximumAttendeeCapacity
values)schema:maximumAttendeeCapacity
used both as venue capacity (correct) and electric vehicle charger capacity (incorrect)http://schema.org
to https://schema.org
When accessing for example the DH ontology URL https://www.w3.org/ns/ldt/document-hierarchy/domain
in browser mode, we get
[line: 1, col: 1 ] Expected BNode or IRI: Got [DIRECTIVE: base]
Failing to start docker-compose
for repository on Ubuntu 18.04 host system.
Docker version 18.09.7, build 2d0083d
Steps to reproduce:
f6262e5
commit.env
with credentials (leave BASE_URI=https://localhost:4443/
)docker-compose up
sudo docker-compose up
Creating linkeddatahub_email-server_1 ...
Creating linkeddatahub_nginx_1 ...
Creating linkeddatahub_fuseki-end-user_1 ...
Creating linkeddatahub_linkeddatahub_1 ...
Creating linkeddatahub_email-server_1
Creating linkeddatahub_fuseki-admin_1 ...
Creating linkeddatahub_nginx_1
Creating linkeddatahub_fuseki-end-user_1
Creating linkeddatahub_linkeddatahub_1
Creating linkeddatahub_nginx_1 ... done
Attaching to linkeddatahub_email-server_1, linkeddatahub_fuseki-admin_1, linkeddatahub_linkeddatahub_1, linkeddatahub_fuseki-end-user_1, linkeddatahub_nginx_1
email-server_1 | + sed -ri '
email-server_1 | s/^#?(dc_local_interfaces)=.*/\1='\''[0.0.0.0]:25 ; [::0]:25'\''/;
email-server_1 | s/^#?(dc_other_hostnames)=.*/\1='\'''\''/;
email-server_1 | s/^#?(dc_relay_nets)=.*/\1='\''172.23.0.4\/16'\''/;
email-server_1 | s/^#?(dc_eximconfig_configtype)=.*/\1='\''internet'\''/;
email-server_1 | ' /etc/exim4/update-exim4.conf.conf
email-server_1 | + update-exim4.conf -v
email-server_1 | using non-split configuration scheme from /etc/exim4/exim4.conf.template
email-server_1 | 1 LOG: MAIN
email-server_1 | 1 exim 4.92 daemon started: pid=1, -q15m, listening for SMTP on port 25 (IPv4)
fuseki-admin_1 | Starting temporary server
fuseki-admin_1 | Temporary server started.
linkeddatahub_1 | ### Generating server certificate
fuseki-end-user_1 | Starting temporary server
fuseki-end-user_1 | Temporary server started.
nginx_1 | ### Waiting for linkeddatahub...
nginx_1 | ### linkeddatahub responded
fuseki-admin_1 | http://localhost:3333/ds/ not responding, exiting...
linkeddatahub_fuseki-admin_1 exited with code 1
fuseki-end-user_1 | http://localhost:3333/ds/ not responding, exiting...
linkeddatahub_fuseki-end-user_1 exited with code 1
linkeddatahub_1 |
linkeddatahub_1 | ### Quad store URL of the root admin service: http://fuseki-admin:3030/ds/
linkeddatahub_1 |
linkeddatahub_1 | ### Secretary's WebID URI: https://localhost:4443/admin/acl/agents/e413f97b-15ee-47ea-ba65-4479aa7f1f9e/#this
linkeddatahub_1 |
linkeddatahub_1 | ### Secretary WebID certificate's DName attributes: CN=LinkedDataHub,OU=LinkedDataHub,O=AtomGraph,L=Copenhagen,ST=Denmark,C=DK
linkeddatahub_1 |
linkeddatahub_1 | ### Secretary WebID certificate's modulus: c33d3ab2873ed78...
linkeddatahub_1 | ### Waiting for http://fuseki-admin:3030/ds/...
linkeddatahub_1 | ### URL http://fuseki-admin:3030/ds/ not responding after 20 seconds, exiting...
linkeddatahub_linkeddatahub_1 exited with code
I'm not sure where does localhost:3333
comes from.
After consequent launch attempts the only thing changed is one more error line in the logs:
linkeddatahub_1 | keytool error: java.lang.Exception: Key pair not generated, alias <ldh> already exists
LinkedDataHub has no problem sending concurrent streams of outbound RDF data, but it turns out the Graph Store Protocol fails on concurrent write (POST
, PUT
, DELETE
) requests. At least Dydra fails, Fuseki needs to be further tested.
The solution would be to queue outbound requests in ImportListener
. Possibly as easy as turning down the number of Executor
threads to 1.
Apparently keytool
(which we use during signup to generate WebID cert) does not accept non-ASCII characters:
stage.linkeddatahub-prod_1 | keytool error: java.security.KeyStoreException: Key protection
algorithm not found: java.security.UnrecoverableKeyException: Encrypt Private Key failed: getSecretKey failed: Password is not ASCII
PUT
request should only add (update?) dct:modified
value, leaving dct:created
intact. Right now the dct:created
value is updated instead.
When trying to the follow the instructions to start LinkedDataHub, I encounter an error that prevent the application from starting (see startup.log). As a note, I was running the docker image on MacOS 10.15.7.
Has this issue been experienced before?
getSystem().getWebIDModelCache()
optimizes authentication by avoiding having to load Agent
data with each authentication. It should have an expiration period however. Similarly, we need to introduce getSystem().getOIDCModelCache()
.
One possible solution for this is using Guava's CacheBuilder
.
Some comments from a test user:
@
was acceptedAdd a [Save]
button in LD browser mode and SPARQL query editor to save the current RDF graph.
We don't want to render hidden RDF/POST forms with a lot of data, so this needs some client-side logic. The prerequiste is probably that we load the data on the client-side in the first place, which is currently not the case for the LD browser.
Does it make sense to provide a container choice? We would need to attach all documents to it, but some resources might remain disconnected.
Maybe we need to store it in a "user-space" named graph and provide access to it through a new Graphs container, while keeping system named graphs hidden(?).
E.g. when looking at Root in Graph mode, ac:SVG
mode throws an error in Saxon-CE:
SEVERE: XPathException in mode: '{http://saxonica.com/ns/interactiveXSLT}onclick' event: '[object MouseEvent]: An empty sequence is not allowed as the value of variable $min-x
There's some new logic in ResourceBase.put()
that retains the original dct:created
value and adds a dct:modified
value. This needs an HTTP test.
After signing up, Email says:
When you're ready, proceed to create a context for your Linked Data applications by following this link: https://linkeddatahub.com/docs/create?forClass=https%3A%2F%2Flinkeddatahub.com%2Fdocs%2Fns%23Context
Read more about LinkedDataHub contexts here: https://linkeddatahub.com/docs/manage-apps#contexts
But this fails with:
HTTP Status 500 - com.sun.jersey.api.container.ContainerException: Unable to create resource class com.atomgraph.platform.server.impl.ResourceBase
root cause
...
com.atomgraph.processor.exception.ParameterException: Parameter 'forClass' not supported by Template '[<http://atomgraph.com/ns/platform/templates#Item>: "(.*)", 0.0]'
com.atomgraph.processor.util.TemplateCall.applyArguments(TemplateCall.java:90)
Correct link should probably be https://linkeddatahub.com/create?forClass=https%3A%2F%2Flinkeddatahub.com%2Fns%23Context
When i launched with docker, linkeddatahub container crashes with error Admin base URI and/or admin quad store could not be extracted from /WEB-INF/classes/com/atomgraph/linkeddatahub/system.trig for root app.
remaining other containers started properly.
Bad Accept-Language
header value: en-GB,en;q=0.9,en-US;q=0.8,nl;q=0.7,es-419;q=0.6,es;q=0.5,de;q=0.4
Browser: Chrome on MacOS
Reported by @warpr
Returns 500 Internal Server Error
due to
A message body reader for Java class org.apache.jena.rdf.model.Model, and Java type interface org.apache.jena.rdf.model.Model, and MIME media type text/xml was not found
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.