Giter Site home page Giter Site logo

maxlath / wikibase-sdk Goto Github PK

View Code? Open in Web Editor NEW
320.0 16.0 47.0 2.45 MB

JS utils functions to query a Wikibase instance and simplify its results

License: MIT License

Shell 0.11% JavaScript 0.15% TypeScript 99.66% Dockerfile 0.07%
wikidata api sparql simplify data wikibase

wikibase-sdk's Introduction

wikibase-sdk

JS utils functions to query a Wikibase instance and simplify its results

This package was primarily developed as wikidata-sdk but has now being generalized to support any Wikibase instance, wikidata.org among others, and was thus renamed wikibase-sdk.

This project received a Wikimedia Project Grant.

wikibase           wikidata

License npm version Required Node.js version npm downloads code style type definitions

Summary

Changelog

See CHANGELOG.md for version info

Dependencies

NodeJS >= v12.0.0 or not too outdated web browsers (see Object.fromEntries browser compatibility table) For older JS runtimes, you can use ES5 bundles from wikibase-sdk <= v8.

Install

as an ES module

Install via npm to be able to use the import the module.

npm install wikibase-sdk

Then in your javascript:

import { WBK } from 'wikibase-sdk'
const wbk = WBK({
  instance: 'https://my-wikibase-instan.se',
  sparqlEndpoint: 'https://query.my-wikibase-instan.se/sparql' // Required to use `sparqlQuery` and `getReverseClaims` functions, optional otherwise
})

The wdk object of previous versions of this documentation - from the time this module was bound to wikidata.org only - thus corresponds to the following:

import { WBK } from 'wikibase-sdk'

const wdk = WBK({
  instance: 'https://www.wikidata.org',
  sparqlEndpoint: 'https://query.wikidata.org/sparql'
})

For convenience, and for the sake of retro-compatibility, that same wdk object can be obtain directly from the wikibase-sdk/wikidata.org package:

import wdk from 'wikibase-sdk/wikidata.org'

By default wikibase-sdk assumes that your Wikibase instance has $wgScriptPath set to /w, but if that's not the case, you can set it by passing a wgScriptPath parameter:

import { WBK } from 'wikibase-sdk'
const wbk = WBK({
  instance: 'https://my-wikibase-instan.se',
  wgScriptPath: '/some_custom_script_path'
})

as an CommonJS module

Importing with CommonJS require is not supported anymore in version >= v9.0.0, but can still be done by installing an older version:

npm install wikibase-sdk@v8

See the corresponding version documentation

download pre-bundled files

Pre-bundled files is not supported anymore in version >= v9.0.0, but can still be done by pre-bundled files from older versions:

wget https://raw.githubusercontent.com/maxlath/wikibase-sdk/v8.1.1/dist/wikibase-sdk.js
wget https://raw.githubusercontent.com/maxlath/wikibase-sdk/v8.1.1/dist/wikidata-sdk.js
wget https://raw.githubusercontent.com/maxlath/wikibase-sdk/v8.1.1/dist/wikibase-sdk.min.js
wget https://raw.githubusercontent.com/maxlath/wikibase-sdk/v8.1.1/dist/wikidata-sdk.min.js

See the corresponding version documentation

Features

Wikibase API

A set of functions to make read queries to a Wikibase instance API (see Wikidata API documentation). For write operations, see wikibase-edit.

Wikibase Query

There are additional functions for Wikibase instances that have a SPARQL Query Service (such as Wikidata Query for wikidata.org). SPARQL can be a weird thing at first, but the Wikidata team and community really puts lots of efforts to make things easy with a super rich Wikidata Query Help page, an awesome tool to test you queries and visualize the result, and lots of examples!

General helpers

Contributing

Context

This library had for primary purpose to serve the needs of the inventaire project but extending its capabilities to other needs it totally possible: feel welcome to post your suggestions as issues or pull requests!

Design constraints

  • wikibase-sdk should stay "small" and dependency-free, so that a web application can include it in its bundle without paying a too high cost for it. A consequence is that the lib generates URLs where other libs would integrate doing the request and parsing it's response. But that actually feels quite right to do this way: simply generating the URLs let's users free to handle requests as they like (with callbacks, promises, async/await, custom request agent, whatever!)
  • Therefore, it should focus on providing basic, general helper functions most application working with a Wikibase instance would need.
  • Write operations should go into wikibase-edit as it involves working with Wikibase credentials/tokens.
  • General command-line interface tools should go to wikibase-cli, very specific ones — wikibase-dump-filter and alikes — should get their own modules.

See Also

You may also like

inventaire banner

Do you know Inventaire? It's a web app to share books with your friends, built on top of Wikidata! And its libre software too.

License

MIT

wikibase-sdk's People

Contributors

brunnock avatar bynaristar avatar daanvr avatar edjopato avatar ent8r avatar fuddl avatar josef-friedrich avatar larsgw avatar lsjroberts avatar maxlath avatar moshest avatar mshd avatar nichtich avatar noinkling avatar offirmo avatar simon04 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

wikibase-sdk's Issues

Feeding Wikidata

@maxlath since you look expert of wikidata, I ll share with you an idea I have - happy to continue discussion via email or in PVT if interested.

I would like to allow the community (aka, the public) to feed wikidata with information of public interest, but that is still, in some how, provided as "premium" service.

I have been working in knowledge graphs and contextual research domain, and my point of view is that while data is redudant, insights are not, and I would like to facilitate and enable diverse type of people (and companies) to compute own analytics.

To do so, they need to access data.

I d like to leverage on crowd-sourced data, and use wikidata as the repository.

Does your tool allow to feed and store new entities and properties over wikidata?

I tried to contact them, but it is not so clear to me how the process work and would like to simplify.
The data I would like to focus on are data about companies around the world - a very very central aspect in enabling "decentralised" governance over economic networks.


P.s. if you are interested in, this is one of the application I designed to explore and make sense of knowledge networks - here applied to English Wikipedia http://nifty.works - would like to applied to other economic networks like http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0025995

Entity/Item naming issue

In helpers such as isWikidataEntityId the term entity is wrongly used to designate items: in Wikidata lexic, entities include both properties and items.
Also, we could drop the wikidata part in most helpers names given it's a lib about Wikidata.
Both changes would constitute breaking changes.

Add option to keep qualifiers in simplifyClaims()

After simplifyClaims(), all qualifiers are discarded. This makes it impossible for e.g. merging P50 (author) and P2093 (author string) after fetching labels, as the P1545 (series ordinal) qualifier is gone. It would be nice to to have an option for simplifyClaims() or a similar method to keep these.

shortLang behavior when fetching zh-hans

Thanks for making this awesome tool!

I've been run into this issue when I'm trying to query zh-hans (simplified chinese) as a languages options in getEntities. It seems to force convert it to zh, what's the rational behind this? In the case of zh, we have many variants so each language code have return different result

reference:

// Ex: keep only 'fr' in 'fr_FR'

Should the force lang leave as an option or better just leave to the end users to convert?

thanks!

Time helpers lose month info on month-precision values

Example: publication date (P577) claim on Q756866 (JSON) has "month" precision.

datavalue snippet:

{
  "value": {
    "time": "+1990-11-00T00:00:00Z",
    "timezone": 0,
    "before": 0,
    "after": 0,
    "precision": 10,
    "calendarmodel": "http://www.wikidata.org/entity/Q1985727"
  },
  "type": "time"
}

The month is being lost when using wikidataTimeToDateObject/EpochTime/ISOString, but the year stays correct:

> wdk.wikidataTimeToISOString(entity.claims.P577[0].mainsnak.datavalue.value)
'1990-01-01T00:00:00.000Z'

I'm aware that wikidataTimeToSimpleDay will handle it correctly, however sometimes it's just necessary to have a Date object/ISO string/Unix timestamp, even if they can't represent the precision.

The piece of code that seems to be responsible:
https://github.com/maxlath/wikidata-sdk/blob/96b9d15ae562e36458fca8e661840818a65e6d4b/lib/helpers/wikidata_time_to_date_object.js#L39-L45

I believe this issue only effects month-precision values, as day-precision values are already valid and will parse OK, and obviously years are already handled.

Add feature: Simplify multiple entities at once

It may be useful in some situations to just simplify a bunch of entities all at the same time, so after using wdk.getEntities(['Q1', 'Q2', 'Q3', ..., 'Q123']) you can simplify them at once instead of going one by one with wdk.simplify.entity(entity).

wdk.parse does not work

I have no UseCase for wdk.parse but tried to understand its potential UseCase. It basically wants to do the same as wdk.simplify.entities?

But it does not work as simplifyEntitiy is not a function anymore. It is now an object containing simplifyEntitiy and simplifyEntities.

According to git blame this was changed 9 months ago (in 58caecb).
As I don't see someone complaining about it not working it should be removed?

Or fixed and moved to the Legacy part in the index.js? Edit: Looks like it was meant to be in a legacy part of the index.js, just not obvious enough for me 😬. Maybe just one single legacy section at the end?

/node_modules/wikidata-sdk/lib/helpers/parse_responses.js:9
  TypeError {
    message: 'simplifyEntity is not a function',
  }

Simplification throws error when encountering somevalue/novalue qualifiers

Real examples:

var e = data.entities.Q19180293
wdk.simplify.qualifiers(e.claims.P1433[0].qualifiers.P156[0])
TypeError: Cannot destructure property `value` of 'undefined' or 'null'.
    at Object.simplifyClaim (/<...>/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:96:17)
    at parse (/<...>/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:84:31)
    at Object.entity (/<...>/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:9:40)
    at prefixedId (/<...>/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:18:21)

Which means of course that following functions all crash too:

wdk.simplify.propertyQualifiers(e.claims.P1433[0].qualifiers.P156)
wdk.simplify.qualifier(e.claims.P1433[0].qualifiers)
wdk.simplify.claim(e.claims.P1433[0], { keepQualifiers: true })
wdk.simplify.propertyClaims(e.claims.P1433, { keepQualifiers: true })
...
wdk.simplify.entities(data.entities, { keepQualifiers: true })

Setting somevalueValue and novalueValue doesn't help.

P.S. The naming for the simplify.qualifier and simplify.qualifiers functions is a little confusing, as if they're swapped. But they are documented correctly. Was this intentional?

how to use the keep qualifiers option

Hi, I don't quite understand how to use the "keep qualifiers" interface.

Let's say I've got this query from your docs:

const url = wbk.getEntities({
  ids: [ 'Q146', 'Q726' ], // these IDs contain qualifiers in Wikidata
})

fetch(url)
  .then(response => response.json())
  .then(wbk.parse.wd.entities)
  .then(entities => // do your thing with those entities data)

So now I've got some entities (which have qualifiers in wikidata). So how would I use the API code to show the qualifiers in this context?

wbk.simplify.claims( entity.claims, { keepQualifiers: true })

I've tried calling that code on each entity, but that just returns empty items for me. I would have expected to be able to set an option to show the qualifiers for all entities using eg. the "wdk.getEntities" option object. I am probably overlooking something simple here :-)

Thanks for a great library!

unused selected variable make wdk.simplifySparqlResults crash

importing the discussion started in a wikidata-taxonomy issue

example request:

SELECT ?item ?itemLabel ?broader ?parents
WHERE
{
    {
        SELECT ?item (count(distinct ?parent) as ?parents) {
            ?item wdt:P279* wd:Q1
            OPTIONAL { ?item wdt:P279 ?parent }
        } GROUP BY ?item
    }
    SERVICE wikibase:label {
        bd:serviceParam wikibase:language "en" .
    }
}

result:

{
  "head" : {
    "vars" : [ "item", "itemLabel", "broader", "parents" ]
  },
  "results" : {
    "bindings" : [ {
      "item" : {
        "type" : "uri",
        "value" : "http://www.wikidata.org/entity/Q1"
      },
      "parents" : {
        "datatype" : "http://www.w3.org/2001/XMLSchema#integer",
        "type" : "literal",
        "value" : "0"
      },
      "itemLabel" : {
        "xml:lang" : "en",
        "type" : "literal",
        "value" : "universe"
      }
    } ]
  }
}

The broader variable gets no result as it's not used in the WHERE block, thus making parseValue crash.
@nichtich: what do you think, should the simplified result still have a broader key with value null or should the key be totally ignored?

getSitelinkData throwing error because of missing language

> wdk.getSitelinkData('shnwiki')
Error: sitelink lang not found: shn

It's present in the list at https://www.wikidata.org/w/api.php?action=help&modules=wbgetentities and you can find an example of usage here: https://www.wikidata.org/wiki/Special:EntityData/Q1860.json

{"site":"shnwiki","title":"လိၵ်ႈဢင်းၵိတ်ႉ","badges":[],"url":"https://shn.wikipedia.org/wiki/%E1%80%9C%E1%80%AD%E1%81%B5%E1%80%BA%E1%82%88%E1%80%A2%E1%80%84%E1%80%BA%E1%80%B8%E1%81%B5%E1%80%AD%E1%80%90%E1%80%BA%E1%82%89"}

I don't know if any other languages are missing.
@maxlath Do you generate the array in sitelinks_languages.js with a script or is it done by hand?

Time conversion error at the start of the universe

Error

During routine tests (using Q1 as a simple id), I got this error, a lot:

wikidata-sdk time conversion error: RangeError: Invalid time value
    at Date.toISOString (<anonymous>)
    at toISOString (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/helpers.js:28:35)
    at value (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/helpers.js:19:12)
    at Object.time (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:41:49)
    at module.exports (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:76:32)
    at simplifyClaim (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:85:17)
    at propClaims.map.claim (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:19)
    at Array.map (<anonymous>)
    at simplifyPropertyClaims (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:6)
    at Object.simplifyClaims (/home/larsw/public_html/Citation/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:14:24)

Input

To be fair, the time values being parsed are pretty long ago, but I don't think the values themselves are invalid:

{ time: '-13798000000-00-00T00:00:00Z',
  timezone: 0,
  before: 0,
  after: 0,
  precision: 3,
  calendarmodel: 'http://www.wikidata.org/entity/Q1985727' }

Code

Run on RunKit.

const wdk = require("wikidata-sdk")
require('isomorphic-fetch')

const data = await fetch(wdk.getEntities('Q1')).then(response => response.json())

wdk.simplify.entity(data.entities.Q1)

Solution

None, it seems to be a limitation of JavaScript Date:

const d = new Date()
d.setYear(13798000000) // NaN
d // Invalid date

Optionally reduce simplified data

Add an option to further reduce data on simplifiation by

  • removal of empty arrays and objects such as "references": [], "qualifiers": {}"
  • making single-element arrays simple values (e.g. "P31": "Q5" instead of "P31": ["Q5"])

This reduced format is better suitable for human inspection and reuse in edit-entity (see maxlath/wikibase-cli#63).

simplifySparqlResults: respect numeric datatypes

    "birth" : {
        "datatype" : "http://www.w3.org/2001/XMLSchema#integer",
        "type" : "literal",
        "value" : "1903"
      }

should be simplified as

   "birth": 1903

instead of

   "birth": "1903"

I don't know whether there can be other numeric types such as xsd:float (?)

List of sitelink languages has to be updated manually

Currently the list of sitelink languages has to be updated manually. When using the sitelinks of an entity and putting them all into getSitelinkData the ones with yet not known languages will fail.
Updating the list will fix this until the next language arrives.

Any ideas how to approach this better than having a fixed list that needs manual updates?

TypeError: wikibase-form claim parser isn't implemented

I'm encountering this error just a couple of minutes after the parser begins running. Strangely, I haven't changed anything with my set up. And I've tried to run the same parse in two different environments and both encounter the same issue.

Claim ID: Q275937$60A491E6-E5FF-4A2E-934C-9F0FD3072539

Stack Trace:

at parse (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:93:31)
    at simplifyClaim (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:96:13)
    at propClaims.map.claim (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:19)
    at Array.map (<anonymous>)
    at simplifyPropertyClaims (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:6)
    at simplifyClaims (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:14:24)
    at Object.simplifyEntity [as entity] (/usr/local/lib/node_modules/wikidata-filter/node_modules/wikidata-sdk/lib/helpers/simplify_entity.js:17:25)
    at module.exports (/usr/local/lib/node_modules/wikidata-filter/lib/format_entity.js:24:27)
    at wdFilter (/usr/local/lib/node_modules/wikidata-filter/lib/wikidata_filter.js:28:12)
    at Stream.<anonymous> (/usr/local/lib/node_modules/wikidata-filter/lib/filter.js:7:19)

And here is the command I'm running:

DUMP='https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.json.gz'
curl $DUMP | gzip -d | wikidata-filter --claim P31:Q5 --sitelink enwiki --keep id,labels,sitelinks,claims --languages en --simplify '{}' > humans_2.ndjson

simplify.sparqlResults has multiple possible result types

I would like something opposite to #40: Do not simplify sparql results as much as its currently done and generalise its output.

Simplifying the query (removing not needed variables from the select) results in a different result from simplify.sparqlResults.
Currently the result can be either an array of objects with the variables as keys or just an array of the single variable specified in the query.

(Whats actually worse: undefined entries in the array get lost. But when the behaviour is changed this bug will be gone too.)

When the user wants the simple array of the single variable its easy to use something like .map(o => o.amountChildren). Also another simplify method could be doing that.

This would definitely be a breaking change.

Example query (query.wikidata.org):

SELECT DISTINCT ?amountChildren WHERE {
  ?item wdt:P31 wd:Q5.
  OPTIONAL { ?item wdt:P1971 ?amountChildren. }
}
LIMIT 5

The result of the simplify.sparqlResults will be (something like): [2, 6, 0, 1]

When the query is modified to return a second variable like ?item it will return (something like) this:

[
  {amountChildren: 6, item: 'Q1785'},
  {amountChildren: 2, item: 'Q2010'},
  {amountChildren: undefined, item: 'Q1747'},
  {amountChildren: undefined, item: 'Q1745'},
  {amountChildren: undefined, item: 'Q1750'}
]

Missing field in simplifySparqlResults?

Just curious, why is personLabel removed in this simplifySparqlResults output?

  SELECT DISTINCT ?personLabel ?occupation ?twitterName ?instagramUsername
  WHERE {
    ?person wdt:P106 ?occupation.
    OPTIONAL {
      ?person wdt:P2002 ?twitterName.
      ?person wdt:P2003 ?instagramUsername.
    }
    ?occupation wdt:P279* wd:Q5.
    SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
  }

Results are

1 [ { occupation: 'Q947873',¬
  2     twitterName: 'shannamoakler',¬
  3     instagramUsername: 'shannamoakler' },¬
  4   { occupation: 'Q970153',¬
  5     twitterName: 'yoshida_riko',¬
  6     instagramUsername: 'ai_yoshikawa_official' },¬
  7   { occupation: 'Q970153',¬
  8     twitterName: 'Maisie_Williams',¬
  9     instagramUsername: 'maisie_williams' },¬
 10   { occupation: 'Q947873',¬
 11     twitterName: 'VanessaLachey',¬
 12     instagramUsername: 'vanessalachey' },¬
 13   { occupation: 'Q17125263',¬
 14     twitterName: 'coollike',¬
 ...

what is promiseRequest?

reading your documentation, I see code:

// see the "SPARQL Query" section above
var url = wdk.sparqlQuery(SPARQL)
promiseRequest(url)
.then(wdk.simplifySparqlResults)
.then((simplifiedResults) => { // do awesome stuffs here })

But I get the following error:

image

Where this promiseRequest came from?

too many claims break getEntities()

https://www.wikidata.org/wiki/Q83835 (Swiss Federal Railways - SBB) breaks getEntities() / simplify() and returns undefined result.

Seems that there are too many "owner of" (P1830) statements (too many railway stations).

==> it should be possible to sub-specify which claims shall be returned.

(or at least which ones not)

TypeError: wikibase-sense claim parser isn't implemented

I encountered this error:

TypeError: wikibase-sense claim parser isn't implemented
        Claim id: undefined
        Please report to https://github.com/maxlath/wikidata-sdk/issues
    at parse (~/node_modules/wikidata-sdk/lib/helpers/parse_claim.js:93:31)
    at simplifyClaim (~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:96:13)
    at ~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:19
    at Array.map (<anonymous>)
    at simplifyPropertyClaims (~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:6)
    at simplifyClaims (~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:14:24)
    at simplifyClaim (~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:142:28)
    at ~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:19
    at Array.map (<anonymous>)
    at simplifyPropertyClaims (~/node_modules/wikidata-sdk/lib/helpers/simplify_claims.js:33:6)

command I use:

curl https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.json.gz | gzip -d | ~/node_modules/.bin/wikidata-filter -s '{"keepRichValues":"true","keepQualifiers":"true"}' --claim org_claim > organizations.ndjson

where org_claim is a rather long list of ids, but it starts like this:

P31:Q43229,Q7275,Q11032,Q38723,Q41298,Q62447

add simplifyClaims support for all datatypes

currently supported: string, commonsMedia, url, monolingualtext, wikibase-item, time

the complete list: https://www.wikidata.org/wiki/Special:ListDatatypes

most will probably just need to be added to one of the existing chain of datatypes sharing a common parser. ex: string, commonsMedia and url

Add ability to query entities by multiple property values

It would be very useful to be able to find all the entities that have one of several values for a property.

For example this query would allow you to find a list of books by their ISBN:

const url = wdk.getReverseClaims('P212', [ '978-0-465-06710-7', '978-2-267-02700-6' ])

Perhaps this is a round-about approach to this problem. If there is a better way I'd be happy to know!

Add support for very long BC dates on `wikidataTimeToISOString`

I'm not familiar with any time frame limitation for the ISO format.

I think that the method wikidataTimeToISOString should always return a valid ISO string even if the date if not valid of native Date object. This behaviour is more consistent.

For example:

wdk.wikidataTimeToISOString('-13798000000-00-00T00:00:00Z')
'-13798000000-01-01T00:00:00Z'

add wikidata-sdk to cdnjs

For managing client libraries I use LibMan

But when I tried to add it:

{
      "library": "wikidata-sdk",
      "destination": "lib/wikidata"
 }

I got the error:

The "wikidata-sdk" library could not be resolved by the "cdnjs" provider

Of course, I can add it manually, no problem.

I just thought you can be interested to add your library to cdnjs.

More examples/samples

We don't have a /examples dir in our project.

To increase usability and reach of this awesome sdk and toolkit we have for wikidata in JS, I was planning to create a set of examples/samples.
It would be awesome to get some feedback, views and ideas regarding the same.

Or do we have a repository for these that I am missing?

install via bower can't find version 5.16.0

Tryring to install the latest version of wikidata-sdk via bower install wikidata-sdk results following additional error details:

Available versions in https://github.com/maxlath/wikidata-sdk.git: 5.1.4, 5.1.3, 5.1.2, 5.1.1, 5.1.0, 5.0.0, 4.3.3, 4.3.2, 4.3.1, 4.3.0, 4.2.5, 4.2.4, 4.2.3, 4.2.2, 4.2.1, 4.2.0, 4.1.0, 4.0.6, 4.0.5, 4.0.4, 4.0.3, 4.0.2, 4.0.1, 4.0.0, 3.2.4, 3.2.3, 3.2.2, 3.2.1, 3.2.0, 3.1.0, 3.0.1, 3.0.0, 2.6.0, 2.5.0, 2.4.1, 2.4.0, 2.3.0, 2.2.2, 2.2.1, 2.2.0, 2.1.1, 2.1.0, 2.0.0, 1.4.4, 1.4.3, 1.4.2, 1.4.1, 1.4.0, 1.3.7, 1.3.6, 1.3.5, 1.3.4, 1.3.3, 1.3.2, 1.3.1, 1.3.0, 1.2.1, 1.2.0, 1.1.0, 1.0.2, 1.0.1, 1.0.0, 0.1.1, 0.1.0, 0.0.6, 0.0.5, 0.0.4, 0.0.3, 0.0.2

Access-Control-Allow-Origin Error

First, thanks so much for this wonderful interface to wikibase. Also, this may be an issue only with my understanding rather than an actual issue with wikibase-sdk. Regardless, hopefully I'll be set straight here.

I'm getting what seems like some odd behavior. I'm making a series of sparql queries as follows, where rp is request-promise:

async function getItems(Qid) {
const sparql = ... ${Qid} ...; // I've omitted my actual query here
const url = wbk.sparqlQuery(sparql);
return rp(url).then(wbk.simplify.sparqlResults);
}

results = await getItems(...);

All of this seems to work just fine at the beginning. I get the expected results for my sparql queries, life is good. However, for some queries with certain Qids I get the following error:

Origin http://localhost:... is not allowed by Access-Control-Allow-Origin.
Fetch API cannot load https://query.wikidata.org/sparql?format=json&query=... due to access control checks.

The same sparql works for another Qid. Any idea what is going on? Why does CORS seem to be a problem sometimes and not others? Very confused. I saw somewhere else that I should make sure I am requesting JSON, but the url has format=json so this should be fine I assume. Apologies in advance if this is something more fundamental that has nothing to do with wikibase-sdk. Any advice or pointers in the right direction will be greatly appreciated.

Cheers,
Marcel

Support simplification of special snaktypes on request

The special values "no value" and "somevalue" are filtered out by simplification but they could also be mapped to special values such as - and ? on request. I think the value null is best but this collides with use of null to skip data. The special values might help to find out which statements are complete and where information is lacking.

Negative/BC dates are not parsed correctly when coming from the JSON API

Summary of the issue from https://www.mediawiki.org/wiki/Wikibase/DataModel/JSON#time:

[in JSON] Years BCE are represented as negative numbers, using the historical numbering, in which year 0 is undefined, and the year 1 BCE is represented as -0001, the year 44 BCE is represented as -0044, etc., like XSD 1.0 (ISO 8601:1988) does. In contrast, the RDF mapping relies on XSD 1.1 (ISO 8601:2004) dates that use the proleptic Gregorian calendar and astronomical year numbering, where the year 1 BCE is represented as +0000 and the year 44 BCE is represented as -0043.

This can be confirmed with the following examples:

Q25299

"Point in time" (P585) value is shown as 1 BCE.

JSON snippet:

  "property": "P585",
  "datavalue": {
    "value": {
      "time": "-0001-01-01T00:00:00Z",

RDF TTL snippet:

wdt:P585 "0000-01-01T00:00:00Z"^^xsd:dateTime ;

Q25298 (2 BC)

JSON:

"time": "-0002-01-01T00:00:00Z"

TTL:

wdt:P585 "-0001-01-01T00:00:00Z"^^xsd:dateTime ;

JavaScript behaviour

The ES spec (since ES5) gives these specific examples:

-283457-03-21T15:00:59.008Z 283458 B.C.
-000001-01-01T00:00:00Z 2 B.C.
+000000-01-01T00:00:00Z 1 B.C.
+000001-01-01T00:00:00Z 1 A.D.

This means that JS follows the 2004 version of the ISO spec, and therefore matches the RDF serialization, but not the JSON.

In order to fix the time converter functions when parsing JSON-originating time values, 1 year needs to be added/subtracted (depending on which way you look at it) to years that are negative, if we want them to be accurate.

However since the same functions can be useful for dates coming from SPARQL queries or another RDF source, there might need to be an extra option parameter to select between behaviours, or two sets of functions to make it clear.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.