[00:06:03] calnation just ask your questions :) [00:06:40] wikidata admins are users with special rights on the wikidata wiki, they aren't inherently capable of answering questions regarding the wikibase software [00:08:26] oh sorry then. Let's just I didn't see any other bell to ring! [00:09:06] first I just wanna make sure, is it usable on non-wikimedia wikis or is it mainly made for their internal use? [00:10:09] it's mainly made for use on wikidata, but you can use it for your own wikibase repo [00:10:35] much in the same way mediawiki is mainly made for use on wikimedia foundation wikis and wikia, but you could use it for your own wiki :) [00:11:12] well i wanted to something instead of RelatedSites extension [00:13:20] Maybe I misunderstood but what I thought is that wikibase kind of has that ability to do the same which is add links on the side bar of other wikis in the case the article name happen to exist on those wikis [00:14:11] that's a feature of mediawiki, with a touch of the wikibase extension, if memory serves [00:14:18] it's not something wikibase-client can do by itself [00:14:41] calnation: *automatically* adding those links, based on the article name, isn’t something Wikibase/Wikidata does at all afaik [00:14:46] that sounds more like Cognate to me [00:14:56] but I have no idea how Wiktionary-specific Cognate is [00:14:58] the other wikis will simply get the data from the wikibase repo (not the client) when a relevant page is requested [00:15:05] and display it on said page [00:16:09] so basically wikibase repo acts as the hub here, right? [00:17:20] nope [00:17:41] mediawiki on the external wiki acts as a client, fetches data from the repo which acts as a server [00:17:47] it's nothing like a hub [00:18:25] a hub would mean action on wikidata's part, that's simply not the case [00:18:33] (or wikibase repo's part) [00:18:41] yes [00:19:03] there is no action from the wikibase repo that changes resources on other wiki [00:19:11] the action comes from the other wikis [00:19:19] the wikibase repo just "is there" [00:19:25] so what kind of data does the repo provide [00:19:42] pretty much the entire entity requested [00:19:55] it's requested by searching the entity with the relevant sitelink [00:20:40] https://www.mediawiki.org/wiki/Extension:Wikibase_Client#Other_projects_sidebar [00:20:56] that's the section i based my thinking on [00:21:25] your thinking started off on the wrong foot [00:21:38] you assumed data was sent by wikibase, it's not [00:21:50] data is requested by the other wiki [00:22:08] THEN wikibase returns it to the other wiki on request [00:22:16] then the other wiki does whatever it needs to do with it [00:22:28] so router [00:22:29] ? [00:22:30] including displaying whatever sidebar links it deems useful [00:22:34] no ? [00:22:53] i don't see what a router would have to do with anything here [00:23:37] i thought it basically connects say wiki a with wiki b and wiki c, then those respond with whatever [00:24:11] user browser other wiki, other wiki tries to see if it can fetch additional data from wikibase repo, wikibase repo returns the requested additional data if present, other wiki turns that returned data into something useful, and displays the resource to the user [00:24:26] the other wikis don't connect with each other there [00:24:41] all they do is ask the "central repo" if you will [00:25:52] is that central repo, that wikimedia wikis use, open to other external wikis? [00:27:39] well, they definitely can request the same info in the same way [00:28:18] adding them to sitelinks (pointed to from the entity by schema:about) however, is unlikely [00:28:29] only WMF wikis are in schema:about, so far [00:28:51] cuz I did setup the entire client thing, manually no vagrant [00:28:51] (and iirc not even all of them, some private wikis or single-purpose wikis for instance) [00:31:59] but my wiki stopped working cuz it keep asking for access to database 'repo', which doesn't exist since I only had the client and I thought, following the setup, I'll connect directly to wikidata.org or something like that [00:32:24] that's a bit beyond my knowledge, sorry [00:32:55] and i know the documentation is a little lacking at times, but iirc there's a hackathon this weekend that aims in part to remedy that issue ^^ [00:34:03] where? [00:35:21] vienna, austria [00:35:57] not sure inscriptions are still open, what with it being so close and all [00:36:21] lol, way out of my commute, but it's good to know [00:37:45] either way, do you know what that section (that i linked to above) means? [12:02:16] beginner here. i'm trying to retrieve the entity for English (as in the language) using SPARQL. it doesn't return anything -- why? it's only a few lines, so i'll post it here: [12:02:21] SELECT ?lang WHERE [12:02:21] { [12:02:23] ?lang rdfs:label "English"@en . [12:02:24] ?lang wdt:P31 wd:Q34770 . # ?lang instance of language [12:02:24] } [12:11:47] HannesP: the problem is that your item is not an instance of the "language" item but of "natural language" [12:12:03] "natural language" is a subclass of "language" [12:12:24] so you need to change your query to accept items that are instances of any subclass of your target type [12:13:04] to do so, replace "wdt:P31" by "wdt:P31/wdt:P279*" which means "instance of" followed by a sequence (possibly empty) of "subclass of" [12:13:22] here is a diagram that explains the situation: http://tinyurl.com/l3snzq8 [12:14:38] @pintoch got it, thanks! [12:15:39] yw [12:17:13] does this mean that wikidata doesn't implement entailment? (just read about it on wikipedia) [12:24:36] HannesP: I wouldn't say wikidata does not support entailment - the Wikidata query service is just one particular service which doesn't [12:25:21] ah, got it [12:25:36] but this use of P31 and P279 is quite standard so you can reasonably expect that any end-user tool should respect that [18:06:09] Hey, I was wondering whether there are any genealogy projects using or working with WikiData? A lot of them seem to require you to create your own objects for something like places where entering a WikiData ID would be a whole lot easier [21:54:51] anyone knows how to run multiple "delete" statements? [21:54:57] via sparql? [21:55:13] e.g. would this work: https://gist.github.com/nyurik/81a50bcedbae2f8062b3acaf92e01c0a [21:57:54] yurik: if I read https://www.w3.org/TR/sparql11-update/#updateLanguage correctly, that would work, except I think there’s a semicolon missing after the closing brace of the INSERT [21:58:28] WikidataFacts, would it break that i use the same var names in multiple delete statemnts? [21:58:42] and thanks, good catch [21:59:48] disclaimer: I literally didn’t know about this syntax until you posted your gist, so anything I say here is just based on a quick glance over the spec I just saw for the first time :) [21:59:51] that said – [21:59:53] > Implementations MUST ensure that the operations of a single request are executed in a fashion that guarantees the same effects as executing them sequentially in the order they appear in the request. [22:00:17] I would read this to mean that reusing variables in different parts of the query MUST NOT break the query [22:02:22] heh, i hope it does... thanks! [22:03:01] by the way, I really love this combination of OSM and Wikidata, and I totally need to play some more with it! thanks for putting it up :) [22:16:07] WikidataFacts, enjoy )) i'm adding more stuff to it, e.g. for relations -- osmm:has osmway:300617282 -- indicates that relation contains a way