[09:38:34] Does wikidata serve results to searchers based on their location? I and a friend happened to search for an entity on Wikidata and we both got different languages in the top 3 that are shown [09:38:59] *based on their location / search history etc? [09:43:09] are you both using the same interface language? [10:33:46] I don't understand the question. I opened a url and when I send him the same url and he opens it we get different languages in the panel just below the node name [10:42:15] AdityaAS: the languages should be based on your user languages if you have them (for example I get the ones listed on my profile, https://www.wikidata.org/wiki/User:Reosarevok) [10:42:41] I think otherwise it tries to guess what's most relevant from IP or something? [10:43:16] If I'm not logged in I see English (the interface language), Spanish (my second setting for browser language), and Estonian and Russian (I'm in Estonia, where those are the two most common languages) [11:10:09] are any data model experts online? I’m not *quite* sure about the meaning of the Julian calendar [11:10:41] I always thought that “3 February 2018 (Julian calendar)” in Wikidata means that day of the Julian calendar, which is the same as 16 February 2018 in the Gregorian calendar [11:10:49] but I remember reading that some people interpreted it instead as [11:11:14] “3 February 2018 in the Gregorian calendar – but *on display* (e. g. in an article) it should be *converted* to the Julian calendar” [11:11:23] was that ever cleared up? [11:14:08] well, it looks like the RDF export converts “3 February 2018 (Julian)” to "2018-02-16T00:00:00Z"^^xsd:dateTime, so I’m going to assume the first interpretation is the correct one :) [11:16:41] Yeah, the calendar is that which the data was saved with/ is in [11:17:07] okay thanks [11:17:21] (the second interpretation also doesn’t support calendars that can’t be converted to Gregorian, I guess) [11:54:00] does anybody know if Wikibase allows for a particular kind of property datatype, the "datatype of datatypes"? [11:54:28] i.e. a property could have a Wikibase datatype as value [11:56:18] I'm asking this because the Wikidata-Toolkit treats Wikibase datatypes as possible values (they implement the same interface "Value", which is the least common ancestor of say MonolingualTextValue and QuantityValue) [11:57:11] well, extensions can define their own data types AFAIK (e. g. WikibaseLexeme) [11:57:21] but I’m not aware of anything that would provide a “data type” data type [11:57:33] AFAIK the closest thing on Wikidata would just be “string” [11:58:00] okay, so it's more likely that this is just a mistake, right? [11:58:17] or just a weird architecture [11:59:09] well, I wouldn’t want to pass judgment on a library that I know nothing about :D [11:59:14] but sounds odd to me, yes [11:59:14] sure :-D [11:59:33] but I heard Wikidata-Toolkit is based on an older version of the Wikibase data model, so it could be due to that [12:00:16] oh yeah? we definitely lack some new datatypes but I am not aware of any other drift [12:00:36] I think the RDF serialization of the datamodel is outdated though [12:01:54] perhaps that was just the thing I heard about [16:35:02] Hey friends! I'm trying to explore the WDQS a bit today, I'm wondering how to accomplish something... [16:35:37] I want to pull the "properties for this type" property of a particular class, then query for members of that class that are missing some of those properties [16:36:17] I'm not sure if I can do that in pure SPARQL or if I need to pull the properties first, then check each property individually [16:36:23] * Lucas_WMDE starts writing a query [16:36:39] I'm already like halfway there it's just the syntax for multiple values that I'm concerned about [16:36:59] Lucas_WMDE: For your reference, the class I'm using is Q82955 [16:38:24] marktraceur: should be something like this http://tinyurl.com/yag5t9vj [16:38:38] though I’m not sure if that makes sense for Q82955, “politician” as a class feels odd… [16:38:55] I was testing my query with “galaxy” (random example from https://www.wikidata.org/wiki/Special:WhatLinksHere/Property:P1963) [16:39:22] Hmm, K [16:39:54] This is significantly different from the queries I've seen in the past, thanks :) [16:40:03] hehe, you’re welcome :) [16:40:14] #learnings [16:40:31] (don’t let the subquery distract you too much :P that’s just an optimization so we don’t run the label service on too many items) [16:43:23] I’m trying to run the query with “occupation” instead of “instance of” so it would make more sense for politicians (I think), but that just gives me timeouts :( [16:44:29] Oh, right [16:52:20] Lucas_WMDE: Looks like I'm getting the same here [16:53:03] Lucas_WMDE: https://crates.io/crates/wkdr https://crates.io/crates/wikibase Finally had the guts to release to public :D [16:53:33] \o/ [16:53:35] * Lucas_WMDE looks [16:55:00] And I just realized crates.io didnt find my reaadme :) https://gitlab.com/tobias47n9e/wikibase_rs https://gitlab.com/tobias47n9e/wkdr [16:55:20] ah, so there is one :) [16:56:17] Still a lot to learn with Rust and the ecosystem around it, but it is a crazy good language and does run "blazingly" fast as advertised :D [16:56:17] ♥♥♥ for CoC and Dedication [16:56:59] Oh yeah. CoC important. And of course like anyone who knew Kris very sad at the moment :( [16:58:54] Lucas_WMDE: I would like to work on the json-dumps parsing at the hackathon. I bet Rust could really excel at those kind of big data processing things. The library does not use threads yet, but Rust should make that pretty easy [16:59:21] sounds good [16:59:44] I vaguely remember hearing that Rust managed to use the borrow checker to prove thread safety [17:01:21] oh, I just realized what wdkr stands for (I think) :D [17:01:24] that took me a while [17:02:12] This video is highly recommended for Rust mutltithreading: https://youtu.be/RTCHFlGg5wQ [17:05:23] Oh, we have 44 million items now. [17:05:55] And someone expanded our biggest item. https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-statements?refresh=30m&panelId=6&fullscreen&orgId=1&from=now-30d&to=now [17:06:30] sjoerddebruin: I think that was fnielsen https://twitter.com/fnielsen/status/963716784238088194 [17:07:45] That was the 16th largest item I think. [17:08:17] but the first page listed on Special:LongPages hasn’t been edited since 2017… [17:08:20] But it might be the item with the most statements. [17:08:45] and the statement count on Q21558717 matches the graph [17:08:51] :) [18:47:48] hi! [18:57:18] Hellow [23:53:54] Hello am Julio I wrote Spanish movie