[01:35:28] Annoying work... https://www.wikidata.org/wiki/Q28475043 [01:37:26] Yep. [01:47:58] sjoerddebruin: ALl hail Liné1 who is doing great job with the WikiBio module and making the VN-template warn about WIkidata issues https://commons.wikimedia.org/wiki/Category:Lepraria_harrisiana [01:48:01] All* [01:48:09] That is great! :D [01:50:05] Next step I want is a warning when a Commons category is connected with a non-P31:Q4167836 [01:52:05] Yeah, good idea. [01:53:14] Want to code? :p https://commons.wikimedia.org/wiki/Module:Wikidata4Bio [01:57:34] nah, thanks [02:45:53] Zeg sjoerddebruin, moet jij je nest niet in? [02:45:56] :P [02:46:03] RO|VoiceNL: waarvoor? [02:46:37] Veel te laat op jij. Kinderbedtijd was 8 uur geleden hoor ;) [02:48:16] Just kidding [02:48:34] Deze ouwe knar duikt er nu wel in. :D [02:48:45] welterusten sjoerddebruin [02:52:29] even more dutch speakers :o [02:53:55] Ugh. https://www.wikidata.org/w/index.php?title=User%3APino%7Eeowiki&type=revision&diff=434881594&oldid=239117969 [02:54:43] :/ [06:45:22] I notice that both Wikidata:Classification noticeboard and Help:Modelling are inactive and not really linked to from anywhere. Are there any similar forums actually in use? [10:38:59] do we know that a link in Wikidata is a redirect ? [10:40:44] GerardM: no, otherwise they are forbidden [10:41:03] (if you mean a sitelink) [10:41:08] yes [10:41:18] It's a bug or unintentional feature that people use to link page sections etc. [11:14:38] aude: Talking with Lydia right now... [11:15:08] aude: Please review https://github.com/Wikidata-lib/PropertySuggester/pulls - especially #179 by Marius. [11:18:50] ok [11:22:53] aude: Also a perfect review for a native speaker ;-) https://gerrit.wikimedia.org/r/330630 [11:23:24] aude: Otherwise, Lydia suggests you continue working with Citoid. [15:26:17] hello [15:27:10] no json dump of wikidata today ? [15:43:38] Don't they get finished later today? [15:45:23] Lydia_WMDE: https://gerrit.wikimedia.org/r/#/c/327905/ [16:02:09] sjoerddebruin: the first step is to create a new directory in https://dumps.wikimedia.org/wikidatawiki/entities/ and this is not even done [18:37:29] edoderoo: Are you getting a ping from https://www.wikidata.org/wiki/Wikidata:Database_reports/removed_sitelinks/nlwiki ? [18:39:06] Probably not, as a signature is required for that. [18:41:31] I don't know the exact logic behind the magic ping ;-) [18:42:25] Link to user page and then a signature (user page link and timestamp) [19:45:58] wikimedia projects are acting up a little, is something scheduled happening? [19:49:25] What's the most convenient way to get the correct wikidata ids for a large number of strings, all corresponding to film titles? [19:49:53] o/ Lydia_WMDE [19:49:57] Still around? [19:50:08] I'll obviously need to do some manual disambiguation, but it would be nice to make the process as painless as possible [19:50:29] Stuck on https://phabricator.wikimedia.org/T155828 and not sure what to do to describe the different quality levels. [19:55:23] johtso: Yes, probably, what do you want to do exactly? Did you already have a look at the query engine? [19:55:38] And openrefine? [19:57:27] multichill: I basically have a list of film titles and want to get the relevant wikidata id so I can then query properties, does openrefine have some built in functionality for fuzzy searching wikidata? [19:57:57] halfak: I would go for coverage and completeness for labels/descriptions/statements/references [19:58:47] Are these film titles connected to something? I helped someone the other day with OpenRefine and Wikidata. Let's see if I can find the manual [19:59:27] multichill, hoping for something like this: https://en.wikipedia.org/wiki/Template:Grading_scheme [19:59:56] Essentially, I think we'll need criteria (even vague or subjective) about each quality level we want to predict. [20:00:09] This will help volunteer quality labelers be more consistent. [20:01:11] johtso: Ah, found it, ehm, how is your Dutch? ;-) [20:03:32] Basically you're linking a field with the Wikidata search in OpenRefine. That seems to be working ok [20:04:26] halfak: Maybe you can get unblocked if you make it a two step approach? 1. What do we want to hash in? 2. What levels do we want to set? [20:05:11] multichill, not even my work. I'm waiting for someone in Wikidata who knows what they mean when they say "quality" to write it down and talk to me about it. [20:05:11] And maybe set the levels based on percentage? Highest class is 0.1%, next one 1%, etc etc [20:05:15] ;) [20:05:31] multichill, hard to formalize that [20:06:13] It sure is. Wiki projects took many years to hammer it out. https://phabricator.wikimedia.org/T127470 I guess? [20:06:36] halfak: i'll have a look at that tomorrow :) [20:06:51] OK no worries Lydia_WMDE. Have a good night :) [20:07:16] multichill, yeah. That's what I'm trying to help facilitate [20:47:10] Greetings, hoping someone can point me to the right person to ping on two instances running on WMF Labs -- wikidata-suggester -- and -- wikidata-wdq-mm -- [20:47:31] we are going to have to shut them down at the end of march via https://phabricator.wikimedia.org/T143349 and the list of project members is very long to cc everyone on phab :) [20:48:39] Lydia_WMDE: ^ [20:49:32] chasemp: hey [20:49:46] Good evening Lydia_WMDE [20:49:49] i'll bring it up in tomorrow's team meeting [20:50:03] ok thanks, I'll make a quick comment and ping to you on the task just to keep the narrative going. [20:50:16] cool [20:50:51] chasemp: wikidata-wdq-mm can most probably be shut down. magnus is the person to ask. I helped migrate that to the wdq project, where they run trusty or jessie. I'll ping him to check [20:51:02] thanks yuvipanda [20:54:42] chasemp: i've emailed magnus [21:13:12] chasemp: yuvipanda: Any task related to WDQ should be on https://phabricator.wikimedia.org/T153439 [21:21:28] thank multichill I think teh idea is I'm not sure what hte task would be, and if it's just "shut it down" we can handle it on our end, I'll wait for more input [21:42:25] I've got a newbie's question about wikidata. Why do we have reverse properties? (part of/has part, subsidiary/parent organization, ...) Wouldn't it be simpler to model all relations in the same direction? [21:43:30] I find it a bit tedious to write { ?a wdt:P361 ?b } UNION { ?b wdt:P527 ?a } in a SPARQL query… [21:52:40] multichill, no ... I do not get a ping there [22:09:30] pintoch: you can write ?a wdt:P361|^wdt:P527 ?b instead, that’s shorter at least ;) [22:09:39] ^path inverts a path, and path1|path2 unions two paths [22:09:49] (I suppose it’s also less readable) [22:09:58] that's indeed much better [22:10:04] but you’re not supposed to need that [22:10:18] ah yeah, I was surprised that this was needed [22:10:19] afaik all such properties are supposed to have both statements in all cases [22:10:38] well, in my case it does make a big difference on the results I get [22:10:56] yeah, then someone didn’t add the reverse statements :/ [22:11:20] see the “inverse property” template on https://www.wikidata.org/wiki/Property_talk:P361, it gives you a list of constraint violations [22:11:35] ouch, 230k of them [22:11:48] shouldn't a bot take care of filling these? as a user I really don't want to spend my time doing that [22:12:21] I think someone else will have to answer that, I’m not too familiar with the policy around these statements [22:12:25] *properties [22:13:09] if there was a bot, it should probably duplicate the references and qualifiers on the statements? what if they are conflicting? [22:13:16] they're a bit annoying, sometimes people do try to fill in the gaps but it usually ends up re-adding removed statements or adding more incorrect statements or whatever [22:13:45] I see [22:14:17] there's also some places where it's not quite symmetrical, like we have "member of" and "part of" but the inverse is "has part" for both [22:14:33] -_- [22:14:41] yeah :/ [22:15:36] I don't understand how the project could get to this state given that property creation is restricted to users with privileges [22:17:34] I think some people like to have inverse properties because without them, the connections between things are hard to see and use [22:18:22] like there's no way for a wikipedia page to say "find all items which have a "member of" statement linking back to this item", they can only use the "has parts" statements [22:18:39] that's a good point [22:19:24] but still, there should be some system to keep the two versions in sync