[01:09:04] Cloaks request [10:05:07] Lucas_WMDE: was the constraint database updated recently or? [10:19:11] sjoerddebruin: about a month ago, https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2017/05#Constraint_table_updated [10:19:35] A monthly update would be nice then. [10:19:38] no update since then, and I don’t think we’ll have another one since constraint statements are progressing quite nicely [10:19:46] so we should be able to migrate to those soon [10:20:40] you can already play with them on https://wikidata-constraints.wmflabs.org/ btw (I’ll probably announce that on PC at some point too) [10:20:50] Ah. [10:23:45] hmm... I wonder if that will cause any problems [10:24:31] there are already occasionally people who go around changing data because it violates a constraint, when it's actually the constraint that needs improving [10:27:57] like the place of birth one on https://wikidata-constraints.wmflabs.org/index.php/Item:Q35, if there were two conflicting sources and we therefore had two statements, it would be an exception, but the popup thing makes it sound like there should only ever be one statement [10:29:21] hm, “the constraint might be wrong” isn’t really a possibility I even included in the “possible actions” on the help documentation (e. g. https://www.wikidata.org/wiki/Help:Property_constraints_portal/Single_value#Possible_actions) [10:29:30] not sur if it should be added though [10:36:06] I'm not sure what the best thing to do is :/ but I do think it should be clearer that exceptions are possible [10:36:10] nooo I broke it [10:36:20] I clicked edit, clicked cancel and now clicking the ! doesn't do anything [10:38:54] nikki: that’s a bit weird – I would’ve expected the icon to disappear, or to remain functional, but not that [10:39:06] I'm good at breaking things :P [10:39:07] page reload should fix it :/ [10:44:43] does it do anything with ranks? like deprecated rank statements are things we know are wrong, so it would be a bit silly to complain about a single value violation when we've already marked one as wrong [10:45:18] nikki: no, but that’s an interesting idea [10:45:35] not sure if that would always be correct, but it sounds good :) [10:45:56] do you want to open a phabricator task or shall I do it? [10:46:41] I'll let you do it, you presumably know which tickets to connect it to and stuff [10:47:19] alright, sure [10:55:43] https://phabricator.wikimedia.org/T167653 [10:57:31] thanks :) [11:03:21] I love how items can fill up with tools and bots, this item wasn't even touched by a "human" yet. https://www.wikidata.org/w/index.php?title=Q13137772&action=history [11:30:51] how to get wiki technical data? [12:16:38] Aleksey_WMDE: Are you willing to merge https://gerrit.wikimedia.org/r/347571 as a separate patch, independent from any discussion about constructors/builders? [12:32:00] DanielK_WMDE: Lucas_WMDE: https://xkcd.com/simplewriter/ [12:36:00] frimelle: People say things about other things, but sometimes those things are wrong. [12:36:01] I try to find out when those things are wrong, to help people write things that are more right. [12:36:01] Because I don’t have a lot of time, I try to make a computer find out when the things are wrong, instead. [12:36:01] Sometimes the computer is right, and sometimes the computer is wrong, too. [16:04:35] Lucas_WMDE: do you have any idea how I could select disambiguation pages which have the same text for the label and description? I tried using the same variable for both the label and description, tried separate variables plus a filter, tried turning the optimiser off, but it's so inefficient that I can only get about 10 at a time, and the query seems so simple that I can't think of any other approach I could use [16:05:01] it's probably another case where I need to parse a dump >_< [16:05:53] nikki: http://tinyurl.com/y892mxca is basically my best idea as well [16:06:04] not sure if there’s a better way [16:06:26] bah [16:06:38] was worth a try :) [16:09:23] I wouldn't mind using dumps so much, but the data's out of date before I even start downloading [16:15:39] frimelle: https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/Data_Model [16:21:44] See https://www.wikidata.org/wiki/User_talk:Joy_Agyepong#Adding_multiple_incorrect_links , do we have a tool to mass revert a user without sending them 500 pings? [16:22:48] No bad intentions, just clueless. It's a bit under 500 edits to revert at https://www.wikidata.org/w/index.php?title=Special:Contributions/Joy_Agyepong&offset=&limit=500&target=Joy+Agyepong [16:23:53] multichill: use hoo's mass rollback script and mark as bot edits [16:24:43] https://meta.wikimedia.org/wiki/User:Hoo_man/smart_rollback.js [16:25:46] The fun thing, you also need to revert in mix-n-match, otherwise someone will add them again. [16:44:39] sjoerddebruin: If you can give a hand, that would be nice. Back later [17:16:38] Hi. My restbase instance isn't working and I keep getting this the error 404: not_found#route, log sample here: https://pastebin.com/N6iUXTxg [18:37:49] !admin https://www.wikidata.org/wiki/Wikidata:Administrators%27_noticeboard#Vandalism_report_of_181.93.33.39 [19:23:23] SMalyshev: You keep coming up with all these new fun things! You don't happen to have the category tree in Blazegraph too? :P [19:24:10] multichill: soon, soon :) [19:24:40] https://phabricator.wikimedia.org/T157676 [19:25:12] I showed you https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Possible_paintings , right? Would be nice to be able to graph that directly [19:25:24] also https://phabricator.wikimedia.org/T165982 [19:25:27] multichill: yes [19:27:04] SMalyshev: Do you know the concept of over-categorizaton on Commons? Still looking for a service that can filter that. I have a phab ticket somewhere, but can't find it right now [19:27:31] Oh, there it is, https://phabricator.wikimedia.org/T110833 [19:28:36] not sure I know how to algorithmize over-categorization [19:29:01] but getting categories into graph db is something I'm definitely is looking into [19:29:14] The sweet spot seems to be at around 8 levels. Go deeper and times explode and junk starts showing up, go lower and you miss things [19:30:02] And I just filtered out hidden categories. [19:41:43] DanielK_WMDE: replied your comment on the patch ;) [19:41:48] is there an easy way to add references to existing statements? [19:57:09] depends on what kind of "easy" you want. but iirc there is a gadget. aude knows more, probably [19:57:19] nikki: --^ [19:59:04] glorian_wd: "By default, this API only suggests properties if the item contains classifying properties (P31 or P279)" <--- that should definitly not be true. [19:59:47] if it is true, either it's a bug, or someoner decided to change things significantly, and I did not notice. [19:59:49] DanielK_WMDE: it's true :). You can check the table wbs_propertypairs [19:59:50] I have a list of items/statements and I know what reference needs adding, but I don't want to do it one by one [20:00:07] DanielK_WMDE: hoo told me about that. [20:00:18] And I have checked the table by myself [20:00:22] hoo has been messing with that table manually [20:00:32] that can of course have all kinds of effects [20:00:56] if he made that data that way, this may be true at the moment. [20:01:03] DanielK_WMDE / glorian_wd : *kuch* random empty item https://www.wikidata.org/wiki/Q18692145 , click add, guess what I get? [20:01:12] but is it true by design? if is it guaranteed to be true? if so, why? [20:01:35] multichill: hm? [20:01:42] multichill: instance of/subclass of a.k.a classifying property [20:02:19] "By default, this API only suggests properties if the item contains classifying properties (P31 or P279)" <- thus this statement is incorrect [20:02:20] DanielK_WMDE: I've never known the true reason of why it only works for classifying properties. [20:02:38] The api suggested 2 properties to me on an empty item [20:02:43] multichill: well, *no* properties at all are yet another special case. [20:03:14] glorian_wd: it wasn't designed that way. it may be that hoo decided that without classifying properties, the quaslity of the suggestions was too bad. [20:03:45] im getting the impression that this has been hand-tuned to a point where there is no rhyme or reason, only "seems to work ok with our current data, so let's do it like this for now"... [20:04:05] I'd certainly see how it could happen - it's hard to suggest something meaningful to an item having no P31 except "please add P31" :) [20:04:18] glorian_wd: so, basically: i no longer know what is guaranteed and what is not. which means i can't help you. [20:05:40] DanielK_WMDE: do you mean the PropertySuggester able to suggest properties for item that does not have classifying properties in the past? [20:05:45] glorian_wd: however, from the issues with the different cut-off conditions in hte code, i think you should be using the database directly. the meaning of the query isn't hard to understand - as i explained, it's feature vectors in an adjecency list [20:05:57] glorian_wd: yes. [20:06:17] huh... would that just be the most commonly used properties? [20:06:36] nikki: that's what we used to do if there were *no* properties at the moment [20:07:17] if there were properties, but no classifying properties, we'd just use the co-occurrance matrix to get the most likely ones [20:07:33] but the results were often kind of random [20:07:48] i guess that's why this has since been tuned [20:07:56] * nikki nods [20:07:58] but i don't know all the details of this, it seems [20:08:34] glorian_wd: i'd be interested to see documentation of how this is now intended to work, and why :) [20:08:43] it seems my knowledge has grown stale [20:10:26] anyway, time to go home [20:11:56] DanielK_WMDE: AFAIK, aude and hoo are knowledgeable on this specific code [20:12:02] DanielK_WMDE: have a good evening! [20:12:16] yes, i hope they are :)