[08:32:32] PROBLEM - puppet last run on wdqs2001 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [08:55:52] RECOVERY - puppet last run on wdqs2001 is OK: OK: Puppet is currently enabled, last run 20 seconds ago with 0 failures [09:33:16] matej_suchanek: "warned" Muhammad Abul-Futooh again, 118 edits per minute. Dispatch is going down now, if he stops it will go even faster. [09:35:41] I was looking at it in the morning and told to myself: "faster is only a free fall"... maybe we can reach that :D [09:40:35] Yesterday was half the edits than the two days before. [09:41:26] http://wikidata.wikiscan.org/gimg.php?type=edits&date=201707&size=large [09:41:59] interesting [09:43:34] I actually dreamt about this :P [09:43:54] Oh dear. [09:43:55] I was dreaming that every time we convinced someone to stop, someone else would start [09:44:03] rather stressful :P [09:44:17] That's reality, I think. [09:44:30] I'm giving Muhammad Abul-Futooh one hour to respond. [09:47:03] inb4 “wow Wikidata barely has any edits anymore, is the project dying” [09:47:37] It's more like people flushing toilets while there is something stuck in the pipe. [09:47:58] nice analogy :-D [09:48:03] :D [09:48:47] Can't we get extra server capacity or something? [09:57:46] sjoerddebruin: apparently, we have a war of bots here https://www.wikidata.org/w/index.php?title=Q20389306&action=history [09:58:20] quickstatements just overwrites existing descriptions [09:59:01] I hope Lucas_WMDE agrees this is not correct https://www.wikidata.org/w/index.php?title=Q20389306&diff=523397554&oldid=523321974 [09:59:18] yeah, Dorf should be capitalized [09:59:24] not sure if Dorf or Siedlung is more appropriate [09:59:35] Well, it has P31 https://www.wikidata.org/wiki/Q486972 [09:59:37] (but „im“ is correct, at least) [09:59:47] sjoerddebruin: +1 [10:00:06] how do they know it's a village? [10:00:16] en desc says village too… [10:00:33] of course, added by them [10:00:48] I think this should have been approved before it started [10:01:11] Yeah, block please. [10:03:11] will do [10:09:03] done [10:09:15] Will watch the dispatch graph carefully now... [10:09:45] https://www.wikidata.org/wiki/User_talk:Muhammad_Abul-Futooh#Blocked [10:11:48] "Dispatch pending" should decrease soon, "Dispatch lag" depends on what was happenning on Sunday [10:12:10] anyway, here you can see it https://grafana.wikimedia.org/dashboard/db/wikidata-edits?refresh=1m&orgId=1&from=now-1h&to=now [10:21:44] I think we should update https://www.wikidata.org/wiki/Wikidata:Bots because the page only deals with "bots" as we know them from Wikipedia [10:22:06] And Magnus needs to adjust his tools ASAP when he gets better [10:23:08] I'm thinking what bot flag is for when everyone can enter QS and go [10:24:27] yeah it has always struck me that QS v2 is basically a convenient way to bypass bot approval… [10:24:47] even the legacy version... [10:25:06] at least the legacy version does not perform edits with a bot flag [10:25:09] so it is slower [10:26:33] but there is an obvious need for these tools. I think they should just have some sort of staging area: once you upload your statements, it creates a publicly visible "candidate batch edit" that people can review before perfoming it [10:26:54] HarvestTemplates nicely distinguishes if you are bot or not and throttles you to 5 seconds [10:27:54] and then basically move the reviewing work that is being done at BRFA to that platform. The advantage is that you would directly be able to see what the bot is going to do [10:28:49] it could also have some nice statistics about each edit batch (how many unreferenced statements? what ranks? how many constraints is it going to violate? and so on) [10:29:11] harvesttemplates also supports constraints yeah [10:29:41] but it hasn't been updated to the new version yet :/ [10:34:26] is it vandalism here? https://www.wikidata.org/wiki/Q16297851 [10:35:47] Ainali: no longer [10:36:28] I put there a short block because there were many IPs [10:36:59] hi evrb, i don't understand the semantics of 'statements/claims' about properties ? I checked the glossary, but still don't get it ? [10:37:27] matej_suchanek: Ah, good [10:37:54] I saw it thanks to https://twitter.com/WikiLiveMon/status/887616451145854976 [10:40:05] i resolve all properties(P) of a wikidata-item(Q) and in the json-response exists a huge branch 'claims' about each property ? [10:52:15] I do wish people had to get approval before mass adding labels/descriptions like that, because people keep doing it for languages they don't speak [10:53:09] nikki: +1 [10:53:38] (at least, someone who speaks the language should say "yes, that's correct" before they add it millions of times) [11:01:57] I also think it would make sense to have a central list of common descriptions somewhere, there's several bots plus autoedit and they don't always agree [11:05:42] like we could have a json file which the bots and autoedit could all load and then we would only have to update one list in one place [11:13:07] ugh... 19,258 items with the description "dorf im Jemen" [13:35:55] sjoerddebruin: we've got a response https://www.wikidata.org/wiki/User_talk:Muhammad_Abul-Futooh [13:45:03] greek and russian look wrong there too... [13:45:51] yeah, I'm just writing it [15:13:11] matej_suchanek: Harej's speed isn't impacting dispatch right, as the items don't have sitelinks. http://wikidata.wikiscan.org/?menu=live&filter=all&sort=weight&date=6&list=users [15:18:14] hm [15:25:06] I expect "Dispatch Lag" to be decreasing more and more slowly because the tail is approaching the busiest part of queue [15:31:11] Dispatch pending is still going down though. [15:31:52] it's possible it will stop as well [15:37:54] We should just lock the site for three days? [15:40:23] preferably [15:40:36] Will be faster if https://www.wikidata.org/wiki/Special:Contributions/XXN-bot was quiet. [15:41:46] Mr.Ibrahembot running again... [15:42:51] wouldn't it be enough to just block the N most active bots instead of locking the site? [16:21:11] No, because others will appear. Like https://www.wikidata.org/wiki/Special:Contributions/Laddo. [16:21:14] (73 p/m) [16:48:56] there should be some limit (in tools or Wikidata?) how many edits user can make in a minute [16:49:50] That was proposed, but there still isn't one. [17:14:24] I think having the limit in wikidata itself would be tricky, like you don't want saving a bunch of labels or sitelinks or whatever to take ages and I don't really want my edits to start failing when I do a sudden burst of edits in a bunch of tabs [17:15:11] Asking Magnus to really implent some good limiting will be the important action. [17:15:20] so I think tools should limit the speed, like quick statements shouldn't let people do thousands of edits as fast as possible [17:16:25] Indeed. [17:26:28] it's sad that https://phabricator.wikimedia.org/T48251 is still not implemented.. [18:27:43] Wow, 1 per minute... https://www.wikidata.org/wiki/Special:Contributions/Info-farmer [19:03:17] Question about [[Template:Constraint:Type]]: we can use it with "relation=instance" xor "relation=subclass"; "relation=instance, subclass" is not possible. Is anyone aware why this is the case? It would be useful for some properties to allow both relations. [19:03:17] 10[1] 10https://www.wikidata.org/wiki/Template:Constraint:Type [19:48:02] DanielK_WMDE_: if you have some time, https://gerrit.wikimedia.org/r/#/c/362596/ (and subsequent https://gerrit.wikimedia.org/r/#/c/362606/) are waiting for review [19:53:44] Is this clear enough? I don't have dates, sadly. https://www.wikidata.org/wiki/Q28054860#P734 [21:01:00] SMalyshev: oh, right! [21:01:03] ping me again tomorrow [21:01:11] ok ) [21:01:12] rfc meeting coming up now... [21:03:25] DanielK_WMDE_: is there one on IRC? I don't see anything happening there... [21:05:58] SMalyshev: /join #wikimedia-office [21:06:29] oh dammit my window scroll was frozen :) [21:47:20] Does anyone know how to identify wikidata editor tools in revision comments? Would a "#" be used? [21:57:14] hall1467: i believe revision tags would be used. they are stored separately. [21:59:46] Okay, so there is no way of identifying all tools through revision comments? Do you know where revision tags are stored? [22:05:04] hall1467: in the change_tag table [22:05:22] Okay, thank you! [22:05:34] but they may be attached only to recentchanges entries, not revisions. that table is a strange hybrid [22:05:42] you can get them via the api, too [22:13:26] Okay, looks like the change_tag table does not have all revisions in it. So, I'll have to use the API [22:23:28] Dispatch still going down... [22:46:38] DanielK_WMDE_: https://phabricator.wikimedia.org/T171107 <- was JsonUnitStorage class moved recently? [22:57:55] SMalyshev: eek! [22:58:04] not that i'm aware... [22:58:20] DanielK_WMDE_: indeed. Looks like the class was renamed, but config wasn't updated [22:58:46] I'm not sure what's the right way to go - should we just do both at the same time? [22:58:59] shit... [22:59:03] a class alias would work [22:59:04] In general, we should be a bit more careful in moving such classes around [22:59:10] but i wonder how to best make people aware [22:59:32] we could put a warning into the class docs, but how does the class know? [22:59:46] i suppose we know which classes are likely to be referenced from config [22:59:56] putting a warning into the documentation would be good. [23:00:18] can we grep the config for classes that are referenced there? [23:00:34] yeah. So what you think would be the best way to unbreak it now? class alias or config patch? [23:01:20] whatever we can get out more quickly. config patch, i guess. [23:01:45] plus a ticket for identifying classes that are likely to be referenced in config, and adding a warning [23:01:47] well, config patch will break wmf.9... [23:01:54] it is either one or the other. [23:02:25] ah, we need b/c [23:02:28] alias, then [23:02:37] ok I'll make alias patch then [23:02:55] * DanielK_WMDE_ is on his way to bed and in no shape for emergancy patching [23:03:01] SMalyshev: thank you! [23:03:33] aude: if you are around, maybe you can help get the patch rolled out [23:03:52] SMalyshev: i'm unclear on how aliases currently work with outoloading in wikibase. [23:04:02] may need a line in composer.json [23:04:35] hmm maybe I don't know how that would work with autoloader, indeed [23:04:42] hi [23:06:03] aude: hey, good to see you! [23:06:30] DanielK_WMDE_: but if we change composer.json, it'll need new build, not? [23:06:44] not even sure how it works :( [23:06:45] seems we moved a class and broke compat. the class name is referenced from config. https://phabricator.wikimedia.org/T171107 [23:07:10] aude: we need to add a class alias. do you remember how to make that work with autoloading? [23:07:29] I am close to just putting class_exists into the config for now... [23:08:04] hehe, nasty, but wouold work. [23:08:12] we can list both classes in teh config [23:08:22] how? [23:08:38] have both in the array? [23:08:51] yea, but ObjectFactory doesn't support that, does it? [23:09:01] we need a quick fix, this broke the train [23:09:11] oh, maybe it doesn't work [23:09:47] seems wikibase doesn't use class_aliases any more [23:10:15] aude: if we add an alias and add a line to composer.json, can we just patch that into the build? [23:10:20] or do we need a fresh one? [23:11:04] class exists maybe isn't terrible [23:11:58] there also is $wmgVersionNumber [23:12:21] check out https://gerrit.wikimedia.org/r/#/c/366480/3/wmf-config/Wikibase-production.php - would this work? [23:13:01] not promising... https://www.google.de/search?q=composer.json+class_alias [23:13:05] hopefully autoloader is configured by that point [23:13:23] iirc class_alias + composer = trouble [23:13:39] i think that works [23:14:17] class_exists is ok as a temp. solution in a case like this [23:17:40] config changes are scary because they are hard to test... [23:18:50] SMalyshev: we'll learn all about it when tim starts to make core psr4 compatible... [23:20:41] DanielK_WMDE_: we can pull the change onto the canary servers to test [23:21:09] https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug#Staging_changes [23:24:46] aude: thanks! [23:25:58] when will we know if it worked? [23:27:08] we are deploying it now and checking on the canary [23:27:11] seems ok [23:30:37] looks like it doesn't break wmf.9 and test.wikidata seems to be ok too so maybe also doesn't break wmf.10 [23:31:46] thanks SMalyshev [23:31:46] hmm we aren't logging wiki name when logging fatal errors, are we? [23:32:01] no [23:32:27] i just search for 'Wikidata' or specific search term in kibana (e.g. JsonUnitStorage) [23:34:07] oh wait test.wikidata doesn't actually have conversions enabled. so we don't know yet if it works on wmf.10 until we try to upgrade [23:36:30] living dangerously... [23:37:14] i think we can check config on terbium [23:37:36] * aude needs to regain ssh access (https://gerrit.wikimedia.org/r/#/c/363180/) and then can look [23:38:19] aude: ok, that'd be great.