[06:32:33] !gs ALVARO MOLINA VALE BASURAAAA.. LALA [06:32:35] !gs ALVARO MOLINA VALE BASURAAAA.. LALA [11:01:45] sjoerddebruin: Do you have op on this channel? Can you remove the cluster ban? See https://lists.wikimedia.org/pipermail/wikimedia-l/2016-October/085191.html [11:03:26] Just realized I don't have op here so I can't do the work around myself [11:03:38] multichill: yes I am, but we've had bans like these before AFAIK. [11:04:05] Someone using a proxy spamming all over Wkimedia's channels. [11:04:52] I think "C" knows more about them, he seems to be online. [11:08:01] sjoerddebruin: We invite our users to join this channel on https://www.wikidata.org/wiki/Wikidata:Project_chat and than it doesn't work [11:08:39] Yeah, but we don't have the tools to combat this spamming. As you can see, we have two people current opped. That indicates that there was a recent incident. [11:09:49] Looks like the spammer was klined within a minute [11:09:53] That's effective enough [11:10:33] And it's easy enough to make it a muted ban instead of a full one [11:11:11] http://wm-bot.wmflabs.org/browser/index.php?start=09%2F30%2F2016&end=10%2F01%2F2016&display=%23wikidata [11:17:51] Let's ban all ip's from editing the Wikimedia projects while we're at it. No, the cure is worse than the disease [11:19:03] I'm trying to look up the cause, moment please. [11:25:57] No response either. Okay, I will remove for now. When the ban is lifted I will add it back, as I don't know what the effects without the cluster would be. [11:26:46] * nikki grumbles at the sparql thing not supporting post and therefore not being able to run queries that can't be expressed concisely [11:28:33] Hm, slow ChanServ :P [11:57:25] Why is MUL not in the language dropdown? I've tried searching for the text that shows up after the string, but it's hard to remember. [12:11:32] hoo: everytime I add statements to properties, I get sad because there a no suggestions. :( [12:12:48] I can guess [12:13:28] But I don't think it's very progressive to complain to you every day. :P [12:13:28] but given people decide to just store number for entities in the suggester, fixing this takes a bit of effort :/ [13:53:52] PROBLEM - Response time of WDQS on wdqs1001 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [13:54:11] meh [13:54:50] :P [13:58:12] someone is apparently sending garbage :/ [13:58:52] RECOVERY - Response time of WDQS on wdqs1001 is OK: OK: Less than 5.00% above the threshold [120000.0] [14:10:22] hoo: Garbage? It doesn't seem to be very responsive :-( [14:10:40] multichill: Well, I saw many queries there which weren't SPARQL [14:10:50] maybe someone did a mistake in a script [14:11:03] so that it send a variable name instead of the value or some such [14:11:20] How did you see that? Can't you see the user-agent? [14:11:51] I only looked at the journal on wdqs1001 very briefly [14:12:10] Normal queries are all giving 5xx errors now.... [14:12:12] no user agents there, just the Java exceptions we also show to the user [14:12:35] Maybe we should have a simple regex on the webserver that filters out most none sparql junk? [14:13:24] Request string should always start with SELECT and maybe a couple of lines of # comment ? [14:13:34] hm [14:13:44] well, it should at least contain select, I guess [14:14:15] Just a simple junk filter could keep a lot of stuff from hitting the application [14:16:39] gehel and SMalyshev are the folks in charge, right? [14:18:10] yeah [14:24:02] hoo: Can you have a look at https://www.wikidata.org/wiki/Q21535072 ? Is the calendar correct there? [14:26:24] I see that pywikibot sets everything to https://www.wikidata.org/wiki/Q1985727 [14:27:24] multichill: Well, Nürnberg had the Gregorian calander when he died [14:29:33] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [14:32:02] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [14:34:53] wdqs is slow as fuck yeah [14:35:12] nikki is probably having problems too, hence her Phab task [14:37:42] commented on the task [14:38:24] auxiliary_matcher is Magnus' mix-n-match [14:39:14] sjoerddebruin: Can you poke him to have a look at the queries his tool sends out? [14:39:35] but the volume of those is rather low… so I doubt that's the acutal cause [14:39:42] maybe just a coincidence [14:41:01] * gehel is late to the party... [14:41:21] multichill: yep, gehel and SMalyshev are the one who should be looking after wdqs... [14:43:27] hoo: Tweeted him, two days ago he was on holiday. [14:50:47] I wasn't having problems with 502s/504s earlier, I was trying to do queries with lists of ids, which very quickly hit the url length limit [14:51:10] which reminded me of the problem with the button not getting re-enabled that I'd had before [14:51:38] Isn't it possible somehow to POST nikki? [14:51:50] but I haven't run a query for a while so maybe it's not working for me now either :P [14:52:00] no, the documentation says that will return a 403 :/ [14:52:28] Grmbl [14:52:35] yep [14:52:47] yeah… posts are used for updates [14:53:07] so we don't allow them for users [14:54:12] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [14:56:16] sjoerddebruin: I'm going to add some missing country of citizenship claims. I'm checking that date of death > inception of the country. That should catch mistakes, right? [14:57:16] multichill: AFAIK yes, thanks for thinking about that. [14:59:08] We really have to figure out how to do deal with older painters and how to connect these to a country [14:59:11] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [14:59:32] But I'm doing that like with programming, just postpone it and later fix it ;-) [14:59:55] Maybe https://www.wikidata.org/wiki/Property:P2348, like I said before? [15:01:52] But that's for time? Do you want to create a new similar property? [15:03:16] Well I saw things like https://www.wikidata.org/w/index.php?title=Q120122&type=revision&diff=277628494&oldid=277354655 [15:05:04] ok. That covers the when, but not really the where [15:22:23] PROBLEM - Response time of WDQS on wdqs1001 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [15:27:14] RECOVERY - Response time of WDQS on wdqs1001 is OK: OK: Less than 5.00% above the threshold [120000.0] [15:51:42] PROBLEM - Response time of WDQS on wdqs1001 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [15:51:46] sjoerddebruin: Ok. Implemented it and it seems to work well, see https://www.wikidata.org/wiki/Special:Contributions/BotMultichill [16:01:43] RECOVERY - Response time of WDQS on wdqs1001 is OK: OK: Less than 5.00% above the threshold [120000.0] [16:01:50] multichill: great [16:10:51] sjoerddebruin: Take for example https://www.wikidata.org/wiki/Q22236302 . It didn't add the country because date of birth/death was not known. Next run it will be added [16:30:41] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [16:35:28] Thanks Wikidata. Can't remove a statement while logged out, due to edit filter, and can't log in due to 'hijacking'? https://usercontent.irccloud-cdn.com/file/HQGokipD/IMG_4039.PNG [16:35:52] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [16:37:27] Josve05a: tried restarting your browser? [16:38:01] it's on my iPhone...can't really restart the browser...I tried to force quitting it , and reloading the tab/page etc...nope... [16:38:07] had to walk to my computer... [16:38:51] Hm, weird. It's only prevention, no need to worry about hijacking. [16:39:10] And damn, don't we have a retina logo yet? [16:39:13] yeah, but it's still annoying... [16:40:34] Nope...sjoerddebruin https://usercontent.irccloud-cdn.com/file/pYIxl17Y/IMG_4040.PNG [16:40:41] Ewwwwww [16:41:32] Code doesn't seem hard: https://phabricator.wikimedia.org/rOMWCb2588062c2eba8c4f89a887e5a5d94f3aea7c3cc [17:03:43] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [17:08:42] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [17:12:29] hoo|away: interesting suggestions. https://www.wikidata.org/wiki/Q602300 [17:20:40] sjoerddebruin: Weird… "Use" and "topic's main category" could cause trouble [17:20:57] The last one, because of the same reason again. [17:21:05] *now* I got a 504 from the query service [17:21:13] 400k use, use on different subjects. [17:21:54] If Lydia is ok, we can remove it [17:22:44] But I think P17 (country) is still causing more trouble than this. [17:36:23] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [17:41:23] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [17:57:13] AlexZ: O, you're back. Care to respond in #wikipedia-nl to the message I sent you earlier? [18:36:56] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [18:41:57] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [19:00:42] Jonas_WMDE: I saw your additions, but it's still showing some defects. The section header stays on edit mode, and when you paste a reference on a statement in edit mode it stays on that too. [19:09:37] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [19:14:38] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [19:37:03] hi, how much flooder edits per minute are allowed in Wikidata? [19:38:33] !admin [19:42:04] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [19:42:24] doctaxon: see the last paragraph of this section. https://www.wikidata.org/wiki/Wikidata:Bots#Bot_accounts [19:47:04] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [20:07:04] sjoerddebruin: Want to help make a property proposal for me *puppy eyes* [20:07:34] Josve05a: you know this channel is public logged? ;) Of course I can help with making proposals. :) [20:07:51] sjoerddebruin: okay, 60 per minute? [20:08:09] sjoerddebruin: See https://www.wikidata.org/wiki/User:Josve05a/aftonbladet :) [20:08:52] doctaxon: I would say 30 [20:09:06] "Use maxlag=5 (5 seconds). This is an appropriate non-aggressive value, set as default value on Pywikibot. Higher values mean more aggressive behaviour, lower values are nicer." [20:09:22] sorry as German native speaker you don't understand this cryptic pages [20:09:51] well I've been trying my best to find the relevant information too [20:09:57] Josve05a: what do you need? [20:10:33] sjoerddebruin: Formatting the proposal to fit that damn template...it takes me about 30 min to add all the stuff in the correct parameters... [20:10:49] 5 to 10 here. :P [20:12:03] that's faster than me :) [20:12:14] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [20:12:31] what a problem is that? [20:13:47] nothing relevant I think [20:15:10] BotMultichill edits about 1 time per second [20:16:07] I read in recent changes [20:16:52] Josve05a: https://www.wikidata.org/w/index.php?title=User%3AJosve05a%2Faftonbladet&type=revision&diff=382984940&oldid=382982814 [20:17:13] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [20:17:23] yay, thanks sjoerddebruin ! :D [20:17:45] <3 [20:25:50] sjoerddebruin: Hmmm...? https://www.wikidata.org/wiki/Q27063508 [20:26:31] do we have items for our properties? [20:27:13] Not the only one AFAIK [20:31:03] https://www.wikidata.org/wiki/Property:P1629 needs to be inversed imo, and it feels weird to connect it with the newspaper item [20:31:25] It's even a constraint, 825 violations... [20:32:42] https://www.wikidata.org/wiki/Wikidata:Property_proposal/Aftonbladet_topic_ID [20:36:24] I will sent the bill later. [20:36:57] sjoerddebruin ok I will check [20:39:00] Lydia_WMDE: Howabout adding something like this link https://www.wikidata.org/wiki/User:Josve05a/dupes to the newsletter to work on this month? [20:39:25] Don't think you need approval for that. [20:39:40] Perhhaps not...but I'd rather ask than not... [20:41:24] https://www.wikidata.org/w/index.php?title=Wikidata:Status_updates/Next&diff=382989060&oldid=382937636 :) [20:42:33] I've added "Help translate or proofread pages in your own language!' months ago, not sure if it helped [20:44:48] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [20:46:29] is there a list of like the "top 10/top 100" items (like most views of all wikis combined/averaged based on number of site links...) [20:48:24] You can do that per page with https://tools.wmflabs.org/langviews/, but no top list :( [20:48:41] :/ ok [20:49:44] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [21:12:36] hm... I've selected some items with sparql and want to select all the other statements the items have, but I can't figure out how to do that, any ideas? [21:13:55] I tried "?item ?prop ?val" but then ?prop is all sorts of predicates and I can't figure out how to limit it to only p: or wdt: ones [21:14:16] nikki: [] wikibase:directClaim ?predicate: http://tinyurl.com/zv5gd8w [21:14:27] wikibase:claim for p:, see https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Properties [21:14:49] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [21:19:07] thanks :) [21:19:25] now if only it would give me some results and not a 504 error [21:19:38] :/ [21:19:50] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [21:19:58] WDQS having a rough day… [21:34:26] wooo! some results! :D [21:36:31] <3 [21:44:57] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [21:49:55] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [22:12:30] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [22:17:30] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [22:25:26] hmm...Is anyone here great with queries? Is it possible to generate a list of articles about people (Q5) with VIAF-statements, which has an sv sitelink, where the svwp article does not have the template {{Auktoritetsdata}} on it [22:25:27] 10[3] 04https://www.wikidata.org/wiki/Template:Auktoritetsdata [22:39:10] Josve05a: I don’t think the template part is possible in WDQS, but here’s a query for the rest: http://tinyurl.com/zbq8y9z [22:39:50] for the last part, you’d probably have to retrieve the article and check the templates on it programmatically (pywikibot or whatever) [22:39:53] thanks...I'll use thatsomehow with AWB then :) [22:40:03] AWB? [22:40:18] aude: About? Do we have problems with wikidata creating error logs still? [22:41:00] WikidataFacts: enwp.org/WP:AWB [22:41:29] ah, okay, thanks [22:42:34] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [22:43:18] Reedy: you mean stuff not appearing in fatal log? [22:44:12] o_O https://phabricator.wikimedia.org/T147122 [22:45:23] aude: Yeah.. [22:45:28] Can't see that in fatal or exception :( [22:45:32] on fluorine anyway [22:47:34] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [22:51:26] aude: Is Flow supposed to have created the archive 1 pages? [22:52:11] i cant find the fatal, but found something related [22:52:31] flow probably creates the archive but i'm not sure [22:53:14] I haven't ever poked around in flow enough to know [22:53:38] i don't have it installed locally and haven't enabled it yet for my talk page [22:53:58] heh [22:57:10] oh... I bet I know what this is about [22:58:15] I think this is related to T147098 [22:58:15] T147098: Exception when getting TTL export of a deleted entity - https://phabricator.wikimedia.org/T147098 [22:59:17] Which may be fixed by https://gerrit.wikimedia.org/r/#/c/313626/ ... but maybe not because FlowException doesn't override report() [23:00:46] but this is not a MWEXception afaik [23:00:52] but could be related [23:01:05] class FlowException extends MWException [23:01:31] oh [23:01:56] Flow\Exception\InvalidInputException [23:02:02] but probably extends MWException [23:02:05] class InvalidInputException extends FlowException [23:02:09] ok [23:02:17] https://github.com/wikimedia/mediawiki-extensions-Flow/blob/458fd8907a72f684d84e3af4f3cf9ca9e4b207cb/includes/Exception/ExceptionHandling.php [23:02:25] and then not in logs part is related [23:02:42] that was part of what Stas tracked down yesterday [23:03:01] ok [23:03:27] my patch isn't going to fix though because FlowException doesn't fully overload report() [23:03:47] I'll put some notes on the bug [23:05:16] what a shit exception [23:05:36] maybe the error is easy to reproduce locally, if i enable FLow extension [23:05:36] hmm [23:05:36] thanks [23:07:21] $wgOut? [23:08:28] function isLoggable() { return false; } [23:08:40] ... why do we have support for that? [23:08:54] * bd808 tries to look away before getting mad [23:08:56] o_O [23:09:06] maybe some of the HttpErrors? [23:09:13] yeah, related [23:09:29] that exception is supposed to change the http status code to 400 [23:09:33] and not get logged [23:09:38] https://github.com/wikimedia/mediawiki-extensions-Flow/blob/458fd8907a72f684d84e3af4f3cf9ca9e4b207cb/includes/Exception/ExceptionHandling.php#L163 [23:10:18] so I think basically its for signalling something to an ajax call [23:10:58] oh [23:11:07] Aaron's change breaks all of that Flow exception stack :/ [23:11:18] we probably need to roll it back somehow [23:11:52] but we can't just blindly revert without reverting a bunch of the db refactoring he's done recently too [23:12:35] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [23:13:49] :/ [23:14:13] were still on wmf18? [23:14:34] wmf20 [23:14:35] wmf20 -- https://tools.wmflabs.org/versions/ [23:15:14] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [23:18:41] aude: I'm pretty sure you can safely remove #wikidata from that bug. It's a Flow + core problem [23:20:26] probably [23:46:28] https://www.wikidata.org/wiki/User_talk:Aude :( [23:46:38] i enabled flow, then disabled it for my talk page [23:57:11] heh [23:57:16] At least it's reproducible [23:57:29] easily though not sure how to correctly setup flow locally [23:57:44] vagrant role enable flow; vagrant provision [23:57:55] true [23:58:16] but i'm tethering and don't want to use all my data :/ [23:58:47] you can look at the puppet code to see all the config they setup [23:59:47] now it works