[00:00:28] I moved the action==upload check first [00:00:33] its going to be the fastest [03:11:32] Krinkle: Left a reply at https://gerrit.wikimedia.org/r/#/c/98869/ [07:56:37] is someone else receiving double email notifications from (it.)wikivoyage? one from Echo and one from core [07:58:02] also, is Echo using oldid=prev instead of diff=next just for the love of inconsistency? sigh [09:02:28] hoi ... the OpenDyslexic font is enabled on Wikidata but not on pl.wikipedia ... as I understand it the Opendyslexic font should have been configured for use everywhere ... ???? [09:02:47] what am I missing ? [12:26:27] Reedy_: https://bugzilla.wikimedia.org/show_bug.cgi?id=57972 [12:51:31] ankry: Pasting URL without related question. ;) [16:38:07] andre__: click me, click me!! [16:38:55] jeremyb, where? [16:42:49] 04 12:51:31 < andre__> ankry: Pasting URL without related question. ;) [16:42:57] ah :) [17:01:27] just got a timeout from wmflabs - anybody else? [17:03:26] Trying to run https://tools.wmflabs.org/xtools/pcount/index.php?name=Robsinden&lang=en&wiki=wikipedia [17:04:48] Oh, and http://status.wikimedia.org/ doesn't presently include wmflabs - will it ever? [17:08:42] Wow - just got this unsightly mess from the pcount call above: Warning: Cannot unset offset in a non-array variable in /data/project/xtools/public_html/counter_commons/HTTP.php on line 126 Warning: Cannot unset offset in a non-array variable in /data/project/xtools/public_html/counter_commons/HTTP.php on line 127 Warning: Invalid argument supplied for foreach() in /data/project/xtools/public_html/ [17:08:44] counter_commons/HTTP.php on line 131 Notice: Undefined index: 1 in /data/project/xtools/public_html/pcount/counter.php on line 199 Notice: Undefined index: 1 in /data/project/xtools/public_html/pcount/counter.php on line 207 Notice: Undefined index: 2 in /data/project/xtools/public_html/pcount/counter.php on line 199 Notice: Undefined index: 2 in /data/project/xtools/public_html/pcount/counter.php [17:08:45] on line 207 Notice: Undefined index: 10 in /data/project/xtools/public_html/pcount/counter.php on line 199 Notice: Undefined index: 10 in /data/project/xtools/public_html/pcount/counter.php on line 207 Notic [17:08:49] and lots more. [17:08:51] eek [17:17:41] lexein: Maybe the long text should go into a bug report [17:18:36] It was transitory, and I of course don't have any sort of other dump to verify. Shall I file anyways? (Sorry if I annoyed anyone) [17:19:27] Possible [17:19:33] It might just be a timeout issue [17:24:13] filing [17:28:38] thank you lexein [17:35:49] filed https://bugzilla.wikimedia.org/show_bug.cgi?id=57988 [17:38:00] Thanks, lexein! [17:38:12] Complete enough? [17:39:11] Looks like it to me, but I may not be the best person to ask :) Coren, look good to you? [17:39:13] I made an assumption that it was a tools problem, and not a pcount problem Sorry if wrong [17:39:38] If it's not, Coren can help you push it somewhere else, I bet. [17:39:40] It does; it's a DDOS in progress looks like. [17:39:52] Fun times [17:40:32] ohcrap; it has to be a botnet. I see stuff coming from 181 different /8s [17:42:10] ~1000 requests/m from some 90k IPs, most only once, but all related. [17:42:31] All to heavy scripts, all walking down a long recursive list of links. [17:49:24] added all the error msgs as an attachment - not needed now, I see. Happy hunting! [17:53:08] oh shit [18:48:57] Eenteresting. wmflabs is going reachable/unreachable like a crazy monkey [18:50:16] lexein: DDOS ongoing, see -labs [18:50:24] Coren: Right? [18:50:27] damn [18:50:33] thx for the tip [18:51:05] marktraceur: Yep. Shut down http to stem the flood. https unimpacted. [18:51:22] getting timeouts from https [18:51:24] lexein: Also, "chaos monkey". :-) [18:51:45] * lexein excuses self for ign'ance [18:51:50] lexein: Works fine from here, snappy too. :-) [18:52:02] lexein: What exact URL is giving you trouble? [18:52:08] (Might just be the tool itself) [18:52:33] on VPN from Toronto, CA - https://tools.wmflabs.org/xtools/pcount/index.php?name=Robsinden&lang=en&wiki=wikipedia times right out [18:52:56] * Coren looks into it [18:53:25] I'll switch my exit node to San Jose, CA [18:53:29] (other CA) [18:53:51] No, I see timeouts with that tool as well. Its lighttpd might have been impacted by the DDOS while in progress. [18:53:51] brb [18:53:55] * Coren restarts them. [18:56:01] Hm. Times out while "connecting to tools.wmflabs.org" [18:56:44] When you say "restarts them" - is that the whole farm? [18:56:54] yeesh [19:38:16] yurik: is it *really necessary* to have this here? it's quite inappropriate for mediawiki.org https://www.mediawiki.org/w/index.php?title=Extension:FlaggedRevs&diff=0&oldid=810809 [19:39:23] if you give me a tiny bit context I can find a page on wikitech (like, is this to add a wiki, update code, setup FR for the first time on a wiki...?) [19:39:24] Nemo_bis, well, I have spent very long time looking for this info, as it was far from obvious. Where would be a good place to store wikimedia-related setup? [19:40:11] that info is how to add flaggedrevs to a wikimedia - in that case i was adding it to the meta.wikimedia.org [19:40:47] orly, how comes I lost this piece of news? :) [19:41:38] Nemo_bis, its only for zero namespace :) [19:42:55] yurik: ah too bad, I proposed it also for the Grants namespace (they're using a completely flawed workflow right now) [19:43:03] * Nemo_bis crosses fingers [19:44:54] yurik: your case looks covered by https://wikitech.wikimedia.org/wiki/Heterogeneous_deployment#Install_a_new_extension_on_a_wiki -> https://wikitech.wikimedia.org/wiki/How_to_do_a_schema_change#sql.php [19:44:55] Nemo_bis, now that it has been deployed to meta, we can expand it to another namespace much easier :) [19:45:15] as if anything was easy when speaking of grants :P [19:45:48] hehehe :) [19:46:05] i'm talking about the technical aspects of doing schema changes :))) [19:46:12] thanks for that page btw! [19:46:17] there's no FlaggedRevs-specific instructions on wikitech, probably it's always been handled by Aaron and/or the usual suspects; if there's something special about its schema, it could be added there (the other commands you posted look standard) [19:48:20] anyway, it's not that important, it's nice that you add docs and I was just trying to take the highest profit out of them :) but it's a wiki ;) someone can move them any time [21:22:41] Coren: So if I have the autodesk.js script somewhere on enwiki instead of labs, can I re-enable the wikidata search script? [21:23:06] autodesc* [21:23:21] legoktm: That almost certainly solve the immediate issue, yes. [21:23:38] Although the /better/ solution would be to only load that script at all from the search page. [21:24:00] it's funny because I pointed this out last night: https://en.wikipedia.org/wiki/MediaWiki_talk:Wdsearch.js#Don.27t_load_external_JS [21:24:02] right [21:24:05] I can do that too [21:24:11] It's also used to complement noarticletext [21:24:31] Still should be possible [21:24:44] Nemo_bis: Ah, that I didn't know. But yeah, if it were a gadget the resourceloader would help a lot. [21:25:16] * legoktm waits for labs to load so he can grab the source of the script... [21:26:09] legoktm: isn't the file world-readable from tools-login? [21:26:42] yes, that was my second option [21:28:21] hrmmmm, no Ryan_Lane... [21:29:01] i think he was looking for something about verifying tags (or result of a fetch?) for trebuchet? [21:29:10] > Make foo^{tag} to peel a tag to itself, i.e. no-op., and fail if “foo” is not a tag. git rev-parse –verify v1.0^{tag} would be a more convenient way to say test $(git cat-file -t v1.0) = tag. [21:29:14] from https://blogs.atlassian.com/2013/12/whats-new-git-1-8-5/ [21:33:59] Coren: is https://dpaste.de/aOaG/raw ok? [21:34:11] it will only load when needed, and pulls from enwiki [21:34:19] * apergos looks [21:36:06] "Rebase saw some polish:" > hm, it's still all in English! [21:36:28] if we have issues it will have to get turned off again [21:36:44] yes [21:36:45] looks ok to me though [21:36:49] but it shouldn't hit labs at all anymore [21:36:55] except for people clicking through to reasonator [21:38:07] which should be a lot fewer people [21:38:13] Coren? ^^ [21:38:44] i'm going to gadgetize it later tonight [21:38:54] Should be okay, but probably better to wait a bit until tools restabilizes. [21:39:19] And yes, people actually clicking trhough is okay by design. :-) [21:39:45] 61 of last 1000 requests are the bad one [21:40:01] so we might be stable-ish [21:43:10] ok, well just poke me when I can re-enable it [21:43:42] and sorry for making this mess even worse :(( [21:48:22] legoktm: by switching to https? well, surely you helped identify the source of the problem :P [21:48:33] heheh yeah >.> [21:51:39] actually it threw us off tbh [21:51:49] but it was pretty entertaining :-D [21:55:18] so, should be stable enough now? most pages are loading for me in a few seconds [22:01:37] legoktm: Yep. I've been keeping an eye on it for a while first, but things seem to be full of happies now. [22:03:05] awesome [22:03:29] https://en.wikipedia.org/w/index.php?title=MediaWiki%3AWdsearch.js&diff=584588998&oldid=584578225