[00:04:19] wmflabs.org is chucking security warnings at me now... [00:07:16] i keep running into [00:07:17] Not Found [00:07:18] The requested URL /w/index.php was not found on this server. [00:07:18] Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. [00:07:30] once i try a few times it works [00:07:40] p858snake|l: what domain? [00:07:46] jeremyb_: wmflabs.org [00:07:48] he said [00:07:56] Reedy: which never worked? [00:08:00] afaik [00:08:07] i thought he meant a subdomain [00:08:15] well clearly yes [00:08:26] it doesn't even resolve [00:08:39] legoktm: what domain for you too [00:08:50] enwiki [00:08:54] using https [00:09:35] The requested URL /w/index.php was not found on this server. [00:10:17] Got that twice from https://en.wikipedia.org/w/index.php?title=RC4&curid=25831&diff=543711773&oldid=539790940 [00:10:54] legoktm: get the headers when it happens? [00:11:09] sure [00:11:36] umm [00:11:40] how do i do that in firefox? [00:11:56] Ctrl+Shift+K [00:12:03] headers: https://gist.github.com/stwalkerster/2bdaf678b9e1f0eb5fa5 [00:12:13] Hi. [00:12:21] I'm intermittently getting 404s on en.wikipedia.org via HTTPS. [00:13:40] Susan: You're late to the party [00:14:55] And yet. [00:15:05] Also, surely I'm fashionably late. [00:17:20] what's up with the intermittent 404s? Have they been reported here? [00:17:40] Same here. [00:17:45] Magog_the_Ogre: 3-4 people before you :) [00:17:55] My bots just corrupted as a result of those. [00:17:58] Susan was fashionably late. the rest of you are really late [00:18:05] corrupted? [00:18:17] yeah my bot is going haywire too [00:18:22] same [00:18:36] 'Missing dependency: \`( < MediaWiki 1.15)\` [00:18:42] I'd recommend you suspend your bots until this is fixed [00:18:50] MediaWiki 1.15? [00:19:30] yeah it's probably because it's 404'ing so the bot can't grab the MW version [00:19:54] Mine can't read the api [00:20:03] Just wait? [00:20:40] At first I thought I bugged up my framework while updating it. [00:21:28] i've gotten intermittent 404's on two different pages in the last few minutes [00:21:34] welcome to the club [00:21:37] Yep. [00:21:37] I'm getting a series of 404 errors on pages that I know exist [00:21:39] You're late to the party, Emw :) [00:21:51] i always am [00:21:52] Really late. [00:22:01] Cncmaster, welcome to the club. We're all here to complain. [00:22:23] the first rule of complaining club is: no complaining [00:22:27] :) [00:22:38] thanks for the reports, btw [00:23:13] If you guys are going to break everything, at least warn us first. :p [00:23:26] Or keep the API running. [00:23:52] Cyberpower678: That's not helpful. [00:24:12] Susan, it was joke. [00:24:29] Oh. In that case, that's not funny. ;-) [00:25:00] Test wikipedia is completely 404ed [00:25:07] Nothing is working there. [00:25:53] Cyberpower678: we don't need more reports of more sites. *all* sites should be at least sometimes broken now. [00:25:58] EVERYONE PANIC NOW [00:26:08] * jdelanoy runs around screaming [00:26:32] * Cyberpower678 is lighting the torches and waving it around in panic. [00:28:26] en.wikipedia is working. test is still down. :D [00:30:16] .....aaaand we're back up again. Yay. [00:33:24] finally [00:33:41] And my bots are functioning again. [00:40:53] gn8 folks [01:02:22] was mediawiki updated today? [01:02:28] no [01:02:33] there something wrong in uk.wikipedia [01:02:49] what's wrong? [01:03:03] sorting in category: http://uk.wikipedia.org/wiki/Категорія:Станції_Південно-Західної_залізниці [01:03:20] yesterday it worked fine [01:03:29] after fixing bug 45444 [01:03:34] but now it is broken [01:04:05] Well, see how it is in 24 hours or so [01:04:10] there should be headers for each letter [01:04:19] but I look only letter Y there [01:04:20] there's another script going to be run on it to fix some other issues [01:04:47] and other categories wrong too [01:14:08] LeslieCarr: that discussion reminded me of https://rt.wikimedia.org/Ticket/Display.html?id=4143 :) [01:22:52] in pl.wiki is similar issue: http://pl.wikipedia.org/wiki/Kategoria:Prezydenci_Wenezueli [01:22:59] category sorting is broken [01:24:09] Ahonc: i doubt the answer will change. ask again in 24 hours [01:24:27] (won't change tonight that is) [01:25:13] jeremyb_: :) [01:30:08] and similar issue in pt-wiki [01:30:08] http://pt.wikipedia.org/wiki/Categoria:Presidentes_por_país [01:30:27] all these wikis had changed collation in categories [01:31:33] Ahonc: i hope you don't want an answer? [01:43:45] I don't want answer. I want hear that problem is fixed :) [01:48:32] Ahonc: so, set an alarm for 86000 secs from now [01:49:06] :) [01:57:08] has been any mw update? on pt.wp, cats are having non latin letters, and the alphabetic order is a mess: http://pt.wikipedia.org/wiki/Categoria:!Predefini%C3%A7%C3%B5es_da_p%C3%A1gina_principal [01:58:29] I asked similar question above [01:58:44] ya, that's what i've just noticed [01:58:53] such problems are at least in pl.wiki and uk.wiki too [01:59:13] is there any open bug? [02:43:46] Alchimista: yes, It's due to a update to the category sorting system (so we can actually do it properly in some languages) so there is a script running and updating causing the "appearance" of some mess whilst it does its magic [03:20:52] I'm not convinced that it is just a temporary update problem [03:22:46] no, I really don't think so [03:44:36] TimStarling: what are you referring to? [03:44:55] spose I should dig up the backlog [03:45:15] the garbage in the h3's on https://pl.wikipedia.org/wiki/Kategoria:Prezydenci_Wenezueli [03:45:25] I'm most of the way to fixing it now, I hope [03:46:56] pretty unicode characters! :) [04:08:04] Alchimista, robla, p858snake: should be fixed now [04:22:39] fixed for me [04:22:50] after a hard refresh [04:24:38] ditto [09:36:07] hashar: got a minute for a gerrit oddity? I can't merge https://gerrit.wikimedia.org/r/#/c/46686/ [09:36:11] i just don't get the button [09:36:24] even though it sais "Can Merge: Yes" [09:36:30] sure [09:36:35] is jenkins supposed to auto-merge this? [09:36:45] or is it because oif jeroen's (outdated) -2? [09:37:51] yeah I guess Jenkins attempted to submit the change [09:37:56] but it got blocked caused of CR-2 [09:38:06] hrm [09:38:12] ok, i'll remove and re-add jeroen :) [09:38:17] ;-) [09:38:24] Zuul should gives out a message about it [09:38:37] hashar: but if CR-2 blocks, why does it say "Can merge: yes"? [09:38:41] that's just broken :/ [09:39:36] hashar: seems like if there's a CR+2 and a CR-2, it *sais* it can merge, but doesn't let you. [09:39:53] na can-merge just mean the code could potentially be merged [09:39:58] regardless of the code review status [09:40:08] something like : " it is technically possible to merge the change" [09:40:10] o_O [09:40:11] really? [09:40:14] that's confusing [09:40:41] anyway, thanks for looking [09:40:50] here is Gerrit message: [09:40:51] error: blocked by Code-Review [09:40:52] one or more approvals failed; review output above [09:40:57] and yes, it would be good if it would tell me that it was unable to merge for some reason [09:40:58] that is only in the debug log though :( [09:41:04] :/ [09:41:35] anyway, merged it now [09:41:37] thanks! [09:44:57] DanielK_WMDE: opened bug https://bugs.launchpad.net/zuul/+bug/1154507 :-] [09:54:31] hashar: thank you [11:18:09] !log reenabled mail delivery to wikibugs [15:20:20] Can a techie give me some help? [15:20:44] Cyberpower678: if you describe your problem I can try [15:20:50] I may not know the answer though. [15:21:03] createaccount on API seems to hangup/ [15:21:25] When I give it the parameters and the token, it creates the account and stops. [15:21:48] It's definitely not an issue with the program. [15:22:10] When did this start? Were you able to use it successfully before? [15:22:10] sumanah, any ideas [15:25:19] sumanah, since the beginning. I just didn't notice at first. When I use createaccount to get the token, the API returns results. When I then use the createaccount, it successfully creates the account and hangs up, causing my bot to hangup. [15:28:44] Cyberpower678 define "hang up" [15:29:25] api are being used over the http protocol, its connection is terminated after every request [15:32:07] petan, all API queries return something. The createaccount function doesn't. [15:34:17] petan, when the apiQuery function is running in my framework, it gets stuck there. All other queries work fine. [15:38:20] petan, according to the documentation, a successful account creation should return something along the lines result=success. [15:42:48] CP678|iPhone OK if it doesn't return anything it probably crashed :o (you should get 500) [15:43:06] you should never get a blank page [15:43:58] O_o https://bugzilla.wikimedia.org/show_bug.cgi?id=43998#c1 [15:44:29] lol [15:44:42] haha [15:44:55] Nemo_bis we needed moar bots :D [15:45:06] petan, But it's giving me a blank. My bot is waiting for a return reply and the API is just not giving it anything. [15:45:27] Cyberpower678 ok if it's giving blank page, then it's bug in mediawiki -> bugzilla is what you want :P [15:45:30] !bugzilla [15:45:30] https://bugzilla.wikimedia.org/$1 [15:45:32] meh [15:45:36] without $1 [15:45:51] !bugzilla [15:45:51] https://bugzilla.wikimedia.org/ [15:45:52] :P [15:45:54] here we go [15:46:07] 8 lines for a link [15:46:10] :D [15:46:17] lazyness is responsible [15:46:22] not me [15:47:05] petan, I did a thorough search for a bug and it always stops at the query command for createaccount. [15:47:11] Thanks for the link. [16:38:08] Wikimedia Language Engineering team office hours in #wikimedia-office in about 20 mins [17:09:34] Wikimedia Language Engineering team office hours started in #wikimedia-office [18:33:51] IRC borken too :/ [18:46:46] engineering - have any deploys happened in the last hour ? [18:47:49] something is broken [18:49:44] hexasoft: yes, we are investigating in #wikimedia-operations [18:50:06] ok [18:53:57] things are very slow [19:20:23] anyone knows if the cause is an increase of viewers? [19:21:29] addshore: can you stop addbot on plwiki please? we have some performance issues [19:25:05] AaronSchulz: ping [19:38:05] Habemus Luam :D Thanks ! [20:29:08] binasher: hm? [20:30:09] AaronSchulz: hey [20:30:26] AaronSchulz: did asher find you or should I do a preliminary recap? [20:30:46] I just got back from Chaat, and talked to rob a bit [20:35:33] AaronSchulz: https://graphite.wikimedia.org/render?from=-2months&until=now&width=800&height=400&target=drawAsInfinite%28deploy.scap%29&target=*.-total.count&yMax=&logBase=10&uniq=0.5222163577696607 [20:36:16] that's the rate of backend php requests [20:36:51] i think the first big increase starts with this deploy - February 16 01:11 logmsgbot: reedy synchronized wmf-config/filebackend.php [20:38:19] Blame the ginger. [20:38:48] though maybe not! [20:39:25] but today we noticed that a pretty high percentage of requests hitting the apaches are action=render requests against commons sent by mediawiki from FileRepo [20:41:01] a pope related traffic spike caused some site disruptions but the increase in apache requests over the last few weeks meant we had less capacity for it [20:41:09] binasher: so wont that change result in internal https requests from apache to apache to render commons description pages on foreign wikis? [20:41:26] maybe it did that before though [20:41:38] * AaronSchulz looks at $urlprotocol [20:43:23] * binasher steals AaronSchulz for a site performance team  [20:44:05] seems like that's not new, though still wonky nonetheless [20:45:09] Wiki wonkiness. [20:45:21] AaronSchulz: am i right in thinking mediawiki is sending some of those requests to https://commons? [20:45:24] it used to be $urlprotocol = '' before that, so I can't see Reedy's change hurting much [20:45:53] AaronSchulz: does that split the memcached keys? [20:46:00] binasher: yes, if the main requests itself was https [20:46:11] $key = $this->repo->getLocalCacheKey( 'RemoteFileDescription', 'url', $wgLang->getCode(), $this->getName() ); [20:46:24] greg-g: due to the site outage, the mobile team needs to reschedule today's deployment - can we reschedule for tomorrow 3-5pm PDT? [20:46:50] memcached misses there result in render request, so splitting the key space would double the requests [20:46:57] doesn't seem to split the cache [20:47:04] the url is not in the key [20:48:45] awjr: yep, tomorrow 3-5 is open, I'm editing the wiki right now for some other things, so I'll update it [20:48:53] awesome thanks greg-g [20:49:03] thank you :) [20:49:25] Ryan_Lane: paravoid: ^^ the above https requests to commons made by mediawiki hit the ssl servers so would increase traffic there [20:49:34] binasher: yep [20:49:37] that was my thought [20:49:42] binasher: though they also do ipv6 [20:49:48] so it's possible that was the increase, too [20:49:50] mediawiki hitting the ssl servers and squids to talk to mediawiki.. heh [20:49:59] heh [20:50:02] wait. mediawiki? [20:50:07] -_- [20:50:08] yes, mediawiki [20:50:20] why would mediawiki go back through the entire stack? [20:50:46] https user request -> ssl -> (plain text) squid -> mediawiki -> ssl -> (plain text) -> squid -> mediawiki [20:50:54] hahahaha [20:50:57] that's rough [20:51:12] yo dawg [20:51:14] why on earth does that happen? [20:52:18] er, what? [20:52:22] why mediawiki went through SSL? [20:52:38] paravoid: because mediawiki :D [20:52:42] lol [20:52:47] haha [20:52:52] you read the discussion about the much larger SSL cert? [20:52:58] yup [20:52:59] with numbers? [20:53:58] seems like confusion around getDescriptionUrl() (which seems to give public urls, which should ssl as needed) and getDescriptionText() just reusing that (which makes sense for instant commons, not for wiki farm) [20:54:19] I guess that was written back when pretty much everything was non-https [20:54:20] so, multiple issues found [20:54:30] is anyone planning to summarize those in mails? [20:54:33] I'm already lost :) [20:55:01] <^demon> AaronSchulz: Yes, that stuff was written a long time ago. [20:55:12] <^demon> And hasn't changed really since ForeignApiRepo. [20:56:10] AaronSchulz: do you think we should look for other causes behind the increase in php requests? [20:59:14] <^demon> There's a reason it doesn't fetch directly from the database though...it wants to be parsed in the context of the source wiki. [20:59:36] <^demon> But yes, there's better ways than going all the way through a request like that, don't disagree. [21:00:14] It wants? [21:00:29] it hungers for parsing [21:00:35] wtf does the Http of all things not have profiling calls [21:00:48] *Http class [21:02:58] binasher: https://gdash.wikimedia.org/dashboards/filebackend/ heh [21:04:57] yay saturated networks! [21:06:00] lots of Memcached error for key "commonswiki:file:87fb66e3a88146210c175706a2f0c807" on server "10.64.0.190:11211": SERVER HAS FAILED AND IS DISABLED UNTIL TIMED RETRY [21:06:56] probably wasn't helping the stat cache and thus the HEAD req/sec [21:07:32] actually that would increase master traffic to [21:07:40] DB master [21:08:48] so the stream/stat increase suggests that lots of new thumbs were viewed [21:08:56] I wonder how much of that work was redundant [21:09:07] * AaronSchulz recalls mentioning pool counter and thumbnails before [21:14:50] binasher: the graphite data seems to be flaky around this time [21:46:01] !log payments cluster updated to 61d3f1f [21:46:06] Logged the message, Master [21:53:47] Ryan_Lane: Noticed the certificate discussion. You know Nagios has a build in certificate check? [21:54:36] for validity as well as expiration? [21:54:41] and does it check the entire chain? [21:55:42] AFAIK you should be able to check that yes [21:56:04] I mainly use it to hunt down about to expire certificates, but it's much more extensive I think [22:29:22] Hi, anyone familiar with error: https://pl.wiktionary.org/wiki/eigenvector?uselang=en from within function "MathRenderer::writeDBEntry". Database returned error "1048: Column 'math_outputhash' cannot be null (10.64.16.8)". [22:30:12] Yes [22:30:16] Known and logged bug [22:30:26] With activity ongoing [22:30:43] Ok, thanks. Can anyone save https://pl.wiktionary.org/w/index.php?title=gutta_cavat_lapidem_non_vi,_sed_saepe_cadendo_/_sic_homo_doctus_fit_non_vi_sed_saepe_studendo then? [22:31:06] Error: ERR_ZERO_SIZE_OBJECT, errno [No Error] at Wed, 13 Mar 2013 22:30:56 GMT