[00:59:31] not that there's anything for me to see there in the first place, but why did I just get a huge scary SSL warning trying to go to https://arbcom.en.wikipedia.org/wiki/Main_Page ? [00:59:48] Not just [00:59:59] It's been like that since we've had the new SSL system [01:00:03] ah [01:00:10] why? [01:00:11] The SSL certs don't match sub-subdomains [01:00:25] *.wikipedia.org is fine [01:00:30] *.*.wikipedia.org is not [01:00:33] there's no *.*.wikipedia cert [01:00:37] you can't have a *.*. cert [01:00:41] so why not get one? [01:00:56] or a .*.en.wikipedia.org [01:00:57] Pink: does not exist [01:01:05] vendor doesnt do it [01:01:09] wildcards can only cover one level down [01:01:14] * aude wants unicorns :) [01:01:15] We'd have ot get *.en.wikipedia.org [01:01:18] mutante: Didn't we get some fancy SubjectAltName cert? [01:01:19] ^ [01:01:21] it would have to be.. yea that [01:01:34] And *.de.wikipedia.org [01:01:35] etc [01:01:37] RoanKattouw: yea, with lots of domains on it, but not *.*. [01:01:45] and wait, why does en.m.wikipedia.org work on SLL then? [01:01:47] It'd be far cheaper to get individual certs for that [01:01:49] Right, but we somehow forgot to add these? [01:01:57] Pink: probably has a wildcard on m.wikipedia.org [01:01:58] PinkAmpersand: Because we have a *.m.wikipedia.org certificate [01:02:18] not according to Chrome [01:02:21] like, go to https://en.wikipedia.org, click cert symbol, go to details [01:02:30] Certificate Subject Alt Name [01:02:36] it says it's just the .wikipedia.org cert [01:02:38] and see all the values in the field [01:02:53] you may have to find that field in details [01:03:05] DNS Name=*.wikipedia.org [01:03:05] DNS Name=wikipedia.org [01:03:05] DNS Name=m.wikipedia.org [01:03:05] DNS Name=*.m.wikipedia.org [01:03:09] Yada yada [01:03:23] yep [01:03:26] oh ok :P [01:03:38] There's 48 of them [01:03:47] Note, mutante and I did look at moving those wikis before [01:03:50] But there was problems [01:03:52] So we stopped [01:03:56] *there were [01:04:29] we could also get a trusted root cert [01:04:34] ah. it is sorta weird to have a subdomain of en.wikipedia as it is [01:04:36] luckily there are just very few of those using 2 levels [01:04:37] which would allow us to issue unlimited domains of any kind [01:05:10] certs* [01:05:29] was there anything besides arbcom where this is used? [01:05:47] *.planet.wm.org also has its own [01:06:13] https://bugzilla.wikimedia.org/show_bug.cgi?id=31335 [01:06:50] yea, see, the remaining ones were just arbcom [01:06:52] https://bugzilla.wikimedia.org/show_bug.cgi?id=31335#c9 [01:06:58] and .. wg.en.wp ? [01:07:03] Yeah [01:07:07] I'm confused now. Why couldn't rename those wikis? [01:07:07] noboard.chapters.wikimedia [01:07:20] because of external storage [01:07:22] and db names [01:07:46] I thought that was for "proper" renames [01:07:58] ie completely changing it [01:08:06] not changing where we can replace . for - type thing [01:08:11] Which we did do for one wiki? [01:08:21] pa-us.wikimedia.org [01:08:26] Which is closed or whatever [01:08:31] https://wg.en.wikipedia.org/ [01:08:37] yea, true.. ehmm ... [01:08:50] More redirects! [01:08:50] i don't think i've ever been on "wg.en" [01:08:53] Just what we need [01:09:16] mutante: i have. ha! [01:09:21] there was something else that stopped us and we cant think of right now :P [01:09:25] I wish they could have wildcards that span multiple subdomain levels [01:09:25] i think [01:09:35] Probably [01:09:40] I might try and find out again [01:09:44] And then document it [01:09:50] "Why are private wikis even on the WikiSet manager anyway?" [01:09:51] what? [01:09:59] WikiSet manager? [01:10:00] some centralauth feature [01:10:03] ok [01:10:06] :) [01:10:17] Certainly for the language code renames we can't do without hacks and moar hacks [01:10:26] i wish they could be used to wikimedia.org [01:10:40] wikipedia.org only real encyclopedia wikis [01:11:29] http://wikimedia.org/wp/en/view/Main_Page [01:11:57] http://wikimediafoundation.org/wiki/Wp/en/view/Main_Page [01:12:17] https://secure.wikimedia.org/... [01:12:20] What are we doing? [01:12:29] Reedy: proposing better url schemes [01:12:43] en.wiki [01:12:43] eh, mine was just pointing out where the first one takes me [01:12:53] i argued for http://wikipedia.org/en/ once but no one else agreed [01:12:58] heh [01:13:03] http://en.wiki/wiki/Wikipedia [01:13:09] yes, soon come [01:13:27] seriously, i talked to teh guy [01:14:22] http://wi.ki/en/ [01:14:43] felicity: we'll actually get .wiki domains [01:14:55] en.wiki.pedia [01:14:57] um, that would be a huge waste of donors' money [01:14:58] lol, ssl cert problem [01:15:15] felicity: nah, it's a donation by the guy who owns it [01:15:56] http://icannwiki.com/ [01:16:13] Reedy: eh, maybe :) [01:16:13] http://icannwiki.com/index.php/Talk:Main_Page [01:16:14] sadface [01:16:36] ICANN's and ARIN's wikis are such huge disasters it isn't even funny [01:16:43] hopelessly spam-clogged [01:17:45] * Jasper_Deng wonders what would happen if Wikimedia got their own tld [01:18:01] Jasper_Deng: we were too slow , kind of [01:18:23] and would have been a couple thousand [01:18:28] but possible [01:20:08] https://gtldresult.icann.org/applicationstatus/viewstatus [01:20:58] https://gtldresult.icann.org/applicationstatus/applicationdetails/1457 [12:11:10] Do the weird cached pages mentioned in topic include redirects to foundationwiki? https://meta.wikimedia.org/w/index.php?diff=6133509&oldid=6133271&rcid=4624996 [12:12:27] yes [12:12:41] that's about the only target [12:25:30] Reedy: https://commons.wikimedia.org/w/api.php?action=query&list=allcampaigns gives me a, uh, 404 [12:25:36] Not Found [12:25:36] The requested URL /w/api.php was not found on this server. [12:25:36] Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. [12:25:44] Reedy: no served by info either [12:26:13] sounds like default apache ErrorHandler? [12:27:29] mutante: ^ [12:27:44] also returned after a long long time [12:32:03] WFM [12:32:18] Reedy: well, wfm now too [12:32:23] intermittent issue [12:32:38] hmm [12:32:52] I'm not sure if apache-fast-test hits the api apaches too [12:33:15] possibly didn't? [12:33:33] it'll also explain a bug someone filed y'day for the commons app [12:34:10] Could do [12:34:17] (https://bugzilla.wikimedia.org/show_bug.cgi?id=56016) [12:34:19] I know mutante did fix stray normal apaches [12:36:33] By the looks of it it doesn't... [12:36:42] Anyone know perl? [12:36:43] ;) [12:37:05] $servers = get_server_list_from_pybal_config(qw( [12:37:05] /home/w/conf/pybal/eqiad/apaches [12:37:05] )); [12:37:33] hm? [12:38:05] I'm presuming in $servers->{$1} = 1; $servers is an object, not an array? [12:39:18] $servers would be a reference to a hash [12:39:27] Need a combined thing with the entries from both /apaches and also /api [12:40:20] "$servers{$1} = 1" would mean $servers is a hash, the arrow means it's dereferencing [12:41:07] In the most literal way... [12:41:08] $servers = get_server_list_from_pybal_config(qw(/home/w/conf/pybal/eqiad/apaches)) . get_server_list_from_pybal_config(qw(/home/w/conf/pybal/eqiad/api)); [12:41:21] The question being how do we join the 2 things together into one? [12:42:05] mawwiage [12:42:05] so then we can cry and possibly restart some more apaches [12:42:10] ? [12:42:11] fyi: commons api broken "During the execution of AxUserMsg, the following error occured: [12:42:11] API request returned code 404 errorError code is Not Found" [12:42:18] I have no idea what you're trying to do :) [12:42:45] paravoid: Combine the results of 2 calls of get_server_list_from_pybal_config into one thing that the script would then iterate over like it does from just one thing [12:43:00] what is the return data type of get_server_list_from_pybal_config() ? [12:43:01] Steinsplitter: Welcome to 17 minutes ago [12:43:02] ;) [12:43:44] oh \O/ i see ;-) [12:44:11] wait, is this an outage? [12:44:17] Partial [12:44:21] Sort of [12:44:24] API WFM [12:44:27] Doesn't work for others [12:44:35] in puppet repo files/misc/scripts/apache-fast-test [12:44:58] bottom of the script [12:46:33] paravoid: Can you restart apache on these boxes please? [12:46:38] reedy@fenari:~$ ./apache-api-fast-test urls pybal | grep 404 [12:46:38] mw1117.eqiad.wmnet 404 Not Found [12:46:38] mw1126.eqiad.wmnet 404 Not Found [12:46:38] mw1145.eqiad.wmnet 404 Not Found [12:46:38] mw1201.eqiad.wmnet 404 Not Found [12:46:39] mw1206.eqiad.wmnet 404 Not Found [12:46:52] I can, but why would they need restarting? [12:47:17] Were you around yesterday for when mutante and I sent most of *.wikimedia.org traffic to foundationwiki? [12:47:24] no [12:47:56] After we fixed it there were some stray apaches that seemingly hadn't restarted/picked up the reverted config [12:48:14] the eqiad api pool isn't included in apache-fast-test (used to find said stray apaches) [12:48:28] Which is the reason for my perl questions above [12:48:53] Copying to ~ and editing it to use the api pybal group shows that those 5 servers are 404ing for something they should not be 404ing for [12:48:53] restarted [12:49:02] Steinsplitter: YuviPanda ^^ [12:49:42] Thanks [12:50:23] the answer to your question is so very easy [12:50:28] the perl one [12:51:41] so I should not just replace the pybal config files by json then [12:54:24] mark: Preferably not [12:54:25] ;) [12:59:03] Reedy: thanks! [13:20:16] Reedy: do you have the bug number that tracked the docroot issue? [13:21:18] https://bugzilla.wikimedia.org/show_bug.cgi?id=56006 [13:23:21] Reedy: ty [15:11:58] YuviPanda: https://commons.wikimedia.org/w/api.php?action=query&list=allcampaigns is 504 for me. is that better than 404? [15:12:54] 2nd try is 504 too [15:13:01] https://commons.wikimedia.org/w/api.php?action=query&list=allcampaigns&uclimit=10 works [15:13:28] 3rd try works [15:13:40] jeremyb: 4th try? [18:06:51] greg-g: mobile would like to use the lightling deploy window today if possible. Nothing critical, but a fix needed for our eventlogging. [18:10:12] kaldari: sure thing, plz add to calendar on wikitech [18:11:33] kaldari: I would but this conf wifi is sooo crappy [18:15:00] greg-g: will do [18:49:42] ^d: cirrus seems to have skipped indexing of several pages on it.wikt; first [[vita]], now https://it.wiktionary.org/wiki/casa [18:49:55] <^d> Hmm [18:50:14] I fixed [[vita]] with a dummy edit, I didn't here not to destroy the test case; https://it.wiktionary.org/w/index.php?title=Speciale%3ARicerca&profile=default&search=edificio+che+accoglie+temporaneamente&fulltext=Search [18:53:10] ^d: actually, I've tested 4 more random entries and I couldn't find any of them in search [18:53:41] I think this used to work, looks almost like the index was emptied at some point [18:53:41] <^d> Do we have a bug filed for this already? [18:53:52] no because till now I thought it was an occasional failure [18:56:26] <^d> Nik and I are both at conferences today. Can you file a bug so one of us will be sure to have a look at it? [18:56:32] <^d> Don't want it to slip between the cracks. [18:57:48] sure [19:02:36] https://bugzilla.wikimedia.org/show_bug.cgi?id=56061 [21:38:08] ori-l: do you know https://github.com/WPO-Foundation/webpagetest ? is there any value in it? [21:44:09] Nemo_bis: only vaguely. isn't that what httparchive is using? [21:48:32] yes [21:50:56] hmm http://www.webpagetest.org/result/131023_G8_1181/2/performance_optimization/#first_byte_time [22:05:10] (only now I saw the wikitech thread) [22:13:34] andre__: nik and chad are both at conferences atm. at least one is aware of the itwikt issue [22:13:53] ah... I see, thanks