[15:50:08] RobH: today's agenda http://office.wikimedia.org/wiki/Technical/Tech_Day_Agenda [15:51:17] RobH: I sent you an e-mail asking about the status of various things in Nagios, think you can manage a bare-bones answer some time before 2-ish? [15:51:33] RoanKattouw: if i wasnt in the office, easy! in office, i try. [15:51:49] i actually have it up now [15:51:54] I know, meetings and such [15:52:19] That's why I was being conservative with the 2pm :) [15:55:19] i dont recall how to make a damned nagios login [15:55:46] the memcached can be acked and RT'd because its not critical (since they were rotated out for the actual memcached pool) [15:55:58] I would leave the disk space stuff to stare at us in there honestly or it wont get touched [15:56:18] (i will reply to your email as well =) [15:56:41] Yeah good point about the disk space [15:57:02] But should probably get an RT anyway since disk space doesn't go to CRITICAL until it hits zero [16:17:41] RoanKattouw_away: making RT is good yea, but i wouldnt ack them in nagios [16:19:41] Ryan_Lane: is from new orleans, his default state is 'party' [16:24:20] RobH: :D [18:15:38] How do I find the bug ID of the bug where MW stores %26 as & breaking URL [18:15:41] http://en.wikipedia.org/w/api.php?action=query&prop=extlinks&titles=7th_%26_Capitol_and_8th_%26_Capitol [18:16:44] Dispenser: what context of storage here? [18:17:06] externallinks table [18:17:12] ah i see [18:17:59] I use s/&(?=[^&=]*(&|$))/%26/g to fix them in my coordinate dumping tool [18:18:11] but I want a bug ID to link to [18:18:59] Dispenser: do you happen to know there's a bug report in bugzilla about it, and you're asking how to find it? [18:19:04] It may have been fixed at a year last ago [18:19:23] have you searched in bugzilla already? [18:19:30] jorm, oh teh noes! [18:19:31] trying to... [18:19:46] first -- can you confirm that the correct URL is in fact being generated to begin with by whatever template is building it? [18:20:44] grrrr slow firefox today [18:22:46] ok can confirm that: http://en.wikipedia.org/w/api.php?action=query&prop=extlinks&titles=User:Brion%20VIBBER/test [18:22:57] http://en.wikipedia.org/w/index.php?title=User:Brion_VIBBER/test&action=edit <- i'm using %26 explicitly in the link [18:23:07] the link on [[7th & Capitol and 8th & Capitol]] produces links using %26 while the result in the TS db and API result in &/& [18:25:18] i don't see an obvious bugzilla entry in my searches [18:27:41] bug seems current [18:28:06] flipzagging: What do you think you're gonna get done UW-wise before the all-staff? [18:28:48] code comment: Dec 27, 2009: Seems to have been fixed, but the fix doesn't hurt [18:29:10] IIRC a few months later it reappeared [18:29:37] Dispenser: where's this code comment exactly? [18:29:50] a program I wrote, ghel.py [18:30:22] Parser::replaceUnusualEscapes() looks like the culprit to me [18:30:34] heh. howie referred to me as "jorm" in real life. [18:30:54] jorm: i can never even remember your real name ;) [18:32:01] no one does. [18:32:11] though interestingly, he pronounced it "yorm". [18:32:16] How does jorm not leave an impression? [18:32:32] jorm: as i would, actually [18:32:35] Me too [18:32:40] But that's because English is not my native language, I think [18:32:41] gotten so used to germanic/east-european names ;) [18:32:57] that's the correct way, but most people use the hard 'j' [18:33:33] You mean as in 'janitor'? [18:33:55] Dispenser: it looks to me like the bad behavior there has probably been in for a few years; it might have gone away temporarily due to some other borkage and then gotten restored by accident :) [18:34:20] yeah. [18:34:53] Dispenser: go ahead and file a bug in the parser component; cc tstarling@wikimedia.org and brion@pobox.com and toss in a mention that i said Parser::replaceUnusualEscapes needs fixing per the code comments about treating URL components [18:35:39] yeah i can confirm if i stub out that function, i get the expected value back [18:35:49] so fixing that to handle query strings correctly should fix it [19:06:46] Hi, is there a Wiki / list with the translations of the English namespaces to the different local languages or do I need to dig through the PHP code? [19:07:19] somewhere on translatewiki one would assume... [19:08:09] but keep in mind individual wikis can have aliases for namespaces that are not part of default mediawiki [19:09:28] TrevorParscal: Argh, evilness. You moved CSSMin remapping and URL rewriting to *before* the extraction of image dependencies :( [19:09:32] Gotcha [19:09:40] You can use the API as well [19:09:56] http://de.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces|namespacealiases [19:10:34] Pass &format=xml or something to get it in a programatically useful format (json, yaml also supported) [19:10:43] awesome! [19:10:50] thanks RoanKattouw [19:12:41] TrevorParscal: ...which will cause ?timestamp to be appended to them, which will cause getLocalFileReferences to ignore them [19:15:35] TrevorParscal: Coding up a quick&dirty fix for now [19:18:50] RoanKattouw: ah, oops [19:19:08] Was wondering why my module_deps table was almost empty :P [19:19:31] *TrevorParscal needs more sleep [19:26:11] Alright, got it working locally [19:36:26] do we have a DC hack-a-ton cat? [19:36:30] on Commons [19:41:02] ok we do have one now [19:41:04] [[Category:Wikimedia_Hack-A-Ton_DC_2010]] [19:46:29] RoanKattouw: I'm hoping to bring the code to trunk today, just need to squash those last fixmes [19:46:55] OK [19:47:32] hmmm I wonder how appropriate the crowdsurfing pic is for commons :p [19:49:16] haha [19:54:04] mdale: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/75486#c10539 [19:54:33] mdale: nobody likes your "thou shalt not extend the array prototype" commandments :) [19:55:27] anyway, if anybody has pics, upload them: http://commons.wikimedia.org/wiki/Category:Wikimedia_Hack-A-Ton_DC_2010 [19:55:35] TrevorParscal: I understand your choise, but ahm.. can you clarify what you mean by 'remote embedding' ? [19:55:42] TrevorParscal: I don't like that either [19:56:54] RoanKattouw: I got all the namespaces as JSON objects, thanks! [19:58:06] mdale: can you please defend you own point here? [19:58:27] Krinkle: basically a site that's not powered by mediawiki loading mediawiki js [19:58:28] Diederik: Note that there's non-Wikipedia wikis too (Wikibooks, Wiktionary, ... ). Not sure your study covers them but you should know they exist [19:58:54] in those environments, our code gets mixed and matched with code that we don't know of or have control over [19:59:01] Well, some sites use jQuery library and if you'd copy that without loading jQuery, it brakes. [19:59:04] aparently some of this code is garbage [19:59:15] That's their fault for writing garbage [19:59:21] that's my position [19:59:32] RoanKattow: yes, I know, our first focus is on the Wikipedia's but thanks for reminding [19:59:35] Michael was stating an opposite point [19:59:52] I said, hey, I can satisfy michael's needs without much work, why not? [19:59:59] there are issues with jquery getJSON [20:00:11] but I am in a meeting will get back to it in the code review comments. [20:00:20] (ie internal issues ) [20:00:54] I prefer extending array, string, object, etc. Michael will need to argue his points himself, I was just being nice and supporting more use cases [20:13:20] mdale_away: did you forget to commit add apiserver/client for iframe ? [20:16:12] thedj: maybe ;) [20:16:46] hehe [20:20:25] thedj: the test file is presently kaltura specific... will have to think about a fix for the "loader" / "wrapper" to work for mediaWIki embeds... [20:21:26] its nifity in that it lets you use the html5 video api on the iframe as if it was a video tag .. but keeps the player on it own domain so all the css and javascript space does not conflict [20:23:54] w00t, works now