[09:46:42] Hi! How are m.wikipedia.org pages generated? Are they from the same MediaWiki dump as the desktop website? [10:00:44] ANyone? [17:57:56] Krinkle: the browser swarm is going to be fun :) [18:30:23] alolita: Hi [18:35:34] RoanKattouw: I gotta go for dinner, you go first :) [18:36:05] or not.. [18:36:30] Oh, missed my ping [18:36:39] ...and Alolita's gone [18:36:48] prolly gone to lunch [18:39:45] RoanKattouw: do you know if the minutes are listed somewhere? [18:39:59] For the meeting that just ended? [18:40:10] the features team meeting yup [18:40:19] http://eiximenis.wikimedia.org/FeaturesTeam20110531 [18:40:28] !pad foo [18:40:28] --elephant-- I don't know anything about "pad". [18:40:39] I guess I should just bookmark it [18:40:40] oh, right. no mw-bot here [18:41:37] !pad is http://etherpad.wikimedia.org/`e1 [18:41:37] --elephant-- Successfully added keyword: pad [18:44:45] !ep alias pad [18:44:45] --elephant-- You don't have permission to do that. [18:45:07] alright, dinners ready now [18:45:08] cya ltr [19:47:52] TrevorParscal: You're no longer on AFT, are you? [20:01:49] hexmode: build time for cruise control, including (Database and Parser groups) is down to 2 minutes 28 seconds! The parser tests were run twice :b [20:06:24] RoanKattouw: not much, but I'm always going to be around to help [20:06:28] what's going down? [20:06:35] Just a trivial bug [20:06:42] And you're still the auto-assignee [20:06:46] I'll fix that [20:09:29] thanks [20:47:15] *TrevorParscal laughs about "what does the community department do?" email thread [20:47:44] RoanKattouw: how's it going? [20:48:09] Only just getting back into work, had been taking off the same days as you [20:48:30] Am now fried cause I slept something like 1-6 last night, it's 11 now so that's hitting me [20:48:42] the code release process rash is flaring up again... that'll be refreshing to read through [20:49:12] I haven't caught up on wikitech-l yet [20:49:36] I haven't read any of it but I'm glad it's being raised [20:49:36] the "what does the community department do" thread is pretty entertaining [20:49:40] hint: WE DON'T KNOW! [20:49:42] I tried in Berlin but failed [20:49:46] ha ha [20:49:51] yeah, it gets pretty deep man [20:50:04] my wife saw my email queue this morning and was like "whoa!" [20:51:51] There are many mornings I start with right-clicking and selecting "Mark Folder Read" [20:52:54] that's probably about 99% effective [20:53:41] most of the time it's on internal-l [20:53:45] I actually read most thnigs [20:53:49] i always read the thread subjects. [20:55:03] RoanKattouw: it's not fair, you are a robot... I don't have a gigabit ethernet port in the back of my head like you... [20:57:16] RoanKattouw doesn't have a bed, he has a charging station [21:02:53] alolita: Got a minute to talk in PM? [21:03:12] Speaking of said charging station, I need to hit that up soon [21:04:38] RoanKattouw: just remember, a quick charge never lasts as long [21:05:33] Yeah I need a full charge [21:07:10] have a good night! [21:40:09] TrevorParscal: Hey [21:40:17] howdy timo! [21:40:23] I did some work recently on documenting functions, constructors and variables. [21:40:28] But came accross an issue [21:40:32] What format to use ? [21:40:41] doxygen doesn't appear to have js support (atleast not in a sane way) [21:41:02] (there's work around that convert JS into psuedo C++ or Java which is then fed to doxygen) [21:41:18] we could go that way, but I've been told you have another syntax/doc engine that works for JS ? [21:52:32] hm [21:52:51] i've never actually bothered compiling doc comment from js myself, i just use it in my editor ;) [21:54:45] there's.... this thingy http://code.google.com/p/jsdoc-toolkit/ [21:57:42] I am not sure if I like the idea of auto-generated docs for JS. JS is a lot more free-form than languages which assume a lot about structure, like Java. [21:57:52] Even within WMF code we write JS in at least two styles. [21:58:33] Krinkle: JSDoc uses a different and incompatible syntax to Doxygen [21:58:59] we use Doxygen for PHP, and most people seem to document JS as if it's going to be read by Doxygen [21:59:21] but it never actually does, because doxygen sucks, and even PHP is poorly supported by doxygen [21:59:25] there's something about it JSDoc that didn't attract me. Couldn't get it to work, for one and it seems to have ugly output. [21:59:53] Maybe I'm in the minority but I never find that I want to read documentation summaries for JS. It's concise enough that reading the code with comments is fine. [21:59:57] I've been using JSDog (for jsNode) and Natural Docs [22:00:13] http://metajack.wordpress.com/2008/07/01/the-state-of-javascript-documentation-tools/ [22:00:20] I started writing a documentation extraction system, that used PHP and JavaScript tokenizers to gather actual code structure, and collect comments associated with that code structure - but I have not messed with it in a while [22:01:35] we should probably switch to natural docs since it supports all the languages we use (php, javascript, c and makefile) [22:01:49] i don't know what the output looks like [22:02:14] I do, it's nice. [22:02:32] plus it supports D - whoo hooo! [22:02:36] but I'm addicted to @'s tho, which is the only thing I dislike about Natural Docs [22:02:37] After a few minutes of looking at it I do like Natural docs. Less magic syntax [22:02:49] Krinkle: uh oh we mid-aired :) [22:02:59] neilk_: I dont have bugzilla open. [22:03:03] well, someone just needs to make a call on that and start converting docs [22:03:18] Krinkle: no, I was joking, our comments just cancelled each other out here. [22:03:19] TrevorParscal: I thought you were going to introduce JSDog to your workflow [22:03:22] it's a pretty large project, but it's also something someone with a low level of technical expertise could do [22:03:35] and we would likely find that there's lots of code that's incorrectly documented [22:03:38] TrevorParscal: since I saw {string} {type} somewhere. [22:04:08] Krinkle: I was using that style of docs for my documentation scanner, yes [22:04:14] now that JS is still fairly small and going to be structured, documented and warmed up with unit test suites, it's a good time to make this kind of call. [22:04:38] since we're going to go through all JS code to do so. [22:05:02] I'm currently workign on a "Is every public method of this object tested for in QUnit" thingy [22:06:07] https://github.com/trevorparscal/hoverboards/tree/master/mediawiki/junkpile/deferred/libs/coda [22:06:13] that is the lib I was writing [22:06:23] you can send it PHP code and it will give you a JSON response [22:06:51] it can also parse multiple files and return a response for all of them [22:07:41] k, Natural Docs it is. [22:07:53] *Krinkle waves magic wand. [22:08:22] TrevorParscal: looks interesting. [22:08:37] Is it a doc generator (ie. replacement for doxygen / Natural docs ? ) [22:09:04] it's the code to structured representation part [22:09:19] it could be used for semantic diffs, or documentation generation [22:09:34] it's code structure -> json [22:10:23] who screwed up prototype.wikimedia.org? [22:10:45] Priyanka. She said she'd fix it in like, 10 minutes. [22:10:50] ok [22:11:43] jorm: Do you know if anyone is told to review WikiLove yet ? (not extension review, but svn revision code review) [22:11:57] i have no idea. [22:12:05] http://www.mediawiki.org/w/index.php?title=Special:Code/MediaWiki/status/new&path=/trunk/extensions/WikiLove [22:12:09] i don't think so, but i see you just marked one of my revs fixme. [22:12:25] yeah, that one is actually fixed tho I think. [22:12:30] and i have no idea what your comment means; i was just cloning it to fix a bug. [22:12:32] yeah. [22:12:49] it was part of the review roan and I did, but I haven't seen those revs yet. [22:13:29] right now(last week) the images were broken on wikis that have /extensions/ named not /extensins/ but ie. bits.wikimedia.org/extensions-17 or /localhost/trunk/extensions [22:13:55] therefor "extensions" should never be hardcoded, isntead wgExtensionAssets. [22:14:17] (sorry about the typos, it's awful riht now) [22:14:22] right, i understand. [22:15:00] i thought that stuff had been addressed in a later revision. [22:36:17] this is fun: http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Creation_of_the_Sun_and_Moon_face_detail.jpg [22:39:56] this is funner: https://wikimania2005.wikimedia.org/wiki/Main_Page [22:41:08] almost. i get a mixed content warning. [22:41:13] of course [22:41:22] because bits and upload aren't https yet [22:41:34] and mediawiki isn't configured to serve protocol relative urls [22:41:44] this is the first domain I'm testing [22:45:29] Ryan_Lane: Awesome [22:45:55] really nice [22:46:06] I thought the same-domain https was off the table ? [22:46:25] btw, compare the security rating: https://www.ssllabs.com/ssldb/analyze.html?d=https%3A%2F%2Fwikimania2005.wikimedia.org https://www.ssllabs.com/ssldb/analyze.html?d=https://secure.wikimedia.org [22:46:34] Krinkle: negative :) [22:46:45] I started a sprint during hackathon berlin [22:46:55] I figured, but I thought there was some stuff against it on wikitech-l [22:47:17] that was likely me [22:47:19] it's difficult [22:47:22] k [22:47:27] wildcard certificates ? [22:47:35] yeah, we need like 10 of them [22:47:52] and we need ip addresses for each project [22:48:09] and we need to add lvs config for every single one of those for 80 and 443 [22:48:15] and we need to change all the cnames [22:48:28] and we need to have an ip address for each service in each datacenter [22:48:30] So the sslabs results, I take it the difference is more than just the domainname. You're using a different setup for the new https thing then secure.wm. ? [22:48:36] yes [22:48:46] and make scripts for when all the cnames need changing in a hurry [22:48:49] I actually put some effort in to configuring ssl correctly :) [22:49:04] k, so this is going to be Ipv6 only ? [22:49:06] well, changing the cnames isn't an issue :) [22:49:14] no. ipv4 and ipv6 [22:49:19] oh, nice. [22:49:24] (since many ips are needed) [22:49:33] Ryan_Lane, I was thinking when you move all traffic to pmtpa because something is broken [22:49:35] yeah, we are low on ips, which makes this a pain [22:49:44] and then forgot wikibooks.org or something like that [22:49:57] Platonides: yeah, we change the text.wikimedia.org to point from one place to another [22:50:14] I just set up a geodns entry for wikimedia-lb.wikimedia.org [22:50:18] but you are now going to have one ip per project... [22:50:32] so, when we change dns scenarios, powerdns will flip the cnames for us [22:50:44] cool [22:50:49] yep :) [22:51:15] good night