[00:02:09] TimStarling: sorry, was afk. no platform meeting an hour ago [00:02:19] however, 1:1 now :) [00:06:56] Ryan_Lane: Okay so I'm going to start the first pass of installing etherpad lite. First question, should I install apache2 via apt? From puppet? [00:07:37] puppet would be best??? it's not the most straightforward thing to do [00:07:38] let me look for a sec [00:07:51] Sure, docs here https://github.com/Pita/etherpad-lite [00:08:12] Also could I get a wikitech account to document this with? [00:08:19] actually, it's fairly straightforward :D [00:08:24] johnduhart: sure [00:08:33] johnduhart: make it yourself, and I'll promote you [00:09:07] Ryan_Lane: Can't create an account [00:09:10] oh. lame [00:09:20] lemme make you one then :) [00:14:00] In the mean time I'll be writing it up on etherpad to post later http://etherpad.wikimedia.org/EtherpadLite [00:14:28] Ryan_Lane: Right so, where do I begin? [00:14:45] ah. I just added a new puppet class to use [00:14:59] so, if you click "configure" on your instance... [00:15:16] you can select misc::apache2 and click submit [00:15:35] Got a 500 error [00:15:41] then you can force a puppet run on your instance by doing "puppet -tv" as root [00:15:43] really? [00:15:46] Yes really [00:15:58] hmm. worked for me. [00:15:58] odd [00:16:05] one sec [00:16:24] can you try again? [00:16:31] I want to see the error in the apache log [00:17:00] ah ha [00:17:01] There [00:17:45] can you send me the URL on the "configure instance" page? [00:18:43] Ryan_Lane: https://labsconsole.wikimedia.org/w/index.php?title=Special:NovaInstance&action=configure&project=etherpad&instanceid=I-00000066 [00:19:58] This extension is in SVN right? Spotted another minor issue I might as well fix [00:21:02] Yup, checkingout now [00:22:26] much better. fixed in the interface now too [00:22:37] the extension is in svn yeah [00:22:47] the link "back to blah" is broken for configure :) [00:23:00] yup [00:23:02] :) [00:23:12] https://bugzilla.wikimedia.org/buglist.cgi?query_format=advanced&component=OpenStackManager&product=MediaWiki%20extensions&list_id=45280 [00:23:25] oof [00:23:28] :) [00:24:13] https://bugzilla.wikimedia.org/show_bug.cgi?id=31873 [00:24:19] that's the bug for that issue [00:25:45] the hard part about fixing stuff in the extension is that it isn't easy to set up a development environment for it [00:25:57] it takes a shit-ton of architecture to test against [00:27:41] yeah same thing with CentralAuth [00:27:51] Ryan_Lane: We should create a lab for it!!! [00:28:03] I was actually planning on putting it in labs ;) [00:28:13] I had it in a virtualized environment before [00:30:06] Ryan_Lane: openstack, is that like pacemaker type stuff? [00:30:13] nope [00:30:14] crm, or whatever its called [00:30:17] it's like EC2 [00:30:33] ah. haven't ever played with EC2 instance yet actually [00:30:33] Amazon EC2, that is [00:30:47] it's like an entire EC2 stack on your own hardware [00:31:27] so basically its for managing a cluster setup? [00:31:38] one that can be managed via an API [00:31:44] but yep [00:32:11] so, mediawiki is managing the cluster via its API using the OpenStackManager extension [00:34:05] Ok ok so let's try this again [00:34:17] So we added the misc::apache class, and I run sudo puppet -tv [00:34:30] But it tells me -tv is an ambiguous option [00:34:37] puppetd -tv [00:34:38] sorry [00:34:40] AH! [00:35:21] so, document things like which apache modules are required, as you install them [00:35:36] A bunch of failed dependencies are normal right? [00:35:36] and which packages you install via apt [00:35:39] umm [00:35:40] right [00:35:45] shouldn't be :) [00:35:52] curious, debian or ubuntu? [00:35:54] lemme go in and run it :) [00:35:55] ubuntu [00:36:04] *troubled boo's slightly ;) [00:36:16] although im one to talk with a ubuntu desktop :D [00:36:21] we use ubuntu for everything ;) [00:36:43] debian all over here cept workstation, but not like there is much difference I guess :) [00:36:55] johnduhart: I didn't see any dependencies fail [00:37:03] I see this: err: /File[/var/lib/puppet/lib]: Could not evaluate: Could not retrieve information from source(s) puppet://virt1.wikimedia.org/plugins [00:37:09] but thats normal, until I fix it :) [00:37:23] and apache is running on the box, thankfully [00:37:38] but anyway, document all things installed or configured [00:37:44] then we'll go through and puppetize it [00:37:57] we can walk through it together, if you'd like [00:38:01] Yup, doing that here: http://etherpad.wikimedia.org/EtherpadLite [00:38:02] then we'll kill your instance ;) [00:38:06] I would like that a lot, lol [00:38:20] here's why we'll kill it: http://ryandlane.com/blog/2011/11/02/a-process-for-puppetization-of-a-service-using-nova/ [00:38:43] Ryan_Lane: I know that's why I asked about your blog before :) [00:38:45] we'll puppetize it on the second instance, using the documentation [00:38:54] then we'll kill that one, and install it just using puppet [00:38:59] on a new instance [00:39:07] then we know it just works! ^_^ [00:39:32] and you'll be the first volunteer to puppetize something for us [00:40:29] so you're kind of a guinea pig here. heh [00:40:53] Ryan_Lane: Question, since this isn't in a package how will puppetize it [00:41:03] we'll need to make a package too [00:41:43] is it a binary? [00:41:59] It's a node server(?) [00:42:00] have you made a debian package before? [00:42:02] nope [00:42:15] *johnduhart would really like to know though [00:42:19] I've never dealt with node [00:42:27] making debs is a pain in the ass [00:43:16] I'd imagine this one will be fairly straightforward [00:44:53] ugh, nodejs isn't in apt with lucid [00:44:59] *johnduhart looks for an ppm [00:45:09] ah, so it outputs a node server, which is then proxied to via apache [00:45:15] yup [00:45:38] why is nodejs needed? [00:45:59] the installation demo doesn't show it installed [00:46:09] To run the server? [00:46:17] oohhhh [00:46:19] they build node [00:46:45] Don't we use node somewhere in WMF? someone should of made a puppet thing already [00:46:57] we don't now [00:46:59] we will be [00:47:15] oh [00:47:29] well, the good thing is that you can likely pull the natty package and backport it to lucid [00:47:50] Yay [00:47:59] oh wait [00:48:20] that's 0.2 not .4 like it asks on the README, might be an issue [00:48:38] pull from ocelot then [00:48:45] or pangolin [00:49:05] you'll need to rebuild the package [00:49:11] as hardy [00:49:14] Right [00:49:27] err [00:49:28] not hardy [00:49:30] lucid [00:50:21] Okay, so let's start backporting that. Where should I look to begin that [00:50:44] deb-src http://us.archive.ubuntu.com/ubuntu/ oneric main universe [00:50:44] deb-src http://us.archive.ubuntu.com/ubuntu/ oneric universe [00:50:44] deb-src http://us.archive.ubuntu.com/ubuntu/ oneric multiverse [00:50:54] that needs to be added to the apt sources list [00:51:05] then you can do apt-get update, then apt-get source nodejs [00:51:16] or not [00:51:55] Can I do this on pad1 or should I spin up a new instance for this so we keep pad1 clean? [00:52:18] can do it on pad1 [00:52:23] pad1 is the dirty one ;) [00:52:30] heh okay [00:52:37] deb-src http://us.archive.ubuntu.com/ubuntu/ oneiric main universe [00:52:38] deb-src http://us.archive.ubuntu.com/ubuntu/ oneiric universe [00:52:38] deb-src http://us.archive.ubuntu.com/ubuntu/ oneiric multiverse [00:53:26] I was missing an i before ;) [00:53:33] ah [00:53:48] silly naming conventions, why you so complex [00:54:04] why does ubuntu have to use such weird names? :) [00:54:27] one thing I like about debian over ubuntu. can't ever remember the names of ubuntu :) [00:54:32] we have a build host for making this stuff. I'm not sure how easy it'll be on that host [00:54:40] probably easy enough [00:54:44] you want to use pbuilder, though [00:55:20] I really need to make a proper build host [00:55:33] that's on my todo list :) [00:55:44] Okay, apt-get source'd it. Now what :) [00:55:55] sec [00:56:08] you need dpkg-buildpackage [00:56:24] which is installed, it seems [00:56:24] Let me document this too, hold on [00:56:39] you need pbuilder installed too [00:57:06] this is good to document for building stuff, but you won't need it in the future for etherpad-lite [00:57:21] unless you need to upgrade it [00:57:30] I mean just document in general backporting and building these packages [00:57:37] hopefully we'll have a build host that does it for you, then :) [00:57:42] ah. yeah. good idea [00:57:44] For me and any other volunteer [00:57:54] and ops. we hate building packages [00:58:44] you guys host your own mirror for custom packages too I assume? or just local fs dpkg -i? [00:58:55] custom repo [00:59:22] <^demon> We should change the wmf logo to a unicorn too. [00:59:29] dpkg -i is evil ;) [00:59:34] ^demon: heh [00:59:46] I was wondering how long it would take till someone asked about the unicorn :) [01:00:03] heh [01:00:16] im guessing the 3 lines are symbolic of ubuntu? [01:00:28] <^demon> I'm still wondering til someone comes along and makes it change it :( [01:00:55] <^demon> us change it, even [01:01:01] <^demon> Damn, I can't typoe this evening. [01:01:02] <^demon> Type. [01:01:02] Ryan_Lane: Okay so we up to the apt-get source part, let's continue http://etherpad.wikimedia.org/BackportPackage [01:01:03] <^demon> Fuck. [01:01:16] troubled: nah, that's on our wikimedia logo [01:01:17] ^demon: ya, can't even typo right tonight eh? ;) [01:01:59] johnduhart: so, if we are lucky, we just need to change changelog in the debian directory [01:02:39] I put this there:nodejs (0.4.9-2wm1) lucid-wikimedia; urgency=low [01:02:39] * backporting to lucid [01:02:39] -- Ryan Lane Wed, 02 Nov 2011 01:02:18 +0200 [01:02:41] err [01:02:42] nodejs (0.4.9-2wm1) lucid-wikimedia; urgency=low [01:02:42] * backporting to lucid [01:02:42] -- Ryan Lane Wed, 02 Nov 2011 01:02:18 +0200 [01:04:06] unfortunately deb building stuff is very specific about the format of the changelog [01:04:18] because debian hates every packager on the planet [01:05:07] there is a tool to update the changelog iirc, no? [01:05:13] dch or something [01:05:18] maybe [01:05:26] debchange - Tool for maintenance of the debian/changelog file in a source package [01:05:30] of course, now there are build dependencies [01:05:35] that are unmet [01:06:04] dch is the alias for debchange I guess, but it should adhear to policy iirc (been a loooong time since i tried it) [01:06:09] I'm seeing that by running: dpkg-buildpackage -rfakeroot -S [01:06:42] Ryan_Lane: So now what [01:06:58] most of those dependencies are likely bullshit [01:07:15] at least the version numbers anyway [01:07:44] so, I'm going to pull the natty version, and see what's there :) [01:07:47] if you stole it from a newer dist of the package, most likely :) [01:07:48] have I mentioned this sucks? [01:08:01] lol [01:08:02] don't blame you, deb building is.... [01:08:38] my eyes gloss just reading the policy about version numbering rules :) [01:12:21] heh [01:12:27] it was complaining about this missing: /usr/share/cdbs/1/rules/upstream-tarball.mk [01:12:37] so I just commented it out in the rules file [01:12:59] that's just to get the dsc file [01:13:14] this command: dpkg-buildpackage -rfakeroot -S [01:13:27] that's run in the nodejs-0.4.9 directory [01:13:50] then, I run: pbuilder create --distribution lucid [01:14:04] and pbuilder build ../nodejs_0.4.9-2wm1.dsc [01:14:36] Still get unmet dependencies [01:14:54] yeah. couldn't satisfy build dependencies in pbuilder either :( [01:15:08] The following packages have unmet dependencies: [01:15:08] pbuilder-satisfydepends-dummy: Depends: cdbs (>= 0.4.85~) but it is not installable [01:15:08] Depends: libv8-dev (>= 3.1.8.22) but it is not installable [01:15:08] Depends: libc-ares-dev (>= 1.7.3) but it is not installable [01:17:01] `apt-cache policy` explain why those aren't installable? or simply too high version number for current dist? [01:17:59] too high version for the distro [01:18:02] forgive me for being noisy, but I like a good "geeking out" :) [01:18:04] so, I'm finding the ones we have [01:18:11] libv8-dev (>= 2.0.3), [01:18:16] cdbs (>= 0.4.62), [01:18:21] libc-ares-dev (>= 1.7.0), [01:18:27] and changing that in the control file [01:18:36] then we'll see if it'll build with those versions [01:18:54] Are you in my home Ryan_Lane [01:18:58] nope [01:19:03] k [01:19:11] I'm on my build system [01:20:56] worst case we can't use lucid [01:20:57] Ryan_Lane: What's your opinion on ppas? [01:21:02] https://launchpad.net/~chris-lea/+archive/node.js [01:21:16] he have lucid packages? [01:21:22] we usually don't use ppas [01:21:52] he does have lucid packages [01:22:02] we can always pull the source deb from there if this doesn't work [01:22:06] and it didn't :) [01:27:02] Ryan_Lane: Okay so I have his src now, so I'll add the changelog entry and hopefully it'll build right? [01:27:31] hopefully, yeah. I was gonna look inside of it first :) [01:29:27] looks fine [01:31:10] https://github.com/tobert/nodejs-ubuntu-lucid [01:31:22] that may be even better [01:31:27] :o [01:31:36] Well the ppa seems to be very well updated imo [01:32:03] the git-hub thing is the oneric package's debian directory correctly updated :) [01:32:43] ah [01:32:47] *johnduhart wouldn't know better [01:34:34] Ryan_Lane: Okay so now I have that cloned in my ~/src dir, what now [01:34:52] modify the changelog, and try to build it :) [01:37:13] the good thing about using this, rather than the ppa version, is that it's easier to backport fixes later [01:37:14] Getting depedency errors [01:37:19] and everytime you have to rebuild, take a shot ;) [01:37:23] probably doing something wrong [01:37:47] what kind of dependency errors? [01:37:52] are you using pbuilder? [01:38:07] dpkg-buildpackage [01:38:15] oh, you may need packages installed just to run that [01:38:25] you can add -d to ignore them [01:38:38] cause what you really want is the dsc [01:38:47] :o [01:39:00] debian/rules:3: /usr/share/cdbs/1/rules/utils.mk: No such file or directory [01:39:00] debian/rules:4: /usr/share/cdbs/1/rules/debhelper.mk: No such file or directory [01:39:00] debian/rules:5: /usr/share/cdbs/1/class/autotools.mk: No such file or directory [01:39:09] yeah, you need cdbs package installed [01:39:14] did you do a "apt-get buildep "? [01:39:20] err build-dep [01:39:48] or would it help rather, if you are using a already made package as a starting point [01:45:27] Ryan_Lane: Cool dpkg ran, so now pbuilder? [01:45:33] yep [01:47:39] *johnduhart waits paitent for pbuilder create [01:48:19] Now for build [01:49:03] uhoh [01:49:20] chmod: cannot access `/tmp/buildd/nodejs-0.4.9/./configure': No such file or directory [01:50:12] This is just so much fun! [01:50:16] heh [01:50:56] *johnduhart pokes Ryan_Lane [01:51:14] Could you look at /home/johnduhart/src/nodejs-github since I probably did something wrong [01:51:46] hm [01:52:06] Was I supposed to checkout the 4.9 there and then gitclone that other repo to debian... [01:53:48] oops. sorry :) [01:54:08] it build for me :) [01:54:27] I wonder if it's how pbuilder is configured [01:55:00] I don't see that directory [01:55:06] dpkg -L does list some /etc files here for pbuilder [01:55:20] buildd-config.sh and pbuilderrc, specifically [01:56:30] although if this goes well, you wont need to run pbuilder anymore, you can just invoke johnduhart with `apt-johnduhart build-pkg` ;) [01:56:42] hehehe [01:56:44] johnduhart: I don't see that directory in your home dir [01:56:49] well, I already built it :) [01:56:55] this was more so you can see how it's done [01:56:56] Ryan_Lane: I nuked it and I'm trying again [01:57:00] ah ok [01:57:17] I'm about to leave for the night [01:58:06] back tomorrow. later guys [01:58:12] \o [01:58:14] seeya [02:00:43] huh, root can't read my home directory [02:01:19] O_o [02:01:47] I guess it's intential so that people can't read others homedirectories, but it means I can't sudo scripts [02:01:48] unless [02:02:27] weird, I always thought root had capability specifically for bypassing fs acls [02:02:49] ok, I just need to move it out of the home dir and into a dubdirectory [02:02:54] AHA! [02:02:56] although with stuff like apparmor and selinux, one never knows wth works these days [02:02:58] It's building :D [02:03:03] woot! [02:03:17] lmao I never checked out the source last time [02:03:56] so you were compiling with just a ./debian dir and no source? [02:04:01] yeah. [02:04:05] heh, smooth [02:12:28] johnduhart: so what is all this packaging for anyways? is this how you deploy stuff on the cluster to test your extensions and stuff? [02:12:35] <^demon|away> Down to 26 unidentified committers and 24 with a name but no e-mail \o/ [02:12:59] troubled: This is for deploying etherpad-lite to the production [02:13:10] ah [02:13:16] Which includes puppetization http://ryandlane.com/blog/ [02:13:17] i take it you just set this up today? [02:13:33] We started today, yes [02:13:36] only my second time in this channel, so im a little green :) [02:13:42] ah :) [02:15:59] ah, that page mentions NFS for home dirs, could explain why you cant read ~/ as root if they are squashing root or something [02:16:47] only a guess of course. sadly my eyes can't see into remote machine syslogs ;) [02:17:29] seems to confirm it at a glance: http://ryandlane.com/blog/2011/11/01/sharing-home-directories-to-instances-within-a-project-using-puppet-ldap-autofs-and-nova/ [02:18:30] although I dont see the squash option. still possibly the cause. shame Ryan left or i'd ask [02:20:55] ah, man says its default, could explain why i dont see it. so maybe after all [03:31:17] *johnduhart assembles a npm package [11:40:05] YAY [11:40:08] I did it! [11:40:22] *johnduhart packaged node and npm [11:43:11] npm? [11:44:47] Nikerabbit: package manager for node, needed for etherpadlite [11:44:57] Ah! [11:45:01] Are you packaging etherpadlite? [11:45:08] yes please please please [11:45:13] Once you do that [11:45:16] the bugs in current etherpad are driving me nuts [11:45:19] PLEASE put it in /trunk/debs [11:45:31] Sure [11:45:42] RoanKattouw: are there any deployments going on now? I was going to do the webfonts test [11:45:59] Then I can get Ryan to put it in the repo and puppetize etherpadlite (and hopefully test it on the virt cluster) [11:46:03] Nikerabbit: No, all yours [11:47:38] RoanKattouw: I'm doing all of that. Packaging and puppetizing [11:47:43] On labs [11:47:47] Oh, cool [11:47:58] Do you have access to the puppet git repo? [11:48:39] I think so [11:49:05] You may want to create a topic branch for the etherpadlite puppetization and publish your topic branch through gerrit [11:49:14] In case you want to do that, I wrote a guide https://labsconsole.wikimedia.org/wiki/User:Catrope/Topic_branches [11:49:41] (Cause it's what I did for my logmsgbot and LocalisationUpdate puppetizations, and Ryan and I fixed some issues while going through that, so it should be working now) [11:50:57] For instructions on how to set up a checkout of the puppet repo locally, see the main page of that same wiki [11:51:08] RoanKattouw: wow lots of errors [11:51:16] RoanKattouw: Cool thanks [11:51:18] Nikerabbit: From scap? [11:51:39] RoanKattouw: yeah [11:51:50] well, remote host id changex *3 and few others, so not so much [11:52:08] Are you getting permissions errors for cache/l10n or .git directories? [11:52:24] Remote host ID changed is a known side effect of upgrading the API Apaches to lucid [11:53:03] (Peter thought he'd upgraded all Apaches, we discovered yesterday that he'd forgotten the entire 290-301 range) [11:55:18] RoanKattouw: fortunately no [11:55:36] Hm OK [11:55:59] RoanKattouw: but I only did sync-dir WebFonts so... [11:56:09] Oh, right [11:56:22] So you got remote host ID changed for ... srv290, 291 and 292? [11:57:12] nope [11:57:20] 296,300 and 301 [11:57:30] 290 complained something about tty not present [11:58:26] 292 295 299 asked for password [12:01:39] RoanKattouw: hmm nothing is logging my syncs... that isn't normal right? [12:01:54] no, but it's known breakage [12:02:28] I see, since yesterday? [12:04:17] Since Monday I think [12:04:27] Or maybe Tuesday [12:04:42] I puppetized logmsgbot the same way Ryan puppetized nagios-wm [12:04:49] Using the ircecho package from our Debian repo [12:04:51] grrr fucking school proxies [12:04:56] git won't work [12:05:07] BUT... that package is only in the lucid repo (fenari runs hardy), and depends on a newer version of some Python library that isn't in hardy [12:06:16] Ryan pushed the package to the hardy repo, but then we discovered the Python lib issue, which isn't fixed yet [12:10:52] RoanKattouw: http 304 works, although browser still need to revalidate on each request... [12:11:15] Is it sending must-revaliate? [12:11:18] *must-revalidate [12:11:59] nope [12:12:14] but there is no Expires header either [12:12:47] So what is there? [12:13:00] Could you pastebin the response headers for both a 200 and a 304 response? [12:13:18] I'm testing with http://test.wikipedia.org/w/extensions-1.18/WebFonts/test.html [12:13:27] OK [12:13:33] I am going to make lunch now [12:13:39] I'll poke at it when I get back [12:14:11] hmm lol [12:14:14] This is what Firebug tells me [12:14:15] Last Modified Thu Nov 03 2011 13:13:25 GMT+0100 (CET) [12:14:17] Last Fetched Thu Nov 03 2011 13:13:25 GMT+0100 (CET) [12:14:18] Expires Thu Nov 03 2011 13:14:09 GMT+0100 (CET) [12:14:26] interesting [12:14:36] If your resources are cached for 44 seconds, you have a problem :D [12:14:51] I'll poke more after lunch [12:19:45] (note, those aren't request or response headers, but it's the cache metadata that Firebug exposes) [12:20:00] http://etherpad.wikimedia.org/qRtBb5GShr [12:28:41] yay, etherpad lite is working. sort of [12:57:48] Nikerabbit: Yeah so no caching headers are sent at all [12:57:56] No Expires, no Cache-Control, no nothing [13:00:59] RoanKattouw: only etag and date [13:01:18] that's better than nothing, but could be better [13:01:29] Yeah [13:01:37] We'll have to mess with the Apache config for this one [13:01:46] What's the MIME type for these fonts supposed to be? [13:02:12] I don't think there's any agreement on that yet [13:02:18] santhosh: do you have a proposal? [13:02:44] something like aplication/x-font perhaps? [13:04:07] webfonts supports delivering four formats: ttf, eot, woff and svg [13:04:23] webfonts are awesome [13:04:28] OK [13:04:40] Probably want separate MIME types for each [13:04:45] application/x-woff for WOFF [13:04:59] eot =application/vnd.ms-fontobject [13:05:10] Nikerabbit: your CR of my style fixup said "meh" ... Could you point out the issues when you get a chance? [13:05:12] OK, let's try WOFF then [13:05:24] Nikerabbit: Could you put a .woff test file up there? [13:05:32] nm [13:05:34] I'm dumb [13:05:39] The current test file IS woff [13:06:31] for TTF application/octet-stream [13:06:39] hexmode: what was the revision number? [13:06:57] 1s [13:08:07] santhosh: I would really like not to use octet-stream [13:08:24] Nikerabbit: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/101490 [13:08:59] RoanKattouw: hmm, but I dont see any MIME definition for truetype and opentype font so far. [13:09:38] We can make one up if we need to [13:09:49] hexmode: ahaa [13:10:38] hexmode: foo( array( .... \t\t .... ) ); right? [13:11:27] hmm, even the last line had one extra tab [13:12:20] extra tabs must die [13:12:29] spurious spaces, too [13:12:40] Grr [13:12:47] I'm trying to add a MIME type for .woff but failing so far [13:13:19] http://pastebin.com/zGRRceEx [13:14:19] hmm [13:14:46] that looks correct to me [13:17:05] Wrong file [13:17:08] It's working now [13:17:14] (Until puppet overwrites my changes) [13:17:40] https://test.wikipedia.org/w/extensions-1.18/WebFonts/test.html WFM now [13:17:48] Cache-Control max-age=2592000 [13:18:06] (I set the expiry to 30 days, not sure that's what you actually want. You may need versioned URLs for updates) [13:18:43] Hmm, wait a second [13:18:48] Ideally we pull this stuff over bits [13:19:16] It's still broken on bits [13:19:22] Probably due to Varnish config stuff [13:19:50] moin moin \o [13:20:12] are these going to be served through bits or what? [13:20:46] They should be [13:20:58] But we'd need to get that to work first [13:21:00] See https://bits.wikimedia.org/w/extensions-1.18/WebFonts/test.html [13:22:13] RoanKattouw: webfonts uses $wgExtensionAssetsPath, that automatically goes to bits? [13:22:54] Yes [13:23:09] In order to even try this to bits, my Apache config change needs to be deployed first [13:23:14] I am going to put my change in git [13:23:24] okay [13:23:27] Not sure how s-maxage will work yet but I'll ask Mark or Ryan [13:23:57] Hmm might not even be needed [13:23:59] Oh well [13:24:22] RoanKattouw: I guess we need some kind ?timestamp feature on these? [13:24:36] Yes [13:24:40] Or a version number, or something [13:25:00] Cause unless you tell me otherwise, the max-age will be 30 days [13:25:55] santhosh: ^^ [13:26:10] I imagine we will have font updates, but they will be somewhat rare [13:26:36] maybe just adding ?n to the definition list is enough [13:28:33] yes, I think that is enough. [13:28:43] https://test.wikipedia.org/w/extensions-1.18/WebFonts/test2.html with our biggest font [13:30:34] RoanKattouw: did you add similar configs for other formats too? [13:30:45] On it [13:30:49] okay [13:31:50] we've planned to deploy webfonts on 14th if no there is no blockers [13:34:31] gerrit-wm New patchset: Catrope; "Configure Apache to send caching headers for WebFonts font files" [operations/puppet] (test) - https://gerrit.wikimedia.org/r/657 [13:35:06] diff @ https://gerrit.wikimedia.org/r/#patch,unified,657,1,files/apache/apache2.conf.appserver [13:36:33] looks great [13:36:49] What the ... [13:36:50] The program 'patch' is currently not installed. You can install it by typing: [13:36:52] apt-get install patch [13:37:09] maybe it is considered dangerous :) [13:37:36] RoanKattouw: thanks a lot for your help [13:37:48] Sure [13:39:15] Alright [13:39:22] All four font types should now work on testwiki [13:39:56] On bits it won't work until that patch I just submitted in gerrit is reviewed, merged and deployed [13:41:46] hexmode: surrur? [13:42:00] ? [13:42:11] Nikerabbit: ... [13:42:18] where is reedy [13:42:36] I need him to make me a tarball [13:42:47] haven't seen [13:43:39] hexmode: but if it wasn't clear, I don't undestand why you added extra indentation level there [13:44:27] Nikerabbit: yeah, thanks. I got that. I've been thinking of how/if to fix it [13:44:58] while reading about how wikipedia isn't bliss eternal for everyone on earth [13:45:34] hexmode: It isn't? :O [13:45:44] http://www.salon.com/2011/11/01/does_culture_really_want_to_be_free/singleton/ [13:45:52] evidently not [13:49:52] hexmode: okay [14:01:27] hexmode: can you add me in default cc of WebFonts, Narayam extesions? [14:01:55] santhosh: np [14:01:59] thanks [14:06:44] munah munah! ;) [14:06:55] err, can't even spell right today! [14:07:25] sumanah: morning is what I meant :) Looks like they got me all setup last night, thanks [14:07:55] hi there troubled -- glad to hear it [14:08:45] btw are you on the developers' email list, troubled? [14:08:49] I cannot recall [14:09:14] sumanah: not that I know of yet, unless it was part of the registration I did with Ryan last night [14:09:25] santhosh: Santhosh Thottingal , right? [14:09:25] no I doubt it troubled [14:09:35] I've got 3 santhosh's [14:09:40] sumanah: got the url handy? [14:09:41] yes, correct [14:09:42] https://lists.wikimedia.org/mailman/listinfo/wikitech-l troubled [14:09:42] one w/o a surname [14:09:50] thx [14:10:03] yw [14:10:53] santhosh: done [14:11:09] sumanah: wondering about digest or not. is it high or low traffic? [14:11:22] troubled: I say, start with digest [14:11:32] troubled: and in a week decide whether to switch to regular mode [14:12:19] sounds good. im digest on the blend list too. seems much nice to not have email notifications every second coming in for sure, even if im slightly behind on the "news" :) [14:12:47] troubled: the thing that bothers me about digest mode is that when I reply it doesn't get properly threaded [14:13:01] but if I'm not talking much, that is an ok price to pay, for me, for some lists [14:13:17] (also you cannot killfile individual posters, which, well, that's what the spacebar is for) [14:13:26] I know what you mean. I find myself hilight erasing for 2 minutes while I scroll a 20 mile long page of replies :) [14:13:58] oh well, worst case, non digest and gmail filters ftw :) [14:14:12] true :-) [14:15:39] hey troubled, what are you better at? back-end database stuff rather than JavaScript/CSS/frontendy stuff, right? [14:16:17] sumanah: so whats the idea here with this stuff anyways. I spinup a machine in the testlabs, setup my desired extension and changes, then submit them to gerrit for testing/approval for inclusion in the trunk for that extension (sphinx in my case) answering Q in next line [14:16:31] im more a backend/db kinda guy [14:17:06] *johnduhart wavse [14:17:09] troubled: https://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report is basically what's blocking us from releasing MediaWiki 1.18 in case you want to help fix any FIXMEs or inspect/test any of the revisions that are awaiting code review [14:17:11] er, waves [14:17:18] got a few years of mysql experience, although its not something I do daily. but if I were to classify my skillset, I would say im an admin type more than a programmer [14:17:25] Ryan's not in the office yet? [14:17:28] troubled: Depends, are we talking about a MediaWiki extension, or installing an application on the MW servers [14:17:30] johnduhart: \o [14:17:33] johnduhart: At 7am? [14:17:36] johnduhart: I doubt it; it's, what, 7:15am in San Francisco? [14:17:44] troubled: i.e. is it a dev thing or an ops thing? [14:18:20] lol Timezones are not my forte, plus you're talking to someone who starts his day at 6am [14:18:57] (morning, johnduhart!) [14:19:00] RoanKattouw: well, the sphinxsearch extension is what I was working on that got me here. Its a dev type project, although we are having some admin problems (im proxy admin access to project atm only) that are complicated by a slightly unorganized dual svn setup in our project that im hoping to pick up some useful tips from your setup to help with [14:19:08] (or AFTERNOON since you are such an early riser, hee) [14:19:16] Hehe [14:19:43] I'm imagining the johnduhart's morning montage: coffee with the sunrise, weightlifting while listening to classical music, fixing a bug while making breakfast for the household [14:19:52] oooh, coffee, I almost forget :) [14:20:01] *troubled darts off to put a pot on [14:20:03] troubled: remind me what timezone you're in? [14:20:08] EST [14:20:12] oh me too [14:20:13] NYC [14:20:14] lol sumanah, more like waking up at 6am and getting on the bus 10 minutes later [14:20:30] johnduhart: it must be hard to do your sunrise coffee weightlifting on the bus [14:20:44] hahaha [14:20:56] troubled: OK so does the extension already exist in our SVN repo? [14:21:26] Ha, I used to give myself more time than that [14:21:27] RoanKattouw: yes, I believe its called "sphinxsearch" [14:21:32] 6:15am alarm, 7:31am train [14:21:59] troubled: Right, then as far as we're concerned you can just check it out and start committing. But setting up a dev environment might be hard, I'm not familiar with Sphinx [14:22:07] I mean a testing environment [14:22:27] btw, troubled, you may have looked around http://wikitech.wikimedia.org/ for some of our setup docs. http://wikitech.wikimedia.org/view/Subversion and https://www.mediawiki.org/wiki/Subversion might be interesting to you [14:23:23] sumanah: I haven't looked at those pages yet I don't think. I was going to sit down today and read up on the testlabs and gerrit a bit and get familiarized with their use [14:23:51] troubled: there's a lot to read! :-) good luck. And of course if anything is confusing, ask us (and then help by updating the docs, if you have time?) :-) [14:24:17] RoanKattouw: well, sphinx isn't too hard really. it basically just has an sql_query that seperates the wiki pages into attributes (title, namespace etc) to be indexed [14:24:49] sumanah: the part that I am wondering about atm is who do I ask for permission to actually commit to the original project? [14:24:50] troubled: testlabs and gerrit are kind of only for ops-like things at this point [14:25:10] For now you should stick with testing locally and committing to our SVN repo [14:25:21] testlabs is still kind of under construction [14:25:28] troubled: if you do not have SVN commit access right now, apply via: [14:25:29] ah [14:25:34] https://www.mediawiki.org/wiki/Commit_access_requests#Requesting_commit_access [14:25:36] But it will allow you to set up a testing environment eventually [14:25:40] sumanah: thx [14:26:26] Okay so etherpadlite on pad1 is working, I need to talk to Ryan about if we're going to use MySQL [14:26:32] Then we gotta package it [14:26:35] the next step will be for me to pick apart our setup and decide what should go upstream. atm our wiki is mid reorg to help with indexing, so some changes wouldnt make sense to a typical user of the extension [14:27:59] troubled: I cannot guarantee you'll *get* commit access, but my guess is you probably will (the decision is up to Tim Starling, Chad Horohoe, & Aaron Schultz, and is basically based on whether your code sample is terrible code with lots of security vulnerabilities, or not. And we're pretty lenient about giving access to extensions & tools parts of our repo) [14:28:12] Hmmm http://etherpad.org/ [14:28:26] sumanah: understandable :) [14:28:57] johnduhart: eeek [14:32:20] RoanKattouw: Would you know if ops prefer sysvinit or upstart scripts? [14:32:44] I'm not sure [14:32:51] Suggest looking at /trunk/debs/ircecho [14:32:57] That package was created by Ryan not very long ago [14:33:42] Yeah looks like an sysvint script [14:35:38] I thought so too [14:47:23] grrr [14:47:48] Seems like I can't create users on pad1, something with homedirectories... [14:49:04] johnduhart: nfs problem perhaps? does it work if you create a user with a home dir not under /home or something? [14:49:36] RoanKattouw: can I leave the test files there for a while? [14:50:00] Sure [14:50:27] johnduhart: You can join the "I need Ryan" club then [14:50:45] People have been jumping on Ryan the second he gets on IRC for like two weeks now [14:51:00] hehe [14:51:17] lol [14:51:23] troubled: That's what I'm thinking [14:51:25] let me try that [14:51:40] im still only guessing NFS root squash being the problem, but it does seem to fit [14:52:56] It's set up in a bit of a weird way [14:52:58] troubled, it could be [14:53:00] *RoanKattouw looks up Ryan's blog post [14:53:22] http://ryandlane.com/blog/2011/11/01/sharing-home-directories-to-instances-within-a-project-using-puppet-ldap-autofs-and-nova/ [14:57:48] <^demon> johnduhart: You create users in the puppet manifest for a particular server. [14:58:07] hi ^demon -- hey, do you care how I license the how-to-test audio file when I upload it? PD ok? [14:58:40] <^demon> That's fine. [14:58:53] ^demon: We're not up to puppetization, we're just getting it working and then I'll do the right thing with puppet [15:00:16] what do you guys use to get the servers up to the initial state before you puppet them up to date? dd image? FAI? preseed? kickstart? other? [15:01:23] troubled: http://wikitech.wikimedia.org/view/Build_a_new_server [15:01:31] thanks [15:02:33] troubled: also, we're hiring, in case you know anyone http://jobs.wikimedia.org/ [15:03:08] troubled: including "Development and Operations Engineer" (in the contractor section) [15:03:28] neat. always thought you were all volunteers? [15:04:09] johnduhart: ah, pxe i see for the initial shell [15:04:46] troubled: all the content contributors, like people who write & edit & upload media, are volunteers, but there is a nonprofit that keeps the servers running [15:04:53] troubled: it's 93 paid people [15:05:13] sumanah: wow, not bad. congrats! [15:05:20] troubled: and we got a lot to do! https://strategy.wikimedia.org/wiki/Product_Whitepaper [15:06:38] troubled: the thing to skim there: https://strategy.wikimedia.org/wiki/Product_Whitepaper#Product_priority_recommendations which explains engineering department priorities for the near future in English sentences -- more geekspeak at https://www.mediawiki.org/wiki/Roadmap [15:06:59] sorry for spamming, all that is not urgent to read now of course [15:07:11] no worries, it's irc, im used to it ;) [15:07:23] thats why browsers have tabs! :D [15:07:59] :-) [15:09:51] troubled: our fresh new monthly engineering report, which guillom put together, is at http://blog.wikimedia.org/2011/11/03/wikimedia-engineering-october-2011-report/ and gives an overview of what the tech folks at Wikimedia Foundation have been up to in the past month. but of course we are just a portion of the MediaWiki technical community. Lots of folks, like you, are volunteers. One of our sysadmins, domas, is a volunteer. [15:09:51] And a bunch of the developers are volunteers [15:11:03] <3 volunteers! Without em, there wouldn't even be a freenode! [15:11:47] yes, or Wikipedia [15:12:02] or linux! :) [15:12:29] yep. or many other important parts of civil society, like the League of Women Voters [15:13:21] can't say i've ever heard of that one, but surely a good cause! :) [15:13:43] troubled: LWV puts on political debates, and issues informational pamphlets [15:13:44] yeah [15:14:46] sumanah: I am guessing that wiki folk tend to be well versed in such things due to the nature of the content on the main site itself [15:15:27] troubled: you mean in collective digital participation and production? [15:15:53] yeah, there's definitely crossover with other opensourcey ideologies, practices, temperaments, etc [15:16:13] <^demon> Bunch of hippies :) [15:16:15] well I noticed how you guys tend to organize in very well structured ways that seem to mirror polictics and society in certain ways [15:16:18] heh [15:16:43] troubled: we see the world not as it is, but as we are. right? [15:16:50] anyway, back to devops stuff [15:17:12] true :) sounds good. besides, I gotta bit of reading to do it seems :) [15:17:21] troubled: so, we're hiring, please spread the word if you can [15:17:30] *troubled nods [15:17:50] troubled: and if you are anywhere near DC, http://wikimediadc.org/wiki/Library_Lab#November is happening this month & next -- Wikimedia/MediaWiki development get-together stuff [15:18:07] <- Canada, Ontario btw [15:18:33] nod [15:18:49] *troubled s/(.*)/$1, eh!/g his past statements ;) [15:19:33] hahaha [15:20:50] troubled: Where in Ontario, specifically? (If you don't mind me asking) [15:21:18] RoanKattouw: Peterborough [15:21:28] *RoanKattouw has to look that up [15:21:47] My mother was born in Ontario and I know a decent bit of geography there, but I've never heard of Peterborough [15:21:47] about 1.5 hour from TO, or 45min from the shaw [15:22:04] about half way between TO and Ottawa [15:22:31] Yeah, just looked it up [15:22:45] although my geoip seems to think oshawa for some reason *shrug* (silly bell netblock admins!) [15:22:51] troubled: do you ever hit hacklab.to? I may have asked you this before [15:22:58] Now he's coming to get you troubled, watch out [15:23:12] sumanah: never heard of it tbh [15:23:19] johnduhart: eek! what I do now! heh [15:23:23] I don't think I have any family in Toronto right now. My grandparents lived in Ottawa in the 60s and I think my grandfather's sister's son lives in Kingston now [15:23:47] born in TO myself, although funny that I never really go there lately [15:24:18] troubled: prepare yourself for RoanKattouw's terrifyingly efficient travel towards you (he's realllllly into itinerary optimization, I think he solved traveling salesman) [15:24:23] (just, like, on the side) [15:24:23] hehe [15:24:40] hahaha [15:24:48] *troubled better put on an extra pot of coffee ;) [15:24:52] troubled: Just don't answer the door and he'll leave you alone :) [15:25:00] hehe [15:25:13] I do plan on exploring my Canadian roots at some point [15:25:45] But as it stands I've never really been to Canada, other than a short layover at Toronto Pearson [15:25:49] probably best to wait til summer, winters can suck here! Although more so up north like manitoba than ontario [15:25:59] Yeah, I've heard the stories [15:26:03] RoanKattouw: just listen to some Moxy Fruvous and read some Gordon Korman [15:26:09] I went to niagara falls once does that count? ;p [15:26:10] and watch "My Winnipeg" [15:26:12] Shortly after my mom was born, my grandparents moved to Saskatoon [15:26:29] <^demon> RoanKattouw: You should explore your Southern roots too :) [15:26:37] ive been in -30 with wind chill to -40 or so. I froze my ear once in manitoba walking home! [15:26:56] but the northern lights are worth it all :) [15:26:57] Yeah that's pretty much what Saskatoon is supposed to be like [15:27:03] But then the summers are +30 [15:27:17] I was in Flin Flon, even further than Sakatoon by a few hundred miles iirc [15:27:21] ^demon: Are you trying to lure me to Tobacco Town? [15:27:53] Yeah that's seriously a lot farther north than Saskatoon [15:27:58] <^demon> Nah, Richmond's not very fun. [15:28:02] and much colder too :) [15:28:05] <^demon> But there's other neat places in the south. [15:28:06] I see how you'd get northern lights there [15:28:11] Saskatoon probably not so much [15:28:21] It's South of Edmonton and I live in Europe on the same latitude as Edmonton [15:28:35] RoanKattouw: they weren't always out, but some nights, if you are lucky, they would come out and are absolutely brilliant to watch [15:29:59] Hmm, come to think of it I've been farther North than Flin Flon I think [15:30:32] hopefully not in the winter! :) [15:30:35] 54??40'41"N [15:30:37] Oh, yes [15:30:39] But not in Canada [15:30:42] This was in Sweden [15:30:47] oh you poor thing hehe [15:30:58] Wasn't too bad [15:30:59] <^demon> I think we should do WM2013: McMurdo Station [15:31:02] flin flon was particularly bad cause of the rocks and being near the lake though [15:31:10] Coldest daytime temperature was -11 [15:31:19] the wind comes off the hills and whips across the lakes and hits you twice as hard [15:31:28] ^demon: Something like that, yeah. I'll settle for Tromso [15:32:02] ^demon: really screw with everyone's timezones! [15:32:19] <^demon> Since we'll all be on NZ time? [15:32:20] <^demon> :p [15:32:34] 60??05'18"N is where that was [15:32:47] So that's on the same latitude as the MB/NT border [15:32:59] I should say, mess with everyone's circadian cycles [15:33:17] Anchorage 2013 is threatening to do just that [15:35:13] <^demon> Hmmm, couple of 2013 bids already up. [15:35:16] troubled: if you are interested in putting on a MediaWiki-related get-together near you, or teaching about MediaWiki at a local PHP meetup, or anything like that, we have materials you can use http://lists.wikimedia.org/pipermail/wikitech-l/2011-October/056040.html [15:35:29] <^demon> I wouldn't mind Manila. [15:35:46] <^demon> Margarita Island looks pretty, if a bit impractical. [15:36:14] troubled: these folks are talking about Wikimania, the yearly Wikimedia movement conference. [15:36:37] sumanah: going for wold domination are we? ;) [15:37:04] ughh where's ryan :p [15:37:06] troubled: Imagine a world in which every single human being can freely share in the sum of all knowledge. That's our commitment. [15:37:16] :-) [15:37:21] sumanah: ah yes, the borg! :D [15:37:43] troubled: this is where I make a reference to an obscure DS9 episode, but I'm holding myself back :-) [15:37:46] I do agree with knowledge sharing though [15:37:51] heh [15:38:15] more of an TNG guy, so I may not get the DS9 ref, as I haven't seen em all [15:38:41] im actually tickled pink as of late, since TNG episodes started from scratch again on the DVR! [15:38:48] *Platonides looks up DS9 [15:39:00] ack! Never heard of DS9?! [15:39:10] nope [15:39:19] I've heard about Start Trek, though :) [15:39:19] <^demon> As long as we don't start making TOS references we'll be fine. [15:39:20] Star Trek: Deep Space 9 [15:39:24] I was a TNG gal, then my spouse and I watched all of Deep Space 9, and then Babylon 5, so now I am more into the latter. [15:39:26] *Star Trek, [15:39:51] sumanah: aha! I knew you must have been a gal :) [15:39:59] troubled: well, my name is easily googlable [15:40:12] no surprise :) [15:40:16] <^demon> sumanah: There's something about Patrick Stewart pretending to be a frenchmen that I still find amusing to this day. [15:40:21] I KNOW ^demon [15:40:23] sumanah: I try avoid googling people and prefer to just ask or be told ;) [15:40:27] <^demon> "I don't care what your character's name is. You don't sound French" [15:40:46] ^demon: it's even less believable than FTL travel [15:41:42] sumanah: im guessing you've already been sent here plenty of times then? https://www.youtube.com/watch?v=8N_tupPBtWQ [15:41:54] *troubled grins slyly ;) [15:42:04] troubled: haha! no, never in response to my name [15:42:09] hehe [15:42:33] I guess "su manah" would be the "play that video, but as root" command [15:43:06] ha, ya [15:43:11] troubled: a museum near me has a Henson exhibit happening right now and there are Snowths and the manah puppet! on display! rockx [15:43:13] <^demon> sumanah: I'm totally on board with the whole warp drive thing. It's totally possible, once we figure out what a "gravimetric field displacement manifold" is and how to use it :) [15:43:32] sumanah: it's a sign! :) [15:43:33] and the Heisenberg compensator, to account for all the meth on board [15:43:44] cf. Breaking Bad [15:43:46] <^demon> sumanah: Also, dilithium crystals. [15:44:04] meth crystals? [15:44:35] hexmode: (in "Breaking Bad" there is a meth manufacturer named Heisenberg.) [15:44:52] ok, back to [15:44:52] http://www.mediawiki.org/wiki/User:Sumanah/TODO [15:45:11] *^demon adds "figure out warp travel" to [[User:Sumanah/TODO]] [15:45:20] sumanah: more and more I begin to understand pop culture by my exposure to all you [15:45:23] *sumanah laughs aloud. think I'll delegate that [15:45:35] <^demon> You could put it under "Longer-term" [15:45:36] Heisenberg compensator? SOunds like the episode with Moriarty being tricked by Picard and crew as to the device he needed to get off the holodeck :) [15:45:52] LOLing, for real [16:20:19] Reedy! [16:20:39] uneventful trip back? [16:21:30] Pretty much, yeah [16:22:22] first round of root canal surgery today [16:22:38] got told i need one of my molars doing also.. which could be a complete removal, but at least a crown [16:23:09] ouch! [16:23:37] (I thought you decimalized your currency!) [16:23:52] local anesthetic is just about worn off, and it's not too bad [16:24:06] Dentist suggested the molar having an infection could be headache related [16:24:06] sumanah: too soon :) [16:24:08] *Reedy shrugs [16:24:21] *sumanah laughs aloud [16:25:14] poor Reedy [16:25:19] that could do quite a bit to explain headaches now couldn't it [16:25:54] teeth grinding/clenching could also be possible, no signs at all of grinding. nothing obvious for clenching [16:26:19] yeah [16:26:35] <^demon> Tooth infections can cause all kinds of weird (and seemingly unrelated) symptoms. [16:35:49] Reedy: are you feeling up to getting a beta tarball out this week, or should we maybe figure out a different plan? [17:52:24] Ryan_Lane there you are! [17:55:26] I'll assume you're away from you desk but I'll fill you in on what I've accomplished. [17:55:47] (since I'm on a bus ATM) [17:57:01] so I finally got the nodejs to build after you left, silly me was building with inly a /debian folder and no source [17:57:38] and this morning backported npm [17:58:29] so now etherpad lite is running on port 9001 [17:59:01] i need to configure the apache proxy [17:59:04] oh [17:59:14] and the init script is in [18:04:15] ah. sweet [18:04:25] johnduhart: that's great [18:13:07] oops, feel asleep [18:13:27] so ill continue in a bit when i get home [18:21:58] sounds good. that's great work [19:32:24] Okay so back to etherpadlite stuff [19:32:44] :-) [19:35:12] Ryan_Lane: Are we going to want to run EPL with MySQL? [19:46:29] johnduhart: what's EPL? [19:46:42] EtherPad Light [19:46:48] thank you [19:46:49] *johnduhart is making up acronyms [19:46:52] ;p [19:46:59] I was like, "embedded PL?" [19:47:12] 2- and 3-letter abbreviations are way harder to Google for [19:48:08] lol [19:48:42] johnduhart: I didn't realize EtherpadLite *needed* a db behind it [19:49:39] sumanah: Well of course, where else would it store everything :) [19:49:50] johnduhart: filesystem perhaps [19:50:00] johnduhart: depending on how many revisions it wants to store [19:50:03] Out of the box it uses sqlite, but since we have plenty of db servers we might as well use them [19:50:06] but yes, db is reasonable [19:50:07] sure [19:50:37] Ew filesystem storage for relational data ;) [19:51:01] yes, of course in retrospect it makes sense [20:23:57] sumanah: what are the issues with prorprietary extensions? I have someone who wants to write one. [20:29:19] Amgine: well, some proprietary extensions do exist, I know that -- of course we do not host them in our SVN repo [20:30:34] I'm talking against it, but there isn't a problem with their existence, right? [20:31:26] As far as I know, no, but I'm not the expert here -- you might try asking in #mediawiki where more knowledgable folks hang out & can tell you of pitfalls [20:32:13] johnduhart: yeah. db [20:33:05] Ryan_Lane: I assume there's a puppet class for MySQL? [20:35:03] heh [20:35:04] kind of [20:36:56] lemme seee... [20:37:22] seems the mysql class doesn't install mysql [20:37:48] no need to worry about that though [20:37:56] johnduhart: just install it manuall [20:37:59] *manually [20:38:04] sure [20:38:12] when we move this to production, we'll already have a mysql box installed for this [20:38:22] it just won't be on the same system as etherpad-lite [20:51:50] Ryan_Lane: Apache proxy done :) http://cl.ly/BWNU [20:53:37] sweet [20:53:43] how are you connecting to the instance? [20:53:46] ssh socks proxy? [20:56:45] Ryan_Lane: port forwarding(?) with ssh [20:57:03] ssh johnduhart@bastion.wmflabs.org -L 80:pad1:80 [20:59:12] ah. coll [20:59:13] *cool [20:59:16] that works too [21:06:34] johnduhart: so, now that it is running, we should puppetize it [21:06:54] Hold on [21:06:59] *Ryan_Lane nos [21:07:02] *nods [21:07:06] Still gotta do the db stuff :p [21:07:10] ahhh. ok [21:27:08] Ryan_Lane: pfff, etherpadlite uses one table in MySQL and then stores everything key-value [21:27:19] :D [21:27:26] that's *terrible* [21:27:32] lol yeah [21:27:39] Oh well, better than sqlite [21:27:56] yeah [21:28:18] Let me clean up the docs and we'll get started on puppetization [21:28:33] sounds good [21:32:45] Ryan_Lane: Could you just look this over and make sure it's mostly sane? http://wikitech.wikimedia.org/view/Etherpad_Lite [21:33:41] bin/installDeps.sh? [21:33:45] what does that install? [21:33:56] we'll need the list of packages it installs, since we'll add that in puppet [21:35:25] I'm not totally sure how etherpad-lite gets installed [21:36:42] does safeRun.sh launch etherpad lite using the nodejs binary? [21:37:03] it kind of seems like we should make a second package ;) [21:37:13] I know you're going to hate me for that. heh [21:37:41] installDeps installs stuff via npm [21:37:42] etherpad-lite should be a package that includes the etherpad-lite code, and depends on the dependencies in installsDeps.sh, and depends on nodejs [21:37:46] npm? [21:37:49] what npm? [21:37:53] node package manager [21:37:54] Node Package Manager [21:37:57] *Ryan_Lane groans [21:38:00] I thought so [21:38:05] we have another ruby on our hands [21:38:10] I knew you would hate that [21:38:38] I don't know why people think language-specific package managers are a good idea [21:38:43] I don't either [21:38:45] I hate them [21:38:54] where does npm install things? [21:38:58] uh [21:39:01] Well [21:39:04] Seriously, the APT/Debian people should work on APT and everyone else should just go home [21:39:04] acutally [21:39:33] Ryan_Lane: node_modules in the etherpad-lite folder :p [21:39:36] :o ** [21:39:39] oh. ok [21:39:43] well, that's not as bad [21:39:50] So...we could package them? [21:39:53] at least it's bundled with the app [21:39:56] yes [21:39:59] yay [21:40:07] we can package the modules with the binaries [21:40:17] then we only have to install one thing, plus the init scritp [21:40:19] *script [21:40:33] we [21:40:34] err [21:41:10] well, the package will install all the modules, the etherpad-lite binaries, the user account and group, and the init script [21:41:33] And run the stuff at the bottom of installDeps.sh? [21:54:15] hexmode: did Reedy ever get back to you? [21:55:21] Ryan_Lane: when you are running the tests against the various dbs how is it going to work? Will there be multiple MW installs running, will it change dbs on the fly, or will the test talk directly to the db object? [21:56:27] blobaugh: I'm probably not the right person to ask [21:56:39] ^demon is the guy you should ask [21:57:32] Ryan_Lane: ah, ^demon is not around right now :( [21:57:59] yeah [21:58:24] johnduhart: well, the package could run that while building [21:58:26] Who is ^demon? [21:58:40] johnduhart: then will include the modules created in the package [21:58:46] yeah [21:59:02] blobaugh: Chad Horohoe [21:59:09] he's working on the testing stuff [21:59:13] Wow, that is an allowed URL char [21:59:17] ^ that is [22:02:05] Ryan_Lane: Okay so first things first, let's get those npm and nodejs packages where they need to be. Is there any special signing that needs to be done to get them on apt.wikimedia.org ? [22:02:36] I built it as well [22:02:39] I'll push it to the repo [22:03:06] And the npm? [22:03:09] at some point we'll have a build server that'll do this stuff automatically [22:03:13] did you build npm too? [22:03:13] great [22:03:16] I did [22:03:18] oh [22:03:28] I'll get that from your home directory [22:03:36] k [22:06:01] *Ryan_Lane groans [22:06:16] well, I'll need to build it again at some point [22:06:40] node, that is [22:06:59] the dsc file will be missing the tar.gz file [22:07:05] so the source package won't be right [22:08:53] argh [22:09:00] no. I need to rebuild it completely now [22:09:06] I hate packaging stuff [22:23:57] robla: I've tried to reach Reedy a couple times... no luck [22:24:10] guess I'll try again tomorrow :P [22:26:51] hexmode: I've talked to him a little bit. [22:28:20] The new nsowiki has been imported using wrong character encoding. We are exporting new pages and are going to re-import it. Could someone please clear the database (or delete the wiki)? [22:28:24] hexmode: he's mulling whether to get r101449 r101450 r101454 (one batch of commits) and possibly r101451 in before rolling something [22:29:07] robla: good :) [22:29:15] *hexmode goes to look at those commits [22:31:41] ^demon: hey [22:32:33] robla: looks like he merged r101{449, 450, 454} [22:32:52] <^demon> blobaugh: Howdy [22:37:52] ^demon: have a question for you. let me dig it out of my history [22:38:04] ^demon: when you are running the tests against the various dbs how is it going to work? Will there be multiple MW installs running, will it change dbs on the fly, or will the test talk directly to the db object? [22:38:23] TimStarling: morning [22:38:27] morning [22:38:39] <^demon> I'll end up making some new projects in jenkins probably that will use --dbtype=foo instead of --dbtype=sqlite [22:39:00] ^demon: so multiple MW installs then? [22:39:23] <^demon> Yeah, but the workspace is cleared after each build and the installs aren't publicly accessible anyway [22:42:00] ^demon: ok, just wondering about it from a testing perspective. Sumana wants me to help design the over db testing scheme. i am betting that only one class is needed to test most of it because the methods are going to be the same in every class. [22:42:30] <^demon> Yeah, as long as we have decent "Database" related tests, we'll be fine I think. [22:42:43] <^demon> Since we'll be falling back on whatever Database subclass is configured for that run. [22:51:39] ^demon: cool. gotta take off now. getting dinner then going to the Seattle PHP Meetup :D. cheerio! [22:51:55] <^demon> Have a good one. [22:54:07] Ryan_Lane: How goes the packaging? [22:54:25] crap. I totally forgot about npm [22:54:32] :( [22:54:33] and about pbuilder running nodejs [22:54:36] lemme check [22:54:37] Ryan_Lane: curious about gerrit. I noticed a bug report somewhere talking about SSO stuff with ldap missing/not implemented yet. that stuff the case [22:54:37] It's okay [22:54:39] I was writing docs ;) [22:54:48] https://labsconsole.wikimedia.org/wiki/Access#Accessing_public_and_private_instances [22:54:49] Ah :) [22:54:53] heh, yer right johnduhart, everyone jumps on him at the same time :) [22:54:59] :D [22:55:05] troubled: lol [22:55:16] I smirked when I see you speak as I was typin heh [22:55:16] troubled: gerrit uses LDAP fine [22:55:45] Ryan_Lane: hmm, maybe it wasn't ldap im thinking then. thought I see some request for gerrit *scratcheshead* [22:55:48] ok. starting the nodejs pbuilder build [22:55:59] Ryan_Lane: there's another form of SSO [22:56:05] use SAML or something like it [22:56:12] or web server auth [22:56:16] but ldap is working [22:56:30] I wish it stored the ssh keys in ldap too. that's what's missing for me [22:56:40] ah [22:56:52] johnduhart: I tried connecting to the etherpad via socks proxy, or forwarded ports and couldn't [22:57:00] Hm, maybe it's off [22:57:02] hold on [22:57:03] ok [22:57:06] was just noticing that it supported apache auth, and was gonna suggest ldap mod, but clearly I misread the problem :) [22:57:23] heh [22:57:36] Ryan_Lane: Yup, try now (Test pad) [22:57:39] talking to yerself there eh :) [22:58:19] cool. it's working [22:58:39] are the ssh keys stored in a dedicated file by chance? [22:58:52] nope. in a database for gerrit [22:58:54] which is annoying [22:58:59] mysql? [22:59:02] otherwise I'd update the keys myself [22:59:02] yeah [22:59:13] I want the keys to be managed in one place [22:59:19] I'd prefer gerrit just read them from LDAP [22:59:27] im guessing you would rather avoid custom mysql module to redirect it [22:59:32] yes :) [22:59:34] no doubt [22:59:40] I'd rather just add the support to gerrit [22:59:47] socks proxy via ssh ftw [23:00:00] it's such an easy way to make things just work [23:00:24] man, havent used socks in ages [23:00:32] guessing ssh port redirect to it? [23:00:41] well, many of the instances are on a private network [23:00:52] so to test things like this it's necessary to forward ports [23:00:55] or use a SOCKS proxy [23:01:00] SOCKS proxy is slightly easier [23:01:17] since then you can use normal domain names without messing with your hosts file [23:01:24] and you can use normal reserved ports without being root [23:01:43] could always grant the process the CAPABILITY for that right [23:01:54] forget the cli tool name for it off hand though [23:02:05] yeah. it's still a pain then, because the hostname won't be the same without adding it to hosts [23:02:29] guessing you dont like to depend on dns around there [23:02:38] with a socks proxy, you just do ssh @bastion.wmflabs.org -D 8080, and configure your web browser to use localhost:8080 as a proxy [23:03:00] we use DNS, but the entries are like: pad1.pmtpa.wmflabs -> 10.4.0.17 [23:03:21] dig @virt1.wikimedia.org pad1.pmtpa.wmflabs [23:03:35] and, rfc1918 [23:03:39] ? [23:03:44] 10.x.x.x [23:04:00] yeah. we use a range that is only privately routable [23:04:00] s/and/ah/ [23:04:21] wording, means so much, and I always mess it up hehe [23:04:45] johnduhart: good job on the npm package [23:04:53] Thanks [23:05:07] I don't even have to re-compile it :) [23:05:10] well, maybe [23:05:32] troubled, I think you were tinking on setcap(8) [23:05:44] Platonides: i think so ya, thx for refreshing me :) [23:07:44] hmm. no I'll need to recompile I think [23:08:40] it needs to include the .tar.gz with the .dsc file, which requires the -sa flag on dpkg-buildpackage command [23:09:16] ugh. and nodejs failed to build this time? wtf [23:09:18] :) [23:09:20] *Ryan_Lane stabs [23:09:51] you guys gonna be full dd's by time this is done :D [23:09:59] oh. maybe I was in the wrong build directory [23:10:00] dd's? [23:10:20] debian dev's. although I suppose that might not be right term for a packager [23:10:27] heh [23:10:38] I've built probably 20 or so packages [23:13:39] doubt ive done that many myself. can't stand it either. make install ftw :D [23:13:59] *Ryan_Lane pukes [23:14:06] mind you, im not dealing with a cluster of machines that need repeatable and clean installs of stuff like you [23:14:06] :) [23:14:23] I always package things. I feel dirty with unpackaged things on systems [23:14:31] unless it's small and puppet can install it [23:14:36] i used to use a program called encap too for a while (basicaly a sym link farm, similar to debian alternatives i suppose) [23:15:27] about the only thing I "package" these days is the kernel. but make-kpkg really takes care of the nitty gritty stuff [23:18:37] you change your kernel? :) [23:21:09] not usually, but ya. preemptive kernel atm, 300hz, few other tweaks. only cause its my workstation though :) [23:22:46] used to do it all the time, back when kernel updates were tarball patches :) times have changed though [23:24:03] heh [23:24:17] used to be a firm believer in no modules kernel with grsec. but then selinux...."won". at least in a defacto way. that, and the debian team refused to maintain their patch, so you needed a vanilla kernel [23:25:32] sllooooooowwwwww internets [23:25:39] you guys do any security stuff like selinux with any of the setups? seems like a nightmare to maintain. tried getting into it, but audit2allow leaves alot to be desires (read: it's rules are way too broad) [23:26:36] im guessing with puppet, its probably just easier to blow away machines and use the newest security updates [23:40:07] well, selinux in fedora is good [23:40:11] selinux in ubuntu sucks [23:40:15] ubuntu uses apparmor [23:40:21] and apparmor in ubuntu sucks too [23:40:48] http://toolserver.org/~reedy/mediawiki/ <- 1.18.0beta1 tarball [23:40:48] ubuntu basically fails at this task while fedora/redhat do a fantastic job [23:42:08] hexmode, ^ [23:42:38] no doubt. I tried in debian instead of ubuntu. but one thing I noticed was a bunch of the useful apps I needed weren't covered by the stock selinux modules. the paths were there, but they were geared for fedora systems or something other than debian. so not bad in that its supported, but shame that they only seemed to copy/paste the upstream rules instead of actually tweaking it properly [23:42:48] Reedy: \o/ !!! [23:42:52] Ryan_Lane: Progress? [23:42:58] Reedy: aw yeah [23:42:59] hexmode: don't run off and announce it yet [23:43:07] nodejs is pissing me off [23:43:19] Ryan_Lane: What's wrong? [23:43:28] something with the tar file [23:43:39] robla: I was gonna try to get TimStarling or someone to move it download [23:43:56] let's all do a quick test pass before putting it up [23:44:06] (just a basic "yay, it installs!") [23:44:12] robla: sure [23:45:12] johnduhart: got it to work :) [23:45:15] it's in the repo now [23:45:18] yay [23:45:20] working on npm [23:47:18] hmm. dpkg-buildpackage is failing [23:48:22] it installs! ship it! [23:48:23] :) [23:48:26] $wgVersion = '1.180beta1'; [23:48:27] Bah [23:48:29] I typoed [23:48:35] I saw that :) [23:48:59] I was afraid to mention it because I was worried you'd be compelled to fix it [23:49:02] :) [23:49:06] Reedy: where is your key? [23:49:13] not found on keyserver [23:49:21] pgp.mit.edu [23:49:32] Didn't realise there was anywhere to submit it to... [23:49:33] releasing tarballs is not very forgiving [23:49:51] I can though... [23:49:51] any small mistake and you're sure to have people complaining to you by email and IRC [23:49:59] heh [23:50:02] best to check everything twice [23:51:14] johnduhart: I made a change [23:51:14] I don't think brion ever made a release error, despite having really bad tools [23:51:18] he's a machine [23:51:19] johnduhart: can you run: sudo pbuilder build ../npm_1.0.93-0wm1.dsc [23:51:25] Sure [23:51:29] hexmode, try again now :P [23:51:39] Key block added to key server database. New public keys added: 1 keys added succesfully. [23:51:43] there's a reason that one of the first changes I made when I took over releasing was to remove the checksums from the emails [23:52:12] it's not really fun to have checksums of your mistakes sent to thousands of people [23:52:15] why is our stupid build server running something so old? [23:52:20] guessing you sign packages only now? [23:52:28] yes, just sign [23:52:39] Reedy: verified fingerprint: 1D98 867E 8298 2C8F E0AB C25F 9B69 B310 9D3B B7B0 [23:52:55] that way you can fix bad tarballs and pretend like there was never a problem [23:53:08] hexmode: tisk, not using an ssl irc connection? ;) [23:53:34] how you know wasn't mitm sed script changing the fingerprint we see ;) [23:53:50] Ryan_Lane: done [23:53:51] oh i pushed out a few buggy releases at least :) [23:53:52] troubled: No, I ssh tunnel to my server and its clear from there [23:53:56] thanks [23:54:01] but i think most of them had the right files in em ;) [23:54:09] hexmode: just teasing :) [23:54:10] but yes I was thinking of fixing that today [23:54:43] troubled: sokay... I only *pretend* to be paranoid [23:54:49] heh [23:56:28] there was a https page with gpg fingerprints for mediawiki releases