[13:40:00] Does http://npmjs.org/install.sh recognize PREFIX env variable (I want to install in /usr/local)? [13:41:06] * grondilu just tried [13:41:19] no it doesn't. [13:41:57] the latest node.js tarball should come with npm if I'm not mistaken [13:42:18] so building node.js with --prefix might obviate the need to install npmjs separately [13:43:28] ok, it's just that I had just installed node via normal debian install. I'll remove it and install from tarball. [13:44:43] *nod* [13:47:58] * grondilu compiles nodejs [13:55:49] Nikerabbit: ping ? :) [14:05:33] hashar: oink oink [14:06:01] will you be available this afternoon talk about migrating your twn.net scripts from subversion to git ? [14:07:03] hashar: not necessarily, I have to pack for tomorrow's trip [14:10:10] Nikerabbit: do you at least know which scripts use svn beside Translate/scripts/sync-group.php and the update-mediawiki-ext shell script? [14:10:17] Nikerabbit: I haven't found any other [14:16:57] twn:/home/betawiki/bin$ grep svn * -l | xargs [14:16:58] bdiff bupdate checkmsg export-freecol generate-translation-documentation grr lastcommit sandupdate stats-mediawiki svndiff update-branches update-europeana update-freecol update-fudforum update-kiwix update-mediawiki-ext update-mwext update-nocc update-okawix update-openimages update-pywikipedia update-toolserver update-wikia update-wikiblame warnings-mediawiki wikiupdate xupdate [14:18:36] even sync-group doesn't use it directly, does it? [14:18:56] and what is not listed is a script that goes every repo and commits as the l10n-user [14:22:44] dohh [14:23:10] looks like most of them are updating scripts for non WMF wikis anyway [14:23:39] sync-group does use svn to get the last time a file was changed in the repo [14:37:43] au: ok I have compiled everything and install modules, but I don't see any "extensions/VisualEditor/test/parser" directory anywhere :-/ [14:37:58] did you check out VisualEditor from svn? [14:38:10] ah, my bad [14:38:14] as in "svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/VisualEditor/" [14:38:41] yeah, silly of me. [14:38:45] np at all :) [14:40:18] isn't there a git repo instead? I'm not used to svn [14:41:26] hang on. Will I need to have mediawiki installed?? [14:41:50] no [14:41:58] * grondilu is relieved [14:45:29] grondilu: we are in the process of migrating to git [14:45:40] good [14:45:41] grondilu: should be for March 21st if all goes well [14:45:53] i.e. next wednesday [14:51:13] jeez this svn checkout is endless [14:53:43] hmm, did you checkout only VisualEditor? the entire phase3/extensions tree would take considerably more time [14:54:44] damn it [14:55:29] I don't know I just followed instructions: svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3 [14:56:05] ohh no [14:56:13] you are downloading the whole english wikipedia! [14:56:15] yeah. might want to Ctrl-C it and checkout the partial tree above [14:56:31] http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3 is the whole MediaWiki [14:56:40] (as in "svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/VisualEditor/") [14:57:06] it is roughly 150 - 200 MB which includes some funny PHPUnit tests and various languages translations [14:57:28] ok! [14:57:37] * grondilu starts again [14:57:49] -> afk a bit, bbl [15:04:29] grondilu: let me know if I can help [15:04:58] do I have to run the update --set-depth infintiy line? [15:05:36] which instructions are you following ? [15:05:49] by default svn does use an infinite depth [15:06:07] https://www.mediawiki.org/wiki/Parsoid/HTML5_DOM_with_microdata [15:06:50] because when I run 'node parserTests.js' it does not see the module I had installed (jquery and so on) [15:07:06] * grondilu has no idea what this depth thing means [15:07:08] have you installed them globally ? [15:07:18] I don't know. I guess not. [15:07:33] to install node.js modules, you will use 'npm' [15:07:40] such as: npm install jquery [15:07:47] Yeah. I did that. [15:07:55] by default that will install the 'jquery' in whatever current directory you are in [15:08:02] for example, your homedir [15:08:07] It did. [15:08:08] that is used to easily bootstrap an application [15:08:30] I personally install the node modules globally with npm install -g jquery [15:09:02] do I have to reinstall or is there a command to set global? [15:09:22] I think you have to reinstall [15:09:51] anyway, if you have installed the module locally in the working copy of VisualEditor/tests/parser/ that should work [15:10:55] I tried a symlink and it didn't work. Anyway I reinstall [15:11:00] grondilu: to get the path where npm install modules globally, run: npm root -g [15:11:33] then in your bashrc or something : NODE_PATH=/usr/local/lib/node_modules :-] [15:12:08] ok I gives a correct directory (the one in /usr/local) [15:14:39] hum, I still get Error: Cannot find module 'diff' [15:14:42] I don't get it [15:14:57] have you installed that npm module ? [15:15:01] npm install -g diff [15:15:04] probablty [15:15:05] I did. [15:15:36] it should be in directory output by: npm root -g [15:15:42] ll `npm root -g` [15:15:44] grr [15:15:48] It is there! [15:15:51] good [15:16:02] I don't understand [15:16:19] so that must be because node can not find it [15:16:29] is NODE_PATH set ? [15:16:43] no [15:16:47] cause you might be running the command in a term that does not have it set yet [15:16:50] since you only changed the bashrc [15:16:52] :) [15:16:54] makes sense [15:17:22] I have literally spent 2 days figuring out all of this :-)))))) [15:17:53] yeah it happened to me too. I always forgot to use a new shell [15:18:30] it's kind of obscure though [15:19:43] still, doesn't work. [15:20:13] I guess I'll set NODE_PATH manually [15:20:59] NODE_PATH="/some/path/to/node_modules" node parserTests.js [15:21:39] jeez, now: Error: Cannot find module 'jshashes' [15:21:48] * grondilu installs jshashes [15:21:49] good ! [15:22:21] I have updated the README a few sec ago to mention 'request' and 'jshashes' being needed https://www.mediawiki.org/wiki/Special:Code/MediaWiki/113723 [15:23:02] ok now it complains about not finding '../../../../tests/parser/parserTests.txt', but it tells me SUMMARY: [15:23:05] Passed 0 of 0 tests... ALL TESTS PASSED! [15:23:20] congratulations! [15:23:36] the parserTests.txt file comes from MediaWiki [15:23:38] aren't there any default tests? [15:24:19] the aim is to build a js parser that pass all those tests [15:24:25] they are basically defining the language [15:24:35] * grondilu runs make test [15:24:50] (well the real definition is the PHP implementation :-D ) [15:25:25] ok ok but where is this parserTests.txt? [15:25:36] in MediaWiki :-] [15:25:43] the trunk/phase3 stuff [15:26:03] http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3/tests/parser/ [15:29:15] ok I've put the file where it is expected. Tests are running. [15:31:26] => 268 total passed tests, 409 total failures [15:31:29] \o/ [15:31:42] thanks [15:32:00] happy hacking 8-) [15:34:53] well, I wonder if that will not be this parserTests.txt that will be most useful to me [15:35:34] anyway, gotta go. thanks again. [17:11:29] For the meeting in 49 minutes: http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [17:11:30] :-) [17:17:47] 20% checkin now-ish [17:20:20] http://toolserver.org/~robla/crstats/ <-ouch [17:21:17] http://www.mediawiki.org/wiki/20_percent [17:22:46] (sorry got disconnected just before :15 w/o noticing) [17:23:29] rmoen: looks like you may be the only 20% person who is actually online right now [17:23:48] robla: I'm here [17:23:55] rmoen: are you in a good position to help out with the code review backlog? [17:24:23] robla: Yes, thanks for the reminder. I've been waiting for this guy to show for an interview [17:24:41] hrm [17:24:46] (that doesn't bode well for the interviewee) [17:25:15] robla: yeah, this is his second time being late / canceling….I'll gladly do some code review [17:25:21] rmoen: if you get a chance for patch review, This regression has a patch https://bugzilla.wikimedia.org/show_bug.cgi?id=34972 [17:26:03] hexmode: looking [17:26:15] herrrrow! [17:26:24] TrevorParscal: !! [17:26:25] TrevorParscal: heya [17:26:36] repeating from before: http://toolserver.org/~robla/crstats/ <-ouch [17:27:10] the linear regression from Feb 4 or so looks pretty aweful [17:27:48] looks good to me, what's wrong? [17:28:21] I guess it's not as bad as , say, Sept or Oct of last year, but... [17:28:45] TrevorParscal: I suppose you're right. The colors are nice. I pulled them from the tango palette :) [17:28:57] :) [17:29:10] anyhoo [17:29:23] so there's like 336 revs to be tackled [17:29:32] TrevorParscal: have time to dive in on that? [17:30:07] * TrevorParscal notices nothing has been tagged for him to review [17:30:42] TrevorParscal: if you could get through krinkle's stuff, that'd be a huge win [17:31:00] so, what's the deal with teams doing cross-review [17:31:07] hey hexmode could you add the new bug triage to https://www.mediawiki.org/wiki/Bug_management/Triage ? (the textual list) [17:31:21] TrevorParscal: que? [17:31:23] RoanKattouw: I guess as that script is still on abwiki, I should spawn some more instances [17:32:01] robla: well, seems like there would be far fewer revisions left "new" if teams would just cross review more often [17:32:14] sumanah: hrm... had it all written up but didn't click save. [17:33:42] true...need to do that, too [17:36:00] Ah yes, CR backlo [17:36:03] g [18:04:28] I messed up the URL for our meeting: http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [18:06:28] Let's start in 5 minutes [18:06:39] VE is meeting with Howie [18:11:15] Yeah sorry about that [18:13:19] OK we're done [18:14:58] Okay are these guys online? :-) [18:15:08] http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [18:16:06] Roan and Rob, I also marked your time off at the bottom of the pad, btw. [18:16:15] I mean Roan and Trevor. :-) [18:16:24] awesome [18:16:25] see that [18:17:01] Maybe next week I'll clean up the :"List of Project Teams" [18:17:05] Also, general announcement: there is a DEADLINE for submitting Wikimania proposals THIS SUNDAY [18:19:26] Yes also, we have to complete http://www.mediawiki.org/wiki/Wikimedia_Engineering/2012-13_Goals next week. This open document provides guidance for the engineering budget next year. [18:20:11] So if you have ideas of what we should do next year and why we're awesome and important, it'd be nice to add to that so they budget enough money to pay us. :-) [18:20:24] Okay let's start. [18:20:42] Visual editor. [18:20:52] tewwy: Aren't Rachel and Dana gonna schedule a meeting for that goals/budget thing for Visual Editor (and other projects)? [18:20:52] Any updates other than what you added to the ether pad? :-) [18:21:18] We are interviewing candidates for the VE Product Analyst position [18:21:22] Yes, they should, but I'm anticipating time is going to be tight so just in case it doesn't hurt to have as much of it filled out as possible. [18:21:23] http://www.mediawiki.org/wiki/Wikimedia_Engineering/2012-13_Goals#Visual_Editor [18:21:44] I won't say too much about that in this public channel, just that it's moving forward [18:23:26] Okay, that was an interesting blip. [18:24:12] Yeah, wtf [18:24:22] Are there any updates on the parser, or should be just go on to ACW. [18:24:31] TrevorParscal: You have a half sick day and a half WFH day this Thursday, right? [18:24:42] TrevorParscal_: RoanKattouw_ TrevorParscal: You have a half sick day and a half WFH day this Thursday, right? [18:25:16] yeah, that's the idea I guess [18:25:31] but I may or may not be useful the 2nd half of thursday [18:26:11] because I will have just finished a 3 hour dental appointment and will still be on valium [18:26:11] Wow. [18:26:12] I've just changed the Etherpad so it now says you're gone starting Thursday [18:26:20] Yeah, long live the office networ [18:26:22] k [18:26:40] Yes Trevor has retreat this week, and next week he will be in massive pain . [18:26:49] RobLa is needing my help with code review, so if I'm available on thursday afternoon, that's what I will be doing [18:30:48] Okay since VE is moving right along and gwicke isn't here for the parser stuff, I'll get an update later, let's do Editor Engagement [18:31:07] Any updates on ACW or NPT? http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [18:31:18] BTW, you guys are all in there twice, once for ACW and once for NPT [18:32:01] ACW is still under reivew [18:32:07] Also feel free to add stuff to http://www.mediawiki.org/wiki/Wikimedia_Engineering/2012-13_Goals#Editor_engagement_features when you have time before the meeting that has TBD scheduled [18:32:53] How about the project formerly known as NPT? [18:34:10] Hmm, I'll ping Ian and Ryan to update that stuff. You can update your fields bsitu. :-) [18:34:27] Ryan is out this week [18:34:45] this is what happens when I don't look at the calendar [18:35:19] Well it's not on there [18:35:36] No it isn't. Grrrr. :-) [18:35:49] The wretch of HR shall descend on kaldari [18:36:38] Luckily Howie and co are on them so they don't need much love/hate from me. :-) [18:36:42] AFTv5? http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [18:36:59] Ah, yeah [18:37:10] So we did another deployment last week [18:37:38] It was the first one in like a month or more, fortunately there wasn't too much code cause they only had one person working on it [18:37:50] On my end, Fabrice wants to move to labs sometime because he feels testing is slow, so I'm looking in to it. But since labs is unstable, I'm not bugging RyanL very hard. [18:37:58] There was a mini-deploy (two minor changes) this week, and we'll have another deployment next week [18:38:05] Yes, I should move stuff over to labs [18:38:10] Hopefully I'll get to that this week [18:38:19] By which I mean maybe on Thursday [18:38:44] RyanL wants this to happen too so he can shut down prototype [18:38:44] Yeah, don't rush it, but if you can, that'll make Fabrice happy/ [18:39:05] I'll horse trade this with the proxy thingy he was gonna give me two weeks ago [18:39:08] Yes, prototype is unsupported… which really means you're doing all the support. :-D [18:39:29] Fortunately I'm not doing actual support for prototype, they haven't run into any problems for months [18:39:36] But yeah labs will be better [18:40:13] Yeah I gather the only issue is the config isn't the same? [18:40:28] That may or may not be an issue [18:40:50] The deployment-prep project has solved this problem much better than I did on prototype [18:41:00] I should be able to piggyback on that [18:42:15] Okay, sounds good. [18:42:23] BTW, which team is andrew garret on? [18:42:48] EE [18:43:03] good, I put him in the right field. [18:43:03] grmbl [18:43:05] :-) [18:43:31] any updates from Multimedia, Education, or Fundraising? http://etherpad.wikimedia.org/FeaturesTeam2012-W011 [18:43:55] My only update is Multimedia is blocked on QA, so I'm trying to get them unstuck. [18:44:41] ping JeroenDeDauw [18:44:53] http://education.wmflabs.org/wiki/Main_Page seems to be getting lots of testing from the WP edu folks [18:45:02] at some point it'll need a bulk code review [18:45:10] Not it [18:45:20] hee [18:46:15] Eloquence: yeah... AFAIK no mayor new features still need to be done, so now would be a good time to start code review [18:47:05] JeroenDeDauw, was there a UI pass at some point? I know frank wanted to do one [18:47:35] Eloquence: not yet - he indeed wants that. I wrote him about that and CR yesterday but he has not gotten back to me yet [18:47:53] ok, can you cc me & terry into that thread [18:50:21] Well I don't want to be keeping you. Remember to update http://etherpad.wikimedia.org/FeaturesTeam2012-W011 if you haven't already by the end of day today, and I'm going to ping some of you by email who weren't here/got kicked off due to network issues. :-) [18:51:18] Eloquence: sure, done [18:57:56] thanks jeroen [18:57:57] ttfn [20:39:13] Ryan_Lane: is now a good time to get the Gerrit project owner groups settled? I believe I just need to get certain privs from you so I can set up these groups -- basically analogous to the commit access process, so I will run it [20:39:27] https://bugzilla.wikimedia.org/show_bug.cgi?id=35148 [20:55:50] Ryan_Lane: ping ^^ [21:51:30] sumanah: guillom I mentioned to robla last week about trying to standardise our mediawiki release email notifications somehow... e.g format/layout, what we use as seperators etc [21:51:41] this sounds like a good idea [21:51:55] Not something that needs fixing right now (obviously), but sorting something in the near term would be nice [21:52:05] nod. [21:52:31] guillom: can I ask you to put that on some sort of "we should eventually get round to this" list for engineering project documentation? [21:53:17] I'm not sure there's much to do besides pasting the template on a subpage of the release checklist, honestly. [21:53:23] Or is there? [21:53:35] Reedy: do you think that would suit? [21:53:52] guillom: I think it would be good enough for a first pass, let's say [21:54:05] and maybe that'll be all that's needed [21:54:23] It might be something like that, with the headers/footers to sort [21:54:33] I do wonder if we could get away with sending wiki markup in a plain text email [21:55:12] http://lists.wikimedia.org/pipermail/mediawiki-announce/2012-January/000107.html [21:55:14] vs http://lists.wikimedia.org/pipermail/mediawiki-announce/2011-November/000105.html [21:55:25] Roan or Trevor in the house? [21:55:46] There is a mix of ways of creating headers and stuff [21:57:12] Krinkle: they're at Wikia I think [21:57:29] I have an awesome plan for gadgets 3.0 (three) [21:58:02] but I'll save for later :) [22:21:15] Ryan_Lane: what does it take to create a new group in the LDAP ? [22:21:16] I would need a group for MediaWiki core repository to allow some volunteers to merge submitted commits in addition to regular staff and contractors :-) [22:21:30] we don't necessarily need to do so in ldap [22:21:48] should this be "mediawiki core reviewers"? [22:22:00] yup [22:22:26] so that would be people from Platform Engineering + ops + some trusted volunteers [22:22:36] hm, you can probably make the group [22:22:48] I have a MediaWiki group which include wmf + ops [22:22:49] go to Admin->Groups [22:22:55] create a new group [22:23:11] then I can manually add volunteers there. So that technically works [22:23:27] was more wondering if we should have all of that in LDAP or if having them listed only in Gerrit is fine [22:23:28] an ldap group isn't any better [22:23:33] Ryan_Lane: hashar - I have just sent you a very unfinished email about this topic [22:23:42] Ryan_Lane: hashar just FYI. Will finish & send it today [22:23:51] I say that because the group won't be used outside of gerrit [22:23:55] sumanah: have a look at https://bugzilla.wikimedia.org/show_bug.cgi?id=35148#c3 too [22:24:05] and the ldap group would likely be more difficult to modify [22:24:07] sumanah: an access right matrix would be great :) [22:24:33] Ryan_Lane: you got me at "won't be used outside of gerrit" [22:24:41] * Ryan_Lane nods [22:24:44] so let s just manually add volunteers in Gerrit group [22:24:49] yep [22:24:53] I really hope you like beers [22:24:57] heh [22:24:58] I do! [22:25:01] are you going to be in berlin? [22:25:06] cause you are not going to drink any water next time we meet :-] [22:25:09] beers are cheap there, so it works out even better [22:25:15] hashar: read my bit where I talk about how we are only going to start with people who have cluster access, as Gerrit Project Owners for that core master branch [22:25:41] that's what Tim, Aaron, Chad, Rob decided [22:25:48] Ryan_Lane: will most probably be in berlin :) [22:25:53] so, we already gave an ops and a wmf group [22:25:58] we can then add people based on certain criteria, applied consistently and transparently [22:26:02] you should include those groups [22:26:05] rather than the people [22:26:10] it's less to manage [22:26:39] hey hashar [22:26:42] join our call again? [22:26:47] Ryan_Lane: yeah we have ops and wmf already [22:26:51] sumanah: you still owe me a csv ;) [22:27:20] Ryan_Lane: either I use my terrible Python to scrape the people who have email in their svn USERINFO, or you do. [22:27:21] sumanah: was listening to music in x2004 :/ [22:27:29] hashar: got the G+ invite now? [22:27:37] sumanah: rob did. I am in [22:27:44] can someone else give me a csv? [22:27:48] let me look at userinfo [22:27:58] Ryan_Lane: it's faster if you do it but if you absolutely don't want to do it, I will try to get someone to, or I will sweat it out [22:28:36] hm [22:28:48] if they are in a consistent format I can probably do it [22:29:25] * Ryan_Lane sighs [22:29:32] Ryan_Lane: they should all be like https://www.mediawiki.org/wiki/USERINFO_file [22:29:35] can someone else do it? this is more than a one minute script [22:29:55] oh goddammit some are obfuscated [22:30:12] of course they are :D [22:30:24] sumanah: we have a script to parse the userinfo files [22:31:43] hashar: oh! could you pull out realnames + svn usernames + emails for records where they have each of those? [22:32:01] ok if that ends up only capturing like 100-200 of the 400 records [22:32:06] (approx) [22:32:43] hashar: this is so we can autocreate those Gerrit accts to avoid having to make as many one-offs manually. [22:33:27] lot of accounts in USERINFO are inactive [22:33:44] aren't people able to create an account and submit right away? [22:33:59] or do we need to explicitly allow them? [22:35:23] we should create them all [22:35:36] people can't self-register right now [22:38:39] MediaWiki::USERINFO in cpan by Avar [22:39:05] still need some magic foo [22:39:09] though [22:40:04] hashar: use your best judgment [22:40:22] hashar: either use MediaWiki::USERINFO or do it with a more hacky custom script or whatever [22:40:44] hashar: I'm with Ryan, and Chad agrees - just autocreate as many as possible. Break down the barriers [22:41:07] not going to happen tonight though [22:41:26] someone needs to make a csv, then unobfuscate the email addresses [22:41:32] then I'll script their creation [22:43:52] ok, hashar, if you get all those records that have some kind of "realname" or whatever & have an email, and turn it into a CSV, then I will do the gruntwork of unobfuscating the emails and giving that improved data to Ryan. [22:43:57] hashar: tomorrow :-) [22:44:36] I must be missing something [22:45:06] aren't svn commiters been created an account in labs already? [22:45:09] hashar: No. [22:45:31] they have LDAP accounts, but not labs accounts [22:45:39] they also don't have passwords [22:45:53] they need to be "linked" [22:46:05] and to do that you will need the email right? [22:46:08] after we link all of the users we can use the web form from then on! :D [22:46:13] yep[ [22:46:50] I'll be glad to not have to use the scripts anymore [22:46:57] the web form automatically emails them too [22:47:02] one less step [22:47:17] Some of the LDAP accounts don't have email addresses associated with them, though [22:47:22] It would be nice to have some "new user" welcome form that tells them what they need to do next [22:47:35] sumanah: then they don't get accounts [22:47:43] the web form will error out if the user already exists [22:48:00] then we'll have to use the scripts [22:48:03] Ryan_Lane: go ahead & file a quick bug re the text of the welcome form & assign it to me? [22:48:09] sure [22:48:27] I think it's possible in mediawiki to do this, right? [22:48:31] probably [22:48:35] what component? [22:48:39] I don't know [22:48:41] heh [22:48:52] I'll put it in the labs component, then [22:49:13] Ryan_Lane: I missed the "after we link all of the users we can use the web form from then on!" bit of the conversation, so, I was still stuck on what hashar has to do - sorry for talking in circles. [22:49:29] it's ok [22:49:43] you can create new users via the mediawiki interface [22:49:49] rather than needing access on formey [22:49:58] and it'll email them a password [22:50:22] I've been avoiding that for now because we have people that are unlinked [22:52:53] sumanah: ok. added a bug for you [22:53:01] damn it [22:53:02] maybe not [22:53:11] what's your assignee address in bugzilla? [22:53:28] Ryan_Lane: sumanah at panix [22:53:33] ah ok [22:54:00] I get that, I use my personal email address for bugzilla too [22:54:43] Ryan_Lane: it is because we hire so many people from the commmuuuuuunity [22:56:42] heh [22:58:38] Ryan_Lane: I have a space separated file :-] [22:58:45] heh [22:58:47] that's harder [22:58:54] Chad already did the conversion job a few months ago [22:58:57] for git migration purposes [22:58:58] since some usernames may have spaces [22:59:12] so we have: accountfirst name last nameemail [22:59:16] same with obsfucated email addresses [22:59:31] I can't parse that [23:01:56] yeah, we need *comma*-separated. [23:02:05] conf call ended [23:02:13] bahh [23:02:25] * hashar reviews account names [23:03:46] * hashar launches vim, :%s/ /,/ %s/ here we have [23:04:11] do we want quotes around names? :-) [23:04:31] hashar,Antoine Musso, [23:08:44] heh: http://www.linuxjedi.co.uk/2012/03/changes-coming-to-gerrits-style.html [23:09:05] openstack's gerrit being skinned some [23:21:51] Mar 13 23:17:03 i-0000019e automount[4935]: syntax error in nsswitch config near [ syntax error ] [23:21:53] hmm [23:22:43] I suppose there is a cached dns entry someplace that needs to timeout [23:23:57] bed time for good. see you later :) [23:24:02] night