[00:15:12] TimStarling: and during that 5-min break, you can deploy something! :-) [00:15:20] * sumanah may or may not be kidding, she is not sure [00:15:25] ok, dinnertime [08:32:31] New patchset: Hashar; "timestamp console output for several jobs" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4320 [08:32:50] New review: Hashar; "(no comment)" [integration/jenkins] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/4320 [08:32:52] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4320 [11:23:17] morning ^demon :) [11:23:23] <^demon> Morning. [11:29:09] Alright, I'm about to publish the March engineering report, so if you have any last-minute changes, now is the time: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/March [11:36:20] my bit looks fine, you hardly touched it [11:42:37] apergos: that's because it was already good :) [11:42:47] heh [11:43:02] flattery will get you, if not everywhere, pretty darn far... [11:45:56] apergos: because you're so knowledgeable... [11:46:09] uh oh [11:46:17] I need a developer to process a code review for me of a patch that werdna made for the abuse filter [11:46:22] who do you recommend? :) [11:46:29] heh it's not /that/ bad ;) [11:47:21] the code review is at https://gerrit.wikimedia.org/r/#change,3435 [11:47:44] the current status is "Review in Progress" but not sure by whom [11:48:21] if anyone clicks through and looks at a diff I think it indicates that [11:49:26] ah [11:50:53] I guess roan is the person [11:54:10] <^demon> vvv: So Scribunto is the new name for Scripting? [11:55:28] ^demon: yes [11:55:40] That was Tim's idea [11:56:03] <^demon> Yeah that's cool. It would be a bit easier for me if you do that before we migrate on Friday. Could you rename it today? [11:56:18] ^demon: you mean, SVN rename? [11:56:26] <^demon> yep. [12:03:39] ^demon: done [12:04:26] <^demon> Awesome, thanks. [12:33:09] <^demon> JeroenDeDauw: I'm setting up permissions for the extensions being migrated tomorrow. Do you know if Yaron has a gerrit account yet? I can't find him. [12:34:02] ^demon: then I guess he does not have one [12:34:38] <^demon> That's cool. Extension groups can manage their own users, so you can add him whenever he's registered. [12:40:13] :o [12:40:25] ^demon: is it long list? [12:40:49] <^demon> 36 extensions getting moved tomorrow :) [12:40:51] it'd be nice, JeroenDeDauw, if you could leave a note at https://www.mediawiki.org/wiki/Git/Gerrit_project_ownership/Archive for the record when you add someone [12:41:09] JeroenDeDauw: because it's nearly impossible to find the audit trail in Gerrit itself [12:45:32] <^demon> Actually, 2 less. Andrew must not've noticed that I moved AbuseFilter and LQT already. [12:46:17] where's the list? [12:46:36] https://www.mediawiki.org/wiki/Git/Conversion/Extensions_queue [12:48:05] my wish is to get l10n commits running regularly by end of this week, only gets more important when more extensions move to git [12:49:07] My understanding was that Ryan & Chad had gotten that started; was I wrong? [12:50:59] ^demon: sumanah: right ok [12:51:41] Trying to push a commit to core/master on my new machine and am getting this very helpful error when doing "git review" [12:51:47] Errors running git rebase -i remotes/gerrit/master [12:51:47] /usr/lib/git-core/git-rebase--interactive: 1: arithmetic expression: expecting primary: "+" [12:53:10] <^demon> That's bizarre. [12:53:22] WHAT [12:53:38] <^demon> Nikerabbit: I've been working on it, but I'm not sure of the cause just yet. [12:53:44] (Gerrit finally rebels against addition, after the debasement and abominations of +1 and +2) [12:54:05] <^demon> Huh? [12:54:05] Sigh, this takes all the fun out of committing to MediaWiki :/ [12:54:46] ^demon: I was joking re "arithmetic expression: expecting primary: "+"" [12:55:41] <^demon> JeroenDeDauw: Try doing `git rebase --abort` and then `git review -v` [12:55:49] <^demon> And pastebin the output. [12:58:35] ^demon: http://dpaste.org/PxQcn/ [12:59:20] <^demon> That's really weird--I'm not sure what's going on there. Is this a followup to an existing change or a new commit? [13:01:02] ^demon: fresh checkout, made new branch, made change, committed, tried git review, got this stuff [13:01:12] ??? [13:01:21] that is definitely a bug in your git instsall [13:01:29] I already reinstalled it [13:01:36] Running Ubuntu 11.10 [13:01:40] No special stuff\ [13:01:43] <^demon> Yeah, that's what I'm thinking. Try submitting it with `git review -R` and let someone else rebase if need be. [13:01:45] just apt-get install git [13:01:46] have you tried simply: git rebase -i remotes/gerrit/master [13:01:58] make sure git is what you want: which git should give /usr/bin/git [13:02:01] not some alias [13:02:32] <^demon> sumanah: I'm going over the 34 extensions for migration tomorrow. I've already created all the groups/repos, now I'm assigning rights. [13:02:39] Oh wait, I did install the legit thing on recommendation of some wikidata person, might be messing stuff up [13:02:58] hashar: how do I see what "git" points to? [13:03:09] ^demon: got it. I'll wait till you finish that before taking care of Victor (CentralAuth) and the other stuff in the Gerrit project ownership queue [13:03:10] oh nvm [13:03:21] JeroenDeDauw: anyway the command is the shell script at : /usr/lib/git-core/git-rebase--interactive [13:03:23] Yeah it's /usr/bin/git [13:03:58] and something in that shell sc [13:04:11] and something in that shell script is incorrect for you :( [13:04:16] hashar: if I do "git rebase -i remotes/gerrit/master", it exits with the same rror [13:04:37] Well, I got git 1.7.5.4 [13:04:42] JeroenDeDauw: yup the issue is in /usr/lib/git-core/git-rebase--interactive [13:05:43] you could try editing that shell script and at the very top add -x to the shell command. Something like: #!/bin/sh -x [13:05:50] that might print out every commands as they are run [13:08:03] hashar: http://dpaste.org/eodmn/ [13:08:29] I googled for the error, and GREP_OPTIONS also showed up there [13:11:00] total=$(($new_count+$(sane_grep -c '^[^#]' < "$todo"))) [13:11:04] that is the line that cause the issue [13:11:10] I think [13:11:17] in the mark_action_done() {} shell method [13:11:19] function [13:12:32] maybe because the grep above did not get any value :D [13:13:48] JeroenDeDauw: maybe because /bin/sh is bound to dash instead of bash [13:14:08] you could try replacing the first line in /usr/lib/git-core/git-rebase--interactive to make sure you use bash: #!/bin/bash -x [13:14:16] also /bin/sh -version [13:14:22] should show you are using dash [13:15:58] That gives me: /bin/sh: Illegal option -r [13:17:07] I get the same when doing "dash -version" but not when doing "bash -version", so I guess it's dash :p [13:17:11] <^demon> vvv: Scribunto is going to be deployed to wmf sites eventually, right? [13:17:24] ^demon: I hope so [13:17:50] <^demon> Then open push isn't an option. [13:17:54] <^demon> Has to be push-for-review [13:18:13] Can't we change that later? [13:18:26] I thought we could [13:19:08] Well, now I get a different version of the same error (after switching the thing to bash): /usr/lib/git-core/git-rebase--interactive: line 175: +: syntax error: operand expected (error token is "+") [13:21:55] <^demon> vvv: It can be, yes. [13:22:32] JeroenDeDauw: probably still at the same line [13:24:10] hashar: yeah... [13:25:37] I can rebase without any problems when doing stuff with some of my github repos... [13:26:50] sure [13:27:05] JeroenDeDauw: maybe try out with a new fresh user to discard the environnement [13:28:21] <^demon> sumanah: I'm done in gerrit for now, feel free to do whatever. [13:38:46] ^demon: I shall, in mere moments! I will set up groups, As You Taught Me, and add members to that, and then have the projects inherit from the groups [13:39:46] <^demon> I don't know if any groups need creating, I believe I already setup CentralAuth as such (so just need to add the new member). [13:39:57] <^demon> All of Jeroen's I took care of en masse when I was creating new repos just now. [13:41:38] oh great [13:41:39] :) [13:42:24] <^demon> If we're adding gwicke to 'mediawiki', we don't need a PFuncs group I don't think. Mediawiki has ownership rights on pfuncs by default. [13:42:43] <^demon> So the only 2 things you should need to do are adding vvv to CentralAuth group and adding gwicke to mediawiki. [13:42:48] ok then [13:43:02] hi auroraeosrose [13:43:09] hi [13:43:27] hey au and auroraeosrose - are either of you planning on coming to https://www.mediawiki.org/wiki/Berlin_Hackathon_2012 ? [13:43:32] it's June 1-3 in Berlin [13:43:51] I'll be there, some of your colleagues will be there.... [13:44:10] we'll be hacking, learning, designing [13:44:24] nearly no presentations/talks except very hands-on specific tutorials [13:44:36] it'd be great to have you there [13:47:37] heh, I'm actually flying to Europe on the 4th so I'll just miss it [13:48:49] auroraeosrose: blah! maybe you can come to Wikimania, in DC in July? [13:48:51] !wikimania [13:48:52] Wikimania, the worldwide Wikimedia conference, is 10-15 July 2012 in Washington, DC, USA, including developers' days 10-11 July. See http://wikimania2012.wikimedia.org/ and https://wikimania2012.wikimedia.org/wiki/Hackathon [13:48:52] Wikimania, the worldwide Wikimedia conference, is 10-15 July 2012 in Washington, DC, USA, including developers' days 10-11 July. See http://wikimania2012.wikimedia.org/ and https://wikimania2012.wikimedia.org/wiki/Hackathon [13:51:48] ok, ^demon, ok to move https://www.mediawiki.org/wiki/Git/Gerrit_project_ownership#Jeroen_De_Dauw.2C_19_March_2012 to the archive page, since it's done? [13:53:59] <^demon> Done. [13:54:10] rock. [13:54:31] andrewbogott: can I ask you to take care of the 4 or so requests at https://www.mediawiki.org/wiki/Developer_access ? [13:55:59] sumanah: Sure. If someone applies on that page but already has SVN access, does that mean that they're asking for a labs account? (I'm never quite clear on which kinds of access people are seeking.) [13:57:00] andrewbogott: By default, they're asking for Labs and Gerrit access, but you don't give them bastion access [13:57:22] andrewbogott: so you can follow https://labsconsole.wikimedia.org/wiki/Help:Access#Giving_users_Labs_access.2C_if_they_already_have_an_SVN_account [13:57:26] Ah, I see, so they might have SVN already but not gerrit. [13:57:33] Right. [13:57:44] sumanah: Yep, I know that part, just never know if/when to give bastion access. [13:57:57] Which, I guess the answer is 'never, unless they show up on IRC and explain why they want it.' [13:58:05] I say: just don't give it out unless they come into IRC or email and explain. [13:58:06] yes. [13:58:31] Thanks! [14:24:41] <^demon> Attempting a test dump of all the extensions we're doing tomorrow. [14:35:49] hi people [14:36:01] I got a couple of questions [14:36:15] I have a special page [14:36:27] I want to make links that go there [14:36:46] would a parser function/magic word be the way to do it? [14:39:01] <^demon> Is there something special about the links or can you just use normal wikilinks? [14:44:28] Git docs needed: [14:44:29] * Tagging [14:44:29] * write up Amend/rebase/multiple commits [14:44:29] * archive & rewrite code review guide [14:44:30] sigh [14:47:00] ^demon: they are supposed to trigger search [14:47:20] so they contain search queries [14:49:13] I was thinking use an alias or a name space to do the trick [14:49:19] or both [14:50:03] [[search:one OR two]] [14:53:23] <^demon> You could do [[Special:Search/One or two]] and use $par. [15:02:26] JeroenDeDauw: do you still have trouble with git-review? [15:03:03] I got a similar message [15:03:04] (missing Change-Id in [15:03:06] commit message) [15:03:21] when I had not run git-review -s before the first commit [15:05:52] ^demon: thanks that seems to work fine I messed up the slashes before [15:19:12] gwicke: yes, still having the problem [15:19:15] gwicke: I ran that [15:19:30] Oh wait, but not before the first commit [15:19:33] Meh [15:19:38] gwicke: how did you fix? [15:19:55] JeroenDeDauw: moment, am on the phone [15:20:00] sure [15:21:33] <^demon> You can use the change-id it suggested in your error message. Do a `git commit --amend` to adjust it then `git review -R` to push. [15:26:09] JeroenDeDauw: I used git reset --soft HEAD^ to revert the last commit (there was only one in my case) [15:26:22] and then re-did the commit after git-review -s [15:27:13] ^demon: how? I did that, and added the change id on the first line of the commit message, still same error though [15:27:29] <^demon> gwicke's plan will work too. [15:27:49] <^demon> Change-Id has to be at the end. [15:27:58] Huh [15:28:35] <^demon> http://code.google.com/p/gerrit/issues/detail?id=606 [15:28:41] ^demon: it's still whining it's missing thee change id [15:28:49] <^demon> Change-Id: has to be at the end of your commit message or it won't find it [15:30:53] ^demon: it is at the end [15:31:04] gwicke: tried your approach, still getting the same error [15:31:21] ^demon: you sure it's supposed to work with -R ? [15:31:28] <^demon> Yes, I'm sure. [15:31:33] <^demon> -R means "skip the implicit rebase" [15:32:40] ^demon: well, I'm still usable to push anything with git review [15:32:51] JeroenDeDauw: maybe the quickest would be to apply the diff on top of a new checkout [15:33:01] ^demon: https://bugzilla.wikimedia.org/35709 -- Can you handle this or does it need Ops? [15:33:14] <^demon> JeroenDeDauw: Pastebin your git log. [15:33:51] ^demon: http://dpaste.org/5nTPO/ [15:34:01] <^demon> hexmode: I can't do anything for that, that's ops. [15:34:15] k [15:34:24] <^demon> JeroenDeDauw: That commit has no Change-Id. [15:37:23] wtf, ... [15:37:36] ^demon: don't know how it went away, but I did have it [15:37:39] And now I have it again [15:37:40] http://dpaste.org/qR4Vy/ [15:37:46] And exact same result still [15:37:51] <^demon> You just have the ID. [15:37:59] <^demon> you need Change-Id: Iad.... [15:38:19] <^demon> https://gerrit.wikimedia.org/r/#change,3797 [15:38:43] Well [15:38:44] remote: ERROR: missing Change-Id in commit message [15:38:44] remote: Suggestion for commit message: [15:38:44] remote: style tweaks [15:38:44] remote: Iad73bd5e7d01c867b6d72799d531478d47f2a886 [15:38:49] Guess what I will copy then? [15:39:05] If I need that one, then this is a VERY bad message [15:39:18] <^demon> Bah, that needs improvement again. [15:39:19] Then again, gerrit has the worst UX I have ever seen, so would not be surprised [15:39:22] <^demon> They made it better than it was. [15:40:14] So where do I get the change id I need to put in now? I don't get it? [15:40:25] And why the fuck do I need to put one in? This is fucking rediculouse [15:40:56] <^demon> That's why the guides all say to run `git review -s` before ever trying to commit anything so this happens automatically. [15:41:07] <^demon> Gerrit uses unique change-id's to group changesets. [15:41:30] Ok, I will get a fresh clone [15:48:23] ^demon: fresh clone, ran git review -s, made a commit, ran git review -R, got the exact same error again [15:48:43] <^demon> What repo is this? [15:48:59] core [15:49:04] <^demon> Ok, now that's impossible. [15:50:47] .gitreview has [15:50:47] host=gerrit.wikimedia.org [15:50:47] port=29418 [15:50:47] project=mediawiki/core.git [15:50:47] defaultbranch=master [15:51:02] ^demon: clearly it is not [15:51:13] <^demon> What does `git review -v -s` give you? [15:51:35] does anyone here speak any thai at all or know someone who does? [15:51:47] ^demon: http://dpaste.org/i99xs/ [15:52:56] <^demon> On a fresh clone it should look like http://p.defau.lt/?k96PXSJcGZXhN0jVaJngUw [15:53:34] <^demon> Does your .git/hooks/commit-msg hook have stuff about making a change-id? [16:02:41] ^demon: http://dpaste.org/t9VHS/ [16:03:21] <^demon> Permissions on the hook? [16:03:29] ^demon: what? [16:03:42] <^demon> What are the permissions on .git/hooks/commit-msg? [16:05:43] ^demon: -rwxr-xr-x [16:05:49] <^demon> Ok so the hook's there, it's executable, it's otherwise identical to mine but for some reason it's not doing it's job. [16:06:14] <^demon> I'm at a loss. Seriously. [16:06:32] <^demon> Let me think about it. We'll ask hashar too next when he's back. [16:08:40] <^demon> JeroenDeDauw: One other thing, does the hook work properly for other cloned things? It's just core that's borked? [16:30:16] ^demon: looks like my grep is broken somehow, which is probably what's causing the git review error, and maybe the changeid one as well [16:30:39] JeroenDeDauw: I am intrigued. How did grep break? [16:30:40] <^demon> Ouch, yeah that might do it. [16:30:52] I have no idea how it broke [16:31:05] It might have never worked, as this is a new install [16:46:10] ^demon: looks like it indeed was the broken grep [16:47:01] ^demon: so looks like I was raging over the wrong tool :D [16:47:19] <^demon> :) [17:15:10] AaronSchulz: hi! how is 20% day going? :) [17:15:27] it goes [17:17:54] hi preilly - I was wondering whether you or AaronSchulz could take a look at https://gerrit.wikimedia.org/r/4039 [17:18:01] a fix to (bug 24322) Add image links from Page namespace to corresponding images. [17:18:04] from a community member [17:18:25] Zaran: would be great if you could give this ProofreadPage fix a look https://gerrit.wikimedia.org/r/4039 [17:19:38] another candidate, AaronSchulz & preilly - https://gerrit.wikimedia.org/r/4028 (bug 26909) follow up r102947: fix the navigation with 'dir' and 'continue' for [17:20:24] yes sumanah, I've been trying to check all ProofreadPage changes recently, I'll do this one soon [17:20:36] Zaran: great! and there's another from that same author, did you see it? [17:20:43] yes [17:21:09] actually this author submitted 5 changes, 3 are already merged [17:21:15] oh wow [17:21:20] that's great that ProofreadPage found a new active contributor :) [17:21:23] :D [17:21:29] !berlin [17:21:29] Registration is now open for the Berlin hackathon, 1-3 June 2012. See https://www.mediawiki.org/wiki/Berlin_Hackathon_2012 [17:21:29] Registration is now open for the Berlin hackathon, 1-3 June 2012. See https://www.mediawiki.org/wiki/Berlin_Hackathon_2012 [17:21:32] Zaran: ^^^ [17:21:33] this user is "Beau" in bugzilla [17:21:38] Zaran: oh, right [17:21:47] a polish wikisource contributor [17:22:17] AaronSchulz & preilly: and iAlex's https://gerrit.wikimedia.org/r/4335 is another one that it would be nice to review during 20% time: "(bug 35728) Git revisions are now linked on Special:Version" [17:24:49] one more thing, AaronSchulz & preilly - this community-written extension is ready to deploy https://www.mediawiki.org/wiki/Extension:RandomRootPage https://bugzilla.wikimedia.org/show_bug.cgi?id=16655 in case you want to shepherd that forward today [17:29:39] ^demon: Hey how do you import branches from SVN into git? I want to import /branches/salvatoreingala as a branch into extensions/Gadgets.ggit [17:30:03] <^demon> I wish you had told me awhile ago, it's a painful process. [17:30:12] I'm sorry [17:30:24] I didn't realize until today [17:30:33] <^demon> Although Platonides did it for us yesterday for Wikidata, so he might be willing again if it's not a lot of branches. [17:30:34] It's just adding new commits, right, so it's not a repo rewrite, right? [17:30:40] <^demon> s/branches/commits/ [17:31:51] RoanKattouw: still having trouble with omnitiwork/* — http://pastebin.com/raw.php?i=1CaZ6RPx [17:32:24] wtf [17:32:29] Hmm, maybe you don't have the create reference right? [17:32:41] dunno [17:32:45] how do i check? [17:32:56] By trying again now [17:33:06] I just granted the Create Reference right on omnitiwork/* [17:33:43] that did it, thanks! [17:33:46] yay [17:42:47] ^demon: Maybe you can tell me what tools you guys use so I can do this myself? I know the branch point and I know this codebase, so I figure it shouldn't be too hard to do [17:43:50] my condolences, vvv-sick [17:44:08] <^demon> RoanKattouw: Well when he did it yesterday he manually picked the commits in. Don't know the exact process. [17:44:43] <^demon> When I do stuff it's with my dump tool, operations/software.git svn2git [17:54:46] <^demon> RoanKattouw: So when I said rewriting make-wmf-branch wouldn't take long I meant it. I did better than half the work already. [17:55:23] yay [17:55:58] OK svn2git might be overkill for my purposes [17:56:18] <^demon> Yeah :) [17:56:21] I could git-svn init the SVN checkout, then pull the revisions across into my Gadgets clone, then rebase them onto the branch point [17:56:47] <^demon> I didn't finish "copy old branch's submodule status for X extension", "copy StartProfiler" or "apply these custom patches" [17:56:56] <^demon> But those shouldn't be needed for the first iteration. [17:57:02] <^demon> StartProfiler we'll have to copy in manually. [18:07:51] evening ^demon and RoanKattouw :) [18:08:27] I'm curious to see whether my scripts will work tomorrow when new extensions migrate to git [18:08:36] ^demon: should I do last updates for svn now? [18:09:55] <^demon> Whenever's fine. I'm not going to make them read-only until tomorrow. [18:13:52] okay [18:16:54] ^demon: umm wtf [18:16:56] Checking AddThis [18:16:56] D AddThis [18:16:56] Initialized empty Git repository in /resources/nike/mediawiki-extensions/extensions/AddThis/.git/ [18:16:59] fatal: You are on a branch yet to be born [18:17:01] Unable to checkout submodule 'extensions/AddThis' [18:17:33] <^demon> You can't clone those yet, they don't have any history. [18:17:40] <^demon> I haven't pushed to them. [18:17:51] ^demon: how many are those? [18:17:57] <^demon> 34. [18:18:09] <^demon> The ones getting migrated tomorrow. [18:18:12] hmph [18:18:18] so why are they in the list already? [18:18:47] <^demon> Because I created the repositories and set up the permissions today so they're ready to push to tomorrow. [18:19:13] and all scripts relying on that list broke [18:21:05] <^demon> I'll hack that script and remove the extensions. [18:24:35] ^demon: no need for me [18:25:01] <^demon> It's no big deal. [18:30:41] Nikerabbit: problem with AddThis? :) [18:31:22] ahhh - I see - git test - sorry - Colloquy flagged it for me :) [18:32:21] <^demon> varnent: Yeah I already created the repositories today in preparation of migrating the svn repos tomorrow :) [18:32:30] <^demon> And that confused stuff :) [18:32:45] ^demon: makes sense :) [18:33:32] ^demon: I'm waiting until extensions are migrated before I dive back into Git - although I do need to touch base with you and others about when would be good to do a training for extension developers and sysadmins - also if they should be done separately or at once.. [18:34:05] <^demon> Training for sysadmins? [18:35:11] ^demon: yeah - I've been contemplating doing one to help answer their git questions - aid with transferring folks installs from SVN to Git - we haven't done a sysadmin workshop yet..so..no predictions on how well attended or anything it will be - but I am getting Git questions from sysadmins now [18:35:44] ^demon: obviously more for third-party sysadmins - I doubt any WM projects would have a need to attend [18:36:37] <^demon> I see. I don't know when I'll have a chance to do training for a couple of weeks at least. [18:36:45] <^demon> Definitely will be doing training at wikimania. [18:37:26] ^demon: that's fine - I'm leaning towards waiting until dust settles before doing an online workshop anyway - so probably not until later in April anyway [18:45:19] um... https://bugzilla.wikimedia.org/35731 this is a weird one [18:47:47] <^demon> Nikerabbit: Ok, list is back to normal for now. [18:48:05] ^demon: thanks but I already committed for svn [18:48:22] <^demon> Ah ok :) [18:49:37] hexmode: that is weird. fwiw, I installed trunk a couple times last week, saw nothing like that [18:50:00] <^demon> hashar: So I'm trying to figure out why my hook isn't working. The log is suspicious (http://p.defau.lt/?xXiXyBGTglIMtUmNjuC2Zg) since I only see entries for operations/puppet. [18:50:07] <^demon> I would expect to see it fired for all repos. [18:50:29] chrismcmahon: yes, I grepped the source, but wanted to install just to verify. Most likely a local issue. [18:51:17] ^demon: which hook? [18:51:32] <^demon> patchset-created [18:51:33] chrismcmahon: hexmode: hello there :-] [18:51:56] * hashar switch from PHP CodeSniffer to python [18:53:16] ^demon: and so what is "your hook" supposed to do ? :D [18:53:30] <^demon> I mean the auto-approve thing that Ryan merged in for us. [18:53:33] <^demon> For l10n-bot. [18:54:08] :O [18:54:32] ^demon: as I said to nike, I would like to have them auto merged if and only if jenkins has marked them verified [18:54:42] ^demon: though that is not going to explain why it does not work :-) [18:55:15] hashar: can you make it so today? [18:55:47] hashar: hey! [18:55:56] Nikerabbit: make what ? [18:56:04] Nikerabbit: to approve only if jenkins approved ? [18:56:12] hashar: yep [18:56:21] unlikely :D [18:56:35] but I am going to try Chad hook [19:00:52] ^demon: could it be because l10n-bot start with an upper case L ? [19:01:10] aren't first letters case insensitive [19:01:25] ^demon: I have no idea about what options.uploader can be [19:03:04] <^demon> Nikerabbit: In MediaWiki, not in python. [19:03:27] ^demon: I think gerrit allowes me to use nikerabbit as well as Nikerabbit [19:04:33] ^demon: I would try adding a log of options.uploader to see what it contains [19:04:56] cause you check a uploader != 'l10n-bot' [19:05:16] where uploader could be the email address [19:05:35] or something like "John Doe " [19:07:00] ^demon: so yes my recommendation is to log options.uploader [19:07:27] once you have confirmed the format, make sur to have it documented in all hooks using that field [19:07:49] meanwhile I am happy not having my git log spammed with l10n :D [19:09:17] USD/EUR up to 0.765 \o/ [19:09:41] yay [19:09:51] * RoanKattouw wants it to rise a bit more still [19:10:02] <^demon> USD/USD is still 1:1 :( [19:10:17] RoanKattouw: are you in USD land anyway? :D [19:10:29] ^demon: come in Europe !!! [19:10:38] we have "social security [19:10:53] and wine :-D [19:10:57] and bicycles [19:11:07] Wait, hmm [19:11:11] Which direction do I need this to go again [19:11:16] No, it needs to go down [19:11:22] NOOOOOO [19:11:36] I need my euros to be worth a lot of USD [19:11:40] (And hashar needs the reverse) [19:11:49] I need my USD to be worth a lot more EUR [19:12:10] It was at 1.335 last week, now it's gone back below 1.32 [19:12:24] Maybe if it goes in the direction of 1.40 I'll move some of my EUR savings to USD [19:12:26] RoanKattouw: or!! I can give you my USD and you give me your EUR :-] I will give you 1 $ for any 1€ you give me. [19:12:34] hehe [19:12:36] 1 == 1 [19:12:38] \o/ [19:14:03] RoanKattouw: you are upper-middle-class in the US, upper-class in world-historical terms, no matter what the exchange rate is [19:14:59] True [19:15:04] (lunch) [19:16:53] more exactly, the upper class trick you in believing you are upper-middle-class when you are actually just middle class :D (ping sumanah) [19:17:13] hashar: I'm not going to dive into that particular pit [19:17:16] of argument [19:19:22] sumanah: c:-) [19:26:50] <^demon> hashar: L10n-bot != l10n-bot. [19:27:15] insightful [19:27:52] <^demon> Yeah, who knew capitalization mattered for string comparison ;-) [19:29:58] ^demon: anyone not using PHP ? [19:31:29] <^demon> I did check the logs though, the uploader is just the username. [19:31:37] <^demon> So L10n-bot should fix it [19:35:22] we will really have to rethink that workflow [19:35:42] cause that sounds way too hacky to me :) [19:36:54] on another topic. I have a basic set of CodeSniffer rules for MediaWiki style to be checked [19:37:00] I have made it on github https://github.com/hashar/MediaWiki-CodeSniffer [19:37:26] I would like to have jenkins/integration to have that code. Two possible ways: [19:37:35] 1) add a git submodule to that github repository [19:37:44] 2) migrate the code to jenkins/integration [19:37:58] ^demon: Enter a choice to continue [12]: [19:38:51] <^demon> 2 [19:39:07] Processing. ETA 4 days. [19:39:35] <^demon> If you don't care about the history: ETA 5 minutes :p [19:39:46] I care ? :D [19:40:03] na you are right, I will just dump everything in some dir [19:42:37] hmm [19:43:12] thinking about it, maybe people would want to use the MediaWiki CodeSniffer set of rules without having to fetch the integration/jenkins repo [19:50:43] New patchset: Hashar; "MediaWiki code standard for use with PHP_CodeSniffer" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4360 [19:51:09] though I should probably migrate that to mediawiki/CodeSniffer [19:51:13] or something similar [19:55:24] New patchset: Hashar; "MediaWiki code standard for use with PHP_CodeSniffer" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4360 [19:57:24] hashar: woo, running a github repo directly on a production machine? [19:57:27] sounds heavy :P [19:57:47] yeah maybe too [19:58:08] * Krinkle has seen BackToTheFuture recently and has an overflow of expressions from the 80s/90s in his head now [19:58:58] Love the reaction "Doc Brown" gives when Marty expresses "heavy" to his 1950s self. "The gravity must have changed in the future, is that why it feels so heavy in 1958?" [19:59:14] xD [19:59:27] I need to watch that movie again [20:02:08] hashar: Could we perhaps re-arrange the inexistent schedule a little bit? [20:02:16] I know getting the specification implemented is important [20:02:25] but the fact is, bare down the phpunit is working "fine" right now [20:02:28] and the testswarm is not working at all [20:02:31] not even standalone [20:02:40] svn/git [20:02:59] sure rearrange all the null bits around :-D [20:03:17] I totally agree PHPUnit is mostly working [20:03:34] Don't worry about Jenkins for now, that's not worth the trouble with testswarm 0.2.0 since it has no usable API [20:03:36] there is a few bugs around, but they are not too bad [20:03:50] so no aggregation into a jenkins build just yet [20:04:00] we can do that as soon as testswarm 0.3 is out (mid-April) [20:04:35] hashar: What's easier, refactor MWFetcher or snapshot of the working dir of one of the jenkins builds and curl POST to testswarm addjob? [20:04:50] second [20:05:05] snapshot is going to be pretty trivial [20:05:08] I thought about it today [20:05:43] just have to capture the sqlite file, copy the MediaWiki-GIT-Fetching checkout and copy the LocalSettings file [20:06:48] then do the directory detection of ./tests/qunit (see Fetcher for details) and construct list of module names and test suite urls (SpecialJavaScrptTest/qunit?filter=modulename) and submit it to state=addjob [20:06:58] then we can copy that in /var/lib/testswarm/mediawiki-trunk/checkouts/ using the GIT Fetching build number [20:07:26] exactly :) [20:07:44] hashar: is the fetcher still running? [20:07:51] most probably [20:07:54] k [20:08:09] maybe kill it and clean up the big stack of old installs [20:08:11] it is in a cronjob defined in operations/puppet [20:08:33] have you ever edited puppet configurations ? [20:08:33] hashar: it's too bad though, we went through all the trouble in the fetcher to support db-install and we never actually used it [20:08:46] since so far we only submitted testswarm jobs with urls to index.html [20:08:54] which doesn't need the db or php at all [20:09:02] that is all because I had over priorities :-/ [20:09:13] it is not like we are dropping a million dollars project anyway ;) [20:09:18] nah [20:09:33] though sometime, it makes sense to throw away a million dollars project [20:09:48] (to avoid it costing you way more millions over the next X years :D) [20:12:21] I agree [20:12:26] although it rarely happens [20:12:29] it should happen more oftten [20:12:50] (throwing away an expensive project if it isn't worth it, even if it costed a lot already) [20:13:03] can we get ride of the testswarm-mw-fetcher.php files ?:D [20:13:23] Sure, don't nuke the repo though [20:13:23] history :) [20:13:29] sure [20:22:13] Krinkle https://gerrit.wikimedia.org/r/4364 [20:22:18] it is dead [20:36:34] New patchset: Hashar; "integration/testswarm as a toolbox for Jenkins" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4366 [20:40:33] Krinkle: I am heading bed [20:40:43] Krinkle: will do the jenkins copying job tomorrow [20:40:56] Okay, looking forward to that :) [20:41:06] and try to have some checkouts published and ready to get tested [20:41:47] Krinkle: and if I am any good, I will try the addJob() :D [20:43:17] have a good night! [21:03:39] git question: Should the command in: http://www.mediawiki.org/wiki/Git/Workflow#Update_master [21:03:52] be: [21:03:53] git pull origin master [21:04:40] the one given probably works [21:05:08] I get: fatal: 'master' does not appear to be a git repository [21:05:40] if I type: [21:05:44] git pull master [21:06:39] jpostlethwaite: Yes, you probably need git pull origin master then. Or just git pull [21:06:51] i will update the docs [21:06:54] thanks [22:05:26] more Berlin hackathon outreach, yay! [22:06:21] hexmode: your message about the 1.19 sprint landed in my spam folder :( [22:53:23] chrismcmahon: please for the love of all holy please use python and not ruby [22:53:44] most of us *despise* ruby [22:53:53] we have a number of devs that already know python [22:54:07] we have a number of other toolkits written in python as well [22:54:18] we've actively been trying to purge everything ruby from our environments [22:55:24] Ryan_Lane: from what I heard from Chris, it seems like Ruby is just a better fit for this particular job, given the deep support for it ... we'd have to do more work in-house to provide that level of support with the Python version [22:55:43] ruby is never a good choice in ubuntyu [22:55:46] *ubuntu [22:56:07] also, very few people in our community know or use ruby [22:56:12] many of us know python [22:56:13] chrismcmahon: go ahead and chime in anytime :-) you know this stuff better than I do [22:56:43] Ryan_Lane: Understood. For this particular niche I think Ruby is the right choice. I at least want to make the case before getting shot down. [22:56:46] so, bus factor, yes, but hashar and chris are fine with it, and they're the ones who'll be doing this work, right? [22:57:09] right. it always starts with only so and so will be doing the work [22:57:21] mobile gateway! [22:57:21] then somehow ops gets wrangled into supporting it [22:57:34] and we hate it, and it gets piss poor support [22:57:46] ruby in ubuntu is terrible [22:58:29] also, we despise gems (and pip, and pecl, and Ryan_Lane: mobile gateway vs. selenium regression testing is apples and oranges. automated browser testing code has no place on production servers. [22:58:54] where else would it live? [22:58:57] so as I read Chris's email, it sounds like the Python version doesn't have the API we need [22:59:11] the stuff that runs the tests is on a production system [22:59:26] we *must* have working testing infrastructure for things to go live [22:59:35] hence the testing system itself is a production system [22:59:54] which means we need to puppetize this [22:59:55] Chris mentions that basically with Ruby we're good to go, but if we use Python then we have to spend a lot of time building custom infrastructure [23:00:32] Ryan_Lane: we can totally not puppetize this, aside from optionally making a VM snapshot of a single labs instance host [23:00:40] no.... [23:00:49] jenkins is going to be running this, right? [23:01:02] Chris, do you have any kind of estimate of the time tradeoff here? like, how long might it take to build this infrastructure? I do not know who would do it [23:01:04] everything we depend on needs to be puppetized [23:01:24] Jenkins is going to be kicking off a job on some host and collecting the results [23:01:27] also, the testing infrastructure lives in production, not in labs [23:01:54] * sumanah goes away from screen for a bit [23:01:55] anyway, snapshots are not a reasonable method of maintaining a system [23:02:18] what happens when we need to move it from one instance to another? [23:02:23] or from labs into production? [23:02:32] we puppetize services we depend on [23:02:38] Ryan_Lane: I should send you a description of the architecture involved. [23:03:12] I understand the basics of how testing is done with selenium [23:03:46] our initial selenium support infrastructure was built by priyanka, markus glaser and myself [23:05:05] selenium code runs tests on a set of selenium servers [23:05:13] those servers run the tests and report the results [23:05:26] our original tests were written in php [23:05:41] I don't mind if they aren't in php, but I'm pretty against them being ruby [23:06:56] I think the last ruby thing we have in the infrastructure is puppet, and we hate *it* too [23:08:15] Ryan_Lane: understood. so two things: one is that selenium-webdriver/selenium 2.0 is a whole different approach to the issue. the other is that there is more to a Selenium automation project than just Selenium. part of my mandate is to make the whole package as attractive as possible to the global testing community. [23:08:17] with the fire of 1000 suns. [23:08:52] I'd rather it be more attractive to us, than the global community, if I have to choose [23:10:04] our community knows PHP mostly, some python, and a tiny amount of java [23:10:14] we just introduced a java app that is already causing us issues [23:10:23] introducing ruby back in is going to be a PITA [23:10:45] especially for ops, who really, really hates working with it [23:11:40] Ryan_Lane: Ruby would be in two places: git, where it is static text; and on a labs instance maintained by QA staff. nowhere else. [23:11:46] no [23:11:51] *not* in labs [23:12:07] labs is not a production environment [23:12:25] testing infrastructure goes into production [23:12:36] testing environment goes in labs [23:12:50] so, mediawiki, running like a production clone = labs [23:13:05] a testing server, hitting the production clone = production [23:13:50] if we depend on it to push code to production, we have to treat it like a production service [23:14:07] at some point we'll be requiring the tests to pass for code to go into production [23:16:19] you can go with whatever you want to go with, just realize that we're going to bitch, constantly about it if you use ruby [23:17:54] Ryan_Lane: as I see it, ops should never come within smelling distance of the Selenium test framework itself. I might have a misunderstanding, but that's how it looks from here. [23:18:20] but we have to support the infrastructure that runs it [23:18:33] which means we need to deal with ruby, and gems [23:18:49] will this require any ruby web service? [23:18:55] or just ruby cli? [23:19:26] Ryan_Lane: totally headless, all in the shell [23:19:42] ok, I'll relent, as long as you don't use gems [23:19:56] Ryan_Lane: there are a handful of gems involved. [23:19:59] if it'll install via ubuntu packages I'm fine [23:20:27] Ryan_Lane: I understand the problem with gems vs apt [23:20:37] chrismcmahon: https://labsconsole.wikimedia.org/wiki/Help:Development_recommendations_for_easily_moving_to_production [23:20:52] very first section :) [23:22:05] Ryan_Lane: which is why I want a dedicated host as a central client for selenium-server hosts on other VMs. if the gems get wiped out for one reason or another it is a matter of minutes to get them back in place. [23:22:12] again, I'm opposed to using ruby… I think it splits our testing infrastructure up poorly, language wise. [23:22:30] selenium server hosts? [23:23:17] aren't we just going to use an outside service for running selenium server? [23:24:29] yes. we have VMs running IE7/IE8/IE9. selenium-server is a Java app that accepts Se commands from a client in any language. right now I'm not aware of plans to use any outside service, although Sauce Labs would be nice to have eventually. [23:24:52] where do we have vms running ie? [23:25:26] in the office [23:25:33] hahahahahahahahahahahaha [23:25:36] err [23:25:37] sorry [23:25:39] lol [23:25:47] I know, you meant in the cluster huh? [23:26:05] so, where do we have vms running ie somewhere we can actually depend on? [23:26:22] Ryan_Lane: I'm working with what I got here [23:26:26] * Ryan_Lane nods [23:26:32] we should use an outside service :) [23:26:47] either run windows on ec2/rackspacecloud or pay for sauce labs [23:27:02] * robla votes Sauce Labs at this point [23:27:07] me too [23:27:14] then we don't have to deal with selenium server either [23:27:21] which is a big win :) [23:27:26] they have os x too, right? [23:27:30] e.g sfo-virtual1-vista-ie7 [23:27:48] chrismcmahon: yeah, we can't depend on the office for anything we really need :( [23:28:05] it's simply not set up for that [23:28:21] no redundancy for anything, no backups, no puppetization, no automated installation, etc [23:28:32] no backup power, right? [23:29:10] beta labs cluster wikis have dependency issues of their own. there is a lot to be done. [23:29:18] yep [23:29:25] better to make less dependencies, over all [23:30:17] we don't really have plans for windows in labs [23:30:29] and we're getting rid of the OS X stuff we have [23:30:57] (windows in labs opens us to a lot of vulnerabilties we'd like to avoid, and we have no way of automating any of it) [23:31:12] yeah, and Windows is a big priority both for the tests themselves and also for potential community collaboration [23:31:21] yep [23:31:39] a cloud service is like better for us for windows [23:31:56] is sauce labs super expensive? It's sounding like a pretty good option [23:32:22] I remember the selenium cluster I built needing constant attention to keep it running. it was a PITA [23:33:30] mostly because it was a mix of OS's [23:35:10] Ryan_Lane: regardless of the WMF architecture, I still want to attract a high class of contributor. I know some people doing similar work at Mozilla, and they've had some struggles building a quality community around their homegrown Python test framework. I think the answer is to make the framework architecture shiny and attractive and community-supported from the start, and to me that suggests Ruby. Now the challenge is to manage tha [23:35:29] chrismcmahon: manage th.....? [23:37:37] sumanah: not sure what "th....?" is... [23:37:54] chrismcmahon: your message cut off at "Now the challenge is to manage th" [23:37:57] (sorry, unclear) [23:38:20] ah "... Now the challenge is to manage that with a minimum of pain and heartbreak." [23:40:25] fwiw, there might very well be a reason to veto using Ruby for Selenium tests. but my starting position is that it can managed and managed well, and attract a huge untapped community of contributors. [23:43:09] chrismcmahon: if you expect mediawiki developers to write selenium tests for their code, it *must* be in php [23:43:32] they won't do it otherwise. it's hard enough to get them to write tests to begin with [23:44:28] third party contributors hardly upstream to us, because for the most part we ignore them [23:45:28] what community are you aiming at with ruby? [23:46:12] wikihow, one of the larger third party users, is running mediawiki 1.12, for instance [23:46:31] Ryan_Lane: I'm far more interested in having community contributors for Selenium tests. That said, the Selenium use I've seen at WMF so far doesn't show a very sophisticated understanding of how the tool is properly used. I'd far rather see developers writing great unit tests. [23:46:32] wikia, until fairly recently, didn't upstream, but kept forked revisions in their own repo [23:46:49] chrismcmahon: who's going to write these tests, though? [23:47:41] the only people I can think that would consider writing tests are people using mediawiki already, they have a vested interest [23:48:05] * Ryan_Lane shrugs [23:48:30] from the ops perspective, go with whatever you want as long as everything comes as an ubuntu package [23:48:44] or is packaged and stuck in our repo [23:48:56] Ryan_Lane: I'll be writing the first installment of Se tests, aiming at both a clean MW install and also Labs Beta wikis. Beyond me and whatever QA staff WMF ends up with, I've already had an offer for a weekend GiveCamp project to get a basic MW regression test suite in place [23:50:41] Ryan_Lane: deal. a Ubuntu package of either Ruby 1.8 or 1.9, and the QA "department" maintains everything beyond that. [23:51:01] assume you don't have root :) [23:51:30] and that everything will need to be pupptized [23:51:40] which should be easy enough if all it is is package installs [23:52:01] Ryan_Lane: should be OK with either RVM or Bundler, but I don't have details this minute [23:57:24] Ryan_Lane: this whole thing is going to be in prototype mode for some time, the wheels may yet fall off. But I think it has a lot of promise, avoiding mistakes WMF made in the past and things like Moz is negotiating right now. [23:58:01] * Ryan_Lane nods