[08:13:08] thedj: can we discuss the "chrome crash bug"? I want to ask you to report it if possible in the right forum. [15:02:35] ^demon: Do you have any objections against me adding WikimediaMaintenance to 1.18wmf1 so I can use the script I added in r106656 for Nagios? [15:02:58] <^demon> Like I can stop you? :p [15:03:12] <^demon> Go ahead :) [15:03:19] heh [15:03:21] Thanks [16:43:42] hi brion [16:46:23] yo [16:46:55] brion: yo! I've got some git problems again [16:47:07] ok [16:47:13] we moved the repo so that might break somethin :) [16:48:53] brion: yeah - I was supposed to have push access there [16:50:23] ok so a) did you re-check out the repo from git@github.com:wikimedia/WikipediaMobile.git ? [16:50:31] b) lemme double-check permissions [16:51:28] i find github's admin interface slightly baffling [16:52:40] ok i may or may not have fixed your permission [16:52:59] okay [16:53:30] brion: seems to work now [16:54:23] yay! [16:55:12] now I can fix the usage of word article in the messages [16:56:18] :D [16:57:47] oh, nice [17:02:28] <^demon> Hi everyone :) [17:02:31] github? [17:02:42] *Danny_B|backup thought we'll have our own git server [17:03:02] This is about the mobile app [17:03:10] The MW git meeting hasn't started yet [17:03:29] meetings collide [17:03:33] hola [17:03:37] um ok should we start it? [17:03:39] <^demon> This is about the MW git meeting. [17:03:44] yay [17:03:49] git! :) [17:03:54] let's git to it! [17:04:02] *jeremyb stabs apergos [17:04:03] I'm not totally sure what I can contribute, although I've been evangelizing git-svn [17:04:03] *^demon whacks apergos for the pun [17:04:16] *apergos snickers and bleeds all over everyone else's keyboards [17:04:34] neilk_: blood [17:04:36] <^demon> According to Tim git-svn destroys mergeinfo and is partially to blame for our fun revision graph :) [17:04:54] *Nikerabbit pings siebrand [17:04:59] git-svn is a hacky perl script... you look at it ever? [17:05:05] hmm? [17:05:09] <^demon> The etherpad (also in the /topic) is http://etherpad.wikimedia.org/Git [17:05:38] ^demon: mergeinfo seems like more of an obstacle than anything helpful for the average commit. Sorry if it makes life hard when switching the entire repo. [17:05:39] <^demon> First item of business: let's make sure we're pretty much settled on the layout we'd like to use. I've been proposing something like this: [17:05:54] <^demon> It makes life hard when you want a sane history. [17:06:03] <^demon> So yes, the layout... [17:06:07] <^demon> mediawiki/core.git [17:06:11] <^demon> mediawiki/extensions/foo.git [17:06:14] <^demon> Etc. [17:06:21] <^demon> Does that sound sane to everyone? [17:06:28] There was a master extension repo somewhere, right? [17:06:35] <^demon> Yes. [17:06:36] ** Should probably make mediawiki/extensions-meta.git (or something) that has *all* extensions as submodules [17:06:46] <^demon> mediawiki/extensions-all.git or something [17:06:46] heh. Git developers about our situation: "why don'y you just put all the extensions into the same module?" [17:06:49] <^demon> I'm open to the name. [17:07:07] Well, so, yeah [17:07:09] (I met 'em @GSOC summit) [17:07:17] heh [17:07:26] Maybe this has been covered a zillion times before, but just what are the reasons for not putting all extensions in one repo? [17:07:35] I've heard size, but I don't find that very convincing [17:07:37] 1) permissions [17:07:46] 2) import/export / self-determination [17:07:52] 3) cleanliness :) [17:07:53] Hmm, fair [17:07:58] *RoanKattouw is convinced and shuts up [17:07:59] <^demon> Permissions is a big one. [17:08:20] Roan: It's very slow, too. Consider that git sucks down and stores way more info than svn [17:08:25] <^demon> Exts deployed to WMF will probably have a gated trunk model like we will do with core and currently do with puppet. [17:08:26] why do we want to restrict access to specific extensions? [17:08:34] i *really* want people to be able to start their extensions on their own, then move them into our infrastructure, and also be able to move back out again if they wish [17:08:48] brion: yes! [17:08:51] <^demon> It's not about preventing access, it's about having the development model for each extension that works best. [17:08:57] perms is a good one yes [17:09:02] Because of the gated trunk model [17:09:07] <^demon> Permissions in gerrit inherit, so mediawiki/extensions/ would be very open by default. [17:09:13] hmm [17:09:17] i still don't like gerrit myself though [17:09:18] <^demon> And then extensions we want to add review+etc to, we can do explicitly. [17:09:23] i find it ???. awful [17:09:27] there's only one thing that sucks if you divide exts -- refactoring across exts. But that seems very rare. [17:09:30] Then I guess for WMF-deployed extensions, the master or deployment branch would be restricted? [17:09:34] brion: Gerrit UI sucks [17:09:34] I'm not really enthusiastic about gerrit either [17:09:37] *siebrand mumbles something about LU and git. [17:09:40] The core of gerrit is awesome [17:09:47] I have found moving stuff between exts and core to be annoying [17:09:49] Its usability is crap [17:09:51] wasn't gerrit a Google product? [17:09:51] it's not the ui I have a problem with [17:09:54] <^demon> siebrand: People pushing commits can be given permission to push without review. [17:09:54] RoanKattouw, did you get any further with hacking it? [17:09:57] No [17:09:58] RoanKattouw: so it's like git then? :) [17:10:03] it's the way it restricts one's use of git [17:10:05] It's worse than giy [17:10:07] <^demon> siebrand: Pushing commits for i18n [17:10:09] siebrand: in principle there shouldn't be much difference between svn and git for localizationupdate [17:10:17] apergos: No I don't actually mind /that/ . I hate the UI [17:10:20] have we tested anything? [17:10:24] ^demon: LU as in LocalisationUpdate, not the push. we'll work that out somehow. [17:10:29] We haven't tested LU with git [17:10:38] But that's fine [17:10:52] The way we have deployed LU on WMF is VCS-independent [17:10:59] oh good [17:11:02] i think we are going quite wild now, i would prefer if this could be moderated/facilitated, but if you like this unstructured discussion, i'll try to follow it [17:11:05] RoanKattouw: is it fine for WMF or fine for all users of LU? [17:11:09] Fine for WMF [17:11:16] Not fine for LU's default modus operandi [17:11:21] OTOH [17:11:22] I would prefer if someone has an agenda [17:11:22] <^demon> Danny_B|backup: Yeah we're fracturing quite quickly. [17:11:31] is there a wiki / etherpad page [17:11:36] neilk_: /topic [17:11:41] ok, stop question. Are we talking WMF needs to be fine here, or MediaWiki users need to be fine? Also: ExtensionDistributor [17:11:45] If anyone else runs LU and moves from SVN to git, they have to change config *anyway*, so they should just migrate to the WMF modus operandi and they'll be fine [17:11:50] stop = scope [17:11:54] damn autocorrect. [17:12:04] rats, temporarily unavailable [17:12:18] anybody is logging as well? [17:12:20] <^demon> Ok everyone halt. Let's cover one question at a time. [17:12:21] neilk_: yup, http://etherpad.wikimedia.org/Git [17:12:24] <^demon> Danny_B|backup: Channel is logged. [17:12:35] siebrand: With a bit of documentation, 3rd party installs can migrate easily [17:12:35] ok, thanks [17:12:49] Yeah, let's get organized here [17:12:56] Danny_B|backup: http://ur1.ca/1e8l0 [17:12:59] So, Siebrand asked what the situation is with LocalisationUpdate [17:13:31] The WMF side of the answer is that we'll have to change a script (that lives in puppet, not in the extension) to pull from git instead of svn and that's it [17:13:56] and commit back to git not svn? [17:13:57] For 3rd party installs, LU doesn't support git directly the way it supports SVN, but 3rd parties wishing to migrate from SVN to git can simply adopt the WMF setup [17:14:10] jeremyb: LocalisationUpdate doesn't commit anything back to anywhere [17:14:22] err, huh [17:14:30] The WMF setup would be documented on the extension page so it can easily be duplicated. But it's pretty simple anyway [17:14:47] jeremyb: LU is an extension that's pull-only. The push side of the equation is on TranslateWiki's side [17:14:49] So [17:14:51] Next question? [17:15:07] <^demon> Are there any more questions relating to the proposed layout? [17:15:23] <^demon> If there aren't any more objections to it, I'd like to set that down in stone. [17:15:42] I am fine with the layout [17:15:47] layout+1 [17:15:49] siebrand: Any other concerns from the TWN side re layout? [17:15:50] *robla gets chisel [17:15:59] so one repo per extension? [17:16:10] Yes [17:16:11] robla: i prefer laser cutter [17:16:52] ^demon: looks settled to me. what's next? [17:17:10] <^demon> One minor bit of bikeshedding on a similar topic. [17:17:18] teal [17:17:20] <^demon> mediawiki/extensions-all.git for the meta-repo? [17:17:28] <^demon> Open to name suggestions there. [17:17:29] me like [17:17:35] extensions-all sounds better than extensions-meta [17:17:39] all-extension.git? [17:17:43] +s [17:17:46] I like "extensions-all" [17:17:46] al--extensions.git ? [17:17:50] I guess we'll want extensions-wmf too? [17:18:00] *all-extensions [17:18:03] extensions-all + extensions-wmf [17:18:04] sounds yoda like [17:18:13] <^demon> extensions-all and extensions-wmf would be good. [17:18:25] it's all about sort order [17:18:30] <^demon> These packages would just bit submodules pointing to the mediawiki/extensions/foo repos [17:18:41] <^demon> s/bit/be/ [17:18:43] what there is to sort with those? [17:18:50] is extensions-wmf (considered as) subset of extensions-all? [17:18:51] ^demon: Can we talk about branch naming / conventions lateR? [17:19:08] RoanKattouw: no concerns. [17:19:20] Danny_B|backup: Extensions that are in -wmf would always be in -all , although I suppose -wmf could refer to older versions [17:19:38] <^demon> I think that's the last of my naming bikeshedding. [17:19:48] ^demon: what's next? [17:19:48] Nikerabbit: You don't want the list of repos to look like "all-extensions , core, extensions, wmf-extensions" [17:20:09] <^demon> Next item someone put on etherpad is "* Some SVN properties such as svn:externals to a 3rd-party project" [17:20:13] <^demon> Does anyone want to clarify? [17:20:43] i assume this involves geshi, fckedit etc. [17:20:48] I guess that's a concern re porting stuff like geshi? [17:21:14] <^demon> Presumably. Does someone know how git can do externals like that? [17:21:22] <^demon> If it even can? [17:21:24] ^demon: submodules are similar but are git-to-git only [17:21:31] i think we'd just need to bite the bullet and import the files [17:21:36] liangent added that question [17:22:14] I guess we'd want to look for existing git repos for those projects first [17:22:15] ? [17:22:16] have a cron somewhere importing svn to git and then do git submodule? [17:22:42] ^demon: submodules work differently with git than svn submodules. they're pegged to a specific versoin. there's no way to say "latest" [17:22:45] I met a GeSHi dev and I got the impression those folks like religiously love git :) [17:22:54] <^demon> robla: You can do latest or pegged with SVN [17:23:15] robla: that's the reason for the commit hook i thought? [17:23:45] *robla looks up git feature request for floating revs [17:23:47] <^demon> Well the post-commit hook was for updating the extensions-all meta repo. [17:24:01] Yeah but obviously that doesn't work for foreign-hosted stuff [17:24:08] <^demon> *nod* [17:24:27] all our svn externsl are also pegged [17:24:29] or they should be :) [17:24:43] <^demon> How many externals do we have across all extensions? [17:25:05] ^demon: answering that question in a few mins. [17:25:13] <^demon> Thank you. [17:25:38] floating submodules rfc/patch for git: http://comments.gmane.org/gmane.comp.version-control.git/185164 [17:25:50] They're not technically all pegged, but the non-pegged ones are probably tags [17:26:06] <^demon> I think we're going to have to do it on a case-by-case basis with externals. Some will be easier to just import like brion suggested. [17:26:11] There are 9 externals [17:26:21] <^demon> Larger libraries or ones that are updated more often might be annoying :) [17:26:30] ^demon: 14 across trunk/ [17:26:51] I believe we have postcommit or precommit hooks in SVN that need to be ported -- add these to the doc? [17:27:04] what's the current fallback for externals where the foreign svn server is down? or disappeared for that matter? [17:27:21] jeremyb: error message and it doesn't check out [17:27:24] <^demon> neilk_: The current post-commit hooks are about pinging CR and the mailing list. [17:27:33] brion: someone has a backup? [17:27:43] one hopes ;) [17:27:50] so that's another benefit to actually importing the files: safety [17:27:57] also more git-friendly in general [17:28:07] <^demon> This is true. [17:28:23] ^demon: list of paths with externals: http://p.defau.lt/?mdH9cRwLTJt6rnplnd9ePw [17:28:36] you get the same safety from cronjob import (assuming updates do fast forward only) i think [17:29:21] brion: does that effectively make extensions-all read only? How does a patch make it from the meta repo back into the subrepo? [17:29:24] imported parts of extensions should be always in the same named folder - that should become a part of extension development guidelines [17:29:27] <^demon> siebrand: Thank you. That's not that bad. [17:29:28] or since you have change the pegging anyway to get any new stuff you could just make it a manual update [17:29:46] robla: extensions-all would exist as a handy way to check everything out [17:29:52] so it's easily distinguishable for the first sight [17:29:53] ^demon: oh, I see. So the real thing to do is to make Gerrit do all the things CR used to do. I don't know how things end up in IRC, I assume a similar process, RSS and some bot? [17:29:53] you'd make commits against the individual repos [17:30:06] which i think you could do by just cd'ing in and doing git stuff there [17:30:13] robla: extensions-all is just submodules and no actual code i thought? [17:31:02] neilk_: Have you hung out in #wikimedia-operations or #wikimedia-labs recently? Gerrit already reports puppet changes there [17:31:03] has anybody compared the features of CR tool and gerrit? Can it filter by path, author and status? [17:31:10] jeremyb: I *think* brion is proposing something different [17:31:16] RoanKattouw: not only not recently but not ever. [17:31:17] heh, Nikerabbit's question is almost comical [17:31:22] Gerrit sucks [17:31:28] <^demon> neilk_: Right now codurr's a bot watching from TS. [17:31:36] I mean, its core is great, but the UI and everything around it sucks [17:31:42] jeremyb: robla no, extensions-all is git submodules only as we've discussed so far [17:31:47] so if gerrit sucks, are we going to force it for everyone? [17:32:01] *^demon disagrees that gerrit sucks. [17:32:11] <^demon> The UI isn't intuitive, but it's not a bad tool. [17:32:14] OK maybe that was too harsh [17:32:26] Gerrit's UI is not nice to say the least [17:32:43] it looks very messy to me [17:32:45] Well, if I can't filter by path and author, then something else about review workflow needs to change. [17:32:47] <^demon> And really, gerrit will only apply for gated repos (trunk, wmf-deployed extensions) [17:32:56] Gerrit can filter by author [17:33:01] I don't believe it can filter by path [17:33:06] ok [17:33:07] <^demon> If the code isn't in a gated model, you just push and can skip gerrit entirely. [17:33:08] But it can filter by branch [17:33:21] And, remember, most of the things we use path filtering for now will be separate repos by then [17:33:24] can it at least show a list of all authors? [17:33:36] ^demon: were there any unresolved issues that you're trying to settle with Gerrit, or is this disucssion just about whether we should use Gerrit? [17:33:41] RoanKattouw: ok, so that probably solves things for extensions, but for core it will be a huge mess... [17:33:46] RoanKattouw: for good and bad I assume? [17:34:03] <^demon> robla: Some backend issues, but nothing show-stopping from where I sit. [17:34:04] Nikerabbit: ? [17:34:19] There are a lot of things we could do to make Gerrit nicer to use [17:34:36] is it worth it? [17:34:47] Nikerabbit: what's the alternative? [17:35:36] Yeah that's a good point [17:35:40] hack CR to support Git? [17:35:45] heh [17:35:56] It's probably easier to make Gerrit nice [17:36:18] <^demon> I could go list a bunch of reasons CR sucks too. [17:36:25] does gerrit have an api? we could replace its entire frontend [17:36:44] let's replace ours first :) [17:36:44] heh [17:36:49] is there, like, a spec with a list of the functionality that we need in a CR tool? [17:36:56] wait who here has actually used gerrit apart from roan and demon [17:37:00] CORS, oauth :P [17:37:01] There is a wiki page where we wanted to track this [17:37:04] i seriously find gerrit completely unusable, every time i touch it i'm baffled [17:37:18] But it only lists one thing [17:37:21] "replicate Extension:CodeReview" is kind of vague and I predict a bunch of things would fall through the cracks [17:37:28] neilk_: apergos [17:37:31] <^demon> brion: Not like you're used to, but the whole thing is ajaxy using JSON-RPC so it could be possible, in theory. [17:37:44] brion: A little bit of documentation would go a long way there I guess [17:37:45] *shudder* oh this is going to be fun :) [17:38:09] I haven't actively used it, just browsed and my impression is that it's inteface will be even more frustrating with our commit rate, no? [17:38:19] How does commit rate matter here? [17:38:43] currently, the number of commits under review is relatively low [17:38:50] "unusable" needs to be clarified. Like, is it seriously conceptually flawed or just doesn't explain things well in its UI. [17:38:55] dashboard is useful [17:39:00] neilk_: IMO it's the latter [17:39:11] I don't believe Gerrit is conceptually flawed. Actually, I think it's conceptually great [17:39:17] neilk_: at a minimum it doesn't explain things well in its UI. as such, i haven't fully divined its conceptual structure [17:39:26] will it continue being useful with 1000 commits to review? [17:39:29] It's one of those things where the concept is brilliant, the backend implementation is OK, and the frontend is badly done [17:39:32] <^demon> MaxSem: That dashboard is supposed to be your primary view. It'll show you things you're signed up for reviewing. [17:39:46] ok so it might fundamentally suck or be fundamentally okay... but its superficial design flaws are hiding that for now :) [17:40:13] neilk_: I actually understand Gerrit, as do ^demon and Ryan, and I think we all agree it's fundamentally solid [17:40:21] MaxSem: Google is using it for some pretty large projects. whether or not it will work for ours is another story, but there is at least some evidence it can be coerced to work [17:40:28] another thing is that, in principle with better use of git and personal work branches there'll be fewer commits actually going through final gerrit review -- because we'll be able to do more work collaboratively on a branch before pushing it together as a chunk [17:40:32] this is imo good :D [17:40:51] right now we have to commit to trunk just to share our code -- that's bad [17:40:53] heh, internally Google used Mondrian last time I heard [17:40:56] <^demon> Also, remember like I said earlier that gerrit only matters for gated extensions. [17:40:58] Also, the ops people have been using it for puppet for a while [17:41:00] it clutters the code review system with crap that's not actually ready yet [17:41:06] <^demon> If you're writing your extension and you don't care if anyone ever reviews it. [17:41:10] <^demon> You can push without review [17:41:12] <^demon> And never touch gerrit [17:41:24] (sorry, I am unavailable right now) [17:42:17] (are there two git workflow irc meeting scheduled today? or did i just have the wrong one on my calendar for this afternoon pst?) [17:42:24] there are 2, brion [17:42:27] confusing! [17:42:29] <^demon> brion: Yes, so we can grab more timezones. [17:42:30] the second one is in about 6 hours [17:42:34] covering the same material [17:42:37] ok that explains that anyway :D [17:42:41] <^demon> So I could grab Tim + anyone who missed it in SF. [17:42:46] perfect [17:42:54] ok i gotta run into the office, catch y'all later [17:43:06] Hmm, 1am, no I'm not going to that one [17:43:28] Sooo [17:43:32] Any other items / questions [17:43:38] I had a thing about branch naming [17:43:41] *sumanah looks at http://etherpad.wikimedia.org/Git [17:43:52] But that's kind of bikeshed-y [17:43:55] random tip I just heard: "it's basically a terminal gui interface w/ color, so it defaults to showing info similar to git-log, but you can drill down. similar to gitweb in browser. very useful, very fast. http://jonas.nitro.dk/tig/screenshots/ " [17:44:10] the software's called "tig" [17:44:15] Mondrian: http://video.google.com/videoplay?docid=-8502904076440714866 (Google Tech Talk) [17:44:37] were we supposed to assign any tasks now? I see we've gathered a few more... [17:44:43] *RoanKattouw sees ASCII art revision graph and groans [17:44:57] <^demon> neilk_: Most of those tasks will fall to me :) [17:45:05] <^demon> I may delegate if someone's volunteering. [17:45:15] I've used tig. keystrokes are kinda unintuitive...it's nice but classic unixy ui [17:45:15] neilk_: https://www.mediawiki.org/wiki/Git_migration_issues and https://www.mediawiki.org/wiki/Git_conversion [17:46:11] neilk_: one of the TODOs from a platform engineering meeting last week: Chad will write the "how we are doing this" primer before the switchover. [17:46:36] ok, I'm just far out of the loop here I guess [17:46:41] sumanah: thanks [17:46:43] neilk_: but fortunately instead of writing our own "how to use git at all" primer we can use, like, http://openhatch.org/missions/git and similar tutorials [17:47:19] sumanah: pls put it in etherpad under the todo/documentation, thx [17:47:28] neilk_: well, IIRC ^demon has been saying that he needs to update those pages on mediawiki.org [17:47:50] Danny_B|backup: just a moment.... [17:49:13] Danny_B|backup: done. [17:49:20] thank you [17:50:26] *sumanah looks at http://etherpad.wikimedia.org/Git [17:51:27] so RoanKattouw & ^demon -- where should I look to see the spec of exactly how we need to mod/improve Gerrit? [17:51:36] <^demon> Any other questions/things we want to cover? We're almost at an hour now (and I'm wanting some lunch) [17:51:44] sumanah: In our minds [17:51:55] https://labsconsole.wikimedia.org/wiki/Gerrit_bugs_that_matter [17:51:59] Incomplete ---^^ [17:52:06] <^demon> All the improvements I want are backend things. I've gotten used to the UI already [17:52:12] <^demon> But yeah, that link. [17:52:28] What backend improvements do you want then? [17:52:34] If we want to improve the number of people who feel comfortable doing code review, maybe we should reach out to a few specific people & ask them to beta test [17:52:36] one thing I hate about Gerrit is that upon clicking on "diff all" ot opens a bunch of windows [17:52:37] (Apart from the LDAP issue) [17:52:52] Yes [17:52:55] MaxSem: I have filed a bug for that [17:53:02] I want the full diff displayed inline [17:53:07] I was *gonna* code that up in NOLA [17:53:17] But spent most of the weekend trying and failing to set up Gerrit on my machine in the first place [17:53:26] GrafZahl_: have you taken a look at the test git repository? the sandbox? [17:53:47] and it doesn't support Opera! [17:54:09] sumanah: Nope, my repo is on GitHub [17:54:20] sumanah: Up to now I didn't know of a test repo [17:54:41] are we going to keep old svn in read only mode? [17:54:50] <^demon> Danny_B|backup: Yes [17:55:01] GrafZahl_: let me dig up the link -- I know ^demon sent it out to wikitech-l, the developers' list [17:55:02] <^demon> I plan to keep the old svn repo up in read only mode for practically forever. [17:55:08] Platonides, thedj, have you played with the test repo at all? [17:55:19] <^demon> Platonides has, and he pushed changes to it :) [17:55:23] <^demon> He's been super helpful [17:55:25] ^demon: so all [[rev:1234]] as well as r1234 will work, right? [17:55:51] <^demon> Well [[rev:]] links point to CR, not viewvc/svn. [17:55:59] <^demon> I imagine we could just set CR as read-only too. [17:56:07] GrafZahl_: `git clone https://gerrit.wikimedia.org/r/p/test/mediawiki/core.git` [17:57:01] ^demon: [[rev:1234]] *used to* point to viewvc [17:57:35] sumanah: thanks [17:57:54] so we can switch it back if necessary [17:58:19] Can anyone remind me why literal protocol-relative don't result in links where any other protocol would ? [17:58:22] <^demon> Perhaps. But leaving the links stable as they are now wouldn't be bad. We don't want to trash our current CR history. [17:58:44] like //foo.org/bar [17:58:55] <^demon> I think there's an open bug for that. [17:59:11] <^demon> Anyway, we've come up on about an hour. Any last questions? [17:59:13] Krinkle: // can be comment [17:59:17] ok, thanks ^demon [17:59:22] Krinkle: Because it's got way too much potential for accidental links [17:59:25] Yeah, I think we're done [17:59:33] so i assume it's not resolving now by designe [17:59:43] Danny_B|backup: In wiki markup ? [17:59:51] <^demon> Alrighty. Thanks everyone for your input. [18:00:10] Krinkle: you can have a demo chunk of code on site [18:00:15] <^demon> Also one last thing on gerrit: it *is* under active development, so if you find things about it that particularly bug you, I encourage you to file bugs about it. [18:00:17] s/site/page/ [18:01:04] Danny_B|backup: So if someone doesn't use or
 or  and pastes code with only a space indentation, there can be random links
[18:01:15] 	Danny_B|backup: Well, only if there is no space between // and the text, which usually isn't the case.
[18:01:55] <^demon|away>	Lunchtime!
[18:01:57] 	sure, but as RoanKattouw said - it would trigger lots of false positives atm
[18:02:25] 	I added a few items to http://etherpad.wikimedia.org/Git around recruiting testers for the test repo, documentation of the CR tool/Gerrit todos, and so on
[18:02:42] 	meeh
[18:02:51] 	Krinkle: The standard regex requires a \b (word break) before the start of a protocol, but / is a word-breaking character, so that fails
[18:02:51] 	what exactly does "mine" and "their" mean when doing svn merge
[18:03:00] 	Nikerabbit: "mine" is what you have locally
[18:03:07] 	"their" is what's coming in from the repository
[18:03:40] 	RoanKattouw: okay, so currently it is not deliberately excluded from anything in the Parser, it just happens to be the way it is.
[18:03:51] 	as usual :P
[18:03:58] 	No
[18:04:07] 	It is deliberately excluded from the autolink protocols
[18:04:21] 	Because otherwise foo//bar would be rendered as foo//bar
[18:04:43] 	The fact that / is a word-breaking character makes the regex overly wide rather than overly narrow
[18:06:19] 	Hm??? I don't know regex very well, but if the regex states that it must be word-broken (which makes perfect sense here) in order to become a link, shouldn't // only become a link if text is like foo///bar (three slashes) ?
[18:06:31] 	the \b is inside the protocol-match ?
[18:07:14] 	foo/http://bar does become a link
[18:07:19] 	which makes sense
[18:07:23] 	later
[18:07:23] 	The \b matches the space between a word char and a non-word char
[18:07:35] 	So foo//bar matches \b\/\/[a-z]+
[18:07:36] 	RoanKattouw: not the _space_
[18:07:40] 	No, not the space
[18:07:42] 	but the word boundary
[18:07:43] 	The .... void?
[18:07:47] 	Yeah
[18:07:56] 	There is an imaginary "place" between those two characters
[18:07:57] 	RoanKattouw: \b  match a change from \w to \s  iirc
[18:08:01] 	And that's what \b matches
[18:08:06] 	No, not to \s
[18:08:10] 	From \w to \W
[18:08:19] 	Because, as I said, \b matches at the position between '/' and 'b'
[18:08:53] 	understood. so why would \b + (protocols) match // ? It's not a word break + protocol, it's either a word break with a slash or just two slashes.
[18:10:42] 	The thing is, the regex is essentially this
[18:10:52] 	'\b' . preg_quote( $protocol )
[18:11:02] 	So for $protocol == 'http://' that's fine
[18:11:18] 	'h' is a word char, so it forces the preceding char to be a non-word char
[18:11:25] 	But for $protocol == '//' it's the other way around
[18:12:38] 	unfortunate timing.... skype call that I could not put off
[18:12:43] 	now let me do the backread
[18:13:10] 	Okay, so in case of foo/http://bar \b is between '/' and 'h'+ and for foo//bar, it's between 'o' and '/' ?
[18:13:20] 	nasty
[18:14:22] 	I believe that's how it works, yes
[18:17:24] 	Krinkle: not that nasty :)
[18:17:31] 	back
[18:17:43] 	Krinkle: the nasty thing is with word having a dash
[18:18:41] 	for example human-computer
[18:18:46] 	you will only match "human"
[18:18:52] 	cause \b is between n and -
[18:19:21] 	the result is that on http://www.mediawiki.org/wiki/Special:VisualEditorSandbox you can not select "human-computer" with a double click
[18:19:35] 	(cause it uses \b \B to find word boundaries)
[18:20:03] 	and I am pretty sure Javascript does not let us redefine \w \W \b \B etc..
[18:26:10] 	seems I???m a bit late with this topic, but??? what about [[commit:]]?
[18:29:32] 	hashar: thx
[18:30:09] 	Krinkle: http://www.regular-expressions.info/ is a good source of informations
[18:30:31] 	Krinkle: they have some links to Windows tools that will let you visually play with regex
[18:30:55] 	http://www.regular-expressions.info/regexbuddy.html  http://www.regular-expressions.info/regexmagic.html
[18:31:02] 	there are similar tools under kde / gnome IIRC
[18:31:15] 	well, if we would really want, one could reproduce \b with a more advanced regex and keep that part out.
[18:31:29] 	but I think not matching foo-bar as one word is perfectly fine
[18:31:33] 	most text editors do it too
[18:31:38] 	when double-clicking words
[18:31:46] 	so not just javascript surfaces
[18:32:13] 	I'd say it's the expected behavior, or at least, has become it over the years.
[18:32:24] 	from a UX perspective that is.
[18:36:34] 	Krinkle: most probably. I don't have an opinion on dashed words, just took that as an example of \b usage :)
[18:36:44] 	sure, cool :)
[18:37:00] 	hashar: Arr, so while here. What's stat on the testy testy ?
[18:37:20] 	I fixed some bugs friday and today
[18:37:38] 	the VM is running fine. Aka the mediawiki installation is actually usable
[18:37:59] 	so now I need to have an op to build the package, upload it on apt.wm.org and then unleash that on galliu
[18:38:05] 	will most probably do that tomorrow :)
[18:38:25] 	apt.wm.org ?
[18:38:31] 	never heard of that
[18:38:35] 	but I can guess what it is
[18:38:36] 	:)
[18:39:11] 	that one http://wikitech.wikimedia.org/view/Apt.wikimedia.org  :-)))))
[18:39:41] 	currently figuring out why X forwarding does not work :D
[18:40:00] 	will have to choose between:  50/50, calling a friend or a joker
[18:40:11] 	dinner time, baby bath. See you later this evening
[19:05:20] 	hi, I added some of my "cents" to the pad
[19:58:34] 	rsterbin: So, what's the api.php URL for that clicktracking-won't-take-absolute-URLs bug?
[19:59:47] 	RoanKattouw / DarTar: when i click on the "Learn More" button, I go to a page with this: "     " at the top
[20:00:08] 	OK, thanks
[20:00:50] 	i'm doing this locally -- url is api.php?action=clicktracking&eventid=ext.articleFeedbackv5%400-option1-cta_learn_more-button_click-bottom&namespacenumber=0&token=b67YHdNap0Tr6OFu2d16u5AzgOOB4smGU&redirectto=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FWikipedia%3AArticle_Feedback_Tool%2FTutorial
[20:04:04] 	I've found it
[20:04:10] 	It's some validation code that I put in that we don't really need
[20:10:54] 	rsterbin: 	catrope *  /trunk/extensions/ClickTracking/ApiClickTracking.php: Remove server-side "validation" of redirect URLs, not needed and getting in the way for AFTv5
[20:11:21] 	awesome
[20:11:41] 	Also just deployed that to the site
[20:12:07] 	fabriceflorin: option 4 being broken on prototype was due to a bug i fixed on friday -- i've updated prototype, so you should be able to see it there.
[20:13:52] 	RoanKattouw - works great now (locally); shall i test on labs?
[20:13:59] 	Please do
[20:14:05] 	I am in the process of updating labs to SVN
[20:14:11] 	*from SVN
[20:14:12] 	to HEAD
[20:14:35] 	ok
[20:16:50] 	RoanKattouw: just ran into a bug on prototype on edit -- "Fatal error: Call to undefined method  EditPage::getTitle() in  /srv/org/wikimedia/prototype/wikis/rc/extensions/ArticleFeedbackv5/ArticleFeedbackv5.hooks.php  on line 310"
[20:17:00] 	Rawr
[20:17:07] 	looking into that now
[20:17:10] 	Thanks for saying that right before I press Enter :)
[20:17:14] 	Oh, I know what that is
[20:18:03] 	oh good
[20:18:31] 	i'm looking at docs, but perhaps it's quicker if you fix it
[20:20:13] 	I'm committing it now
[20:20:31] 		catrope *  /trunk/extensions/ArticleFeedbackv5/ArticleFeedbackv5.hooks.php: For 1.18wmf1 compat, don't use EditPage::getTitle()
[20:21:08] 	word
[20:25:27] 	Thank you guys! I have started a new Etherpad here: http://etherpad.wikimedia.org/AFT5
[20:25:49] 	Please add your notes in the appropriate section.
[20:26:05] 	OK I've updated AFTv5 on labs
[20:26:13] 	Thanks to Reha for catching a last-minute issue there
[20:26:50] 	Roan, is MediaWiki r106685 something we need to be concerned about?
[20:27:09] 	That's my fix for the issue Reha ran into
[20:27:37] 	Thanks, Reha and Roan! Do you need us to test this on labs? Is this related to 'Edit this page' CTA?
[20:28:04] 	It was for the thing where it passes something through to the edit page
[20:28:25] 	The other issue Reha was talking about on Skype, I don't know if that's in the deployed code or not (the clicktracking absolute URL thing)
[20:28:46] 	Got it, I will test the edit pass-through on labs now.
[20:28:58] 	Reha, thanks for updating Option 4 on prototype. Will check it later on, once we launch.
[20:31:50] 	ok
[20:32:29] 	fabriceflorin: just in case you're working hard and don't see my email - can you send me links to the most up-to-date AFT5 form screenshots/wireframes so I can shove them in the tutorial?
[20:32:47] 	Let me know when I'm good to go to production
[20:32:52] 	Seems to be working on en.labs, I am able to make an edit without any issues. However, I no longer get all the info about my set-up in the URL, which I used to get.
[20:32:59] 	RoanKattouw: the fix you just did for click tracking solves the issue i was talking about on skype
[20:33:00] 	Will do, Ironholds.
[20:33:08] 	fabriceflorin: thankee :). We all still on for 2pm?
[20:33:09] 	Yay
[20:33:14] 	good to go as far as i can see here
[20:34:09] 	Oliver, let's connect at 2pm PT. Please wait for my go-ahead before announcing, though. We need to test on production for about an hour before we can pronounce this ready to announce.
[20:34:58] 	Thanks, Roan. Are there any final tests we should do before you can deploy to production?
[20:35:38] 	fabriceflorin: of course :)
[20:35:42] 	Nah we'll be fine
[20:35:54] 	In the meantime I'll just be prepping; final help-page checks, drafting up announcements, clearing them, that sort of thing.
[20:37:38] 	Alright, deploying now
[20:37:46] 	*crosses fingers*
[20:38:06] 	Sounds great, Ironholds.
[20:38:09] 	oh crap
[20:38:10] 	103 Warning:  Missing argument 12 for ArticleFeedbackv5Hooks::trackEditSuccess() in /usr/local/apache/common-local/php-1.18/extensions/ArticleFeedbackv5/ArticleFeedbackv5.hooks.php on line
[20:38:12] 	hahaha
[20:38:12] 	 321
[20:38:16] 	ack
[20:38:18] 	Fixing
[20:38:19] 	 Alright, deploying now
[20:38:19] 	 oh crap
[20:38:23] 	WE'RE PROFESSIONALS, HONEST
[20:38:50] 	Oooh
[20:38:52] 	this is the dirty secret about professionals
[20:38:53] 	Hook incompat I guess
[20:39:18] 	Reha, should I be concerned that the URL on en.labs no longer includes the coordinates we used to have after editing? Right now, it only says: '&stable=0' ... as so:  http://en.labs.wikimedia.org/w/index.php?title=Golden-crowned_Sparrow&stable=0
[20:39:30] 	RoanKattouw: why would that not show up on a local checkout?
[20:39:49] 	Probably an incompatibility between trunk and deployment
[20:39:54] 	Hook signature probably change
[20:39:55] 	d
[20:39:58] 	fabriceflorin: not sure what you're talking about
[20:41:38] 	Reha, after I clicked 'Edit this page', then 'Save Edit', there used to be a string of parameters in the URL last week, identifying which bucket you used, which CTA you used, etc. Don't know if that was a bug or not, but I am reporting a change in behavior.
[20:42:03] 	fabriceflorin: That's because of that error message probably
[20:42:33] 	OK, as long as that data is being saved through some other means, I'm fine.
[20:43:31] 	hey Ironholds
[20:48:49] 	Alright, we are live on enwiki
[20:48:58] 	And this time it's not spewing PHP warnings :)
[20:49:20] 	it's always an accomplishment to have code that doesn't spew errors
[20:50:56] 	Cool! We're starting to test now!
[20:51:00] 	fabriceflorin: Alright so you and whoever else, please test AFTv5 on enwiki now :)
[20:51:03] 	jinx
[20:51:36] 	Thanks Aaron, if you are referring to AFT5, we appreciate the compliment ;o)
[20:51:54] 	the re-re-re-re rewrite of AFT :)
[20:52:18] 	OK, please post any immediate issues on this Etherpad page: http://etherpad.wikimedia.org/AFT5
[20:54:06] 	Roan: so on 	https://en.wikipedia.org/wiki/1935_Ohio_State_Buckeyes_football_team, what am I looking for?
[20:54:10] 	I can't see anything.
[20:54:30] 	A widget right below the red table, I think
[20:54:34] 	Or that's what I'm seeing
[20:54:38] 	You may be in a different bucket I guess
[20:54:42] 	Ah, got it now.
[20:55:05] 	I see '<articlefeedbackv5-bucket2-function arrayPrototypeUniq() { var result = []; for (var i = 0; i < ...'
[20:55:21] 	wtf
[20:55:29] 	clickable JS code ftw
[20:55:35] 	That ain't supposed to happen
[20:55:39] 	I don't get that, not even with ?bucket=2
[20:55:44] 	Oh, I don't think it shows on https *investigates by himself so as to lessen the workload on devs*
[20:55:53] 	Jarry1250: It shows on https or me
[20:56:02] 	oooh
[20:56:07] 	AaronSchulz: I think I might know why that is
[20:56:16] 	Oh, got it now. *ponders more*
[20:56:17] 	Someone's been naughty and used a for .. in loop on an array
[20:57:46] 	And I''m supposed to see it whilst logged in or not?
[20:57:49] 	I've found one instance, going through more code
[20:57:53] 	I'm seeing it while loggedi n
[20:59:53] 	RoanKattouw: i take it you're still working on the AFT deployment?
[20:59:58] 	Yes
[21:00:16] 	do you have any idea how much longer you'll need?
[21:00:20] 	Logged out and immediately saw it...
[21:00:40] 	Logged in and gone again (same article, same browser).
[21:01:24] 	awjr: Not long
[21:01:43] 	Aha, got a relevant JS error.
[21:01:54] 	Oh!
[21:02:19] 	Jarry1250: Let's hear it :)
[21:02:28] 	categories.current[cat].replace is not a function
[21:02:34] 	Aha
[21:02:36] 	OK
[21:02:46] 	Your issue and Aaron's issue are probably the same issue
[21:02:49] 	This next commit will fix it
[21:03:00] 	RoanKattouw: cool - can you let me know when yer done?
[21:03:37] 	Sure
[21:04:06] 	RoanKattouw: That's due to Twinkle probably, or a fix wasn't deployed yet..
[21:04:12] 	(clickable js)
[21:04:23] 	Krinkle: No, it's a classic for-in-loop-on-an-array thing
[21:04:27] 	I know
[21:04:31] 	Working fine on my tests so far. Tried bucketing and protected pages, no problems yet.
[21:04:37] 	That caused a fatal error last week in AFT
[21:04:44] 	?
[21:04:47] 	which I fixed by adding a hasOwn check
[21:04:49] 	Also working on uploading the latest images for Oliver.
[21:04:51] 	Oh
[21:04:52] 	OK
[21:05:08] 	(it was looping over an object and calling something on it which ended up calling undefined of undefined)
[21:05:10] 	Well ideally if you're looping over an *array*, you just use i=0; i < a.length; i++
[21:05:14] 	that was fixed bt this is very alike
[21:05:15] 	That's how I fixed it
[21:05:18] 	Yeah well
[21:05:23] 	Object.prototype isn't extended very often
[21:05:29] 	Not as often as Array.prototype
[21:05:49] 	Still getting errors (purged, cache reloaded)
[21:06:07] 	They should be going away within the next 5 minutes
[21:06:17] 	RoanKattouw: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/106398
[21:06:20] 	ah, that's WebFonts
[21:06:23] 	see that bug
[21:06:29] 	(the bugzilla link)
[21:06:37] 	Right
[21:06:48] 	that shows a report by me about how twinkle happily adds 2 methods to array.proto
[21:06:50] 	We generally don't protect against Object prototype additions in MW code
[21:06:57] 	indeed
[21:07:00] 	Twinkle should be fixed
[21:07:06] 	Or is config an array?
[21:07:14] 	It's an object, right?
[21:07:31] 	this one fixes it in WebFonts, confirmed
[21:07:44] 	does AFT have a var by the same name ?
[21:07:54] 	No
[21:08:01] 	I was just wondering
[21:08:06] 	Anyway
[21:08:07] 	config in Webfonts is indeed an array now that you mention it
[21:08:08] 	I traced it
[21:08:11] 	Right
[21:08:16] 	https://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/WebFonts/resources/ext.webfonts.js?view=markup&pathrev=106398#l182
[21:08:20] 	So then the appropriate fix is to use an i=0 loop
[21:08:24] 	yep
[21:08:27] 	Anyway
[21:08:43] 	congrats on getting AFT5 out!
[21:08:43] 	Jarry1250: You should see the fix if you Ctrl+F5 at any point 5 minutes after I deployed the fix
[21:09:04] 	for folks that want to help with post-deployment testing, here's an ether pad we're using:
[21:09:05] 	http://etherpad.wikimedia.org/AFT5LaunchTests
[21:09:07] 	I think it's been 5 mins already
[21:09:24] 	Cool, will let you know.
[21:09:33] 	Yup, working :)
[21:09:35] 	yay
[21:09:37] 	awjr: I'm dnoe
[21:09:50] 	RoanKattouw: by the way, I watched a Google Tech Talk yesterday on JS performance. Not a typical one, I actually didn't know most of these. Among them was an interesting way to optimize a for loop or a while loop. Use the second statement of the for loop (or the statement to the while keyword) as a boolean returning. So instead of i < len; i++, use i--
[21:09:53] 	RoanKattouw: awesome thanks
[21:10:04] 	congrats on AFT v5 :)
[21:10:09] 	had especially nice results in javascript
[21:10:19] 	Ah, right
[21:10:26] 	although it's not huge, and one could argue readability wins over performance here
[21:10:31] 	for ( i = n; i--; )
[21:10:33] 	yep :)
[21:10:39] 	Yeah I tend to go for readability > performance
[21:10:46] 	Now if n is like a million, then maybe
[21:11:04] 	But generally for loops with ten iterations are not gonna dominate your running time
[21:11:12] 	RoanKattouw: Anyway, if you're interested, I added a couple of links here: http://www.mediawiki.org/wiki/JSPERF#See_also
[21:11:21] 	New are "Nicholas C. Zakas - Speed Up Your JavaScript" and "Douglas Crockford - JavaScript: The Good Parts (Google Tech Talk)" to this list.
[21:11:32] 	The latter you probably have already seen in various forms.
[21:11:56] 	recommend it :)
[21:14:04] 	I have never actually read The Good Parts
[21:14:06] 	which was the CR tag for 'need improvement'?
[21:14:07] 	I should read it some time
[21:14:11] 	Platonides: todo
[21:14:21] 	You know, that mythical time where I have time to actually read stuff
[21:14:29] 	howief / Roan / anyone who knows: Will this 11k deployment be used to analyse the usefulness of the extension or just its tecchnical integrity ahead of a wider rollout?
[21:14:41] 	Jarry1250: the former, to a certain extent
[21:14:48] 	RoanKattouw: you read train routes instead?
[21:14:49] *RoanKattouw 	defers to Howie or Ironholds
[21:15:07] 	Jarry1250:
[21:15:08] 	both
[21:15:17] 	things like "testing if it makes a positive impact on editing activity" will wait a bit - the initial testing is just to see which version works best - but this phase is as much for research as it is for making sure nothing melts :)
[21:15:28] 	AaronSchulz: Haha. Nah, web sites about taxes and insurance and stuff actually
[21:15:39] 	(International moves are hairy)
[21:15:39] 	oh, my mistake then :)
[21:15:39] 	this initial 11k deployment is more for technical integrity, but Ironholds will be working with the community in selecting a few hundred high trafficked articles to test the tool on
[21:15:45] 	Oh, sorry about forgetting you Ironholds :) nothing personal.
[21:15:50] 	Jarry1250: that's fine!
[21:15:56] 	the data from tis expanded set will be used to determine the usefulness of the extension
[21:16:21] 	the 11k deployment will be used for things like a/b testing data so we can work out a final version, but more complex tests on usefulness have to wait until we are sure we have the "Best" AFT version
[21:18:45] 	Jarry1250: would you like a link to the full research and testing plan?
[21:18:54] 	it's a wee bit opaque, to me at least, but it might help :)
[21:19:04] 	Hi Oliver, I just sent you the links to the new AFTv5 screenshots via email. The links are also posted on our Etherpad: http://etherpad.wikimedia.org/AFT5
[21:19:21] 	RoanKattouw, do msg_resource, msg_resource_links, module_deps tables contain useful data for the future? Or do they get automatically regenerated?
[21:19:26] 	fabriceflorin: yup, saw them, replied :)
[21:19:33] 	Jarry1250: http://meta.wikimedia.org/wiki/Research:Article_feedback/Data_and_metrics if you're interested
[21:19:37] 	Ironholds: Sure, hit me :) (See! I do research my Technology reports, honest.)
[21:19:42] 	Thanks, Ironholds!
[21:19:43] 	Thaanks.
[21:19:58] 	Platonides: They're caches, they can be safely purged and will be regenerated
[21:19:58] 	Jarry1250: that's more than I do! I mean, uh, I'm a professional, honest! (please don't quote me, I like my job)
[21:20:20] 	thanks
[21:20:24] 	I thought so, but wanted to confirm
[21:20:40] 	Jarry1250, here is a general link about this AFTv5 project, FYI: http://www.mediawiki.org/wiki/Article_feedback/Version_5
[21:21:04] 	I think if you purge one of msg_resource and msg_resource_links you need to purge the other too, but that's it
[21:21:09] 	fabrice: I have indeed seen that page, probably around a week  ago.
[21:22:26] 	it was just for skipping them altoghether from backups
[21:22:57] 	Sounds great, thanks Jarry. Please let Ironholds or I know if you have any more questions.
[21:24:14] 	I'm looking on how to rewrite the table comments to imply that
[21:24:36] 	"Table for administering which message is contained in which resource" is a bit ugly for "do not include it in the backup"
[21:25:16] 	fabrice: Sure. I'm just doing a simple In brief this week but I do like to get them right :) I also appreciate all updates I can get on these kinds of things.
[21:25:20] 	fabriceflorin: in other news, we still on for the 2pm PST checkin?
[21:26:51] 	btw if any of the VE team want to check I've got my facts right, they are more than welcome to ( https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-12-19/Technology_report ) -- we haven't published yet.
[21:29:47] 	Hi Ironholds, yes we are still on for 2pm PT check in. Want me to call you over Skype?
[21:30:12] 	fabriceflorin: that'll work :)
[21:30:38] 	Reha, Greg, Sean or other OmniTI team members, are you finding any issues with the AFTv5 code on production?
[21:30:50] 	nothing so far
[21:31:07] 	Thanks, Reha! Roan, any issues on your end?
[21:31:19] 	Nope, all is quiet on the Eastern front
[21:34:09] 	Cool. That's the way we like it ;o)
[21:39:29] 	can you specify multiple wikis at the same time with mwsript? eg mwscript sql.php --wiki=metawiki,foundationwiki?
[21:40:48] 	Mo
[21:40:50] 	*No
[21:40:59] 	You'll have to do for wiki in foo bar; do mwscript $wiki blah; doe
[21:41:00] 	*done
[21:41:10] 	Or foreachwiki sql.php blah
[21:41:15] 	k, yeah didnt look like it from the code but wanted to be sure
[21:41:16] 	(but that does all wikis)
[21:41:17] 	thanks
[21:54:52] 	RoanKattouw: thanks for rolling out aftv5
[21:55:41] 	amen!
[21:55:50] 	now, lets pray that the community hates it
[21:55:55] 	I get paid by the hour and want to stay employed
[21:56:02] 	I mean, uh, I'm sure the software will be fine!
[22:03:16] 	fabriceflorin: call still on? :)
[22:04:28] 	Yes, Oliver, will call you in about 5 mins., OK?
[22:05:05] 	yeah, that rocks :)
[22:10:34] 	Jarry1250: Reading your VE coverage now
[22:10:36] 	"Nonetheless, it seems likely that hand-constructed pages will be subject to a one-off simplification program for overly complex wikitext structures, with their reintroduction prevented using a series of technical restrictions."
[22:10:42] 	That statement sounds slightly misleading to me
[22:10:57] 	Probably. What would you prefer?
[22:11:53] 	I'm sorry
[22:11:54] 	I was thinking of those {{medal table begin}} templates, incidentally.
[22:12:07] 	I was gonna explain but I choked on a glass of water and had to run off to the kitchen :(
[22:12:08] 	Yeah
[22:12:19] 	So most likely (and I am only half plugged into this), two things will happen
[22:12:36] 	First, we will forbid stuff like {{medal table begin}}, and pages using that will have to be fixed before they can start using VE
[22:12:40] 	But that's a manual step
[22:12:41] 	Hi guys. Dario is reporting that no comments are being stored right now, according to his first check.
[22:12:51] 	Second, we will probably enforce a "normalized" version of wikitext
[22:12:57] 	I am checking the data coming in on the slave DB and I only see feedback metadata, not feedback contents
[22:13:19] 	aft_article_feedback is getting correctly populated
[22:13:30] 	13 rows so far
[22:13:36] 	For instance, we'll always use [[Foo]] and never [[Foo|Foo]], we'll use == Foo == instead of ==Foo== or vice versa, etc etc etc. Essentially we'll do this for every syntactical ambiguity
[22:13:53] 	DarTar: NICE!
[22:14:03] 	wait, that means it's broken. crap.
[22:14:04] 	but aft_article_answer is empty
[22:14:12] *Ironholds 	needs to learn not to read from the bottom up
[22:14:19] 	Ironholds: indeed
[22:15:17] 	Jarry1250: So we set rules for resolving each of those ambiguities by determining what the WikiDOM->wikitext renderer will output. Then you can normalize wikitext by running it through the parser (wikitext->WikiDOM) then through the renderer (WikiDOM->wikitext)
[22:15:35] 	Jarry1250: So before enabling VE on a page, we'll make a normalization edit that will be clearly marked as such
[22:15:38] 	Will manual edits be silently normalised then?
[22:15:45] 	I assume so.
[22:15:47] 	Jarry1250: And then after VE is enabled, we will silently normalize your edit pre-save
[22:15:56] 	Just like ~~~~ is silently expanded to a signature
[22:16:01] 	It's in the subst: phase, essentially
[22:16:15] 	I have to leave, will be back in an hour or so
[22:17:19] 	Jarry1250: Note, all this normalization business is speculation at this point. It's an approach that's been discussed and I think it's the one the rest of the VE team is behind but I'm not sure. I know I support it :) but I don't know about the others
[22:17:41] 	Oh, sure. :)
[22:18:33] 	Anyway hopefully that clears things up
[22:18:45] 	Ironholds, I think we need to make sure this is not broken before we can promote it. So that means we need to hold off our call until we have addressed these issues.
[22:18:48] 	You seemed to imply the conversion of disallowed syntax to allowed syntax would be automatic
[22:19:05] 	Yes.
[22:19:09] 	So that's not the case; there is disallowed syntax and there is automatic conversion but they're not directly related
[22:19:39] 	fabriceflorin: the problem with the tables?
[22:19:44] 	"It also seems likely that once the editor is live, edits made in manual mode may be automatically tweaked for compliance before saving."
[22:19:45] 	cool :)
[22:19:49] 	That one is spot on
[22:19:58] 	Including the "seems likely" part :)
[22:20:12] 	RoanKattouw: and is that just the edit that gets tweaked, or would it normalise the rest of the page too?
[22:20:42] 	Ironholds: AFAIK it's a one off run to normalise the full page followed by silent normalisations of the page post-edit.
[22:21:27] 	ahh, awesome
[22:21:37] *Ironholds 	is not working on the VE at all, so knows nothing about it
[22:21:49] 	Roan, can you reproduce the problem reported by Dario? He cannot see any comments in the AFTv5 data he just checked on production.
[22:22:49] 	Ironholds: Yeah, essentially when a page is put into VE-supporting mode, we make an automatic normalization edit, then after that we silently normalize each edit so the page never becomes denormalized
[22:24:41] 	RoanKattouw: thank you for making me feel like I do a worthless job :P
[22:25:03] *RoanKattouw 	doesn't get it
[22:25:43] 	Do you frequently do syntax normalization gnoming then?
[22:26:46] 	no, I more meant "I am surrounded by awesome people doing stuff more useful and incredible than my work"
[22:26:50] 	Oh, heh
[22:26:58] 	My involvement with VE has been very tangential
[22:27:04] 	you normalise syntax. I, uh. tell people you're normalising syntax. Big whoop.
[22:27:25] 	But I was involved in the "Trevor dumps his brain in front of three other people to eliminate bus factor" meeting a while back
[22:27:33] 	heh
[22:27:38] 	It took an hour and a half to go through the VE backend design
[22:27:43] 	That shit is compelx
[22:27:49] *Ironholds 	nods
[22:27:56] 	I'd say "I can imagine" but I really can't
[22:28:12] 	I imagine I'll be more involved once VE picks up steam again
[22:28:37] 	Right now it's kind of dormant because Trevor is the driving force behind the whole thing and his Christmas vacation started today
[22:29:39] 	it still looks incredible
[22:30:25] 	Ironholds: heh, or you can tell me and I can tell people that the WMF is normalising syntax :P
[22:30:32] 	I would tell you to buy the VE team a beer, but I'm not the only one on that team that doesn't drink
[22:30:43] 	Jarry1250: subcontracting out my job? Iiiinteresting.
[22:30:47] 	Jarry1250: 2 degrees of separation, yay
[22:30:54] 	RoanKattouw: plus, I am kinda 13,000 miles away from most of them
[22:31:01] 	Well aren't I?
[22:31:10] 	(OK not for much longer but still)
[22:31:13] 	yes, but you'll probably be nearer sooner than I will be ;p
[22:31:39] 	Just organize a London hackathon already ;)
[22:31:47] 	Or, you know, get WM13
[22:32:39] 	RoanKattouw: Regarding RL2, got a minute ?
[22:32:50] 	Anyone got any last minute tips for the Signpost btw?
[22:32:52] 	Sure
[22:32:57] 	Or I'll sign off.
[22:32:59] 	RoanKattouw: So you added support for skins in the backend, right ?
[22:33:03] 	No?
[22:33:06] 	I'm planning to
[22:33:09] 	Probably this week
[22:33:15] 	Hm..
[22:33:26] 	I would've bet I saw you port that to RL2 ?
[22:33:33] 	I commented about it
[22:33:35] 	ok
[22:33:39] 	But I don't believe I actually did it
[22:33:54] 	anyway, regarding that. We need to think about optional things in validatoin.
[22:34:17] 	because in contrary to the old syntax, the JSON string doesn't' only contain non-empty properties
[22:34:29] 	Hmm, right
[22:34:32] 	but it should say functional and need no migratoin-script when we add a property
[22:34:40] 	Are you saying you want stuff to be optional
[22:34:41] 	Ah, right
[22:34:43] 	Good point
[22:34:45] 	so I was thinking of either normalizing them in, or normalizing them out.
[22:34:47] 	What do you prefer >?
[22:35:02] 	I don't think either is necessary
[22:35:13] 	What if we just tolerate missing properties and assume a default value for them
[22:35:24] 	Or third, make most optional
[22:35:25] 	And ignore unknown properties silently
[22:35:29] 	yeah
[22:35:36] 	okay, great :)
[22:35:47] 	I'll put that in as a task
[22:36:01] 	This will also simplify my unit test code
[22:36:29] 	RoanKattouw: yeah, that's what reminded me to this point. Going through CR backlog currently
[22:36:42] 	skipping some revs, but just making sure I know what's been happening
[22:37:14] 	RoanKattouw: Could you also look at applying normalization ? (order or properties, and stripping whitespace for db, and adding it for wikipage)
[22:37:22] 	Yes, that's also on my list
[22:37:35] 	ah, I see
[22:39:37] 	RoanKattouw: btw, ack-grep'ing for todos in our branch, I come across a todo that I solved partly but not sure how to properly do the other half.
[22:39:43] 	Exporting of mw-message pages
[22:39:56] 	how to get the translations as well?
[22:39:59] 	?
[22:40:28] 	http://svn.wikimedia.org/viewvc/mediawiki/branches/RL2/extensions/Gadgets/SpecialGadgets.php?revision=100001&view=markup#l340
[22:40:57] 	 // Translation subpages of title message349 	// @todo
[22:40:59] 	aha
[22:41:09] 	There's Title::getSubpages(), it's not too hard
[22:41:18] 	I did something similar in the migration script already
[22:41:24] 	filter out non-language codes or ignore that ?
[22:41:36] 	Just export all subpages? Can't hurt
[22:41:43] 	k
[22:42:21] 	interesting, I didn't know Title had that.
[22:42:30] 	Anyway, I'm off to bed
[22:42:43] 	k, thx
[22:42:58] 	Are you still doing RL2 frontend stuff?
[22:43:03] 	yep
[22:43:06] 	among other thigns
[22:43:17] 	What and how much are you doing these days anyway? I've seen you get involved with the Harvard thing
[22:43:54] 	yeah, Dario needed that fairly quick and got priority approval from alolita to get it done fast, so had to pause RL2 for a few weeks.
[22:44:19] 	That's done know, although it's going to get a second phase, I don't have workable details on that yet though.
[22:44:25] 	actionable*
[22:44:59] 	and there's two new project alolita wants me on, I don't remember their names right now but that's also going to happen.
[22:45:20] 	alolita: on that, you wanted to e-mail me about those ?
[22:45:50] 	and October code challenge, code review for that, needs to be done this week.
[22:46:00] 	that's it currently.
[22:46:10] 	OK well
[22:46:23] 	I'm trying to put in some hours the next few days, I'm working on RL2 unit tests
[22:46:29] 	(Krinkle, btw, if you get a sec, is it possible to resolve de->Deutsch within JavaScript?)
[22:46:34] 	But I will soon be reassigned to painting walls probably
[22:46:40] 	My parents decided to move weeks before I do
[22:47:07] 	Jarry1250: Hm.. not that I know of, no.
[22:47:12] 	Where do you need that ?
[22:47:50] 	Jarry1250: "within JavaScript" as in, native in the programming language or native browser API, no. But it's certainly possible in JavaScript using something from a MediaWiki server.
[22:48:23] 	Or even validate "de", "fr", etc? It's my TranslateSVG extension. I just wondered if there's a function for it within MediaWiki.
[22:49:45] 	(In the PHP equivalent I validate them by trying to resolve them into native-language names.)
[22:50:10] 	There's one in the language class i think
[22:50:39] 	oh sorry, you said js above
[22:51:44] 	Yup. I mean, I could just go and dump the current PHP language names into a massive JavaScript switch()-based function, but that hardly seems a nice way of doing it.
[22:52:56] 	You'd have a map like var languages = { 'de': 'Deutsch', 'en': 'English', ... };
[22:53:07] 	Supposedly, MW would export that map to JS
[22:55:13] 	RoanKattouw: It can do that already
[22:55:21] 	we use that on Commons for the AnonymousI18N drop down menu
[22:55:31] 	which is inspired by the language preference