[16:08:17] hi [16:08:36] waiting a few minutes for fabrice [16:10:11] Good morning [16:10:18] Or... afternoon on the East Coast I guess [16:11:37] just barely [16:13:18] hi fabrice [16:14:01] HI there, yoni_omniti. Sorry to be late, IRC wouldn't let me in when I tried to login with my Verizon Mifi from the bus. [16:14:28] fabriceflorin: If that happens in the future: http://webchat.freenode.net [16:14:52] so, we're here to discuss the new workflow and how to execute it on the omniti side [16:14:53] Thanks, Roan_Kattouw, great tip. What did I miss? [16:15:05] Just people waiting for you [16:15:23] currently - we have 2 issues: 1) sync between the devs and 2) push to prototype [16:17:33] i understand that there is a proposed solution (by yeah) which is approved by roan, and the temporary solution implemented by roan (thanks for that!) [16:17:37] am i correct? [16:17:55] yeah = reha (autocorrect…… grrrr) [16:17:59] I pretty much implemented the proposed solution, didn't I? [16:18:16] if this is the case, i'm happy :) [16:18:21] reha? [16:18:43] I created a branch called 'omniti' and allowed all registered users to push directly to it without review [16:18:58] is the branch only for aftv5? [16:19:06] I should probably clamp down the permissions a little, make a separate group with OmniTI people and only allow them to bypass review [16:19:11] Yes, this is only for AFTv5 [16:20:00] thanks! and yeah, please adjust permissions - i would like us to be limited to aftv5 on one hand, and i would like to keep the bypass open only for omniti and you - on the other hand [16:20:17] Thanks for the overview, Yoni_omniti, RoanKattouw. Reha (rsterbin(, have you been able to push code to this new configuration? Is it working for you? [16:20:34] yes, it's working great [16:20:51] i'm not able to switch to the branch on prototype, though [16:20:53] not sure why [16:21:04] That's strange [16:21:07] i'm still setting up my local, will need some help from reha [16:21:11] word [16:22:07] Glad to hear this, rsterbin. So when do you think we will be able to test your bug fixes on prototype? Does yoni_omniti need to push them? Or can you push directly, so we don't have to wait for him? [16:22:09] RoanKattouw - http://pastebin.com/raw.php?i=K45N5zhp [16:22:52] once we get to where prototype can switch to the omniti branch, we should be good to go on that [16:22:54] What does this mean? [omniti@prototype /srv/org/wikimedia/prototype/wikis/rc/extensions/ArticleFeedbackv5]$ git checkout omniti [16:22:54] error: pathspec 'omniti' did not match any file(s) known to git. [16:22:55] Did you forget to 'git add'? [16:23:04] I got it [16:23:06] catrope@prototype:/srv/org/wikimedia/prototype/wikis/rc-en/extensions/ArticleFeedbackv5$ git checkout -b omniti origin/omniti [16:23:08] Branch omniti set up to track remote branch refs/remotes/origin/omniti. [16:23:09] Switched to a new branch "omniti" [16:23:13] lol [16:23:19] i guess it only likes you [16:23:22] I tried the other variations that you tried too [16:23:27] Well, I had to run git pull first [16:23:38] ok, let me see... [16:23:42] sorry about that - colloquy quit on me [16:23:42] hi everyone [16:23:45] See if you can git pull [16:23:54] hi DarTar [16:23:59] This shared-clone-with-ssh-fetching thing is annoying [16:24:06] I wish https wasn't broken on this machine [16:24:12] RoanKattouw: http://pastebin.com/raw.php?i=9S7GUwGL [16:24:37] rsterbin: Did you forward your key? [16:24:51] the omniti account is shared [16:25:02] Oh, right [16:25:26] Unfortunately the way it's set up you need an SSH key to pull [16:25:28] does it need to be your key, since you checked it out? [16:25:42] No, it can be someone else's key, but.... oh rawr [16:25:48] It would have to match their username too [16:25:50] anything I can help with, please ping me – I'll be lurking on IRC [16:27:12] Oh, wait, maybe I need to update the git client on prototype [16:27:17] ok [16:27:22] and maybe that'll make cloning over https work [16:27:40] Ewww [16:27:43] prototype is on hardy [16:28:26] that is... very old [16:28:38] Yeah [16:28:46] Even our cluster doesn't run hardy anymore [16:29:47] prototype is still up? [16:30:05] it is! [16:30:12] * Krinkle-away crawls back into his cage [16:30:30] we use it so that fabrice can see what we're working on without having to take up roan's time for every commit [16:31:05] Krinkle-away: I need to move that to labs soon [16:34:19] OK I've got it [16:34:26] Hang on while I swap out the SSH checkout with an HTTPS checkout [16:34:33] And you should be able to pull without problems [16:34:45] "7.9.1 Rules of Automatic Semicolon Insertion" now the fun part begins ! [16:35:31] OK, all done [16:35:35] rsterbin: Try git pull now? [16:35:42] Thanks, RoanKattouw, much appreciated. [16:36:29] "error: cannot open .git/FETCH_HEAD: Permission denied" [16:37:08] ah, it's not group-writeable again [16:37:11] one sec... [16:37:27] You can also sudo git pull if you want [16:37:33] Probably easiest [16:37:36] (long live sudo) [16:37:56] ha, there it is! [16:38:13] "Already up-to-date." -- woohoo [16:38:25] Whee [16:39:50] fabriceflorin: we're good to go. prototype has my fixes (mostly clicktracking-related), but not yoni's yet [16:40:28] OmniTI can now just use git push origin omniti to push to their branch [16:40:53] yoni_omniti: If you give me a list of the Gerrit usernames of all the omniti people, I can lock this down so that only they, not any random person, can push there [16:40:54] RoanKattouw: has that fix that you merged to master last week been deployed? [16:40:59] Excellent. We will start testing new bug fixes now. rsterbin, were you able to implement RoanKattouw's proposal to write click tracking data to a file, so that DarTar can review it on prototype? [16:41:08] rsterbin: You mean " Bug fix: setLinkId() still had a reference to the old linkId property; closeAsModal() updated to use 'X' rather than '0'" ? [16:41:12] yep [16:41:16] in a second [16:41:17] Ah... maybe? [16:41:25] yonishostak is mine [16:41:28] I was on a different continent then, I don't remember. Let's see [16:42:20] Nope not deployed [16:42:24] Cool. What is the URL for the click tracking data file? Can you and DarTar work on this together, to make sure he's getting the data he needs? We want to deploy this as soon as possible, as we are not collecting useful data on production now. [16:42:41] I'll deploy that fix of Reha's [16:43:28] RoanKattouw, which fix is that? If it's click tracking related, let's make sure that DarTar reviews the file first. [16:43:43] !g 3501 [16:43:44] Google says: http://www.google.com/search?btnI=745&q=3501 [16:43:56] What! Who overwrote my !g shortcut [16:43:58] !gerrit 3501 [16:43:58] https://gerrit.wikimedia.org/r/ [16:44:01] :D [16:44:02] !gerrit del [16:44:03] Successfully removed keyword: gerrit [16:44:08] !gerrit is https://gerrit.wikimedia.org/r/$1 [16:44:09] Successfully added keyword: gerrit [16:44:11] !gerrit 3501 [16:44:12] https://gerrit.wikimedia.org/r/$1 [16:44:13] brb [16:44:15] .... [16:44:23] https://gerrit.wikimedia.org/r/3501 <--- THIS [16:46:14] fabriceflorin & RoanKattouw: that fix is only for aft_link_id being zero in the db [16:46:52] here, let me try to go from omniti to master with the others [16:47:51] rsterbin: You mean cherry-pick your other commits from omniti over to master? [16:47:59] yep [16:48:05] OK [16:48:16] hm [16:48:55] AFAICT they've already been submitted into master anyway [16:49:07] ah [16:49:18] Just haven't been approved and merged [16:49:36] https://gerrit.wikimedia.org/r/#q,project:mediawiki/extensions/ArticleFeedbackv5,n,z [16:49:37] alas. i thought this would be a good test for making sure our plan works :) [16:49:50] huh, some of those claim to be merged [16:50:05] that's weird [16:50:22] they're definitely not showing up when i switch to the master branch [16:50:50] huh [16:50:56] I have a meeting starting in 10 mins but I can help looking into the prototype logs when I am done. RoanKattouw where is the data being captured? rsterbin: do you have access to the logs too? [16:50:59] Oh, it noticed you pushed them into the omniti branch [16:50:59] Thanks, rsterbin. Are we confident that this bug fix 3501 will solve the problem? Have we tested it on our end? [16:51:02] https://gerrit.wikimedia.org/r/#change,3420 [16:51:02] yeah [16:51:57] RoanKattouw: I think I found a bug and 2 inefficiencies while looking through CSSMin for the first time [16:52:00] > return CSSMin::minify(' #foo { prop : foo ; content: "Hello: World" }' ); [16:52:00] #foo{prop :foo ;content:"Hello:World" } [16:52:02] whoops, DarTar: i don't know yet -- RoanKattouw? [16:52:31] fabriceflorin: i think all the testing has been mine, locally [16:52:40] space before colon and space before } isn't trimmed and strings are touched [16:52:42] let me pop in the db on prototype and do a quick test [16:52:49] Krinkle-away: Re inefficiencies, we don't care. There are edge cases that aren't caught but they're not worth the effort. Re bug, I guess we missed that one [16:52:54] but content: is evil :D [16:53:19] it's just textNodes basically, not too evil. And often used in decoration and font-icons [16:53:47] Krinkle-away: There's this thing called i18n .... :) [16:53:55] true [16:54:15] Also, I thought it was very strange CSS3 allowed this. CSS is for visual appearance, HTML is for content. This seems to fundamentally violate that separation [16:54:29] but since its an open libs class and we do run it over third party modules (like jQuery UI that already do i18n built-in) [16:54:35] aye [16:54:39] content isn't new in CSS3 [16:54:41] It should be fixed [16:55:02] fabriceflorin: test successful on prototype [16:55:19] !bs [16:55:19] https://bugzilla.wikimedia.org/buglist.cgi?quicksearch=`e1 [16:55:24] :) [16:57:44] Thanks, rsterbin. I am glad the test was successful on prototype. Which bug were you testing? Is it the aft_link_id being zero in the db? (https://gerrit.wikimedia.org/r/3501) Is there a link to the click tracking file which DarTar could review? Also, how can we check the other click tracking changes that needed to be fixed? [16:57:46] RoanKattouw: I thought about it when I was looking through a jQuery syntax highlighter that uses pre.source-css::after { content: "CSS{;}" } or something like that to position a funny label in the corner of a
, and thought.. Hm.. wonder if CSSMin trims that away
[16:57:48] 	 RoanKattouw: omniti usernames are -- yonishostak, rsterbin, emsmith, gregchiasson, and seanheavey
[16:58:09] 	 Thansk
[16:58:12] 	 fabriceflorin: i tested af_link_id
[16:58:16] 	 np
[16:59:13] 	 fabriceflorin: clicktracking still seems to be stored in the db
[16:59:56] 	 DarTar: do you have access to the db on prototype?
[17:00:17] 	 (if so, i can give you the query i use to test clicktracking)
[17:01:12] 	 http://pastebin.com/rb4t08FA -- just grabs relevant information for the last ten events
[17:01:34] 	 rsterbin: Added Yoni and you; Elizabeth, Greg and Sean don't seem to have Gerrit accounts
[17:01:49] 	 i was sure elizabeth had set hers up
[17:01:51] 	 hang on...
[17:03:00] 	 If she hasn't logged into Gerrit yet, it won't exist there
[17:05:10] 	 ah, looks like she had trouble with the forgot password loop
[17:07:09] 	 what do we do in case of a path conflict of someone else's when we try to review and accept their commit in gerrit?
[17:07:34] 	 I don't even know how to find out wha tthe conflict is...
[17:07:54] 	 apergos: Basically you download the change and try to rebase it
[17:08:13] 	 git review -d 1234   (where 1234 is the 4-digit change number in the Gerrit URL)
[17:08:19] 	 git rebase origin/master
[17:08:20] 	 I don't use git review
[17:08:24] 	 Oh, meh
[17:08:33] 	 OK, if you go to the change page
[17:08:35] 	 I'm another of those die-hards :-P
[17:09:00] 	 RoanKattouw: re: r35491 - escaping the output is fine for production but that will allow goodfaith user tools to actually serve malicious content, I am not sure we should allow this
[17:09:04] 	 There is a command that you can copypaste that looks like    git fetch https://gerrit.wikimedia.org/r/p/mediawiki/core refs/changes/45/3745/2 && git checkout FETCH_HEAD
[17:09:10] 	 yup
[17:09:18] 	 I've used that for me in the past
[17:09:41] 	 DarTar: Anyone who serves content from a database (let alone a 3rd party database!) and assumes it's safe not to escape it deserves the exploits coming their way, IMO
[17:09:58] 	 So you fetch it with that, and then you run git rebase origin/master
[17:10:02] <^demon|away>	 RoanKattouw: Upstream bug we should file: `git review --help` should produce actual output.
[17:10:03] 	 That'll barf with a conflict probably
[17:10:08] 	 ^demon|away: +1
[17:10:17] 	 ^demon|away: Could probably alias that to git help review
[17:10:17] 	 ok hopefully it will be a bit mroe verbose about the problem
[17:10:46] <^demon|away>	 RoanKattouw: There is no git help review either, git-review lacks a man page.
[17:10:51] 	 apergos: Use git status to find out which files are conflicted. Use git add FILENAME to mark a file as resolved. When all files are resolved, run git rebase --continue
[17:10:55] 	 ^demon|away: It has a man page for me
[17:11:09] 	 and --help works for me
[17:11:18] <^demon|away>	 git help review
[17:11:19] <^demon|away>	 No manual entry for git-review
[17:11:19] 	 I am  talking of the toolserver, which everybody in the community trusts by default, I am trying to minimize the risks to people who may be unaware of what we collect in the aft5 tables
[17:11:26] 	 You wouldn't be using one of them Macs, would you? ;)
[17:11:32] <^demon|away>	 Well duh ;-)
[17:12:20] 	 apergos: ... then after the rebase is done, check the diff with git diff HEAD^1..HEAD , then submit normally
[17:12:48] 	 if we do strip this stuff in wikitext, should we not sanitize at all user feedback?
[17:13:13] <^demon|away>	 RoanKattouw: git review --version is reporting 1.15
[17:13:19] 	 I think we're creating a major opportunity for abuse
[17:13:28] 	 gotta go on a meeting, bbl
[17:13:59] 	 DarTar: there are several facets to this problem - e.g. if we do sanitize before storing in the db, what do we save? do we want to retain the malicious content (for legal reasons perhaps)?
[17:14:45] 	 Same here
[17:14:57] 	 agreed, it's a complex problem butand we need to have clear specs in place, we cannot just say it's the end user responsibility
[17:15:02] 	 well that was about useless
[17:15:09] * apergos  hard resets and tries it all again
[17:15:13] 	 piece of crap
[17:16:04] 	 DarTar: We can document this
[17:16:19] 	 But really, as a developer you should NEVER EVER EVER blindly trust stuff that's in a database
[17:16:42] 	 as a developer you should NEVER EVER EVER trust user input
[17:17:14] 	 I really need to start my meeting but we can resume in an hour
[17:17:20] 	 ok
[17:17:32] 	 let me know about db + prototype?
[17:17:37] 	 Storing user input in a DB and escaping it on the way out is a valid way of not trusting user input
[17:17:54] 	 In fact, it's the *recommended* way in our security manual
[17:18:00] 	 escape as close to the output as possible
[17:18:00] <^demon|away>	 apergos: `git config --global alias.ohcrap "reset --hard origin/master"`
[17:18:23] 	 RoanKattouw: elizabeth is set up on gerrit now; i'll harass greg and sean to get theirs up, too
[17:18:33] 	 yay
[17:18:42] 	 I'm gonna grab an overdue pizza
[17:18:43] 	 yeah, only it's usually gerrit that's making me swear
[17:18:45] 	 git is fine
[17:18:49] 	 agree there ;)
[17:18:54] * RoanKattouw  has been up since 6am and has teh hungahz now
[17:19:11] 	 Thanks, RoanKattouw, rsterbin and yoni_omniti. Have we solved all the main Gerrit/Git issues, so OmniTI can push code to prototype? (assuming we get Elizabeth's account validated)
[17:19:38] 	 apergos: Fetch change, rebase against master, fix conflicts, resubmit. I don't see what's the hard part other than the conflicts, and that's all git
[17:19:47] <^demon|away>	 Hahaha, this is an awesome git alias
[17:19:55] <^demon|away>	 mav = !afplay -s 0 12 ~/Music/iTunes/iTunes\\ Music/Kenny\\ Loggins/Top\\ Gun/Danger\\ Zone.mp3 &\ngit rebase -i
[17:20:03] 	 fabriceflorin: yep, i've successfully updated prototype using the joint omniti account
[17:20:05] 	 lol
[17:20:32] <^demon|away>	 `git mav HEAD~1` to rebase while listening to Danger Zone?
[17:20:33] <^demon|away>	 Oh yes
[17:21:08] 	 Good. If all is well on the AFT git/gerrit front, let's reconvene over Skype at 11am PT for our weekly AFT meeting. Roan, would you like us to call you on Skype to discuss our next deployment?
[17:22:12] 	 it rolled back a whole bunch of commits (no idea why),barfed in the middle, left me with a bunch of crap not actually in the downloaded change
[17:22:17] <^demon|away>	 brion: You missed it. Best git aliases :)
[17:22:18] 	 and sais I was somehow suppoed to resolve it
[17:22:19] <^demon|away>	 mav = !afplay -s 0 12 ~/Music/iTunes/iTunes\\ Music/Kenny\\ Loggins/Top\\ Gun/Danger\\ Zone.mp3 &\ngit rebase -i
[17:22:20] 	 fabriceflorin: Sure
[17:22:38] 	 apergos: What's the change# ?
[17:23:32] 	 RoanKattouw, sounds good. I will call you on Skype at 11am PT. Ideally, we would like to deploy the bug fixes at your earliest convenience, either later today (for those which have been tested), or on Wednesday. Will that work for you?
[17:23:32] 	 ah I see
[17:23:35] 	 heh
[17:23:46] 	 of course you think I'm on branch master like most folks
[17:23:46] 	 Sure
[17:23:48] 	 but I'm not
[17:23:51] 	 Oh, lol
[17:23:53] 	 Sorry
[17:24:02] 	 git rebase origin/$BRANCH
[17:24:15] 	 ^demon|away, hey that reminds me, how do we get new extensions into gerrit?
[17:24:18] 	 OK folks, thanks for the good work. Will continue this discussion via email and Skype. Over and out.
[17:25:00] <^demon|away>	 brion: I wrote to wikitech-l about that. See thread "migrating a non-WMF extension to Git" on March 23rd.
[17:25:02] 	 that is *much* better
[17:25:03] <^demon|away>	 I've gotta run
[17:25:06] 	 yeah obviously
[17:25:19] 	 it was my fault for not paying attention
[17:25:25] 	 ^demon|away, found it thx
[17:25:31] 	 otoh it is now 8:30 here and I started at 8:30 am so...
[17:26:06] 	 i'll stick with github for ExtensionFetcher for now then :D
[17:26:36] 	 this is someone else's change though, do I just resubmit it as their new patchset or something?
[17:26:43] 	 If your ext lives in github it should be trivial
[17:26:51] 	 yep
[17:26:58] 	 brion: I can create repos for you as long as I don't have to import anything from SVN
[17:27:06] 	 apergos: Yes, just resubmit normally
[17:27:11] 	 ok
[17:27:18] 	 It'll show you as the committer and them as the author
[17:27:28] 	 ok, I guess that's all right
[17:27:29] 	 whee
[17:27:58] 	 RoanKattouw, can you make me a repo for mediawiki/extensions/ExtensionFetcher ? i'll try and feed it with data on my own :)
[17:28:14] 	 Sure
[17:28:18] 	 Han on
[17:28:18] 	 thanks!
[17:28:34] 	 we're doing this one han-style...
[17:28:38] 	 solo!
[17:33:34] 	 brion: Pizza interruption, you'll get your repo later
[17:33:44] 	 :D
[17:37:03] 	 frack, is github misbehaving for others or does it just hate me today?
[17:38:55] 	 well the git part works anyway
[18:03:09] 	 i am having some trouble running the campaigns feature in UploadWizard extension, can some one help me in this
[18:04:51] 	 not too familiar with it myself sorry
[18:08:28] 	 thats ok brion, let me try asking it on #mediawiki
[18:26:35] 	 RoanKattouw, rsterbin: https://bugzilla.wikimedia.org/show_bug.cgi?id=35496
[19:22:55] 	 DarTar: And you're certain you're getting save-complete events from AFT4? What do the keys look like?
[19:23:17] 	 hang on, let me look them up
[19:25:11] 	 pitch-edit-save-save-complete
[19:27:13] 	 Aha, this code lives in the ClickTracking ext
[19:27:15] 	 AFT4: *-pitch-edit-save-save-complete = AFT5: *-cta_edit-edit_success-*
[19:28:09] 	 AFT4: *-pitch-edit-save-save-attempt = AFT5: *-cta_edit-edit_attempt-*
[19:28:43] 	 in AFT5 we also count attempts/saves for edit tab/SEL edits, we're not doing this in AFT4
[19:35:36] 	 brion: hi! got a minute for a look at my latest Evil Plans (tm)? I want to cut MediaWiki loose from wikitext...
[19:35:46] 	 here'S the writeup http://meta.wikimedia.org/wiki/Wikidata/Notes/ContentHandler
[19:36:17] 	 whee
[19:36:25] 	 i mostly want to know if you think it's feasible to get this in. I already know that it'S a good idea :)
[19:36:32] 	 hehe...
[19:41:02] 	 anyone else want to pick at the thing and point out flaws?
[19:41:35] 	 Daniel_WMDE, the main alternative i might propose is tying content models to namespaces, which would map better to how we've done different content types in the past
[19:41:44] 	 however clearly we do some mixing
[19:41:50] 	 such as CSS and JS pages in MediaWiki: and User:
[19:42:11] 	 could i have, say, an image in the main namespace in this system? would that be an issue?
[19:42:12] <^demon>	 Just because we did that before doesn't mean we should repeat the same mistakes :)
[19:42:17] 	 aye :)
[19:42:30] 	 brion: well, i do set the default content model by namespace.
[19:42:32] 	 just sayin', inertia has its benefits and shouldn't be fully ignored ;)
[19:42:35] 	 excellent
[19:42:44] 	 and if, when and how that can be overridden is up the to UI
[19:43:05] 	 though i have been thinking on implementing an "allowed models per namespace" restriction on a lower level
[19:43:16] 	 migrating File: pages to structured data, for instance, might make a LOT of sense
[19:43:26] 	 indeed
[19:43:43] 	 one of my favorite (only mentioned as a side note on the page) is multip-part pages
[19:43:50] 	 which allowes for "attachements"
[19:44:03] 	 so you'd have normal wikitext, but could attach any kind of structured data
[19:44:24] 	 like geo-coordinates, or license info, or citations, or categories, sister links, whatever
[19:44:36] 	 (this is out of scope of wikidata, but would become streight forward)
[19:44:39] 	 eek :)
[19:45:40] 	 brion: my main concern is that the assumption thet pages contain wikitext is so ingrained in the current codebase. this means two things:
[19:46:14] 	 a) some important, old and complex beasts have to be messed with heavily (namely: EditPage, Article).
[19:46:21] 	 whee
[19:46:24] <^demon>	 EditPage needs love anyway
[19:46:28] 	 ^^]
[19:46:31] <^demon>	 Even if we keep wikitext for the next 80 years.
[19:46:37] 	 b) lots and lots and lots of small changes in places that access revision text directly
[19:46:59] 	 ^demon: i planned on a rough slapping... but anyway
[19:47:35] 	 brion: both a) and b) will make review and testing pretty heavy. do you think it'S realistic to get this into 1.20?
[19:47:51] 	 hmm
[19:48:06] 	 i'd prefer a 'quick' 1.20 cycle that's basically a first-turnaround on git
[19:48:15] 	 might be better to push it for 1.21 if we do that
[19:48:31] 	 ok, let'S say "deployed in august".
[19:48:35] 	 doable?
[19:48:55] * brion  looks at roblaBUSY :)
[19:49:07] 	 i'd say in theory
[19:49:12] * Daniel_WMDE  makes robla busy
[19:49:16] 	 hehehe
[19:49:22] 	 well...
[19:49:57] 	 in theory, adding in the apis should be relatively stable while everything remains wikitext
[19:50:05] 	 (we just started back on the budget meeting, but I think we're going to stop incrementing minor versions for deployments)
[19:50:16] 	 we should be going pretty fast
[19:50:26] 	 \o/
[19:50:32] <^demon>	 So we're gonna increment the major version instead?
[19:50:38] 	 roblaBUSY: i don't care about version numbers. i care about august
[19:50:45] <^demon>	 Time to finally undo the mistake made by Brion at 1.9 -> 1.10 ;-)
[19:50:52] 	 hehe :)
[19:50:56] 	 :P
[19:50:57] 	 can we just use timestamps?
[19:51:00] 	 anyway
[19:51:01] 	 they're not decimalssssss
[19:51:11] 	 shoulda started with 1.000000001
[19:51:25] <^demon>	 Tell that to every person who's shown up since then wondering why 1.9 < 1.11.
[19:51:27] <^demon>	 :)
[19:51:54] 	 tell that to every script that wonders why firefox 11 < 9
[19:52:33] 	 brion: the thing is: this functionality is a blocked for wikidata. if it's not feasible to get this rolled out in the summer, i need to scrap it and start over with a different approach. so... i'm not looking for a promise, but an educated guess at how long it takes to get a change like this rolled out.
[19:52:48] 	 e.g. how long did it take for the MediaHandler code to go live?
[19:53:00] 	 Daniel_WMDE, my educated guess is that getting this infrastructure in an august timeline is reasonable
[19:53:00] 	 i think that's comparable
[19:53:10] 	 brion: yay, thanks
[19:53:21] 	 you may wish to check with tim, who may tell you it's impossible or feasible in time X :)
[19:53:24] 	 now i only need Tim-away to agree and i'm happy :)
[19:53:30] 	 hehe
[19:53:39] 	 hehe, my thought exactly
[19:53:52] <^demon>	 What's super cool is now we have non-crazy branching and the ability to push-for-review without cluttering the history if it's totally bunk.
[19:53:55] <^demon>	 Git++
[19:54:04] 	 What would be some hooks I could use to check and see if a user selected a license for a file or not
[19:54:20] 	 yeah
[19:54:27] 	 brion: please nudge tim towards my writeup (or my mail on wikitech-l) if you see him around. my communication with him appears to be time zone encumbered.
[19:54:44] 	 or maybe he'S on a alkabout or something
[19:55:05] <^demon>	 I'd put my money on timezones.
[19:55:15] <^demon>	 I'm having a hard time imagining Tim on a walkabout.
[19:55:20] 	 JRWR: don't you just wish the license for a fiule was stored in some structured machine readable way in the database?...
[19:55:46] 	 ^demon: hehe, but it's a fun thought ;D
[19:55:57] 	 All I want to do is reject a fileupload if a user didnt select a license. :P
[19:56:20] <^demon>	 Daniel_WMDE: WM2013 - Australian Outback
[19:56:41] 	 JRWR: last i checked all the license magic was JS, and the server knows nothing about it. but then, it has been years since i touched that code...
[19:57:18] 	 ^demon: that would be awsome! connectivity could be a problem, though
[19:57:21] 	 Ill look into the FileUpload hook, see if I can find anything in there
[19:57:29] <^demon>	 HughesNet?
[19:57:30] <^demon>	 :)
[19:58:23] 	 JRWR: well, you can always fetch the list of available license that is used to build the selkection box from whatever system message it is stored in (wouldn't it be nice to have that in JSON) and grep through the image description...
[19:59:07] 	 Daniel_WMDE: a static list and some preg would work out well
[19:59:42] 	 maybe there is a api call I can use
[19:59:52] 	 or some function or something to pull the list
[20:02:35] 	 JRWR: it'S a system message, as i said. don't ask me which one though
[20:09:05] 	 hi Daniel_WMDE
[20:09:58] 	 hi Nikerabbit
[20:10:17] 	 I like your idea of pluggable page rendering modules, if it can be done in not too intrusive way
[20:11:15] 	 Nikerabbit: the "intrusive" bit is the problem. only few classes new heavy changes. but there's about 100 places that would need small changes
[20:11:38] 	 well, they are not totally needed right away. they are covered by the B/C interface. but still.
[20:11:51] 	 the assumption that all pages are wikitext is deeply ingrained
[20:12:40] 	 yeah we have little exceptions here and there for js/css pages, it's kinda dirty
[20:12:43] 	 and those aren't even complete
[20:13:16] 	 indeed
[20:13:30] 	 they will be comverd by special handlers
[20:13:48] 	 i'd actually prefer to handle them *less* like wikitext.
[20:14:00] 	 i.e. apply no parsing. no links, no categories, no redirects.
[20:14:12] 	 I would love to replace the rendering of translation pages if I could, so I definitely see huge gains to be had here
[20:14:20] 	 hm, i bet people transclude js code across pages
[20:14:22] * Daniel_WMDE  shudders
[20:14:36] 	 but it would be possibel to run the preprocessor without running the parser
[20:14:49] 	 Nikerabbit: cool, another use case :)
[20:15:37] 	 brion: actually, the most tricky "special" bit about JS/CSS pages is the special system messages shown on the page (and the edit page).
[20:15:59] 	 i'll have to put something into the generalized interface for this.
[20:16:05] 	 or sublcass Article and EditPage
[20:17:43] 	 hmm so there are at least two sides to this: rendering the page and providing editor for the page
[20:19:34] 	 hmm I might be able to attend the wikidata thingie
[20:21:27] 	 why is the windows java installer offering to install the Ask browser add-on toolbar? ewwwww
[20:26:01] * AaronSchulz  snickers
[20:28:16] 	 Nikerabbit: it would be good to have someone from the i18n team there!
[20:29:02] 	 Daniel_WMDE: depends on how well/quick I will arrive back home from berlin
[20:29:05] 	 Nikerabbit: internationalizing data records and complex ajax ui is something Wikidata needs...
[20:29:47] 	 brion: i recently installed an HP print driver on windows. it cambe bundled with 360 MB of crud, including the yahoo toolbar.
[20:29:55] 	 o_O
[20:29:56] 	 Fuck yeah
[20:30:27] 	 Nikerabbit: rendering and editing - and diffing! structural diff and merge for JSON... awesome :)
[20:34:41] 	 ug
[20:36:43] <^demon>	 brion: Have you played with gerrit yet?
[20:37:33] 	 ^demon, a little here and there
[20:37:43] 	 not enough yet to feel comfy with it
[20:37:51] <^demon>	 There's 2 mathjax related commits ;-)
[20:37:53] <^demon>	 https://gerrit.wikimedia.org/r/#q,status:open+project:mediawiki/extensions/Math,n,z
[20:39:46] 	 excellent i'll review em after i've tested em
[20:45:24] 	 does anyone know why going to https://mediawiki.org redirects you to www.mw ?
[20:46:11] <^demon>	 I think there's an open bug about it
[20:46:26] 	 ^demon: bug # please
[20:47:01] <^demon>	 Now you're gonna make me search :p
[20:47:08] <^demon>	 https://bugzilla.wikimedia.org/show_bug.cgi?id=31369
[20:47:11] 	 thanks!
[21:04:30] 	 Reedy: did you get my message about one-month-old shell request?
[21:41:41] * brion  grumbles and mumbles at gerrit
[21:41:57] 	 how do i check out a commit before it's been reviewed?
[21:43:07] 	 ok git-review should let me do this if i understand
[21:46:37] <^demon>	 brion: git review -d 
[21:47:09] <^demon>	 for example, `git review -d 3363` for https://gerrit.wikimedia.org/r/#change,3363
[21:47:31] <^demon>	 checks it out into a local branch for you to work in.
[21:47:41] 	 you can go to gerrit and it will give you the fetch command to get it
[21:47:46] <^demon>	 That too
[21:48:04] 	 there's a little icon by the long bit of text (beware, there's also a reset fetch head in there I think)
[21:48:09] 	 that lets you copy to clipboard
[21:50:30] 	 apergos, oh i see it now, carefully hidden between buttons and other things
[21:50:39] 	 ^demon, found it thx :D that is handy
[21:50:51] 	 yes they hide thhese things very well
[21:51:25] 	 I suspect if I jsut clicked around on every square inch of the gerrit ui I could find all sorts of things
[21:51:32] 	 the question is what I would destroy in the process...
[21:52:41] 	 other than your mind of course
[21:52:52] 	 a soul gone mad from staring into the depths of gerrit
[21:53:12] 	 sad but true
[21:53:23] 	 or is that... mad but true? :-P
[23:04:45] 	 is there a way to update the language cache on test.wikipedia.org without doing it everywhere?
[23:06:41] 	 Reedy ^?
[23:07:01] 	 langauge cache? of messages?
[23:07:30] 	 Reedy: ah yeah
[23:10:11] 	 awjr: I suppose you could use the part of the scap script that does it - "Regenerate the extension message file list for all active MediaWiki versions" and not sync them out, but then that's polluted the copy on nfs, which if pushed (not using scap, as they'd be overwritten) could cause issues
[23:10:20] 	 There's certainly no simpler way of doing it
[23:10:30] 	 blargh
[23:10:35] 	 ok thanks Reedy
[23:19:14] 	 preilly ^
[23:19:41] 	 awjr: thanks!