[01:00:46] RoanKattouw: when you have a moment, can you check if we any overrides in the local settings on enwiki for the following variables: $wgMoodBarCutoffTime (moodbar) and $wgEditPageTrackingRegistrationCutoff (edit page tracking) [01:01:28] bsitu and I cannot make sense of what we are seeing in production unless there are overrides set outside of the configuration of the two extensions [01:01:53] > echo $wgMoodBarCutoffTime [01:01:55] 20110725221004 [01:01:56] > echo $wgEditPageTrackingRegistrationCutoff [01:01:58] 20110725221004 [01:02:01] aha!!!! [01:02:08] ok that's what I needed [01:02:16] thx [01:02:25] Those are what it thinks the values are, I don't know where they come from [01:03:42] they certainly come from LocalSettings as they are both set to NULL in the extension configs [01:04:32] unless there's anywhere else where an override can be set [01:05:05] (forwarding this to bsitu) [01:07:02] You're right, in InitialiseSettings.php: [01:07:04] 'wmgMoodBarCutoffTime' => array( [01:07:06] 'default' => '20110725221004', [01:07:07] 'frwikisource' => '20110304202000', [01:07:12] ), [01:09:00] frwikisource - w00t [01:09:28] following up by mail [01:18:06] I'm trying to build out release notes, and I'd like to figure out how to have a gerrit link (not a gitweb link) to a revision based on its sha-1 rather than gerrit id [01:21:15] Oh you can [01:21:25] Timo figured out how [01:21:25] Lemme see [01:26:56] robla: https://gerrit.wikimedia.org/r/#q,6540260001a0ec3b506a9ad4916712b4fe07b6bb,n,z [01:27:05] That'll redirect to a URL with a change number [01:27:07] ah, ossm, thanks! [01:40:00] intuitive much? [02:57:04] what's the value for $wgMaxShellMemory on the mediawiki wiki behind commons? the default seems to be 102400 (~ 100 MB). a post-upload processing task in an extension i'm working on uses around 100 - 200 MB of memory for a brief period of time, and i'm wondering if that task would take up more memory than the maximum available for shell processes on a wikimedia wiki down the road. [02:57:25] > echo $wgMaxShellMemory [02:57:27] 102400 [02:57:51] ok, so wikimedia wikis use the default [02:58:53] The image scalers have more [02:59:13] if ( file_exists( '/etc/wikimedia-image-scaler' ) ) { $wgMaxShellMemory = 300000; // temp was 200M } [02:59:22] Oh ha [02:59:29] Yay, hacks [03:01:17] Reedy: ah, 300 MB would give a lot more breathing room. would it be feasible for this extension's post-upload processing -- generating a ray-traced from a plain-text -- to be done on an image scaler? [03:01:26] plain-text file* [03:01:48] Possibly [03:02:05] As it stands currently, job runners are just apaches [03:03:00] so image scalers are just job runners? [03:03:22] no [03:03:31] image scaling isn't done as a batch process [03:03:37] video transcoding will be [03:06:29] Krinkle: When will you be on line tomorrow? [03:06:41] * RoanKattouw is thinking about how early to get up [03:06:45] do job runners have different limits on the maximum amount of memory available? if this ray-traced image generation took more than 300 MB RAM, would it need to be added to a job queue? [03:07:02] RoanKattouw: Im not sure yet. I don't have college on thursday/friday this week [03:07:14] Oh OK [03:07:22] Oooh right [03:07:25] Thursday is Ascension Day [03:07:34] I didn't realize cause it's not celebrated herre [03:07:36] No, job runners get 150M [03:07:46] I'd say lets start a little later. Will be best for both of us. Considering 8AM is early for you. and it is already 5AM here [03:07:50] Yeah [03:07:57] I'll just come in at 10 as usual [03:08:01] sounds good [03:08:19] Emw: to enable something like this on commons is going to need some infrastructure in place, so I wouldn't worry about the specifics too much [03:09:13] It wouldn't be hard to get the job that was run to attempt to increase it's own memory at runtime [03:09:19] wouldn't be/isn't [03:10:55] Reedy: ok, so do you think it makes sense for now to do this post-upload processing relatively simple, e.g. just bump up $wgMaxShellMemory to whatever it needs to be in my local deployment, and, instead of adding a job for the processing, to just do it in, say, using the onUploadComplete hook? [03:11:05] No [03:11:32] That's going to block, and the process will probably timeout [03:11:52] Well.. [03:11:57] You could debug flag that [03:12:01] the process usually takes about 10-20 seconds [03:12:28] Like $wgEnotifUseJobQ [03:12:48] $wgRayTraceUseJobQ, if false, just do it there and then, if it's true, put it on the job runner [03:16:22] i looked over the job queue management code a few weeks ago but am a bit foggy. is the job processing done on a different machine than the machine that enqueued the job? [03:17:49] if so, is there a way to determine $wgMaxShellMemory for the (other) job runner machine? [03:19:07] Most likely, yes [03:19:10] Evaluate it at runtime [03:19:26] for example if the job runners currently have only 150 MB memory available for shell processes, that would also need to be checked in whatever chunk of code is evaluating what to do with $wgRayTraceUseJobQ [03:20:03] since this ray-tracing processes can take > 150 MB [03:21:42] All you need to do in the job queue execution function is add ini_set('memory_limit', '512M'); at the start [03:23:49] oh, that'd make sense -- it didn't occur to me that that would be ok to do [03:28:40] $wgMaxShellMemory, even [03:29:05] Then use wfShellExec [03:31:12] is it kosher to modify that global variable in the job queue execution function? [03:31:46] Yeah [03:31:55] and/or in the constructor [03:35:24] ok. in that case increasing $wgMaxShellMemory (and presumably resetting it to its initial values after the job completes) seems like a convenient solution. [03:37:37] Yeah [03:38:32] (i appreciate the help) [03:42:52] Makes sense to give you useful information now, rather than let you write it and then tell you it's ALL wrong ;) [03:44:30] that's what i'm trying to avoid as much as possible, though i imagine a decent of rewriting is probably inevitable [04:05:10] Reedy: Found it, it's https://www.mediawiki.org/wiki/MediaWiki_1.20/Roadmap#Schedule_for_the_deployments [15:05:16] siebrand -- just realized i responded to your code review comment and forgot to give you a link to the comps i was supposed to be replicating (although this is a newer version with one angle bracket instead of two) -- http://dl.dropbox.com/u/30377416/prototypes/feedback-tool/article-feedback/permalink-page.png [15:06:45] rsterbin: Yeah, I figured that. [15:07:22] rsterbin: I'm working for another client at the moment. About 60 e-mails behind. I'll get to it in the coming days. [15:07:37] this is supposed to launch on thursday [15:07:47] er, tomorrow [15:08:19] afaik, this is english only right now [15:09:10] and the whole thing's going to have to be reviewed when it's translated, because a lot of the css is going to break down entirely [17:06:28] Chris and I are working here: http://etherpad.wikimedia.org/ReleaseNotes on release notes for 1.20wmf3. Anyone want to pitch in? [17:16:42] hi 20% checkin people. [17:17:31] Which is basically Amir [17:17:54] actually, Krinkle-away , IIRC you're at fulltime now? https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy have you picked a day? [17:32:45] plop [17:45:21] alolita: regarding the code review for SignupAPI - I know that there was some discussion of who would do this (Ryan, Sam, Roan, Trevor, or other). Do you know whether there has been a decision on this? [17:46:49] sumanah: thanks for the ping; I checked with Roan and he suggested that Sam or Aaron or Roan would be the only folks who can review the SignupAPI . So i need your help in getting some of Sam or Aaron's time [17:47:02] sumanah: can you help or should i ping robla? [17:56:51] alolita: sorry, will respond in 5 min [18:03:55] Hi alolita - I think we should talk with robla together! [18:04:28] Well isn't robla popular today [18:04:42] alolita: My understanding is that Aaron's and Sam's 20% time are pretty fully allocated to code review and to shell requests, respectively. [18:08:02] alolita: But if the SignupAPI work is blocking urgent E3 stuff, then that would be good to know! [18:08:32] sumanah: yes, it's blocking us! [18:08:32] alolita: Actually I misspoke just now -- Sam's time for code review currently includes a lot of Wikidata code review, because Reedy is our liaison to the Wikidata team [18:08:46] I'll try to review it this friday [18:09:04] kaldari -- thank you! [18:09:31] alolita: Actually I misspoke just now -- Sam's time for code review currently includes a lot of Wikidata code review, because Reedy is our liaison to the Wikidata team [18:10:03] RoanKattouw: You had earlier suggested that, instead of Ryan, one of you, Sam, and Aaron should be the ones to review SignupAPI? [18:11:09] Thanks for your offer, kaldari! [18:11:26] alolita: http://etherpad.wikimedia.org/E3Analytics [18:11:29] sumanah: That would be best yeah [18:11:47] it would be good to have someone with security expertise review it as well [18:12:11] kaldari: so, csteipp maybe should be involved in the SignupAPI review? from a security perspective? [18:13:09] sumanah: ok; so who would do the review :-) [18:13:13] well, assuming it's actually involved in creating and/or logging into accounts - I haven't actually looked at the extension yet [18:13:25] alolita: This is a good question and I think I'm watching Roan & Ryan figure it out. [18:13:25] sumanah: that makes sense - chris would be ideal for the security review [18:13:50] sumanah: :-) [18:13:52] :) [18:13:59] For something like this it would probably make sense to have a front-end person and a back-end person review it [18:14:16] People are so multi-dimensional [18:14:19] alolita: I can speak a little from a person's-time-allocation perspective but I completely defer to the developers involved when it comes to expertise assessment [18:14:27] yup agreed [18:15:20] I'll be happy to help review it, but I might miss things that a back-end/security person would notice [18:15:46] although hopefully not anything obvious :) [18:16:58] kaldari: great ; so you will team up with chris to complete the review [18:17:15] sounds good to me [18:17:15] Sounds good [18:17:31] thekaryn: ok, it looks like kaldari + csteipp are doing this review [18:17:50] kaldari, csteipp: thanks! any timeline for this [18:18:01] end of this week or monday next week :-) [18:18:07] kaldari, do you have a link to the code? This is the first I've heard of the feature... [18:18:21] yeah, I put the extension in gerrit [18:18:31] let's see.... [18:18:43] https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/SignupAPI,n,z [18:19:10] csteipp: for background: it was begun by a summer intern last year [18:19:40] Ah, cool. So the entire extension hasn't been reviewed yet, right? [18:19:54] not a single line of it has been reviewed as far as I know [18:20:28] csteipp: https://www.mediawiki.org/wiki/Extension:SignupAPI [18:21:48] Where are the latest channel logs? [18:22:47] 50% chance I can completely review it on Friday [18:23:26] hi TrevorParscal, hope your daughter feels better soon. [18:23:34] depends mostly on how fast my other reviews go [18:23:48] sumanah: thanks! [18:23:49] kaldari: (btw, how is Ankur (drecodeam) doing?) [18:24:43] thekaryn: you may want to add yourself to cc on this bug: https://bugzilla.wikimedia.org/36225 which tracks the code review for the SignupAPI extension. [18:24:54] haven't had a chance to check on him in a while. He was doing regular commits but I haven't seen anything in a couple weeks. All of his code so far has been reviewed though [18:25:41] kaldari: Do you know how far he is, on the checklist of stuff to be done by May 21st? [18:26:18] I don't remember off-hand what all is on the list, but last time I looked at it, he was done with most of it [18:26:44] kaldari: the ones that are less likely to have been completed include: triaging a bug, playing with release notes, merging code in Gerrit [18:27:32] "Playing with Release Notes" sounds like a form of vandalism :) [18:28:06] anyway, I'll try to review his status soon [18:28:56] thank you kaldari_away [18:29:04] :) [18:58:41] that extension is going to require some clean up work, regardless of whether or not it passes code review. [18:59:37] <^demon> +1, based on a 20 second skim of it. [19:02:53] sumanah: no, I am not full-time now [19:03:13] Krinkle: Aha, ok, sorry for misunderstanding. [19:04:03] still between 0 and 20 hours a week (official max: 30, but that's more for when I feel like it during school breaks) [19:04:20] college etc. [19:17:23] RoanKattouw: https://meta.wikimedia.org/wiki/User:Krinkle/Tour [19:17:33] https://meta.wikimedia.org/wiki/User:Krinkle/Le_Tour_de_Wik%C3%AD/2011_Resource_Walker [19:18:09] scroll down to Progress, and load the site matrix [19:19:02] pick one you like :) [19:19:07] (wiki) [19:19:09] then I will too [19:19:15] and then we can go through the checklist [19:19:27] * RoanKattouw picks fy.wikipedia.org :) [19:19:53] ok, I'll take nds.wikipedia.org [19:20:11] back to the Tour page, under progress you'll see Log [19:20:28] be sure to list it there and sign it as in progress so on-one else will do it at the same time (just in case) [19:23:11] RoanKattouw: you may wanna put jsUpdater into your fywiki common.js [19:23:23] Oooh yeah just reading through that now [19:23:29] I haven't used jsUpdater in a while (developed by Helder.wiki), not sure about its current state [19:25:49] Krinkle: is it Tour Day? [19:25:50] :D [19:26:19] yeah, good way to gather some data about current gadgets for the workshop in Berlin [19:26:32] Or on-wiki JS in general [19:27:11] so there is "1.21alpha" as an existing version (now)? [19:27:25] No [19:27:30] not supposed to be [19:27:45] ok, just saw 18alpha 19alpha 20alpha [19:27:48] ty [19:28:38] YAY [19:29:48] RoanKattouw: After the console and recent changes check. Just copied Common.js into my editor and going through the steps [19:30:16] "recent changes check"? [19:30:24] oh [19:30:25] I see [19:30:45] Not really a task, just in case something is up [19:33:17] So I ran jsUpdater first, then copy the result after the diff into the editor (not saving right away), and then fixing the rest [19:38:02] Hmm [19:38:06] It doesn't always do the right thing [19:38:19] − jQuery( document ).ready( function ( $ ) { [19:38:20] + $( function ( $ ) { [19:43:45] RoanKattouw_away: yeah, I fixed that [19:43:51] document ready is fine [19:43:55] no need to shorten it by force [19:44:09] https://meta.wikimedia.org/w/index.php?title=User%3AKrinkle%2FLe_Tour_de_Wik%C3%AD%2F2011_Resource_Walker%2FjsUpdater.js&diff=3758429&oldid=3519442 [20:11:35] Hey Roan / Robla, can you see this draft? https://gerrit.wikimedia.org/r/#/c/7824/ [20:11:57] * robla looks [20:12:00] csteipp: I can't see it [20:12:06] That's a good thing I guess [20:12:06] nope [20:12:37] Cool. Thanks! [20:12:52] and now I can...I see you added me as a reviewer [20:12:53] Krinkle: Hah, isn't ta deprecated/removed? [20:12:57] Krinkle: https://fy.wikipedia.org/w/index.php?title=MediaWiki:Monobook.js&action=edit [20:12:59] Yep [20:13:04] no-op [20:13:13] as of 1.20 might even throw exception for undefined [20:13:26] on in this case they define it themselves [20:13:26] ta = new Object(); [20:13:33] so it doesn't throw ta undefined [20:13:36] but still no-op code [20:13:40] Ooooh wait [20:13:42] https://fy.wikipedia.org/w/index.php?title=MediaWiki:Monobook.js&oldid=9065 [20:13:43] Look at the author [20:13:52] really ? [20:14:03] how... did... [20:14:04] They've edited a bit but the diff is empty [20:14:10] I'll just delete that page [20:14:13] yes [20:14:58] How did that end up as the default? Afaik it was a community thing, then at some point the function was put into wikibits and then integrated as php [20:15:17] maybe someone requested ops to run it on all wmf wiks [20:15:22] ha, never seen this [20:15:51] https://simple.wiktionary.org/w/index.php?title=MediaWiki:Monobook.js&diff=next&oldid=4974 [20:19:30] Krinkle: Dude I'm like done with fywiki, it's a boring wiki [20:19:36] No Gadgets, hardly any local JS [20:19:41] wow [20:20:08] They have like two things in Common.js, nothing in Vector.js, and that ta cruft in Monobook.js (I deleted that page) [20:20:09] RoanKattouw: https://fy.wikipedia.org/w/index.php?title=Wiki:Alle_siden_neffens_foarheaksel&from=Loginprompt&prefix=&namespace=8 [20:20:15] interesting... they have all local stuff [20:20:20] pre translatewiki.net ? [20:20:28] Holy crap [20:20:29] why isn't that cleaned out [20:20:35] Isn't fy a Dutch regional language/ [20:20:40] Yep [20:20:47] Frisian [20:21:16] vvv: That's why I chose it, it's spoken in the province where I grew up [20:27:33] So RoanKattouw... master has changed a bit since i finished my work in my branch. git review -D failed, I'm guessing because I'm working off of an old master. Do I do a git merge master? Or git rebase? [20:27:44] How did it fail? [20:27:47] What was the error message [20:28:03] remote: Change-Id: I4f5a2ac18a016820428c5a35e7ca8580b676f391 [20:28:04] To ssh://csteipp@gerrit.wikimedia.org:29418/mediawiki/core.git [20:28:04] ! [remote rejected] HEAD -> refs/drafts/master/bug/29296 (missing Change-Id in commit message) [20:28:04] error: failed to push some refs to 'ssh://csteipp@gerrit.wikimedia.org:29418/mediawiki/core.git' [20:28:13] Oh [20:28:53] RoanKattouw: btw, for a large css document http://procssor.com/process can help to clean it up (selector per line, space between prop/value; semi colon, indention etc.) [20:29:07] some local sheets are terrible just to squeeze whitespace [20:31:58] Hmm [20:32:07] Didn't notice that either [20:32:17] I'm gonna pick a different wiki just so I'll run into stuff) [20:32:27] hehe [20:32:51] RoanKattouw: added a note https://meta.wikimedia.org/w/index.php?title=User%3AKrinkle%2FLe_Tour_de_Wik%C3%AD%2F2011_Resource_Walker&diff=3758542&oldid=3758418 [20:35:08] TrevorParscal: Testing a new beta function on a small wiki? I love this new Vector 2.0 edit button: https://nds.wikipedia.org/wiki/MediaWiki:Onlyifediting.js [20:35:13] ;-) [20:35:25] https://nds.wikipedia.org/wiki/Wikipedia:H%C3%B6%C3%B6ftsiet [20:35:55] It was fairly popular in Monobook on a lot of wikis, but doesn't fit Vector very well. [20:37:14] Krinkle: wtf? [20:37:26] dude, that dotted line around the edit tab, owch [20:37:30] what the hell is going on? [20:37:32] I know :D [20:37:32] ewww [20:37:56] Domas could do better CSS than that' [20:38:09] TrevorParscal: it's custom site css that many wikis used to use on Monobook. fr.wikipedia got famous with it. [20:38:10] it's all over the place [20:38:21] but I hadn't seen it in Vector yet [20:38:24] that's so horrible [20:38:35] I suppose some people copied all monobook stuff to common after the Vector switchover [20:38:43] which I know many people did, unconditionally [20:39:10] TrevorParscal: It'd be interesting how this affects editor contribution though [20:39:16] it's ugly as hell, but it does stand out [20:39:36] we all know how people find it hard to find the overall edit button on page (they usually see the [edit] section link from my experience) [20:40:03] meh, these days it would only lead users to the infobox code though :P [20:41:29] Hmm, not much luck with iuwiki either [20:41:36] https://iu.wikipedia.org/wiki/MediaWiki:Common.css is all I found and it looks reasonable [20:43:28] indeed [20:43:59] Aha [20:44:04] I think I have struck gold heree [20:44:06] https://ia.wikipedia.org/wiki/MediaWiki:Common.js [20:44:24] yeah, that looks "good" :D [20:44:29] bahhh [20:44:33] hard coded http:// [20:44:47] Reedy: That's not the worst thing in there :) [20:45:00] RoanKattouw: I'm almost done at nds.wikipedia :D [20:45:03] Wouldn't' suprise me [20:45:03] done [20:45:48] Krinkle: I have an improvement on that CSS, I think it will draw even more people into editing ( http://trevorparscal.com/stuff/wikipedia/new-edit-button.gif ) [20:46:09] TrevorParscal: hah [20:46:18] awesome [20:46:22] Propose it ! [20:46:23] xD [20:46:33] Propose it? [20:46:37] Just push it live on enwiki [20:47:12] TrevorParscal: css3 animations, sweet x:P - at least our fellow IE6 users won't be bothered [20:47:31] yeah, we don't want their edits anyways [20:51:53] RoanKattouw: This looks scary [20:51:54] https://nds.wikipedia.org/w/index.php?title=Spezial%3ASieden+de+anfangt+mit&prefix=MediaWiki%3A&namespace=0 [20:51:58] "Sitesettings-wgDefaultBlockExpiry" [20:52:12] Sitesettings-wgReadOnly [20:52:13] etc. [20:52:20] Some dark past ? [20:52:51] Krinkle: oh wow [20:52:56] That's... terrifying [21:01:17] Wat [21:01:19] That's scary [21:01:22] function getURLParamValue( paramName, url) [21:01:25] That can die in favor of mw.util I guess [21:05:45] Krinkle: Are you done with your wiki yet? I think I need you to help me with (or just do) iawiki because it's just baffling me [21:06:37] RoanKattouw: I'm on nl.wikipedia now [21:06:40] RoanKattouw: link ? [21:07:03] https://ia.wikipedia.org/wiki/MediaWiki:Common.js is full of cruft that I don't even begin to know how to fix [21:07:09] be careful with deleting functions. it is not uncommons for a common.js to have a 100 functions of which none are used outside the file. but sometimes they are [21:07:22] Although, I guess [21:07:28] I can fix the secure stuff [21:07:43] And hasClass() [21:07:53] and getURLParamValue() [21:08:40] hey RoanKattouw, so apparently ammending a draft isn't done like ammending a normal patch. I'm guessing there's no way to un-submit something to gerrit? [21:08:51] Huh, you can't amend drafts? [21:08:55] You should totally be able to? [21:08:56] RoanKattouw: https://www.mediawiki.org/wiki/Snippets/Load_JS_and_CSS_by_URL#Load_withJS [21:08:58] What have you been trying [21:09:12] RoanKattouw: also fix that one, the old one around on wikis has a security bug in the regex, allows other pages to be loaded [21:09:14] Well, I did a "git review -R" at the end, and that made a new commit. [21:09:21] That's public [21:09:22] Krinkle: Thanks [21:09:29] csteipp: Yes, you have to set -D every time too [21:09:37] crap [21:09:41] Although, kind of a Gerrit fail [21:09:49] Or.... not? [21:09:51] I don't know [21:09:53] But yeah, -D [21:09:54] RoanKattouw: MediaWiki:Sysop.css kind of stuff can be done with group- [21:10:00] user.groups module [21:10:09] csteipp: As for the ohcrap my security fix is now public part, poke ^demon [21:10:10] move page and rm common.js code etc. [21:10:27] <^demon> Hrm? [21:10:29] <^demon> What'd I do? [21:10:34] RoanKattouw: and for WikiMiniAtlas: https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_(users)#Keep_gadgets_central [21:10:41] that's all I see on first sight [21:10:44] Hey ^demon: So I submitted a patch to gerrit I didn't mean to make public [21:10:54] <^demon> That's not good :( [21:11:04] https://gerrit.wikimedia.org/r/7826 [21:11:14] RoanKattouw: Oh, and mark's /** IPv6 AAAA connectivity testing **/ got spread here as well :P [21:11:21] It's not criticle, but didn't mean to release it yet, and forgot to add -D... [21:11:29] Is it possible to remove it? [21:11:37] If it's a ton of work, it's really not a big deal this time. [21:11:52] mar fail, mark isn't online ? [21:11:56] <^demon> Hmm, possibly. [21:12:00] that's a first, for me [21:12:14] <^demon> csteipp: I know it's possible in git, but I'm afraid gerrit might 'splode. [21:12:21] csteipp: Holy shit that's an enormous patch [21:12:38] Krinkle: He's not in this channel but he is on line [21:12:45] ah ok [21:12:53] yeah, I see him in op [21:13:04] RoanKattouw: Yeah, I'm trying to rework it so it cuts out the library... but haven't done it yet. [21:14:14] <^demon> RoanKattouw: `git push -f origin :refs/changes/26/7826/2`? [21:14:37] <^demon> Should work from git's perspective. I'm really nervous about gerrit going *boom* over it. [21:14:44] Yeah [21:14:50] Gerrit will probably prevent it, right? [21:14:57] ^demon: Why don't you test this in the labs instance [21:15:08] <^demon> Not when you're magic like me, then you can do anything ;-) [21:15:12] haha [21:15:16] <^demon> Worth testing, yes. [21:15:37] <^demon> Before I make gerrit die on my "friday" :) [21:16:17] <^demon> csteipp: I'll give this a test and let you know. [21:16:23] Thank you! [21:16:50] And really, if there's any question, it's not a big deal if it stays public. [21:20:59] <^demon> Well crap, our 2.4rc0 is broken and I can't test :\ [21:21:02] * ^demon headdesks [21:21:36] <^demon> Oh, typo. [21:21:38] <^demon> Ignore me. [21:22:31] * csteipp ignores demon [21:31:38] <^demon> Ah-ha! The changeMerge.test setting is awesome. It doesn't just hide "Submit" [21:31:48] <^demon> It also adds a new "Can merge" field to the box at the top [21:32:00] <^demon> So you're not left wondering why it doesn't appear :) [21:32:19] Does it tell you if something is unmergeable? [21:32:35] (and why) [21:32:53] Krinkle: Hmm OK so, thinking what else we should do to prepare for the Berlin tutorial [21:32:54] <^demon> Doesn't tell you why, but just says yes|no. [21:33:07] <^demon> It does a dry-merge attempt and the results are based on pass/fail of that. [21:33:22] ^demon: OK so this covers both path conflicts and bad dependencies? [21:33:28] <^demon> Afaik, yes. [21:33:35] <^demon> csteipp: Answer is no, we can't :( [21:33:51] ^demon: bummer, but thanks for trying! [21:33:56] <^demon> http://p.defau.lt/?_okrGF2K9RK25sdayCqPkQ [21:34:17] RoanKattouw: Yeah, docs are pretty much up to date for RL/MGU [21:34:26] Looks like it [21:34:47] Krinkle: Although, Ctrl+F 'secure' comes up blank [21:34:57] It has a section on prot rel URLs but it doesn't mention secure [21:35:45] RoanKattouw: elaborate? [21:36:18] Krinkle: The WMA thing I fixed on iawiki had the familiar if ( foo ) { use secure } else { don't } pattern [21:36:41] We should document that that's bad and how to fix it [21:44:29] Yay for jQuery: http://cl.ly/1L0w3U3x431n0y2Y3P10 [21:46:34] nice [22:23:02] what an ungodly url [22:24:54] oops: https://en.wikipedia.org/w/index.php?title=Special:Search&search=ajax.googleapis.com&fulltext=Search&profile=all&redirs=1 [22:25:09] AaronSchulz: Cloud App is awesome :P [22:25:25] snapshot, cmd+R and it's up in the air and url in the clipboard [22:25:39] AaronSchulz: you should add a hook and add a Power By Cloud logo to MW [22:34:19] Reedy: are you not cloud enough to cr 7688?