[03:20:21] RoanKattouw: still there? [03:20:32] Trying to investigate a bug but no luck so far [03:20:53] there appears to be a significant difference in parser behavior between 1.19wmf1/1.20wmf1 and my localhost [03:20:57] (which runs master) [03:21:48] About 1.5 months back I made a change in SyntaxHighighter_Geshi to use
 instead of 
to wrap the stuff so that normal skin styles for the
 containers apply.
[03:21:54] 	 Looking at http://www.mediawiki.org/wiki/Extension:Proofread_Page
[03:22:28] 	 it appears they're closed early due to being inside a list item
[03:23:35] 	 https://www.mediawiki.org/w/index.php?title=Project:Sandbox&oldid=523532
[03:25:32] 	 You're looking at case 12?
[03:25:44] 	 8 and 12
[03:25:44] 	 yes
[03:25:48] 	 It's 
stuff
[03:25:50] That's weird [03:25:51] as well as the first three above the list [03:26:14] Does this happen in 1.19wmf1 AND 1.20wmf1? [03:26:21] Cause maybe Tidy is doing this [03:26:26] it was backed out of 1.19wmf1 [03:26:28] I committed it to svn trunk [03:26:31] requested merge [03:26:34] No I mean the bug [03:26:35] backed out due to this bug [03:26:37] Oh, right [03:26:47] I figured it was because it didn't apply cleanly and dependent on a small parser change [03:26:51] but now its happening again [03:27:02] If you're saying it's broken in 1.20wmf1 but not in master, that would be very very weird [03:27:05] so it must be related to some wmf config stuff [03:27:10] Cause 1.20wmf1 was branched /today/ [03:27:12] Yeah [03:27:13] I know [03:27:16] try locally please :) [03:27:20] Hence me suggesting Tidy [03:27:27] see if it happens on your localhost [03:27:40] (copy sandbox contents into a localhost running master with syntax geshi)( [03:27:49] writing bug report now [03:28:54] Eek, my mergeRL2 commit is broken [03:32:17] You're right it works on mastser [03:33:17] Hmm, installing Tidy didn't break it [03:35:45] !b 35875 [03:35:46] https://bugzilla.wikimedia.org/show_bug.cgi?id=35875 [03:35:53] Nope, Tidy does break it [03:35:54] It's Tidy [03:36:01] Steps to reproduce: [03:36:04] sudo apt-get install php5-tidy [03:36:13] Set $wgUseTidy = true; [03:36:15] I don't have that stuff , you know that :P [03:36:19] Make an edit (an actual edit!) to the page [03:36:23] Oh, you're on your mac, right [03:36:29] don't rub it in :P [03:36:47] I know install brew or ports but I don't like that [03:38:22] RoanKattouw: added rev ids to bug [03:40:46] I don't get why its doing this [03:40:54] it seems to disallow those pre tags there [03:41:00] its not limited to list item context [03:41:07] the first three above the list in the test case are also failing [03:41:27] RoanKattouw: does your local with tidy match the first screenshot in the bug report? (including the cases not in the list) [03:42:47] It matches what you showed me on mw.o [03:42:51] ok [03:44:22] I'm starting to think this bug has been in geshi for a while but since previously the inner wrapper outputted by geshi internals matched the tag name that the mw extension was using (
inside
, maybe tidy merged them) [03:44:28] but still, why would it not allow that [03:47:45] tidy(1) - validate, correct, and pretty-print HTML files [03:47:48] whatis tidy [03:47:50] I have it already? [03:47:53] * Krinkle is surprised [03:48:38] RoanKattouw: Do you have $wgAlwaysUseTidy on as well? [03:48:53] I have tried with and without [03:49:11] I used the PHP extension, not the CLU [03:49:14] *CLI [03:50:31] yeah, enabling $wgUseTidy breaks it [03:51:04] OK, it's almost 9, I'm gonna go home off [03:51:15] The pseudophedrine has been wearing off for an hour now so I'm getting tired [03:51:46] I may or may not sleep well tonight depending on what my sinuses and mucus glands do, and what sucks is that the drug I use to shut them up also keeps me awake [03:51:56] So I have absolutely no idea when I will be back on line [03:52:11] Well it'll be tomorrow but I don't know what time [03:52:27] RoanKattouw: Do you know where give or take tidy applies its stuff? [03:52:36] I could search for hours [03:52:41] You mean to work around it? [03:52:45] Look for the ParserAfterTidy hook [03:52:48] You might find it useful [03:52:53] well not really, more to debug it [03:52:59] ok [03:53:00] thx [03:53:21] geshi is executed from parserHook [03:53:51]
stuff
[03:53:53] line 3 column 1 - Warning: missing
before
[03:53:55] line 3 column 16 - Warning: inserting implicit
[03:53:56] 	 although it comes back to 'why' since 
 and 
are nearly the same, both are exceptions in wikitext in that they are allowed [03:53:58] You can run tidy on the CLI and just feed it stuff [03:54:12] Or pipe a file into it [03:54:15] why is there a div inside [03:54:16] hm,, [03:54:17] interseting [03:54:20] interesting* [03:54:22] Oh, sorry [03:54:24] That's my input [03:54:30] Lemme try them in reverse [03:55:03] It took
 no problem
[03:55:09] 	 That's with an HTML5 doctype though
[03:55:23] 	 yeah, div>pre isn't an issue for use
[03:55:25] 	 sure
[03:55:31] 	 I can imagine pre>div,
[03:55:36] 	 but it doesn't make sense
[03:55:40] 	 I don't know
[03:55:47] 	 Well I'm going home and gonna get some sleep
[03:56:04] 	 Good luck (and good morning, it's 6am)
[03:56:05] 	 the whole reason this^H^H^, never mind, I'll see what I can figure out
[03:56:06] 	 you go :)
[06:33:15] 	 morning
[06:53:46] 	 TimStarling: ping
[06:53:55] 	 hello
[06:54:00] 	 hi Tim!
[06:54:26] 	 I was wondering about how the Lua stuff is going
[06:55:44] 	 we have to get a lot of information and functionality from the wiki, which is not that easy from node
[06:55:59] 	 it's going nicely, I'm currently writing the standalone interpreter
[06:56:00] 	 so we were considering a C extension
[06:56:40] 	 ah- standalone as in static binary with stdin/stdout communication?
[06:57:06] 	 yes, except that it will probably be dynamically linked rather than static
[06:57:18] 	 k
[06:57:35] 	 so the plan for shared hosting is to distribute the binary
[06:57:36] 	 because if I statically link against glibc, the binary size is 1MB, whereas if it's dynamically linked it's 200KB
[06:57:42] 	 yes
[06:57:52] 	 nice ;)
[06:58:07] 	 the only dynamic dependency is libc
[06:58:21] 	 200KB is small enough that I can check them in to git
[06:58:33] 	 maybe 3 binaries: two for linux and one for windows
[06:58:58] 	 using "make generic" it doesn't try to link against any system dependencies, it just uses libc alone
[06:59:00] 	 piggy-backing on that solution would be very attractive for a possible C parser port too..
[07:00:42] 	 I am using a message format with a length field followed by a PHP serialized message body
[07:01:12] 	 with a kind of checksum of the length field to prevent most kinds of hanging that you would expect from strange things appearing on stdout
[07:01:34] 	 I've basically finished the PHP side, that was easy
[07:01:57] 	 I must have written 10 lines of code before in Lua so the Lua side will be a bit slow
[07:02:07] 	 very nice
[07:02:57] 	 how do you plan to handle function calls?
[07:03:14] 	 which direction?
[07:03:22] 	 I noticed that you support Lua -> PHP -> Lua
[07:03:27] 	 yes
[07:03:54] 	 so for PHP -> Lua calls, there is a "loadString" message which will return a chunk ID
[07:04:11] 	 then PHP will send a "call" message with that chunk ID and the arguments
[07:04:39] 	 then Lua will respond with either a "return" message giving the return value, or an "error" message if there was an error raised
[07:04:54] 	 the other direction is basically the same
[07:05:15] 	 so somewhat similar to dlopen, but at a higher level and message-based
[07:05:32] 	 the PHP client code will catch exceptions of a certain class and convert them to error messages to Lua
[07:06:05] 	 which allows PHP code to simulate the way the base library raises errors
[07:06:33] 	 that mechanism is also implemented in the C extension
[07:07:04] 	 i.e. catching exceptions and converting them to lua errors
[07:07:29] 	 the basic mechanism seems to be quite independent from Lua
[07:07:44] 	 yes
[07:07:56] 	 so I guess it could also be used to access parser stuff
[07:07:59] 	 Victor wanted the Scripting extension (now called Scribunto) to support multiple languages
[07:08:33] 	 there are now three levels of object composition abstraction in my working copy
[07:09:05] 	 the first level is the wikitext parser interface, and contains some functionality common to all languages
[07:09:46] 	 the second level is the "engine"
[07:10:16] 	 there is a Lua engine base class which has the API which is exposed to Lua code
[07:10:30] 	 and two small child classes for the two implementations
[07:10:39] 	 and then the third abstraction level is the lua interpreter
[07:10:50] 	 so the engine provides the mediawiki-specific environment
[07:10:51] 	 Is this the correct place for asking questions about mediawiki core?
[07:11:01] 	 which will provide a generic module registration interface, a calling interface, etc.
[07:12:07] 	 should check those interfaces out, especially the wikitext parser interface
[07:12:15] 	 yes, the engine provides the MediaWiki-specific environment
[07:13:42] 	 the node parser is currently solidifying on the parser side, and I hope to have an idea about remaining template-related limitations soonish
[07:13:56] 	 excellent
[07:14:04] 	 but there is a long tail of functionality to implement / duplicate
[07:14:22] 	 thumbs need the image size and would need to trigger scaling etc
[07:14:24] 	 of course
[07:14:45] 	 if it was easy it would have been done by now, right? ;)
[07:15:02] 	 I guess so ;)
[07:16:49] 	 I guess once it is clear how compatible the parser portion really is in particular for complex template / parser function / table interactions, we should consider if it is a good time to port to something that can directly reuse the existing PHP code for all the functionality
[07:17:28] 	 in the mean time, we can do a slow fall-back using action=parse and other API methods
[07:19:23] 	 right
[07:20:55] 	 doing the token-based parser in PHP would likely be too slow and memory-intensive, so C is quite attractive
[07:22:54] 	 a big issue with C or C++ is of course shared hosting, but you seem to have a very good solution for that
[07:29:31] 	 but for now I need to get taxoboxes to render properly ;)
[07:29:39] 	 thanks for your explanation!
[07:31:19] 	 no problem, good luck
[07:32:01] 	 I have to work out how to write beautiful Lua code so that everyone will look at it and say "wow, Lua is not so bad, let's all write templates in it"
[07:33:32] 	 hehe- anything willl be nicer than templates and parser functions though ;)
[08:02:48] 	 New patchset: Hashar; "MediaWiki code standard for use with PHP_CodeSniffer" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4360
[08:04:09] 	 New review: Hashar; "Patchset3 remove the github submodule and make us use the mediawiki/tools/codesniffer one hosted by ..." [integration/jenkins] (master); V: 0 C: 0;  - https://gerrit.wikimedia.org/r/4360
[10:00:34] 	 New review: Hashar; "(no comment)" [integration/jenkins] (master); V: 1 C: 2;  - https://gerrit.wikimedia.org/r/4491
[10:00:44] 	 New review: Hashar; "(no comment)" [integration/jenkins] (master); V: 1 C: 2;  - https://gerrit.wikimedia.org/r/4487
[10:02:00] 	 New review: Hashar; "The list of files was far from perfect but follow up changes 4487 & 4491 by Platonides makes the lin..." [integration/jenkins] (master); V: 1 C: 2;  - https://gerrit.wikimedia.org/r/4485
[10:02:03] 	 Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4491
[10:02:04] 	 Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4487
[10:02:05] 	 Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4485
[10:17:36] 	 New patchset: Hashar; "MediaWiki code standard for use with PHP_CodeSniffer" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4360
[10:18:19] 	 New review: Hashar; "patchset 4 is just a rebase." [integration/jenkins] (master); V: 0 C: 0;  - https://gerrit.wikimedia.org/r/4360
[11:08:34] 	 Any gerrit admins around?
[12:12:05] 	 I'm getting lots of "lost session" when editing on mediawiki.org
[13:04:00] 	 vvv: do you still need Gerrit assistance ?
[13:04:18] 	 Nikerabbit: if you get session lost that might be due to 1.20wmf1 :-(
[13:04:37] 	 hashar: do you know if there is a way to get a RO-access to gerrit's SSH interface?
[13:05:17] 	 what do you want to do ?
[13:05:39] 	 we could surely try to craft a group which has no rights
[13:05:39] 	 hashar: that's why I mentioned it
[13:05:46] 	 hashar: well, I need to make update-extensions.sh work from anonymous gerrit setup
[13:06:12] 	 you can use https?
[13:06:47] 	 like https://gerrit.wikimedia.org/r/p/mediawiki/extensions/Translate.git
[13:06:50] 	 I don't know what is that script nor what it does
[13:07:06] 	 hashar: well, what's then preferred way to check out all extensions?
[13:07:26] 	 I did a script for Nike, there is a few arounds too
[13:07:34] 	 the list of eats is at :  https://gerrit.wikimedia.org/mediawiki-extensions.txt
[13:08:22] 	 you could then do something like:   for i in `curl $URL_ABOVE`; do submodule add $i https://gerrit.wikimedia.org/r/p/mediawiki/extensions/$i; done;
[13:09:06] 	 And what would be the command for updating all them?
[13:09:24] 	 to do something on every submodules uses:  git submodule foreach
[13:09:31] 	 so that would be:   git submodule foreach 'git pull'
[13:09:34] 	 Reedy: how do I run extension maintenance script for git branch wiki?
[13:09:58] 	 Nikerabbit: same way
[13:10:10] 	 mwscript /path/to/maintenance/script.php test2wiki
[13:10:22] 	 the path being from php-1.20wmf1
[13:11:02] 	 hmm
[13:11:13] 	 so I have to be inside that branch?
[13:11:24] 	 no
[13:11:35] 	 i'm just meaning it's relative from php-1.19/php-1.20wmf1
[13:12:05] 	 and it will figure it out automatically?
[13:12:16] 	 yeah
[13:19:14] 	 Reedy: how do you merge changes from master to REL1_19  ?
[13:19:21] 	 do you use git cherry-pick ?
[13:19:52] 	 That's what I've done before, yeah
[13:20:35] 	 The output on gerrit looks sane, so it seems to know where it's come from etc
[13:20:40] 	 there is a -x option to append a line saying  "(cherry pick from commit …)"
[14:10:01] <^demon>	 JeroenDeDauw: I almost want to quip "I fully agree with what Chad said" in BZ. See, we do have something in common ;-)
[14:11:11] 	 ^demon: I don't always disagree with you, I just keep quit when I do
[14:11:43] <^demon>	 It's easier to comment when you disagree.
[14:12:15] <^demon>	 Actually, we have that in common too--we both speak our minds when we think "that's stupid" ;-)
[14:32:13] 	 ^demon: seems to be prevalent here :)
[14:42:24] 	 ping everyone: when are you guys arriving to Berlin hackaton ?
[14:44:07] 	 ^demon: Nikerabbit Reedy ^^^^^
[14:44:49] 	 Dunno
[14:44:53] 	 When I get there :p
[14:44:55] 	 hashar: I probably wont
[14:45:07] 	 I've got to be there for the Wikidata event the day before
[14:45:24] 	 on thursday?
[14:45:49] 	 I'll go there either Thursday morning or Wednesday evening
[14:45:50] 	 yeah, so most likely on the wednesday
[14:46:03] <^demon>	 hashar: Not coming.
[14:46:05] 	 also for Wikidata
[14:46:57] 	 ^demon: AGAIN ???????????
[14:46:59] 	 ^demon: :-D
[14:47:15] <^demon>	 Yeah, won't be there this year.
[14:47:50] 	 you need to setup an platform engineering hackaton in your city :-)
[14:50:14] <^demon>	 Ha, we lack public transportation.
[14:50:21] <^demon>	 I'd be stuck shuttling everyone from the airport.
[14:54:08] 	 guillom: do you come to Berlin hackaton and if so when is your flight ?
[14:54:18] 	 guillom: I have a cheap one from Nantes to Berlin via your city :-]
[14:56:00] 	 what days is it?
[14:56:19] 	 Friday June 1st  till   Sunday June 3rd
[14:56:34] 	 june
[14:56:35] 	 ok
[15:11:23] 	 hashar: to clone a tag, do i need to clone the whole repo, then do git checkout tags/FOOBAR ?
[15:26:10] 	 Reedy: sorry back
[15:26:34] 	 Reedy: a tag is just a name pointing to a sha1
[15:26:39] 	 err a commit object sha1
[15:26:42] 	 yeah
[15:26:49] 	 need to clone then checkout
[15:27:09] 	 regardless of tag / branch, if you want a copy you need to do a clone
[15:27:13] 	 could be a local clone though
[15:27:49] <^demon>	 Yeah, you don't have to always clone remotely. If you've got core.git already cloned locally you can clone from it.
[15:27:53] 	 for release management, there is probably a command like:   git tarball 
[15:27:55] <^demon>	 And just add a new remote if you need to push to gerrit.
[15:28:20] <^demon>	 hashar: `git archive`, but we have to do some manual steps between clone & tarring anyway so we can't one-line it :\
[15:28:59] 	 which steps?
[15:29:12] <^demon>	 Copying in bundled extensions.
[15:29:31] 	 aren't they submodules in ./extensions/ ?
[15:29:48] <^demon>	 You'd have to commit the submodules to the branch in order to tag it.
[15:30:07] <^demon>	 And so we'd end up with a commit with submodules for core which I kinda didn't want :\
[15:30:26] <^demon>	 Extensions aren't submodules for core except on wmf branch.
[15:30:26] 	 you could use a release branch
[15:30:41] 	 just like we have the wmf branch to release master on production servers
[15:31:11] <^demon>	 We have release branches. I'm just not sure how I feel about adding those extensions as submodules though.
[15:31:23] 	 ^demon: still no luck with automerge?
[15:31:23] <^demon>	 Reedy: You're release manager. Thoughts?
[15:31:29] <^demon>	 Nikerabbit: No :\
[15:32:07] <^demon>	 hashar: We also have to generate the diff between the release and the previous release since we always publish those.
[15:32:31] 	 oh yeah the patches
[15:32:34] 	 those are trivial though
[15:32:36] <^demon>	 It's easier to just clone & checkout and do the work then tag.
[15:32:44] <^demon>	 No need to use git archive.
[15:32:46] 	 should be something like   git diff 1.19.1..1.19.2
[15:33:03] 	 indeed, clone / checkout :-D
[15:37:49] 	 ^demon: can you write an update to https://bugzilla.wikimedia.org/show_bug.cgi?id=35537
[15:38:00] <^demon>	 I will today, yes.
[15:48:15] <^demon>	 Nikerabbit: Do we want to move /trunk/translatewiki to git at some point?
[15:48:59] 	 dunno
[15:49:16] <^demon>	 There's no rush, just something to think about.
[17:15:34] 	 hi 20% folks!
[17:15:42] * sumanah  looks at 
[17:15:42] 	 https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy
[17:16:06] 	 Wednesday: raindrift (who isn't in yet, I think) and a little of Niklas's time
[17:16:34] 	 jorm: I'm sorry I didn't catch you yesterday, do you have a moment to talk?
[17:19:29] 	 hrm crickets
[17:19:56] 	 guillom: while I've got a few min....are you around?
[17:20:10] 	 robla: sort of; what's up?
[17:21:01] 	 any thoughts on 1.20 communication for next week?
[17:21:11] 	 I'm assuming you've seen the schedule and know what's up there
[17:21:16] 	 hello
[17:21:24] 	 sorry I was in sauna
[17:22:00] 	 Nikerabbit: sounds fun! I am merely on a couch. :-)
[17:22:00] 	 robla: no, I've seen the 1.20 page but as sumanah and I were looking at it it appeared that its content wasn't yet stable/reliable
[17:22:20] 	 robla: so, without knowing what 1.20 actually is, it's difficult to know what to communicate about
[17:22:30] 	 guillom: ok...so, here's the scoop:
[17:22:31] 	 Nikerabbit: is your code review & other 20% time stuff for today/tomorrow all absorbed already, or would you like a suggestion of something to look at?
[17:22:38] 	 hmm robla: have you thought about how we communicate about changes and new features to the users before every new branch being deployed?
[17:23:08] 	 sumanah: I do little bit during the week, trying to do most of it on Fridays
[17:23:15] 	 Nikerabbit: that's what I'm going to ask guillom to advise on  :)
[17:23:24] 	 Nikerabbit: ok, I'll leave you alone for a little while, then :-)
[17:23:28] 	 I still have quite a many gerrit commits I've promised to look at
[17:23:31] 	 Nikerabbit: ok
[17:23:51] 	 guillom: the plan is to deploy little mini branches every two weeks
[17:24:14] 	 branchlets, so cute!
[17:24:27] 	 eeeeee
[17:24:30] 	 I had some play with the idea and come up with: https://www.mediawiki.org/wiki/I18n_deployments
[17:24:34] 	 each with its own HEAD
[17:24:36] 	 but I'm not really happy about that
[17:24:36] 	 so, instead of just having 1.20wmf1 for the duration of the 1.20 cycle, we'll have 1.20wmf1 and 1.20wmf2 in April, 1.20wmf3 and 1.20wmf4 in May, etc
[17:24:59] 	 mostly because it probably doesn't scale beyound just i18n stuff
[17:25:24] 	 in six months, we'll have a 1.20 release that corresponds to whatever the latest 1.20wmfXX branch is
[17:25:59] 	 so....if we communicate in Grand Style about these changes, we'll drive people nuts
[17:26:01] 	 but in general I think we need some kind of wiki page that lists the changes and deployment days in addition to sending notifications by email and perhaps by blogging
[17:26:24] 	 robla: and are all these mini deployments going to be done in batches, like the previous ones?
[17:26:27] 	 Nikerabbit: something more human-readable than https://wikitech.wikimedia.org/view/Software_deployments ?
[17:26:32] 	 sumanah: yes
[17:26:37] 	 guillom: yup.
[17:26:41] 	 Nikerabbit: ok, just making sure I understood you :)
[17:26:52] 	 sumanah: and more detailed
[17:26:55] * robla  looks for wikitech-l mail describing this
[17:26:56] 	 sumanah: and more well known
[17:27:19] 	 robla: I saw that e-mail but it looked very meta and technical so I only skimmed through it
[17:28:09] 	 sumanah: we have been quite caiutios with any i18n stuff we do and always send email to mediawiki-i18n at least few days before it is going to happen
[17:28:47] 	 just to give people chance to raise any potential issues if they are interested
[17:29:15] 	 robla: I can propose a communications process that would apply to all these mini-deployments, but I can't do it by Monday unless I stop everything else I'm working on right now.
[17:29:19] 	 http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/60511
[17:29:33] 	 (and perhaps work over the week-end)
[17:29:34] 	 and we can keep on doing that, since the deployment cycle is predictable... biggest problem is keeping up to date what actually changes, since gerrit doesn't have tagging yet
[17:29:51] 	 guillom: I think maybe you're overdoing it if you're talking about upending everything
[17:30:09] 	 robla: upending?
[17:30:23] 	 suspending
[17:31:12] 	 guillom: let's do something relatively lightweight to get us through this first deployment, and think about this more broadly later
[17:31:15] 	 robla: well, my week is full and the deployment is scheduled for Monday, so...
[17:31:33] 	 gotta run to the next thing
[17:31:47] 	 ok
[17:31:47] 	 actually the first real deployment is already today, isn't it?
[17:32:08] 	 robla: I'll sketch out a plan for guillom, something perhaps a little more immediate and small
[17:33:22] 	 guillom: I've found a few things that readers and editors might notice in 1.20
[17:33:22] 	 https://www.mediawiki.org/wiki/MediaWiki_1.20#Readers_and_editors_will_notice
[17:33:39] 	 guillom: most noticeably: the new diff style
[17:34:51] <^demon>	 Krinkle: Who added useskin=default?
[17:34:58] 	 ?
[17:35:02] 	 I have no idea
[17:35:17] <^demon>	 Hmm :\
[17:35:21] 	 I didn't know it existed until now
[17:35:23] <^demon>	 From the release notes for 1.20
[17:35:28] <^demon>	 Neither did I.
[17:35:29] 	 wow
[17:35:41] <^demon>	 Wouldn't that break if someone ever made a skin called SkinDefault?
[17:36:34] 	 sumanah: but if I understand robla correctly, that page is about 1.20wmf01, not 1.20, right?
[17:36:57] <^demon>	 They're branched from the same point.
[17:37:09] <^demon>	 Other than a couple of wmf-only hacks we did on the 1.20wmf1 branch
[17:37:39] 	 ^demon: but robla said that 1.20 would be 1.20wmfXX in like 6 months
[17:37:39] 	 wmf-only hacks? noooooooo
[17:37:57] 	 I'm confused
[17:38:24] <^demon>	 Are we releasing in 6 months?
[17:38:33] <^demon>	 I thought we were trying to shrink release cycles.
[17:38:57] <^demon>	 brion: Less than before. We managed to get most of them worked into master to varying degrees.
[17:39:03] 	 ^demon:  in six months, we'll have a 1.20 release that corresponds to whatever the latest 1.20wmfXX branch is
[17:39:22] 	 most is good.a ll is better
[17:39:23] <^demon>	 News to me, I didn't know we were shooting for so far out.
[17:40:29] <^demon>	 brion: https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=shortlog;h=refs/heads/wmf/1.20wmf1 - the branch begins with "Creating new..."
[17:41:50] 	 ok, guillom, here's my thinking: since the deployment on Monday is for a relatively small code change (just about a month's worth or two), it's ok to be less thorough in how we communicate it out to the communities. So, sometime tomorrow or Friday, spend a few hours -- and time-limit it to only what you can do in less than half a day -- posting to the major village pumps of the non-Wikipedia wikis
[17:41:55] 	 ^demon: i need some quick help with git to make sure i don't break things
[17:41:57] 	 Alright, I'm off to dinner; sumanah: I'll take a look at 1.20 comms tomorrow, but I may defer the 1.19 in favor of 1.20
[17:42:26] 	 guillom: understood, and that makes sense.  1.19 tarball release comms are less urgent than 1.20 deploy comms
[17:42:37] 	 https://www.mediawiki.org/wiki/MediaWiki_1.20#Readers_and_editors_will_notice now has a paragraph with the top three things to communicate out, I think, guillom
[17:56:22] 	 sumanah: sup?
[17:57:24] 	 hey jorm - just as background, did you happen to see the recent wikitech-l thread about bottlenecks in extensions getting reviewed before deployment?
[18:07:06] 	 brion: thanks for your email to wikitech-l just now; used it to clarify the start of https://www.mediawiki.org/wiki/Parser_tests
[18:09:03] 	 i'm certain i saw it; not certain i read it.
[18:09:39] 	 jorm: ok. So here's the bit that might affect you :-) --
[18:09:40] 	 \o/
[18:09:57] 	 http://lists.wikimedia.org/pipermail/wikitech-l/2012-April/059962.html
[18:10:21] 	 jorm: I talked with Howie about this -- many of these extensions need user experience design review before they need any technical or code review
[18:10:36] 	 because after all we're asking "do we need this feature at all" before we dive into how it's implemented
[18:10:46] 	 are these things that are going to get deployed on wmf wikis?
[18:11:00] 	 jorm: Yes, but only if they pass all the review steps
[18:11:23] 	 jorm: I've updated https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment to clarify the process that these extensions have to go through before deployment
[18:11:52] 	 i haven't been able to get up and working on git yet.
[18:12:14] 	 so that's a bit of a problem.
[18:12:40] 	 also, can someone re-close that difftext bug?
[18:12:48] 	 i'm seriously done bikeshedding about it.
[18:12:49] 	 jorm: Most of these extensions are in SVN
[18:13:33] 	 jorm: ConfirmAccount, for example
[18:14:19] 	 jorm: there are a few extensions in the review queue that are awaiting a user experience design review -- https://www.mediawiki.org/wiki/Extension:ConfirmAccount is the one where I think it's most ripe for your thoughts
[18:15:08] 	 (SpecialInterwiki, I want to check with Siebrand & with the Wikidata people.... SubpageSortKey I think is less interesting frankly)
[18:15:33] 	 well.
[18:15:41] 	 we are never going to deploy confirmaccount to any wmf wiki.
[18:16:05] 	 i can say that based on what it's *for*, not how it works.
[18:16:17] 	 sumanah: I once added that extension to core, but that was reverted because the special page could not be disabled, or editing could not be disabled (forgot what exactly).
[18:16:18] 	 subpagesortkey needs ui review?
[18:16:22] 	 it doesn't have a ui
[18:16:37] 	 bawolff: does it affect the user experience at all?
[18:16:48] 	 sumanah: I think it makes sense to have a read-only interwiki list available for users.
[18:16:56] 	 not in a way that users have any say in ;)
[18:17:12] 	 sumanah: now it's a code function without a UI, so guesswork.
[18:17:20] 	 code = core
[18:17:35] 	 bawolff: is this the one that Tim said "just put it in core" for? Subpage Sortkey?
[18:17:50] 	 yeah i think so
[18:17:59] 	 I'm just catching up on all my mediawiki related email
[18:18:04] 	 Ah yes it is https://bugzilla.wikimedia.org/show_bug.cgi?id=22911#c13
[18:18:09] 	 from the last 2 weeks or so
[18:18:19] 	 bawolff: understood, ok, I'll change that, then
[18:18:20] 	 sorry
[18:18:59] 	 siebrand: mind if I copy and paste what you just said into https://bugzilla.wikimedia.org/show_bug.cgi?id=22043 for posterity?
[18:19:12] 	 sumanah: no problem
[18:19:26] 	 jorm: and may I copy and paste what you said into the https://bugzilla.wikimedia.org/show_bug.cgi?id=13782 bug regarding ConfirmAccount?
[18:19:59] 	 There's not much to say.
[18:20:05] 	 I mean, do you know what it does?
[18:20:28] 	 Sets it up so that only 'crats can create accounts.  That's it.
[18:20:29] 	 I think enwikipedia folk want confirmaccount to replace the toolserver account creation process
[18:20:29] 	 "No" is a substantial thing to say :)
[18:20:41] 	 they don't want it enabled in general
[18:20:47] 	 We won't be deploying it.  I can't see a situation where the WMF would want it.
[18:21:00] 	 ok, siebrand, updated the bug, thanks
[18:21:06] 	 yw
[18:21:09] 	 toolserver?
[18:21:32] 	 hrm.
[18:21:36] 	 yeah, they have some sort of request an account hack on toolserver
[18:21:47] 	 hrm.
[18:21:55] 	 that's not wmf, but close enough.
[18:21:58] 	 https://toolserver.org/~acc/
[18:22:24] 	 (we only have like, five restricted wikis:  office, internal, board, collab, foundation)
[18:22:35] 	 and those have account creation disabled entirely on them.
[18:23:32] 	 https://en.wikipedia.org/wiki/Wikipedia:ACC
[18:24:28] 	 siebrand: just to be super-clear -- is there any reason I should ask the Wikidata people to comment on the request to install Extension:SpecialInterwiki?
[18:24:56] 	 and bawolff when I said "less interesting" I simply meant, potentially interesting to Brandon
[18:25:13] 	 as functionality he might be interested in reviewing
[18:25:20] 	 no offense to you or your work intended!
[18:25:26] 	 sumanah: lol, I wouldn't be offended if you said my 5 line extension was boring
[18:25:32] 	 It aint exactly sliced bread
[18:25:56] 	 hygiene is more important than heroics
[18:26:01] 	 the boring is more important than the exciting
[18:26:06] 	 :D
[18:27:41] 	 ok, so jorm, now that you know why they're interested in having it installed (basically as a workaround for people to HELP them get accounts), is it less of an automatic no?
[18:28:25] 	 sumanah: btw, for the Memento extension on the review queue list - that has the secondary issue of no one actually wanting it as well ;)
[18:28:31] 	 bawolff: ahaaaa
[18:28:52] 	 okay.  i'm trying to figure out how to say this delicately.
[18:28:56] 	 or at least, no one other than the author caring (people don't not want it, just no one really cares)
[18:29:13] 	 but there really isn't, so here we go:  I don't have much time to spend on non-WMF requests.
[18:29:14] 	 bawolff: hm, I shall attempt to figure out how it got on the review queue then
[18:29:26] 	 This really isn't one.  It's arguable.
[18:29:57] 	 bawolff: where would ConfirmAccount be installed if we approved it? WMF sites?
[18:30:05] 	 one wonders why it's needed.  if the point is to prevent spam accounts (on any wiki), then really having an interface where you have to go approve/reject people is actually *more* work.
[18:30:32] 	 sumanah: en wikipedia are the folks who want it
[18:30:51] 	 is there a discussion about this? i'm just curious, now.
[18:31:02] 	 jorm: so, it's a request for a change to a WMF site.  Do you mean you don't have much time to spend on requests from outside WMF? just clarifying
[18:31:02] 	 It should be checked if they still want it. They've invested quite a bit in doing their work around so they might not even care anymore
[18:31:28] 	 bawolff: I did ask in the bug comments. jorm  https://bugzilla.wikimedia.org/show_bug.cgi?id=13782
[18:31:45] 	 They say enwikipedia in comment 0
[18:31:55] 	 but that is years ago.
[18:32:54] 	 So I figure I'll ask Ironholds whether he has time to assess things by asking around about this and figuring out whether en. still wants it.
[18:33:04] 	 (Since the relevant people might not have read their bugmail yet.)
[18:34:44] 	 yeah. i'm not going to spend time on this if the only person who wants it reviewed is MzMcbride.
[18:34:59] 	 it was closed for a whole year before he reopened it.
[18:35:39] 	 jorm: Got it. OK, I'll summarize our discussion today in another comment on the bug and ask Ironholds to reassure me that this isn't something they actually want
[18:35:58] 	 jorm: and on a related thing -- to talk about your 20% time allocation, should I talk with you directly or with Terry?
[18:36:17] 	 i don't work for terry. i work for howie.
[18:36:31] * sumanah  misread an org chart
[18:37:20] 	 and the last time i did any 20% work, it got turned into a bikeshed conversation and a major waste of my time.
[18:37:24] 	 hey hexmode - when you have a second, I'm interested in the background of the Memento extension and whether a community member asked for it to be added to the review queue https://www.mediawiki.org/w/index.php?title=Review_queue&diff=505247&oldid=495208
[18:37:31] 	 and it's *still going on*
[18:37:35] 	 jorm: diff?
[18:37:39] 	 yes
[18:37:44] 	 my sympathies
[18:38:11] 	 I have also been pulled into bikeshedding and ... things I had not previously prioritized .... as part of community engagement
[18:39:03] 	 jorm: so perhaps you would prefer to have a windshield or something, to help make your 20% time more productive?
[18:39:26] 	 jorm: a liaison who ... hmm, that might not work
[18:40:02] 	 hey its wikipedia - if its one thing we're good at is arguing over pointless things
[18:40:22] 	 i'm not opposed to doing the 20% work but i am already operating at near full capacity so the entire rigamarole of fighting over shit just to have someone redo all of it later is just not high on my list of priorities.
[18:40:32] 	 bawolff: no need to let that particular natural resource turn into a oil spill that paralyzes birds (I guess Brandon here is a bird?)
[18:40:40] 	 sumanah: vaguely remember that... let me go look around
[18:41:14] 	 jorm: ok, so if I want to ask for a fifth of your time, it sounds like  I would need to nag Howie (to try to reduce your current workload)
[18:41:21] 	 I also know you do a lot of community engagement as you go
[18:41:30] 	 working on mediawiki.org and uploading things publicly
[18:41:32] 	 you need to get howie's sign off.
[18:41:38] 	 yeah
[18:41:47] 	 to be honest, i'd prefer to spend my 20% doing design work for the sister projects.
[18:42:58] 	 jorm: I know there are a few volunteers who have contacted you & me asking "how can I help?" and I would love for you to be able to spend a little time giving them stuff to do
[18:43:06] 	 from a design perspective
[18:43:14] 	 sure.
[18:43:20] 	 all I can tell them is to look at bugzilla bugs with the "design" keyword, and comment with their perspective
[18:43:30] 	 ok, I'll talk to Howie
[18:43:30] 	 thanks
[18:49:55] 	 sumanah: I don't remember putting that in in Feb, but here is the discussion from back in Nov: http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/46202
[18:50:05] 	 aha, thanks hexmode
[18:50:19] 	 oh wow, 2009
[18:52:24] 	 heh, I missed the date
[18:53:51] 	 hexmode: so how's ShortURL?
[18:54:18] 	 hexmode: more urgently, I presume you know what the big bugs are in 1.20 now that we've deployed to mediawiki.org
[18:54:44] 	 hexmode: anything worrisome?
[18:55:27] * chrismcmahon  listens in.... 
[18:57:25] 	 sumanah: haven't seen anything
[18:57:42] 	 but I haven't gone through all the bugs yet
[18:57:52] 	 so I may have something yet.
[18:58:29] 	 hexmode: got it. chrismcmahon you might also want to have a skim of recently filed bugs in bugzilla
[18:58:34] 	 just to have 2 sets of eyes on it
[18:58:46] 	 :)
[19:10:00] 	 robla: sumanah: https://bugzilla.wikimedia.org/35849 -- highest since this might be on enwiki
[19:10:26] 	 thanks for the heads-up, hexmode
[19:10:50] 	 A tiff screenshot !?
[19:11:15] 	 I haven't opened a tiff file since I left high school
[20:28:54] 	 DarTar: can you take a look at https://bugzilla.wikimedia.org/show_bug.cgi?id=35590
[20:41:25] 	 rsterbin: sure
[20:41:31] 	 thanks
[20:43:18] 	 rsterbin: I like that
[20:43:28] 	 great
[20:43:34] 	 has fabrice signed off on this?
[20:43:47] 	 nope; i'm not sure he's seen it
[20:43:53] 	 the plan looks good to me but I don't know how this is affecting the timeline
[20:44:15] 	 and also, can you clarify what would be values for {LOCATION} ?
[20:44:15] 	 probably not much
[20:44:21] 	 ok cool
[20:44:27] 	 bottom or overlay, as usual
[20:44:31] 	 right
[20:44:46] 	 I'll post a reply saying that the plan makes perfect sense and ask fabrice to review it
[20:44:51] 	 thanks
[20:44:55] 	 np
[20:58:33] 	 ^demon: don't forget https://bugzilla.wikimedia.org/show_bug.cgi?id=35537 :)
[20:58:50] <^demon>	 I won't.
[21:03:45] 	 New patchset: Hashar; "target to take MediaWiki screenshots" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4752
[21:04:05] 	 New review: Hashar; "work in progress." [integration/jenkins] (master); V: 0 C: -2;  - https://gerrit.wikimedia.org/r/4752
[21:05:47] 	 New patchset: Hashar; "target to take MediaWiki screenshots" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4752
[21:05:54] 	 have a good night
[21:19:39] 	 New review: Krinkle; "Where is `${builddir}` defined? Is that a ./tmp directory created by ant? Or is that the directory w..." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4752
[21:22:34] <^demon>	 Krinkle: It's defined in default.properties.
[21:49:36] 	 Where's raindrift?
[21:55:26] 	 New review: Platonides; "To anyone trying to understand the commit message, it should have been "Use gitphpfileschanged, not ..." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4491
[22:03:50] 	 [14:34]  yeah. i'm not going to spend time on this if the only person who wants it reviewed is MzMcbride.
[22:03:53] 	 [14:34]  it was closed for a whole year before he reopened it.
[22:04:02] 	 Don't be stupid, please.
[22:05:35] 	 if you can come up with a compelling reason why that extension needs review, i'm all for it.
[22:19:58] 	 jorm: There's community consensus for it to be enabled and it hasn't yet been reviewed?
[22:20:11] 	 FWIW, I'm not sure how you got dragged into it, though.
[22:20:15] 	 It doesn't seem like a design thing at all.
[22:20:29] 	 Well, I guess it depends how it's deployed.
[22:20:53] 	 But if it's just going to serve a limited number of account creation requests and not face the mass public, I don't think you'd care much about it.
[22:22:14] 	 well, maybe we should examine what's really going on.  it was never reviewed or deployed.  the toolserver peeps came up with their own solution, outside of the box.  if they no longer need this, then there's no rush whatsoever.
[22:22:29] 	 that bug is from 2008.
[22:22:39] 	 it's clearly not in a hurry to get anywhere.
[22:23:28] 	 Nobody indicated there was any rush.
[22:23:36] 	 I'm not sure why or how that's relevant.
[22:23:45] 	 There are much older and much newer bugs, yes. But the bug is still valid.
[22:23:50] 	 The Toolserver tool is a bad approach.
[22:25:26] 	 It looks like I wasn't involved in the discussion on-wiki that led to the bug being filed.
[22:25:36] 	 But I agree with comments there.