[00:04:30] has anyone seen our copy of JavaScript: The Good Parts lately? [00:04:33] our = WMF's [07:18:14] I still see no discussions at all about WikiLove on en.wiki and there's a huge village pump discussion on it.wiki instead. crazy. [07:18:45] Users on zh.wiki also gets interested. [07:23:11] Most it.wiki users are whining quite a bit (with no reason, to date). [07:23:13] lol http://lists.wikimedia.org/pipermail/wikien-l/2011-June/109170.html [11:25:52] I am getting a 'call to undefined method Sanitizer::validateEmail()' although that method is defined in includes/Sanitizer.php [11:26:07] That's very strange [11:26:11] Are you using trunk now instead of 1.16? [11:27:05] yes [11:27:15] The code is here http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/SignupAPI/includes/SpecialUserSignup.php?view=log [11:27:17] line 337 [11:32:16] Are there any other errors or warnings in the Apache log before that error? [11:33:06] Also, could you var_dump($GLOBALS['wgAutoloadClasses']['Sanitizer']); right before the failing line? [11:39:06] i tried the var_dump() and now it says NULL & then the same error [11:55:40] That's strange [11:56:02] akshayagarwal: Could you var_dump($GLOBALS['wgAutoloadClasses']) then? [12:05:33] ok, i ll try that n be back in 20 mins [12:07:38] RoanKattouw, akshayagarwal: that method was moved to sanitizer only recently, before that it was in User:: [12:08:08] He says he's running trunk [13:11:22] could anyone help me? [13:16:25] Just ask your question [13:16:36] If it's a MediaWiki support question, #mediawiki is a better channel for that [13:17:24] i have asked in #mediawiki but they have said to me to come here [13:17:34] OK -- then what's your question? [13:18:17] _jem_ and I have created a message: http://es.wikipedia.org/wiki/MediaWiki:Newpages-summary [13:18:58] it is going to be used in http://es.wikipedia.org/wiki/Especial:P??ginasNuevas to explain to the users how to review new pages [13:19:54] we wanted to add the interlanguaje links but we haven??t been able [13:20:00] Interlanguage links? [13:20:15] Oooh, I see [13:20:22] Yeah what you're trying to do won't work [13:20:54] You're adding [[en:Special:Newpages]] to the message and expecting "English" to show up in the sidebar on Especial:P??ginasNuevas [13:20:58] But that won't work [13:21:09] Special pages can't have interlanguage links [13:21:17] http://es.wikipedia.org/wiki/Wikipedia:Interwikis in spanish [13:22:05] <^demon> https://bugzilla.wikimedia.org/13489 [13:22:15] yes, that??s what we thought, but we have seen this: http://es.wikipedia.org/wiki/Especial:CambiosRecientes [13:22:15] <^demon> ^ Request for just that. [13:23:12] i don??t know why, but it has interlanguaje links [13:24:11] Oh wow [13:24:16] I had no idea that was possible [13:27:10] I??ll ask Sanbec (http://es.wikipedia.org/wiki/Usuario:Sanbec) how he did it if you want [13:28:11] like you can see here (http://es.wikipedia.org/w/index.php?title=MediaWiki:Recentchangestext&diff=prev&oldid=404248), he just added the interlanguaje links and they appeared [13:28:27] i don??t how he did it [13:28:39] or what he did [13:29:09] Strange [13:30:01] there's a full list of worning/on-wroking at https://bugzilla.wikimedia.org/show_bug.cgi?id=13489#c2 [13:30:41] s/w.*?ing/working/g [13:31:17] on-working? [13:31:56] non-working [13:32:19] what failure of phrase [13:32:45] what do you think RoanKattouw? [13:33:11] I have no idea, I'm not familiar with this part of MW [13:38:13] ok RoanKattouw I??ll investigate it [13:39:18] thanks [13:47:59] I figured out that my extension will work in MW 1.19 but not in 1.17 [13:48:21] should i develop against trunk i.e 1.19 or 1.17? [13:49:34] trunk [13:51:19] ok :) [14:21:13] hexmode: guillom: call? [14:21:26] sumanah, mumble? x2003? [14:21:29] sumanah: which telephony tech are we going to try? [14:21:45] robla: guillom - I was hoping SIP [14:21:49] since that works for me [14:21:51] x2003 [14:22:07] alright, we'll give it a shot. is Claudia calling in? [14:22:17] Oh, and http://etherpad.wikimedia.org/TldrWeekly [14:22:48] oops...one sec [14:55:24] RoanKattouw: There're some problems on Prototype with WikiLove: when sending a message you get "Fatal error: Allowed memory size of 100000000 bytes exhausted" [14:55:53] File+line number? [14:56:32] Somewhere in the profiler, doesn't really tell you something [14:56:36] You can try it out yourself: [14:56:39] http://prototype.wikimedia.org/release-en/User_talk:Georgewiki [14:57:01] Hit up firebug when doing this and look at the network tab! [14:58:18] Perhaps the page size has something to do with it, trying to replicate it on localhost now. [14:58:31] Hmm this is not the lastest code [14:58:35] {{PAGENAME}} still expands to API [14:58:47] It isn't the latest code, no. [14:59:05] But nothing in that logic should have changed [14:59:11] You can try it with the new code if you like, though. [14:59:25] Ah, and you also have the classic infinite spinner bug, resulting from a lack of error handling [14:59:46] Yes, will add proper error handling there, too. [15:00:49] On my installation it is fine, although the request takes ~30sec. [15:00:56] Long page is loooong. [15:01:03] Ewww [15:01:06] data:text/html;base64,eyJyZWRpcmVjdCI6eyJwYWdlTmFtZSI6IlVzZXJfdGFsazpHZW9yZ2V3aWtpIiwiZnJhZ21lbnQiOiJUZXN0aW5nIn19PGJyIC8+Cjxmb250IHNpemU9JzEnPjx0YWJsZSBkaXI9J2x0cicgYm9yZGVyPScxJyBjZWxsc3BhY2luZz0nMCcgY2VsbHBhZGRpbmc9JzEnPgo8dHI+PHRoIGFsaWduPSdsZWZ0JyBiZ2NvbG9yPScjZjU3OTAwJyBjb2xzcGFuPSI1Ij48c3BhbiBzdHlsZT0nYmFja2dyb3VuZC1jb2xvcjogI2NjMDAwMDsgY29sb3I6ICNmY2U5NGY7IGZvbnQtc2l6ZTogeC1sYXJnZTsnPiggIS [15:01:08] ApPC9zcGFuPiBGYXRhbCBlcnJvcjogQWxsb3dlZCBtZW1vcnkgc2l6ZSBvZiAxMDAwMDAwMDAgYnl0ZXMgZXhoYXVzdGVkICh0cmllZCB0byBhbGxvY2F0ZSAyNCBieXRlcykgaW4gL3Nydi9vcmcvd2lraW1lZGlhL3Byb3RvdHlwZS93aWtpcy9yYy9pbmNsdWRlcy9Qcm9maWxlci5waHAgb24gbGluZSA8aT4yNDc8L2k+PC90aD48L3RyPgo8dHI+PHRoIGFsaWduPSdsZWZ0JyBiZ2NvbG9yPScjZTliOTZlJyBjb2xzcGFuPSc1Jz5DYWxsIFN0YWNrPC90aD48L3RyPgo8dHI+PHRoIGFsaWduPSdjZW50ZXInIGJnY29sb3I9JyNlZW [15:01:10] VlZWMnPiM8L3RoPjx0aCBhbGlnbj0nbGVmdCcgYmdjb2xvcj0nI2VlZWVlYyc+VGltZTwvdGg+PHRoIGFsaWduPSdsZWZ0JyBiZ2NvbG9yPScjZWVlZWVjJz5NZW1vcnk8L3RoPjx0aCBhbGlnbj0nbGVmdCcgYmdjb2xvcj0nI2VlZWVlYyc+RnVuY3Rpb248L3RoPjx0aCBhbGlnbj0nbGVmdCcgYmdjb2xvcj0nI2VlZWVlYyc+TG9jYXRpb248L3RoPjwvdHI+Cjx0cj48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+MTwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjAuMDAwNjwvdGQ+PH [15:01:12] RkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdyaWdodCc+MTU2NzI4PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYyc+e21haW59KCAgKTwvdGQ+PHRkIHRpdGxlPScvc3J2L29yZy93aWtpbWVkaWEvcHJvdG90eXBlL3dpa2lzL3JjL2FwaS5waHAnIGJnY29sb3I9JyNlZWVlZWMnPi4uL2FwaS5waHA8Yj46PC9iPjA8L3RkPjwvdHI+Cjx0cj48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+MjwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjUuOTM1NzwvdGQ+PHRkIGJnY29sb3I9JyNlZW [15:01:14] VlZWMnIGFsaWduPSdyaWdodCc+OTIxMTQ2NDg8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJz53ZkxvZ1Byb2ZpbGluZ0RhdGEoICApPC90ZD48dGQgdGl0bGU9Jy9zcnYvb3JnL3dpa2ltZWRpYS9wcm90b3R5cGUvd2lraXMvcmMvYXBpLnBocCcgYmdjb2xvcj0nI2VlZWVlYyc+Li4vYXBpLnBocDxiPjo8L2I+MTIzPC90ZD48L3RyPgo8dHI+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjM8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJyBhbGlnbj0nY2VudGVyJz41LjkzNTc8L3RkPjx0ZCBiZ2NvbG9yPScjZW [15:01:15] VlZWVjJyBhbGlnbj0ncmlnaHQnPjkyMTE0OTIwPC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYyc+d2ZHZXRQcm9maWxpbmdPdXRwdXQoICApPC90ZD48dGQgdGl0bGU9Jy9zcnYvb3JnL3dpa2ltZWRpYS9wcm90b3R5cGUvd2lraXMvcmMvaW5jbHVkZXMvR2xvYmFsRnVuY3Rpb25zLnBocCcgYmdjb2xvcj0nI2VlZWVlYyc+Li4vR2xvYmFsRnVuY3Rpb25zLnBocDxiPjo8L2I+NDk0PC90ZD48L3RyPgo8dHI+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjQ8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJyBhbGlnbj [15:01:17] 0nY2VudGVyJz41LjkzNTc8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJyBhbGlnbj0ncmlnaHQnPjkyMTE1MTI4PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYyc+UHJvZmlsZXItPmdldE91dHB1dCggICk8L3RkPjx0ZCB0aXRsZT0nL3Nydi9vcmcvd2lraW1lZGlhL3Byb3RvdHlwZS93aWtpcy9yYy9pbmNsdWRlcy9Qcm9maWxlci5waHAnIGJnY29sb3I9JyNlZWVlZWMnPi4uL1Byb2ZpbGVyLnBocDxiPjo8L2I+NDU8L3RkPjwvdHI+Cjx0cj48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+NTwvdGQ+PHRkIGJnY2 [15:01:18] Uh, lol? [15:01:18] 9sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjUuOTM1OTwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdyaWdodCc+OTIxMjA5NjA8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJz5Qcm9maWxlci0+Z2V0RnVuY3Rpb25SZXBvcnQoICApPC90ZD48dGQgdGl0bGU9Jy9zcnYvb3JnL3dpa2ltZWRpYS9wcm90b3R5cGUvd2lraXMvcmMvaW5jbHVkZXMvUHJvZmlsZXIucGhwJyBiZ2NvbG9yPScjZWVlZWVjJz4uLi9Qcm9maWxlci5waHA8Yj46PC9iPjE4MzwvdGQ+PC90cj4KPHRyPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJy [15:01:20] BhbGlnbj0nY2VudGVyJz42PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+Ni4wMzI1PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J3JpZ2h0Jz45OTc5MjYyNDwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnPndmUHJvZmlsZUluKCAgKTwvdGQ+PHRkIHRpdGxlPScvc3J2L29yZy93aWtpbWVkaWEvcHJvdG90eXBlL3dpa2lzL3JjL2luY2x1ZGVzL1Byb2ZpbGVyLnBocCcgYmdjb2xvcj0nI2VlZWVlYyc+Li4vUHJvZmlsZXIucGhwPGI+OjwvYj4yNzc8L3RkPjwvdHI+Cjx0cj48dGQgYmdjb2 [15:01:21] xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+NzwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdjZW50ZXInPjYuMDMyNTwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnIGFsaWduPSdyaWdodCc+OTk3OTI2MjQ8L3RkPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJz5Qcm9maWxlci0+cHJvZmlsZUluKCAgKTwvdGQ+PHRkIHRpdGxlPScvc3J2L29yZy93aWtpbWVkaWEvcHJvdG90eXBlL3dpa2lzL3JjL2luY2x1ZGVzL1Byb2ZpbGVyLnBocCcgYmdjb2xvcj0nI2VlZWVlYyc+Li4vUHJvZmlsZXIucGhwPGI+OjwvYj4yMTwvdG [15:01:23] Q+PC90cj4KPHRyPjx0ZCBiZ2NvbG9yPScjZWVlZWVjJyBhbGlnbj0nY2VudGVyJz44PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J2NlbnRlcic+Ni4wMzI1PC90ZD48dGQgYmdjb2xvcj0nI2VlZWVlYycgYWxpZ249J3JpZ2h0Jz45OTc5MjYyNDwvdGQ+PHRkIGJnY29sb3I9JyNlZWVlZWMnPlByb2ZpbGVyLT5nZXRUaW1lKCAgKTwvdGQ+PHRkIHRpdGxlPScvc3J2L29yZy93aWtpbWVkaWEvcHJvdG90eXBlL3dpa2lzL3JjL2luY2x1ZGVzL1Byb2ZpbGVyLnBocCcgYmdjb2xvcj0nI2VlZWVlYyc+Li4vUHJvZmlsZX [15:01:24] IucGhwPGI+OjwvYj45ODwvdGQ+PC90cj4KPC90YWJsZT48L2ZvbnQ+Cg== [15:01:26] Whooooops [15:01:28] Sorry folks [15:01:29] Long data URL is long [15:01:32] :P [15:02:45] http://www.youtube.com/watch?v=POBqwC-9AvU [15:02:46] So my guess is that it has something to do with the page length. [15:02:51] data:text/html? Is that live somewhere, RoanKattouw? [15:02:59] hexmode: Not in MW no [15:03:08] looks funky [15:03:09] This is what I meant: http://twitpic.com/5i5oaq [15:03:13] I'll add proper error handling code to everything, can you try to debug this stuff further? [15:03:21] Or rather http://twitpic.com/5i5oaq/full [15:03:43] In this particular case it would seem you're getting invalid JSoN [15:04:00] Oh wait, the redirect data is also echo'ed! [15:04:06] Yes! [15:04:08] So it does arrive there [15:04:16] The OOM only happens after the fact, when writing profiling data [15:04:22] I'll hotfix this by disabling profiling [15:04:26] Ah. [15:04:30] We don't really need that on prototype [15:04:33] Right [15:04:41] Except when debugging crazy 1.17 parser issues last January [15:04:54] ^_^ [15:05:52] Profiling disabled [15:06:07] profiling seems to be very memory hog btw [15:06:19] Yes [15:06:26] I noticed the same on twn [15:07:04] catrope@prototype:/srv/org/wikimedia/prototype/wikis/rc-en$ date [15:07:05] Tue Jun 28 15:06:57 UTC 2011 [15:07:07] Well shit [15:07:12] NTP obviously isn't running on that box [15:07:59] Hah: {"servedby":"prototype.wikimedia.org","error":{"code":"unknownerror","info":"Unknown error: ``231''"}} [15:08:14] And again Fatal error: Allowed memory size of 100000000 bytes exhausted [15:08:17] In Profiler.php [15:08:35] Now, why an unknown error this time.. :\ [15:09:12] At least it gave the correct output before the stack trace last time. [15:10:07] Could you pastebin the contents of the response? [15:23:17] RoanKattouw: twn is spewing: PHP Warning: filemtime(): stat failed for /www/w/resources/mediawiki.special/mediawiki.special.js in /www/w/includes/resourceloader/ResourceLoaderFileModule.php on line 369 [15:23:28] Oh ffs [15:23:46] Ah, I mistyped the path [15:23:50] *RoanKattouw moves the file instead [15:24:36] http://pastebin.com/GVb14evb [15:24:52] Basically the same as before [15:24:56] Oh, error 231 [15:25:03] I forget when that happens exactly, there's a bug report about it [15:25:54] It happens when "Article::doEdit() fails" [15:25:59] Probably due to a race condition or somethnig [15:26:16] hah, google thinks it is System Error: 231 - All pipe instances are busy [15:26:29] * @return Status object. Possible errors: [15:26:30] * edit-hook-aborted: The ArticleSave hook aborted the edit but didn't set the fatal flag of $status [15:26:32] * edit-gone-missing: In update mode, but the article didn't exist [15:26:34] * edit-conflict: In update mode, the article changed unexpectedly [15:26:35] * edit-no-change: Warning that the text was the same as before [15:26:36] * edit-already-exists: In creation mode, but the article already exists [15:26:39] Any of those cases may trigger error 231 [15:26:45] So it's most likely intermittent [15:26:55] If it's persistent, it would have to be edit-hook-aborted [15:27:32] I'll try a few times to make sure it's not a conflict. [15:27:46] Right [15:27:54] This time it is correct: {"redirect":{"pageName":"User_talk:Georgewiki","fragment":"A_kitten_for_you.21"}}
[15:28:03] But still with the same stack dump. [15:28:15] So the same as in the TwitPic that RoanKattouw just sent. [15:28:21] So it was probably a conflict. [15:28:51] Wait, what [15:28:57] You're still getting the profiler-related stack trace? [15:29:36] Yes. [15:29:40] Bah [15:31:30] Oh try *now* [15:31:43] I made StartProfiler.php unconditionally load ProfilerStub [15:32:22] *janpaul123 trying [15:32:44] Yes! [15:32:46] Works now. [15:33:12] Btw, can you do an svn up on WikiLove, fixed the infinite spinner problem by adding some error messages. [15:33:37] Alright, so profiler seems to be a memory hog, but this is not really a solution [15:35:17] svn up done [15:35:21] It's also not a WikiLove problem [15:35:33] True. [15:35:41] And we don't really need profiling on prototype as far as I'm concerned [15:35:49] Alright :) [15:36:05] The cluster uses UDP profiling so it's unaffected [15:40:20] Sweet. [15:40:21] Thanks for helping out! :D [15:40:59] Sure [16:08:17] who should I poke to get https://bugzilla.wikimedia.org/show_bug.cgi?id=16036 resolved? [16:11:19] Delta: hexmode is our bug database overseer, or "bugmeister", and could help [16:11:43] Delta: looking [16:11:50] AryehGregor: Is there a maintenance script for fixing category counts? [16:12:18] RoanKattouw, IIRC, populateCategory.php should do it. [16:13:27] Delta: could you update the bug with why you need it? And, if Ryan (last comment) is right that it should be renamed, please rename it [16:13:58] RoanKattouw: tyvm for your comments on that bug in the etherpad. [16:15:05] AryehGregor: OK so if I run that with --force it'll recompute all those counts [16:15:07] ? [16:15:30] RoanKattouw, IIRC, yes. You might want to glance over the code first to make sure you know what it's doing. [16:15:41] I don't remember the details of how it works, like how it checks for slave lag or what batch sizes it uses or anything. [16:15:52] But I did design it to work for both initial population and rebuilding. [16:16:12] Yeah it just calls refreshCounts() on a Category object constructed from each category row [16:24:47] hexmode: done [16:25:33] :) [16:58:50] hello [17:19:13] alolita: AFT scrum? No one in x2003 or x2004 [17:19:40] hi Roan - let's chat right here since Howie is not in [17:19:48] are you available to chat now [17:19:54] (Well there is someone in x2004, actually, with a huge echo, but they're not responding) [17:20:02] Of course, it's a scheduled meeting [17:20:10] ok cool - [17:20:21] is krinkle around to join [17:20:27] He's around [17:20:31] Krinkle: Ping, AFT scrum [17:20:34] (Here on IRC) [17:21:16] :) [17:21:47] hi krinkle [17:22:03] krinkle: did you get the final list of tooltips from howie [17:22:12] he should have sent it to you yesterday evening [17:22:30] tooltips for aft [17:22:59] I know he was supposed to send them but I.. there's so much e-mail traffic I can barely read it, respond and work within the 2 hours per day. [17:23:12] I haven't seen it yet, but he.. yeah I see it. [17:23:35] roan: once krinkle is done with the aft tooltips by tomorrow, then you are on the hook to cr / deploy to prototype tomorrow and production on thursday [17:23:46] krinkle: i can send you the tooltips right now [17:24:04] Yeah I was gonna start CRing in a bit [17:24:18] roan: thanks; [17:24:51] krinkle: is your laptop working ok now [17:24:55] Yes [17:25:02] krinkle: awesome [17:25:26] okay, I thought I did but it's a different mail from howie [17:25:31] I dont have any tooltips I think [17:25:46] After AFT and WikiLove CR I'll start working on the UDP logger thing [17:26:02] krinkle: let me send these to you right now :-) On wikilove - are you syncing up with jan paul; is he doing most of the functionality? or are you helping too? [17:26:19] Roan: yes that's a great plan :-) we're waiting for your time on the udp logger [17:26:20] there was a lot of wikilove traffic overnight. [17:27:05] jorm: Off topic: I read some of your recent blog posts. Loved the uncle-with-the-gun story. I also would not have guessed you had Scandinavian blood [17:27:21] i don't. [17:27:28] Well, I have re-reviewed wikilove last weekend to check for code quality and verify the previous review's suggestions were implemented. [17:27:29] there's another side to that story. [17:27:30] Looks good [17:27:31] Your grandfather's name was Ny-something [17:27:39] both of my grandparents there were orphans. [17:27:43] A clearly Scandinavian name and he's from Minnesota too [17:27:45] Ooooh.. [17:27:49] so they were adopted by norwegians, but they're likely french. [17:27:54] Right [17:28:07] i approved your comment, btw. [17:28:13] now you can comment all you want! [17:28:17] "I don't mean to be a dick, but -- " :P [17:28:23] krinkle: i am sending the tooltips to you now [17:28:33] i'm totally unsure as to how you could ever be a dick, dude. [17:28:45] Krinkle: Cool. Any changes you didn't or couldn't review, anything outstanding, ... ? [17:28:57] jorm: You haven't met the 9yo me, he was a dick [17:29:08] Or could be when he got angry [17:29:09] aww [17:29:48] RoanKattouw: I dont think so, lemme verify real quick [17:30:24] Cause if the WL changes relative to what I deployed to officewiki earlier are all reviewed already, that'd be great and save me work [17:30:38] timo: still waiting for your passport info [17:31:17] s/info// I think [17:31:48] ok going to epm meeting now [17:31:55] tty in 30 minutes [17:32:08] alolita: I am recieving my passport on thursday according to the city town hall thingy [17:32:52] Looks like nearly every WikiLove bug I filed got fixed. [17:33:02] Does that surprise you? [17:33:23] RoanKattouw: I haven't reviewed php/api code tho. I could for a part, but just didn't yet. Can you look at that ? [17:33:28] Sure [17:33:35] (there's a few mixed revs too) [17:33:38] You've OKed all the revs you did cover? [17:34:02] The mixed ones I'll just review the PHP/API parts and assume you've done the JS [17:34:07] jorm: It was a pleasant surprise. :-) [17:37:08] hey akshay [17:40:45] Hi jorm! [17:40:58] Wassup? [17:41:27] not much; trying to catch up with my email. [17:41:43] i saw that ApiSignup is in the svn repository now. [17:41:53] did you get you commit access stuff straightened out? [17:44:43] yes, i secured commit access n did my first commit too :) [17:46:42] the extension works with the trunk version i.e 1.19 but not 1.17 [17:47:07] excellent. [17:47:21] i'm going afk for a sec. be back in a few. [17:47:33] ok jorm, i ll be here [17:58:33] okay. back. [18:02:34] alolita: Ping, features meeting [18:02:43] roan: walking over now [18:06:00] Krinkle: Features meeting, ping [18:07:32] Krinkle: So there's some JS revs that you don't seem to have reviewed yet, I'm tagging those with the 'krinkle' tag [18:07:41] good [18:09:46] jorm: i have tested the extension its working smoothly, i m now testint the API [18:10:20] jorm: it will be done in a day [18:10:30] jorm: what should we next focus on? [18:13:50] Krinkle: https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWiki/tag/krinkle [18:14:05] brion: alrighty, green again :) http://toolserver.org/~krinkle/testswarm/user/MediaWiki/ [18:14:10] Krinkle: One of them has an API change. I looked at that API change and it's fine [18:14:36] whee [18:20:37] !e ArticleFeedback [18:20:37] --elephant-- http://www.mediawiki.org/wiki/Extension:ArticleFeedback [18:21:12] !e WikiLove [18:21:12] --elephant-- http://www.mediawiki.org/wiki/Extension:WikiLove [18:21:14] !e MoodBar [18:21:14] --elephant-- http://www.mediawiki.org/wiki/Extension:MoodBar [18:38:14] JeroenDeDauw: So has the WMDE survey thingy changed since the last time I reviewed it? [18:39:29] RoanKattouw: no [18:39:50] Ok, cool [18:39:57] If itdoes change, let me know [18:40:37] The day that thing will be enabled (Jul 6) is a day I'm mostly gonna spend on airplanes so I wanted to make sure everything is in order [18:40:41] RoanKattouw: no reason why it would change, it does what it should do. And I'll be on holiday starting tomorrow, so won't even be able to make any changes to it :) [18:40:49] OK cool :P [18:41:01] Right, no one is planning to make changes. [18:41:08] OK cool [18:42:17] janpaul123: just checked out a fresh trunk+wikilove+globalenabled it. When I open up the dialog box the section below "Select type" is empty. [18:42:22] known ? [18:42:27] (Safari) [18:42:39] Hm, shouldn't be the case! [18:42:41] Let's see. [18:43:06] robla: ?? [18:43:23] http://i.imgur.com/faryS.png [18:43:48] JS errors? [18:43:52] Nope, [18:44:00] debug=true, false or trsue, none work [18:44:07] debug=false gives $.wikiLove undefined [18:44:22] was wondering what trsue was [18:44:30] what function can I use to post the request parameters to api.php [18:45:06] in js? [18:45:11] Krinkle: Really, is that the latest version? [18:45:12] in php [18:45:15] janpaul123: jep [18:45:20] *robla catches up [18:45:35] akshayagarwal: You want to do a PHP-to-PHP API post? Same wiki or cross-wiki? And why? [18:45:44] akshayagarwal, where abouts in php? In MW.your extension? Or another s3cript? [18:45:52] MW/your [18:45:54] Krinkle: Do you have a MediaWiki:WikiLove.js page? [18:45:56] hexmode: what did you want me for? [18:46:00] janpaul123: not anymore [18:46:09] Actually deleted it? [18:46:12] in that, it's blank [18:46:16] Ah [18:46:17] Yeah [18:46:19] That's a problem [18:46:20] robla: just checking thast you got my emails ;) [18:46:23] You should actually delete it. [18:46:30] RoanKattouw, Reedy: i have to test my APISignup [18:46:44] Hmm, maybe you can look at the existing structure for API tests? [18:46:48] Krinkle: it's a workaround for https://bugzilla.wikimedia.org/29608 [18:46:53] janpaul123: Ah, yeah. [18:46:53] I think Reedy has messed with those before [18:47:03] probably wanted something else, but emails work... [18:47:06] I've written like one MW test in my life *shame* so I'm not much use there [18:47:09] janpaul123: right [18:47:17] Barely [18:48:38] i saw that the existing API's could be tested by doing api.php?action=foo¶m1=bar [18:48:49] janpaul123: looks good now [18:48:53] but that doesn work for post-only api's [18:48:54] yeah, for GET only requests [18:48:54] Sweet [18:49:09] Perhaps we should post a note on mw:Extension:WikiLove.. ;) [18:49:34] akshayagarwal: If you just want to do empirical testing, hack out the POST requirement [18:49:38] Or use $.post() in a Firebug console [18:50:26] janpaul123: actually, it could be worked around with by calling mw.loader.load [18:50:41] from within the module [18:50:49] Remember that that's async [18:50:58] mw.loader.using [18:51:00] :) [18:51:25] does take 1/2 more http requests, but perhaps better for now. [18:51:33] i actually built a form , populated it with the paramters and posted the form to APISignup but seems that it needs an entry point [18:51:44] api.php is the entry point [18:51:59] ya but how can I use it? [18:52:11] sumanah: I have no idea if this makes any sense: http://etherpad.wikimedia.org/aIXzrM3YhA [18:52:12] do a POST against that url [18:52:15] You can post the form to api.php and include a hidden 'action' field that you set appropriately (to 'usersignup' I guess, or whatever you used as a key in $wgAPIModules) [18:53:11] p [18:53:28] RoanKattouw: sounds good, so it cant be tested independently as an extension? [18:53:48] *sumanah looks - thanks Krinkle [18:53:52] If you mean automated tests, I'm not familiar enough with our testing framework to answer that [18:54:15] If you mean you personally sitting there and pushing buttons, that's totally trivial [18:54:55] And like I said for quick at-home testing or debugging you can just drop the POST restriction temporarily and you'll be able to do GET requests to api.php?action=usersignup¶meter=should&go=here [18:55:00] *parameters [18:55:29] RoanKattouw: ok, i ll temporarily drop the strict post requirement [18:55:59] Don't commit that or use it for automated tests, it's strictly for I-just-wrote-this-and-I-just-wanna-quickly-see-if-it-works testing [18:56:27] RoanKattouw: i wont :) [18:56:31] OK cool [18:57:11] could you give me steps to integrate my API in the existing API structure [18:57:21] i mean how do I make MW know that it exists [18:57:37] do I need to add it to some $wg var? [18:57:40] $wgAPIModules['signup'] = 'NameOfYourAPIClass'; [18:58:04] in the extension setup file [19:00:30] reedy already did that in the first commit :) [19:01:08] yeah, it should just work [19:01:13] Note, the api help page is cached [19:01:32] hmm.. i m using private browsing ;) [19:01:42] it's cached server sid [19:01:42] e [19:02:20] oh, so how can I purge it? [19:02:21] Add "$wgAPICacheHelpTimeout = 0;" to your localsettings [20:08:12] Reedy, RoanKattouw: it worked! [20:09:26] if there are more than one error then then my API outputs only the first error, is it ok or should it output all the errors? [20:10:15] All API modules work that way [20:13:04] ok, one more thing, there are a few $wgOut statements before a success state is returned to the API, this is needed when the signup request is handled only via SpecialUserSignup, is this ok or should I try to tweak the flow? [20:13:42] There should not be any $wgOut calls in the backend [20:14:08] There should be a backend that just does backend-y stuff, then one frontend that uses $wgOut to communicate stuff through the UI and an API frontend [20:15:18] could you tell me in terms of file names? [20:15:43] OK so *ideally* (and this is not what we have right now) [20:16:15] You would have one file that only has functions that do logins and return status codes or Status objects but don't actually do anything with the presentation of the error/success, they just tell you it occurred [20:16:39] Then you would have SpecialUserSignup.php which calls those functions, gets a success/error back, and presents it to the user [20:16:59] And ApiSignup.hpp which ... basically does the same, but presents its results differently (as an API response) [20:17:15] Currently the first two (backend and SpecialUserSignup) are in the same class, which is not ideal but it'll do [20:17:22] As long as their code is still somewhat separated [20:17:56] As long as there's something that just does logins and doesn't contain anything that's specific to either the UI ($wgOut) or the API [20:20:10] akshayagarwal: Does this make sense to you? [20:22:00] RoanKattouw: i m reading it again n again [20:22:06] trying to grasp [20:22:19] Have you ever heard of MVC (Model View Controller)? [20:22:37] ya, i sensed some similarity [20:23:04] We don't follow that strictly but we try to separate the view from the rest [20:23:12] I'll write up a quick pastebin to illustrate [20:24:53] so in my file I have a processSignup() which does the processing and returns error codes and then mainSignupForm() handles the display part [20:26:15] is it somewhat like what you said? [20:27:27] Yes [20:27:44] http://pastebin.com/U40pqikP was my illustration. Sadly the syntax highlighting broke because I used an apostrophe [20:28:08] But yeah in your case, procesSignup() should not contain *any* display login, that should all be in mainSignupForm() [20:28:18] So using $wgOut in processSignup() is a red flag [20:29:38] hmm... i rememeber you had mentioned a similar point when i started, so i took care of it :) [20:30:21] OK [20:32:28] the problem is with the 'success' case, before the success case is returned by processSignup() a function addNewAccount() is called which does some $wgOut [20:32:50] That's bad [20:32:56] That's a smell in addNewAccount() [20:33:21] You'll have to clean that up; I had to do that a lot when I wrote the API move/delete/protect/... modules [20:33:40] Using status codes or, preferably, a Status object [20:34:54] a possible alternative which I thought was to remove addNewAccount and take its code to the case SUCCESS in mainSignupForm() [20:36:25] actually addNewAccount() doesnt do much besides setting a language preference option, sending a confirmation mail & displaying its result, & run the hooks for post account creation [20:36:31] Right [20:36:38] Well the preference should be set for API account creations too [20:36:40] Same for the hook [20:37:07] hmm... true [20:38:07] so for all the condtional stuff displayed from addNewAccount() we can set a status Object and then test it in mainSignupForm() and out suitably [20:39:27] Well the beauty of the Status object is [20:39:36] You've got the message key and parameters for the error in there [20:39:57] So you can pass that straight to some wgOut function (I don't remember which) and that function will take it from there, IIRC [20:41:36] gr8! i ll check it out [20:52:50] RoanKattouw: any links for more info on Status objects or any examples? [20:53:11] I'm not too familiar with it myself [20:53:25] grep the code for 'Status::' and 'new Status' [20:59:11] <^demon> Mostly Status::newFatal(), Status::newWarning() and Status::newGood() [21:00:26] ^demon: yes, & in some places i see $result=foo() & then if($result->isGood()) [21:01:18] <^demon> Right. Statues can be in one of three states. [21:02:08] <^demon> That's what isGood() [did execution complete, and warning-free] and isOK() [completed, but may have warnings] [21:02:56] <^demon> I use them all over the place in phase3/includes/installer/ if you're looking for examples. [21:04:47] ^demon: they perform so accurately as their names! [21:05:21] <^demon> :) I like them [21:26:33] woof [22:53:52] 7 minutes till IRC triage time [22:53:55] w00! [22:54:49] woof [22:54:58] hexmode: turns out I can actually make it to the triage meeting! [22:54:58] hexmode: I can cross channel spam a bit too if you like ;) [22:55:20] link to the Etherpad again? [22:55:23] hexmode: you might like to re-post to foundation-l saying meeting starting in 5 minutes as a reminder [22:55:51] sumanah: http://etherpad.wikimedia.org/BugTriage-2011-06 [22:55:59] Thehelpfulone: will do [22:56:41] RoanKattouw: could you please link my svn account to my wiki account? [22:57:11] akshayagarwal: Sure. What are the account names? [22:57:45] SVN-> akshay@svn.wikimedia.org Wiki-> User:Akshay.agarwal [23:00:15] Let's get it started! [23:00:17] Done [23:00:54] first thing: We need people to review the items at line 65 down [23:01:03] Oh damn [23:01:12] to confirm or not that they should be blockers [23:01:13] hexmode: the 1.18 deployment blockers? [23:01:13] I said I wouldn't stay up for this meeting but look at the time [23:01:17] for 1.18 [23:01:22] RoanKattouw: hah [23:01:26] sumanah: yes [23:02:04] next, and I kind of don't know why this isn't what we do every week [23:02:05] RoanKattouw: ty [23:02:22] *sumanah looks at http://bugzilla.wikimedia.org/14890 Make image views statistics available through wikistats [23:02:24] lets focus on the things that are current issues [23:02:38] RoanKattouw: hah [23:02:45] https://bugzilla.wikimedia.org/29552 - Edited page is not showing the most recent edits to anyone not logged into wikipedia [23:03:04] hold on, hexmode - I thought you said that first we were looking at 1.18 blockers. [23:03:09] so can we address those first? [23:03:16] Pff Nikerabbit is even guiltier, he's ahead of me :) [23:03:30] To me, this does not look like a blocker: http://bugzilla.wikimedia.org/14890 Make image views statistics available through wikistats [23:03:52] sumanah: I was just saying that people should be looking at them and making comments in line, but lets start there, sure [23:04:22] <^demon> I'm not sure how 14890 is a blocker to 1.18 at all. [23:04:57] indeed [23:05:02] it's completely irrelevant [23:05:02] That sounds completely frivolous [23:05:16] ^demon: it was the fault of my clone, the one that decided all "high" bugs are 1.18 blockers [23:05:31] <^demon> Clones are dangerous :D [23:05:42] any others besides those that are noted [23:05:43] ? [23:05:50] I'm also thinking that https://bugzilla.wikimedia.org/show_bug.cgi?id=28898 "Set up notification for when/if Google's safe browsing spots something on wiki" is not a blocker [23:06:06] and it is done already now anyway [23:06:11] (just today) [23:06:18] indeed [23:06:26] although we need mail aliases set up [23:06:50] 14890 also looks like one we set to "low" in a triage, then someone else set to "high" [23:06:56] http://bugzilla.wikimedia.org/28223 iPhone Native Crash with UTF-8 is completely seperate [23:07:31] 14890 is also not a shell bug [23:07:49] Not in the traditional sense no [23:07:50] It's an ops issue [23:08:06] I'm also wondering whether https://bugzilla.wikimedia.org/show_bug.cgi?id=28857 is really "high"/a deployment blocker [23:08:17] Bug 28857 - Sometimes there's "undefined" in a Resource loader CSS request [23:08:30] I'm not sure that one can be reproduced any more [23:08:37] RoanKattouw: someone needs to modify the filters running on locke, I think. That requires C coding [23:08:47] And I am *extremely* skeptical that a server using PHP code could return 'undefined' [23:09:06] it does seem ... odd [23:09:23] <^demon> If https://bugzilla.wikimedia.org/show_bug.cgi?id=29277 just needs a shell fix like the etherpad says, it should be trivial to fix. [23:09:39] RoanKattouw: fyi: no load.php + undefined in twn logs :o [23:10:16] ^demon: don't you have shell ;) [23:10:18] ? [23:10:18] <^demon> Actually, scriptDirUrl is already set on wmf wikis. [23:10:22] <^demon> Hmm. [23:11:03] <^demon> Ah, I see what's going on. A wiki doesn't have any way of knowing that it's acting as a foreign repo. [23:11:09] <^demon> Thus it doesn't know to load Filepage.css [23:11:23] ok, so, hexmode, it sounds like you'll mark a few as "investigate whether this is actually a blocker" and we will move on? [23:11:45] <^demon> (basically: Commons doesn't know it's Commons) [23:12:04] sumanah: yes, lets move on to the bug ^demon is talking about ;) [23:12:10] $wgAmICommons [23:12:39] ^demon: how does it not now? [23:12:52] <^demon> Well Commons doesn't know anyone else uses it as a foreign repo. [23:12:58] <^demon> There's no configuration required on the parent repo side. [23:13:18] <^demon> All the other wikis know they use commons, but as far as commons is concerned it's just a single wiki with a bazillion uploads. [23:13:38] so the parent doesn't understand what it should do? could you update the ticket ^demon ?? [23:13:41] Why don't we page all wikis load their own Filepage.css ? [23:13:50] <^demon> I was just about to suggest that. [23:13:53] Or at least for local filse [23:13:56] <^demon> All wikis load the local FilePage.css [23:13:58] That would be even better [23:14:05] Load the Filepage.css of the source wiki of the image [23:14:11] <^demon> Well it does that. [23:14:20] I was gonna suggest loading with local and remote but the source wiki makes more sense [23:14:22] <^demon> It loads the filepage of the remote file, just not the local one [23:14:46] is thedj around? [23:14:48] For remote files it should load the remote CSS and that only [23:14:56] ^demon: do you understand this well enough to take a shot at fixing it? [23:15:01] For local files it should load the local file page CSS -- is this currently lacking? [23:15:02] robla: he couldn't make it [23:15:09] he'll be here next week [23:15:14] <^demon> Go ahead and assign it to me. [23:15:18] k [23:15:20] how important is this? [23:15:28] I can't see a major consequence of this problem [23:15:30] <^demon> Annoying for Commons. [23:15:34] <^demon> Not a blocker for anything else. [23:15:42] low, then? [23:15:48] <^demon> Low to normal, I'd say [23:15:52] k [23:16:00] next: https://bugzilla.wikimedia.org/29552 - Edited page is not showing the most recent edits to anyone not logged into wikipedia [23:17:16] cache issue w/ redirect pages? [23:17:25] bawolff is on it ? [23:18:01] I think it was an issue with squids not being purged properly [23:18:02] priority is ok or is the cache issue more important? [23:19:01] beuller? is everyone reading or should we move on? [23:19:11] still grokking [23:19:24] k... [23:20:00] that seems like a high priority issue to me [23:20:34] and who will take this on? bawolff found it but I don't think he can test it, can you? [23:20:46] Not really [23:20:59] I suppose I could apt-get install squid [23:21:39] I can assign you and you can say "oops!" if it turns out you can't manage it [23:21:40] I would patch the code to log the sent purges [23:22:44] yeah, that makes sense [23:23:00] k, anyone else? I'm gonna assign to bawolff until he can't do it. [23:23:05] :) [23:23:43] next: https://bugzilla.wikimedia.org/29518 - Links are not marked existing when importing [23:24:46] tats a dupe [23:24:49] *thats [23:24:53] dupe of? [23:24:56] another bug [23:25:06] :P [23:25:06] let me find it [23:25:10] o rly :) [23:25:17] Its a dupe of another bug filed by the same guy [23:25:31] haha [23:25:31] Win [23:25:46] !bug 29585 [23:25:46] --elephant-- https://bugzilla.wikimedia.org/show_bug.cgi?id=29585 [23:26:04] did he forget? [23:26:06] so he re-reported it against 1.17 [23:26:16] which is fair enough when hexmode just closed it for being 1.15 [23:26:22] maybe hew thought they were different issues [23:27:01] maybe [23:27:29] I wasn't able to reproduce it on 1.17. Edit.php works fine for me [23:27:43] mark it "worksforme" and move on? [23:27:52] sounds good [23:27:57] moving on [23:28:12] He did provide a link though, so I can tell its happening to him [23:28:43] bawolff: I'll let you follow up if you like [23:28:51] seems mostly worksforme, though [23:28:51] perhaps we can keep it open at low priority then [23:29:01] I have no idea why its happening to him though [23:29:08] we still need a repro if we're going to do anything with it [23:29:27] yep... [23:29:29] moving on [23:29:30] next: https://bugzilla.wikimedia.org/29021 - Anonymous users can edit page protected with [edit=autoconfirmed] [23:29:30] for all we know, there's some wacky mod to that wiki that's causing the problem [23:29:49] have to check special:version on his wiki then [23:30:41] bawolff: you were following up so if he comes back w/ something or you want to test the extensions he has, go for it [23:31:10] Hmm, he is using maintenianceshell extension, thats kind of funky [23:31:36] yeah, maybe i'll try to see if that can cause it to reproduce. I'm kind of curious as to what the cause is [23:32:00] :) [23:32:05] maintenance 's hell? [23:32:18] sounds like it [23:32:26] fwiw, I can't repro on the page referenced [23:32:57] anyone looking at 29021 yet? [23:33:02] He null edited that page, so its not showing it anymore [23:33:41] hexmode: that's the one I'm talking about. I can't repro [23:33:46] ah [23:33:59] hexmode, maybe the page was deleted and undeleted? [23:34:18] that removes the protection [23:34:32] hrm.... does it have it now? [23:34:45] it appears to from the history [23:35:12] and, is this hewiki-specific? [23:35:12] I'm reading a Hebrew->English translation, though, so I don't know [23:35:17] heh [23:36:16] Platonides: wouldn't delete/undelete be in the history? [23:36:26] No [23:36:29] But it would be in Special:Log [23:37:55] so, note to check Special:Log [23:38:04] moving on [23:38:16] https://bugzilla.wikimedia.org/27478 - Enable $wgHtml5 on Wikimedia wikis [23:38:20] hi folks [23:38:40] anyone know why http://nyc.wikimedia.org/wiki/Special:RecentChanges redirects to special:recentchanges on metawiki? [23:38:41] just pick a list of small wikis for HTML5 and then someone with shell? [23:39:16] aude: ask in #wikimedia-operations? [23:39:24] Someone did ask earlier [23:40:16] hexmode: "This is blocking some work for a GSOC project to improve the article assessment system on enwiki." whose GSoC project? [23:40:29] o_0 [23:40:32] sumanah: the commentor, I think? [23:40:38] link? [23:40:39] seems weird, but ok [23:40:43] there's no point enabling $wgHtml5 on small wikis [23:40:48] Reedy: https://bugzilla.wikimedia.org/show_bug.cgi?id=27478#c15 [23:41:01] it's the default so it's already been tested on small wikis, just not on wikimedia [23:41:11] and it shouldn't affect performance [23:41:11] Wasn't it some of the RTL wikis that had the issue? [23:41:29] TimStarling: so it is already on *.wikipedia.org? [23:41:44] Other wikis [23:41:45] !WMF [23:41:45] --elephant-- I don't know anything about "wmf". You might try: !1.17wmf1 [23:41:56] Aryeh says it's the default for non-wikimedia wikis, but not for wikimedia wikis [23:42:07] That's right [23:42:15] Anyone that's gonna roll that out should talk to him [23:42:25] sumanah, Yuvipandas project is article assessment (for wikiprojects), but I've not seen him say it's an issue... [23:42:28] I might do it in mid-July if no one else beats me to it, and if I have time [23:42:41] heh [23:42:47] We know we'll have some screenscraping scripts complaining again [23:42:49] awjr: do you know anything about https://bugzilla.wikimedia.org/show_bug.cgi?id=27478#c15 [23:42:50] ? [23:42:53] so, RoanKattouw, my concern is that someone will complain [23:42:58] and you won't have time [23:43:05] People always will complain [23:43:07] and we'll just back it back out [23:43:12] This happened before [23:43:21] Probably makes sense to dedicate a geneng person for this for that reason [23:43:30] *robla ducks [23:43:32] robla: ?? [23:43:36] heh [23:43:47] Someone for whom this stuff is their FT job rather than 20% time [23:43:52] We just need to look at fixing what breaks [23:44:09] sumanah: yes [23:44:14] do we need an ops person or just someone with shell? [23:44:19] shell [23:44:32] and who has shell and time? [23:44:43] Shell is enough [23:44:44] It's easily enabled and disabled, we've just got to collect and followup on feedback [23:44:50] And we need to communicate [23:44:54] Reedy: can you be the followup person? [23:44:59] should we announce? [23:45:00] But they need to have time to deal with complaints, be in touch with Aryeh, and have some understanding of the issues [23:45:04] So we'd need to ask Guillom nicely probably for a sitenotice [23:45:19] The latter is of lower importance because Aryeh and Tim are the only ones that I believe really understand this [23:46:05] you know Aryeh is leaving soon [23:46:06] I suspect there are others that can chip in...we just need to power though [23:46:14] "leaving"? [23:46:15] but if, say, Reedy could take point on it he could consult Tim+Aryeh [23:46:38] leaving the internet [23:46:42] robla: and a decision to do it [23:46:53] the email I have here says "this summer", not sure when that is exactly [23:47:01] Reedy: starting in a few weeks, AryehGregor won't have time for MediaWiki development anymore, for a few years [23:47:10] Leaving the internet? That sounds extreme [23:47:56] Although he's helpful when questioned on IRC/mailing lists, he's not really done any development for a while since the category collation stuff [23:47:59] he's moving to Israel and joining a yeshiva [23:48:07] which I gather is the jewish equivalent of a monastery [23:48:16] so, do I need to do anything here besides give it to Reedy to handle? [23:48:27] a hermitage ;) [23:48:36] 27478 is one of those somewhat annoying code correctness issues. we know there's lots of little frustration points, and there might even be some big problems that are created by it [23:48:49] it's hard to quantify the impact, though [23:48:52] hello, could you consider to fix bug https://bugzilla.wikimedia.org/show_bug.cgi?id=29170 in mw 1.18, since enotif function is enabled for all Wikimedia projects, this function is very popular and it would be nice to receive e-mails in proper gender [23:48:53] Right, so we should probably get this HTML5 thing down real soon now so we'll still have Aryeh to guide us through it [23:49:17] mmm [23:49:25] Twinkle etc haven't been fully rewritten yet have they? [23:49:33] ok... make "highest" priority for Reedy ? [23:49:46] think they said twinkle was , though [23:49:55] I know work was underway [23:49:56] dbl checking [23:50:01] Don't know the state of play though [23:50:06] Reedy: TimStarling: do you agree with hexmode? [23:50:25] yes [23:50:55] k...let's do it, then [23:51:00] moving on [23:51:12] https://bugzilla.wikimedia.org/29495 India-style commas [23:51:31] is this a TimStarling/Brion level task [23:51:32] ? [23:51:42] or is it something a dullard like me could do? [23:51:47] ;) [23:52:01] hexmode: i bet you could do it :) [23:52:02] Do we have number formatting per language now [23:52:04] ? [23:52:12] my comment 5 explains a way of setting it up that should work [23:52:28] h [23:52:29] so you'd basically need to modify Language::commafy() [23:52:29] commafy [23:52:45] throw in a switch() that checks the available settings [23:52:47] any takers? [23:52:50] and then have the four implementations [23:52:59] hexmode, see languages/classes/LanguageBe_tarask.php for one eexample of an override [23:53:02] sounds like a great newcomer task [23:53:07] (none is easy, 1k and 10k are the existing variants) [23:53:07] the hindi wiki has been getting a bit more attention ;) [23:53:19] (indic would be the new mode; probably not super hard regex work) [23:53:26] yay india [23:53:33] yeah, better to do it in the base class [23:53:35] hexmode: add it to Annoying Little Bugs? [23:53:38] sumanah: want to find a newbie dev? [23:53:40] yes [23:53:43] and then some basic cleanup on the bits that have a hardcoded commafy() now [23:54:00] hexmode: ok, go ahead & add it then [23:54:03] moving on [23:54:04] https://bugzilla.wikimedia.org/28980 - RTL versions of HTML tag names and xmlish tag hooks [23:54:31] how hard is that one (as we increase our RTL support)? [23:54:36] that's more of a feature request [23:54:45] (we have about 6 minutes left in this bug triage meeting) [23:54:49] sure, granted [23:54:50] iirc we think that one's not super hard to do, though we're a bit undecided on whether it should be done [23:55:19] brion: yes, but I think the RTL issue is pushing people over the edge, no? [23:55:45] maybe maybe not. it probably won't fully resolve the issue (as there'll be other things that don't have an alias still) [23:56:05] I'd want to hear from Siebrand on this one [23:56:07] what things? [23:56:11] it can also be worked around by using templates [23:56:30] ok. I'll get input from siebrand and we can move on [23:56:31] eg {{mycoolrtltext|foo}} -> {{#tag:ref|{{{1}}}}} etc [23:56:32] https://bugzilla.wikimedia.org/29564 LQT putting crap in dumps [23:56:34] at least to an extent [23:56:45] they do use templates already [23:56:52] I'd say it seems to make sense, because the rtl people themselves asked it [23:56:54] but if it's decided to do it i won't super object [23:56:57] *robla shudders at the title of 29564 [23:57:03] !b 29564 [23:57:03] --elephant-- https://bugzilla.wikimedia.org/show_bug.cgi?id=29564 [23:57:04] heh [23:57:29] oh, thanks, robla, I just noticed :P [23:57:37] but I did lol [23:57:45] so on that one -- i have a fix on trunk that'll make the dumps & exports work again [23:57:53] needs a merge to 1.18/1.17/deployment [23:58:06] revision #? [23:58:07] db cleanup for the bad entry would be wise (but can be skipped) [23:58:11] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/90723 [23:58:19] it's tagged already [23:58:22] k [23:58:44] which reminds me -- we should probably have a _slightly_ more regular process for reviewing backports. it's pretty ad-hoc so far [23:58:55] usually something super-important gets pushed quickly, but some things get forgotten [23:58:55] missed your comments, but good [23:59:03] ^demon: any chance you can follow through on that one? [23:59:07] brion, isn't that why we try and tag them? [23:59:12] Reedy: yep :D [23:59:15] or is that what you're meaning [23:59:15] <^demon> Sorry, I've been looking at the filepage issue [23:59:16] :P [23:59:20] *^demon catches up [23:59:22] that way they'll at least get seen next time someone does a big sweep [23:59:24] Yeah [23:59:52] if the exports are screwed up, that's pretty high priority to fix