[08:50:03] <[Bergi]> HI! Where can I find the HTML Tidy config file(s) used for mediawiki.org? They seem to be different from the ones of en.wikipedia.org [13:29:57] hi there csteipp [13:30:05] Hey Sumanah! [13:30:19] csteipp: best wishes with your upcoming and somewhat delayed release :D [13:30:39] Let's just say, schedule slip is getting to be painful... :) [15:01:54] chrismcmahon: https://www.mediawiki.org/wiki/QA/Article_Feedback_Test_Plan has a somewhat different look now, take a look. [15:08:24] sumanah: nice! tyvm! [15:09:55] chrismcmahon: some general thoughts for next time: [15:10:06] * use actual equal-signs bulleted headers [16:05:11] <^demon> JeroenDeDauw: Fyi, in case you didn't notice, I made all of your wikibase repos on Tuesday. [16:06:07] is there a way to get mediawiki to not cache output from doTransform() in a specific extension? if i've got "$wgParserCacheType = CACHE_NONE;" in LocalSettings.php then the output is never cached, but i'd like to be able to have whatever is the default caching except for the return value for the doTransform() method in this media handling extension i'm working on. (does this make sense?) [16:07:19] <^demon> Do you have a parser functions that is doing the transformation? [16:11:32] ^demon: i'm using outputHook( $outputHook, $parserOutput, $data) and parserTransformHook( $parser, $file ), so i think so [16:12:04] <^demon> Should be able to call $parser->disableCache() [16:12:39] oh, very neat, thanks [16:12:59] would http://www.mediawiki.org/wiki/Manual:$wgObjectCaches be applicable too, at all? [16:13:54] <^demon> No, not for this. [16:14:43] ok [17:16:12] hi there Friday folks! [17:16:13] hi bsitu [17:17:21] hi kaldari [17:17:55] hello [17:18:02] Happy Friday sumanah [17:18:10] Happy Friday to both of you as well! [17:18:11] :-D [17:18:17] thanks! [17:18:23] Looking at https://www.mediawiki.org/wiki/Code_review_management - [17:18:24] Current statistics on all MediaWiki (core and extensions): [17:18:26] 30 that have received a positive tentative review (+1) but have not been merged (+2) [17:18:26] 216 that received neither -2, -1, +1, or +2 reviews (but might have textual comments) [17:18:26] 38 received a negative tentative review (-1) with issue to be addressed by the original contributor [17:18:26] 6 that have been rejected (-2) but not yet abandoned by their original authors [17:19:21] thekaryn: kaldari: am I right in understanding that the next step for SignupAPI is that Ori is working on it? [17:19:37] kaldari: and what are you up to today? maybe you already know in which case I shan't press you [17:19:42] bsitu: what are you working on today? [17:19:55] sumanah: yes, Ori will take on Signup API as one of our upcoming experiments [17:20:13] thekaryn: ok! are you waiting on anyone/anything else for anything re SignupAPI? just want to check in :) [17:20:31] waiting on more hours in the day? no, i'm all set, thank you [17:20:32] sumanah: GSoC and PageTraige codereview [17:20:39] I have some pagetriage code to review and test unless you want me to do something else, :) [17:21:01] kaldari: sounds good to me! do you feel like we now have enough bench strength re UploadWizard review? [17:21:23] hopefully [17:21:50] kaldari: and Ankur's progress -- ok? I am glad I got to meet him in Berlin [17:22:02] and he seemed to be really happy with his geocoding progress/momentum [17:22:13] seems to be [17:22:40] bsitu: well, if that's what is most pressing, then I should not try to give you other stuff [17:24:35] ok, and Arthur is basically out for today [17:24:43] and I shall email Gabriel, Amir, Antoine etc separately [17:24:44] thanks all! [17:25:30] alolita, ori-livneh - where are we meeting? [20:07:17] DarTar: just got out of a meeting with Fabrice -- looks like we have a plan for clicktracking. I believe he's updating the ticket. [20:22:16] hey rsterbin [20:22:46] great – I have been in back to back meetings and didn't get a chance to reply to your question [20:24:14] the plan is for me to add a new variable to control feedback clicktracking (separate from front end clicktracking), so that we can turn it off on the front end only [20:24:17] mail for you [20:25:03] i'll also make form impressions tracked at 1%, so if we want to turn it back on at some point after we've all forgotten about this issue, we won't break the clicktracking logs again. [20:25:30] ok [20:25:41] you should probably coordinate with alolita and ori-livneh as we're taking similar measures for E3 projects [20:25:46] yep [20:25:54] let me ping them [20:26:27] ok [20:43:26] DarTar: i'm not entirely sure what E3 is. [20:43:58] Editor Engagement Experiments [20:45:24] EEEEEEEEEEEEEEEEEEEEEE [21:35:59] bsitu: 20120612000000 - 20120626235959 UTC (exactly 2 weeks, assuming we can deploy before the start date) [22:22:16] Reedy: . 'k'; [22:25:01] Amgine you have a moment? [22:25:24] No, but go ahead and ask. [22:25:33] Technically speaking how difficult would it be to have splesific set of namespaces [22:25:39] Very. [22:25:53] for instance if wikimania wikis were mreged to one wiki could an entire namespace be protected from edits? [22:26:11] or only select users can edit a spesific namespace? [22:26:31] MW isn't really designed for that sort of restrictions [22:26:47] Yes, assuming you could get one of the namespace extensions enabled on that wiki. But exactly what Reedy said. [22:31:30] Reedy I think it would be simpler on many levels if Wikimanias were all in one wiki [22:35:35] ToAruShiroiNeko: Perhaps this is not on topic for this channel. [22:40:15] It being simpler doesn't make it easy [22:41:09] Reedy it would be easy after the initial setup [22:41:21] Sure [22:41:28] But it's not that hard to create new wikis [22:41:29] Amgine where should I discuss it? since I am worried more about potential technical problems [22:41:38] And the difficult part is stil making MW do that [22:42:00] Reedy well the act of creating isnt but the act of creating content could be ore difficult [22:42:19] essentially everything on the wiki from templates to mediawiki pages need to be recreated every year instead of being reusable [22:42:57] its just that creating a new wiki and locking the old one each year feels a bit stupid to me, no offense [22:43:03] And the difficult part is stil making MW do that [22:44:11] can sidebar be replaced with a template? [22:44:39] or can it be made completely different per namespace it appears on [22:47:49] ToAruShiroiNeko: Akshay Chugh will probably want to hear about your concern [22:48:07] ToAruShiroiNeko: Akshay is developing a convention/conference extension to make it easier to run a conference site from a MW install [22:48:28] That side of the things is usually run off cluster... [22:55:28] ToAruShiroiNeko: It might be proposed in #Wikimania that each year a bot be run to move the previous year's wikimania pages to a yearly namespace, so after this year's wikimania all pages marked with Category:2012 could be moved to 2012:*. [22:55:55] chrismcmahon: I like the Linphone SIP client [22:58:13] thanks sumanah, I found linphone and my friend Rick Scott mentioned ekiga. I have them both now. I so rarely use one, but I really need a reliable one. [23:02:16] Amgine indeed [23:02:34] or create a namespace each year while not utilizing the main namespace [23:03:05] There are issues with that, as many elements of the MW software are set up to default to ns 0. [23:03:42] right like main page [23:03:49] which can be redirected each year [23:04:04] sumanah is he or she on irc? [23:04:47] ToAruShiroiNeko: I believe Akshay is not on IRC right now -- https://www.mediawiki.org/wiki/User:Chughakshay16 has the contact info [23:05:03] ToAruShiroiNeko: ("he" and thank you for being inclusive, I appreciate it)