[00:30:39] Yay, we got a mediawiki gadget on github now. Hurray for evil cross-site scripting [00:30:43] https://meta.wikimedia.org/wiki/User_talk:Krinkle/Le_Tour_de_Wik%C3%AD/2011_Resource_Walker/jsUpdater.js#Documentation [00:30:53] made a little "Fork me on GitHub" badge [00:30:57] meh, they used to look better [00:42:22] o hai Krinkle [01:48:21] Reedy: JavaScript is possessed by demons [01:48:27] http://cl.ly/071H0d0Q0l3O083p3B0T [01:48:29] ^demons? [01:48:35] http://cl.ly/0F1N3u0a0b1N1t331m3P [01:48:44] Why the F is is alternating [01:48:49] !!! [01:48:49] Stop using so many exclamation marks ! [01:49:25] You! [01:49:32] I test the same regex against the same string, but the result alternates, consistently [01:49:39] starting with true, and on from there [01:49:57] but a simple "hi" stays the same [01:53:03] Reedy: Any clue ? [01:53:15] You're much more familiar with JS than i am [01:53:31] Yeah, but maybe this is some kind of magic rule of regexes [01:53:42] that everybody just knows and I'm supposed to make sense of [01:54:23] I found https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/RegExp/lastIndex and feel close, but I still don't quite get it. As the example used there is consistently evaluating to true (the "hi" example), see first screenshot [02:28:13] Reedy: OMG! Found the sekrit. with help of #javascript [02:28:42] So it turns out global regexes in javascript when executed or tested against a string, use an advance lastIndex [02:28:48] and since the regex literal is re-used... [02:29:25] it means calling it again will be like the next iteration [02:29:26] terrible interface for a regex advancer (non-intuitive) but that's what it is [14:22:43] New patchset: Hashar; "sync MWDumper configuration" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/7956 [14:22:43] New patchset: Hashar; "job to lint files in operations/mediawiki-config" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/7957 [14:23:01] New review: Hashar; "(no comment)" [integration/jenkins] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/7956 [14:23:02] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/7956 [14:23:10] New review: Hashar; "(no comment)" [integration/jenkins] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/7957 [14:23:11] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/7957 [17:04:01] Hey Reedy, I'm doing a code review of a special page, and the person is using $wgOut, $wgUser globals. Are they supposed to use $this->getUser and $this->getOut now? http://www.mediawiki.org/wiki/Manual:Special_pages still says to use globals... [17:05:41] csteipp: yeah, essentially they should be. Unless they're writing for less than 1.18, they should be using the context source based methods [17:06:06] I guess that page needs massively updating [17:06:12] Ah. Cool, I'll check which version. [17:08:28] 1.17 isn't a supported version now, so if it's in our repos, most likely they should be not using globals [18:04:18] hey rsterbin [18:04:25] hey [18:04:29] howdy [18:04:51] i'm in the middle of a call with fabrice, and i had questions about clicktracking [18:04:52] I don't remember if we ever pushed to production the error logging we discussed [18:04:57] ah good :) [18:04:58] yes, we did [18:05:17] ok cool, do you have a link with the naming scheme handy? It's not on meta [18:05:58] (no rush, let's catch up after your meeting) [18:32:08] robla: it's not collection at fault [18:32:20] pushing 1.20wmf3 collection version back to what 1.20wmf2 has made no difference [18:32:30] and pushing 1.20wmf2s version to HEAD still works fine [18:41:48] Reedy: uh oh, you guys are breaking everything again? ;-) [18:42:41] multichill: I've not found out how to upset enwiki this week [18:42:44] I'm running out of time [18:43:19] Reedy: I'm sure you could think of something if you really tried. [18:43:36] It's gotta beat adding boldness to their watchlists [18:50:46] hi bsitu kaldari awjr and other Friday 20% checkin people [18:51:16] hi sumanah [18:51:18] hello [18:52:05] I'm sorry to be late. [18:52:18] Just arrived in San Francisco last night and my calendar screwups are my own [18:52:21] hello [18:52:31] kaldari: I presume you're on the SignupAPI & UploadWizard stuff [18:52:39] bsitu: Are you working on that patch that you were working on last week? [18:52:44] sumanah: is this mtg supposed to happen on thur at 4pm pacific? [18:52:47] and awjr you're on the Forms stuff? [18:52:50] yeah [18:52:51] awjr: yes! [18:52:54] that's what my calendar says at least [18:52:55] ok :) [18:52:56] should be done soon [18:52:58] I'm on SignupAPI and DebianISOCodes and UploadWizard [18:52:59] awjr: so I am like a full day late [18:53:08] kaldari: my goodness! that's a lot [18:53:10] sumanah: nah, only 20 hours or so [18:53:12] do I need to do any more? [18:53:13] :) [18:53:18] kaldari: eeeek [18:53:21] oh, sumanah, i've got some 20% stuff so you know what the design team is on about. [18:53:22] kaldari: and how's Ankur? [18:53:28] jorm: oh sure, please tell! [18:53:28] sumanah: yeah im working on htmlform stuff [18:53:35] he's on my to-do list for today [18:53:39] i'll wait until everyone else is done. [18:53:45] awjr: I shall not call it Platonic Forms, in deference to sanity [18:53:54] bsitu: you need any help, or are you on your way ok? [18:53:55] what are you doing with htmlform? [18:53:58] if i may ask? [18:54:07] I am ok [18:54:11] :) [18:54:11] yeah, awjr, you should, like, communicate about that a bit widely [18:54:13] i'm intensely curious because full implementation of it is essential for the style guide. [18:54:19] jorm: making it so it will output forms wrapped in things other than table elements [18:54:28] k. [18:55:10] jorm once that's done i imagine adapting it to the styleguide will be a lot easier [18:55:12] I also should have roped in Sam, Gabriel, Ian, & Amir Aharoni to talk about their weekend plans - I plan on instead emailing them, hope I get to that before the end of the day! [18:55:19] excellent. [18:55:32] Anyway, sounds like the other people I needed to talk with are all set & on their way. Thanks all. [18:55:42] hurrr [18:55:45] jorm: I am eager to hear of the design team's work! [18:55:53] Nikerabbit: sorry, please elaborate? I don't understand [18:56:08] kaldari: what's DebianISOCodes? [18:56:18] a fundraising extension [18:56:37] it's sort of similar to CLDR, so you might want to take a look at it as well [18:56:42] As far as the design team - those who fall into the 20% umbrella (myself, vibha, pau, lindsey, and munaf [eventually]) - we're trying to sync our days to be on the same day (tuesdays) [18:56:48] kaldari: what does it do? [18:57:20] jorm: oh neat! ok. And do you feel like you already have a handle on what that work will entail, or would you like suggestions? [18:57:23] and we're going to be building out various design standards things for foundation stuff. we're starting with building a standard icon set. [18:57:28] there you go then [18:57:30] oh, i know what we're going to work on. [18:57:39] already ran this buy erik and howie; they're very excited. [18:57:57] NikeRabbit: I believe it's localized location information or something similar [18:57:57] this is similar to the mediawiki style guide but is more focused on wmf projects. [18:58:01] jorm: rock! [18:58:14] I haven't actually looked at it yet [18:58:31] nikerabbit: it might even be something you could merge into CLDR [18:58:32] this is the first document we've been working on: http://www.mediawiki.org/wiki/Wikimedia_Foundation_Design/icon_set [18:59:04] jorm: so, that sounds very useful. I'm also wondering about where we can help take care of design review requests [18:59:20] so, that's a problem of a different horse. [18:59:23] jorm: like, the design reviews of community-developed extensions, or the issues in BZ that are awaiting design input [18:59:42] the issue is that, with the exception of me, none of the team are coders. [18:59:53] and getting them up into the gerrit workflow is pretty much a waste of time. [18:59:54] kaldari: mostly interested because SPQRobin wants to bring more language names into the core [18:59:57] What about design review or UX testing of community-developed projects/improvements? [19:00:30] I need to skedaddle upstairs to listen to the India team's brownbag talk right now, jorm - want to chat a bit after that? [19:00:48] that's a possibility *if* we can have a place where these things are deployed for us, and we can test them there. [19:01:03] but expecting anyone to checkout, build, and deploy locally is probably a non-starter. [19:01:06] sure, go go. [19:01:09] Nod, nod. OK, so this is partly a Labs dependency [19:01:10] OK! [19:01:13] l8r. [19:01:58] jorm: what's the status of http://www.mediawiki.org/wiki/Style_guide/Forms ? [19:02:20] i have no idea. [19:02:33] i think certain parties decided that they didn't like it, and worked to kill the project. [19:03:10] I saw for a second an error message on enwiki saying "cannot load twinkle-options.js" or something but I cannot reproduce it, anyone else? [19:03:33] there is/was an extension in SVN that was trying to implement it there before pushing it into core (krinkle was working on it) but htat's largely been abandoned. [19:04:07] jorm: that's too sad... I mean there are things I could bikeshed too, but in general the forms in mediawiki look ugly and making forms using html form look like that could be a way to force it in [19:04:59] yeah. i'd like to make it easy for developers to build forms that looked good and had all sorts of whiz-bang features. [19:05:07] that was the idea, anyway. [19:05:19] someone just needs to decide to devote time to it, really. [19:07:03] and we know the chances for that [19:07:31] yup. [19:23:11] DarTar: ready to talk about stage 4 clicktracking whenever you are [20:02:44] What is the correct way to review and merge the first instance of an extension in Gerrit? [20:03:29] since initially there is nothing to review? [20:07:31] kaldari: the initial checkin can be reviewed [20:07:58] I haven't been able to find it in Gerrit [20:08:04] did you push it in? [20:08:16] either for SignupAPI or DebianISOCodes [20:08:22] does the repo exist? [20:08:26] yes [20:08:30] did you push it in? :) [20:08:35] they are both projects set up in Gerrit [20:09:08] nevermind, I found the DebianISOCodes one [20:09:10] hey rsterbin, back at my desk [20:09:27] hey [20:09:34] you need to clone the project, add your files in, and then do a git review [20:09:38] it'll show up as a change [20:09:49] add my files? [20:09:59] it's an empty repo right now, right? [20:10:05] no [20:10:05] so you said you had some questions about that proposal [20:10:14] was it moved from svn? [20:10:22] did you push without review for the initial files? [20:10:30] I believe so [20:10:32] ah [20:10:35] well, checkout the repo [20:10:39] I have no idea, I wasn't involved [20:10:39] delete all the files [20:10:42] with without review [20:10:42] ah [20:10:45] then add the files back in [20:10:48] OK [20:10:48] and do a git review [20:10:58] then you'll get an initial review :) [20:10:59] yeah, I think that's what I need to do [20:11:48] DarTar: fabrice answered most of them, and i think most of what you've asked for is doable -- except the "source" [20:12:22] ha! [20:12:34] DarTar: when a user performs an action on a post, it's stored in the regular wikipedia logging table [20:12:34] meaning that the referral info cannot be tracked? [20:12:49] which doesn't give us much leeway on adding extra info [20:12:56] true [20:12:59] (at least not in a way you can easily query for) [20:13:46] we can look at tracking central vs article source in the clicktracking, certainly [20:13:59] (much like we do with overlay vs bottom) [20:14:26] do people actually land on FP by submitting a form? Or is there any way of passing a GET parameter? [20:14:39] not POSTing, I mean [20:14:47] no, i just meant it's the same type of mechanism [20:15:03] i.e.., "overlay" is part of the clicktracking id [20:15:29] you can only get to the feedback page by typing in the url now [20:15:59] although when we launch stage 4 there will also be cta5 and talk page links [20:16:59] so when we enable CTA5 and the talk page link is there any way in which we could pass a parameter and capture it via clicktracking when the FP is loaded? [20:17:04] yeah [20:17:17] i can pull that from the url on load and pass it to the js [20:17:18] then we're set, aren't we? [20:17:24] exactly [20:17:40] yep. all we need now is the full ids. [20:18:49] right, is there any other part of the "source" section that is potentially problematic? [20:19:18] the answer to the "source" section is no. [20:19:28] since that data's only stored in the logging table [20:19:49] i can give you source in the clicktracking data, but not in the DB [20:20:41] DarTar ^^ [20:20:44] oh that's actually what I meant :) [20:20:53] hang on let me open my mail [20:21:00] oh ok I see [20:21:22] source in the DB (central vs per-article FP) [20:21:23] let me check real quick if there's a character limit there is on tracking ids... [20:21:47] right, that's what we don't have [20:21:55] I got confused coz I thought you were referring to the "source" request in clicktracking [20:22:11] oh. i've been calling that referral [20:22:37] yeah that makes sense, I would have loved to have this piece of data to show that the centralized page is not of much use but it's not critical :) [20:23:31] i can add it to the clicktracking ids, or to the data, if you want to compare [20:23:52] I expect the central log will be flooded with posts from high traffic articles so the only moderation actions we will capture from that page will be for Justin Bieber and Barack Obama [20:24:29] the fact is I was not expecting to capture moderation data via clicktracking [20:24:37] to try and keep it simple [20:25:12] you know what, let's drop this requirement for the time being, it's not necessary for the kind of analysis I want to start with [20:25:20] ok [20:25:26] we'll figure it out later whether we have bandwidth or not to do it [20:25:58] hm [20:26:41] what would you like in the data section? because if you want the page id, that would actually give you central vs page with no extra effort [20:27:27] btw, character limit for tracking ids is 255 [20:28:14] ok wait a sec, I was actually expecting to store the id of the corresponding article [20:28:17] from ns0 [20:29:17] so the id for Golden-crowned_Sparrow (0), not the one for ArticleFeedbackv5/Golden-crowned_Sparrow (-1) [20:29:36] yeah [20:29:42] I actually have to go, I got a meeting starting in 1 min, shall we follow up by mail on this? [20:29:44] that's something we can do [20:29:48] ok cool [20:30:00] sure. i'll pull the chat log in a response email and we can go from there? [20:30:05] please do [20:30:16] have a great weekend! [20:30:18] you too [20:43:46] Anyone around who works on/frequently uses the commons upload wizard? [20:49:49] the only people who worked on it were neil and ian (raindrift) [20:50:02] neil's gone; ian is out for the next couple of days. [20:50:30] ok. I dropped a comment on the feedback page. It's certainly nothing urgent; just seems polite to verbally confirm a bug before reporting it. [20:50:53] andrewbogott_: I've been using UW a lot in a sketchy env, it has some issues for sure. [20:51:10] * chrismcmahon looks at the comment [20:52:54] The autocompletion in the category selection box is awesome! Makes it that much more disappointing when the selection is discarded :( [21:51:23] kaldari: did you get any futher with it? [21:52:09] No one's been able to reproduce the problem, and since the global var is set to false, I guess someone must have fixed it last night [21:52:27] although no one has mentioned it [21:53:18] when did wmf3 get deployed to en.wiki exactly? [21:53:43] it hasn't been.. [21:53:46] andrewbogott_: We have an open bug for the category thing. [21:53:54] You need to accept it otherwise it's discarded [21:54:20] Should be done on monday [21:54:29] Reedy: in that case, I have no idea what happened [21:55:17] but whatever was wrong seems to be gone now [21:56:25] It seemed to be related to caching since people reported that the bug disappeared if they did a null save or action=purge [21:57:48] maybe a bad version of a JS file was cached on some of the squids [22:00:21] or someone temporarily switched the config flag to true [22:13:16] Computers suck [22:13:23] I should probably get food