[00:15:21] apergos: So I finally fixed https://bugzilla.wikimedia.org/show_bug.cgi?id=29137#c7 :) [00:15:29] I saw [00:15:49] I was saying in one of these channels (I am basically sleep typing) [00:15:56] that it would be best I guess for it to be read only [00:16:14] so people can review what's there instead of digging through the dumps to read the discussions [00:16:21] (which is I guess what they wanted the dumps for) [00:16:50] RoanKattouw: [00:19:08] Yes, I'm putting it in closed.dblist right now [00:19:18] oh yay [00:19:19] Also, it might want to have, I don't know, LiquidThreads installed? :D [00:19:24] naaahhhh [00:19:25] :-D [00:21:02] Whee https://liquidthreads.labs.wikimedia.org/wiki/Feedback/closed works now [00:21:31] great [00:21:37] all this for one page :-D [00:21:41] thanks [00:22:06] The wiki supposedly has some 4,000 ervs [00:22:08] *revs [00:22:17] So I guess you should be able to run dumps for it now [00:22:21] but it's the feedback page they were interested in [00:22:28] Oh Ok [00:22:32] I don't remember if we run em for closed wikis [00:22:36] I just got that when I clicked Random page [00:22:41] but I could manually do a run just cause. [00:23:28] we don't have a good run from just before it was closed, because there was that bug that prevented good dumps on all the lqt wikis [00:27:01] marktraceur: looks like new git-review isn't appending change-id... [00:27:01] ffs [00:28:33] Or not.. only for syntaxhighlight_geshi.. [00:43:38] Yo, Marktraceur [00:43:43] Are you on? [00:44:19] Someone already implemented my idea [00:44:35] and released it 13 days ago..... [00:45:15] brb [00:53:29] LukeDev: Hi, I'm here [00:53:33] Which idea? [00:57:54] spagewmf: Seems the teimstamp bug isn't fixed [00:57:54] srv282 commonswiki: [5cd63138] /wiki/File:2012-04_Re%C5%84ska_Wie%C5%9B_26.jpg Exception from line 130 of /usr/local/apache/common-local/php-1.20wmf11/includes/Timestamp.php: MWTimestamp::setTimestamp : Invalid timestamp - 1971:01:01 11:30:5500 [01:12:55] Reedy, 1971:01:01 11:30:5500 looks odd, I guess it's more EXIF inconsistency. [01:13:05] Mmm [01:13:28] we seem to be seeing upto 5 a minute [01:19:23] All these 4 digit seconds.. [03:50:16] @marktraceur BACK [03:50:30] are you around? [03:51:08] This was the idea I had: http://wikitimelines.net/ [03:57:41] Oh right [03:58:07] LukeDev: Better not to ask if someone is around...if they're in the channel, talk at them, they'll be more likely to get the message quickly [04:00:22] Alternatively titled "You did the right thing this time, but no need to ask if someone is around, if they're in the channel then that's around enough to send a message" [04:47:17] Hello, I'd like to get some opinions on something I'm trying to do. [04:47:51] I want to improve timeline articles using sentences that I've pulled from their parent articles. [04:49:01] For instance, the timeline article on the Great depression isn't so good: http://en.wikipedia.org/wiki/Timeline_of_the_Great_Depression [04:49:59] but the article has plentiful dates and references: http://en.wikipedia.org/wiki/The_Great_Depression [04:51:28] I'm already able to pull sentences that contain dates from the article, how do I automate the process of adding the sentences to the timeline articles? [04:56:51] I want to be able to rephrase the sentences that were gathered from the article and add them to the timeline article. Does anyone have an opinion on this idea or have any advice on how this has been done in the past? [04:57:12] Anything would be appreciated :) [05:24:23] marktraceur: I suspected you might be here :) [05:25:23] or perhaps not [05:29:18] kaldari: Always! [05:29:23] kaldari: Don't ask to ask, etc. [05:30:02] Lemme know if you have any thoughts on https://bugzilla.wikimedia.org/show_bug.cgi?id=39852 ... [05:30:30] I might just go ahead and try and implement something to get it working [05:31:58] Noted [05:32:04] I'll try to chime in tomorrow morning [05:33:32] thanks! [15:41:07] New review: Ottomata; "this is working, going to +2 to merge in a debian/ dependency change." [analytics/udp-filters] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/22614 [15:41:13] New review: Ottomata; "this is working, going to +2 to merge in a debian/ dependency change." [analytics/udp-filters] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/22614 [15:59:33] New patchset: Ottomata; "Adding prefix-preserving IP anonymization using libanon." [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/22614 [16:18:23] siebrand, are you here? [16:18:34] MaxSem: yep [16:29:09] siebrand, we the mobile team would like to display a localised banner on the mobile site. since CentralNotice isn't supported there, can we add that message temporarily to WikimediaMessages for people to translate? [16:30:02] MaxSem: technically yes, but I don't really like it being used for that. [16:30:23] MaxSem: use MobileFrontend? [16:30:33] Change merged: Ottomata; [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/22614 [16:30:33] it's WMF-specific [16:30:51] MaxSem: Or hook a configurable i18n file in MobileFrontend? [16:31:22] MaxSem: At least create a feature request in bugzilla for CN on mobile... [16:31:44] siebrand, where to put the separate file for TWN? [16:32:07] MaxSem: Very good question... [16:32:18] I'm open to suggestions. [16:33:12] MaxSem: you could make it yet another file in WikimediaMessages [16:35:06] MaxSem: btw, can you review https://gerrit.wikimedia.org/r/#/c/18443/ ? It's been in there for a while already, and there do not appear to be any (active) maintainers at the moment. [16:35:07] siebrand, in principle WikimediaMessages is the most logically related place for it. however, this message will be needed only until the end of this year's Wiki loves Monuments. probably, WikimediaTemporaryMessages.i18n.php? [16:35:31] MaxSem: *nod* I was thinking along the same lines... [16:35:58] deal then [16:36:06] * MaxSem looks at the change [16:36:16] MaxSem: Once merged, don't forget to back port and rebuild LocalisationCache... [16:36:30] okie [16:48:05] who can help with media storage issues? [16:48:19] hey there matanya [16:48:25] hi sumanah :) [16:48:30] er? [16:48:39] for very loose definitions of "help" [16:48:42] hi apergos [16:48:43] https://bugzilla.wikimedia.org/show_bug.cgi?id=40043 [16:48:52] if it's a problem I happen to know about, yes [16:48:57] otherwise, prolly not. lemme look [16:49:15] oh. reedy was llooking at this yesterday [16:49:21] happened to more than one editor today, very annoying. [16:49:22] I expect they will pick up this issue again today [16:49:38] and happens more and more lately. [16:49:39] there was a bug fix that went in yesterday that wasn't a complete fix, i don't know more than that [16:49:40] thanks [16:49:43] uh huh [16:50:20] * matanya got poked by some other editor to bug the devs... [16:50:35] please pass on to them what I just told you [16:50:43] matanya: AaronSchulz might be able to comment [16:50:53] ok, while do. thank you both [16:50:57] *will [16:51:02] "Fatal exception of type TimestampException" [16:51:14] It's due to weird timestamps in EXIF data it seems [16:51:30] 5 digits or something? [16:51:32] 1971:01:01 11:30:5500 [16:51:39] 4 digit seconds seemingly [16:51:41] (I sped-read the backroll form yesterday) [16:51:42] oh ,that [16:51:48] bleargh [16:52:04] not sure why it's unix epoch + 1 year.. [16:52:05] it is very common in some camera models [16:52:17] matanya: have you seen out timestamp code? ;) [16:52:27] when not set. [16:52:47] I have it on several images in my images dir [16:52:53] on my laptop [16:52:55] hmm [16:53:00] What is the actual value? [16:53:05] 2008 [16:53:15] that it? :/ [16:53:30] 2012:08:18 00:23:03 [16:53:47] all taken by iphones (not owned by me) [16:53:56] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob;f=includes/Timestamp.php;h=16be775addb6110f57fa6d435de9c245fbe512a6;hb=HEAD [16:54:10] Might be worth poking Brion about it again, as he's an iphone person [16:54:31] Is Brion back from holidays yet? [16:54:54] he's been committing either way.. [16:55:21] if to be precise I see it only on Iphone 4s so far [16:55:34] but needs validation [16:56:41] Reedy: can you check the the manufacturer of those linked from the bug? [16:57:18] Have to grab the actual file first [16:58:01] https://upload.wikimedia.org/wikipedia/commons/9/98/2012-04_Hrozov%C3%A1_16.jpg [16:58:20] Canon EOS 600D [16:59:41] Windows seems to have interpretted the date stamps fine.. [16:59:52] hi ialex - do you have a moment? [16:59:57] 18/04/2012 10:47 [17:00:20] sumanah: yes [17:01:12] ialex: sometimes I add you as a reviewer to changesets, because it seems like something you could review -- if I've been over-adding please let me know [17:01:45] I am out see you tomorrow [17:01:56] ialex: of course you can always remove yourself as a reviewer, but if I'm adding you wrongly, please let me know so I can avoid doing it in the future [17:01:57] buy hashar [17:02:46] sumanah: for me it's good, but just don't add me as a reviewer for CSS/JS stuff :) [17:03:02] ialex: ok! I shall try to remember that. [17:12:04] Just curious. Would there be any use for a script to convert a particular database table (eg: posts) to wikitext? [17:12:18] I've seen plugins for converting wordpress to mediawiki [17:12:38] But nothing for just generic html to mediawiki. I'm working right nwo to convert from habari to mediawiki [17:14:59] <^demon> nullspoon: MediaWiki allows a lot of HTML tags that you'd use (divs, spans, other content tags). There's also the "raw html" setting, but this can be kinda dangerous on a public wiki that people can edit. [17:16:31] ^demon: Sure - but that's just no fun. :) [18:01:12] gwicke: I'm working on alienation of unknown mw:* things, but I'm running into a few issues Parsoid outputs the newlines in my test page. I'm getting lots of newlines inside

tags, and lots of


[18:01:21] gwicke: I'll send you the test page [18:25:56] RoanKattouw: for the /* @embed */ magic to work, does it have to be directly before the line that includes the image or just within the definition block? [18:26:19] kaldari: Either directly before the line, or directly before the selector [18:26:27] So /* @embed */ .foo { ..... } should work [18:26:42] As well as .foo { color: red; /* @embed */ background: url(....); font-weight: bold; } [18:27:00] ah, didn't know about the selector version [18:29:06] I believe that works but I am not sure [18:29:11] @noflip definitely does work that way [18:35:34] For some reason I can't review Rob's 22164 rev: https://gerrit.wikimedia.org/r/#/c/22164 [18:35:46] every time I try I get 'fatal: Couldn't find remote ref refs/changes/64/22164/6' [18:35:55] rmoen: ^ [18:36:18] oh nevermind [18:36:28] I thought it was a core change :) [18:36:52] looks like it's for the Vector extension [18:44:11] rmoen: are you on 3? [18:44:53] kaldari: I do not see him at his desk [18:44:58] hmm [18:44:59] kaldari, at wikia. Thanks for feedback [18:45:03] ah [18:45:08] kaldari: I think the VE team might be at Wheeeekia [18:45:12] I will amend [18:45:15] (as in: Wheeee!) [18:45:52] Yeah, we are discussing how to handle hidden parts of an article [18:46:15] I have the changeset loaded, but don't actually see any changes. What should I be looking for? [18:46:31] it's supposed to change the editing interface right? [18:52:06] rmoen: do I need to do any special configuration of the Vector extension? [18:53:37] found it: $wgVectorFeatures['footercleanup']['global'] = true; [19:02:12] RoanKattouw: we are still protecting you from brs right now, as you used to butcher them at some point [19:02:45] Right [19:02:50] RoanKattouw: we can remove the mw:Placeholder if you are far enough with br support [19:02:58] But I'm not even expecting brs in that document at all [19:03:03] and tweak things in our handling of that area [19:03:41] Also, magic words like __FOO__ are grouped into the previous paragraph, even if that means there are two (!!) newlines in the

tag [19:04:51] ok, will check that later [20:18:37] kaldari: sorry was out for foods. Yes, there is a configuration that needs to be enabld [20:18:52] no problem, I got it working [20:19:12] left a few additional comments [20:19:16] kaldari: $wgVectorFeatures['footercleanup']['global'] = true; [20:19:42] kaldari: Thanks :) [20:52:53] spagewmf: hi, is the design that you suggested in your mail for the new signup flow available somewhere in html/css/png i would want to incorporate the same in my extension [20:57:26] akshayagarwal the E3 team is developing it in the feature/acux branch of Extension:E3Experiments , experiments/acux/accountCreationUX.js has some HTML chunks. Disclaimer: it's in flux. [20:59:03] spagewmf: ok thanks! is there any way I could help there? [21:01:30] akshayagarwal thanks for the offer. I hope to hook it up to validation, looking at using a combination of your api:validateitem and the client-side e-mail validation like Special:ChangeEmail . If you want to teach me how... ;-) [21:03:22] spagewmf: you were wanting to use HTML5 for client side email validation? [21:07:10] akshayagarwal , good question. I made that comment about HTML5 form features early on. For compatibility with browsers, we'll probably stick with jQuery and mw.util.validateEmail(), again like Special:ChangeEmail. [21:07:30] yeah, i was about to say that :) [21:08:34] ApiValidateSignup.php should provide you with pretty much everything you need to validate the signup form, it was made as an ajax handler [21:09:24] Anyone, has any MW code starting using HTML5 form features, like ? [22:01:12] RoanKattouw: Any idea how I can test a poolcounter server to see if its working? [22:01:22] I am about to push in a second tampa poolcounter to eliminate spof. [22:01:22] I know nearly nothing about PoolConter [22:01:30] RoanKattouw: YOU HAVE FAILED ME [22:01:33] perhaps for the first time. [22:01:35] ;_; [22:01:36] heh [22:01:46] Anyone know? [22:01:59] TimStarling: ---^^ ? [22:02:07] heh, yea, all the wikitech stuff is TimStarling [22:02:34] TimStarling: So I need to know how to test against my newly installed poolcounter server to ensure its setup properly before I go tossing it into PoolCounter.php [22:02:57] (as once ersch is successfully handling it, tarin will be migrated into internal ip address and redeployed as poolcounter again) [22:04:13] heh, asher pointed out i can telnet into its port and pull stats once its in pool [22:32:31] I have an ajax handler implemented as an Api extending ApiBase, I receive the parameters sent by the user by doing extractRequestParams(), from the documentation it seems that this method does not return the sanitised input, what would be the best way to sanitise this input data [22:34:38] What do you mean by sanitize exactly? [22:34:58] ApiBase validates your parameters if you tell it what the validation rules are in the parameter definition [22:36:50] RoanKattouw: I mean, I wish to do something like strip_tags() on the input data since its coming from a form filled up by a user [22:37:12] I doubt you'd want or need strip_tagS() [22:37:22] Really what you want to do is validate for a specific purpose, or escape for output [22:37:29] It depends on what you're doing with the data [22:38:39] inside my api, I pass those input params to functions which validate that data & accordingly generate a response [22:39:01] Then you probably don't need to do anything [22:39:31] In the context of user signup, there's probably no reason to strip tags or whatever [22:40:36] RoanKattouw: ok, i was worried that some malicious user might want to send some code as the username or email & it might get executed/cause damage [22:41:55] We protect against that, but in a different way, by escaping things before outputting them [22:42:24] ah ok, so then I dont need to worry, thanks :) [22:56:41] there is a line is User.php isValidUsername($name) which says $name != $wgContLang->ucfirst( $name ) This evaluates to akshay != Akshay and then this function declares the username 'akshay' as invalid, I am unsure what is the purpose of the above line [23:03:18] Yeah there are various validation functions in User, and they all mean different things [23:04:48] so I checked the documentation for isValidUsername and it seems that this line is because we dont want the usernames to start without a capital letter [23:04:52] but why? [23:05:30] http://svn.wikimedia.org/doc/classUser.html#a75957f369ce60cdce57011e13a4250c7 [23:07:00] morning [23:07:20] sorry, forgot to change my nick last night [23:08:09] RobH: you can connect to it with telnet [23:09:52] use the command "STATS FULL" [23:10:17] has to be all caps, I lost the argument with Platonides about that one [23:10:28] got it, so I checked the username guideline on Wikipedia and it says that the first letter of the username is automatically capitalized [23:10:41] so, it makes sense to implement that check [23:12:44] it's automatically capitalised, independently of $wgCapitalLinks [23:13:23] TimStarling: right, seems that this was implemented after MW 1.8 ? [23:14:20] it would have been enforced on the UI side before that [23:15:17] TimStarling: enforced on the UI as in? [23:16:04] when you type in a name with a lower-case initial letter, it automatically converts it to uppercase [23:16:38] Which is evil, that should be handled in User.php [23:17:23] actually it was [23:17:29] function newFromName( $name, $validate = true ) { [23:17:30] # Force usernames to capital [23:17:30] global $wgContLang; [23:17:30] $name = $wgContLang->ucfirst( $name ); [23:17:55] still is [23:18:00] public static function getCanonicalName( $name, $validate = 'valid' ) { [23:18:00] # Force usernames to capital [23:18:00] global $wgContLang; [23:18:00] $name = $wgContLang->ucfirst( $name ); [23:18:52] the check was also in isValidUserName() in 1.7 [23:19:27] TimStarling: would it be nice, if in the new signup form, I make that capitalisation happen on the form using JS? [23:20:07] yes, I suppose that would work [23:20:26] in MW 1.1 it was [23:20:28] $t = Title::newFromText( $name ); [23:20:28] $u->setName( $t->getText() ); [23:21:07] no isValidUsername(), but the effect was the same [23:21:46] correct, so I must have somehow not looked deeply into that while implementing the ajax handler [23:21:53] I'm always amazed looking at really old mediawiki files, User.php is so small [23:22:12] hehe [23:22:39] in MW 1.1 it is only 630 lines, and that's including about 100 lines that I wrote [23:23:46] it's 4200 now