[00:01:09] * Nemo_bis added all repos to https://www.ohloh.net/p/mediawiki [00:01:11] preilly: I got a reply from Nuno Lopes [00:02:15] he said it's probably very difficult to get phpllvm to compile with a recent version of LLVM and that I may as well start from scratch [00:02:17] werdna: also another comment on https://gerrit.wikimedia.org/r/#/c/16580/2/xmpp/XMPPHP/XMLObj.php line 118 [00:02:27] Nemo_bis: I'm not sure if that was the best idea.. :/ [00:02:30] so he was pleasantly surprised when I told him that actually I already had it compiled and working [00:02:44] TimStarling: oh that's cool [00:02:59] I know we had them before, but it skews the results somewhat.. [00:03:19] he says he's an LLVM developer now so he can give advice on the LLVM side of things if I need it [00:03:32] TimStarling: oh that's wonderful [00:03:51] TimStarling: glad to hear that he responded [00:04:44] Reedy: why? [00:05:23] Lumping it all under "MediaWiki" seems wrong [00:05:49] Reedy: not literally all repos [00:05:56] only extensions (core was already there) [00:06:15] yeah [00:06:17] even still [00:06:48] well, makes no sense to cerate a new project there for each extension [00:07:38] Certainly for most of them [00:09:59] Reedy: sent wikitech email (was a fu to an oldish thread), you can reply to express disagreement :) [00:10:03] and slap me publicly [00:10:36] heh [00:11:38] werdna: two new comments on https://gerrit.wikimedia.org/r/#/c/16580/2/xmpp/XMPPHP/XMPP.php [00:12:08] preilly: seen them. [00:12:50] werdna: okay [00:14:47] werdna: you should patch your copy with these changes too: https://github.com/ivan1986/xmpphp/commit/2133b5c0c519b7ab496fadff237ea52d35b7e9ef [00:23:30] which one of you lovely sirs is going to build HTML email into mediawiki? [00:23:38] Me, I assume [00:23:43] or, perhaps, which one of you gets bribed? [00:23:45] First I need to figure out how to do it :) [00:23:54] jorm has good bribes [00:23:57] I've already done it a few times [00:24:14] In extensions that is, without modifying core [00:24:23] we need this in core, desperately. [00:24:41] I'm trying to remember how to make MediaWiki actually fast :) [00:24:44] Roan, don't know if you're interested, but I'm frying chicken and making chili on saturday. [00:24:55] werdna: use creolewiki. [00:25:34] I heard [00:25:47] Gotta confer with a few other people re bike ride plans [00:25:54] tried to invite timo, too, but couldn't find him on facebook. [01:05:17] Krinkle: Parry, counter [01:05:21] binasher: following up...here's the rev: https://gerrit.wikimedia.org/r/#/c/14084/ [01:05:37] thanks [01:06:13] TimStarling: binasher tells me he's deferring to you on this, so it's up to you to make a design suggestion for Jeroen [01:06:58] jeroen is in Berlin, right? [01:07:07] * robla stabs office wifi [01:07:16] TimStarling: yeah, he's in Berlin, I think [01:07:16] yes [01:08:55] oh, neat, 8 hours difference (*sigh*) [01:09:33] this is probably the "good" time of year for Europe/Australia, huh? [01:09:51] Yes [01:09:58] In NH winter it's 10 [01:10:31] You put "good" in scare quotes but it's actually better than Europe/SF right now, that's 9 [01:11:40] well....the difference is that Tim generally works more of a regular business day (to catch us here in SF), and the Europeans generally work later days to catch us here in SF [01:11:45] Right [01:11:52] Yeah that does kind of make it suck [04:08:33] Change merged: Tim Starling; [analytics/udplog] (master) - https://gerrit.wikimedia.org/r/16414 [12:18:55] <^demon> hashar: So I tried out https://gerrit-review.googlesource.com/#/c/36932/ yesterday on my local install. [12:19:06] <^demon> It's awesome. Can't wait for it to land in master (probably for 2.6) [12:19:08] hi :) [12:19:24] ohhhh the suggestion yeah that will be great [12:19:58] it is good to see that gerrit is actively maintained [12:23:35] <^demon> And not just bugfix-maintained, but actively making improvements :) [12:31:46] ^demon: oh while I am around I will be on vacations in 2 weeks for most of august [12:31:58] ^demon: need to brain dump to you some stuff related to Jenkins I think [12:32:03] just in case :-] [12:32:20] <^demon> Ohnos. People are gonna ask me so many questions :p [12:32:37] unlikely ;-] [12:32:39] <^demon> (People already ask me lots of jenkins questions, and I go "ask hashar") [12:32:45] most of the time it is related to unit test results [12:46:46] damn I hate make [12:47:38] %.1 : %.txt [12:47:45] does not match subdir/somefile.1 :-( [12:54:39] <^demon> Try cmake? [12:54:45] <^demon> cmake usually works pretty well for me [13:01:04] maybe I should learn that one [13:01:11] I wanted to use rake [13:01:18] then figured out nobody will want it :-] [13:04:24] <^demon> I've never used rake in my life. [13:11:54] ^demon: here is my side project : https://gerrit.wikimedia.org/r/#/c/16606/ [13:12:00] started that in a rush yesterday night :-] [13:12:18] the idea is to provide manpages for the cluster script [13:12:23] which we definitely need on labs hehe [13:12:47] <^demon> Heh, man pages [13:12:53] <^demon> Does --help not suffice? :) [13:14:26] ^demon: well you can be as verbose as you want in a manpage [13:14:28] giving examples [13:14:31] notes about usages [13:14:34] todo list and so on [13:14:38] * ^demon shrugs [13:14:42] I guess that is a complement [13:14:43] <^demon> I guess that's cool :) [13:15:26] plus you could then use apropos / whatis :-D [15:17:36] Reedy: I have added a new MediaWiki-Database-Lag header with https://gerrit.wikimedia.org/r/#/c/16631/ [15:17:48] I saw [15:17:50] I got an email ;) [15:17:50] prefixing a header with X- is deprecated by RFC 6684 [15:17:54] ohooo [15:18:00] ;D [15:18:15] not sure what to do which that patch though, I have sent it to Gerrit to discuss about it [15:30:42] how does our $wgConf work, what's wrong with $wgConf->get( 'wgSitename', 'enwiki' ) ? [15:32:10] <^demon> Well I don't like to encourage direct $wgConf usage, since it's very likely to change when I overhaul it. [15:32:16] <^demon> Less usages -- makes it easier for me to break [15:32:41] well, it's for WMF usage [15:33:15] and I need to peek at other wikis' settings [15:39:24] MaxSem: I assume you've skimmed http://svn.wikimedia.org/doc/classSiteConfiguration.html ? [15:42:53] bleh, loadFullData() [15:48:34] Load all of the datas [16:03:30] * marktraceur asked sumanah for things to code-review [16:03:36] I sure asked for it [16:03:55] I've gotten roughly 15 emails in the past half-hour [16:12:07] congrats :-] [16:14:52] hashar: I'm seriously considering whether this is a good thing now, but at least it will help out. I know there's a serious backlog, so if this is a good way to help relieve the backlog, I'm very happy to do it. [16:15:31] marktraceur: plus you will learn lot of our code [16:26:43] Indeed. [17:41:52] plop [17:45:00] oh what the hell, i've got a spare mac. i'l install mountain lion on it and see what happens. BWAHAHAHA [17:45:39] <^demon> #geekproblems [17:56:07] Anyone up for a flame war over for loop style? [17:56:08] http://www.mediawiki.org/wiki/Manual_talk:Coding_conventions/JavaScript#For_loop_style.3F_17737 [17:57:04] Krinkle: ----^^ [17:59:06] mwahaha [18:02:43] Let me fix that :D [18:02:53] Oh, its on the talk page [18:03:17] brion: Your comment about hasOwnProperty is not true afiak. [18:03:39] jQuery and by extend MediaWiki does explicitly not support environments that extend the object prototype [18:03:48] simple jquery operations will fail miserably [18:03:57] including $.each [18:04:27] * Krinkle writes reply [18:04:28] oh yay [18:04:35] that simplifies life a little [18:04:48] And one must never ever use for-in for an array [18:05:01] because keys are strings [18:05:11] wonderful isn't it :P [18:05:23] Krinkle: Yeah, that's a very strange "feature" [18:05:30] eek [18:07:40] for ( var i in arr ) { console.log( i + 1 ) } will output 01 11 21 31 41 51 and so on [18:08:07] JavaScript Typing: Just don't try. [18:08:07] oh javascript you silly bean [18:09:59] mmm [18:10:17] brion is there a simple way I can see the special page on multiple wikis? [18:10:33] I want to see the special:Doubleredirects entries in bulk [18:12:09] ToAruShiroiNeko, write a bot to go fetch them all :) [18:15:37] brion I lack the skills to do so [18:15:49] sumanah: Thanks for the awesome pile of things to review [18:15:50] neither pywikipedia nor toolserver developers want to help [18:16:00] marktraceur: you asked for it! literally! :-) [18:16:08] I did! I was just saying that. [18:16:17] (-: [18:16:24] I'm glad to help [18:16:59] sumanah: The clearly-not-an-actual-pull-request....hopefully I did that right? I figured the person might want to see what happens when a review is sent their way [18:17:28] marktraceur: which one? [18:17:37] https://gerrit.wikimedia.org/r/#/c/14921/ ? [18:18:03] ah yes [18:18:18] That's the one [18:18:23] makes sense to me [18:18:27] Mmmkay [18:18:39] you actually did have a useful bit of feedback about coding standards :) [18:19:09] sumanah: It's what I do best! [18:19:56] "Mark, here's a 400-line file that's not compiling" "You missed a semicolon at line 293" <-- 3 years of my college career [18:20:46] wow [18:21:32] *nod* C++ tutoring isn't all it's cracked up to be [18:21:51] you are reminding me of a fake help support request I once saw, marktraceur [18:22:24] basically, we had just ginned up a new ticket tracking system at Salon, so help tickets were actually in a database instead of my personal email [18:23:07] and to test it out, a colleague of mine created a ticket from the fake name "Clayton Peacock" with subject line "I am a turtle, help!" [18:23:23] and the email basically said that this peacock had fallen over onto his back and needed help to get upright again [18:23:35] and ended something like "I am a subscriber and I expect support!" [18:23:42] s/peacock/turtle/ [18:24:27] Ha, awesome [18:26:13] hey matthiasmullie_ : what table is FeedbackPage pulling per-post activity log data from? [18:27:31] logging [18:28:26] oh right, makes sense [18:30:19] sumanah: thx for following up on Memento, my understanding is that the extension has no front-end component (it's simply serving a specific oldid in response to an HTTP header), but I might be wrong – good that you flagged this explicitly in the ticket [18:31:30] DarTar: hmm, the next step depends on whether this extension would be user visible at all to Wikimedia users (readers and editors). Would it? [18:31:39] yes [18:31:44] incl readers [18:31:51] so, I saw your note that the extension should allow any user (reader or registered editor) to syndicate content based on HTTP headers specifying a time range. [18:33:38] DarTar, so, even if there's no *front-end* component, it would still affect Wikimedia user experience, right? [18:33:44] so, that means it would need a design review [18:34:18] jorm: got a moment to talk about Memento? DarTar has been helping the author and I want to help straighten out any possible confusion re design reviews [18:34:46] correct, it will affect UX for users with a browser capable of posting these requests [18:35:21] no impact on users with no memento support in their browser [18:35:36] so there's only a browser plug in for this? [18:35:50] so we'll have, like, 10 people using it. [18:36:23] http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/46202 is where the conversation around Memento started, I think. [18:36:38] DarTar: what browsers are capable of doing this? [18:38:55] jorm: there's a chicken-and-egg problem around design reviews & getting community consensus, right? We don't want to spend a bunch of WMF time reviewing stuff with 0 community consensus, but we don't want to tell the community to go build consensus & momentum about something if we would take 1 look and say "no way" just from looking at the general featureset [18:39:20] i'm actually asking if there's been any conversation with the community at all about it. [18:39:22] So Howie approved this as step 1 of the process: design review https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment#Design_review [18:39:37] jorm: http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/46202 I think is the most recent public conversation about it [18:39:48] DarTar: do you know of any other, more recent conversation about this? [18:42:32] Well this is useful [18:42:33] E(<0.1597.0>:ejabberd_http_bind:1236) : You are trying to use BOSH (HTTP Bind) in host "undefined", but the module mod_http_bind is not started in that host. Configure your BOSH client to connect to the correct host, or add your desired host to the configuration, or check your 'modules' section in your ejabberd configuration file. [18:42:36] undefined? [18:42:38] wtf [18:42:55] preilly: have you seen this before? [18:44:05] werdna: not sure [18:45:31] oh never mind [18:45:35] I was sending a NULL domain [18:45:39] misconfiguration on my end [18:47:48] sumanah: there's a plugin for Firefox http://www.mementoweb.org/tools/ but we can safely assume that until any major website supports it we probably won't see a massive adoption [18:48:04] I think W3 and the Internet Archive has it enabled [18:56:09] we could provide that adoption rate, but i think i want to see community reaction. [18:56:27] it sounds like a neat idea. [18:58:15] I'd even say that this thing will make even a remote sense when at least 1% of our visitors will have support for this protocol AND care about history [18:59:18] because 99.9% (and how many more nines?) of people who read Wikipedia don't care what was written here before [19:00:26] * sumanah has to go, interview meeting. DarTar can I ask you to summarize any discussion here onto the bug? [19:00:34] sumanah: will do [19:00:36] thanks [19:01:59] matthiasmullie_: any reason why we are not storing the af_id in logging.log_params ? That would make it very easy to aggregate event counts per day per unique feedback item or count how many items at all have some activity [19:02:30] currently it has to be parsed from the page title which is not very performant [19:08:41] DarTar: I can't find any good reason [19:08:53] I'll add it in [19:08:58] good, I'll bring this up with Fabrice [19:09:05] ah nice :) [19:16:46] MaxSem, if you haven't noticed we're on a mission to change that figure [19:17:31] we have to be a bit picky about the future we make, otherwise it won't be that awesome;) [19:19:07] the main priority for at least the next 3 years is to engage more people to become editors and have them realize that Wikipedia can be edited and articles have a revision history [19:19:46] that's not a nice-to-have, it's our #1 priority [19:21:59] * MaxSem suggests a blinking edit button, half screen large [19:23:28] <^demon> Or just default action=view to action=edit :) [19:29:22] MaxSem: we've tested this already: a gigantic invitation to edit an article http://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Option_4 [19:29:45] summary of the results is here: http://blog.wikimedia.org/2012/06/25/converting-readers-into-editors-new-results-from-article-feedback-v5/ [19:30:56] w00t [19:32:12] but your button didn't blink!:P [19:32:56] <^demon> We could bring back. Worked for the fundraiser that one year ;-) [19:42:27] hehheeh [19:45:01] brion: can you please review a bug and a test? [19:49:03] matanya, i can take a quick peek but no guarantees, got a meeting in a few mins [19:49:48] bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=93 test: https://he.wikipedia.org/wiki/%D7%AA%D7%91%D7%A0%D7%99%D7%AA:Test [19:50:05] I think this solves bug 93. can you please verify? [19:51:44] i'll have to look it over later, i'm not quite sure what's going on in there :) [19:52:35] thanks, I'd like to set it fixed worksforme :) [20:17:43] is there away to redefine the location of the Extensions directory ? [20:25:43] <^demon> OrenBo: Not really, no. In theory, it only matters when you're doing your require()s in LocalSettings. In practice, however, extensions may expect to be in ./extensions/ [20:27:22] ^demon: Answered in #mediawiki: http://www.mediawiki.org/wiki/Manual:$wgExtensionsDirectory [20:27:50] <^demon> Oh yeah, I forgot about that. [20:28:03] <^demon> Since HipHop has been pretty much backburnered indefinitely :) [20:47:50] ^demon: You around? Quick MW user rights question if you don't mind [20:48:01] <^demon> I'm in a meeting right now, sorry. [20:48:09] okay [20:48:35] maybe i can help? what's your question? [20:49:36] Turkish Wikipedia wants a load of user rights changes done [20:49:50] Two of which are removal of editor and reviewer groups [20:50:10] Is it a good idea to remove groups from config while users still have them groups in the database? [20:52:07] ^demon: do you know why hiphop was bb indef [20:52:32] <^demon> Well we never had too much time to work on it, and then upstream also got kind of stagnant on github. [20:52:46] It seems that ListUsers still shows them as having the group, but it does not appear as a filter for the dropdown box (you have to filter manually) [20:53:03] It's also impossible to remove the missing group from people [20:53:08] aren't facebook using it ? [20:53:08] OrenBo: waiting for VM [20:53:23] OrenBo: (robla talked about this at Wikimania) [20:53:37] OrenBo: https://www.mediawiki.org/wiki/HipHop_deployment [20:53:37] jorm: ^ [20:53:41] missed that [20:53:42] "This work is on hold until the Hip Hop developers make more progress on the Hip Hop Virtual Machine (hhvm)" [20:54:04] that is from months ago [20:54:06] OrenBo: https://www.mediawiki.org/wiki/HipHop_deployment/status - "After Facebook announced that they were developing virtual machines for HipHop, Tim Starling indicated that Wikimedia would put their current efforts on HipHop on hold, until the virtual machines can be evaluated. Other performance efforts like Wikitext scripting will take priority instead." [20:54:13] Krenair: I would suggest that removing groups is a bad idea in general (though i hate group cruft) [20:54:22] so, the question is, why is the HipHop VM work stalling? answer: ask Facebook [20:54:39] however, i don't see why we couldn't do a mass removal of userrights in the database and then remove the config options. [20:56:17] hm - not sure about importance of VM in the age of Puppet but I better get back to getting the local setup to work in Eclipse + with bugzilla+git+gerrit support [20:57:44] hmm looking on git hub - it does seem there are a number of HHVM out there now [20:59:26] jorm, well they also want to move all current editors to the patrollers group. [20:59:33] Some are already in it [20:59:45] that may be just as easy as flipping the bits off. [21:00:10] update blah set foo='1' where bar='1'; [21:00:28] reedy may be able to give a better answer; he does things like that often enough. [21:01:24] Okay, after discussing it with one of their admins I'll just do the other changes and the rest can be thought over later [21:01:46] sumanah: that and they seem to have stopped committing to their github repo, or answering bugs etd [21:02:42] Well. [21:02:57] Did Facebook suddenly collapse into a gamification wasteland and I didn't notice? [21:04:31] ok, only 5 bugs under 100 left. [21:05:18] thank you mata [21:05:22] matanya: thanks [21:05:25] what the difference between git fetch and pull [21:05:35] hm, what for sumanah ? [21:05:52] matanya: thank you for validating old bugs [21:06:07] oh, that. np :) [21:06:33] When we did that in Berlin last year, we closed a good amount of them [21:06:35] this was actually directed to bri.on, but whatever [21:07:11] OrenBo: git pull = git fetch && git merge [21:07:36] OrenBo: So git fetch contacts the server and downloads the latest changes. git pull does a fetch AND merges the latest changes into your branch [21:07:56] RoanKattouw: thanks [21:08:11] * OrenBo pull the latest thinkup [21:11:29] * OrenBo killed his thinkup :-( [21:11:34] grr [21:11:41] gedit killed all of my tabs and made them into 4 spaces [21:13:31] don't use it :) [21:20:04] AaronSchulz: here maybe? [21:20:13] this seems a bit more appropriate. [21:20:15] 'tis ok [21:20:41] ok, so on srv224 I've got a rewrite rule in for wikipedia only that says: [21:20:55] RewriteRule ^/thumb/.*$ /w/thumb_handler.php [L] [21:21:19] I think Iwant to try without the [L] [21:22:09] both ways, I get: [21:22:09] The source file for the specified thumbnail does not exist. [21:22:09]

[21:22:09] [21:22:09] [21:22:34] however, I also see [21:22:35] ls -al /mnt/upload6/wikipedia/en/d/d8/Wikiwsy.jpg [21:22:35] -rw-r--r--+ 1 apache apache 56703 Aug 11 2006 /mnt/upload6/wikipedia/en/d/d8/Wikiwsy.jpg [21:22:56] yeah, so the source is there [21:23:48] maplebed: Not important that 1px-Wikiwsy.jpg != Wikiwsy.jpg ? [21:24:04] 1px is supposed to be the thumbnail size. [21:24:15] *nod* so not relevant, makes sense [21:24:26] maplebed: what tool are you using to make requests? curl? [21:24:27] the anatomy of the thumbnail URL is thumb/orig/thumbsizepx-orig [21:24:30] AaronSchulz: telnet. [21:24:32] :D [21:25:11] I ran tcpdump on an active image scaler to get the incoming pattern and modified it. [21:25:13] this is my call: [21:25:18] GET http://en.wikipedia.org/thumb/d/d8/Wikiwsy.jpg/1px-Wikiwsy.jpg HTTP/1.1 [21:25:18] Accept: */* [21:25:18] Host: en.wikipedia.org [21:25:18] User-Agent: bentest [21:25:18] X-Original-URI: /wikipedia/en/thumb/d/d8/Wikiwsy.jpg/1px-Wikiwsy.jpg [21:25:21] Connection: close [21:25:56] here's an example of one that works: [21:26:04] GET http://commons.wikimedia.org/w/thumb.php?f=Little_kitten_.jpg&width=1 HTTP/1.1 [21:26:04] Host: commons.wikimedia.org [21:26:04] Accept: */* [21:26:04] Proxy-Connection: Keep-Alive [21:26:04] X-Forwarded-For: 31.135.82.77, 208.80.152.60, 10.0.6.210 [21:26:06] X-Original-URI: /wikimedia/commons/thumb/a/a2/Little_kitten_.jpg/120px-Little_kitten_.jpg [21:26:14] User-Agent: Mozilla/5.0 [21:26:51] so the only headers I left out are the X-Forwarded-For and the Proxy-Connection [21:26:56] both of which should be fine. [21:28:57] I'm tempted to enable local apache logging for a bit [21:30:41] ok, I can hit thumb_handler.php directly (just checking for sanity) [21:32:52] * AaronSchulz tries to find the apache conf [21:33:17] /usr/local/apache/conf/main.conf [21:33:21] (is the part that I modified) [21:34:16] not apache2? [21:34:26] * AaronSchulz must have been looking in the wrong place ;) [21:34:54] /etc/apache2/ has a symlink. [21:34:59] our apache config is a bit of a maze. [21:35:30] ahh, I was in /etc [21:36:18] actually I followed that into /wmf, which symlinks to where you where ;) [21:36:27] so I guess I was in the right spot [21:39:24] the lines I added were 251-252. [21:48:13] maplebed: and apache was restarted? :) [21:48:18] yes. [21:50:02] 'The source file for the specified thumbnail does not exist.' [21:50:04] hrm [21:50:47] can you put something in thumb_handler.php so we can see how it was called? [21:52:33] yeah [21:59:08] maplebed: it might be some bad protocal relative handling...I'll try some stuff in eval.php [21:59:21] not sure what you mean... [21:59:26] from the logging, it's getting the right original request url [21:59:45] ok, so the rewrite part is working correctly? [21:59:51] I think so [22:00:09] cool. I'll start prepping the change across all the projects then. [22:00:11] though it should be restricted to the scalars [22:01:00] hrm, well, then again I guess people can manually hit thumb.php already [22:01:13] I'd prefer that this things always happen on the scalars though [22:03:46] maplebed: I think I see the bug...it's with code not expecting getZoneURL() to be protocal relative [22:04:31] Aaah [22:04:36] Protocol-relative URLs [22:04:41] it see the first char as '/' and assumes it's already a path (no host), but actually it's '//upload.wikimedia.org...' [22:04:42] A nostalgic smile appears on my face [22:06:03] protocol relative urls [22:06:05] ewww ;) [22:07:09] whee, osx upgrade. rebooting. [22:19:08] Is http://svn.mediawiki.org/doc/ still up-to-date now we're on git? It was auto-gen'd at midnight today. [22:20:37] Very good question, spagewmf. I say we ping Reedy about it. Who may not know, but deserves to be randomly pinged. [22:23:36] Amgine thx. uhh, !bot /msg wm-bot randomly ping Reedy is svn.mediawiki.org/doc up to date !logger do [22:23:56] Yeah, it is [22:24:01] it's using the git repo [22:24:05] but domains weren't changed etc [22:24:08] you're soooo good. [22:24:14] at most it should be 24 hours old [22:24:40] heh... [22:24:42] AaronSchulz: you're good to keep poking for a while? [22:24:50] I gotta go help cook dinner. [22:25:01] well I fixed the obvious bugs, but I'll have to think on this one [22:25:20] it's a matter of getting the path conventions to match up better [22:33:21] Reedy, excellent. I wonder if there's a way to indicate svn.mediawiki.org/docs/ is up-to-date while other features there are obsolete [22:33:43] Maybe [22:33:51] I'm not sure how customisable the templates etc are [22:34:18] It should just be moved to doc.mediawiki.org :p [22:56:10] maplebed: I added a hack that makes this seem to work [22:56:55] I'll need to do something proper in master, probably add a bit a of config [23:09:29] party in the mario room [23:09:33] so, dreaded question [23:09:36] who knows shit about regex? [23:09:55] I'm trying to match the *last* link to a user page through to the end of the line [23:10:24] so I'm using this excessively modified regular expression: /\[\[User:([^\|\]]+).{1,250}?\ \d\d:\d\d,\ \d{1,2}\ \w+\ \d{4}\ \(UTC\)$/msgix [23:10:38] That's not just a user page link [23:10:45] It seems to try to match a timestamp too? [23:10:48] correct [23:10:54] it's trying to match signatures [23:11:13] but given this input, it matches the whole thing: [23:11:13] [[User:Shrike]] Per Andy and Marek. [[User:Saedon|Sædon]][[User talk:Saedon|talk]] 21:02, 23 July 2012 (UTC) [23:11:27] however, if I chop off the first link, it works correctly [23:11:36] so obviously the non-greediness isn't doing the trick [23:11:48] or, more likely, doesn't work as I expect it to [23:12:28] Well of course [23:12:53] It thinks the user name is "Shirke]] Per Andy and Marek. [[User:Saedon|blahblahblah" [23:13:01] that's not correct [23:13:04] look at this part [23:13:20] \[\[User:([^\|\]]+) [23:13:24] Oh wait you're right, yeah [23:13:26] But [23:13:32] .{1,250}? [23:13:41] Optionally match 1 to 250 arbitrary characters [23:13:52] No, match 1 to 250 characters non-greedily [23:13:58] Right [23:14:20] my assumption is that "non-greedily" affects the number of characters but not the start position [23:14:34] so it would work if I reversed the input and reversed the regular expression [23:15:19] Ahm, maybe? [23:15:40] I understand it as "try to match until the next part of the regex matches" [23:16:25] yeah [23:16:29] I might need a two stage thing [23:16:47] find all possible start points, then get the run from the last one to the end of the line [23:18:17] Right, use something like lastIndexOf or strrstr to find the last occurrence of [[User: [23:18:36] And then regex on the end of the strin [23:43:48] In bugzilla, what's the project for a random web site bug like svn.mediawiki.org/ ? [23:44:15] Wikimedia -> SVN? [23:44:21] Is that still there.. [23:44:44] -> Subversion [23:44:45] Reedy yes (Subversion), thanks. [23:52:59] D'oh, already filed, bug 35663 (against svn.wikimedia.org , whereas I was looking for svn.mediawiki.org -- they seem to be the same 8-/ ) [23:53:30] yeah, they were made aliases at somepoint