[00:59:12] I think I've asked this before, but: where's the unified list/matrix of what extensions are deployed on what wikis? [00:59:16] (at WMF) [00:59:21] Reedy: do you know? [00:59:27] is it in Puppet? [00:59:59] The list we tend to use is http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/make-wmf-branch/default.conf?view=markup [01:00:44] There is also https://www.mediawiki.org/wiki/Category:Extensions_used_on_Wikimedia, but that might not be such a useful data source [01:01:10] Reedy: I knew about the latter, not the former. but is there a headache-inducing table of what extensions are deployed where? [01:02:38] Not that I know of [01:02:47] it wouldn't be too difficult to make [01:02:50] Right [01:02:55] and probably actually would be somewhat useful [01:02:59] The interesting thing would be for the testers, I think [01:03:12] "ok, if you want to test, here's how you get bang for the buck.... test on these labs instances" [01:04:25] Yeah.. There are ones that are deployed everywhere, some that are deployed to certain projects (wikisource being the main exception), then a few optionally enabled [01:05:04] Summer of Code project: a tool to generate the deployed extensions matrix ;-) [01:06:37] i wonder what would be the best way of listing/splitting the actual wikis/projects [01:06:55] I suppose it could be done programatically with InitialiseSetting and some hacking.. :/ [01:14:35] hello, I'm trying to write a PHPUnit test for a SMW class, but I'm having trouble, I'm getting an error message: "Fatal error: Class 'SMWDataItem' not found in C:\xampp\htdocs\mediawiki\extensions\SemanticMediaWiki\includes\dataitems\SMW_DI_Bool.php on line 15" can anyone help me with this issue? [01:15:30] Is the class loaded by the autoloader? [01:16:20] chrismcmahon: Krinkle ^^^ we got a unit tester here! [01:17:55] I'm not sure it's related to PHPUnit [01:17:58] SMWDataItem is an SWM class [01:19:44] seems to be [01:19:46] here's the thing: I've started writing a test for theSMW_DI_Bool class, I haven't run into problem there, then I created a file test1.php so I can try the test, there I included the test class I have written, also the 'PHPUnit.php' and I got the error message [01:19:49] (autoloaded) [01:20:02] included? [01:20:23] require_once 'SMW_DI_Bool_TestCase.php'; require_once 'PHPUnit.php'; [01:20:32] Why are you doing that? [01:20:46] you should use something like php phpunit.php /path/to/extension/test/file.php to run just the test file [01:21:46] yeah, but I'm not running it in a command line, I've echoed the result in a browser [01:21:48] here's the file [01:21:56] require_once 'SMW_DI_Bool_TestCase.php'; [01:21:56] require_once 'PHPUnit.php'; [01:21:56] [01:21:56] $suite = new PHPUnit_TestSuite("SampleTest"); [01:21:56] $result = PHPUnit::run($suite); [01:21:57] [01:21:57] echo $result->toHTML(); [01:22:07] Don't do that [01:22:10] ^ [01:22:26] Because then 99% of mediawiki is missing [01:22:45] so, I should run it in a command line? [01:22:49] Yep [01:22:54] ok, let me try [01:22:55] and register any classes you create in autoloader [01:22:56] thanks [01:23:06] how do I do that? :) [01:23:24] sorry for the stupid questions, I'm a noobie :) [01:23:25] I'm not sure how that is for php unit, maybe Reedy knows [01:25:31] $wgHooks['UnitTestsList'] [01:25:42] $wgHooks['UnitTestsList'][] = 'efCodeReviewUnitTests'; [01:25:54] function efCodeReviewUnitTests( &$files ) { [01:25:54] $files[] = dirname( __FILE__ ) . '/tests/CodeReviewApiTest.php'; [01:25:58] return true; [01:25:58] } [01:40:26] same thing happens when I run it in command line [02:14:33] filipt: you might want to post your question to the mailing list, if you aren't getting help in IRC right now [02:14:35] !lists [02:14:35] mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [02:14:35] mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [02:19:24] ok, I'm working this out with Reedy, if we can't figure it out, I'll post there [02:19:26] thanks [02:21:16] ok, good luck filipt [02:21:27] I've got it to WFM now ;) [02:26:31] nice [02:52:27] by the way filipt -- there are MediaWiki-related events coming up [02:52:30] !events | filipt [02:52:31] filipt: We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [02:52:31] filipt: We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [03:04:10] sumanah: me and srikanth are planning on doing more small events. Next city tbd [03:04:19] YuviPanda: :D [03:04:29] sumanah: will keep you updated. Need to respond to you on that email thread as well [03:04:42] YuviPanda: here's hoping that our mentoring community has the time to keep up. That's what I'm worried about [03:05:04] we still have 5 extensions at least awaiting review, a dozen patches in Gerrit and 170+ in Bugzilla awaiting review [03:05:14] well, one thing i've been doing is to make people concentrate on 'wikimedia' as such [03:05:16] than mediawiki [03:05:18] so if you can get people writing gadgets and API-usoing tools that's better [03:05:21] exactly [03:05:23] there is a *large* community for userscripts, gadgets, tools [03:05:26] huge [03:05:31] that is much easier and lower hanging fruit [03:05:36] + more people to mentor [03:06:17] but then what's the lifespan of those tools? [03:06:49] do they stay alive? will the authors put them into the megarepo when Gadgets 2.0 comes along? [03:06:58] * sumanah is not discouraging these things, just wants to not waste work [03:08:06] sumanah: they do [03:08:15] sumanah: people pick them up and maintain them. [03:08:19] 'fork' is copy pasting [03:08:25] it could definitely use a lot of help though [03:09:05] sumanah: i've been spending off time in the last month or so writing some gadgets/userscripts for the enwiki India community. Should sit out and write a few blog posts when it is done. [03:09:39] WHOOO [03:09:40] yes [03:09:47] you could maybe put one up on the wmf blog [03:10:06] btw, YuviPanda, this is worth a skim for the feel of it: https://meta.wikimedia.org/wiki/User:Ziko/Berlin_diary [03:10:13] and for some nice aphorisms under the photos [03:10:36] like "Only he who feels secure, is tolerant." [03:10:38] he's from the future? [03:10:48] oh wait [03:10:48] and "When you think that you are exhausted, you still have two thirds of your power." [03:10:51] wow [03:10:55] i am living in april 2012 [03:10:58] I am from the future [03:10:58] okay [03:10:59] YES [03:11:01] you are! [03:11:18] :D [03:11:33] sumanah: sent to kindle. Will read later today. [07:16:31] hmph [07:16:57] all of SF sleeping already? [07:17:48] * exbeinfo guess so [07:39:58] after midnight [07:40:04] if they aren't sleeping at least they are not on line [13:50:27] qchris: ping pong ? :-D [13:50:38] I have read your long mail this morning when waking up [13:50:39] hashar: Hi [13:50:48] I don't remember anything from it [13:50:49] Sorry for having to leave yesterday :( [13:50:50] but [13:50:57] it surely got me confused :-D [13:51:04] oh ... sorry [13:51:08] leaving is fine :-D [13:51:21] I wrote it late in the night. ... maybe I should rephrase some parts [13:51:33] * qchris rereads his email [13:51:44] * hashar rereads after several cofee [13:51:52] *gg* [13:51:53] k [13:52:06] I wanted to come back on the issue on manday (I saw this is you 20% time) [13:52:20] sure if you want [13:52:24] probably a better time for both of us [13:52:25] Anyways thanks a lot for getting PHPUnit updated :D [13:52:33] :-) [13:52:52] Any time is fine for me. But I do not want to disturb you more than necessary [13:53:19] just one thing qchris , at the end of the mail, you have asked me to add an assert in tests/phpunit/maintenance/backupTextPass.php [13:53:28] you could do it in a new commit and send that for review [13:53:35] oh ... ok [13:53:41] Jenkins will happily fetch that commit, apply it to master and trigger the tests :-] [13:53:48] But this additional line would not be kept in the final version of the patch [13:54:15] well you could just abandon the change later :-D [13:54:16] Ok ... but debugging via Jenkins. Should I really go for this? [13:54:25] sometime you have to [13:54:36] I will try. Thanks. Hopefully ppl will not kill me for flooding IRC with this ;) [13:54:38] we did with reedy this week cause we had a nasty bug on jenkins host [13:54:51] ends up we do not reproduce the bug anymore since PHPUnit has been upgraded to 3.6.x [13:55:02] Ok. So I'll give it a try [13:55:14] Thanks. [13:55:27] the rest we can talk about on monday :-) [13:55:35] can even set up a Skype chat if you want to say hi [13:55:36] Ok. That's fine [13:55:46] I am a fan of free software :D [13:55:53] But if you prefer Skype, it's ok. [13:56:29] if you got any recommendation of a conferencing software which is FOSS and work on Mac .. I am all for it :D [13:56:42] *gg* [13:56:50] IRC is a great conferencing protocol [13:57:00] But without talking ... yes. [13:57:10] <^demon> Except when freenode netsplits :) [13:57:25] *gg* [13:57:35] <^demon> hashar: Just use G+ [13:57:42] unfree. [13:57:54] just mentioning it. [13:57:55] + I don't like big brother listening to my conversations [13:57:57] <^demon> Free as in beer :) [13:58:07] ROFL [13:58:11] gnome-meeting and it's descendants? [13:58:12] she quits to get a beer :-] [13:58:25] <^demon> JeroenDeDauw: Did you see my 1-hour notice on migration of extensions. Thought I'd ping you since you're the biggest single affected user today. [13:59:15] fwiw, mozilla uses vidyo [13:59:27] which I don't think is free [13:59:55] <^demon> We've also got the mumble server, but it's kind of crap [14:00:36] qchris: about mocks, I am not sure we ever used any [14:00:44] qchris: I am not even sure how it works, will have to study that [14:00:50] mumble could be made better [14:00:59] qchris: PHPUnit has some support and someone told me about mockery https://github.com/padraic/mockery#readme [14:01:09] qchris: something we might want to talk about monday :D [14:01:12] hashar: Mocks are great! Like stubs, only ...usefull :D [14:01:21] <^demon> We tried introducing mocks once or twice, but we don't really use them no. [14:01:21] hashar: Ok. I'd love to. [14:01:29] <^demon> I think writing mocks will be easier once we're on 5.4 [14:01:45] and one day I will have to investigate PHPUnit ability to build database fixtures [14:01:49] <^demon> I think traits will make that a bit easier so you can skip the code duplication. [14:01:51] * ^demon shrugs [14:01:53] <^demon> Maybe not [14:02:13] hashar: It's focused around PDO, which MW does not use too much (as far as I know) [14:02:23] * hashar googles PDO [14:02:24] <^demon> We use PDO for Sqlite, but that's it. [14:02:26] triats? Coool! [14:02:39] <^demon> Yeah 5.4 introduces traits :) [14:02:41] I am pretty sure we will not use traits [14:03:16] (cause live site will certainly stress it so much that we will end up hitting a corner case bug that produce blank pages) :D [14:03:20] <^demon> I'll use traits before LSB :) [14:03:31] I don't even understand the use case for LSB :-/ [14:03:58] <^demon> Neither do I. And from what I understand the php devs are kind of regretting adding it. [14:05:10] ^demon: nope, but thanks for pinging me now :) [14:05:33] ^demon: regarding 5.4, I am pretty sure ops will push to have it on the cluster [14:05:33] <^demon> You're welcome. [14:05:44] <^demon> What reason? [14:05:54] ^demon: I am pretty sure Asher will show nice graphs showing a X * 10% improvement over 5.3 [14:05:57] ;-D [14:06:02] <^demon> hehe :) [14:06:11] then it will become a top priority project [14:06:18] <^demon> I was like "we already have register_globals disabled, so the outright removal doesn't really affect us." [14:06:45] <^demon> I imagine we'll probably upgrade around when we move the apaches to the next LTS that has 5.4 shipped. [14:07:36] or just migrate to lighttpd [14:07:45] or some PHP embed server [14:09:37] <^demon> Pfft, the embedded server is never going to be production-grade. [14:09:42] <^demon> It's designed for testing :p [14:16:18] :) [14:42:14] whii [15:01:08] <^demon> Nikerabbit: Can we try core again? [15:03:48] <^demon> If this doesn't work I might just have to give up computers. [15:04:03] <^demon> Become a hermit and live on a mountain top. [15:04:55] ^demon: as someone once said, quitting a microchip project with nanosecond-level timing issues: [15:05:12] "I am leaving for a commune in Vermont where I will deal with no unit of time shorter than a season." [15:07:03] back in a few min [15:24:44] ^demon: sure [15:31:46] sumanah - ah, i remember that — it was in Soul of a New Machine, right? [15:31:57] hi rsterbin yup! [15:32:29] the weird thing is i remember almost nothing else about that book [15:32:37] ha! [15:32:41] how long ago did you read it? [15:32:56] college, so … gah, almost ten years. [15:33:28] ^demon: can you poke me when you are ready to test? [15:33:52] <^demon> Actually I totally lost track of time. [15:33:57] hashar: sim;le as that, eh? ;) [15:34:06] <^demon> Let's do this really quick so I can either be happy or yell some more. [15:34:23] ^demon: hah, so I can commit? [15:34:24] Reedy: simple what ? :D [15:34:28] <^demon> Yes [15:34:57] hashar: using old software sucks! [15:35:11] Reedy: which software are you talking about ? [15:35:19] phpunit on gallium [15:35:28] <^demon> Ok, I don't care what Ryan says, patchset-created is *not* getting run on non-puppet repos. [15:35:33] * ^demon kicks and fumes. [15:35:37] <^demon> Nikerabbit: Thanks. [15:36:56] Reedy: thanks to mutante, we finally are running PHPUnit 3.6.10 :-] [15:37:02] Yaah :D [15:38:10] ^demon: np [15:38:40] oh my [15:38:51] I will probably have to rethink the whole jenkins system :-D [16:14:33] <^demon> Migration went smoothly for 33/34 extensions :) [16:14:40] yay [16:14:48] Which is the odd one out? [16:14:58] ^demon: yay for the 33 [16:15:11] <^demon> WikiShare. I didn't notice when I was dumping but it turns out someone asked for an empty extension to be migrated. [16:15:19] lol [16:15:27] <^demon> There was no history so git balked and was like "I can't push that homey." [16:15:28] Well that's easy isn't it [16:15:57] <^demon> Yeah I just gotta ping varnent about it. [16:30:45] ^demon: so will l10n updates get delayed to next monday again? [16:31:21] <^demon> I'm still looking into it today [16:32:30] okay [16:32:51] I can of course do manual test commits without actual i18n updates if needed :) [16:57:53] hi, i noticed small inconsistency in the API response for list=allimages: aiprop=dimensions and aiprop=size do exactly the same thing: both return both the dimensions and the size [16:58:06] do you think something like that is worth fixing? [16:58:51] <^demon> varnent: Hello! [16:58:55] Could it be deliberate? [16:59:00] Svick: could it be sort of an alias? [16:59:33] it doesn't look like an alias to me; it's not documented that way [16:59:47] I think it is [16:59:55] ^demon: greetings :) [17:00:29] Probably a b/c alias [17:00:46] Yeah [17:00:59] RoanKattouw: "b/c"? [17:01:04] Backwards compatibility [17:01:06] Yup [17:01:07] 'size' => ' size - Adds the size of the image in bytes and the height, width and page count (if applicable)', [17:01:07] 'dimensions' => ' dimensions - Alias for size', // For backwards compatibility with Allimages [17:01:14] ^ code comments say as much [17:01:18] Svick: but thanks for noticing :D [17:01:22] <^demon> varnent: I tried migrating WikiShare this morning with the other extensions, but you don't seem to have ever checked any code into SVN so there was no history. Do you just need an empty repo and you'll start checking stuff in? [17:01:37] ^demon: yeah - that's fine [17:01:47] <^demon> Ok I'll do that then. [17:02:02] Reedy: oh, right, i didn't notice that; i looked at allimages only [17:02:08] ty - I thought I had put a couple more files in there - but apparently not :) most of that extension remains in scattered code and notes [17:02:19] varnent: I hereby scold you and tell you to commit things. [17:02:44] sumanah: sorry :) I need to give that extension's development some love now that Wikimania reviewing is done [17:02:45] But as a carrot I give you a cool thing: http://outreach.wikimedia.org/wiki/Best_practices_in_training_adults [17:02:56] varnent: it is only the mildest of scolds [17:03:04] lol :) [17:03:13] it's like a frown and a "Gregory, I am a little disappointed in you, now go forth and sin no more" [17:03:33] you've been taking notes from my priest :) [17:03:47] Svick: [17:03:48] 'prop' => array( [17:03:48] ApiBase::PARAM_TYPE => ApiQueryImageInfo::getPropertyNames( $this->propertyFilter ), [17:04:03] sumanah: i need to explore the outreach wiki more [17:04:07] So line 494 onwards of ApiQueryImageInfo.php [17:04:25] <^demon> varnent: Ok, you've got an empty initial commit so you can clone it now. [17:04:42] ^demon: excellent - ty - how are the migrations going? [17:04:59] <^demon> Well now that you're settled, all 34/34 migrated today. [17:05:03] <^demon> No complaints from anyone yet. [17:05:04] YAY [17:05:09] yay! :) [17:05:18] ^demon: are you going to sneak away while no one has had a chance to complain yet? :D [17:05:31] I was gonna say - that sounds more like a lack of time in knowing it's happened :) [17:05:33] <^demon> I might sneak away for some lunch though [17:05:38] JeroenDeDauw: have you tried committing to the Semantic extensions that are now in Git within the last few minutes? [17:05:55] <^demon> Well, last ~hour :) [17:06:30] Sorry, yeah [17:06:51] sumanah: I wonder if over time we should put some notes in there about online trainings and other specialized items not yet included [17:07:10] varnent: seems reasonable, I'm not going to prioritize it now - as you said, over time. [17:07:22] Reedy: thanks [17:07:33] Svick: how are you liking the API in general? [17:07:35] sumanah: right - I imagine this will be helpful for wikimania prep in the meantime [17:08:20] sumanah: unrelated..you noticed a few new articles on Meta that are basically "here's my latest vent about some WM/WMF related" - glad I don't have to monitor that site :) [17:08:37] varnent: I may have missed those too...... [17:09:02] sumanah: you're not missing much :) [17:09:25] sumanah: in general, I like it, that's why I'm using it in my bachelor thesis :-) [17:09:33] Fantastic, Svick [17:09:40] are you coming to the Berlin hackathon or one of our other events? [17:09:42] !events [17:09:42] We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [17:09:42] We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [17:09:42] We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [17:09:42] We run face-to-face events for MediaWiki developers and people who want to learn: https://www.mediawiki.org/wiki/MediaWiki_developer_meetings We also have online meetings in IRC to triage bugs: https://www.mediawiki.org/wiki/Bug_management/Triage [17:10:11] hmmm [17:10:22] mw-bot has mini-me's.. [17:14:01] sumanah: but there are some small issues that I noticed now that I'm looking at it more closely [17:14:17] sumanah: : for example, every module follows the same structure: the data is in a tag with the name of the module, tha's good [17:14:18] Svick: got it. patches welcome of course. thanks for looking at it [17:14:51] sumanah: except expandtemplates, that adds a second tag in some cases [17:37:29] Ryan_Lane: ^demon - do you have an estimated week or day for the gerrit upgrade? just scheduling around other deployments as necessary [17:38:04] <^demon> No, nothing scheduled yet. [17:40:33] binasher: hope you feel better soon. [17:40:40] thanks [17:47:35] sumanah: until it's fully tested it won't get scheduled [17:49:00] <^demon> And we're still waiting for the final 2.3 release, I don't want to upgrade to a release candidate. [17:51:29] hi guys! I'm applying to GSoC here is my proposal http://www.mediawiki.org/wiki/User:Davidpardz/GSoC_2012_application I think this idea could be a suitable one for getting a new member to the open source community, what do you think? [17:55:17] psst davidpardz, you should mention the subject of your proposal to get more people interested [17:56:20] ^demon: got it. so the flow will go: 2.3 gets released, we put it on a test server, we ask the MW development community to try it out and test it, we make some test cases up and try them, and then we upgrade gerrit.wikimedia.org ? [17:56:38] <^demon> Something like that. [18:08:56] hi guys a brief explanation of my proposal is: the main idea is to add a feature to the UploadWizard extension capable of handling copyrights issues. When media is uploaded to Commons sometimes is necessary a revision. A feature like this could focus on two things: the first one is to detect copyrigth issues before an image from Flickr is uploaded, and the second one is to give the info necessary to the reviewers [18:08:58] and the FlickerevieweR (bot) that will help them to do their contribution easily. http://www.mediawiki.org/wiki/User:Davidpardz/GSoC_2012_application [18:25:21] so somehow I got a linking job triggering but Cant remember how I have set it up in jenkins [18:32:06] wrong job!! :D [18:32:08] fixed [19:39:34] i'm working on adding information about values the API returns to action=paraminfo, but I can't decide how to represent properties that can be added by several different “props” [19:39:47] have a look at http://en.wikipedia.org/wiki/User:Svick/Sandbox for the options i'm thinking about [19:40:51] for example, both iiprop=comment and iiprop=parsedcomment can produce the commenthidden property [20:25:01] New patchset: Hashar; "Universal linter job, `php -l` for now" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4485 [20:25:24] Platonides: here is the linting job :) [20:26:06] the interesting file is https://gerrit.wikimedia.org/r/#patch,unified,4485,1,jobs/_shared/build.xml [20:59:17] Platonides: https://gerrit.wikimedia.org/r/#patch,unified,4485,1,jobs/_shared/build.xml [20:59:23] that is an ant build script :/ [20:59:24] I hadn't looked at this channel :P [20:59:39] you probably want to look at the full file [21:00:35] Platonides: https://gerrit.wikimedia.org/r/gitweb?p=integration/jenkins.git;a=tree;f=jobs/_shared;h=fed9c9d6f8f7f28203e5e6a1492b62ac35c3bc9c;hb=HEAD [21:00:49] was trying to do it with git-review [21:00:55] ohh or that [21:01:05] if you already have a clone of integration/jenkins :-D [21:01:53] anyway [21:01:54] jobs/_shared/build.xml [21:02:14] the build.xml is repeated on different folders ? [21:02:17] I got a target named git-files-changed which finds out the various files that got modified [21:02:31] and php-lint apply php -l on that list of files [21:02:45] yup the _shared/build.xml is shared among all jobs [21:02:52] so you're just doing a diff --name-only HEAD^ | xargs php -l [21:02:55] I find it is easier to handle that way [21:03:00] git diff --name-only HEAD^ | xargs php -l [21:03:26] no, there's a tr below [21:03:31] git diff --name-only HEAD^ | tr \\n " " | xargs php -l [21:03:49] or: git diff -name-only HEAD^ | xargs -n1 [21:04:09] hmm you are right, maybe that would be easier that way [21:04:26] well, you can use ant buildscripts if you want [21:04:27] with a |egrep "\.php$" [21:04:34] I find it a bit confusing [21:04:51] hey! I had in my console: git diff --name-only HEAD^ | grep \\.php$ [21:05:04] the original idea was to have the git-files-changed to initialize a list of files [21:05:28] then have each linking target (python , javascript, css, php) to filter out files they do not want to lint [21:05:50] it's probably easier to filter those you want to lint [21:06:14] but you'd want the grep between git diff and tr [21:06:20] so I can just copy paste diff --name-only HEAD^ in every linting targets [21:07:00] you could have a git-php-files-changed, git-python-files-changed... [21:07:10] each of them depending on git-files-changed and adding a filter [21:07:20] http://stackoverflow.com/questions/3931608/how-do-i-use-the-ant-exec-task-to-run-piped-commands [21:07:22] ant is ugly :-D [21:07:29] so I will write shell scripts for each cases [21:07:36] and that suddenly will become easier [21:07:38] and nicer [21:08:20] or makefiles [21:08:30] why did you begin with ant ? [21:08:54] that is what was used before [21:08:57] so I stick to it [21:09:04] hashar, were you looking how to do a pipe in ant? [21:09:06] if it was only me, I would have use rake (and ruby) [21:09:10] yup [21:09:21] look at build.xml line 193 [21:09:28] you have an instance there, with tr [21:09:40] that would work I guess [21:09:55] that's even uglier [21:10:08] ok, let me try [21:10:43] Platonides: well don't spend too much time on that [21:10:56] I will be back on tuesday [21:11:03] and going to head bed like right now [21:12:07] I'm trying to send to gerrit [21:12:19] remote rejected :( [21:12:23] :D [21:12:29] you might not be allowed [21:12:30] ok, let's send as a different commit [21:12:43] New patchset: Platonides; "Filter php files only" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4487 [21:12:49] it depends on yours [21:13:21] but that _should_ make the work [21:13:33] OH!!!! [21:13:35] that is a wonderful idea [21:13:47] just create another variable :D [21:14:16] I'm no Wizard of the ant [21:14:26] just copying the example given :) [21:14:41] why does use gitfileschangedinline ? Wouldn't gitfileschanged work as well? [21:15:47] New review: Hashar; "Excellent! Will resume my work on that on tuesday." [integration/jenkins] (master); V: 0 C: 1; - https://gerrit.wikimedia.org/r/4487 [21:16:03] Platonides: php -l only accept one argument [21:16:28] oh sorry [21:16:40] for , I guess either variable would work [21:16:51] anyway I am out to bed and weekend. See you tuesday! [21:25:31] kaldari: hey, got a moment? [21:25:40] sure [21:26:31] kaldari: please register at google-melange.com and tell me your "link ID" [21:26:39] (I may have missed where you did this already) [21:27:20] ok [21:27:22] ... [21:27:40] kaldari: is this like the point in a videogame where a character is talking and I should hit A to continue? [21:31:53] no :) [21:32:18] ok done [21:32:24] my link ID is... wait for it... [21:32:27] kaldari [21:32:28] xD [21:33:39] Dude [21:33:59] I just realized that the following /actually works/ : [21:34:12] "kal ... wait for it... dari" [21:35:28] kaldari: thank you [21:35:47] kaldari: in your inbox, you will find an invitation [21:35:55] kaldari: this is an invitation to a party called Melange [21:36:00] Melange is not the best party [21:36:38] Melange is in fact more like a mandatory training than a party [21:36:51] nevertheless I must ask you to accept this invitation [21:37:14] so that someday you too will be able to say of a web app, "well it's better than Gerrit I guesssssss" [21:39:00] <^demon> Pfft, I'd take gerrit over melange any day. [21:47:41] is it so bad?? [21:56:17] Ryan_Lane: you're hoping to mentor that OpenStackManager student this summer - right? [21:56:18] please register at google-melange.com and tell me your "link ID" [22:04:30] sumanah: how do I register? [22:04:43] ah [22:04:48] I didn't see a mentor link before [22:05:15] New patchset: Platonides; "Filter php files only" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4487 [22:05:25] sumanah: it's ryan_lane [22:06:06] Thanks Ryan_Lane [22:06:14] yw [22:06:18] Ryan_Lane: you should see an invitation to mentor for Wikimedia Foundation [22:07:24] RoanKattouw: is Trevor around? could you get him to sign up at google-melange.com? [22:07:47] He's WFH today [22:07:55] Or maybe he took the day off, I forget [22:08:00] He's moving tomorrow [22:08:04] RoanKattouw: got it [22:08:06] I don't see the proposal here [22:08:36] ah. now I do [22:08:42] Ryan_Lane: Suhas HS? [22:09:21] yea [22:09:23] I see it [22:17:19] New patchset: Platonides; "Use gitphpfileschanged, not so this should work as well as gitfileschangedinline, but only with php files." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/4491 [23:05:51] kaldari: are there any extensions that use pagination to display results? [23:06:07] I wanted to see if there was nay code that can be reused [23:06:07] CentralNotice [23:06:18] that is where I am looking [23:06:32] does SpecialBannerListLoader [23:06:34] use it? [23:06:39] no [23:06:46] but most of the other pages do :) [23:06:58] ok, that is where i was looking to add it :) [23:07:13] bug 30056 [23:08:19] looks like you mean SpecialBannerAllocation [23:08:25] yes [23:08:28] probably the easiest examples are in the log pages [23:08:49] i was looking at getJsonList [23:08:52] We've got a TablePager class (CodeReview uses this among others) [23:08:58] for SpecialBannerAllocation [23:09:09] thanks Reedy [23:10:12] do you have a link for documentation on TablePager? Reedy [23:10:18] there's a whole collection of Paging classes [23:10:37] includes\Pager.php [23:10:44] I'm not sure how well it's documented... [23:10:52] most of the ones already in CentralNotice are ReverseChronilogicalPager, which isn't what you want for SpecialBannerAllocation [23:11:22] thanks [23:11:29] but if you want a basic example of how the class works, look at CentralNoticeCampaignLogPager [23:11:41] right on [23:11:45] will look [23:11:47] they all inherit from the base Pager class [23:11:59] eventually [23:12:13] but there's a big family tree of them to choose from [23:12:56] much appreciated :) [23:16:51] uga chaga