[02:05:31] xyzram: Hi. [02:58:32] New patchset: Diederik; "Add replace_space function" [analytics/webstatscollector] (time_travel) - https://gerrit.wikimedia.org/r/47827 [04:19:36] New review: Stefan.petrea; "The str pointer is actually url." [analytics/webstatscollector] (time_travel); V: 0 C: -1; - https://gerrit.wikimedia.org/r/47827 [04:24:08] New patchset: Diederik; "Add replace_space function" [analytics/webstatscollector] (time_travel) - https://gerrit.wikimedia.org/r/47827 [04:27:03] New patchset: Krinkle; "Enable voting for mwext-Math-jslint." [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/47833 [04:27:38] Change merged: Krinkle; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/47833 [08:23:18] hello [08:32:49] heya [08:49:25] anyone know how to set a proper copyright for a file on mw.org ? Susan maybe ? :-D [08:49:40] some bot is threatening me to drop my screenshot at https://www.mediawiki.org/wiki/File:Git_fetchall_alias.png :-/ [08:50:59] ahh https://www.mediawiki.org/wiki/Project:File_copyright_tags [08:51:01] that is it [08:57:41] Just block the bot. :-) [08:58:08] We really ought to stop uploading screenshots like that. [08:58:11] Text should be in text form. [09:04:23] Susan: well I like the screenshots :-] [09:04:33] anyway [09:04:36] time to write an RFC [09:07:12] The OS X interface elements are copyrighted there, I think. [09:07:21] Patrice R. got in trouble for this. [09:07:39] Err, Patrick. [09:09:07] Yeah. [09:09:27] https://www.mediawiki.org/wiki/File:Vim_example.body_file.png etc. all had to be cropped. [09:11:00] pfff [09:11:09] can't I fair use them Susan ? [09:11:33] I don't think mediawiki.org acknowledges fair use. [09:11:35] Dunno. [09:11:43] or we can wait for Apple legals to complain :-] [09:11:53] The whole problem seems a bit ridiculous to me, as all of these screenshots just shouldn't be screenshots. [09:11:58] They're blocks of text. [09:12:26] yup [09:12:42] but it is a pain in the ass to write colorized text in wikitext :D [09:17:52] hashar: I think it's a subtle blackmailing tacting to get devs work on https://www.mediawiki.org/wiki/Files_and_licenses_concept [09:18:36] hashar: I knew about Paris squids ;) but other deleted pages on wikitech will go... where? [09:19:13] also, is labsconsole on a different host like wikitech to ensure it's up when cluster is down? [09:25:34] Nemo_bis: I guess devs don't care that much about copyirght [09:25:44] at least not as much as the commons.wm.o folks :-] [09:26:04] labsconsole is hosted on labs which is more or less a different cluster [09:26:11] though it shares the network accesses [09:26:34] I have no idea where the current wikitech is hosted, most probably on an entirely different network [09:26:39] linode [09:26:48] tactic * [09:26:56] omg hosting wikitech on labs [09:27:04] hashar: You know there's a syntax highlighting extension installed on mediawiki.org. :-) [09:27:11] You can just specify or whatever. [09:27:15] Susan: yes that's what my hands wanted to type [09:28:24] wikitech on labs sounds like evacuation manuals library in the middle of the local volcano [09:30:11] the current wikitech is on linode, third party hosting [09:31:26] the idea is that there would be a read only copy off site, synced continuously [09:38:00] Susan: yeah I don't think it properly highlight git --log --oneline --decorate --color --graph :-D [09:50:39] hashar: hi [09:51:01] are you the one to bug about testing harnesses? [09:51:44] yurik: kind of [09:52:05] my tests fail on the server :( [09:52:08] but work locally [09:52:30] although the bigger issue is that the DB is not cleaned up before tests run [09:52:51] change ? :-) [09:52:54] hashar: is that the "kind of" ? or someone else? [09:52:58] I mean what is the change number? [09:53:00] https://gerrit.wikimedia.org/r/#/c/47839/ [09:53:03] sorry :) [09:53:07] note that Jenkins uses SQLite as a database backend [09:53:27] yeah, i suspect that might be another reason for this nice behaviour :) [09:54:40] ohh [09:54:45] hashar: if you look at the full log, it shows what it expects and what it gets - there are three tests failing, and all of them have identical issue [09:54:57] look at the first one - its shorter [09:55:03] yeah some format has changed apparently [09:55:14] format? [09:55:28] it returns two category entries instead of just one [09:55:52] and I don't think you should put the ApiQuery.* files in yet another subdir [09:55:54] :-D [09:56:01] but that is not that much important [09:56:12] there will (should) be tons of them [09:56:30] but yeah, one issue at a time :) [09:56:44] at 07:35:14 [09:57:09] the first result - the 'allcategories' call returns two identical entries :( [09:58:12] well the expected and result are different [09:58:24] the question is why :)\ [09:58:31] the result has two entries in [query][allcategories] [09:58:42] works for me! (tm) [09:59:12] have you added all files in your change ? [09:59:15] the point is that if I run it locally, and i suspect - anyone else as well, they will get a pass [09:59:23] maybe it is missing some changes made in includes/api/ ? [09:59:55] how would that be possible? [10:00:05] you forgot to use git add ? :-] [10:00:17] hehe, no such luck :) [10:00:50] besides, this looks too much like a very deep issue [10:01:08] of the duplicate database entries [10:01:14] in a different db backend [10:02:27] i guess i should bug whomever is doing sqllite db backend really [10:02:54] hashar: as for pure integration question - why does it tell me to add a space after casting, and than - remove that space [10:03:07] it ? [10:03:09] compare the warnings in patch 1 & 2 [10:03:17] jenkins php checker [10:03:38] forget about that one, that is still a bit experimental and that is mostly styling [10:03:46] gotcha :) [10:03:55] but maybe there are two conflicting checks, one wanting a space the other one not wanting it [10:04:07] though I think I fixed that a few weeks ago [10:04:36] seems that way -- https://integration.mediawiki.org/ci/job/mediawiki-core-phpcs-HEAD/1375/console wants it added [10:04:46] https://integration.mediawiki.org/ci/job/mediawiki-core-phpcs-HEAD/1373/console -- wants it removed [10:05:22] ahh I see [10:07:36] New patchset: Hashar; "phpcs: Show sniff codes in all reports" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47847 [10:07:58] New review: Hashar; "regenerating jobs already" [integration/jenkins-job-builder-config] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/47847 [10:07:58] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47847 [10:09:24] i'm looking at DatabaseSqlite.php maintainers - it seems daniel (??), christian aistleitner, and max semenik worked on it last [10:09:25] yurik: so I got the same errors on my comp :-] [10:09:33] which DB ? [10:09:50] sqlite too [10:09:56] i guess that's the cause of it [10:09:58] I am not sure why it does not show a diff of the arrays though [10:10:14] because it can't :) [10:10:20] i do my own array comparison [10:10:27] ahh [10:10:28] because i have to ignore pageids [10:10:41] because the harness is broken :(( [10:10:49] i want clean state before each test!!! [10:11:22] * yurik runs around the room o_O [10:13:00] my recommendation would be to try to isolate the issue with a very simple check [10:13:10] and you might want to setup a local wiki using the sqlite backend :] [10:13:20] how difficult is that? [10:13:29] i'm sure there is a guide somewhere :) [10:13:34] copy code, use the cliinstaller with sqlite ? :-] [10:13:54] i have been using a nice VM [10:14:08] that has been pre-setup with mysql [10:14:13] guess its not that hard :) [10:14:43] hashar: could you do me a favor - create two pages on your test install, both with a [[Category:Cat]] (don't need the category page itself) [10:15:07] and then do this : api.php?action=query&list=allcategories [10:16:56] actually add one more param just in case &acprefix=Ca [10:19:14] doing yurik [10:19:36] thx [10:20:34] yurik: http://pastebin.com/KJLM4ttF [10:21:01] running master at 1c4587aabfc7e6d00171062214f5887f0e8d7ede [10:21:08] hmm, strange :( [10:21:28] how did you get the debuginfo? [10:21:45] no idea [10:21:50] I got debug toolbar enabled [10:21:54] as well as some other debug statement [10:22:05] MWDebug::appendDebugInfoToApiResult [10:22:06] :-D [10:22:18] hmm, need to start using that thing [10:22:43] i was trying to figure out how to enable profiling to show when running phpunit tests, and that failed badly [10:22:53] meanwhile I filled https://bugzilla.wikimedia.org/show_bug.cgi?id=44742 about the space after cast statements [10:23:50] thx [10:26:00] i will have to setup the sqllite locally after i wake up to figure out why its failing. I suspect different data rows are being created in the db, although i'm not sure how that's possible (unique key). [10:27:09] yurik: the code sniffer was not up to date on the continuous integration server :/ [10:27:40] one of these days i hope the testing harness will work and clean up the state [10:27:51] so now it dosen't complain anymore? [10:28:04] only complains for one of the cases [10:28:14] so what's the "proper" way? [10:28:35] A cast statement must not be followed by a space [10:28:39] so: (int)$foobar [10:30:13] cool, thx. and apparently setting $wgDebugToolbar=true adds all that wonderful debug info! [10:43:07] yurik: yeah that kind of make sense [10:43:20] that is mimicking for the API what is done in the usual interface [10:43:25] aka dump stuff at the bottom [11:53:09] marktraceur: Hey [12:24:35] New review: Hashar; "deployed :-)" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/47833 [15:16:26] Nischay|Away: Hi! [15:16:31] Nischay|Away: What's up? [16:34:38] The January engineering report will go out in the next 2 hours: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/January ; if you have last-minute edits, please make them now, while I write the summary :) [17:31:03] <^demon> aude: Here's the profile data I was talking about: https://noc.wikimedia.org/cgi-bin/report.py [17:32:27] ^demon: thanks [17:34:31] ^demon: is it possible to see profile data for purging or parsing a page? [17:35:15] * aude doesn't see any calls to the sites stuff, although it is in memcached [17:35:27] <^demon> It might be too far down on the list :) [17:35:31] might not be using the sites table at all [17:35:35] <^demon> Click "show more" [17:35:53] oh, ok [17:38:34] <^demon> SiteSQLStore::getSites 732843 0.00859 1.47 0.00719 1.57 [17:39:20] <^demon> But that's not specific enough, and I don't know when the stats were cleared last. [17:39:33] * aude looking at test2 [17:40:14] the getSites is coming from memcached [18:08:35] aude: You can also use prefix=Parser if you just want Parser data... [18:12:27] csteipp: cool, thanks [20:43:50] why isn't https://gerrit.wikimedia.org/r/#/c/47967/ auto-merged? [20:47:46] <^demon> MaxSem: Yeah it is. IIRC, the twn script does all the commits, then goes back and does all the review+submits. [23:45:38] if i've got an abstract file path like "mwstore://local-backend/local-thumb/", is there a way i could get its concrete value (e.g. /var/www/mediawiki/images/)? [23:46:26] i'm getting that abstract path from $file->getThumbPath(), where $file is an instance of LocalFile [23:52:39] Emw: try $file->getLocalRefPath() see https://svn.wikimedia.org/doc/classFile.html#a4352b5de6e9795a357315375bd2d69c8