[14:27:33] mutante: on? http://pt.planet.wikimedia.org/ isn't beeng updated, i'm gonna publish a wikivoyage post on wm pt and would be nice to appear there [14:34:56] Alchimista: 4am for him, so he is sleeping :-] [14:35:04] Alchimista: feel free to mail him I guess [14:35:30] hashar: 4 am? good hour to wake up XD [14:36:09] i'll send a mail wen finished, but hope that he sees the message when got back on irc [14:36:34] if he is like me, he'll provably only check the mail in on or two weeks [14:38:27] andre__: Here's what it looks like in another language: https://www.mediawiki.org/wiki/How_to_report_a_bug/fr ; I think it will be useful for people who are unsure about their proficiency in English, but who can use machine translation. [15:09:55] hashar: jenkins-bot is confusing me by telling me mwext-Babel-merge : LOST without a link. Any idea what I can do to proceed? https://gerrit.wikimedia.org/r/#/c/33505/ [15:10:22] Denny_WMDE1: what Niklas asked in -dev :-] [15:10:28] Denny_WMDE1: I have restarted Jenkins. Sorry [15:10:38] ah, didn't see that [15:10:38] thanks [15:11:20] ;) [15:17:52] hashar: you could change the error message to say "wake hashar"... [15:18:03] :) [15:18:08] or I can write more documentation! [15:19:24] na, nobody reads that anyway [16:10:01] where is the process documented for a wiki to request installation/activation of an extension or feature? [16:10:24] (ie, get consensus on your wiki, then file a bug request on bugzilla) [16:11:07] <^demon> The consensus + BZ is just convention. [16:11:13] <^demon> I don't think it's ever been written down. [16:11:26] <^demon> http://www.mediawiki.org/wiki/Writing_an_extension_for_deployment - this is important, though, when talking about new extensions we've not deployed before. [16:12:05] I get occasional questions about enabling the education extension on other wikis. [16:12:49] I tell them consensus + BZ, but it's happened enough that I wanted to a) see what the actual formal procedure was, and b) have a page I could point them to to save me explaining again. [16:14:34] ragesoss: https://meta.wikimedia.org/wiki/Requesting_wiki_configuration_changes [16:15:46] <^demon> Ah, so there is a page on meta. [16:15:50] <^demon> Did not know that. [16:16:41] It doesn't actually deal with extensions specifically. [16:16:49] I've added a couple links now. [16:21:32] thanks, Nemo_bis! [18:23:29] Alchimista: i got it to update to October 09,2012, but not more recent..is it possible those feeds really did not update since then? [18:24:33] !log updating pt.planet [18:24:43] Logged the message, Master [18:26:49] Alchimista: i think there are really no news since Oct 09.. because on the other software it's the same http://zirconium.wikimedia.org/planet/pt/ [18:27:09] and don't see errors besides one of the feeds returns 500..but that doesnt keep it from updating [18:55:26] mutante: it never got updated, i've talked to you at that time, you've made a manual update or something like that, and it never got new feeds [18:56:45] mutante: nevermind, seems that got updated today, someone fixed it [18:57:24] Alchimista: i did [18:57:34] 10:31 < mutante> !log updating pt.planet [18:57:46] but it still not newer than October.. [18:57:55] and that part is actually true i believe [18:58:45] well yes, it's updated right now, as far as i see. but those last posts weren't appearing on planet. [21:50:56] evening guys - I wonder if you would be kind enough to tell me what the maximum picture size is that the software can create a preview of, please? [21:51:38] We have one on Wikipedia at 16,134 � 20,762 pixels, 80MB in size, flatly refusing to show up on its file page, and it's rather high quality. [21:52:00] What's the absolute max we could get it to, where it would preview correctly, and wouldn't lose too much quality please? [21:52:10] Depends on file type, I believe. [21:52:19] JPGs have some hard limit (or maybe it's PNGs...). [21:52:21] I always forget. [21:52:21] There is a 25MP limit where mediawiki won't even bother [21:52:22] Link to the file? [21:52:41] sorry it's not on en, it's on commons - http://commons.wikimedia.org/wiki/File:Jacques-Louis_David_-_Marat_assassinated_-_Google_Art_Project_2.jpg [21:52:48] Figured. [21:53:06] https://upload.wikimedia.org/wikipedia/commons/thumb/f/f8/Jacques-Louis_David_-_Marat_assassinated_-_Google_Art_Project_2.jpg/186px-Jacques-Louis_David_-_Marat_assassinated_-_Google_Art_Project_2.jpg [21:53:09] <^demon> svgs and tiffs have a separate limit, fwiw [21:53:10] Error generating thumbnail [21:53:11] Error creating thumbnail: convert: Insufficient memory (case 4) `/tmp/localcopy_35fae30268b2-1.jpg' @ error/jpeg.c/EmitMessage/236. [21:53:14] > [21:53:17] convert: missing an image filename `/tmp/transform_1ffed05270b5-1.jpg' @ error/convert.c/ConvertImageCommand/3011. [21:53:19] > [21:53:27] I'm not sure that error should be exposed like that. [21:53:37] Though I guess it's better than a white screen. [21:53:43] Susan, so it's buggered? :) [21:53:53] BarkingFish: It's a known limitation. [21:54:05] Images can't just keep getting bigger and bigger without consequence. [21:54:09] 400MB of shell memory to try to do it in too [21:54:17] ok, so we gotta shrink it down to what in order to get it to show up? [21:54:18] Not without a new/lot better scaler [21:55:07] 334974108 [21:55:28] come again? [21:55:43] sorry, i can't figure that out. [21:56:01] That's a 335 megapixel image [21:56:13] kaldari: good job with the extension! finally we'll get rid of it [21:56:49] Danny_B: if it's ever reviewed ;) [21:57:17] would it work for example, if it was downloaded and say, halved in size, so we got it to 8067 * 10381? [21:57:31] reviewed by danny_b with the comment: "trustful user, will not take servers down" [21:57:55] <^demon> kaldari: Well, as long as the initial commit is just forking it out of core, then I don't see any reason we couldn't approve that as-is :) [21:58:28] ^demon: no it's a reimplementation [21:58:35] <^demon> Ah. [21:58:52] to removal all the hacks :) [21:59:42] I originally reimplemented it in core, but then moved it to an extension while I was at it [21:59:59] <^demon> I'm reviewing it now. [22:00:06] yippie :) [22:00:08] What are we talking about? [22:00:26] https://gerrit.wikimedia.org/r/#/c/41043 [22:01:04] ^demon: the main thing I need double-checked is the SQL in the special page since it's a bit complicated [22:01:20] <^demon> Can you pastebin what that full query looks like? [22:01:43] <^demon> Other than that, this lgtm. [22:01:49] I think Anomie put it in one of the comments [22:02:26] https://gerrit.wikimedia.org/r/#/c/41043/7/specials/SpecialDisambiguationPageLinks.php [22:03:10] <^demon> There's a filesort, ouch. [22:03:23] <^demon> Possibly ok, given it's an expensive querypage. [22:03:27] just remove the order by [22:03:41] meh no BF [22:03:52] bf? [22:04:29] boyfriend [22:04:32] :) [22:04:41] no, BarkingFish [22:04:46] kaldari: Yeah, I was going to comment the same as Danny_B. [22:04:47] ah [22:04:53] Your ORDER BY pp_page is probably not necessary. [22:04:55] Susan: it's just a progressive image [22:05:07] Nemo_bis: Progressive in what sense of the word? :-) [22:05:17] kaldari: i'm lazier when I can't tab-complete [22:05:25] Susan: in the jpg sense [22:05:27] so how do I get rid of it? [22:05:39] Susan: when is your planned next switch-to-the-new-name reincarnation? [22:05:57] kaldari: I'm fixing it [22:06:07] there's almost a million files to convert btw https://commons.wikimedia.org/wiki/Commons:Bots/Work_requests#Convert_all_interlaced_JPGs [22:06:18] Danny_B: It's a six-month cycle, I believe. [22:06:33] too long, be more scrumish [22:06:48] wow, convert taking 2 GB of my RAM :) [22:07:17] kaldari: Sorry to ask a stupid question, but is it necessary? [22:07:39] I hope not [22:08:01] I think you're saying that it's coming from some SQL abstraction function. [22:08:06] yes [22:08:08] Got it. [22:08:38] getQueryInfo may have a parameter? [22:08:44] <^demon> Ordering by any of those fields in getOrderFields() produces a filesort. Again, it might not be the end of the world. [22:08:59] that makes sense [22:09:24] Big picture, it shouldn't be expensive to pull pages out of page_properties by page_prop. [22:09:33] I don't mind just axing getOrderFields if it makes a big difference on the query time [22:09:37] I think there are sufficient indices that were implemented for this. [22:10:33] <^demon> Ouch, or not. [22:10:48] <^demon> We've only got the primary key on (`pp_page`,`pp_propname`) [22:10:51] I could probably just use sortDescending instead [22:11:21] Although I'm not sure what that sorts by [22:11:24] ^demon: Right, but disambig is a propname, isn't it? [22:11:37] It's not a prop value, is it? [22:11:42] <^demon> Oh, yeah. nvm. [22:11:46] Right. [22:11:49] yeah [22:11:56] The value is a blob and is unindexable, basically. [22:12:04] in this case an empty blob [22:12:11] Empty or NULL? [22:12:24] Just curious. [22:12:30] I believe it's still a blob, but with nothing in it [22:12:44] Okay. [22:12:59] Anyway, the index should make any report output really cheap. [22:13:05] Unless you're doing namespace restrictions. [22:13:13] In which case you'll hit the same issue that every other table hits. [22:13:39] This will almost exclusively be ns:0, though. [22:13:42] So that shouldn't be an sissue. [22:13:44] issue [22:14:06] yeah, 0 byte blob, just checked [22:14:37] <^demon> Anyway, the filesort is on page :p [22:14:46] Pastebin? [22:15:31] oh fantastic, the resulting JPEG is 100.2 MiB [22:15:33] <^demon> Yeah [22:15:49] <^demon> Susan: http://p.defau.lt/?LfDPc_JBiCDs1rZiokU5Zw [22:15:55] kaldari: ^ [22:16:36] <^demon> Ordering with any of the order getOrderFields() gives you the same filesort, fwiw. [22:17:04] good to know [22:20:43] sortDescending gives me alphabetical order at least so I'll use that and kill the getOrderFields [22:21:00] awjr: fyi, I'm sending out an email soon, but the gist is that we're going back to scap for a while [22:21:31] robla ok, thanks for the head's up - still doing the lunch tomorrow? [22:21:52] as soon as Ryan_Lane gets back from lunch, I'll confirm with him [22:22:05] k, i'll keep an eye on my inbox. get better soon! [22:22:08] thanks! [22:22:22] RobH: I'm back [22:22:37] we're going back to scap till after eqiad migration [22:22:38] Is ecap communicable or something? [22:22:40] scap [22:22:51] Susan: eh? [22:22:53] you mean like a disease? [22:22:56] Susan: yes [22:23:06] I saw scap + get better soon. ;-) [22:23:07] Susan: done btw, JPGs never fail if they're baseline, that's why tehre's no pixel limits https://commons.wikimedia.org/wiki/File:Jacques-Louis_David_-_Marat_assassinated_-_Google_Art_Project_2.jpg [22:23:09] awjr: ^^ [22:23:12] :p [22:23:29] well played :) [22:23:34] OS X's dictionary doesn't know "scap", "changeset", etc. [22:23:42] I've been teaching it, but it's a slow process. [22:23:49] anyhoo, Ryan_Lane, the bit I wanted to confirm was the brownbag tomorrow. see email [22:24:01] robla: we can still do brownbag [22:24:19] we're just fixing the fetch method. the rest of the system will work the same [22:24:19] k....I'll note that. thanks! [22:24:31] robla: Thank you very much for posting about deployment freeze + git-deploy (with docs!) to wikitech-l, by the way. Appreciated both. [22:24:50] Susan: no prob! [22:27:51] robla: btw, we have some redesign docs here: http://etherpad.wmflabs.org/pad/p/git-deploy-bittorrent [22:28:11] I'll target stage 1 for next major MW deployment [22:29:00] Ryan_Lane: is beta labs under the "List Projects" view? [22:29:08] on console [22:29:08] AaronSchulz: deployment-prep [22:29:13] ahh [22:29:31] yeah, names don't match up well ;) [22:30:05] heh, I'm not even on the member list [22:30:11] :D [22:30:15] let me add you [22:30:52] shit. I added you to bots [22:30:57] let's retry that [22:31:39] I like the intro text in ssh [22:31:42] AaronSchulz: ok. you're added [22:31:54] like some sort of dos game [22:31:57] :D [22:35:59] ^demon, Susan: How come when I run explain on that query locally it doesn't show me 'using filesort'? Do I need some config var turned on or some mySQL feature? [22:36:39] Smaller dataset? [22:36:47] oh, maybe so [22:36:48] <^demon> I ran it on my localhost, same result. [22:36:56] kaldari: sometimes it's better to just do it against enwiki etc [22:36:57] <^demon> Dunno why the filesort wouldn't show. [22:36:58] cheap enough to do [22:37:19] Sometimes version variants can make a difference [22:37:28] MySQL is evil. EXPLAIN is evil. [22:37:35] That's my final answer. [22:37:40] <^demon> Susan is evil! [22:37:49] Preach. [22:39:14] kaldari: what query? [22:39:30] I ran it against en.wiki and it still didn't show 'using filesort' [22:39:34] explain SELECT p1.page_namespace AS namespace,p1.page_title AS title,pl_from AS value FROM `page` `p1`,`page` `p2`,`pagelinks`,`page_props` WHERE (p1.page_id = pp_page) AND pp_propname = 'disambiguation' AND (pl_namespace = p1.page_namespace) AND (pl_title = p1.page_title) AND (p2.page_id = pl_from) AND p2.page_namespace = '0' ORDER BY pp_page LIMIT 51; [22:40:57] ^demon: maybe different indexing? [22:41:05] <^demon> That would do it. [22:41:16] <^demon> Different indexes on page, perhaps. [22:41:50] depends on join order and equality propagation too [22:42:01] that's greek to me [22:42:12] well the 'equality propagation' part at least [22:42:40] newer mysql versions assume things about columns if they know they equal other columns [22:42:49] like for an equality join [22:43:31] ^demon: what version of mySQL are you running? [22:43:36] also, you didn't say STRAIGHT JOIN, so the actually order that the tables are joined is up to the server [22:43:59] if different installs have different numbers of rows in each, different join orders may be picked [22:44:08] ack [22:44:24] that is normally a good thing since it tries to pick the most efficient one [22:44:45] though it can cause problems when the planner does not account for quicksorts that certain join orders require [22:44:56] that is why the special:log code forces a lot of indexes [22:45:06] <^demon> kaldari: 5.5.29 [22:45:13] or I should say, forces the index for lots of code paths [22:45:48] that's newer than mine [22:46:52] and your tables most certainly have different sizes [22:47:04] -- general inquiry-- does anyone have any idea why phpunit is complaining (and dying) on my local box with "Class 'Symfony\Component\Yaml\Inline' not found in /usr/share/php/Symfony/Component/Yaml/Dumper.php on line 62" -- I've made sure I'm running my most recent phpunit as installed from pear; and I'm running mediawiki build scripts [22:47:38] ^demon: can you tell me if this uses a filesort for you: SELECT p1.page_namespace AS namespace,p1.page_title AS title,pl_from AS value FROM `page` `p1`,`page` `p2`,`pagelinks`,`page_props` WHERE (p1.page_id = pp_page) AND pp_propname = 'disambiguation' AND (pl_namespace = p1.page_namespace) AND (pl_title = p1.page_title) AND (p2.page_id = pl_from) AND p2.page_namespace = '0' ORDER BY value DESC LIMIT 51; [22:48:09] <^demon> Yep, it does. [22:48:43] thanks [22:49:44] <^demon> mwalker: Try installing it with --alldeps. Sometimes the dependencies don't all get caught. [22:50:20] <^demon> (pear isn't the greatest package manager ever) [22:51:15] ... I've noticed :p [22:51:16] ^demon: would you suggest that I not sort the list at all or leave it how it is since it doesn't seem to use filesort on en.wiki at least? [22:52:22] or is there some other trick I'm overlooking [22:53:39] kaldari: can you help me with a little js? [22:53:56] aude: sure, that's a subject I can handle :) [22:54:15] on wikidata, we are seeing an error for logged otu users [22:54:19] if ( mw.user.isAnon() ) { [22:54:28] is the code and maybe something is not loaded yet [22:54:42] Uncaught TypeError: Object # has no method 'isAnon' [22:54:48] <^demon> kaldari: I'm not entirely sure. There's other people who know more than I :\ [22:55:06] <^demon> My inclination would say "It's fine, it's marked expensive anyway," but I could be Very Wrong. [22:55:10] not sure what i can do for a quick fix or [22:55:12] ^demon: I'll ask around, thanks for your input! [22:55:27] <^demon> yw. [22:55:37] and let the frontend folks on our team look at it more tomorrow [22:56:36] hmm, that's strange. Wonder if mw.user is getting overwritten somehow [22:57:00] if mw.user wasn't loaded it would say it wasn't defined [22:57:06] right [22:57:17] i tried typeof !undefined stuff [22:57:30] tried mw.user && mw.user.isAnon() [22:57:51] you could try mediawiki.user to see what happens [22:57:58] quick fix is to just comment out that block of code, as it's a new feature [22:58:01] ok [22:58:20] they are like aliases? [22:58:40] usually it's aliased explicitly with a closure [22:58:49] but I think it's aliased elsewhere as well [22:59:00] * aude nods [22:59:27] if that works, the aliasing must have broken or mw.user got overwritten [22:59:36] doesn't work [22:59:58] do any other mediawiki.user functions work? [23:00:14] * aude tries [23:00:42] like mediawiki.getName() or something [23:00:46] oops [23:00:57] mediawiki.user.getName() [23:01:00] no method [23:01:28] what about other mediawiki methods outside user? [23:02:13] hmmm [23:02:36] like mediawiki.log() [23:03:17] ReferenceError: mediawiki is not defined [23:03:40] i have a breakpoint before where the error occurs and doing this in the console [23:03:48] or mw.log() [23:04:16] seems to work [23:04:30] how about mw.user.getName() ? [23:05:02] mw.user.tokens works [23:05:26] try mw.user.anonymous() [23:06:05] ok [23:06:29] nope [23:06:35] very strang [23:06:36] e [23:06:47] and getName fails as well? [23:06:48] the feature worked fine on our test setups [23:06:56] it fails [23:07:21] what about mw.user.name() ? [23:07:35] NO [23:07:37] no [23:07:45] geez [23:08:00] i'll just disable it but don't know how we go about debugging it [23:08:15] especially, as it works fine elsewhere [23:08:20] did you guys modify mediawiki.user.js at all? [23:08:25] no idea [23:08:43] Nothing weird in console? [23:09:08] since some of the methods say they are defined and some say they aren't defined, maybe there's a stray comma or semicolon in your mediawiki.user.js file [23:09:17] try http://www.wikidata.org/wiki/Q1929456?debug=true [23:09:19] logged out [23:10:07] trying... [23:10:37] works for me [23:10:40] it's supposed to show a "warning" that their ip address will be recorded [23:10:44] huh [23:10:55] in the console, but maybe not at the point it's needed [23:11:12] when does it show the warning? [23:11:38] What browser are you using, aude? [23:12:34] line 148 in repo/resources/wikibase.ui.entityViewInit.js [23:12:37] i have crome [23:12:39] chrome [23:12:47] and not the only person seeing it [23:13:23] I'm seeing it as well [23:13:28] hmmm [23:13:36] who knows, in production we have a bazillion extensions [23:13:43] oh yeah, I see the error on the data page... [23:13:56] we try to test with various extensions but maybe something is interacting in a bad way [23:14:07] I don't have a way to test wikidata stuff [23:14:17] just in the console [23:14:21] wikidata.org [23:14:34] * aude setting breakpoints [23:14:41] strangely, if I run mw.user.isAnon() after the page finishes loading it works [23:14:46] right [23:15:34] This code is run from $( document ).ready( function() { [23:15:56] Which makes it sound to me that it's getting run before mw.user is set up [23:16:19] could be [23:18:13] you could try adding mw.loader.using( 'mediawiki.user', function () { ... around it [23:18:27] but I bet there's a simpler problem to fix somewhere [23:18:31] can try [23:18:33] true [23:19:01] i think i should just disable for now [23:19:26] but try a few more things first [23:22:10] aude: it looks like the only two parts of mw.user available at that point are tokens and options for some reason, no methods [23:22:26] that's odd [23:25:46] aude: as a quick hack you could try changing $( document ).ready( function() { to window.onload = function() { and see if it makes any difference [23:26:16] since window.onload happens later [23:27:13] isn't that == $( document ).load to at least keep it jQuerish [23:28:18] you might be right, I usually just use the first one, but there's probably a jQuery equivilent [23:28:56] no idea [23:29:13] maybe $(window).load? [23:30:03] You could try $(window).load(function() { [23:30:08] or $(document).load(function() { [23:30:40] if the mw.loader.using doesn't work [23:31:43] too much for me to try... [23:32:16] i'd want to debug more on my test wiki, set to wmf8 branch of core and the extension [23:33:06] I'm having issues contacting https://en.wikipedia.org/w/api.php [23:33:11] Request: GET http://en.wikipedia.org/w/api.php, from 10.64.0.123 via cp1005.eqiad.wmnet (squid/2.7.STABLE9) to () [23:33:11] Error: ERR_CANNOT_FORWARD, errno (11) Resource temporarily unavailable at Wed, 16 Jan 2013 23:32:50 GMT [23:35:16] Request: GET http://en.wikipedia.org/w/api.php, from 10.64.0.137 via cp1005.eqiad.wmnet (squid/2.7.STABLE9) to 10.2.1.22 (10.2.1.22) [23:35:16] Error: ERR_CONNECT_FAIL, errno (110) Connection timed out at Wed, 16 Jan 2013 23:34:39 GMT [23:36:18] now its just a plain old nginx error message "504 Gateway timeout" [23:36:27] well, I seem to be locked out of ganglia now, so I can't help [23:36:32] :( [23:37:05] API service is definitely disrupted though [23:37:34] Reedy: ^ [23:38:22] kaldari: Apaches are having a fit, that's why... [23:40:19] that doesnt sound too good... [23:42:46] kaldari: thanks for your help [23:42:57] we will look at it more tomorrow [23:43:04] aude: hope one of those suggestions works [23:43:15] i hope so [23:43:18] I didn't see anything else obviously wrong [23:43:27] me either [23:46:18] why api down? [23:46:38] [05:38:21 PM] kaldari: Apaches are having a fit, that's why... [23:46:52] we are looking at this and trying to fix [23:47:08] ok [23:47:14] thanks Leslie [23:47:40] thanks :) [23:52:23] Aranda56: http://en.wikipedia.org/w/api.php [23:52:38] Aranda56: ok for you again? it is for me [23:52:56] works on https for me (but broken on http) [23:53:02] The API just started working again for TW and Popups on my end (https) [23:53:21] Works both now... probably jsut timing [23:53:23] That's why I came in here originally but I got distracted protecting a page [23:53:45] works for me :) [23:54:04] ok, good [23:54:14] fixed thanks to Ryan_Lane [23:54:18] it was related to a change in search [23:54:31] really fixed thanks to notpeter [23:54:42] working again as well thanks [23:54:59] Mhm, thank you! [23:55:10] * Ks0stm returns to his admin duties