[00:00:08] depends on how much is a bunch [00:00:10] I can always move the sqlite db to the SSD from the HDD if it slows down :) [00:04:17] 667 hours of CPU time [00:04:23] by Template:Country_data_Belgium [00:04:26] yesterday only [00:04:32] and the template was last edited in July [00:04:45] so maybe somewhere on that order per day since July [00:05:22] jesus [00:12:14] [0011][tstarling@fluorine:/a/mw-log/archive]$ zgrep 'zhwiki.*Template:Country_data_Belgium' runJobs.log-201210*.gz | perl -e '$t = 0; while (<>) { /t=(\d+)/; $t += $1;} print(($t/1000./3600) . "\n");' [00:12:14] 9337.56747833333 [00:12:41] 933 CPU hours per day [00:14:00] so, almost 40 cores do noothing but process this template [00:14:59] TimStarling: still caused by my bot edit in july or a touch/purge by sb. else later? [00:16:52] 3843 inclusions is not that much [00:20:04] binasher: i think the delay from editing a template until my bot recognizes high job queue could be that the job is first scheduled and while executing it created further jobs [00:20:59] mmm, we could disable interwikis in template namespace completely :evilgrin: [00:21:23] that could be an interim solution [00:22:40] TimStarling: have you looked at aaron's jobqueue refactor yet? it should fix duplicate job detection, which would be nice here [00:25:56] binasher: i excluded country data template on zhwiki if last edit i less than 6 month. [00:26:33] Merlissimo: thank you [00:26:41] binasher: not quite, for refreshLinks [00:26:47] some more work is needed for that [00:26:50] damn [00:26:58] :) [00:27:11] maybe after it gets merged I'll do that [00:27:46] ok, net will fail soon [00:27:49] * AaronSchulz goes home [00:27:56] that still doesn't help with the cost of these templates on zh [00:33:20] HardDisk_WP: about? [00:35:12] ok maybe there was a small error in that perl one-liner [00:36:23] TimStarling: only 900 cpu hours? [00:36:31] I made an adjustment and now the total amount of time spent by job runners on all tasks is comfortably less than the age of the universe [00:39:13] is there any other wiki project having so much job queue problems? [00:39:54] maybe it is still wrong though, it still gives a number which is higher than the number of processes we have, working full time [00:40:17] Reedy: was it you who posted a link to a enwiki job queue statistic graph? [00:40:24] upload.wikimedia says "help me! I'm fallen asleep and can't provide images to end users!" [00:40:29] (in case you didn't already know) [00:40:40] Possibly, we have things logged in a few places [00:40:41] https://gdash.wikimedia.org/dashboards/jobq/ [00:40:43] Magog_the_Ogre: can you look again ? [00:40:55] nope, still not working LeslieCarr [00:41:00] link? [00:41:02] grrr [00:41:13] http://ganglia.wikimedia.org/latest/graph_all_periods.php?c=Miscellaneous%20pmtpa&h=spence.wikimedia.org&v=506&m=enwiki_JobQueue_length&r=hour&z=small&jr=&js=&st=1349916059&z=large [00:41:17] Magog_the_Ogre: where are you located ? [00:41:22] europe ? [00:41:26] Pennsylvania [00:41:26] Reedy: could you create a graph for zhwiki, too? [00:41:37] :( can you do a host upload.wikimedia.org for me ? [00:42:13] wtf [00:42:22] $ ping upload.wikimedia.org [00:42:22] ping: unknown host upload.wikimedia.org [00:42:33] :-/ [00:42:39] lemme see if one of the servers died [00:42:59] hrm, what's your dns resolver ? [00:43:23] heck if I know; I'm at a friend's house [00:43:46] heh [00:43:49] ns2 is broken [00:44:09] thanks binasher … weird, it worked for me [00:44:16] !log restarting dns on ns2 [00:44:27] Logged the message, Mistress of the network gear. [00:44:29] Merlissimo: I'm about to go offline. Would probably be easier to ask ops to do it... Shouldn't be much work, more copy paste... [00:44:29] !log restarted pdns on ns2 [00:44:30] Magog_the_Ogre: you can use "host whoami.akamai.net" to find your resolver [00:44:40] Logged the message, Master [00:45:05] $ host whoami.akamai.net [00:45:05] whoami.akamai.net has address 66.59.110.83 [00:45:05] $ host whoami.akamai.net [00:45:05] whoami.akamai.net has address 66.59.110.83 [00:45:11] thx [00:45:32] weird [00:45:37] yeah that dns server isn't seeing it [00:45:40] Merlissimo: at least one job claimed to have been running for 15 years, that skews the stats a bit [00:45:52] oh nevermind, i'm just getting it refused [00:45:55] hrm :-/ [00:46:14] TimStarling: could you add a job queue graph for zhwiki (^^Reedy) [00:46:34] zhwiki is older than enwiki ;-) [00:48:19] I'm up and running LeslieCarr [00:48:30] cool [00:48:32] whew [00:48:42] did you do something or did it fix itself? [00:49:44] fixed itself [00:49:58] OK; must have been my DNS then. Thanks! [00:50:26] well it was timed with a dns change... [00:50:35] so maybe there was a blip and it happened to cache a negative result during it [00:51:13] you've run up against the wall of what I know about DNS servers [00:51:23] but I will trust you that this explanation sounds right ;) [00:52:02] :) [00:53:31] binasher: Location map could be a problem, too: http://zh.wikipedia.org/w/index.php?limit=50&tagfilter=&title=Special%3A%E7%94%A8%E6%88%B7%E8%B4%A1%E7%8C%AE&contribs=user&target=Obersachsebot&namespace=10&tagfilter=&year=&month=-1 [00:54:01] Merlissimo: i think you're right [00:54:20] it's less complex, but much more included [00:55:43] friend saying upload borked as well [00:55:57] LeslieCarr: ^ [00:56:10] notpeter: thanks, can you have them give me host upload.wikimedia.org ? [00:58:10] it's going to pmtpa still [00:58:22] 208.80.152.211 [00:58:42] dns probably cached by some kinda intermediary [00:58:58] pmtpa is right for right now [00:59:04] stupid caching [00:59:04] oh [00:59:12] oh ? [00:59:31] upload.pmtpa.wikimedia.org [00:59:44] and getting err [00:59:59] weird, ns0-2 are working right now ... [01:00:20] just checked that [01:00:51] well, i'm going to fix up the switches and switch everything back [01:01:11] ok, cool [01:01:24] I'm getting conflicting reports from people at that IP [01:01:29] so it's probably a local issue [01:01:34] so not much we can do [01:02:21] !log upgrading csw2-esams member 3 [01:02:32] Logged the message, Mistress of the network gear. [01:04:01] !log upgrading csw2-esams member 2 [01:04:13] Logged the message, Mistress of the network gear. [01:06:14] Merlissimo + binasher: http://paste.tstarling.com/p/KlkziI.html [01:06:22] job queue execution times by wiki [01:06:54] so it's mostly zhwiki [01:07:43] 92% zhwiki [01:09:13] zhwiki wins the gold! [01:12:03] binasher: compared to number of templates plwiktionary wins [01:22:38] !log upgrading csw2-esams member 0 [01:22:39] !log upgrading csw2-esams member 1 [01:22:49] Logged the message, Mistress of the network gear. [01:23:01] Logged the message, Mistress of the network gear. [02:30:45] Image on commons wont load https://upload.wikimedia.org/wikipedia/commons/thumb/0/0f/Mt_KAMUEKU_2.JPG/1280px-Mt_KAMUEKU_2.JPG Traceback (most recent call last): [02:30:47] File "/usr/lib/pymodules/python2.6/eventlet/wsgi.py", line 336, in handle_one_response [02:30:49] result = self.application(self.environ, start_response) [02:30:51] File "/usr/local/lib/python2.6/dist-packages/wmf/rewrite.py", line 368, in __call__ [02:30:53] resp = self.handle404(reqorig, url, container, obj) [02:30:54] File "/usr/local/lib/python2.6/dist-packages/wmf/rewrite.py", line 197, in handle404 [02:30:56] upcopy = opener.open(encodedurl) [02:30:57] File "/usr/lib/python2.6/urllib2.py", line 391, in open [02:30:59] response = self._open(req, data) [02:31:00] File "/usr/lib/python2.6/urllib2.py", line 409, in _open [02:31:02] '_open', req) [02:31:03] File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain [02:31:05] result = func(*args) [02:31:06] File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open [02:31:08] return self.do_open(httplib.HTTPConnection, req) [02:31:09] File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open [02:31:11] raise URLError(err) [02:31:12] URLError: [02:31:19] morgankevinj: pastebin next time [02:32:04] sorry [02:36:45] http://upload.wikimedia.org/wikipedia/commons/thumb/4/4c/Steve_Huffman.jpg/800px-Steve_Huffman.jpg is giving me a python backtrace [02:43:27] seems to have resolved itself [09:11:00] hi hashar [09:11:15] at http://dumps.wikimedia.org/hewiki/latest/ there are dumps of SQL tables. [09:11:24] are all tables supposed to be there? [09:12:00] not all [09:12:00] no, not all of them. some are private, for example, and the data from some is contained in the xml files [09:12:06] some are private such as `user` [09:12:24] Ariel is the boss when it comes to dumps :-] [09:13:02] * apergos lets hashar continue to answer, he was doing a fine job :-)  [09:13:18] is the table revision supposed to be there? [09:13:28] no [09:13:47] information from the revision table that's useful is primarily found in the stubs xml files [09:14:08] I guess cause the revision table contains hidden revision we don't want everyone to see [09:14:29] yep, we filter those out for the xml of course [09:16:16] somebody just asked me about this... he was using that dump for his academic projects and now it's gone. [09:16:23] btw if you are a user of the dumps and not already on xmldatadumps-l you might join (or they might) [09:16:55] well we haven't removed any tables from the dumps for years and years [09:18:18] Hmm who's aklapper? Is he the new bug dude [09:19:49] yeah [09:20:19] :) [09:39:57] Damianz here is andre : andre__ [09:40:07] * Damianz waves at andre__ [09:40:11] hi there. yes, that's me :) [09:40:15] ahh :-D [09:40:28] what did I screw up this time? :D [09:40:30] hello andre, I am Antoine Musso working for platform engineering on continuous integration tools :-D [09:40:42] yay, hi. seen your name, yeah [09:40:43] I don't think we ever talked to each other hehe [09:41:20] hehe. I think our ways will cross a lot, yes [09:41:33] hashar: Did you not meet him in Berlin this year? [09:41:43] I don't think so [09:41:52] I have terrible memory when it comes to people faces [09:41:55] Just wondered wtf bugs I was replying to got cc'd to you when I hadn't seen your name around labs :) [09:41:55] cannot remember either, too many faces [09:42:14] heh [09:42:16] I'm currently watching a lot of tickets just to get some impressions [09:42:41] here I am : http://profile.ak.fbcdn.net/hprofile-ak-ash4/369403_718876276_1388790613_n.jpg http://www.mediawiki.org/wiki/File:Hashar-wikipediaglobe.JPG [09:43:29] Pictures are no fun, they ruin what I imagine people to look like [09:44:38] Hmm, no, cannot remember, sorry, probably we didn't talk with each other. [09:44:44] http://home.arcor.de/ak-47/img/me-001s.jpg is an acceptable image of me [09:45:42] Yay acceptable length of hair [09:45:57] just missing a pint of beer and that would be validated pic [09:46:09] ehehe [09:46:19] where's my photoshop when I need it... [09:46:30] photoshop!? [09:46:31] ;) [09:47:16] but isn't that how it's done nowadays on the interweb? [09:49:53] be glad marktraceur isn't in here.. [09:50:52] I think we need to photoshop Reedy in some grey hair and a huge beard [09:50:54] andre__: http://www.gimp.org/ [09:51:07] Damianz: I'm suprised the Wikidata people haven't done that already.. [09:51:22] lol [09:51:53] Reedy: I admit that I run GNOME on this machine so GIMP is my friend already. you got me :) [09:52:25] You know photoshop /will/ run under wine [09:52:25] It's laggy as hell but just about usable [10:18:58] a/clear [11:59:14] Hello all, I'm having a serious issue with a mediawiki install. I realize this probably isn't the right place to ask but I'm desperate. Squid seems to go on holiday after 24 hours of operation and it's crippling my website and blood pressure. Can anyone help me figure this out? [12:13:23] Ulfr, try asking in #mediawiki ? [12:13:36] andre__: Did, sumanah directed me here [12:14:16] oh, ok [16:49:53] !log updated the payments cluster to 4d0223ebb48f9b [16:49:53] I'm trying to figure out who is in the wmf LDAP group. Is there a way for me to look that up? [16:50:04] Logged the message, Master [16:50:59] I looked on wikitech.wikimedia.org and couldn't figure it out, and it doesn't seem to be mentioned when I look up someone using ldaplist -l passwd [username] [17:03:56] sumanah: I think Labs help pages explicitly say that there's no (easy) way [17:04:30] Ah, I hadn't looked on Labs pages. [17:07:15] Nemo_bis: I'm trying to figure out whom I can add as a reviewer on https://gerrit.wikimedia.org/r/#/c/27421/ [17:10:56] sumanah: I suppose it's unrelated? [17:11:11] Nemo_bis: sorry, I don't understand - unrelated to what? [17:11:19] to the previous question [17:11:25] * Nemo_bis confused [17:11:30] anyway guillom should know? [17:12:08] * guillom reads the log. [17:12:46] oh, now I understand, sorry [17:12:50] Nemo_bis: it's actually related. [17:12:52] np [17:13:33] <^demon> guillom: Who would approve a change to WP-Victor, if not you? [17:13:53] ^demon: I don't know. But all I could do was +1. [17:14:09] <^demon> That's not right. Let me fix that. [17:14:13] (as a sidenote, this was the first time I ever used gerrit to review a commit) [17:15:13] <^demon> Merged. You're not in the LDAP group (which I can't fix at the moment). [17:15:22] <^demon> We need to batch-add everyone who's missing from that. [17:15:23] np, and thanks [17:15:48] ^demon: I'm trying to figure out who is in the wmf LDAP group. Is there a way for me to look that up? [17:16:36] <^demon> There's an option with ldaplist, but I can't remember it ever. [17:17:29] <^demon> Ah, `ldaplist -l group` [17:17:49] * sumanah looks at http://wikitech.wikimedia.org/view/Ldaplist [17:18:19] <^demon> `ldaplist -l group wmf` is missing lots of ppl :\ [17:22:05] ^demon: the wmf group is supposed to have everyone at WMF who has an account somewhere in LDAP? [17:22:24] guillom: turns out I am in that group and could have approved that changeset and didn't even know it, silly me [17:22:28] <^demon> At least FT. Not sure about contractors yet (see other discussion) [17:22:33] right. [17:40:29] ok, so https://wikimediafoundation.org/wiki/Special:Version does not mention the CommunityApplications app. I know it's deployed somewhere on wmf sites because it is listed at https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/tools/release.git;a=blob;f=make-wmf-branch/default.conf . but where? [17:40:46] [18:35:42] Ah, it's on officewiki [17:41:10] oh I see what you mean by "it" now [17:41:20] I thought you meant the link to the extensions template [17:42:21] Looking in CommonSettings/InitialiseSettings can usually answer those questions easily [17:42:24] Ignoring FlaggedRevs.. [17:46:10] Reedy: this is in the mediawiki repo? [17:46:14] * sumanah is looking around [17:47:19] <^demon> operations/mediawiki-config [17:47:34] https://noc.wikimedia.org/conf/ [17:47:59] * sumanah headslaps [17:48:01] duh [17:48:16] andre__: ^ you'll be wanting to know about these as well [17:49:39] sumanah, I'm adding those to the new tech hire page :) [17:50:20] thank you Eloquence :) [17:52:00] Eloquence: yeah, as I give useful stuff to andre__ I am when possible also trying to improve the relevant pages [18:24:32] ok, now I have another mystery - I am looking in InitialiseSettings.php and CommonSettings.php for the extension known as "ConditionalShow" or "ConditionalShowSection" and cannot find it, and because it would be abbreviated as "css" I am having a hard time with that too [18:31:05] sumanah: it's not in our deployed extenson-list either [18:31:48] line 33 https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/tools/release.git;a=blob;f=make-wmf-branch/default.conf Reedy [18:31:55] * sumanah is lunching [18:32:01] Yup [18:32:05] I'm just removing it from there ;) [18:32:08] https://gerrit.wikimedia.org/r/27556 [18:32:30] ah ok! [18:34:51] Reedy: "our deployed extenson-list" - where would I look to double-check against that? [18:35:37] it's in the mediawiki-config repo.. [18:35:49] I thought we had it on noc conf, but seemingly not [18:37:08] * sumanah looks at https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=tree;h=refs/heads/master;hb=refs/heads/master [18:38:23] got it, https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/extension-list;h=6a5948a2ebe00e30106c8370e0fe9d00eff5ec87;hb=refs/heads/master [18:39:20] Reedy: is operations/mediawiki-config/wmf-config/extension-list more reliable than mediawiki/tools/release/ make-wmf-branch / default.conf ? [18:39:29] Not really [18:40:11] We use that list for building the localisation cache.. so if it wasn't in there, it could be deployed (unlikely), but would have broken messages etc [19:18:32] Hi all. Is it just me, or is the book extension currently very broken? Whenever I go to download a PDF of a book (on meta and enwp), I get presented with raw HTML, and the resulting PDF doesn't actually contain any content. (e.g. try downloading the PDF of https://en.wikipedia.org/wiki/Book:1200_AD ) [19:20:14] sure looks broken. individual PDF download seems to work [19:21:53] I suspect it's probably more collection backend [19:22:44] yup, individual PDF download works fine for me - just collections that don't. [19:23:21] emailed our pediapress contact [19:36:54] csteipp: is there a bug already about users being redirected to https *after* login? [19:37:12] Nemo_bis: Yes... [19:37:19] looking it up [19:37:36] I don't mean https://bugzilla.wikimedia.org/show_bug.cgi?id=40679 [19:39:12] Ah, Nemo_bis: https://bugzilla.wikimedia.org/show_bug.cgi?id=40789 [19:39:31] Is this still an issue? I wasn't able to reproduce it on enwiki [19:40:04] still happens on translatewiki.net [19:41:07] Gatcha. So yeah, Nikerabbit will have to apply https://gerrit.wikimedia.org/r/27044 [19:43:16] hm, right, last update was -rakkaus:#mediawiki-i18n- [siebrand] updated translatewiki.net to 2a714eb 2012-10-09 09:34:47 +0000 (18 seconds ago) [19:44:06] csteipp: I can test it [19:46:05] Nikerabbit: it's ok [19:46:14] Nemo_bis: what is ok? [19:46:26] Nikerabbit: the fix [19:46:34] prod has not been updated since contenthandler merge [19:47:49] redirect loop is still not fixed [19:48:16] Nikerabbit: ah ok, that's the other bug [19:48:30] but redirect to https is fixed [19:51:39] Nemo_bis: did you clone the bug? [19:51:49] or is there another bug for the loop? [19:54:48] Nikerabbit: I linked it above [19:54:54] https://bugzilla.wikimedia.org/show_bug.cgi?id=40679 [19:55:00] unless it's yet another one [19:55:19] yeah, the infinite loop if you have $wgServer starting with 'http://' is not fixed yet [19:57:16] I don't get why that bug was repurposed for that [19:57:30] the original comment seems to be more broad issue [20:00:26] Hi. [20:00:31] https://commons.wikimedia.org/wiki/File:Seneka_Starszy_Kordoba.JPG [20:00:38] Why does that deleted redirect no disappear? [20:00:42] *not [20:01:39] Nikerabbit: broad bugs with dozens of comments tend to get useless, don't they? maybe another one should be filed? I didn't read it all [20:02:45] Nikerabbit: we tried to narrow the scope of the current problem vs changes in the design decisions. But if the wfExpandUrl issue is causing other bugs, we should open another bug [20:03:06] csteipp: I'll leave it up to you [20:03:45] I only know $wgSecureLogin exposes it [20:12:51] ori-l: We had a big deployment (3 extensions) and are still waiting to push to en.wiki. [20:21:40] AVRS: that sounds like a known bug but I can't remember exactly,, hmm [20:25:24] maybe the sysop should be asked if an error was shown after deletion [20:26:55] I'm here [20:26:57] there was no error message [20:27:55] ok [20:29:03] so it's https://bugzilla.wikimedia.org/show_bug.cgi?id=18402 aka https://bugzilla.wikimedia.org/show_bug.cgi?id=15655 ? [20:39:42] Reedy: you were helpful about nuke, maybe you know who could take care of this? https://translatewiki.net/wiki/Thread:Support/About_MediaWiki:Nuke-defaultreason/en [20:43:18] A developer? :p [20:43:25] * Nemo_bis slaps Reedy  [20:46:46] Nemo_bis: it should be trivial, just needs someone who is familiar with the gender code [20:46:56] sigh [20:47:23] currently everyone is supposed to take care of its own code's gender etc. [20:47:41] we don't have gender/plural/whatever all-catching fixers [20:48:02] 'nuke-defaultreason' => "Mass deletion of pages added by $1", [20:48:12] I've no idea if that'd include User: or just be the username [20:49:12] $this->msg( 'nuke-defaultreason', "[[Special:Contributions/$target|$target]]" )-> [20:49:12] inContentLanguage()->text(); [20:49:45] so there is no User: [21:11:04] bsitu: 3 patches got merged just a few minutes ago and need to get to fenari [21:11:06] bsitu: everything for matthias is merged. he was offering to push it to fenari, but I told him you were probably already doing it. [21:11:18] and kaldari was the fastest typer ;) [21:16:37] mlitn: it's on test [21:18:15] ok sweet, thanks [21:22:47] bsitu: fabrice around? [21:22:57] everything looks good to me now & dario [21:23:08] he will be back in 30 minutes [21:23:46] ok [21:25:05] do you need more time for testing? [21:25:09] he asked me to make a call whether to go ahead or not with CTA1, but since we've fixed all these issues we're good to go [21:25:52] ok, go [21:27:46] mlitn, ori-l, spagewmf, bsitu: thanks for the fixes/debugging, I gotta run ttyl [21:28:59] cya [21:41:16] What is the current total size of media files we store? [21:44:15] about 33T including thumbnails [21:46:35] Reedy: back to that Nuke GENDER request, I suppose that if you can't quickly understand how to fix it this means that there should be more docs on the feature? [21:46:57] Maybe, but I didn't go looking for docs [21:47:34] Reedy: and where would you look for them? [21:47:42] aka where should they be [21:48:13] Normally I'd try and do it by example [21:48:39] Reedy: so not https://www.mediawiki.org/wiki/Localisation#Users_have_grammatical_genders ? :) [21:49:08] 'Please review {{GENDER:$1|his|her|their}} edits.' [21:49:15] isn't an example of the same.. [21:50:37] uh? [21:51:24] Mass deletion of pages added by $1 [21:51:36] We're not wanting that to swap to his/her/their [21:51:44] we're explicitly giving their username [21:54:59] Reedy: I don't understand, the point is that some languages can need gender-dependent structures to translate that [21:55:25] don't we have any manual page on how to create/use system messages? [21:55:43] https://www.mediawiki.org/wiki/Talk:WfMsg_statistics "A first step is to reactivate that is to do a list of these functions" [22:52:20] gn8 folks