[12:26:42] Hello! Could you please tell/show me how to change "Wikipedia" namespace to "Википедия". At. ba.wikipedia.org cyrillic alphabet is used, but wikipedia namespace is in latin still. Here you can see: http://ba.wikipedia.org/w/index.php?title=%D0%AF%D1%80%D2%99%D0%B0%D0%BC%D1%81%D1%8B:%D0%9F%D0%BE%D0%B8%D1%81%D0%BA&search=&fulltext=%D0%AD%D2%99%D0%BB%D3%99%D2%AF&profile=advanced&redirs=1 [12:27:02] !bugzilla [12:27:02] All bugs in MediaWiki should be reported at https://bugzilla.wikimedia.org. Requesting site configuration changes, new features or enhancements to existing features all go here. Bear in mind before making a feature/enhancement request: 1) If the request is specific to a Wikimedia wiki, please discuss it there first. 2) Consider whether a custom extension would be more appropriate [12:29:56] ok. thanks [12:46:55] hello [15:06:14] Hmm [15:06:18] I think now is deplopyment time [15:22:51] fyi....1.21wmf2 deployment happening. most discussion happening on #wikimedia-operations [16:53:37] !log synchronized payments cluster to efcecc41c6f3c26 [16:53:48] Logged the message, Master [18:07:02] is this normal? http://ganglia.wikimedia.org/latest/?r=month&cs=&ce=&m=mem_report&s=descending&c=Glusterfs+cluster+pmtpa&h=&host_regex=&max_graphs=0&tab=m&vn=&sh=1&z=small&hc=4 [18:07:31] Nemo_bis: probably not [18:07:46] poooor gluster [18:08:57] average CPU wait 52910853673.8 % [18:08:57] Nemo_bis: need to poke someone form ops about it [18:09:12] the root cause is probably that it went to swap [18:09:16] for like 2 weeks [18:09:27] that would explain a lot of the issue I have encountered with my instances [18:09:30] would you mind filling a bug about it ? [18:09:42] Ryan_Lane: ? [18:09:44] http://ganglia.wikimedia.org/latest/?r=month&cs=&ce=&m=mem_report&s=descending&c=Glusterfs+cluster+pmtpa&h=&host_regex=&max_graphs=0&tab=m&vn=&sh=1&z=small&hc=4 [18:09:51] yeah [18:09:57] oh no [18:09:57] now dinner [18:09:57] there's a memory leak [18:10:04] it's already fixed [18:10:05] Nemo_bis: seems like the issue has been resolved:-) [18:10:21] look at the hour or day view [18:10:26] * hashar note how ryan proactively fix issues before they get reported \O/ [18:10:31] actually, no ;) [18:10:41] someone complained about performance of gluster [18:10:58] Ryan_Lane: while you are around, I noticed glusterFS is spamming a log file in /var/log/glusterfs/data-project.log (on all deployment-prep instances). [18:11:03] yeah [18:11:11] maybe the volume is corrupted [18:11:12] it needs log rotation [18:11:20] could be that too [18:11:40] are there any fsck tool for gluster ? [18:12:02] rotating the gluster logs is bug 41104 : https://bugzilla.wikimedia.org/show_bug.cgi?id=41104 ;) [18:12:26] I have been looking for a 'logrotate' puppet class but there is none yet :/ [18:12:32] no. If there's a split brain on the files, I need to fix it on the server side [18:14:52] split brain ? [18:16:32] anyway you are the gluster boss I probably don't need to know the details beside there is nothing I can do :-] [18:19:40] off for dinner will be back after [19:35:16] Hi, this bug affect some pages of the French Wikisource after 1.21-wmf2 relase. Can a sys admin look at it ? https://bugzilla.wikimedia.org/show_bug.cgi?id=41188 [19:36:02] Tpt: je regarde :) [19:36:16] hashar: Merci. [19:36:19] testwiki back, sorry for outage [19:36:31] Reedy: seems one of the apache can't find the texvc executable :-] [19:36:42] Yeah.. [19:36:56] It's all French to me! [19:36:57] :D [19:37:39] here is the english version : https://test.wikipedia.org/wiki/Bug_41188 [19:38:14] * hashar blames srv193 [19:39:22] 193 doesn't serve normal stuff [19:39:42] I'll have a look in a few [19:40:13] it does have the texvc file though /usr/local/apache/uncommon/1.20wmf12/bin/texvc [19:41:45] 1.20wmf12? :p [19:42:00] yup [19:42:02] on fenari echo '\frac{1}{2}'| mwscript parse.php --wiki=testwiki [19:42:18] might be a typo in wikiversions.dat can it be? [19:42:42] grmbmbl [19:42:44] took the wrong link [19:42:51] I am getting old [19:43:13] damn scap [19:44:38] I don't even know how texvc is generated on the cluster [19:44:39] :-( [19:45:06] scap? [19:46:45] which calls scap-1 [19:46:52] which calls scap-2 [19:46:52] bahhh [19:53:53] I have no idea how it is deployed :-( [19:54:52] hashar: Arf. There is no log when you can found that ? [19:54:59] Reedy: If you have time, could you look at this patch https://gerrit.wikimedia.org/r/#/c/17643/ ? I've fixed issues you have listed. [19:55:23] Tpt: I know the root cause, is that the math rendering engine (texvc) is not compiled/availalbe on 1.21wmf2 :-( [19:55:45] and I have now idea how it is compiled / deployed nowadays which really piss me off [19:57:18] Didn't Aaron move it? [19:57:41] hashar: [19:57:42] reedy@fenari:/home/wikipedia/common/php-1.21wmf1/bin$ texvc [19:57:42] Segmentation fault [19:58:23] ... [19:58:46] works for me [19:58:51] lol. [19:59:01] with /usr/local/apache/uncommon/1.21wmf1/bin/texvc [19:59:01] ;) [19:59:44] and note that your above command executes /usr/local/bin/texvc probably [19:59:47] (forgot ./ ) [19:59:53] Yeah [19:59:58] hashar: There isn't a 1.21wmf2 folder in uncommon... [19:59:58] anyway [20:00:04] that does not explain how the hell we are building texvc [20:00:12] and I have no idea what uncommon is for [20:00:40] texvc [20:00:44] yeah figured that out [20:00:54] :D [20:01:02] but neither puppet nor tools nor anything reference 'uncommon' or 'texvc' [20:01:08] so that is a whole mistery [20:01:24] that used to be compiled by scap [20:01:30] or at least VIA scap [20:01:31] Yeah, Aaron made it optional, somewhere somehow [20:01:33] AaronSchulz: PING [20:01:58] optional? It takes like 8 seconds to compile compared to the 45 minutes to do scap [20:01:59] ,) [20:02:08] micro optimizations are going to kill us hehe [20:02:09] yeah, nfi :p [20:02:23] I am not even bothering looking for a documentation [20:02:37] will end up having some how to written by Jeluf back in 2004 or so ;-] [20:02:54] or maybe some wikitech-l by brion explaining how to migrate to javascript rendering from 2007 ;-] [20:03:13] 8 seconds? I waited ages to get the "compiling texvc" things to finish [20:03:17] got it [20:03:23] /usr/bin/scap-recompile [20:03:24] na not possible [20:03:25] I mean dozens of minutes [20:03:32] it is french quality software, can't take that long ! [20:03:36] (I know, I am french) [20:03:46] reedy@fenari:~$ sudo -u mwdeploy scap-recompile [20:03:46] MediaWiki 1.21wmf2: Compiling texvc...ok [20:03:46] MediaWiki 1.21wmf1: Compiling texvc...ok [20:03:46] seriously, did it take that long to compile ? [20:03:56] let's find out! [20:04:02] Reedy: there needs to be a puppetized script that does the ddsh to do scap-recompile [20:04:15] .. [20:04:25] though it's only one line, so one can do that manually for know I guess [20:04:32] dsh -F5 -cM -g mediawiki-installation -o -oSetupTimeout=10 'sudo -u mwdeploy /usr/bin/scap-recompile' [20:04:34] but it should be there for convenience [20:04:34] You're welcome [20:04:44] ahh thanks [20:04:44] yeah, that ;) [20:04:44] so that is a hack [20:04:50] !log Running scap-recompile on all mediawiki-installation [20:04:52] what removed texvc recompilation from scap ? [20:04:58] Aaron :p [20:05:03] Logged the message, Master [20:05:06] Tpt: ^^ [20:05:12] hashar: I'll add it to my deploy list [20:05:13] * hashar takes next plane to SF ;-] [20:05:32] Reedy: well we also want to recompile texvc whenever the math ext is changed :-] [20:05:41] well, yeah [20:05:45] well on branch that should be fine though [20:05:47] at least running at the time of deploying a branch would help ;) [20:06:00] scap starting [20:06:04] since the caml script is unlikely to receive hot fix we don't notice [20:06:04] God knows why it didn't fatal [20:06:12] hashar: done! [20:06:25] AaronSchulz: I could not find the change that removed texvc compilation. Any hint ? [20:07:10] we also need that scap-recompile to be in puppet :-] [20:07:15] hashar: I guess it co-incided with getting tim to rebuild wikimedia-task-appserver [20:07:22] ^ it's in there [20:07:41] hashar: [20:07:42] https://test.wikipedia.org/wiki/Bug_41188 [20:07:48] MATHS BITCHES [20:07:49] yeahhh [20:08:02] one less deployment bug :-] [20:08:33] -F25 is REALLY fast :D [20:08:45] Reedy: have you added "run scap-recompile" to your branch task list ? [20:08:45] (just double checking) [20:08:59] http://wikitech.wikimedia.org/index.php?title=Heterogeneous_deployment_v2&diff=prev&oldid=52504 [20:09:18] Reedy: yeah it helps when it's not F5 :) [20:09:27] Tpt: issue fixed :-] [20:09:30] that and it's all local [20:09:35] real 0m42.846s [20:09:42] ^ is how long it takes [20:09:46] hasar, Reedy: Thanks a lot ! [20:10:02] Reedy: we could start the compilation in the background [20:10:10] and wait() for it [20:10:33] AaronSchulz: could you put scap-recompile in puppet ? [20:10:59] or is that provided by some magic debian package ? [20:11:35] I thought that was managed by packaging, so it would be on the apaches [20:11:49] seems to be provided by wikimedia-task-appserver [20:11:49] what we do need is that 1-line ddsh script :) [20:12:03] that belongs on the bastion [20:12:06] [21:04:32] dsh -F25 -cM -g mediawiki-installation -o -oSetupTimeout=10 'sudo -u mwdeploy /usr/bin/scap-recompile' [20:12:06] [21:04:34] but it should be there for convenience [20:12:17] Puppet should provide that I guess [20:12:17] sometime I feel that wikimedia-task-appserver should be deprecated in favor of puppet [20:12:25] so Reedy volunteers to create that file then [20:12:43] Nah, I fixed site [20:12:43] Reedy: do you have any post branching test script ? [20:12:47] [20:12:47] * Reedy grins [20:13:32] ;) [20:13:41] oh men [20:13:51] wikimedia-task-appserver is evil [20:13:54] PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/parsekit.so' - /usr/lib/php5/20100525/parsekit.so: cannot open shared object file: No such file or directory in Unknown on line 0 [20:13:54] PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/redis.so' - /usr/lib/php5/20100525/redis.so: cannot open shared object file: No such file or directory in Unknown on line 0 [20:13:54] PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/uuid.so' - /usr/lib/php5/20100525/uuid.so: cannot open shared object file: No such file or directory in Unknown on line 0 [20:13:56] it provides scap-2 scap-1 [20:14:01] Damn you Ubuntu 12.10 [20:14:06] parsekit should be in [20:14:22] I am pretty sure faidon has made it available in both Lucid and Precise [20:14:29] and that the puppet manifests ship it [20:15:01] I'm using whatever Q is ;) [20:15:11] hmm [20:15:11] :q [20:15:19] where have you found that error ? [20:15:25] locally [20:15:25] # deb http://apt.wikimedia.org/wikimedia precise-wikimedia main universe # disabled on upgrade to quantal [20:15:32] damn you ubuntu! [20:15:39] * Reedy installs stuff [20:16:07] php5-parsekit is provided via misc::deployment::scripts [20:16:22] The following packages have unmet dependencies. [20:16:22] php5-parsekit : Depends: phpapi-20090626 [20:16:22] E: Unable to correct problems, you have held broken packages. [20:16:22] :( [20:16:27] oh no [20:16:36] wrong php version :-] [20:16:36] I'll fix it later [20:16:40] you could pecl it [20:16:43] <^demon> Try with pecl? [20:16:52] something like sudo pecl install parsekit [20:17:15] <^demon> pecl will download, phpize it, and make it. [20:17:15] (or use a real programming language like perl and its package system cpan) [20:17:46] <^demon> All programming-language-specific code repos suck. [20:17:51] <^demon> Pecl. Pear. Cpan. Gems. [20:17:59] <^demon> Every last one of the lot. [20:19:04] npm is nice [20:19:05] ); [20:19:27] composer (for PHP) is young but promising [20:19:27] but yeah [20:19:33] they all have some issues eventually [20:20:54] gaerh [20:21:02] I haTe planes [20:21:13] either 6:55 am (too early) or 10:00 am (too late) [20:23:02] travel tip ""You will soon notice that almost everyone in the Netherlands speaks English fairly well"" [20:23:02] what a great country [20:23:06] I need to relocate there [20:24:56] Looking for the hackathon? [20:26:10] Error: 1146 Table 'wikidb.parsertest_pagetriage_page' doesn't exist (192.168.0.212) [20:26:10] RAGE [20:27:12] Reedy: yeah in nov [20:27:15] Reedy: will you be there ? [20:27:21] Possibly [20:29:14] Reedy: at which one? [20:29:22] NL [20:38:54] hmm [20:39:05] Warning: Invalid argument supplied for foreach() in /usr/local/apache/common-local/php-1.21wmf2/includes/User.php on line 4192 [20:39:09] That's starting to annoy me now [20:39:33] again ? [20:39:38] oh no [20:39:44] sorry [20:40:01] I am not sure how we could properly such issue [20:40:20] maybe warnings should be made to throw exceptions when in developmentWarnings mode [20:41:02] https://gerrit.wikimedia.org/r/#/c/28550 [20:41:23] I'm confused how that fails as it did, with https://gerrit.wikimedia.org/r/#/c/7376/14/includes/User.php,unified working... [20:45:29] epic fails from flying blue [20:45:40] the browser window just disappeared!!! [20:58:01] hashar: I'm writing some tests for InitializeSettings.php (after having drop twice a space instead an underscore in a namespace, that have sound like a good idea). I wanted to reuse the db list from Provide class. [20:58:08] [wgCapitalLinks] tlhwiki isn't a valid projet nor a generic configuration key. [20:58:38] what's or what were tlh.? [20:59:32] oh, it's not the klingon one? [20:59:40] yep [21:00:14] I think I'm learning tests are useful for the present and not only for the future. [21:05:34] Dereckson that is the idea of unit test [21:05:43] you describe past failure to prevent them from happening again [21:05:54] and describe what you think a function should do :-] [21:05:54] and implement it [21:06:00] that is a nice regression tool [21:06:18] one nice thing would be to load the default settings file from the wmf branch [21:06:27] and load CommonSettings [21:06:30] then run the test suite [21:06:54] which will be able to assert that enwiki bureaucrat members are allowed to move pages [21:06:56] (yes, but I didn't know it also incidently find conceit CURRENT bugs) [21:06:56] and so on [21:07:14] IIRC Jenkins trigger the phpunit test suite and report back to Gerrit [21:19:20] off to bed for now [21:19:25] bonne nuit Dereckson :-) [21:26:30] grmbl [21:26:37] When reusing the Hashar list [21:26:40] I missed a comment [21:26:42] # Skip files such as s1 [21:56:15] kaldari, https://gerrit.wikimedia.org/r/#/c/28613/ [22:04:45] E3 deployment running beyond our window but nothing else is scheduled. This is my first deployment... KABOOM [22:10:20] Tim-away: when can i run my script to move all country data langlinks to subpages? [22:12:19] I think you can run it whenever you're ready, did I say before that you had to wait for something? [22:13:39] no, but i think job queue is crying afterwards, so i waited for you ok. script is already tested and accepted by a zhwiki admin [22:14:19] TimStarling: then i'll start in maybe 10 minutes. i online here while it is running [22:15:11] we'll have to fix the job queue after you run it, it doesn't matter what state it is in beforehand [22:17:00] InitialiseSettings.php contains in 'wgAccountCreationThrottle' setting an entry for 'he' [22:17:30] Is it a generic way to designate hewiki, hewikisource, etc. or a legacy way to write 'hewiki'? [22:18:12] it should work for all hebrew language wikis [22:18:20] that's how it was designed anyway, I don't know if anyone has tested it recently [22:21:54] TimStarling: and where should one look to find out? [22:22:12] includes/SiteConfiguration.php [22:22:22] ok [22:22:31] TimStarling: do you know if https://gerrit.wikimedia.org/r/#/c/25737/ would work? [22:23:48] don't know [22:23:52] I would have to test it [22:31:12] ok, zhwiki job queue before start: 1.853.070 ;-) [22:40:36] Reedy: per bug 31600, you added a 'thwikt' => true to install an extension. I checked their [[Special:Version]], that doesn't work, 'thwiktionary' in full is required. [22:41:21] (please don't fix it, I'm submitting in some minutes a change taking care of that and other issues) [22:55:23] - 3 private projets aren't in all.dblist: langcomwiki, strategyappswiki and comcomwiki [22:56:01] - 4 closed projects: ru_sibwiki, wikimaniawiki, iswiktionary, nomcom [22:56:11] (other private and closed projects are in all.dblist) [22:56:25] What's exactly the purpose of all.dblist? [22:58:27] it's everything! [23:00:17] langcom seems non existent [23:00:32] same for comcom [23:01:06] strategyapps also goes to metas missing list [23:01:20] the all dblist is useful for when you want to run a script against every wiki [23:01:51] so I should add the closed ones, ru_sibwiki, wikimaniawiki, iswiktionary and nomcom? [23:02:45] wikimania.wikimedia.org redirects to seemingly the current [23:03:02] iswiktionary exists [23:03:04] in all.dblist [23:03:28] oh yes sorry to have included it, it were the 'iswiktionary ' entry [23:03:56] hmm [23:04:10] fdcwiki mised the fa_sha1 patch... [23:04:41] just saw that [23:04:55] !log Added fa_sha1 to fdcwiki [23:05:09] Logged the message, Master [23:06:03] oh, fdcwiki.filearchive is empty [23:06:16] it's a relatively new wiki [23:06:24] does al.dblist still list non-wmf-wikis? [23:06:42] binasher: created on teh 10th October [23:06:43] pt-online-schema-change actually doesn't support empty tables [23:06:54] haha, sort of makes sense [23:06:54] i need to make my wrapper script around it support that case [23:07:08] Reedy: sounds annoying since scripts need to compensate [23:07:12] or patch pt-osc to run a regular alter [23:09:48] Speaking of dbs: The toolserver still waits for the eniki-dump I requested in August – is there any ETA for that? [23:10:20] DaBPunkt: ask binasher ;) [23:10:20] !log Ran populateFilearchiveSha1.php on private, wiktionary, wikiversity and wikinews wikis [23:10:32] Logged the message, Master [23:11:14] !log Added fa_sha1 to liquidthreads_labswikimedia [23:11:27] Logged the message, Master [23:12:10] DaBPunkt: Sebastian needs to reply to Erik Moeller regarding privacy concerns [23:13:48] * spagewmf shakes the cluster up a bit , according to https://wikitech.wikimedia.org/view/How_to_deploy_code#More_complex_changes:_sync_everything [23:14:33] eta is as soon as the ops team isn't in fear of crossing the legal team.. sorry its been so long [23:15:02] binasher: *sigh* ­– I hate this wmf/wmde-stuff. I will message him and ask where the problem is [23:15:38] gn8 folks [23:15:51] Good night DaBPunkt [23:16:47] my scap complains about wikivoyage versions missing DB rows, and asked for spage@spence's password. [23:17:47] Reedy: how are things going? [23:18:04] All the contenthandler related fatals are gone now [23:18:12] \o/ [23:18:15] spagewmf: I fixed those [23:19:00] Reedy thanks. Do I need to rerun scap? (two extension changes in 1.21wmf1 & 2 ) [23:19:15] did it die out? [23:19:36] The scap is hung in something, won't continue in background [23:19:57] spage@spence's pasword. I could Ctrl-D past that? [23:20:15] errm [23:20:19] Past spence prompt, it's working. [23:20:19] Are you forwarding your ssh key? [23:20:23] /agent [23:21:02] Reedy, ssh -A fenari, yes. Maybe I needed to accept spence's key? I've never ssh'd to it before. [23:21:23] it's asking for your password, meaning it doesn't know you have a key [23:23:59] Reedy, I pressed Ctrl-D and went past spence's password prompt, scap now printing srvxx: ok... snapshot1004: ok. Should I Ctrl-C this scap and restart? [23:24:22] Reedy, isn't it midnight in the UK? [23:24:52] half past midnight [23:24:55] It's early [23:25:15] I was all "kaldari++, \o/ " , but now he's abandoned me :-) [23:25:29] if the rest are continuing, they should bef ine [23:25:43] i wonder why it complained for spence [23:26:27] !log populateFilearchiveSha1.php ran for all wikis on 1.21wmf2 [23:26:38] Logged the message, Master [23:27:14] Uh oh, it was stuck again for a while "spence: Connection closed by 208.80.152.161 \ Copying code to apaches... \ spage@spence's password: fenari: Copying to fenari...ok \ fenari: Done". [23:29:54] So after my two cleaning operations, we'll have configuration instructions for the following wikis not in all.dblist: ru_sibwiki, wikimaniawiki, nomcom. We've also stuff for two "ghosts" project not in .dblist files: zh-min-nanwikisource (which exists) and yiwikinews (which doesn't exist). [23:31:24] Reedy, I dunno. I can ssh to e.g. srv193. spence just doesn't like any of my public key, whereas srv193 "accepts key" for one of them. [23:33:45] The last lines are "snapshot1004: Copying to snapshot1004...ok \ snapshot1004: Done \ spence: Connection closed by 208.80.152.161 \ Finished" [23:34:31] Does scap usually end with a whimper? It should end with "CLUSTER ALL SHOOK UP" [23:36:00] ha [23:37:47] binasher: sumanah suggested i ask you about this. I am fixing bug 39675 by adding fields ar_id (primary key) and ar_logid (logging.log_id of the page deletion action) to the archive table. Both for this situation and for future situations, I was wondering how to decide whether to add an index for a given field. If it's a large number of rows and I anticipate it's going to be used as a WHERE condition, should I go ahead an [23:37:57] to indexing that I should weigh against that? [23:39:17] what i mean is that there are a large number of rows in the archive table on any wiki that has had a lot of deletion activity, e.g. Wikipedia [23:42:05] is archive actually read from based on searches for anything other than rev_id or usertext? [23:43:47] i guess the performance tradeoff depends on how many insertions and deletions there will be to the table compared to how many queries, hmm. Well, I think there was talk that it might be desirable to group lists of deleted edits by deletion action. Perhaps I should write the rest of the patch and, when it's clearer what the implications of it are, come back to the index issue. [23:45:18] that sounds good - i generally like to see the query patterns before deciding [23:45:29] very glad archive is getting a primary key! [23:46:23] applying this migration to production will be quite a pain though, it'll take us back to needing to do master swaps on each shard. worth it though. [23:47:22] binasher: yes, I almost just want to add the primary key for the principle of the thing! but my concern really is that the tuple currently used might not always be unique [23:56:50] archive is crap [23:56:51] it really should be removed