[00:14:01] gn8 folks [00:15:22] spagewmf: looks like scap has completed at least on the stuff fundraising cares about -- and! we've not yet managed to take down the site :) [00:19:22] mwalker Yes, scap has done a whole bunch of machines, mwNN and now srvNNN, up to srv266. I wonder what a users' experience of the site is like during a scap? Maybe someone can give a scap/dsh talk. [00:19:44] it can be interesting I know that [00:30:04] mwalker, finished! 58 minutes. [00:32:52] /j #hpavc [00:32:56] bah [00:57:51] all; fundraising has just discovered it needs to also update wmf2; so we're going to syncdir centralnotice [05:26:36] guys [05:27:04] wikitech.wikimedia is down ? [05:27:12] wikitech.wikimedia.org is down ? [05:27:15] i can't access it. [05:27:42] It's been having some issues today [05:27:44] hum, ditto. [05:28:11] (access, not issues) [05:28:53] ori-l: Been receiving a few random database errors during some periods while trying to view recent changes or my watchlist there [05:30:11] net::ERR_TIMED_OUT [05:30:25] with chrome. [05:30:56] TimStarling: any idea what's going wrong with wikitech wiki? [05:32:42] because i am here http://meta.wikimedia.org/wiki/Wikimedia_servers [05:33:02] the root partition is full [05:33:17] not surprising since it is only 12GB [05:33:27] and the page of server roles are on wikitech.wikimedia.org. [05:34:21] *is [05:34:24] sorry [05:38:08] I tossed a couple files from /root, it will gain us a little time but not much [05:38:16] isn't there a static mirror somewhere? [05:38:21] seems like a useful thing to have [05:40:23] I'm shutting it down to resize its root partition [05:40:42] I would do !log but I don't think it would work right now [05:41:34] ok [05:42:27] there's anotehr older backup that I can pull off that's over 1 gb but resizing is much better, the wiki is only going to get larger [05:43:40] editor engagement! [05:44:21] yeah, I saw that too, I figured it wasn't worth mucking around with [05:45:03] it still had 8GB left in its account allocation, and if we use that up, we can always buy more [05:45:45] sweet [05:45:56] I'll do a bit of moving old files off afterwaords, can't hurt [05:46:22] !log wikitech.wikimedia.org had a full root partition. I resized it to fill up the current account allocation [05:46:25] Logged the message, Master [05:46:37] we can also run compressOld [05:46:44] presumably those server admin logs will compress pretty well [05:47:47] yep [05:48:30] the debug.log in /tmp was also half a gig [05:48:49] I'm making a new partition for /var/log [05:49:15] I think it's pretty stupid to have /var/log on the same partition as everything else [05:49:31] otherwise it's trivial to DoS the server by filling it up [05:49:46] while you're there maybe you can see why debug.log was written to /tmp instead of somewhere useful [05:55:08] guys, it's possible to buy a dedicated server and help to host wikipedia ? [05:55:31] crap... [05:55:53] uhoh [05:56:07] what ? [05:56:07] forgot to put it in /etc/fstab before I rebooted [05:56:20] oh ok [05:56:30] that's alright, it's not like it's an important service or anything, right? [05:57:52] cortexA9: what's wrong with linode? [05:58:15] linode hosts esams? [05:58:27] linode hosts wikitech [05:58:37] oh, makes sense [05:58:41] evoswitch hosts esams, hence the "es" in the name [05:59:19] ok, working now [05:59:22] i guess i always half-imagined esams was one of the hobbits from lotr [05:59:53] what was knams I wonder [06:00:00] kennisnet [06:00:14] no no, I mean as a mythincal creature which one was it [06:00:16] *mythical [06:00:28] the hard consonant identifies it as evil [06:00:33] maybe the "k" is silent [06:00:44] hmmmmmmmmm. [06:01:02] TimStarling, linode ? i don't know. what's wrong ? :-) [06:01:25] cortexA9: i think that was in reference to your offer to buy a dedicated server [06:01:40] cortexA9: I thought you meant to fix wikitech.wikimedia.org [06:01:50] (i'm accepting all offers, by the way) [06:01:53] nono. another thing. [06:01:55] but as a private individual, alas. [06:02:31] because i want to help wikipedia. but how to do that ? [06:02:48] *But [06:03:04] if you have money you would probably be better off giving us the money than buying us things with it and giving us the things [06:03:39] oh ok. [06:03:45] since at least when we choose things to buy ourselves, we can be pretty sure we need them [06:04:29] evoswitch is 100 mbit ? [06:05:29] no... [06:08:06] I think we have about 4 10Gbps links out of it [06:08:23] cool. [06:09:10] cortexA9: Wikimedia is the 5th-most-visited website on the web. You can expect them to have really high-end connectivity and hardware setups. [06:09:38] it would be nice if you donated a whole datacenter completely, but of course you probably can't afford that [06:09:59] we use 100Mbps for serial consoles and things like that, I think [06:10:09] 1 Gbps is really too small for servers these days [06:10:40] we have several servers with 2x1 Gbps bonded links, because they can fill up a 1 Gbps pipe too easily [06:11:00] yes i know [06:11:00] :) [06:11:31] it's a pity on-board 10 Gbps ethernet is still expensive [06:11:51] TimStarling, do you know OVH ? [06:12:16] no [06:12:23] yep, we have some toolserver collocation there [06:13:19] Jasper_Deng, i see they provide 300 mbit outside. [06:13:53] !log on wikitech: disabled debug log [06:13:59] Logged the message, Master [06:14:22] it's not enough for wikipedia i think. right ? [06:14:45] cortexA9: Wikipedia won't take a webhost for production wikis [06:15:16] except those that would give a full, dedicated, rack to us [06:17:14] Jasper_Deng, oh many money i think. [06:17:24] http://en.wikipedia.org/wiki/Coral_Content_Distribution_Network [06:17:36] cortexA9: there's a reason why we need about $10 mil/year in donations [06:17:58] is this still a Thing? wondering if there's some trivially easy to have a mirror of wikitech [06:18:22] coral still exists if that's what you're asking [06:18:25] *way [06:18:36] what about CloudFlare ? [06:18:58] i mean, are there enough nodes for it to be responsive [06:19:22] the easy way to have a mirror is to use whatever script was already written for it [06:19:30] maybe the script is still working, maybe it needs a one-line fix or something [06:20:34] !log upgrading wikitech to MW 1.17 branch head [06:20:39] Logged the message, Master [06:24:00] 1.17? man that is old [06:24:32] better than 1.17wmf1 [06:25:09] need mediawiki-1.18.1.tar.gz for anything? otherwise I will toss it over there [06:25:31] no, I don't need it [06:25:39] I remember when the upgrade attempt to 1.19 failed [06:25:53] Jasper_Deng, how to decrease the costs ? with cloud ? [06:26:05] ? [06:26:59] yes i mean less costs. [06:27:16] it's not possible ? [06:27:59] costs of what? [06:28:17] you say 10 mil/year. [06:28:36] not all is IT [06:28:40] wikitech-static.wikimedia.org [06:28:41] ηθη [06:28:43] huh [06:28:50] it's 2009 so completely useless [06:28:54] but it exists [06:29:32] it's on the same linode [06:29:38] presumably the idea is to rsync it to somewhere else [06:29:59] TimStarling: you hope that that was the idea [06:30:01] yes, I found it by digging through the apache configs over there [06:31:51] apergos hey [06:31:51] I pulled a copy of your wikitech-2012-11-02.tar.gz to tridge [06:31:54] you have a moment? [06:32:03] what's up? [06:32:12] I was going to ask if you are still interested in wiki renames [06:32:21] I follow them once in a while [06:32:30] alright [06:32:34] if there are bugs filed I do a sweep but not very often [06:32:40] the thing is there is a proposed procedure in tech wiki [06:32:50] it needs to be tested [06:33:01] I am trying to find someone who would do just that [06:33:21] what procedure? [06:33:30] https://bugzilla.wikimedia.org/show_bug.cgi?id=19986 [06:33:39] https://wikitech.wikimedia.org/view/Rename_a_wiki [06:33:53] renaming wikis? oh [06:34:21] sorry, thought you were taling about user renames [06:34:30] nop :) [06:34:30] fair assumption [06:34:38] some requests are 6 year old [06:34:39] wiki renames I've never meesed with [06:34:48] and the backlog is increasing [06:34:55] the user formerly known as various different sorts of cat comes in asking about renaming something [06:35:02] apergos to be fair no body has messed with it :/ [06:35:11] must be users! [06:35:30] * ToAruShiroiNeko fus-roh-dah's TimStarling [06:35:57] I don't think I'm going to be helpful on that bug [06:36:38] also the wikitech page is blank right now :-P [06:37:01] yeah sorry, I'm fixing it [06:37:15] I'm going to do my upgrade offline, the svn working copy is a bit messed up [06:37:37] should work now [06:37:56] yes, works [06:38:00] ok, thanks [06:41:59] oh, wikitech is running hardy [06:42:03] that's special [06:42:13] oh joy [06:42:22] no wonder my svn foo wasn't quite working as anticipated [06:42:26] ouch [06:43:40] TimStarling you anticipate svn to work? [06:43:46] murphy! [06:43:46] root@wikitech:/srv/org/wikimedia/wikitech# svn switch --force http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_17/phase3 [06:43:50] Subcommand 'switch' doesn't accept option '--force' [06:43:59] booooo [06:44:13] try --fusrohdah [06:44:16] :p [06:44:17] I thought I must have been misremembering, but no, there is actually a --force option to svn switch in recent subversions [06:44:54] TimStarling I am guessing you arent a skyrim fan :/ [06:45:17] I am not [06:45:22] TimStarling / apergos: https://rt.wikimedia.org/Ticket/Display.html?id=3334 (#3334: Update Wikitech mediawiki) [06:45:29] i'd update but it sounds like you guys aren't done and more fun may be in store [06:45:36] don't want to jinx it [06:45:54] you are missing out :/ [06:45:55] it's after next week [06:46:21] apergos how can I find a dev willing to carry out that proposal? [06:46:38] I have been seeking people for weeks :/ [06:46:48] ask on wikitech-l? [06:46:51] it looks like someone has tried to extract MW 1.19 over the top of it [06:46:59] and then somehow reverted by copying the old files back in [06:47:00] I have tried everything short of fishing rod & bait [06:47:03] so now every new file already exists [06:47:17] "Ugh, i just tried this but it was a FAIL...so i rolled back for now and will try again... " [06:47:27] ok, that would be the failed upgrade attempt mentioned earlier [06:47:57] I've upgraded wikis in a worse state than this [06:48:00] apergos I already posted on wikitech-l [06:48:09] I even did an upgrade for wikia back in 2006 [06:48:16] talk to the current bugmeister next I guess [06:48:24] it already has 48 topics before it [06:48:38] also https://rt.wikimedia.org/Ticket/Display.html?id=3028 (#3028: create a regular backup of Wikitech (Linode) for recovery purposes) makes me suspect there may not be a script to fix. couldn't find it by grepping through my various cloned repos but i don't think i have all the possibly relevant repos handy. [06:48:54] and I sent it to wikitech-l on 23 October [06:49:02] svn diff is empty, it's basically a walk in the park ;) [06:49:09] heh [06:49:31] I don't think there are any backup scripts or any of that for wikitech [06:50:39] apergos, any other recommendations? [06:51:01] besides seeing if the bugmeister has some suggestions? [06:51:04] not so much [06:51:55] got to got for dinner soon, so I'd better not start anything too serious [06:51:57] if you have brought it up in wikimedia-dev during waking hours for sf folks and people aren't willing to look at it right now [06:52:31] TimStarling: that's fine, the immediate issue is fixed for a good while anyways [06:53:35] I gotta get dressed and look for breakfast [06:53:55] and break myself of the bad habit of sitting down at the computer first thing after waking up [06:54:01] still in pjs... [06:55:02] apergos: Don't worry, I do it too :) [06:55:25] if http://wikitech.wikimedia.org.nyud.net/view/Main_Page looks up-to-date, is that an indication it gets cached regularly, or did i just trigger it by navigating there, and it wouldn't have worked if the site was down? [06:56:24] ori-l: you triggered it by navigating there [06:56:40] Coral is just a caching proxy, not a mirror of the whole internet [07:08:47] @add #wikimedia-office [08:37:37] @add #wikimedia-e3 [08:37:37] Permission denied [08:37:47] bah [08:37:53] you need sudo ;) [08:38:00] I can add it I think if you want [08:38:05] I think wikimedia/ works [08:38:13] @add #wikimedia-e3 [08:38:13] Permission denied [08:38:16] guess not [08:38:17] Type @commands for list of commands. This bot is running http://meta.wikimedia.org/wiki/WM-Bot version wikimedia bot v. 1.9.0.0 source code licensed under GPL and located at https://github.com/benapetr/wikimedia-bot [08:38:33] i think i need to add it from its channel [08:38:45] ahhh [08:38:47] yeah you do [08:38:54] #wm-bot or whatever. i did it before once but the bot became buggy and flooded the channel with error reports. [08:39:01] i don't think its rss parser is too robust. [08:39:16] I knew I had done it on my own for the shop channel [08:39:24] what were you using it for? git? [08:39:26] but i think i'll give it another chance [08:39:36] hey [08:39:40] you can do that from #wm-bot :P [08:39:45] um, i tried with bugzilla rss and atom gitweb [08:39:54] ok I can help you [08:39:57] which channel? [08:40:06] #wikimedia-e3 [08:40:34] if you want to insert this bot to any channel, you need to have wikimedia cloak (wikimedia/wikipedia/mediawiki/wikidata etc) [08:40:47] i'm wikipedia/ori-livneh [08:40:59] + you need to be admin (which you are in #wm-bot if you have any wm cloak) [08:41:19] sneaky [08:41:44] petan: did you add the wikivoyage/wikidata cloaks? :) [08:41:44] it's kind of described in @help [08:41:53] I don't really know... but I can do it right now [08:42:32] done [08:42:34] lies! It says you can do it in the #wm-bot channel but it doesn't say you need cloak + op ;) It just does't matter since cloak = bot in that channel :P [08:42:48] hence sneak ;) [08:42:57] eh, you don't need op [08:43:04] you just need to have a cloak [08:43:12] if you have cloak, bot automatically recognize you as admin [08:43:13] ah nvm then, I retract [09:01:55] I'm just going to upgrade it to 1.19, it's no big deal [09:07:52] oh yay [09:08:03] can't sweettalk you into 1.20? :-) [09:31:55] there's no 1.20 [09:36:14] wll there's 1.20wmfx [09:36:19] *well [09:47:00] 1.19 was the last version with extension branches [09:47:44] and nobody has adjusted their attitude towards backwards compatibility, so probably there will be no way to get working extensions for release tarballs in future [09:57:15] !log upgraded wikitech to MW 1.19.2 [09:57:25] Logged the message, Master [10:02:33] awesome [10:06:05] apergos: did you back up the database recently? [10:07:07] no, not me [10:07:40] ok [10:13:18] it's using a latin1 charset on text fields [10:13:43] now I have to remember how to fix that... [10:15:31] how did that happen? [10:17:40] blame brion [10:21:55] !log converting wikitech database from latin1 to binary [10:25:42] Logged the message, Master [10:34:30] !log on wikitech, running compressOld.php [10:34:47] Logged the message, Master [12:28:51] Dereckson: can you submit a patch for https://bugzilla.wikimedia.org/show_bug.cgi?id=32411 ? [13:21:33] i wanted to do mining on controversial articles in wikipedia ,http://en.wikipedia.org/wiki/Wikipedia:List_of_controversial_issues , how do i go about collected the dataset , is there a special sql column in the schema which identifies the article as controversial , any special export function? [13:22:04] is there any better way than manual sql query for each article name [13:22:06] <^demon> BlankVerse: No, that's just a list the community maintains. [13:22:33] BlankVerse: I did tell you there's no sql column for it already :( [13:22:33] <^demon> There's no sort of field to denote controversial or other subjective criteria like that. [13:23:08] QueenOfFrance: oops , sorry , i just copied my query here [13:23:56] Most of the mining papers on controversial articles use those articles as training data set [13:24:34] I am running out of ideas on how to get the articles on that link efficiently.. [13:25:03] other than manually querying each article? [13:25:41] category links [13:26:40] Nemo_bis: means/ [13:29:49] oh, no templates [13:29:53] then outgoing links [13:31:08] which templates? [13:32:51] Nemo_bis: I did not understand your solution , could you please explain a little more? [13:38:59] BlankVerse: https://www.mediawiki.org/wiki/API:Properties#links_.2F_pl [13:39:02] I guess [14:08:52] andre__: could you add me in the default cc list (dereckson@espace-win.org) for two components? Product: Wikimedia. Relevant components: Extension setup and Site configuration. [14:08:56] Thank you. [14:09:09] Nemo_bis: noted, I'll look at this bug around 7 pm CET [14:10:03] Dereckson, done [14:11:30] Dereckson: thanks [14:12:08] Thanks andre__. [14:28:57] Tim-away: Thank you for the wikitech wiki upgrade! I marked https://bugzilla.wikimedia.org/show_bug.cgi?id=37317 as resolved. [14:29:05] andre__: Do you have RT access? [14:30:07] Brooke, yes [14:30:07] RT 3334 can probably be marked as resolved. [15:43:36] what would be the api call to download a wikipedia article with title "foo" in xml format? [15:43:42] only the text of the article [15:47:58] BlankVerse: text or wikitext? [15:48:14] (all this should be on #mediawiki btw) [15:48:30] text [15:48:35] not possible [15:48:50] http://en.wikipedia.org/w/api.php?action=query&titles=San_Francisco&format=xml&export [15:50:58] is this the best ? [15:51:32] or am i missing some api params? [15:51:32] "best" is subjective [15:52:06] "best" here means as close to the text of the wikipedia page [15:52:24] I suspect so [16:54:50] * ToAruShiroiNeko fishes for a dev using a fishing pole with haloween candy tied to the end [16:55:25] maybe I should go for a yarn ball :/ [16:55:54] Reedy: can you enable transcoding on test2 for me (https://gerrit.wikimedia.org/r/#/c/30954/) wanted to test the transcoding infrastructure [16:58:26] Sure [17:00:55] j^: done [17:03:46] Reedy: ups that should have been test2wiki not test2 [17:03:54] should i push a new changeset? [17:04:50] haha [17:04:52] please [17:05:04] Reedy: https://gerrit.wikimedia.org/r/#/c/31403/ [17:36:49] thanks andre__ , +1 page merge https://www.mediawiki.org/w/index.php?title=Bug_management/Bugzilla_usage&diff=0&oldid=566024 [17:37:38] Nemo_bis: heh, sure. :) I need to start with some bits, as I don't like the current Bug management docs [19:13:16] does the current mw version on pt.wp allows the preload? I'm getting an "[441391ef] 2012-11-02 19:11:51: Fatal exception of type MWException" [19:21:44] https://pt.wikipedia.org/wiki/User:Aleth_Bot/common.css?action=edit§ion=new&preload=foo [19:21:49] causes "[1a7d9242] 2012-11-02 19:20:40: Fatal exception of type MWException" [19:22:03] as reported on https://bugzilla.wikimedia.org/show_bug.cgi?id=41706 [19:22:19] can anyone get a backtrace for that? [19:22:38] stacktrace I mean [19:27:04] helderwiki: I got one [19:27:36] great! [19:28:08] I've got a feeling that might be a dupe [19:28:39] I'm guessing so-- contentHandler complaining about text/css [19:28:55] Though, possibly not an open one.. [19:29:28] Format text/css is not supported for content model wikitext [19:29:30] BTW: cant those error messages be more descriptive/useful? [19:29:41] There's a bug about that ;) [19:29:41] (they all look the same ) [20:56:13] Is it normal to get the following error frequently? [20:56:14] Failed to load resource: the server responded with a status of 403 (Forbidden) http://bits.wikimedia.org/en.wikipedia.org/load.php?debug=false&lang=en&modules=mediawiki.action.history.diff&only=styles&skin=vector&* [21:12:39] Reedy: you're deploying a new ProofreadPage? [21:12:54] namespace updates [21:13:05] ah, thanks [21:14:08] gerrit down? [21:14:25] (you can ignore that, I just had to say that to get it back up) [21:14:49] Nikerabbit: It's good that you understand the arcane IRC command interface that ^demon set up [21:15:11] * ^demon wishes he knew why gerrit kept restarting itself [22:20:36] anyone here? [22:20:49] what does http://wikimediafoundation.org/wiki/Special:Contact point to? [22:21:29] yukpn: There are a few people here, yes [22:22:05] would anyone know the config parameters and whether that even works? [22:22:21] yukpn: It's probably in a config repo somewhere [22:23:50] marktraceur: would you know which one? [22:24:13] yukpn: I'm searching now, but there seem to be a large number of config files (as you might imagine) [22:24:43] oh yes, it's a maze but I was wondering if a WMF person might have put it there and would know [22:25:23] is there a Daniel Kinzler anywhere? [22:25:27] it probably goes to OTRS [22:25:43] yukpn, he's DanielK_WMDE [22:26:22] yuvipanda: It goes to a user on the wiki named WikiAdmin [22:26:37] ? [22:26:43] Sorry, misping [22:26:46] marktraceur: I just fixed my nick [22:26:52] ah, figured [22:26:53] UserMono: Yes, now I see :) [22:27:22] UserMono: However, that user may not exist? I don't know, there must be some magic happening somewhere. [22:27:30] DanielK_WMDE, looking for info on http://wikimediafoundation.org/wiki/Special:Contact and you wrote the extension - could you help? [22:27:41] I don't think it works, but it would be nice if it did [22:28:11] UserMono: i wrote the original version, but havn't touched it in years. [22:28:17] i doubt i'd recognize the code ;) [22:28:38] DanielK_WMDE: The bad news is that I think you would actually [22:28:39] others have expanded it quite a bit [22:28:53] hmm. [22:28:53] DanielK_WMDE: Do you happen to know who deployed it and/or configured it? :) [22:29:02] nope [22:29:12] i wrote it for my own website back then :) [22:29:38] UserMono: how does it not work? [22:29:39] Hmm no you're right, it has been modified quite a bit [22:29:47] UserMono, it's probably in the Server admin log... somewhere... [22:30:09] I'm checking bugzilla - but it's probably ancient [22:30:24] it's probably not stored there [22:31:19] you may have more luck in the server admin log, although not knowing the deployment date, you may need to search several pages [22:34:26] !log Created TMH table on all wikis [22:34:29] Logged the message, Master [22:35:11] onoudidnt [22:35:41] AaronSchulz: make sure you add that to addWiki.php too [22:36:04] * AaronSchulz volunteers Reedy [22:36:10] pfft [22:36:49] bug 15624 is relevant [22:38:18] Reedy: https://gerrit.wikimedia.org/r/#/c/31572/1 [22:38:42] Reedy: can you take a look at https://bugzilla.wikimedia.org/show_activity.cgi?id=24682 and tell me if you modified that bug? [22:38:58] Yes [22:40:13] Reedy: it seems like that bug documents the same problem we have right now [22:40:44] And? [22:41:29] well, you closed it so I was wondering if you had any idea what happened [22:50:01] is there someone available with access to labs, who would please restart a bot for me? COIBot, the instructions are at root of the account [22:51:48] sDrewth: Which labs instance? [22:54:18] marktraceur: :-( I don't know [22:54:45] wmflabs [22:55:03] * sDrewth shrugs and checks Beetstra's notes [22:55:46] sDrewth: I may be able to help decipher if they're public [22:56:16] https://meta.wikimedia.org/wiki/User_talk:Beetstra#COIBot_is_AWOL [22:57:17] Damianz: ^^^ [22:57:19] bots-2 I think [22:57:32] thx [22:57:39] I can do it in a few if needed, just eating [22:58:18] someone, sometime is my wish, the screaming urgency it isn't