[00:26:41] !log awjrichards synchronizing Wikimedia installation... : [00:26:45] Logged the message, Master [00:43:06] sync done. [01:59:08] snyc-file is run relative to the common directory right? So to sync the InitializeSettings file you would do... [01:59:18] sync-file wmf-config/InitializeSettings.php [01:59:20] ? [02:07:39] Could not open input file: /home/wikipedia/common/wmf-config/InitializeSettings.php [02:07:39] Aborted due to syntax errors [02:07:41] hmm [02:10:43] !log LocalisationUpdate completed (1.20wmf4) at Thu May 31 02:10:43 UTC 2012 [02:10:49] Logged the message, Master [02:17:50] !log kaldari synchronized wmf-config/InitialiseSettings.php 'syncing InitialiseSettings to disable PageTriage on test2' [02:17:55] Logged the message, Master [02:18:23] hehe, I spelled InitialiseSettings wrong :) [02:32:48] !log LocalisationUpdate completed (1.20wmf3) at Thu May 31 02:32:48 UTC 2012 [02:32:52] Logged the message, Master [02:48:40] [[Tech]]; Helder.wiki; /* Adding a section to the "Special characters" portion of WikiEditor toolbar */ See also this topic: [[mw:Extension_talk:WikiEditor/Toolbar_customization#Adding_characters]].; https://meta.wikimedia.org/w/index.php?diff=3800587&oldid=3800285&rcid=3326519 [02:56:31] [[Tech]]; MZMcBride; /* Adding a section to the "Special characters" portion of WikiEditor toolbar */ +reply; https://meta.wikimedia.org/w/index.php?diff=3800596&oldid=3800587&rcid=3326525 [03:44:17] [[Tech]]; Helder.wiki; /* Adding a section to the "Special characters" portion of WikiEditor toolbar */ You could try this alternative "customizeToolbar" (based on Danmichaelo's code): ...; https://meta.wikimedia.org/w/index.php?diff=3800657&oldid=3800596&rcid=3326563 [04:16:12] Wikipedia down? [04:17:01] oops [04:17:07] can someone revert it? [04:24:13] that one might be stale, but it's the last one I had [04:51:29] @info db38 [04:51:29] jeremyb: [db38: s1] 10.0.6.48 [04:52:18] no kaldari [04:52:25] tab is your friend kaldari! [05:11:38] are null edits supposed to be happening again? [05:11:40] http://en.wikipedia.org/w/index.php?title=Park_Ji-Sung&diff=prev&oldid=495245545 [05:11:49] http://en.wikipedia.org/w/index.php?title=Park_Ji-Sung&diff=next&oldid=495245545 [05:11:56] back to back [05:12:04] it's not the first I've seen tonight [05:12:16] http://en.wikipedia.org/w/index.php?title=Tesla_Motors&diff=495244797&oldid=495244759 [05:12:32] http://en.wikipedia.org/w/index.php?title=List_of_Dallas_episodes&diff=prev&oldid=495244969 [05:13:53] * slakr looks around [05:18:34] https://bugzilla.wikimedia.org/show_bug.cgi?id=37225 [05:18:39] looks like that's related [07:35:11] !log nikerabbit synchronized php-1.20wmf4/extensions/TranslationNotifications/SpecialTranslatorSignup.php 'Tempfix for fatal - bug 37235' [08:08:12] Hi, can someone tell me who I should contact about the Amsterdam cluster? [08:08:30] I'm here at the Wikimedia Netherlands office, and there has a packet from FedEx arrived here. We don't know about it, but I thought it might be for the Amsterdam cluster. [08:10:57] huh [08:11:01] well mark would be the person [08:11:24] in #wikimedia-operations [08:11:38] is there any indication of the contents (shipper or anything)? [08:12:36] FedEx, sent from Sunnyvale California with destination Haarlem Netherlands [08:12:58] meh not enough info [08:14:24] FedEx has mailed WikimediaNL, asking for the data necessary for customs check [08:14:34] ugh [10:39:17] !log reedy synchronized php-1.20wmf3/extensions/UploadWizard/ 'Push trunk UW to cluster' [10:40:41] !log reedy synchronized php-1.20wmf4/extensions/UploadWizard/ 'Push trunk UW to cluster' [10:43:54] !log reedy synchronized php-1.20wmf3/extensions/UploadWizard/ 'Push trunk UW to cluster' [10:43:59] Logged the message, Master [10:43:59] !log reedy synchronized php-1.20wmf4/extensions/UploadWizard/ 'Push trunk UW to cluster' [10:44:03] Logged the message, Master [10:52:06] apergos: Free to answer questions? [10:52:45] let's see if I know the answers! [10:53:05] its about the media tarballs samples [10:53:17] already? [10:53:23] remote-media seems redundant, if we are going to dump Commons? [10:53:30] I do not plan to do so [10:53:35] that's 14 T by itself [10:53:38] damn... [10:53:41] someone wants it, they can rsync it [10:54:10] people who want media per project have to do a little futzing to get the remote hosted media, so the tarballs are reasonable to provide [10:54:16] for commons, you jut rsync the whole thing [10:54:26] hmm... [10:54:47] someone at the archive is already using the media lists to build media bundles for archive there [10:54:48] okay then [10:55:01] for all wikis? [10:55:08] not sure about that [10:55:22] oh great, then its not my job to worry about : [10:55:25] :P [10:55:29] I have to check in with them once I have the "incremental" tarballs going [10:55:39] no need to worry now, that's right [10:55:43] next month, thats the answer I got [10:56:04] I am worried because I don't have any server that can split the file up and upload to the Archie [10:56:06] *Archive [10:56:18] don't worry [10:56:20] be happy [10:56:23] :) [10:56:42] so its just HTML and the other normal dumps that I am doing now [10:56:58] are you using the gluster share to get those? [10:57:02] no [10:57:02] that's what it's there for [10:57:11] its screwed [10:57:18] how so? [10:57:30] uploading is giving trouble for I/O [10:57:47] but I have my tactics haha [10:58:15] please work with whoever you need to on the labs project to get the i/o issue settled [10:58:30] I can't do anything [10:58:39] I am only running one instance of upload [10:58:45] *uploading [10:58:58] and its very slow [10:59:17] you can describe it to folka in the labs channel and see what their plans are [10:59:19] adding one more breaks everything AFAIK [10:59:48] there seems to be some discussion about fixing gluster itself [10:59:54] since we are using the dev version [10:59:58] uh huh [11:00:49] so I just need to wait... [11:00:54] I guess so [11:00:57] it's summer [11:01:02] a nice time to take breaks :-D [11:01:10] lol [11:01:29] I shall see what the Archive is going to do about the media [11:01:43] before I decide if I got to intervene in this [11:02:45] so you download from dumps.wm.org to the vm's gluster filesystem and then upload? is that the deal right now? [11:03:34] I am not doing anything on labs now, other than the one instance [11:03:50] and what's running on that one instance? [11:03:50] which is rsync from your.org and upload - real slowly [11:03:54] I see [11:04:08] so you haven't tried the public gluster share with the last 5 dumps? [11:04:18] I can't access it suddenly [11:04:19] in theory that share should be accessible to all lab instances [11:04:28] theory [11:04:40] practically not though :P [11:04:43] can you ask the labs folks to look at it with you? [11:05:01] because you could upload directly from it, rather than downloading elsewhere etc [11:05:07] * Hydriz figures out... [11:05:08] *from elsewhere [11:05:31] oh, I got back access to the share [11:05:55] yep, I will use that share soon [11:06:06] great. [11:06:13] especially with the longrunner script [11:06:31] which is going to scan the directory for new dumps not in the Archive, and upload it [11:06:47] and I was hoping I could check it into operations/dumps.git [11:06:50] * Damianz drop kicks Hydriz for killing his bots [11:07:04] :( [11:07:16] We could blame gluster but drop kicking people is more fun :D [11:07:31] why did we use gluster zzz [11:07:38] be like the Toolserver :) [11:07:42] because it's distributed, redundant and posix [11:09:00] okay, do you do an rsync to the gluster share from dataset1001 at a fixed time everyday? [11:09:22] it runs from cron [11:09:35] so its daily? [11:09:58] yes [11:10:18] okay [11:10:23] good to know :) [11:10:37] people with real-time data needs aren't going to be getting the dumps.. they'd already be out of date by the time of the download [11:11:09] what time does the cron run? [11:11:33] hmm, nevermind [11:11:47] maybe I can just run my script from cron with a day's delay [11:13:37] better to run with delay, because times could be moved around [11:14:04] you want your stuff to just Do The Right Thing (tm) regardless [11:18:09] hehe [11:18:23] thanks for the info anyway, though its nothing much to you [11:18:50] eh? [11:18:52] at least I can have one less thing to worry about, thats the main point [11:18:56] yes [11:19:22] but didn't you see the request for a new mirror? [11:19:28] on Wikitech-l [11:19:34] or is it happening [11:19:43] I saw it [11:20:08] not interested? [11:20:09] it is in my queue [11:20:25] the queue again... [11:20:44] the dvd links are not even up yet lol [11:20:52] they are commited actually [11:20:54] though I think its in git repo or something [11:20:56] yeah [11:21:09] and they will be up after the upgrade and reboot of dataset2 [11:21:45] to be precise they are up and deployed byt the one job runnign now is not using the new files [11:22:08] but how long will the reboot be, and when is it done? [11:22:28] (just realised I am pulling files from dumps.wm.o now) [11:22:29] I don't know ow long it will take, hopefully 10 minutes [11:22:41] well tomorrow you'll get interrupted [11:22:52] hmm... [11:22:55] I'll be sending out an email to the list a little before I do it [11:23:08] might affect me, I guess [11:23:17] most likely [11:23:49] 21 * 24 = 504 files more [11:24:17] 34GB at 2MB/s? [11:25:03] oh, maybe not [11:25:20] if you don't reboot it in 9-10 hours or so [11:25:41] my expectation is that it will be tomorrow morning my time [11:25:54] but that's subject to change if some emergency pops up [11:25:59] thats in how many hours? [11:26:45] well i's 2:30 pm here now [11:26:53] I see [11:26:59] then it won't affect haha [12:45:40] !log reedy synchronized php-1.20wmf3/languages/Language.php '(bug 36839) Use mb_check_encoding() if available' [12:45:44] Logged the message, Master [12:46:11] !log reedy synchronized php-1.20wmf4/languages/Language.php '(bug 36839) Use mb_check_encoding() if available' [12:46:15] Logged the message, Master [13:04:18] !log reedy synchronized wmf-config/InitialiseSettings.php 'Bring in a few shell requests' [13:04:22] Logged the message, Master [13:13:26] Reedy: yeah, thanks, you deployed my first commit to the wmconfig files :)))) [14:44:31] have old oversighted revisions been migrated to revdeleted revs? [14:45:53] i don't think so [15:48:38] hello. Is there a way to find the path of the API on a wiki? [15:58:07] DaBPunkt: aren't they all just /w/api.php ? [15:58:13] DaBPunkt: tell me more? [15:58:53] jeremyb: http://www.wikimedia.pt (which is a wmf-wiki AFAIS) has the api under http://www.wikimedia.pt/api.php for unknown reason [15:58:58] liangent: there's a bug on it [15:59:46] DaBPunkt: it's definitely not a WMF wiki [16:00:11] jeremyb: but http://noc.wikimedia.org/conf/all.dblist has it [16:00:16] that's fine [16:00:22] it's still not a wmf wiki ;) [16:02:30] jeremyb: but there is a database on s3 for it [16:02:37] that's also fine [16:02:55] ok, I will add it to the ignore-list… [16:03:08] i'm sure it's not the only one that meets all of the above criteria [16:03:36] is there a list of non-wmf-wikis, which are in the wmf-all-list? ;) [16:04:32] (it is not even in the closed.dblist…) [16:04:55] hehe [16:05:44] jeremyb: Why do you say it's not a WMF wiki? [16:06:33] Joan: resolve it's IP and then reverse it back to a domain? or whois the IP? [16:06:37] it's quite certain [16:08:06] Hmmm. [16:08:15] I wonder what ptwikimedia_p is, then? [16:08:27] It should probably be dropped if it's just going to confuse people. [16:08:46] Joan: I will do that now on the TS-side [16:08:57] than it is not longer my problem ;) [16:09:00] :D [16:09:11] Until the next S3 dump/reimport, I suppose. [16:10:32] !b 29675 [16:10:33] https://bugzilla.wikimedia.org/show_bug.cgi?id=29675 [16:11:42] thanks wm-bot [17:53:00] [[Tech]]; MZMcBride; /* Adding a section to the "Special characters" portion of WikiEditor toolbar */ +reply; https://meta.wikimedia.org/w/index.php?diff=3802594&oldid=3800657&rcid=3328080 [20:38:14] !log lcarr synchronized wmf-config/mc.php [20:38:18] Logged the message, Master [21:07:04] When deploying a new extension to testwiki, how do you update the i18n without scapping? [21:08:38] you essentially need to scap [21:08:40] but not push it to site [21:08:41] errm [21:09:15] I thought scap pushes it out for you [21:09:28] yeah [21:10:04] Have you added the extension to wmf-config/extension-list ? [21:10:19] not yet [21:12:50] mwscript mergeMessageFileList.php --wiki=testwiki -file=/home/wikipedia/common/wmf-config/extension-list --output=/home/wikipedia/common/wmf-config/ExtensionMessages-1.20wmf4.php [21:13:27] might need 2 x - before file [21:13:41] bah [21:14:15] mwscript mergeMessageFileList.php --wiki=testwiki --list-file=/home/wikipedia/common/wmf-config/extension-list --output=/home/wikipedia/common/wmf-config/ExtensionMessages-1.20wmf4.php [21:14:15] mwscript rebuildLocalisationCache.php --wiki=testwiki --quiet --outdir=/home/wikipedia/common/php-1.20wmf4/cache/l10n [21:14:54] cool, I'll try that [21:15:16] kaldari: in berlin? [21:16:04] noooo [21:18:36] nope :( [21:20:58] Reedy: pm [21:49:21] RoanKattouw: yt? [21:49:35] awjr: ? [21:49:47] hey - do you know much about MW session handling? [21:49:55] A bit, why? [21:50:13] Now you did it [21:50:13] !log reedy synchronized php-1.20wmf4/includes/logging/LogEventsList.php [21:50:16] Logged the message, Master [21:50:27] awjr: Whoops sorry [21:50:31] we're seeing an issue in MF that seems to exist only in prod and i can't replicate it locally [21:50:35] RoanKattouw no worries [21:50:48] !log reedy synchronized php-1.20wmf4/includes/specials/SpecialLog.php [21:50:49] RoanKattouw the issue manifests here: http://test.m.wikipedia.org/wiki/Special:MobileOptions [21:50:52] Logged the message, Master [21:51:15] basically, every time you load that page, or really any page in the mobile view, MW seems to generate a new session for the user [21:51:26] or at least, i get a session cookie set in my browser [21:51:51] locally, every page request handled by MF gives a consistent cookie value for the session cookie [21:51:58] That's because of Varnish breakage [21:52:02] oh [21:52:05] Talk to Patrick, he discovered this earlier [21:52:07] Well, "breakage" [21:52:10] lol [21:52:22] ok i'll do that - it's making it so we can reliably set a token [21:52:32] I mean, mobile Varnish is stripping the Cookie header apparently [21:52:38] or rather, we can set a token, but it will always result in a mismatch on form submit [21:52:41] o [21:52:56] fun. [21:53:18] ok thanks, that helps a lot [22:01:30] the varnish itself is the same code as all the other varnish, right? just a mobile config file makes it mobile [22:01:33] i hope [22:01:48] jeremyb i believe so [22:02:20] but this happens in the mobile config:http://pastie.org/4004402 [22:04:07] !log aaron synchronized wmf-config/swift.php 'Purge from squid all thumbs in Swift on purge.' [22:04:11] Logged the message, Master [22:04:49] jeremyb: yes, just the configurations are different [22:04:53] same pacakages and stuff [22:05:33] 10.0.11.64 apache2[10190]: PHP Warning: require(/usr/local/apache/common-local/php-1.20wmf3/../wmf-config/ExtensionMessages-1.20wmf3.php) [function.require]: failed to open stream: No such file or directory in /usr/local/apache/common-local/wmf-config/CommonSettings.php on line 2485 [22:07:20] AaronSchulz: for me 2485 is just a curly brace and nothing else on the line [22:08:45] !log reedy synchronized wmf-config/InitialiseSettings.php 'Re-enable randomrootpage again' [22:08:48] Logged the message, Master [22:09:13] that box is out of disk space [22:09:20] Reedy: COOL! ^^ [22:09:51] oh yay [22:10:26] a special plague on whoever decided that /a should be huge and / should be tiny [22:10:45] /dev/sda1 7688360 7298888 0 100% / [22:11:45] there's a couple complaining [22:12:01] mw64? [22:12:16] yes, that one [22:13:03] Wait, a box without disk space even now that MediaWiki is on /a not on / ? [22:13:33] /a has like nothing on it [22:13:42] maybe the box was missed in the symlink rounds [22:13:43] tmp- or log-files maybe? [22:14:43] Reedy: rebuildLocalisationCache didn't seem to work, should I just go ahead and run scap? [22:15:07] log is fairly small [22:15:11] might be easiest [22:15:20] tmp has mw-cache-1.20wmf3 in it [22:15:25] oh but tmp is its own partition [22:15:32] :) [22:15:38] Hmm I guess notpeter must've missed this box then [22:15:45] /a is supposed to have 2 or 3 gigs on it [22:16:57] RoanKattouw: you're not doing the AFT deployment today right? [22:17:07] noooo [22:17:15] cool, just checking [22:17:25] want to make sure no one else is scapping :) [22:17:29] * AaronSchulz interpreted that as an almost-vader scream [22:25:19] !log kaldari Started syncing Wikimedia installation... : [22:25:23] Logged the message, Master [22:26:36] gn8 folks [22:26:54] is it normal to get a hundred or so "srv276: Unable to read wikiversions.dat or it is empty [22:26:54] srv290: Unable to read wikiversions.dat or it is empty" during a scap? [22:28:54] No that's not normal [22:29:09] should I stop it? [22:29:43] Yes [22:29:49] stopped [22:29:54] And ask Aaron wtf is going on [22:29:57] you know, i was seeing that yesterday as well [22:30:14] AaronSchulz: Any idea? ^ [22:30:27] i think i've just been desensitized to scap warnings/errors [22:31:29] heh [22:33:43] ;-( [22:58:00] Fatal error: Uncaught exception 'Exception' with message 'Invalid version dir '/home/wikipedia/common/php-1.20wmf4' on line 0 ('aawikibooks php-1.20wmf4 *'). [22:58:01] ' in /usr/local/apache/common-local/multiversion/MWWikiversions.php:62 [22:58:03] Stack trace: [22:58:04] #0 /usr/local/apache/common-local/multiversion/MWWikiversions.php(24): MWWikiversions::rowFromLine('aawikibooks php...', 0) [22:58:42] if ( !is_dir( MULTIVER_COMMON_HOME . '/' . $version ) ) { [22:58:51] this is, of course, broken on apaches [23:06:02] AaronSchulze: Is that what caused the scap errors? [23:06:28] I mean AaronSchulz :) [23:08:40] I think so [23:08:59] !log aaron synchronized multiversion/ 'Updating multiversion code to head.' [23:09:02] Logged the message, Master [23:09:25] kaldari: should work now [23:09:35] trying... [23:09:40] I tested on srv290 [23:11:21] it doesn't matter that the extension only exists in the wmf4 branch, right? I assume it'll just skip it since it won't be able to load the files in wmf3. [23:12:20] or can you give scap an argument to only do a certain version? [23:12:30] it should work [23:12:40] no, there is not "only scap this version" option [23:12:56] after the git deploy stuff lands it might be nice to do that [23:13:09] and also make the texvc updates skippable [23:13:21] since 95% of the time it's just cpu burn [23:13:38] what is the texvc update about anyway? [23:13:57] recompiling some occaml into binaries [23:14:24] those binaries generate math images from syntax for the Math extension [23:14:26] !log kaldari Started syncing Wikimedia installation... : scapping for new LastModified and E3Experiments extensions [23:14:29] Logged the message, Master [23:15:21] ah, so we probably don't actually need to recompile them for every deploy [23:15:58] No, we don't [23:16:08] But for some reason we've been recompiling them since the dawn of time [23:21:18] dawn of time == 1970-01-01 00:00:00 ? [23:21:27] i hope so [23:40:38] !log kaldari Finished syncing Wikimedia installation... : scapping for new LastModified and E3Experiments extensions [23:40:42] Logged the message, Master [23:44:21] RoanKattouw: The i18n messages are still broken on test.wiki after the scap: http://test.wikipedia.org/wiki/Thesaurus . Is there anything else I need to do? [23:44:40] That should have been fixed [23:44:46] Did the l10n cache rebuild fail? [23:44:50] More permissions errors perhaps? [23:44:57] lemme look [23:46:16] "Updating LocalisationCache for 1.20wmf4... done" [23:46:21] Well you do own all of those files [23:46:40] But they're group-writable [23:50:27] should I try doing a sync-common on srv193? [23:50:57] although I don't imagine that would affect the i18n [23:53:37] how do you ssh to srv193 anyway? [23:54:02] srv193.wikimedia.org doesn't work and test.wikipedia.org gives me permission denied [23:54:17] test.wp.o is surely lvs [23:54:27] kaldari: srv193 [23:54:29] Just srv193 [23:54:56] ah [23:55:13] I think technically its full name is srv193.pmtpa.wmnet or something [23:55:26] it's srv193.pmtpa.wmnet not .wikimedia.org [23:55:27] yeah [23:55:36] aka 10.0.2.193