[00:32:13] Danny_B|backup: [00:32:32] 'wgRC2UDPAddress' => array( 'default' => '208.80.152.178', 'private' => false, ), [00:32:36] only private don't [00:33:00] but it can be overriden, right? [00:33:43] if there was a reason to, I guess [00:34:46] oki, i'll ask folks what they think about it [02:20:00] Sorry, the servers are overloaded at the moment. → Too many users are trying to view this page. Please wait a while before you try to access this page again. → Timeout waiting for the lock [02:20:12] have never seen that error before [02:22:00] http://i.imgur.com/6VRyh.png ← I wonder if I should click on "No" ツ [02:22:36] That's PoolCounter [02:22:48] It's a defense mechanism against Michael Jackson-like catastrophes [02:23:03] heh [02:23:45] RoanKattouw: but... why would it need lock for reading? [02:23:49] aint it cached? [02:24:04] So roughly the way it's supposed to work is this [02:24:15] 1. Notice there is no fresh cache copy for the Olympics page [02:24:20] 2. Grab the lock on the page [02:24:23] 3. Start parsing the page [02:24:37] 4. 2nd visitor comes along, notices there's no fresh cache, wants to parse [02:24:44] 5. Tries to grab the lock, fails because it's already in use [02:24:50] This prevents parallel parses of the same page [02:25:04] it took over 2 minutes before it returned [02:25:09] I believe that in practice, up to 2 or 3 locks can be held on the same page, and they expire after some time, to prevent the system from locking itself up [02:25:11] Huh, that's weird [02:26:04] why not have a "2. Check if there is a lock, if true, wait for it"? [02:26:19] Because waiting for the lock would take a long time [02:26:27] Maybe it is waiting for the lock and timing out [02:26:32] I don't know, I didn't design this system [02:26:34] probably [02:26:37] TimStarling and Platonides did [02:26:39] you didn't? [02:26:48] redesign it then! [02:26:49] Nope, wasn't me [02:26:52] !log LocalisationUpdate completed (1.20wmf7) at Tue Jul 24 02:26:52 UTC 2012 [02:27:04] Logged the message, Master [02:28:15] * AzaToth still can't understand why logmsgbot needs to go via IRC to inform morebots... [02:28:31] It's kind of a ridiculous hack [02:28:45] But at the same time, it makes the message visible to people [02:28:47] AzaToth: its also to inform humans, by the way. [02:28:52] Krinkle: ツ [02:29:01] it does wait, but there's a limit to the number of waiters IIRC [02:29:09] Aaah OK [02:29:13] if that's exceeded then people will start seeing errors immediately [02:29:31] Doesn't it try to server an older version of the article? [02:29:32] Wouldn't it be reasonable to have faster-returning behavior if there is a stale version in cache? Or is that already implemented? [02:29:48] ^ What he says. [02:31:13] the error message is a bit bad imo, "Too many users are trying to view this page" gives the impression that wmf can't serve the world [02:32:52] I actually thought it always only returned from cache [02:33:09] I thought it would do that too [02:33:34] Of course if it's rerendering because it fell out of cache .... [02:34:01] anyway, I felt the ArticleFeedback thingi was funny ツ [02:34:13] Yeah that was funny [02:47:17] that seems like an AFT bug, I wonder how "article decorations" can tell that the page didn't really render well. [02:48:21] !log LocalisationUpdate completed (1.20wmf8) at Tue Jul 24 02:48:21 UTC 2012 [02:48:30] Logged the message, Master [03:47:36] TimStarling: what's the use of $wgUseDumbLinkUpdate? [03:48:58] it wasn't clear when that code was written which way of doing it would be faster [03:49:39] actually dumb updates are probably faster [03:49:47] but they cause too much slave lag [03:49:58] incremental updates were a bit slower but didn't cause so much slave lag [03:50:01] also, ALF_PRELOAD_* isn't used anywhere [03:50:12] TimStarling: http://www.mediawiki.org/wiki/Special:AbuseLog/14073 is strange. [03:50:42] maybe the code that used to use them went away [03:50:43] TimStarling: don't dumb updates cause a lot of delete load? [03:50:59] yes, more writes but less reads [03:51:24] and less mucking around in PHP working out set differences [03:51:56] I tend to find that the DELETE/INSERT pattern causes deadlocks easily too [03:52:53] huh, ALF_NO_BLOCK_LOCK isn't used either [03:53:13] yeah, optimisation bit rot [03:53:32] you see why domas complains so much about declining site performance [03:53:46] but we cared a lot about performance in those days, more than anything else [03:57:34] so if $wgUseDumbLinkUpdate causes lag on large sites and probably doesn't matter on smaller ones...I wonder why we need it there [03:58:20] * Aaron|home hands domas some SSDs [03:59:30] look ma, secondary lookup with no hands! [04:00:35] you mean you don't spend all data trying to get data so you can read it in a few seeks? [04:00:43] * Aaron|home would rather do other stuff ;) [04:01:59] domas: I'm saying you don't care about locality as much [04:02:10] yes, optimizing for space ;) [04:06:55] * Aaron|home wonders what domas is up too ;) [04:07:26] opps, I mean "don't spend all day" [04:12:11] TimStarling: would it be a crime to nuke wgUseDumbLinkUpdate? [04:12:36] is it offending you? [04:13:01] only mildy, as extra code we don't use [04:13:23] if you're trying to refactor LinksUpdate and it's slowing you down, then sure [04:13:37] otherwise I would be inclined to leave it in there [04:13:56] since whether or not it makes things faster depends on the details of the DB server [04:14:17] so it might be useful for someone [09:10:01] !log nikerabbit synchronized wmf-config/InitialiseSettings.php 'Bug 36104 - Narayam on bnwikisource' [09:10:14] Logged the message, Master [10:02:41] srv281: rsync: write failed on "/apache/common-local/wmf-config/InitialiseSettings.php": No space left on device (28) [10:04:24] should that be logged? [10:05:20] it doesn't have a /apache partition [10:05:31] just /a [10:05:58] as long as it's not doing anything it's ok for now I guess [10:05:59] right, I see from the logs that it has been in "reinstall" [10:09:49] Nikerabbit: you could add to the list of "things we should maybe fix some day" on wikitech I suppose [10:10:03] I wasn't aware of such list [10:10:46] Nikerabbit: I mean things like "When running sync-file or sync-dir, you'll usually see about half a dozen errors from broken servers" https://wikitech.wikimedia.org/view/How_to_deploy_code#Small_changes:_sync_individual_files [10:26:20] Nikerabbit: is that a correct place where to add a note on srv281? [10:27:41] no? [10:33:57] oki [11:52:16] Reedy: maybe you can help with this: Nemo_bis> should the i18n keyword be added also to bugs about interface messages problems not strictly related to i18n (ie which apply in all langages, or in ENglish only)? [11:52:26] maybe we need a "message" or "ui" keyword or something, also for things like https://bugzilla.wikimedia.org/show_bug.cgi?id=10573 [12:08:07] Nemo_bis: yeah [12:13:14] Reedy: so we will let virii in :-] [12:13:19] no mooaar clamav hehe [12:13:23] We already are ;) [12:13:28] I have reviewed a few of your changes [12:13:49] Just adding it to app servers is way too simplistic [12:14:38] you could have made a puppet class to ship the configuration as well [12:14:47] though that still need some config in mediawiki-config as well [12:16:03] Reedy: yeah for the first or the second? [12:16:11] I'm using a tracking bug for the second [12:33:23] ie https://bugzilla.wikimedia.org/show_bug.cgi?id=38638 [12:35:35] matanya: can't you fix the description too? [12:35:48] they all seem to have |description={{en|1=Wikipedia Servers}} [12:35:55] (Commonist upload I suppose) [12:36:02] no, UploadWizard, sigh [12:39:44] I guess I can, is it worth is? [12:39:45] *it [12:52:43] apergos: thanks for the pointer to #wikimedia-operations yesterday, even though they just made me file a bugreport: https://bugzilla.wikimedia.org/show_bug.cgi?id=38604 [12:53:34] ok well at least it's on someone's radar now [13:25:42] apergos: your comments would be most appreciated: https://bugzilla.wikimedia.org/38640 [13:27:01] what would prevent a spammer from auto-adding crap in the description field? [13:27:18] a few random words, say [13:28:25] it wouldn't get approved per the description in commons [13:28:34] it needs to match [13:29:32] where would the description live that has to match? [13:29:48] somewhere the user can find it, right? [13:30:05] yes [13:30:16] why can't a script or a bot or a smapper find it automatically? [13:30:37] *spammer [13:31:11] find the right image page --> parse the right language description from that page and paste it? [13:31:23] brb [13:32:18] yes: as long as the image is shown to the user, the user ahs the link to it and can derive the File: page from that. [13:32:54] assuming the descriptions in various languages will be arranged in some sort of table, not randomly on the page, then the retrieval of that can also be scripted [13:52:09] apergos: can we host it in a private database? [13:52:38] how does that help? [13:55:27] it is not open to the world [13:57:12] the user has to be able to see it in order to find the description and type it in, right? [14:00:32] no, I think you missed the point [14:01:13] so we want the user to type in a description and be lucky enough to match exactly the description that we have on file? [14:02:18] the thing about captchas is that we know (and the user knows) exactly what he or she has to type in [14:02:29] he gets two images, and is requested to describe one with CAPACHA and one with out [14:02:46] the one with CAPACHA is known to us [14:02:51] the other isn't [14:02:53] the captcha will have the description of the image in it that we like? [14:02:54] I see [14:03:40] this would be ok for very short descriptions (two short words), longer than that I would find annoying if I were that user [14:03:49] that's imesho though [14:03:56] yes, say you have a pic pf the statue of liberty [14:04:53] so the captcha would be statue of liberty and the other pic would be needed to be added [14:04:57] if the user doesn't know the translation, is that ok? [14:05:02] yes [14:05:18] we check only the first - the known image [14:05:27] isn't that what recaptcha does? [14:05:31] One is a known word, one is unknown [14:05:34] don't know [14:05:55] I seem to recall that's how it worked, with you adding the translation for the latter word (probably done by multiple attempts consensus) [14:06:14] Reedy, matanya: That's my recollection too, if it helps [14:06:59] I think it is doable, and we might get many good translations and slow done spammers a bit [14:07:38] I like the cat asirra one ;) [14:07:53] Or we should do identification of wmf staff [14:07:59] we've loads of brion pictures ;) [14:08:14] spammers will just put "brion vibber" by default [14:08:15] the extension cat asirra? [14:08:16] :-P [14:08:21] lol [14:08:50] so, how can I take this idea forward? [14:09:19] https://www.mediawiki.org/wiki/Extension:Asirra [14:09:26] https://upload.wikimedia.org/wikipedia/mediawiki/1/18/Extension_confirmedit_asirra.png [14:09:30] Select ALL the cat photos! [14:10:11] the user should be able to choose dog pictures if they don't like cats :-P [14:10:17] :D [14:10:21] (I love cats. But some of my friends, sadly, do not.) [14:10:34] which reminds me, where are my two... [14:10:38] Could be done for anything [14:10:42] X and U [14:10:46] *X and Y [14:10:59] apergos: http://i.imgur.com/ZQLhc.jpg [14:12:24] great! [14:12:52] asirra is horrible btw, we tried it on translatewiki.net [14:13:14] it is beta. beta = still don't work [14:14:18] matanya: your proposal is terrible! :D [14:15:27] one more terrible idea [14:15:35] add it to the counter [14:15:42] usually they only loan me the little animals they bring in [14:15:59] after I praise them for being mighty hunters, they take their prey off and play with it a bit more before they eat it [14:16:17] (lizards, insects and such. no birds) [14:16:23] I have many more in my head, waiting until i find the time to write the at wikitech -l [14:16:31] matanya: ;) Brooke to the rescue: https://www.mediawiki.org/wiki/CAPTCHA [14:17:20] aha! great minds think a like [14:17:48] also Brooke [14:19:56] matanya: close the bug or link it from the page please [14:20:03] Reedy: what can we do for https://bugzilla.wikimedia.org/show_bug.cgi?id=36316 ? [14:21:15] matanya: beware that despite many emails Cristian didn't get any attention on his idea, dunno if "Wikisource" is a magic word which makes people deaf and blind [14:22:14] it is just harder to implement this for wikisource [14:24:54] matanya: how so? [14:25:13] copying recaptcha is surely easier than inventing something new [14:25:22] (completely new) [14:25:41] recaptcha is copyrighted, AFAIK [14:26:05] patented? [14:26:10] yes [14:26:15] Actually as I recall [14:26:27] we don't use it for performance reasons. [14:26:34] (rather than anything legal) [14:26:39] better so, we'll see if Google sues the WMF and if yes it will be a good case against software patent [14:26:52] I wish! [14:26:55] Jarry1250: i's both [14:26:56] https://www.mediawiki.org/wiki/Extension:ReCAPTCHA [14:27:12] MIT? [14:27:18] Maybe, if we believe it. [14:27:20] that's just the extension [14:29:42] "reCAPTCHA hold no specific patents for the technology behind their text CAPTCHA algorithms" [14:29:44] Nemo_bis I'd be rather happy it's just an extension [14:29:59] says random blog. [14:30:02] https://www.google.com/recaptcha/terms [14:31:30] sDrewth: what address did you write to for https://bugzilla.wikimedia.org/show_bug.cgi?id=29855#c5 , dear unofficial bunny? [14:31:52] uhm [14:35:22] don't remember, and the mail is on my old PC; so cannot grab it easily [14:35:47] aww [14:35:59] I am reasonably certain that I logged in, and then just followed down the contact us string [14:36:08] ok [14:36:41] try http://help.linkedin.com/app/home/h/c [14:36:55] and do false try, and you get a web contact page [14:37:00] Jarry1250: you may want to include the random blog unrealiable reference in the mw.o page [14:38:26] Or ConfirmEdit? [14:39:00] Jarry1250: ConfirmEdit doesn't need it, it uses reCAPTCHA so it surely doesn't go against it [14:39:31] Which page, I mean? [14:40:29] Jarry1250: CAPTCHA [14:41:00] sDrewth: do you think I have better chances as logged in? Firefox is too disgusted y that help centre to let me login [14:41:17] Nemo: https://www.mediawiki.org/wiki/Extension:ReCAPTCHA? [14:41:43] Jarry1250: [[CAPTCHA]] [14:41:50] https://www.mediawiki.org/wiki/CAPTCHA [14:43:42] Done :) [14:50:01] don't know Nemo_bis [14:50:08] sDrewth: well, did so [15:15:25] ping ... https://en.wikisource.org/wiki/Wikisource:Administrators'_noticeboard#Is there such a thing as... [15:18:15] working link: [15:18:15] sDrewth: so what? [15:18:26] hesperian is right, probably tanvi created the pages because they're required to get the global flag [15:20:14] meh [15:37:51] can anybody tell me if https://gerrit.wikimedia.org/r/#/c/8407/ is already deployed to WMF wikis? [15:38:00] or when it's supposed to be [15:39:51] malafaya: looks like it should be in [15:40:19] Nikerabbit, cause it seems to be happening sometimes when I run my bot [15:40:33] suddenly, it goes "back" in page name to capital letters [15:41:22] nope, not fixed: https://es.wiktionary.org/w/api.php?action=query&list=allcategories&format=xml&acfrom=EN&aclimit=10 [15:41:41] refering to bug 38165 [15:41:45] bug:38165 [15:42:15] malafaya: It was deployed to its first wikis yesterday [15:42:33] es.wiktionary is included? [15:42:38] Not yet. [15:42:41] or en.wiktionary? [15:42:45] ok [15:42:50] do you know when it will be? [15:42:54] Tomorrow [15:43:00] If the timetable is stuck to. [15:43:02] great, thanks for the info [15:43:02] :) [15:43:05] :) [15:43:08] uh, is that the same as my bug [15:43:28] Nemo_bis, if you run a bot in Wiktionaries, probably [15:44:12] malafaya: no, it was on another wiki, it's mentioned in yours [15:44:45] sigh, one month to get a fix to my bug (initially closed as invalid), then 2 months to get the change merged, several other bugs filed in the meanwhile [15:44:56] the funny thing is that the problem has existed for yeats [15:45:24] this same bug i reported existed for a long time until I realized why and under what conditions it was happening [15:45:51] i could never bot-process all pages in a wiki cause at some point it would go back [15:46:45] yep [15:46:56] or just stop [16:28:11] !log reedy synchronized php-1.20wmf8/extensions/AbuseFilter/ [16:28:19] Logged the message, Master [16:57:05] are there some tips and tricks on how to make the import of parts of the dumps into mysql fly? i'm currently trying to import the page.sql.gz and pagelinks.sql.gz of the enwiki into a mysql db, but the process seems to get slower and slower... [16:57:42] Probably will - IIRC there are a bunch of indexes on those tables, not the best person to comment though [16:57:46] i did some tweaks to the config already, but maybe one of the pros has some further ideas.... [16:59:27] as far as i understand it, the dumps already disable the keys before importing: [16:59:45] /*!40000 ALTER TABLE `pagelinks` DISABLE KEYS */; [17:01:03] i'm doing a server-side "source " to import and i'm the only one on that db [17:06:11] jorn: /*!40000 DROP TABLE `pagelinks` */ will help ;) [17:06:50] LeslieCarr: wfh today? [17:06:51] was my question that annoying ;) [17:07:14] werdna: i think so, i started working fromhome and figured "eh, why bother coming in" [17:07:22] heh [17:07:28] how was portlandia! [17:07:41] <^demon> That's how I feel every day. The 8h commute just isn't worth it. [17:07:42] awesome, of course ! [17:07:56] biked a ton, ate even more, drank even more [17:08:03] i think i gained about 10 pounds over the weekend [17:08:06] "I'll just check the tickets before heading to the office", "screw it, getting lunch and staying here" :) [17:08:22] LeslieCarr: winning! [17:09:23] Damianz: pretty much :) [17:09:48] plus, the peaches i got before i left are now in that perfect stage [17:09:59] argh I am so jealous [17:10:05] brb, checking the kitchen for peaches [17:10:08] of leslie's peaches? [17:10:08] jorn: naw, just joking with you ;) don't have any advice [17:10:32] none [17:10:43] jorn: domas has some stuff around [17:11:31] Thehelpfulone, hi again. Are you remember the ticket about Wikimedia RU mailing list? It's been two weeks but nothing happened =( [17:11:37] awwww, there's the big civic center farmer'smarket tomorrow werdna , you might be able to find some good peaches [17:11:51] http://dom.as/2009/02/03/mydumper/ [17:11:53] okay that's for dumping [17:12:52] * Damianz doubts werdna will find Leslie's peaches in their kitchen [17:14:22] yay, BBQ again [17:15:00] werdna: well, seems it can be used for restore as well. thanks for the pointer [17:16:24] oo bbq ? [17:16:43] <^demon> Now I want bbq...and I just had lunch :( [17:16:51] ^^ me too [17:16:52] it's finally good weather here [17:16:55] it's rained for weeks [17:17:08] <^demon> Please send your rain here, we need it. [17:17:26] suckiest summer ever [17:17:31] (until now) [17:19:34] mark .. wanna come over to enjoy the summer here ;-) [17:21:16] no it's fine now [17:25:52] mark: what's the temperature etc? [17:26:05] mark: since I'm going to be over in a few weeks [17:28:42] werdna: right now, 31C or so [17:28:47] more importantly, what's the frequency? [17:28:48] ooo [17:28:49] in a few weeks, no idea ;) [17:29:02] Kenneth [17:29:04] they say next week is gonna suck again [17:29:22] temperature won't be horrid (still 20C or so) but it'll rain [17:31:34] ooo pretty :) [18:04:06] !log reedy synchronized php-1.20wmf8/extensions/AbuseFilter/ [18:04:15] Logged the message, Master [18:04:31] werdna: ^^ [18:04:38] \o/ [18:05:43] Reedy: https://www.mediawiki.org/wiki/Special:AbuseLog/14103 [18:05:46] seems to work. [18:06:16] Apparently you're a spambot now [18:06:36] aww i don't have permissions to see the details [18:06:46] DB cleaned up [18:06:47] mysql> delete from abuse_filter_log where afl_var_dump=''; [18:06:48] Query OK, 19 rows affected (0.04 sec) [18:07:02] I see it made a lot of damage then [18:07:10] the rows I deleted: http://p.defau.lt/?UN92Wkx8NaaotnCv_42tkQ [18:07:20] well, the query i used to prove that they were all deletable. [18:07:51] LeslieCarr: you're not cool enough to be an ABUSE FILTER ADMIN [18:08:00] i tell ya, those people rock. [18:08:37] haha [18:08:59] that's a really simple regex [18:09:10] yeah [18:09:41] https://www.mediawiki.org/w/index.php?title=Special:AbuseLog&wpSearchFilter=1 [18:09:47] killed a few spambots back in march [18:09:51] that makes it worth it right? [18:17:53] on witch cluster commons sites? [18:17:55] s7? [18:18:31] *which [18:19:14] or s4? [18:26:22] matanya, s4 [18:26:44] although commons is replicated in toolserver on all servers [18:30:47] !log root@fenari:~# chown l10nupdate:wikidev /home/wikipedia/common/php-1.20wmf8/cache/l10n [18:30:54] !log root@fenari:~# chmod g+w /home/wikipedia/common/php-1.20wmf8/cache/l10n/ [18:30:55] Logged the message, Mr. Obvious [18:30:55] Reedy: ----^^ [18:31:03] Logged the message, Mr. Obvious [18:31:21] Tim moved stuff around.. [18:31:24] Or so I thought [18:31:40] But the changes to checkoutMediaWiki (which he integrated this stuff too) did fail with a sudo related error [18:31:46] Isn't there some script you guys use for creating new 1.20wmfN dirs? [18:31:51] yeah [18:31:57] checkoutMediaWiki as Tim finished it off [18:32:01] Hmm [18:32:06] But he also moved some of the localisation stuff around [18:32:25] Also, with the sudo creation/mod/own I wasn't able to do it all anyway, and had to get a root to do it for me [18:32:26] Where does that script live? [18:32:33] multiversion/checkoutMediaWiki [18:32:58] Which repo? [18:33:05] Pass [18:33:08] Might still be in svn [18:33:10] nm got it [18:33:16] It's in mediawiki-config I think [18:33:19] yeah [18:33:19] URL: svn+ssh://svn.wikimedia.org/svnroot/mediawiki/trunk/tools/mwmultiversion/multiversion [18:33:20] it's not [18:33:25] ....or not [18:33:29] But it's in /h/w/c [18:33:43] php-* isn't in mediawiki-config either ;) [18:33:52] exec( "sudo -u l10nupdate -- mkdir $destIP/cache/l10n", $output, $status ); [18:33:57] That looks correct to me [18:34:04] Or, well, hmm, maybe not [18:34:18] mkdir needs more params [18:34:19] forwarded the email I sent to Tim about that [18:34:40] -m 0775 for one [18:34:50] Boo, no params for group ownership [18:35:16] wtf... [18:35:21] getcwd failed?!? [18:37:41] lol [18:38:17] Anyway, if you can get sudo to work there, the following things would need to be changed: [18:38:26] * pass -m 0775 to mkdir so the dir is created with g+w [18:38:38] * run sudo -u l10nupdate -- chgrp wikidev $dir [18:39:35] Did it run anyway, though? I mean there is a php-1.20wmf8/cache/l10n dir with the ownership and perms that I would expect given the current code, did the script create it or did a root do it? [18:40:24] I just left it as is [18:40:46] Right, so it did run [18:41:05] yeah, with that spew of errors [18:42:29] Of course that sudo stuff isn't in version control, see 'svn di' in /h/w/c/multiversion *grumble* [18:45:07] not my fault :p [19:10:40] !log mlitn Started syncing Wikimedia installation... : [19:10:48] Logged the message, Master [19:17:29] ooo [19:45:23] !log mlitn Finished syncing Wikimedia installation... : [19:45:31] Logged the message, Master [19:52:16] binasher: yay, just one sha1 script still running [20:02:35] miracle [20:08:17] AaronSchulz: let me guess, enwiki? ;) [20:09:31] Reedy: enwiki had 4, now it's down to 1 [20:09:51] all non-enwiki wikis were down like a month ago [20:10:13] *were done [20:15:31] Reedy: do you have a scheduled time of day that you do the 1.20wmf* deployments? [20:16:15] It'll be 11am your time [20:16:38] https://www.mediawiki.org/wiki/MediaWiki_1.20/Roadmap http://wikitech.wikimedia.org/view/Software_deployments [20:16:44] okay, thanks. we didn't know if there was a time or just sometime in the day [20:16:48] Monday, 18:00-20:00 UTC (11am-1pm PDT): [20:16:58] Wednesday 18:00-20:00 UTC (11am-1pm PDT) [20:17:09] hah, i checked the roadmap but not the deployment calendar, my bad [22:26:11] !log awjrichards synchronized wmf-config/InitialiseSettings.php 'Fixing footer logo for enwiki on mobile view' [22:26:20] Logged the message, Master [22:52:49] !log awjrichards Started syncing Wikimedia installation... : Synchronizing updates to MobileFrontend per http://www.mediawiki.org/wiki/Extension:MobileFrontend/Deployments/2012-06-24 [22:52:57] Logged the message, Master [22:53:35] ruhroh [22:53:36] Copying wikiversions dat and cdb files to apaches... [22:53:36] srv281: rsync: write failed on "/usr/local/apache/common-local/wikiversions.cdb": No space left on device (28) [22:53:45] srv281 appears to be out of space [22:53:49] Oh now it's out of space? [22:53:54] It gave a different error before [22:53:58] :( [22:54:01] notpeter: Weren't you messing with srv281? [22:54:15] srv281: rsync: recv_generator: mkdir "/usr/local/apache/common-local/php-1.20wmf8/skins/chick" failed: No space left on device (28) [22:54:15] srv281: *** Skipping any contents from this failed directory *** [22:54:16] etc/ [23:01:55] what is the extension or config variable to enable search suggestions? it seems it does not work on wm2013 [23:02:19] but i can't verify it's not just in my browser since i can't check the config [23:02:51] I think it needs TitleKey [23:03:03] But that should just be installed [23:03:30] Looks like search simply isn't set up for that wiki [23:03:35] Searching for "hong" yields zero results [23:03:56] There was some problem with having to add new wikis to Lucene manually I think [23:04:45] RoanKattouw: would you mind to either set it up or file a bug since idk what exactly to request there? thx [23:05:02] Let me see if I can find one of the previous bugs [23:05:43] thx [23:05:56] good morning, TimStarling [23:06:33] Danny_B|backup: Seems to be known, see https://bugzilla.wikimedia.org/show_bug.cgi?id=25682#c23 [23:07:07] Although.... that's old. I posted https://bugzilla.wikimedia.org/show_bug.cgi?id=25682#c26 [23:07:31] i see [23:07:34] thx [23:07:44] Reedy: ^^ [23:08:57] Danny_B|backup: Poke notpeter [23:08:59] AFAIK I can't do it [23:09:22] http://wikitech.wikimedia.org/view/Lucene#Adding_new_wikis [23:09:35] Ok, anyone with root should be able to [23:09:38] Reedy: oki, i was referring to your comment in that bug assuming you can [23:10:01] No, I got it sorted and reported back ;) [23:10:31] i see [23:10:38] i seem to recall there is another recent bug about some other wiki with search problems [23:10:56] dupe it to this one if you'll find it [23:11:16] I can't see it in the last weeks bugs [23:16:28] I guess those other wikis need esting and verifying [23:21:17] wow, somebody turned it on, awesome [23:22:00] only special pages work now, but good beginning [23:23:35] !log awjrichards Finished syncing Wikimedia installation... : Synchronizing updates to MobileFrontend per http://www.mediawiki.org/wiki/Extension:MobileFrontend/Deployments/2012-06-24 [23:23:43] Logged the message, Master [23:25:11] awjr: when do you get in tomorrow? [23:25:47] werdna: around midnight [23:25:49] oh wait [23:25:51] that's when i get in tonight [23:25:55] aha [23:25:59] so you're in all day tomorrow [23:26:00] i'll be in the office ~10am tomorrow :) [23:26:02] what are you doing working :) [23:26:06] yep! [23:26:13] my flight's not til 8 [23:26:19] it's a quick trip from tucson [23:26:25] probably getting about time to head to the airport :p [23:26:35] it's only 20 minutes away, and it's tiny [23:26:41] although… i do need to pack still [23:28:10] packing is easy :p [23:29:37] oh yeah :) i've got about 30 mins of work left in me before i start pulling it all togetehr [23:30:33] you are a trooper [23:57:14] Ryan_Lane: you still think https://www.mediawiki.org/wiki/Wikimedia_Labs/Terms_of_use should be "draft"? [23:57:32] I'll need to talk to legal to take it out of draft [23:57:51] ok