[00:29:07] Hm, wait, maybe it was a fi.wiki gadget? [00:29:24] (Today fi.wiki confused me quite a bit) [08:26:15] I can't clear my watch list :( The error I get is: "Our servers are currently experiencing a technical problem. This is probably temporary and should be fixed soon. Please try again in a few minutes." [10:39:05] hi [10:39:45] what ratelimit is for creating pages from the sime ip? too lazy to try to find it in config [11:51:55] ja.wikipedia has been keep overriding Typography Refresh. is there any problem with it going forward? [11:57:18] 'edit' => array( [11:57:19] // 8 ed./min per each non-autoconfirmed, or group thereof from same IP [11:57:19] 'ip' => array( 8, 60 ), [11:57:21] Cladis: ^ [11:58:38] whym: you'd better ask on the bug report you had filed [11:59:30] whym: also, it's about time to file a bug report on how to handle the thing in the 1.24 tarball (font changes were removed from 1.23 branch but IIRC they're still in master; they didn't get worked on or stabilised though, AFAIK) [12:04:57] hmm, on https://wikimania2014.wikimedia.org/wiki/Main_Page#Democratic_Media - (scroll down) to the democratic media section, is there any way to get the images to squeeze onto one row? [12:07:01] Thehelpfulone: smaller images? [12:07:43] Glaisher: I mean is there a gallery to automatically resize them? [12:08:23] oh sorry found it, that heightspx works [12:08:25] height parameter [12:12:14] er, Thehelpfulone it seems that it depends on screen resolution or size.. with your last change it's still on two different rows on my screen [12:12:40] hmm, to be safe should I bring it down to something like 100px? [12:14:50] Nemo_bis: thanks for the suggestion, I have added some update on it. [12:15:10] Thehelpfulone: Too small. The caption on Heather's image is protruding upwards with 100px [12:16:13] Glaisher: can you play around with it on your screen to see what works? [12:21:28] Thehelpfulone: Same result when it's one row. I guess there's no easy way to achieve a consistency across various screen sizes. [12:21:50] Also, the first and thw last sections are also appearing as two rows here. Better to keep it as it is, I think. [12:22:16] ok, thanks [15:16:53] "4Nyfilmcritique thanked you for your edit on [No page]. 1 minute ago" [No page]? [15:18:16] Josve05a: What wiki? [15:18:46] enwp [15:20:06] Reedy - http://i.snag.gy/X0Pq5.jpg [15:21:37] 10:14, July 24, 2014 Nyfilmcritique (talk | contribs | block) thanked Josve05a (talk | contribs | block) [15:21:52] Log doesn't say which page [15:22:36] What if the age has been deleted? Would it still say which page it was? [15:23:17] It's used when the Title is null [15:23:25] Not sure when you might get that... [15:24:04] Well, I did get that. [15:24:12] Might be this?: https://en.wikipedia.org/wiki/Special:Undelete/Zest.md [15:24:29] It's when you interacted with him [15:24:40] So it looks like the page was deleted [15:25:15] (del/undel) 10:14, July 24, 2014 NawlinWiki (talk | contribs | block) deleted page Zest.md [15:25:18] Is the page named [[Zest.md]]? (I do not have "permission" to check that link... :P [15:25:25] But I do. [15:27:12] hmmm, I saw a bz-report about something about this... [15:27:54] 50829 [15:29:05] Thank you, Ijust wanted to know which page it was, so that seems to be the most likely. [16:13:03] short question about API:Search and generators... [16:13:10] do generators lose the order of a search? [16:13:17] search: http://en.wikipedia.org/w/api.php?format=json&action=query&redirects&list=search&srlimit=10&srsearch=cain%20abel [16:13:41] generator for those pages: http://en.wikipedia.org/w/api.php?format=json&action=query&redirects=&indexpageids=&prop=info|categories&clcategories=Category:Disambiguation%20pages&generator=search&gsrlimit=10&gsrsearch=cain%20abel [16:16:22] (if you do the same with gsrlimit=1 you'll see the first search result on top) [16:39:51] Reedy: do you remember why we added db slaves in beta labs? https://github.com/wikimedia/operations-mediawiki-config/commit/38990c671fd3b8d15f31a7c819e7bdd52ecef3ef I think it's causing https://bugzilla.wikimedia.org/show_bug.cgi?id=65486 [16:40:11] chrismcmahon: yup [16:40:17] the simple answer is to match production [16:40:37] There was a discussion to have a permenantly lagged slave to test race conditions [16:40:44] Reedy: I think the problem is that we don't run update.php in prod. [16:40:48] let me find the bug for it [16:41:12] Reedy: having the db go read-only in beta several times per day is a pita. [16:41:36] is that still happening after those schema changes? [16:41:54] Reedy: it happens pretty much daily, I can find you an example from overnight if you want [16:42:20] https://bugzilla.wikimedia.org/show_bug.cgi?id=38945 [16:42:36] That's not the bug I was actually thinking of though [16:42:46] Reedy: I happen to have it up, this was overnight yesterday with basically zero load on beta https://integration.wikimedia.org/ci/view/BrowserTests/job/browsertests-VisualEditor-en.wikipedia.beta.wmflabs.org-linux-firefox-sauce/120/console [16:44:10] Reedy: yeah, that one from Niklas shouldn't happen where we do other work. [16:44:50] There's a similar one logged by someone else [16:46:45] chrismcmahon: found it [16:46:45] https://bugzilla.wikimedia.org/show_bug.cgi?id=60058 [16:46:58] Which then is needed for https://bugzilla.wikimedia.org/show_bug.cgi?id=57583 [16:48:06] oh hell [16:48:15] which is what was filed for https://bugzilla.wikimedia.org/show_bug.cgi?id=38945 [16:48:32] I know why 60058 was filed [16:48:50] which is needed to debug https://bugzilla.wikimedia.org/show_bug.cgi?id=46716 and its dozens siblings [16:49:32] Reedy Nemo_bis I think the point was to have replag but not have the beta db become readonly several times per day. [16:49:42] haha, right :) [16:49:48] But that's the reason for adding the slaves [16:50:03] well, slave, singular [16:50:30] Reedy: but having slave db and running update.php are incompatible a working db it seems [16:50:48] incompatible with having a working db all the time [16:51:08] those schema changes before were an issue as it was trying to run 30 or whatever sets simultaneously [16:51:26] Most of the time there aren't any schema changes made [16:51:31] I'm not sure why it's getting so lagged out apparently so frequently [16:54:13] Reedy: several times per day usually. it is the source of many flaky test failures [16:54:21] :/ [16:54:36] I can't believe we're generating that much db traffic [16:54:44] and update.php should mostly be a noop [16:55:05] if it's becoming an issue, we can comment it out of the MW config [16:55:19] it'll still replicate, but mw won't use it [16:56:48] Reedy: I'm not entirely sure I understood that last, but go for it [16:57:10] heh [16:57:17] So, mysql replication is a seperate thing entirely to mediawiki [16:57:24] sure [16:58:01] if we comment out -db2 from the labs db config, mediawiki will stop using it (so then wikis won't go readonly due to replag), but we'll still have the "working" slave [16:58:40] I wonder what the threshold value is for a wiki to become readonly [16:59:12] wfm [16:59:55] Reedy: yeah, the readonly is the actual problem [17:00:20] "If all slaves are lagged by more than 30 seconds, MediaWiki will stop writing to the database" [17:00:38] I can't remember if this is adjustable [17:00:44] if it is, I'd like to try increasing that first [19:08:30] how to resolve "Enable irc feed for wikitech.wikimedia.org site" ?:) [19:10:25] mutante: UDP packets from the host need to make it to argon [19:11:10] Then something like [19:11:10] $wgRCFeeds['default'] = array( [19:11:10] 'formatter' => 'IRCColourfulRCFeedFormatter', [19:11:10] 'uri' => "udp://$208.80.154.160:9390/#wikitec", [19:11:11] 'add_interwiki_prefix' => false, [19:11:11] 'omit_bots' => false, [19:11:11] ); [19:11:56] without the $ in the IP [19:12:10] and #wikitech spelt properly [19:13:20] Reedy: thanks! [19:15:19] should probably be #wikitech.wikimedia [19:15:43] heh [19:23:55] Reedy: confirmed the network part works, i can netcat -u from virt1000 (wikitech) to argon [19:24:04] 9391 or whatnot [19:24:18] 9390 [19:26:54] Reedy: that was using the new RCStream module, right [19:27:24] "new" [19:27:31] 15 'uri' => "redis://rcstream.eqiad.wmnet:6379/rc.$wgDBname", [19:27:34] that [19:27:42] Uh.. [19:27:43] That's not IRC [19:27:43] puppet/modules/rcstream [19:28:01] but but "contains a simple and self-contained software stack for streaming 5 recent changes from a MediaWiki instance. [19:29:01] "simple and self-contained software stack" smells like it's not our good old hacky IRC feed [19:30:12] ok, trying a change in mw-config as you originally said, then -> gerrit [19:32:57] 2709 // RCStream / stream.wikimedia.org [19:33:36] "An error occurred during a connection to stream.wikimedia.org. SSL received a record that exceeded the maximum permissible length. (Error code: ssl_error_rx_record_too_long) " [19:34:05] No SSL suppport? [19:34:07] 404 on http [19:34:14] not sure yet what we'd expect there [19:34:16] Bsadowski1, did you try to connect via a browser or something? [19:34:22] lol yeah [19:34:28] ... [19:35:03] we might still want to disable that the httpd answers [19:35:19] we could redirect it to the docs :p [19:35:23] https://wikitech.wikimedia.org/wiki/RCStream [19:35:29] good one [19:38:16] I don't like this change. What will happen to snitchbot? [19:38:43] I use https://github.com/mzmcbride/irc-bots/blob/master/snitch.py (well a modified version) [19:39:03] It gets it from irc.wikimedia.org [19:39:33] i made a bug for that redirect, you have mail [19:40:35] Bsadowski1, what, you don't like the idea of moving away from our IRC+human-formatted system? [19:41:53] I do like the idea, but how will the bots we currently use get the feed? [19:42:37] They'll be modified to use the new one [19:42:41] By their developer(s) [21:45:33] what is this for? [21:45:36] http://en.wikipedia.org/wiki/Special:OAIRepository [21:45:39] it's search related [21:45:42] and asks for a login [21:46:07] brion: ^ [21:46:35] mutante: recent changes, essentially [21:46:38] mutante: https://www.mediawiki.org/wiki/Extension:OAIRepository [21:46:56] thanks! [21:46:58] i believe ‘testy’ ‘mctest’ works as a username/pass ;) [21:47:27] brion: nope :P [21:47:48] darn someone must have removed that :D [21:48:12] what access data is that? Wiki one? Or some secret stuff that I can find on tin/ fenari? [21:48:21] * could (don't care thta much) [21:48:28] username testing? [21:48:43] docs: "it will fail" :) [21:48:53] Reedy: yes that’s it :D [21:48:54] brion: testing/mctest it seems ;) [21:49:22] but yeah it’s not super exciting, it’s just an export feed wrapped in OAI-PMH xml requests [21:49:27] :) [21:49:30] Reedy: Works... but doesn't seem useful to have that restricted [21:49:30] i think we want to deprecate it though [21:49:37] in favor of more modern rcstream whatever [21:49:53] I should probably re-enable auditing and see who is still using it [21:50:10] ^demon|away: does cirrus search still use OAI for updates or is it on something else now? [21:50:21] iw ould love to kill it and close any outstanding bugs ;) [21:50:22] Why is it behind http auth? [21:50:31] <^demon|away> We hook into MW all over the place like an octopus. [21:50:33] <^demon|away> No OAI [21:50:45] Reedy: originally we used it to offer real-time data feeds for a fee to some mirrors [21:50:47] <^demon|away> Main source of updates is LinksUpdateCompleted but there are others. [21:50:50] now we’ve phased them out [21:50:56] ah [21:50:57] MONIES [21:50:58] we just don’t want anyone using OAI cause it’s old and unmaintained [21:51:16] undeploy, then?! [21:51:21] so once old search is off, i think it may be safe to undeploy yeah [21:51:24] but not just yet i suspect [21:51:24] people are/were using it [21:51:30] i'm going to enable auditing [21:51:34] then we can start poking people [21:51:36] <^demon|away> brion: So once we've decom'd lsearchd we can start killing OAI imho. [21:51:39] <^demon|away> Yep. [21:51:40] Reedy: the auditing had poor performance iirc… [21:51:41] <^demon|away> Agreed. [21:51:44] I think dbpedia (among others) is still using it [21:51:49] brion: It was enabled for ages [21:51:50] https://gerrit.wikimedia.org/r/#/c/148992/1/templates/lucene/lsearch.conf.erb [21:51:57] I disabled it not too long ago [21:52:03] ah [21:52:31] Sometime around the start of the year I think :) [21:53:01] :D [21:53:18] <^demon|away> brion: So yeah, I think we could be ready to stop using it internally inside the next quarter (and a half, tops) [21:53:26] \o/ [21:53:29] <^demon|away> So if you wanna start trying to get our third parties off it, be my guest. [21:53:32] <^demon|away> :) [21:53:36] Actually [21:53:40] Didn't I break it a few months ago [21:53:43] And we had people complaining? :P [21:54:00] best way to see who's using it? [21:54:04] heh [21:54:13] there's already 14 rows! [21:54:17] definitely doublecheck if dbpedia’s still using it [21:54:17] it's always amazing how one thing leads to another :) [21:54:24] just mentioned it because of some lint change, hehe [21:55:06] <^demon|away> brion: For dbpedia and any other cool people consuming it we don't just want to pull the rug out from under them though. [21:55:06] Yup [21:55:13] dbpedia is DEFINITELY still using it [21:55:21] 44 audit rows so far [21:55:21] all them [21:55:26] <^demon|away> So I'd hope we'd be able to service them from a combination of rcstream/pubsub/api stuffs. [21:55:32] ok let’s plan to plan to migrate them to something more modern [21:55:59] Do we know who to contact? [21:56:03] rcstream is just metadata right? they might need the text as well [21:56:08] no idea [21:56:12] <^demon|away> admin@dbpedia.org? [21:56:14] <^demon|away> :p [21:56:31] root@, postmaster@, attention@ ;) [21:56:59] help@ [21:57:06] stopusingfuckingoai@ [21:57:08] <^demon|away> See if we can get an apache error page with ContactAdmin set :p [21:57:12] http://wiki.dbpedia.org/Support [21:57:28] <^demon|away> Oh they have a bug tracker. [21:57:35] <^demon|away> File "Stop using oai" [21:57:41] http://sourceforge.net/p/dbpedia/tracker/ [21:57:50] No activity for well over a year [21:58:19] To Facebook! https://www.facebook.com/groups/4340232249/ [21:58:32] <^demon|away> "Hey friends i want some data of the banking for my big data experiments please provide me source ..." [21:59:28] try him http://dws.informatik.uni-mannheim.de/en/people/professors/prof-dr-christian-bizer/ [22:02:39] Ah [22:02:45] I've got some old emails about this... [22:03:25] Diederik emailed them.. Reply from Sebastian Hellmann [22:04:00] Of course, we are willing to change to the MediaWiki API, if necessary (and we also have to man power to achieve this within several months). [22:04:00] There were two major reasons, why we didn't switch, yet: [22:04:00] 1. we have a running system, there is no real incentive to switch unless you tell us to. [22:04:00] 2. we didn't have a contact from Wikimedia. I wrote one or two emails in the past, but didn't get a response. [22:04:00] 3. We did not find any good documentation on how to get *all* updates from Wikipedia. Query RC and then do Special:Export requests? [22:04:01] 4. We were afraid to get blocked, since we would be over the 1 request per second limit. [22:04:24] I forwarded this to Engineering back in Feb 2013 [22:04:25] "[Engineering] FW: Fwd: Use of OAI by Dbpedia" [22:08:06] guess we should pull that one out again sometime [22:08:32] and make sure we have a newer soluiton for them [22:09:09] https://bugzilla.wikimedia.org/show_bug.cgi?id=68538 [22:09:25] Just opened that.. Seems Sebastian has a bz account on a gmail email [22:11:23] * Reedy emaisl him too [22:11:46] thanks Reedy :D [22:15:18] Hopefully gets the ball rolling at least [23:14:26] Seems we have some OAI traffic from fresheye.com [23:15:17] wants "https://wikitech.wikimedia.org/wiki/Projects#Gerrit_repo_creation_through_wikitech