[04:19:39] what could it be? for several days "Our servers are currently experiencing a technical problem." https://ru.wikipedia.org/wiki/%D0%92%D0%B8%D0%BA%D0%B8%D0%BF%D0%B5%D0%B4%D0%B8%D1%8F:%D0%A4%D0%BE%D1%80%D1%83%D0%BC/%D0%A2%D0%B5%D1%85%D0%BD%D0%B8%D1%87%D0%B5%D1%81%D0%BA%D0%B8%D0%B9#.D0.9F.D1.80.D0.BE.D0.B1.D0.BB.D0.B5.D0.BC.D0.B0_.D0.BF.D1.80.D0.B8_.D1.81.D0.BC.D0.B5.D0.BD.D0.B5_.D0.BF.D0.B0.D1.80.D0.BE.D0.BB.D1.8F (translate.google [04:22:47] ssl_: I think that's https://phabricator.wikimedia.org/T75462 [04:25:44] thank you [04:31:11] hello, i was wondering if this would be an appropriate place to ask about varnish configs for mediawiki? if so - is an example of the varnish config used by wikipedia available anyway? specifically i am wondering how to have varnish cache the resources loaded by resource loader? [04:37:09] that'd probably be somewhere in https://github.com/wikimedia/operations-puppet [04:40:02] thank you, i'll take a look! [05:12:08] do users receive talk page notifs by echo when a message is placed on user talk subpages? [05:13:43] Glaisher: only if mentioned [09:49:21] wikipedia has been really slow to respond lately. Am I the only one seeing this? [11:14:31] (Maybe, yes.) [13:41:03] wikitravel.org looks down [13:56:19] Nemo_bis, ... k [17:54:42] If this works, might be useful for some of our mailing list digest users https://chrome.google.com/webstore/detail/mailto-for-gmail/dgkkmcknielgdhebimdnfahpipajcpjn?hl=it [18:46:22] Is there something on eo.wiki which converts "XX" to "Xx"? :o This file link shouldn't be red https://eo.wikipedia.org/wiki/Francesco_Filelfo [19:47:24] hi folks, I recall seeing a webapp where you can upload a patch file via a web form and a gerrit change gets created [19:47:44] and I cannot find the URL anymore, but am pretty sure that you are actually using it -- got one? [19:50:30] jkt: https://tools.wmflabs.org/gerrit-patch-uploader/ [19:54:37] Nemo_bis: thanks, you're my hero for today [20:01:07] jkt: great! But you swap hero daily? :) I still have to find one [20:01:42] (just kidding) [20:54:18] How do I create an iw link from en:wikt to simple:w? [20:55:12] I've tried w:simple wikipedia:simple simple:w and simple:wikipedia and none seems to work quite right. [20:57:10] wikipedia:simple is the closest except it goes to simple:special:diff/ on enwp which redirects to simple.wp [21:01:11] w:simple: works [22:22:29] jackmcbarn: why did you change Template:JS_migration to use templates instead of regular brackets? I null edited through the cat and the way I had it was working perfectly. [22:26:06] I can understand adding the / to each find (sure is a pain to have to so it that way instead of with 1 regex invocation). The typo in cologneblue I often make was a good fix too. I just don't understand the point of using {{!((}} and {{))!}} instead of [[ and ]] respectively. [23:08:23] Hi all! I would like some advice on how I can obtain images from Wikipedia (or any other Mediawiki install) in the most community friendly manner... [23:09:01] Community friendly? [23:09:08] As I see it, there has been a fair bit of dispersed approaches to this problem, furthermore public backup mirrors on yours.org seem to be inactive [23:10:02] Sorry, weird wording... community friendly = in a way that's useful to some and doesn't annoy others, i.e. those that admin the servers [23:10:37] Well, for WMF, that'd be generally be by requesting them in a way that won't bypass the caches [23:12:34] Thanks @Reedy! Is requesting a cached version llike requesting something from http://upload.wikimedia.org ? [23:12:53] Reedy: best would be ones that go through caches but don't cache imsses [23:12:54] misses [23:12:54] :) [23:13:01] Well, yeah [23:13:31] My idea would be to make a project that obeys "best practices" and put it here... which is something that I think I can wrap up quite soonish... https://github.com/benjaoming/python-mwdump-tools [23:13:33] but as we both know, we've had people doing stupid shit with parameters which causes new images to be rendered and served and such [23:13:37] T13|supper: it didn't work perfectly. it didn't work at all for mediawiki talk namespace [23:15:19] Then, secondly, I would like to serve a mirror of downsampled media... maybe not totally in public... but that anyone can have access by request or by sending a harddrive over... would you think the interest is fairly big in those regards? [23:16:07] According to the listing in the category it worked jackmcbarn. [23:16:16] Sounds like a lot of hassle for not a lot of gain [23:16:44] I'm not going to change it because it doesn't matter, was just curious. [23:16:48] the sortkey didn't work, since the space is stripped as a trailing space [23:18:52] But empty sortkeys still show up first whether there's a space there or not. [23:19:50] So currently, 800px, 1024px and 1280px are well-known thumbnail sizes that won't generate cache misses? Is that universal for all images or is there a way of verifying it? [23:20:19] T13|supper: empty sortkeys don't even render as a categorylink [23:20:50] benjaoming: No [23:20:52] (though note that that's hard to test, since the PST will attempt to make it non-empty if it's not done through a template or something) [23:21:32] Look at Category:JavaScripts using deprecated elements - all non-user talk are up top. That's how they have been since my edit and null edits. [23:21:43] T13|supper: they weren't that way before i made the edit [23:22:57] benjaoming: Generating cache misses is only a problem if there's a lot of them [23:27:38] Reedy: I serve offline versions of the English Wikipedia for development projects in areas without internet connectivity... with downsampled media... so I really intend to avoid doing things the wrong way, i.e. to generate a million cache misses :) So I'd be most comfortable with a method that gives zero cache misses if possible for thumbnails? [23:27:41] benjaoming: you know of https://github.com/WikiTeam/wikiteam don't you [23:27:56] a million cache misses is irrelevant [23:28:00] benjaoming: I'm not sure if you can do it without [23:31:39] benjaoming: I'm not sure what your requirements are, but https://sourceforge.net/p/kiwix/other/ci/master/tree/mwoffliner/mwoffliner.js already takes care of image download and compression [23:45:26] Nemo_bis: Maybe you're right... it's a long time since I last checked the market. Kiwix seems to offer quite a lot. Will have an extra look. [23:48:00] Nemo_bis: Hmm, unfortunately Kiwix still hasn't released a version of English Wikipedia with images afaik... http://sourceforge.net/p/kiwix/discussion/604121/thread/67a3195e/ [23:48:23] But so what I'll do is talk to those guys and ask how far they are. [23:53:18] Nemo_bis and Reedy thanks for your help. I'll coordinate with Kiwix folks!