[00:34:43] Hey I am developing an extension and I have to use AJAX for this. [00:34:59] The ajax is not working at all .. [00:35:27] I mean there's no record of it in apache .. the .done(function (){ is also not run [00:48:13] How to enable the use of AJAX for mediawiki extensions .. I am using 1.19.5 mediawiki !1 [01:57:48] AdityaSastry___: #mediawiki not #wikimedia-tech [01:58:55] Got it .. [13:20:17] heyas [13:36:56] how do I get the code of miszabot? [13:48:18] guillom: did B.R.I.O.N. lack this week by chance or for the new tech/news? [13:48:51] Nemo_bis: I think this week (and a few times before) nobody had time to write it. [13:49:08] only once iirc [13:49:13] but ok [13:56:00] eptalon: email misza [13:58:15] closedmouth: he has the email set in his profile? [13:58:43] well there's only one way to find out, and it isn't by asking me [13:59:44] ok, will try. [15:46:50] miszabot's code is in pywikipedia's repo [16:22:22] hi there folks [16:22:32] i'm having problems with $wgUseFileCache on my IIS installation [16:22:33] yes, hello [16:22:34] mediawiki is not writing any cache in the specified directory [16:23:08] the directory exists and has permissions [16:24:13] #mediawiki [16:24:22] he's already there [16:24:31] came here after asking there [16:24:55] #mediawiki! [16:25:20] edsonmedina, doesn't look like the staff are around. You'll have to ask some other time. [16:25:44] yeah, #mediawiki seems dead [16:25:46] thanks anyway [16:26:00] np [16:26:01] sorry [16:27:30] YuviPanda, ^ [16:27:37] Can you help that guy [16:28:21] sorry, edsonmedina, Theo10011, haven't done anything with IIS [16:29:01] neither have most staff :) [16:29:09] might want to ask the mailing list [16:29:34] edsonmedina: yeah, emailing wikitech-l is probably the best way to go on [16:40:43] thanks [17:17:45] [[Tech]]; Guillom; fix test; https://meta.wikimedia.org/w/index.php?diff=5502983&oldid=5500953&rcid=4201267 [17:18:52] [[Tech]]; Guillom; another test to fix this; https://meta.wikimedia.org/w/index.php?diff=5502985&oldid=5502983&rcid=4201270 [17:19:16] [[Tech]]; Guillom; giving up; https://meta.wikimedia.org/w/index.php?diff=5502987&oldid=5502985&rcid=4201272 [17:20:52] do servers have an edit limit for bots? [17:21:07] can a bot make a million edits per minute? [17:21:34] no [17:21:43] bots have an edit limit [17:22:56] Theo10011 what would happen if a bot hits that [17:23:02] and also what is that limit? [17:23:19] ToAruShiroiNeko, Im not sure. Susan would def. know. [17:23:34] its not an absurd number I'd imagine [17:23:39] No [17:23:47] it's actually low [17:23:49] I am trying to convince people that there is no reason to regulate bot edit speeds. [17:23:58] heh [17:24:00] there isn't? [17:24:42] if a bot makes an error, you want it to keep going till it does the maximum damage? [17:25:01] It really depends what they're editing too [17:25:16] Respecting maxlag is one way to throttle if necessary [17:25:27] Reedy, would know this stuff. [17:25:47] fail [17:25:55] wat [17:26:19] [[Tech]]; PiRSquared17; /* Identifying wikis with local titlebacklists and issues within */ +; https://meta.wikimedia.org/w/index.php?diff=5502990&oldid=5502987&rcid=4201277 [17:26:31] I was going to whois [17:26:32] But failed [17:27:08] ffs [17:27:11] I'm giving up [17:27:13] for the day [17:27:34] Ya, you don't seem on your best. [17:27:35] Theo10011 no [17:27:41] take my advice though Reedy. [17:27:49] get those 2 things I suggested [17:27:59] if its a routine task a timer only wastes time [17:28:08] latency alone would prevent the bot to edit too fast anyways [17:28:28] my point is wikis shouldnt have policies to fix performance issues of the wiki [17:28:37] ToAruShiroiNeko, there's a bunch of issues there. There is the server where the bot runs from, the local wiki that can limit the edit, the type of edit. [17:28:40] thats something devs can handle just fine [17:29:10] sure but speed limit is only there so the servers dont break [17:29:13] there is no automated task that is lagging that a bot can change [17:29:17] It's very unlikely a single bot can actually cause issues to the server. [17:29:18] you would have to be specific [17:29:33] Depends what they're requesting/doing [17:29:33] unless you try really really hard ;) [17:29:41] ^^ [17:29:44] no speed limits aren't because of the breaking server issues [17:29:50] what legoktm said [17:29:59] sure, but in this case that is the rationale [17:30:21] bots can break the wiki even with the speed limit [17:30:44] I am hoping people would block the bot despite it editing slowly in such a case :p [18:40:33] going to run scap in a second [18:44:53] aude: but why would people at the hackathon need to be creating accounts? i'm more worried about a freenode iline [20:47:36] Hi, [20:47:51] There seems to be an issue with https://commons.wikimedia.org/wiki/File:Harvey_Naarendorp.jpg [20:48:10] It displays the wrong picture and has an incomplete logbook [20:49:07] It displays the picture that was first present under this name, but which I renamed to File:Ivan_Graanoogst.jpg [21:00:05] Looks fine to me [21:05:47] just to double check, you cant generate a proper internal link in wikitext that includes a query string and a fragment?, [{{localurl:page_name|query_string}} link text] appears after cursory review to have no support for fragments, and additionally the single [ ] link reference generates external links which need to be wrapped with a special span to undo the external link icon [21:06:00] wow thats worded horribly, hope it makes sense [21:06:39] i mean i would need to externally generate the url, and then plug it into [url_goes_here link text] ? [21:07:14] although the doc for plain links says its for making an external link look like a plain one, but what i want is a literal internal link [21:14:50] greg-g: Hi [21:31:31] ebernhardson: er, you could do #fragment after the }} ? [21:31:48] ebernhardson: but anyway, sounds like you want either #wikimedia-dev or #mediawiki [21:33:34] jeremyb: yea, i wasnt quite sure, i was thinking since its wiki text as opposed to actual internal code that -tech would be appropriate, but i will try in -dev [21:34:02] jeremyb: i think it can attach the fragment that way, not sure why i didn't think of it though, so thanks again :) [21:34:14] sure :-) [21:34:41] ebernhardson: no, that doesn't make -tech any more appropriate. if you have wikitext then you could try #mediawiki or one of the project channels (#wikimedia, #wikipedia-en, etc.) [21:35:39] jeremyb: someday i will figure out what all these channels are for :) [21:35:44] haha [21:38:10] ebernhardson: most likely not [21:41:27] why do i always get doffsdingo as captcha [22:32:10] qgil: are you in the office or in AMS? [22:32:31] DarTar, none. :) At home in Mountain View [22:32:39] ah ok, [22:32:44] why? [22:32:55] I have two visitors from Italy who are doing research on code review systems [22:33:14] and I thought I should have introduced you to each other [22:33:34] they are still around for 2 days, I'll send a line of intro by mail [22:42:18] DarTar, I might come to the office tomorrow [22:42:31] DarTar, I was just typing the background: https://plus.google.com/100920064738447422284/posts/54ZHxQfrXcC