[00:20:33] heya, guys... i'm having a rather odd issue with mediawiki software [00:20:35] i added $wgGroupPermissions['sysop']['upload_by_url'] = true; $wgAllowCopyUploads = true; to LocalSettings.php [00:20:43] however, i still cannot upload via url to my wiki (starfoxwiki.info) [00:20:54] i see no options to do so, even as a sysop [00:41:27] Ringtailed-Fox: from http://www.mediawiki.org/wiki/Manual:$wgAllowCopyUploads : you need to set $wgCopyUploadsFromSpecialUpload to true as well [00:41:58] the you can use Special:Upload [00:49:25] thanks! [01:46:51] hrmmm, no dschwen [01:47:20] He's not on often, I don't think. [02:32:22] just now I logged in and got redirected to a nonexistent page on foundationwiki [02:32:29] can't reproduce the issue though.. [02:50:23] https://bugzilla.wikimedia.org/show_bug.cgi?id=52206 [05:01:34] uh, where are http://dumps.wikimedia.org/other/pagecounts-ez/monthly/ [05:02:08] not http://dumps.wikimedia.org/other/pagecounts-ez/wikistats/ ? [05:02:44] nope [05:02:51] first link in http://dumps.wikimedia.org/other/pagecounts-ez/ [05:03:00] aka the meat [05:03:28] (or the proteins? let's not discriminate veggies) [05:03:43] did you read the note in big red letters on the page? [05:04:02] I would guess that's why those files aren't there right now [05:04:09] but you can always email and ask if you want [05:05:26] apergos: it's in big letters for me, but I admit I didn't read it because I have already read it so many times [05:06:07] IIRC the files were still there and Erik recently pointed someone to them [05:06:38] * Nemo_bis shrugs [05:06:54] anyways you'll have to ask him aboout them [05:24:14] https://commons.wikimedia.org/wiki/Template:TextDiff [05:32:05] https://www.mediawiki.org/wiki/Extension:Scribunto/Example_modules [05:47:32] preetty :) [08:47:07] #wikipedia [09:49:28] Has this been addressed https://bugzilla.wikimedia.org/show_bug.cgi?id=52808 ? [09:58:09] Pavanaja: since there is a lack of comments saying it has been done, Then it hasn't been done [09:58:23] great [09:58:45] Now what should I do? [09:59:03] The event starts in 30 mins [09:59:05] cc yourself to the bug or fix it, I guess [10:00:11] Reedy: around? [10:01:02] Ryan_Lane: *poke*? [10:03:00] Pavanaja: you didn't think to check more than 30mins before the event? [10:03:21] No [10:03:31] I thought it has been done [13:58:42] How many simultaneous HTTP connections are allowed for downloads the dumps of Wikimedia? [13:58:50] s/downloads/downloading [13:59:02] 3 max? [14:05:02] 2 [14:05:12] try a mirror site though [14:05:26] http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors [14:05:31] better bandwidth too [16:17:42] apergos: hello [16:17:55] hello [16:18:00] no parent today, something came up [16:18:04] ok [16:18:19] i'm still working on the diff dumps [16:19:54] how is that coming along? anything interesting that came up? [16:21:43] hmm, nothing interesting; i think i'll have it in commitable state on monday [16:21:51] cool [16:22:18] any questions or ideas you want to talk about? [16:24:25] i can't think of anything [16:25:05] ok, I"m good too [16:25:16] have a great weekend [16:25:33] you too, see you monday [16:37:30] was that a meeting? [16:39:49] I think so [16:42:24] GSOC? [16:44:24] yes [16:51:53] sumanah: cool :) [16:56:46] Elsie: heya! Thanks for that list of US holidays on Deployments! [16:56:53] much appreciated. [17:35:10] andre__: hey there, check out https://wikitech.wikimedia.org/wiki/Deployments#Week_of_August_19th [17:35:25] Ceph, OAuth, Notifications, VE [17:37:26] ^d: around? Nik is out today, I have a brief question about CirrusSearch behavior... [17:41:37] <^d> chrismcmahon: Shoot [17:43:28] ^d assume I have a page 'foo'; 'foo' has no talk page. If I do a search for 'talk:foo', should the results indicate in some way that the 'foo' page exists, even though it has no talk page? [17:44:21] <^d> Example url maybe? [17:44:22] robla: https://blog.wikimedia.org/2013/08/01/future-https-wikimedia-projects/ [17:45:13] greg-g: https://meta.wikimedia.org/wiki/Tech/News/2013/33 [17:45:46] ^d http://test2.wikipedia.org/wiki/Catapult has no talk page. if I search "talk:catapult" result is empty. [17:46:27] ^d: and I wonder if the result should truly be empty or somehow indicate that "Catapult" exists [17:46:46] <^d> Ah, it's actually looking for "Talk:Catapult" in NS_MAIN, which is wrong. If you click over to advanced you see we're still searching mainspace. [17:46:54] <^d> File a bug, this is a bit wonky. [17:47:04] ^d: awesome, thanks, will do. [17:47:52] ^d: btw, Nik's browser tests pointed that behavior up when I aimed them at test2wiki, I am SO GLAD he did those :) [17:48:06] <^d> :) [18:02:48] * sumanah sort of wants to get into a bugfiling contest over search [18:02:51] (with chrismcmahon) [18:09:00] sumanah: go for it! [18:09:33] ^d put CirrusSearch on test2 yesterday, so a new arena to find bugs in [18:09:42] chrismcmahon: :-) I believe I filed 4 bugs yesterday against CirrusSearch, but I am pretty sure your browser tests shall find a lot of bugs as well [18:09:54] <^d> File all the bugs! [18:17:01] https://bugzilla.wikimedia.org/buglist.cgi?email1=sumanah%40wikimedia.org&emailreporter1=1&emailtype1=substring&list_id=226493&product=MediaWiki%20extensions&product=Wikimedia&query_format=advanced&order=changeddate%2Cassigned_to%2Ctarget_milestone%20DESC%2Cbug_id&query_based_on= huh, I've only ever filed 53 bugs? this is way too few [18:17:27] Bug 28339: Sumana has filed too few bugs [18:18:46] aha, that was only in certain products [18:18:52] total: 70. Still not enough IMO [18:19:05] Apparently I'm responsible for 1.7% of all bugs logged to our bugzilla instance [18:19:21] Reedy: you mean you wrote the buggy code that led to 1.7% of them? :-) [18:19:30] I filed the bugs! [18:19:34] suuuuuure [18:19:57] https://bugzilla.wikimedia.org/buglist.cgi?cmdtype=runnamed&namedcmd=All%20%22my%22%20bugs&list_id=226502 [18:20:01] I think that should just work for anyone [18:20:05] (for themselves) [18:20:13] "The search named All "my" bugs does not exist." [18:21:04] https://bugzilla.wikimedia.org/buglist.cgi?cmdtype=dorem&remaction=run&namedcmd=All%20%22my%22%20bugs&sharer_id=6045 [18:21:11] * Reedy kicks bugzilla [18:26:24] hahah [18:27:12] You might need to share your search first with others under https://bugzilla.wikimedia.org/userprefs.cgi?tab=saved-searches [18:28:03] uhm, nice list of deployments for next week. I'll wear a helmet. [18:32:18] He left [18:32:26] Said search is shared with 10 users [18:33:06] I tried it but I didn't see a percentage [18:35:18] I did it manually - 903/52941*100 [18:36:37] oh [18:36:50] I have a lot less, not even on the map [19:04:50] greg-g: thanks for https://meta.wikimedia.org/wiki/Https [19:22:33] greg-g, "(ie: someone in Iceland will be able to view the Chinese Wikipedia over HTTPS while someone in China would view an HTTP version of the English Wikipedia)." [19:22:42] someone in China would view an HTTP version of the English Wikipedia? [19:23:03] Wouldn't people in China just get HTTP on all wikimedia sites? [19:24:46] Krenair: I may be wrong, and csteipp can perhaps correct me, but I believe at first we are going language-by-language because that we can implement quickly, and then moving to a GeoIP model [19:26:08] oh, I think I understand [20:34:41] Krenair: what sumanah said [20:34:43] :) [20:37:17] Elsie: oh one liners changing to multiline diffs in mediawiki: https://wikitech.wikimedia.org/w/index.php?title=Deployments&diff=next&oldid=80708 [21:01:10] jeremyb, so, just to close the issue we talked during the week... the problem was in my squid, some rules that allow access only to some sites [21:01:57] jeremyb, and the problem was for Norwegian locale, it tried to access nb.wikipedia.com, which was open, but that had a 301 to no.wikipedia.com, which wasn't [21:02:10] jeremyb, thank you very much for all the help you gave me! [21:03:44] facundobatista: woot :) [21:04:13] facundobatista: can't chat right now but see you around the interwebs :) [22:18:34] gn8 folks [23:40:00] greg-g: The diff engine gets it right most of the time. :-) [23:40:03] Edge cases are cruel. [23:40:53] greg-g: No problem re: holidays. :-)