[00:14:16] hrm. i am having trouble getting to wikipedia. very specifically, the IP that google's public dns servers (8.8...) return for me for en.wikipedia.org always returns an empty response. [00:15:01] other wikipedia IPs respond, and from a different client IP (a shell box not located on my home network) that particular wikipedia server does respond [00:15:06] so, um, help? [00:18:35] laurence: What is the IP you get back from Google DNS? [00:18:55] 198.35.26.96 [00:19:55] That's text-lb.ulsfo.wikimedia.org [00:20:39] Hmm, now I need to find someone who knows things about networks [00:20:50] laurence: What kinds of empty responses are you getting, exactly? [00:21:01] Maybe you could show me a shell session or something that displays the problem? [00:22:43] compare the two results in http://pastebin.com/xdDtj5CV [00:22:52] that's probably the cleanest illustration of the problem [00:23:21] where about are you accessing from? people in the SF bay area on comcast are seeing issues today [00:23:25] myself included [00:23:33] comcast business, SF :) [00:23:47] there ya go [00:24:03] seriously, at least 3 WMF employees who were working from home today all had issues [00:24:27] oh, i don't doubt it, but it would be interesting to know if they're all getting that same server for wikipedia [00:24:37] In SF, almost certainly [00:24:47] We use geographic load balancing [00:24:57] That IP you're talking about is our datacenter in San Francisco [00:25:00] because when i do a dns query from elsewhere and use the IP I get to talk to wikipedia via the comcast cable modem, it works [00:25:03] The nearest other one is in Virginia [00:25:39] So I would hope that if you are anywhere on the west coast, DNS would send you to the SF location; at least that's how it's supposed to work :) [00:25:42] laurence: I talked with comcast customer support ~2h ago and they said they'd call me back in 24 hours :/ [00:27:01] hm. [00:28:26] if you have sufficiently fine-grained control over your dns, you might want to switch it to return a different wikipedia server for comcast users for now? [00:28:44] because eg 208.80.154.224 works fine [00:29:55] That sounds like it would be good [00:30:05] Let's see if I can find someone who knows things about DNS that's actually in this timezone [00:30:08] Or one where it's daytime [00:43:36] so who here has the Comcast issue and can debug it? [00:43:58] laurence, legoktm, greg-g [00:44:16] bblack is in WMF Operations [00:44:29] can anyone confirm if it's network connectivity to us or DNS lookups that are failing? I can walk you through some steps [00:44:32] i am happy to provide whatever might help [00:44:36] dns is fine [00:45:42] ok I'm still getting caught on various backscroll [00:45:58] no problem [00:45:59] "says they get empty responses from text-lb.ulsfo, but not from our other locations" [00:46:06] meaning text-lb.eqiad works? [00:47:11] bblack: I was debugging it earlier with paravoid in -operations, but as of a few minutes ago all WMF sites I've tried are working fine [00:47:11] see: http://pastebin.com/xdDtj5CV for a comparison of text-lb.ulsfo and what appears to be text-lb.eqiad [00:47:57] lol and look at that, now text-lb.ulsfo is working for me again [00:48:14] i was getting the empty response for probably 2 hours though [00:49:43] I don't see any major dropouts in our overall traffic stats for the past few hours, I wonder if it's not even all of Comcast [00:50:16] the particularly weird thing is that there was a host that would complete the tcp handshake on that IP [00:50:23] and then just not respond to anything [00:50:46] probably a transparent proxy inside comcast. does it work if you go via https? [00:50:57] does comcast account for a significant portion of our traffic? [00:51:04] well, everything works now [00:51:09] but during the outage https did not work [00:51:21] well they're a major ISP in the US, I'd expect to at see a visible change in the graphs if it was all of Comcast gone [00:51:31] I couldn't say what fraction [00:51:51] i was originally coming to wikipedia via ssl and disabled https everywhere to disconfirm the theory that something was wrong with my ssl libraries [00:52:04] because i'd happened to have just updated them [00:52:18] ok, so at least initially it affected both protocols [00:52:49] ping me if it acts up again, or if anyone in here can still reproduce the issue [00:54:30] if it happens again, i'll ping you :) thanks for taking a look [02:57:43] * Base 's created his first ever patch on gerrit *party* [05:45:00] Base: Yay, congrats! :-) [05:45:51] thanks ^^ [06:09:52] which extension or whatever are wmgUseRSSExtension and wmgRSSUrlWhitelist for? [06:10:28] ah it seems that RSS [06:14:26] are wmg instead of wg names of variables some WMF specific thing? [06:15:54] yes [06:16:06] usually seen in InitialiseSettings and CommonSettings [07:00:14] https://www.mediawiki.org/wiki/Extension:RSS#Example it seem to does not work [07:00:25] I've got the same on wmuawiki with wmua blog [07:00:56] could it be because link which is being tried to fetch is http but the blogs are with https now? [07:06:16] https://git.wikimedia.org/feed/mediawiki/extensions/Translate.git seem to work [07:06:35] perhaps it's because http instead of https ineed [07:06:38] *indeed [07:16:27] hm nope it isn't that it seems [07:16:37] http://blog.wikimedia.org/feed/ seem to work fine on wmfwiki [07:16:49] but it doesnt work on mwwwiki [07:16:57] have you whitelisted it? [07:17:08] http://noc.wikimedia.org/conf/highlight.php?file=InitialiseSettings.php [07:17:14] is it a valid rss url? [07:17:15] wmgRSSUrlWhitelist [07:17:43] p858snake: since it works on wmfwiki than it's probably is [07:17:48] *then [07:18:12] https://wikimediafoundation.org/w/index.php?title=Template:Blogbox&action=edit dk if you have an acc in there though [07:18:36] ah anyway it isn't private, i forgot [07:19:33] I've got the same on wmuawiki with wmua blog each wiki has its own whitelist [07:20:09] yep [07:20:10] 'uawikimedia' => array( 'http://wikimediaukraine.wordpress.com/feed/' ), [07:21:24] as for foundationwiki and mediawikiwiki I see absolutely no difference in lines about http://blog.wikimedia.org/feed/ but on the former it works and on the latter it does not [07:26:18] https://phabricator.wikimedia.org/T71783 seems to be something or exactly like that [07:30:20] but it seems to be about some third party's wiki [07:30:35] while it's wikimedia wikis i'm talking about [09:15:30] "We migrated all of the LibreOffice bugs from freedesktop.org to our own TDF Bugzilla on Saturday, January 24th, 2015." https://bugs.documentfoundation.org [09:15:45] More issue tracker migrations. But I'm still receiving notifications ;) [10:44:59] Just wondering, if mw:Vector.css and User:-revi/common.css conflicts, which applies? [10:45:41] it _should_ be the most-special one ... so the user-common one? [11:32:31] wait, libreoffice is only just migrating to bugzilla? what were they using before? [11:36:25] MC8, Bugzilla on freedesktop.org [11:36:37] they have their own instance now [11:36:41] oh, okay [17:07:14] hello, is there a way to make a wiki-wide "pagenotice" for a certian namespace? or at least that administrators would have access to? [17:07:47] I want to propose creating a pagenotice for user pages that explains that they are not for drafting articles, which is something that happens all the time on enwiki [17:10:45] MusikAnimal: they aren't? As far as I know, they are [17:11:42] Anyway, if you achieve consensus a sysop only has to edit a system message, see https://en.wikipedia.org/wiki/Special:PrefixIndex/MediaWiki:Editnotice [17:11:44] what is? we can create page notices but not one that will automatically show up when editing any root User namespaced page [17:18:18] MusikAnimal: inspect those examples better [17:18:32] https://www.mediawiki.org/wiki/Help:Extension:ParserFunctions is your friend :) [17:20:05] ahh yes of course [17:20:35] should I be concerned about performance? having those parse functions be evaluated when editing any and every page? [17:20:40] *parser [17:21:45] That's a sailed ship [17:21:55] Edits to User namespace are not that many anyway [17:23:18] so is that a yes? I don't want to noticeably slow things down to address this minor ongoing issue [17:23:33] but I really like the parser functions approach as I could target it toward new users [17:24:33] https://en.wikipedia.org/wiki/Wikipedia:Don%27t_worry_about_performance applies [17:25:20] good [18:30:38] Hi, I'm looking at [[mw:API:Block]] and it's unclear what the valid values are for the boolean parameters like "anononly", "autoblock" etc [18:30:57] I figured out how to make them false, just supply the param with no value, but I can't figure out how to make it true [18:31:07] I tried 'true', '1' etc [18:34:23] MusikAnimal: providing any value should make it true. and to make it false, you have to *not* provide the parameter at all (empty value is also true) [18:36:40] MatmaRex: that doesn't seem to do it. First off 'autoblock' apparently defaults to true, so I need to somehow make that false, but also I tried '1' for anononly and that flag was still not set [18:36:56] I'm using api.postWithToken via JavaScript API [18:37:04] :o [18:39:28] when I use the form (not API) and inspect the POST request, I see the value '1' being supplied to indicate true [18:39:37] but I tried that [18:39:38] ugh [18:44:18] anomie, ^ [18:46:07] "Boolean" parameters in the API work like HTML checkboxes: supplying the parameter is "true", not supplying the parameter is "false". [18:47:21] "autoblock" cannot be defaulting to true from the API side of things. [18:47:52] not it's not I was mistaken, it says "autoblock disabled" [18:47:57] let me give you my code: [18:48:28] api.postWithToken("block", {action: 'block', reason:'test', user: 'MusikPuppet6', anononly: '1', allowusertalk:'1'}) [18:48:52] that actually does allow user talk I guess, but it is not setting the anononly flag [18:49:43] ^ anomie [18:50:12] ah wait [18:50:33] Specifying the 'anononly' flag to the API is equivalent to unchecking the "Prevent logged-in users from editing from this IP address" checkbox on Special:Block, FYI. [18:51:17] Which in turn sets the ipb_anon_only field in the database... [18:51:19] yeah anononly would apply when blocking an IP [18:51:34] I'm blocking an account, whoops [18:53:38] Got it working. I was just using the wrong parameter altogether =P [18:53:43] thanks for the help! [21:32:05] Hello dearest wikitechnicians, I have a mailing list question (for lists.wikimedia.org) [21:33:20] if I have a list that has a truly ridiculous amount of spam requests that are held in moderation[such that db requests through the admin interface keep timing out] is there an easy backend way for someone to take care of that? [21:33:50] If you're using Chrom(e|ium), try Firefox [21:34:26] (https://old-bugzilla.wikimedia.org/show_bug.cgi?id=56842 ) [21:35:12] thanks, I tried both, but I'll try opening a fresh firefox and see if that helps [21:35:35] Oh, ok. Too bad the easy thing didn't work. :) [21:37:34] thanks though, good to know that switching to chrome won't help! [21:39:21] brassratgirl, does it make the moderation interface impossible to use? [21:41:13] hi krenair, no, the moderation interface comes up fine, but when I submit a request (to discard messages) it is super slow -- and there's so many of them that I can't delete them all at once without the request timing out [21:50:54] I'll keep trying! [21:53:07] brassratgirl: can the task be split in parts? If so, maybe you can use an extension like CheckFox to select, say, half of the messages in queue and reject them [21:53:14] If that fails, 1/4, etc. [21:53:44] Of course if you reach 1/256 before success then it's not helpful ^^ [22:02:16] Thanks! It worked this time [22:02:26] I tried some batches and it went through. success! [22:02:33] now I just have to be a better moderator :) [22:02:35] thanks all [22:14:05] \o/