[02:27:21] If I am getting a "Gateway timeout" message from Commons, where trying display a Special:WhatLinksHere filtered by a namespace, is that something that should be retried at a later time, or something that should be bugzilla'd [02:27:49] the template at which I am looking will probably have lots of transclusions [02:43:02] sDrewth: what's the specific url? [03:00:31] greg-g: https://commons.wikimedia.org/w/index.php?title=Special%3AWhatLinksHere&target=Template%3ADescription+missing&namespace=10 [03:00:48] apologies, was distracted with a talk page [03:01:28] hmm, now it comes up [03:01:30] wfm? [03:01:31] yeah [03:01:55] see! always go to the tech guys to get something to start working [03:02:09] never sit on it and retry for 15 min [03:02:59] exactly :) [03:03:24] * sDrewth mumbles "bloody computers" [06:29:03] hmm [06:29:08] are valid logins now throttled? [06:29:52] https://travis-ci.org/wikimedia/pywikibot-core/jobs/28473474 <-- I've never seen that failure before, and the password is correct given the earlier proper logins [06:34:13] [[Tech]]; Verdy p; /* Urgent: IP cap lifted for edit-a-thon */; https://meta.wikimedia.org/w/index.php?diff=9008932&oldid=8875705&rcid=5385196 [06:34:52] [[Tech]]; Verdy p; /* Urgent: IP cap lifted for edit-a-thon */; https://meta.wikimedia.org/w/index.php?diff=9008938&oldid=9008932&rcid=5385197 [06:35:55] such urgent [06:52:17] legoktm: were stats uypdated for real? first {{done}} at https://www.mediawiki.org/wiki/SUL_finalisation#Backlog [06:53:12] If we're lucky, many accounts have been merged in the meanwhile upon login by the ad hoc login check [06:53:49] I'm not aware if it's been run recently [06:53:55] Let me see what that script does [06:55:01] it joins local and centralauth tables and does a bunch of selects :) https://www.mediawiki.org/wiki/Admin_tools_development/SUL_Audit#HW_Needs [06:56:07] hmm [06:56:13] "Option 1 - Replicate all clusters to a single slave" <-- that actually happened [06:56:36] Sure, but did it survive? :) [06:56:44] the pass0 script I'm looking at does writes too [06:56:49] yeah, I used it 2 weeks ago :P [06:56:58] it's an analytics slave that has every db on one host [06:57:19] https://github.com/wikimedia/mediawiki-extensions-CentralAuth/blob/master/maintenance/migratePass0.php [06:57:43] Ah, that one. Sure. [06:58:35] Isn't the stats script supposed to be https://github.com/wikimedia/mediawiki-extensions-CentralAuth/blob/master/maintenance/checkLocalNames.php [06:59:15] no, that just deletes invalid account rows...which I actually need for a different bug [06:59:37] I don't see anything in the AccountAudit extension either [07:04:07] wat [07:04:10] this script doesn't even work [07:04:25] * legoktm grumbles [07:06:00] IRC logs were fun but I only found that global accounts were created at 150 per second rate [07:06:29] Heh, at the time this looked urgent https://gerrit.wikimedia.org/r/#/c/63731/ [07:06:57] (Good thing we got it done, one year to translate. ;) ) [07:07:30] https://gerrit.wikimedia.org/r/#/c/142201/ -.- [07:07:53] legoktm: did you check officewiki for notes? some were there [07:08:02] nope, let me do that [07:08:26] heh assignments [07:11:11] first, [07:11:18] the search index on officewiki seems to be out of date [07:11:29] years out of date? [07:11:47] * Nemo_bis blames ^demon|away [07:12:14] Just use Special:Contributions/Pgehres or whatever the username there :) [07:12:20] well I found a page that was touched on May 6, but search claims it was 07 [07:12:24] dunno [07:12:38] secondly, searching "SUL" on officewiki brings up some really old stuff :P [07:12:58] maybe private wikis are deliberately not indexed? [07:13:08] legoktm: the old wikitech.wikimedia.org wasn't ^ [07:13:16] no it is [07:14:26] yeah, I don't see anything useful on officewiki [07:16:13] the old wikitech was on linode and didn't use lucene IIRC [07:17:00] SAL knows nothing about any of those stats queries; at the time, terbium was terra nullius / wild west [07:17:56] maybe James_F|Away knows? [09:01:14] https://wikimediafoundation.org/wiki/Home taking ages to load [09:01:51] Nemo_bis: loaded pretty quickly here [09:03:16] hmm 20 ms ping from Milan to bits? liar network [09:05:20] esams bits? [09:30:17] is it possible that https://en.wikipedia.org/ unreachable over v6? [09:34:15] yes; traceroute welcome [09:38:16] http://paste.org/73312 [09:47:22] now it's also http and the servers de.wikipedia.org and meta.wikimedia.org [09:53:55] satanist: yes, from Milan it's the same, problems with init7 [09:54:09] they've been very unreliable for months now [09:55:17] i have problems only today [09:57:12] now it got slightly better than 30 min ago for me http://paste.debian.net/106815/ [09:57:18] usually it lasts some hours [09:57:37] no idea if WMF / godog can do anything about it [09:59:21] Nemo_bis: could be, I'm not overly into wmf network(ing) yet [10:02:59] now it work again they chainge something at the routing [10:10:08] lucky you, I'm still going through the faulty one [10:12:04] it changed agan, so somtimes it work somtimes not [10:40:15] [[Tech]]; 217.179.198.213; /* Mythology */ new section; https://meta.wikimedia.org/w/index.php?diff=9010575&oldid=9008938&rcid=5385354 [11:00:08] [[Tech]]; Jianhui67; Undo revision 9010575 by [[Special:Contributions/217.179.198.213|217.179.198.213]] ([[User talk:217.179.198.213|talk]]); https://meta.wikimedia.org/w/index.php?diff=9010727&oldid=9010575&rcid=5385383 [11:33:31] andre__ Can you help me with bug on sh wikipedia? My watchlist is not work ? [11:34:15] Kolega2357: Please do ping me. I do not want to talk to you. Thanks. [11:35:17] andre__: maybe you mean "Do not ping me" :) [11:35:35] Jurgen: heh, you are right. Thanks :) [11:41:25] andre__ And? [13:13:11] <^demon|away> Nemo_bis, legoktm: private wikis are indexed, if one isn't that's a bug. [13:13:23] <^demon|away> wikitechwiki is routinely running old code though, different problem. [13:13:30] * ^demon|away runs away, finds breakfast. [13:13:41] yep [13:30:10] twkozlowski: was going to ask for help with polish translation, but MatmaRex took care of it :) thanks! [13:30:36] Sure :) [13:33:53] twkozlowski: I was debugging Kiwix this morning, we should have told people on Twitter to use it :) [13:35:54] I wonder if SwissCom/Fastweb gets CH federal incentives to use such a stupid provider http://www.init7.net/en/about/organigramm [14:29:19] downtime for download.wikimedia.org = dumps.wikimedia.org is continuing past the window, more updates when we have them [14:29:26] apologies for the inconvenience [15:31:14] thanks for info [15:33:59] for solks still following along, we're still trying to beat the dump server into submission [15:34:12] telling it that a 10gb nic is a good thing.... [15:34:14] *folks [15:40:16] the server wants to serve less [15:47:38] it already serves less [17:02:49] gerrit is dog-slow today [17:02:56] git, rather [17:11:20] when is it not? [17:28:18] we're still kickng the dumps web server around... [17:28:30] it seems to be kicking back a bit which is progress [17:28:53] most likely network issues are at the root of the problem, to be expected since it was moved to a new rack with a new nic [18:10:14] dumps.wikimedia.org and download.wikimedia.org are back in service, thanks for your patience [18:17:56] thanks apergos [18:18:49] in about 1 more minute dumps and montoring will be back to normal [19:07:44] I just managed to truly botch something [19:07:47] not entirely sure what I did lol [19:09:21] nevermind, fixed [19:51:02] csteipp: ping [19:52:21] aude: Yeah, what's up? [19:53:12] for our property suggester, you recommended we run the scripts on labs [19:53:38] what is the recommended way to get the csv into production after that? [19:55:46] aude: Anyone with shell can upload it and run the import [19:56:51] I'm assuming it's small enough that someone can download/upload it? [19:57:31] tis [19:57:34] it is [19:57:40] i put it in https://github.com/filbertkm/wbs_propertypairs [20:02:42] aude: How to upload it? do we have a maint. script? [20:02:50] So that it works like shell file uploads [20:03:41] where upload to? [20:04:01] you can sftp/scp it to bast1001 [20:04:03] we have maintenace scrpit that will imporot it [20:04:06] aude: How do we push it [20:04:07] ah [20:04:08] ok [20:04:10] then sftp/scp it to tin [20:04:20] I just load stuff to bast1001 usually and then rsync to terbium or tin [20:04:36] usually terbium as that's beefier [20:05:07] doing [20:05:58] leaving for food, guess you'll get that done [20:06:36] yeah [22:42:48] * legoktm eyes hoo|away [22:47:05] hey legoktm :) [22:47:15] \o/ [22:47:21] how's the refactoring coming? :D [22:48:45] Hope I can take it to gerrit tomorrow or on Saturday at least [22:49:11] ok, sounds good