[00:20:11] gn8 folks [00:23:49] !log updated payments cluster to 4ccd348df7661 [00:23:55] Logged the message, Master [00:26:47] https://en.wikipedia.org/wiki/Talk:MediaWiki#Quotes_about_MediaWiki [00:30:01] Point them to https://bugzilla.wikimedia.org/quips.cgi?action=show, Nemo_bis :P [00:30:44] Krenair: yes, what I first thought [00:30:51] but Wikiquote needs good sources [00:46:29] !log updated payments cluster to ab3d3615bf1ddd0c [00:46:35] Logged the message, Master [00:47:59] Brooke, managed to get it working on my localhost, B actually breaks things completely [00:48:09] Fun. [00:49:31] I get sent to https://en.wikipedia.org/w%252findex%252ephp?title=AT%2526T [00:49:42] for localhost/wikipedia/en/w/index.php?title=AT%26T [00:49:51] Right. The B flag is mostly about ampersand support, as I recall. [01:13:57] TimStarling: any chance you could sanity-check https://gerrit.wikimedia.org/r/#/c/33157/5/EventLogging.module.php? [01:18:33] I'm in a meeting [01:40:08] Krenair: NE flag, apparently. :-) [01:40:19] yeah [01:40:33] Apparently it's broken for the commons.wikipedia -> commons.wikimedia redirect as well - https://bugzilla.wikimedia.org/show_bug.cgi?id=20409 [01:41:51] I saw that too [01:41:55] but it's a much larger change [01:42:12] we basically should change dozens of regexps [01:42:38] redirects.conf in apache-config [01:43:12] yes [01:43:23] the whole file could use some cleaning up I think [01:43:36] but I haven't looked it up that closely [01:43:49] alex@alex:~/Git/Wikimedia/Operations/apache-config$ grep "RewriteRule" redirects.conf | wc -l [01:43:49] 103 [01:43:51] ugh [01:43:53] yep [01:44:09] guess how many of them have NE? [01:44:12] :) [01:44:16] not all of them need it though. [01:44:39] 1. [01:57:09] Krenair: grep -c will give you a count, BTW. [01:58:12] Brooke, oh yeah, ty [02:20:33] !log updated payments cluster to 46bcee539b93f7b [02:20:41] Logged the message, Master [04:20:40] !log updated payments cluster to a24d39a120f61cb [04:20:47] Logged the message, Master [07:25:56] !log Job queue still suspect since the wmf4 deploy, job runners activity halved; see bug 41656 [07:26:03] Logged the message, Master [09:34:52] hi, Autocomplete on wikipedia search box seems to do a very good job on suggesting "more popular" items first. But I didn't find anywhere how it does it. I mean there is no "rank" or similar thing for an article [09:57:37] heh, 8 minutes of waiting for a lucene question [10:00:15] Nemo_bis: as lou reed sang, "first thing you learn is that you always gotta wait." [10:00:28] :D [12:40:54] search suggest index is broken / not updated? [12:49:39] ohughh not peter looked at this.. yesterday maybe? [12:49:48] and had indexing working he thought [12:50:16] I believe him, he's the only one who knows how that stuff works any more [12:50:26] but could be that something broke in the meantime [12:50:29] what project? [14:35:46] Hi! Why that: http://uk.wikipedia.org/w/api.php?maxlag=40&format=xml&action=query&list=allpages&aplimit=max&apfrom=&apnamespace=0&apcontinue=Dow_Jones_&_Company cause an error? [14:36:45] Request: GET http://uk.wikipedia.org/w/api.php?maxlag=40&format=xml&action=query&list=allpages&aplimit=max&apfrom=&apnamespace=0&apcontinue=Dow_Jones_&_Company, from 90.183.23.27 via cp1006.eqiad.wmnet (squid/2.7.STABLE9) to () [14:36:49] hm [14:37:10] happens here as well [14:37:29] mutante can you check it? [14:39:00] mutante sleeps [14:40:06] it works for me as a logged in user [14:42:25] I'm logged in and my bot too [14:43:11] I get results from Dow to Ecbal [14:43:51] How about from Dow_Jones_&_Company ? [14:44:10]

[14:44:13] that's the first entry [14:44:20] it seems that problem is & char [14:44:27] in request [14:45:11] what browser? [14:45:17] I have some ff version or other on linux [14:45:24] and as I said it does return results for me [14:46:04] ff, opera, Wiki.java [14:46:22] ff 17.0 [14:46:45] opera 12.10 [14:47:14] I have ff 16.0.1 [14:47:21] dunno [14:47:38] I mean I am sure it's the & but I don't know why you get problems and I get results [14:48:10] http://uk.wikipedia.org/w/api.php?maxlag=40&format=xml&action=query&list=allpages&aplimit=max&apfrom=&apnamespace=0&apcontinue=Dow_Jones_&_Company that's the url in the address bar [14:50:48] http://uk.wikipedia.org/w/api.php?maxlag=40&format=xml&action=query&list=allpages&aplimit=max&apfrom=&apnamespace=0&apcontinue=Dow_Jones_%26_Company [14:51:14] it seems solving [14:52:13] but question why mediawiki dont like & char is still problem [14:52:46] well it's a special character in urls, right... [15:07:07] it's a bug [15:07:19] I guess !bz is a best choice here Base [15:07:26] problem of #mediawiki [15:07:42] it should be able to handle special chars gracefuly [15:09:11] Well, passing & directly is going to change how it appears to mw/apache [15:09:15] so it's going to be there [15:09:20] encoding it is the right solution [15:11:30] sb.Append(HttpUtility.UrlEncode(s)); [15:11:36] AWB does that for the = foo part [15:55:15] Reedy I think mediawiki should be able to produce some meaningfull error in case you pass it invalid URL, instead of crashing [16:02:23] petan: there's some history you've missed apparently [16:02:45] huh? [16:03:37] ? [16:03:42] no idea what you mean [16:04:20] well i'm digging... [16:07:34] hmm [16:07:44] I sense a dev in search of answers [16:08:20] 42 [16:10:14] lol now I am completely lost [16:13:45] hrmmmm, i can't really find anything relevant on wikitech-l [16:14:01] doesn't help that I don't have any idea what year this was [16:14:07] anyway... [16:15:05] a raw & in a URL sent to a server (in the first header of a request) is a very very strong indicator that the client is doing something wrong [16:15:27] in fact some clients are so wrong that they get in an infinite loop [16:16:06] they request XYZ& and then XYZ&amp; and then XYZ&amp;amp; [16:16:56] iirc there was once a bug in google toolbar for some web browser where the toolbar itself was doing that infinite loop [16:17:06] and the toolbar was installed in *lots* of places [16:17:26] so it was effectively a DDoS [16:17:43] Reedy actually its http://wikimedia.7.n6.nabble.com/wikitech-l-Renaming-wikis-td4987899.html [16:18:01] anyway, so now & is blocked both as a precaution and to stop people that are misusing tools earlier than later [16:18:25] can we improve the message? maybe. but it's not a bug that you get an error there [16:19:04] if Base wants to provide sufficient (complete!) STR for how to get the error then we can further judge who's at fault in this case [16:19:50] https://developer.mozilla.org/en-US/docs/Bug_writing_guidelines#Writing_precise_steps_to_reproduce [17:38:09] can please someone make sure the special pages are back updated at least twice a week: https://bugzilla.wikimedia.org/show_bug.cgi?id=42152 [18:05:10] Needs someone from ops to dig up the cron logs and find out why they're apparently failing [18:13:07] mutante: About? [18:13:39] what logs now? [18:14:38] we have some cronjobs running (on hume?) to do maintenance report updatesw [18:14:51] they seem to be failing, so having an idea would help to trying to fix them.. [18:15:29] I'll be back in IRC in 10-15 mins [20:00:55] http://wikitech.wikimedia.org/ tells me "You do not have permission to create this user account, for the following reason: The action you have requested is limited to users in the group: Administrators." [20:01:21] Would somebody provide me access? I'd like to edit https://wikitech.wikimedia.org/view/Bugzilla.wikimedia.org [20:03:08] andre__: you want to be an account creator? or you don't even have your own account yet? [20:03:19] jeremyb, I don't even have my own account yet [20:05:47] hah, http://wikitech.wikimedia.org/history/User:Tim [20:05:50] hashar [20:06:16] hmm ? [20:06:23] hashar: click! ;) [20:06:41] hashar: also, maybe you want to make an account for andre__ ;-) [20:06:42] I am not giving access on wikitech sorry [20:06:47] no idea what the policy is there [20:06:57] it's open enough, i have an account [20:06:59] though I would +1 both jeremyb or andre :-D [20:07:15] hashar: but did you click the history link? [20:07:28] oh [20:07:29] old stuff :D [20:08:14] I want to drop stuff like instructions how to merge Bugzilla user accounts on http://wikitech.wikimedia.org/view/Bugzilla.wikimedia.org [20:08:28] hashar: you know andre__ is an employee, right? [20:08:32] if it's too much hassle I can also provide you the content, you can edit, and you receive a beer next time I'm around [20:08:44] (which is probably the all staff meeting) [20:08:53] k k [20:08:56] when's that, anyway? [20:08:57] give me the usernames so :-D [20:09:32] ori-l, isn't that biannual? Last was in September so I guess March. [20:10:02] andre__: ah. [20:10:10] well, as I cannot register not sure which username. I'd like to be "Malyacko" though, as everywhere else... [20:10:40] andre__: that's available [20:10:48] I guessed so. :P [20:10:48] http://wikitech.wikimedia.org/index.php?title=Special%3AListUsers&username=Malyacko&group=&limit=1 [20:11:05] andre got a cluster access ? [20:11:23] ahh [20:11:30] i will fill in your mail @wikimedia [20:11:35] what the hell is that? ;) [20:11:40] hashar, aklapper@wikimedia.org [20:12:09] andre__: is what? [20:12:20] "cluster access" [20:12:22] andre__: A randomly generated password for Malyacko has been sent to aklapper@wikimedia.org. [20:12:24] in this context. [20:12:28] hashar, yay, thanks so much! [20:13:13] andre__: i suppose it means the ability to log in to the interactive shell of any WMF machine besides labs [20:13:16] andre__: and you are an editor [20:13:24] now I am back to some python / Jenkins hacking [20:13:35] hashar, thanks again [20:13:38] hashar: ooh, fun ;) [20:13:46] jeremyb, no I don't have shell access. and I don't really need it right now [20:13:46] hashar: what sort of python stuff? don't hoard it all for yourself! :P [20:13:57] andre__: you have labs i assume [20:14:15] ori-l: how about porting MW to python? [20:14:22] jeremyb: sure, give me an hour? [20:14:25] jeremyb, yes, sure [20:14:38] if labs == cluster, okay :) [20:14:53] ori-l: ok, talk to you then [20:14:58] 15 20:13:12 < jeremyb> andre__: i suppose it means the ability to log in to the interactive shell of any WMF machine besides labs [20:15:07] andre__: so != [20:15:18] sigh. so far for my reading skills. [20:15:21] okay :) [20:25:20] AaronSchulz: do you think you might have time to review a patchset involving memcached use? [20:25:47] (in Extension:EventLogging. totally out of the blue.) [20:26:02] * AaronSchulz can't guarantee anything [20:26:17] hmm, I and others seem to have lost most css styling on dawiki [20:26:33] AaronSchulz: i'll add you just in case. :) [20:26:43] The following data is cached, and was last updated 14:25, 1 November 2012. [20:26:43] when cached spacial pages @ ukwiki are going to upgrade? [20:27:04] *special [20:28:28] Base: see above (maybe an hour ago) [20:28:56] Kaare: are you getting weird status codes for resource loader or something? [20:29:22] Kaare: works for me as anon [20:30:50] just tried in a different browser; same problem when logged in [20:31:38] too many bugs last time… [20:32:43] Kaare: what does your network inspector say about resources being fetched? [20:33:01] e.g. ctrl-shift-k in Firefox/Iceweasel [20:33:10] i think [20:33:59] Bug with undownloaded css I can see often after wikimedia-errors [20:34:28] now there is a problem with not downloaded completely js [20:34:34] some times [20:34:41] jeremyb: the css just returned in iceweasel, just as I activated firebug... [20:51:34] heh, did somebody break wikipedia? i'm having CSS loading issues on pl.wiki [20:52:00] as in, it doesn't load. [20:52:18] e.g. on my watchlist, https://pl.wikipedia.org/wiki/Specjalna:Obserwowane [20:52:24] looks broke [20:52:41] exception 'DBConnectionError' with message 'DB connection error: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) (localhost)' in /usr/local/apache/common-local/php-1.21wmf3/includes/db/Database.php:797 [20:54:17] mw1018 looks unhappy [20:54:30] binasher: About? [20:54:41] Reedy: hmm? [20:54:48] Error connecting to 10.0.6.70: Access denied for user 'wikiuser'@'10.64.0.48' (using password: YES) [20:55:02] Seeing quite a lot from that apache [20:55:09] wait, what? [20:55:13] Thu Nov 15 20:54:33 UTC 2012 mw1018 enwiki Error connecting to 10.0.6.42: Access denied for user 'wikiuser'@'10.64.0.48' (using password: YES) [20:55:13] Thu Nov 15 20:54:33 UTC 2012 mw1018 enwiki Error connecting to 10.0.6.69: Access denied for user 'wikiuser'@'10.64.0.48' (using password: YES) [20:55:13] Thu Nov 15 20:54:33 UTC 2012 mw1018 enwiki Error connecting to 10.0.6.70: Access denied for user 'wikiuser'@'10.64.0.48' (using password: YES) [20:55:21] dberror log on fluorine [20:55:21] wtf is an eqiad apache doing [20:55:27] heh [20:55:33] notpeter: ^^ [20:56:04] your guess is as good as mine [20:56:22] I spun up a couple [20:56:24] ran puppet [20:56:31] they're probably pretty broken [20:56:35] feel free to turn them off [20:56:52] I onimaged the so that you/whoever could test on them [20:57:21] they're probably just getting requests via nagios [20:57:37] anyway, not a problem [20:57:48] It's only that one apache [20:58:35] its being monitored by nagios which gets http://en.wikipedia.org/ [20:58:48] and I'm sure that that's failing hard [20:58:56] as there are sitll things not set up in eqiad, I'm sure [20:59:43] like, say, grants on the dbs ;) [20:59:51] those grants will never be granted [21:02:19] if only we had a dba.... [21:02:46] meh, you only have a 'few' mysql servers :P [21:03:24] so no dba needed? [21:20:18] is it all really good right now? [21:21:22] got this instead of modules=site&only=styles: http://p.defau.lt/?76exe7M47jOlQkX1TrN4CQ [21:21:48] wizardist: when? [21:21:52] just now [21:22:50] and this one for scripts: http://p.defau.lt/?VFEObR335pjF0fJzPr0yPw [21:22:58] binasher: ^ [21:23:43] why would anything try connecting to localhost though [21:24:01] getting this in be_x_oldwiki, but I can see the same symptoms at least in enwiki and plwiki, so it doesn't seem a project-related problem to me [21:24:03] probably it has nothing configured for the cluster [21:24:12] wizardist: we're looking at the same issue in #wikimedia-operations [21:25:13] Lol, who broke the stylesheets? ;-) [21:26:27] *flood* it's Icelandic Language Day today, so maybe they've rewritten all variable names into Icelandic? :) [21:44:36] preilly: you're right, it's a bit silly to set up memcached and not use it -- good catch [21:44:44] i'll fix (unless you beat me to it) [21:44:53] ori-l: I didn't beat you to it [21:45:28] should be solved now. [22:27:08] Anyone who can cherry pick this and deploy it to the live sites? https://gerrit.wikimedia.org/r/33659 [22:27:19] (Exception on Incubator Special:Preferences ) [22:30:01] mm [22:31:11] Reedy: ? [22:51:34] preilly: https://github.com/wikimedia/wmf-vagrant/pull/10/ [22:53:15] ori-l: merged [22:53:22] preilly: sweet, thanks :) [22:53:29] ori-l: thank-you [22:56:36] how to export data wikitravel? why some languages ​​have dumps and others not? [22:56:37] hi guys [22:57:26] What do you mean? [22:58:16] Do some of the wikivoyage projects not have xml dumps yet? [23:00:14] Reedy: yes, pt for exemple dont have dump from wikitravel [23:00:49] That's not really anything to do with us [23:00:59] We can only import the dumps we get given [23:01:02] Try #wikivoyage [23:01:14] oh, that exists? [23:01:17] (the channel) [23:03:13] Reedy: What do you mean with "dumps we get given"? [23:03:20] From the original sites [23:04:03] raylton: you are aware that there are 2 separate orgs running these sites? [23:04:11] wikivoyage != wikitravel? [23:04:22] wikivoyage is a fork of wikitravel [23:04:58] as far as I know, all wikitravel content was copied over [23:05:12] but I could be wrong. Its apparentl fairly controversial [23:06:00] gn8 folks [23:07:13] jeremyb , Soapy: so ... wikitravel dumps are not readily available? [23:07:34] I dont know what that even means [23:07:39] "dumps"? Sounds messy [23:07:44] Im not a tech person [23:08:03] but yes, wikitravel is not a Wikimedia project, and never was [23:08:13] Wikivoyage is a fork of Wikitravel that is a Wikimedia project [23:08:19] wikitravel does not have any easy process of retrieving data from their site and does not provide database dumps [23:08:41] Soapy: dumps are just computer readable copies of databases that are in some centralized location [23:11:37] LeslieCarr: as was done in en.wikivoyaje? seems to be a better way than in pt.wikivoyage ... because we(in pt) have to create everything from scratch ... or copy_paste [23:12:14] pt doesn't have a dump from wikitravel or from wikivoyage ? [23:12:35] if there's a wikivoyage dump missing, we can search and try and get their database [23:12:44] if it's wikitravel, we do not have access to their databases [23:14:22] LeslieCarr: both [23:15:03] LeslieCarr: how you got the dumps in en.wikivoyaje from wikitravel? [23:15:29] i do not know how the wikivoyage team did that, that was before they migrated to wikimedia [23:15:39] [18:12] siteinfo field 'phpversion' changed value from '5.3.2-2wm1' to '5.3.10-1ubuntu3.4+wmf1' [23:15:42] [18:12] siteinfo field 'git-hash' changed value from '16b8f19951330a6c433c0bd2a9bea50b66fc1833' to 'ff84fa1a0dd8a907956ac8dcf4f1fc5c2977d5b7' [23:15:46] (For en.wiktionary.org.) [23:16:26] where's hippiebot live? [23:16:30] raylton: i don't see a portugese version in the old wikivoyage - http://wikivoyage-old.org/ (is a temporary mirror of the old project) [23:17:00] jeremyb: In #wiktionary at lesat. [23:17:19] raylton: so i am assuming it's only wikitravel, which as i said before, does not provide us with any access to their databases and are a separate commercial company from the wikivoyage team [23:20:17] raylton: if you can convince Internet Brands (the company who owns wikitravel) to give us database dumps, we could copy the information [23:22:01] LeslieCarr: so.. we can use the content wikitravel manually, but not readily ? [23:22:51] raylton: i'm not a lawyer, but I believe it is licensed under cc-by-sa, so i believe we can legally use it (disclaimer - really not a lawyer, not legal advice) [23:25:20] LeslieCarr: oh god, this will give me a lot of work [23:25:32] raylton: sorry :-/ [23:27:39] repeat after me: IANAL + TINLA [23:28:36] LeslieCarr: I found an old dump and unofficially here:http://code.google.com/p/wikiteam/downloads/detail?name=wikitravelorg_wiki_pt-20110722-current.xml.7z&can=2&q=wikitravel. can it be exported? [23:29:48] raylton: LeslieCarr's really not the right person to discuss this with [23:30:46] jeremyb: who is? [23:35:57] raylton: you could start with one of these people: https://meta.wikimedia.org/wiki/Wikivoyage/Technical_coordination [23:38:18] jeremyb: ok, soo thanks! [23:41:32] so thanks... Reedy, Soapy, jeremyb and LeslieCarr :) [23:41:59] glad we could help [23:50:26] raylton: don't use that dump [23:51:08] besides, MF-Warburg already replied to the question on incubator and wikivoyage-l [23:52:06] Nemo_bis: this dump: http://code.google.com/p/wikiteam/downloads/detail?name=wikitravelorg_wiki_pt-20110722-current.xml.7z&can=2&q=wikitravel ? [23:53:14] raylton: yes, it has some problems iirc [23:53:27] hmm we should note it somewhere on https://archive.org/details/wiki-wikitravel.org [23:56:48] Nemo_bis: there is a dump that I can use, or any simple way of doing? [23:58:15] raylton: no [23:58:29] we're waiting for the user who has the dumps to share them