[00:58:39] [[Tech]]; MZMcBride; split this out; https://meta.wikimedia.org/w/index.php?diff=5363319&oldid=5362881&rcid=4038018 [00:59:24] [[Tech]]; MZMcBride; /* Move sidebar links to bottom (similar to MySkin) */ better; https://meta.wikimedia.org/w/index.php?diff=5363320&oldid=5363319&rcid=4038019 [01:10:23] [[Tech]]; MZMcBride; /* Links underlined */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363327&oldid=5363320&rcid=4038027 [01:11:19] [[Tech]]; MZMcBride; /* Background color (differentiate between article namespace and other namespaces) */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363328&oldid=5363327&rcid=4038028 [01:21:13] [[Tech]]; PiRSquared17; /* Background color (differentiate between article namespace and other namespaces) */ +; https://meta.wikimedia.org/w/index.php?diff=5363341&oldid=5363328&rcid=4038041 [01:26:56] [[Tech]]; MZMcBride; /* Background color (differentiate between article namespace and other namespaces) */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363345&oldid=5363341&rcid=4038045 [01:28:55] [[Tech]]; PiRSquared17; /* Background color (differentiate between article namespace and other namespaces) */ +; https://meta.wikimedia.org/w/index.php?diff=5363352&oldid=5363345&rcid=4038052 [01:34:24] [[Tech]]; MZMcBride; /* serif font */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363361&oldid=5363352&rcid=4038061 [02:03:09] [[Tech]]; Jeshyr; /* Move sidebar links to bottom (similar to MySkin) */; https://meta.wikimedia.org/w/index.php?diff=5363396&oldid=5363361&rcid=4038096 [02:35:55] [[Tech]]; MZMcBride; /* Move sidebar links to bottom (similar to MySkin) */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363472&oldid=5363396&rcid=4038176 [02:52:07] [[Tech]]; MZMcBride; /* Move sidebar links to bottom (similar to MySkin) */ better; https://meta.wikimedia.org/w/index.php?diff=5363484&oldid=5363472&rcid=4038189 [03:06:25] [[Tech]]; MZMcBride; /* Remove gradient on tabs in Vector */ +reply; https://meta.wikimedia.org/w/index.php?diff=5363492&oldid=5363484&rcid=4038197 [03:27:36] is there also a configuration value that says what type of project a project is, like Wikipedia, Wikisource, Wikibooks, etc [03:27:37] ? [03:27:52] one which is language independent [03:28:48] wgSiteName gives Wikipedia in the particular language [03:29:39] Romaine: for what purpose? [03:29:54] in my own vector.js [03:30:08] I what to use a code for Wikipedias only [03:32:04] similare to wgSiteName [03:36:28] Betacommand: any idea? [03:37:22] can wgNoticeProject help with that maybe? [03:39:16] Romaine: i think yair rand had a way to do it, let me see [03:41:28] Romaine: something in https://www.wikidata.org/wiki/User:Yair_rand/WikidataInfo.js makes it only show up on 'pedia's [03:41:37] i cant figure out what it is though [03:44:09] I think I found a way that works, with wgNoticeProject [03:44:44] if(mw.config.get( 'wgNoticeProject' ) === 'wikipedia'){ } else { } [03:45:31] oh great :) [03:46:00] but Yair rand doesn't use that [03:53:43] thanks fopr helping [04:06:07] [[Tech]]; Jeshyr; /* Move sidebar links to bottom (similar to MySkin) */; https://meta.wikimedia.org/w/index.php?diff=5363545&oldid=5363492&rcid=4038258 [05:14:14] 05:13:44 up 3 days, 8:45, 7 users, load average: 93.20, 433.49, 241.01 [05:14:19] that's my laptop :( [05:14:36] apparently i gave myself a forkbomb just by trying to connect to labs [05:14:45] * jeremyb_ will try again when actually awake [05:15:16] jeremyb_: what in lords name did you do? :) [05:15:59] Ryan_Lane: well i had a process list full of ssh [05:16:04] :D [05:16:14] so, i guess it was an infinite loop [05:16:46] (ProxyCommand i assume. will find out in the morning) [06:03:33] [[Tech]]; Geni; /* Background color (differentiate between article namespace and other namespaces) */; https://meta.wikimedia.org/w/index.php?diff=5363627&oldid=5363545&rcid=4038386 [06:09:17] [[Tech]]; Geni; /* Background color (differentiate between article namespace and other namespaces) */; https://meta.wikimedia.org/w/index.php?diff=5363631&oldid=5363627&rcid=4038391 [07:48:37] [[Tech]]; Patrick; /* Move sidebar links to bottom (similar to MySkin) */; https://meta.wikimedia.org/w/index.php?diff=5363726&oldid=5363631&rcid=4038500 [07:52:48] lol One minor thing: the watch/unwatch symbol is repeated three and a half times. - [[User:Patrick|Patrick]] ([[User talk:Patrick|talk]]) 07:48, 4 April 2013 (UTC) [08:23:28] Krinkle|detached: what happened to https://doc.wikimedia.org/mediawiki-core/master/php/html/ ? [08:35:02] ori-l: I overwrote master/ with docs/js/ dir [08:35:05] Working on it [08:35:06] will be fixed by next merge [08:35:51] oh, as long as you're aware. no panic, take it easy. [08:36:02] Trying to add jsduck into it [08:36:10] It appeared to be there already [08:36:28] https://doc.wikimedia.org/mediawiki-core/master/js/ existed up until a minute ago and was linked from the main index [08:36:38] but it was a one-time run I did manually, not being updated postmerge [08:36:59] working on that now + validation on patchset creation (like linting without publishing to doc.wikimedia.org) [08:37:34] yeah, you mentioned that in a comment; that would be useful [08:37:57] Manual run: https://gerrit.wikimedia.org/r/#/c/57446/ [08:37:59] jsduck failure [08:38:12] ("type mixed is invalid, should be Mixed") [08:43:23] ori-l: To check jsduck locally, run "$ ./maintenance/mwjsduck-gen" (assumes you have gem jsduck installed) [08:43:34] (has been merged awhile ago) [08:48:21] ori-l: I've just approved that change so hopefully it will be marked success after the syntax error is fixed, merged. And then have postmerge kick in and publish php and js docs automatically and sync to doc.wikimedia.org [08:48:33] * Krinkle hands free [08:56:51] Apparently hashar disabled php-doxygen postmerge job [08:56:58] but the jsduck one worked: https://doc.wikimedia.org/mediawiki-core/master/php/html/ [08:57:07] https://doc.wikimedia.org/mediawiki-core/master/js/ * [13:17:30] [[Tech]]; Anomie; /* serif font */ re; https://meta.wikimedia.org/w/index.php?diff=5364202&oldid=5363726&rcid=4039012 [13:20:12] [[Tech]]; Anomie; /* Background color (differentiate between article namespace and other namespaces) */ re; https://meta.wikimedia.org/w/index.php?diff=5364205&oldid=5364202&rcid=4039015 [13:37:00] ^demon: since I see you are around, any idea why users on a wiki might see the popup n red links in an inappropriate language? [13:37:03] http://el.wikipedia.org/wiki/%CE%9F%CE%93%CE%9A_%CE%9D%CE%B9%CF%82 [13:37:28] example, trry hovering over a few of the redlinks, they are giving the 'page doesn't exist' message in I guess italian [13:37:53] <^demon> Weird. [13:38:09] the users seem to ty it to edits made by a bot http://el.wikipedia.org/wiki/%CE%A7%CF%81%CE%AE%CF%83%CF%84%CE%B7%CF%82:SamoaBot, that is that all the redlinks on those pages do this but I dunno if that's really a factor [13:38:09] <^demon> It all shows as greek to me, no italian. [13:38:27] wait. [13:38:32] you mouse over a redlink [13:38:42] wait for the little tooltip [13:38:54] and it gives the nake of the link and then a stringi n greek next to it instead of [13:39:07] La pagina non esiste [13:39:12] which is what I see and other users see? [13:39:19] *name of the link [13:39:34] <^demon> Yeah, it's in greek. [13:39:44] hmm I wonder if it's related to the language choice of the user [13:39:57] I have el of course, I expect the users do too [13:40:11] <^demon> eg: Σταντ Μυνισπάλ ντυ Ραι (δεν έχει γραφτεί ακόμα) [13:40:44] yeah I so don't [13:41:03] I get the italian or spanish or whatever it is caption [13:43:06] <^demon> Hmm, toss it in BZ. I haven't the foggiest. I could see that text being in user language, or site language, but a totally unrelated language? [13:43:32] <^demon> Hmm, does purging the page fix it? [13:45:40] the bot is an italian bot we think [13:46:24] <^demon> Hmm. So here's what could happen. Bot saves the page, puts it in parser cache. [13:46:30] <^demon> Then you're viewing the pcache'd page. [13:46:37] <^demon> But that's weird, since we vary on language code :\ [13:46:42] <^demon> (or we should...) [13:46:49] and yes the purge works but gee tht's not awesome [13:46:54] I am a logged in user of course [13:47:03] <^demon> As am I. [13:47:17] so if you were to change your lang prefs [13:47:17] apergos: unless the user has logged into the bot and changed it, shouldn't it be the local community language? [13:47:19] to el [13:47:35] I bet the user has set their lang prefs to it (of the bot) [13:47:56] so that when it requests pages it gets the it messages... [13:47:59] and then bam [13:48:01] <^demon> So here's what happened. Either the edit saved it to the pcache with a langcode of 'el' (wrong), or for some reason you're getting the it-cached page (also wrong). [13:48:09] <^demon> Since I got el, that's doubly weird. [13:48:54] uh huh [13:48:59] apergos: query the bots data in the user table and have a look? [13:49:12] well I am asking the reporter of the issue to change his lang pref temporarily to en [13:49:17] and see what he retrieves [13:49:39] the bot user is form another home wiki see [13:49:52] http://el.wikipedia.org/wiki/%CE%A7%CF%81%CE%AE%CF%83%CF%84%CE%B7%CF%82:Ricordisamoa [13:50:12] presumably the bot user logs the bot into it and then off it goes to do its thing [13:50:46] so when my guineau ping friend switches to en he sees en [13:50:48] not el like you [13:50:52] ^demon: [13:52:24] should I ask him to drop in here so I can stop being the middleperson? [13:52:42] <^demon> Well, I don't think it's the bot...MediaWiki shouldn't show you the wrong cached version :p [13:53:25] true! [13:53:45] he's not the bot owner, he's an admin over on el (I think still an admin) [13:57:02] thank you for looking at it, the reporter may show up here later, you can always refer him to bugzilla or whatever [13:59:18] btw, that was Italian, not Spanish [13:59:31] yeah I figured that out [13:59:54] sorry to be so ignorant of the romance languages group [14:01:20] you're not an ignorant [14:03:46] I'm looking where it is called from, but Linker::link() isn't passed the language to use... [14:05:46] how does it guess? :-D [14:06:02] (I know I sohuld look at the source but my head is in this other code right now, for days actually) [14:06:36] Can anyone help with using Python to change my watchlist using the API (for Commons)? I get /Action 'watch' is not allowed for the current user/ in the returned xml. [14:11:09] I'm logged in okay, but I'm wondering if there is some security issue I have not addressed. [14:12:01] (In particular, I can query my watchlist, so that sort of thing works) [14:41:41] Fae, what request are you sending? [14:49:05] In this case I just used Python's urllib2.urlopen() for http://commons.wikimedia.org/w/api.php?action=query&prop=info&intoken=watch&titles=Main%20Page&format=xml and then read() it into a string I could then extract the token from. [14:49:33] I have also tried urllib2.Request but with the same end result. [14:50:40] If I follow the link to my browser, it gives me the watchtoken I'm looking for, so I'm puzzled at the difference. [14:51:06] Fae: Have you made sure your script is sending the required tokens? [14:51:25] * cookies [14:51:26] damn [14:51:58] Ugh, do I have to get Python to log in every time I do this, as opposed to using my login? [14:52:47] The guide about (un)watch for the API is a bit cryptic - in fact it does not suggest doing anything else. [14:53:23] http://www.mediawiki.org/wiki/API:Watch [14:53:23] Fae: What do you get if you let Python fetch api.php?action=query&format=json&meta=userinfo [14:53:39] (or as xml, doesn't matter) [14:54:23] {"query":{"userinfo":{"id":1086557,"name":"F\u00e6"}}} [14:54:40] that looks good, so you seem to be logged in properly [14:55:26] Yep, I can query the items on my watchlist (so search for matching strings), I just can't seem to watch/unwatch as I can't get hold of the watch token. [14:56:19] I could probably wangle it for the session by taking the one from my browser and plugging it into the python code. It's a daft hack though. [15:02:14] Nope, just tried it, got 'badtoken' [15:03:46] Probably the data you send isn't 100% sane... back then I wrote my Python Wiki framework thingy it took me quite some time to figure all that out [15:03:54] and I wrote an own cookie handler :P [15:06:46] I'm getting closer I think. I frigged the token (by finding it via the browser). It seems to work once using quote() around it. I now get the error code "mustbeposted", which I can handle... This might be a back-burner to fiddle with later though. [15:12:23] Fae: why dont you just use pywikipedia ? [15:19:51] I had a look at the watchlist module, is there a better guide to it somewhere? I could do with an example use to look at. [15:20:48] I assume you've looked at the somewhat sparse https://www.mediawiki.org/wiki/API:Watchlist ? [15:32:25] apergos: Yes, done what it recommends there. [15:34:01] As hoo points out, this all seems a trial of fire, with the mediawiki guides never quite being straight-forward to implement. [15:37:56] !log adjusted anti-fraud rules on the payments cluster [15:38:03] Logged the message, Master [15:40:42] Betacommand: Apart from watchlist.py, what should I look at using within pywikipedia? [15:51:03] Fae: it has support for page.watch() [15:51:10] see wikipedia.py [15:52:02] Fae: what is your goal? [16:53:54] [[Tech]]; MZMcBride; /* Background color (differentiate between article namespace and other namespaces) */ +reply; https://meta.wikimedia.org/w/index.php?diff=5364386&oldid=5364205&rcid=4039293 [16:57:14] Fae: yeah it's mostly trial by swearing [16:57:56] if you're at all used to the api these tiny guides are a little less opaque [16:57:58] [[Tech]]; MZMcBride; /* Move sidebar links to bottom (similar to MySkin) */ +reply; https://meta.wikimedia.org/w/index.php?diff=5364391&oldid=5364386&rcid=4039301 [17:50:10] Fae: did you try .watch() ? [17:50:36] Betacommand: his watchlist is so big that it can't be edited in the browser. or something. 90k entries. he's just trying to unwatch stuff [17:51:19] jeremyb_: easy fix, Ive got code for that :P [17:52:09] Fae: and when doing stuff by hand you should be using 'requests' (a python lib) not urllib2 (for your sanity). and there's some api wrapper glue in pywikipediabot too to make arbitrary api queries. for something in between .watch() and urllib2 [17:53:41] jeremyb_: pywiki has an unwatch() [17:54:45] i figured [17:54:46] :) [17:55:45] which is why writing your own code is evil [18:19:00] jeremyb_: (returns to window) Ah good suggestions - I'm going to snap this and fiddle around later in the week. Betacommand my objective was to be able to sort through my watchlist and unwatch pages based on certain info from the API, such as title or upload date, against a regex filter. At the moment my watchlist is too big to edit (nearly 100k pages), so I need a better way of maintaining... [18:19:02] ...it. If someone has a bit of Python that does this, I'd really appreciate a copy. :-) [18:20:12] Fae: if you want to give me criteria I can create a program for you [18:22:01] Fae: feel free to write up what you want filtered on and email it to me [18:22:10] @toolserver.org [18:22:18] Cool, will do. [18:23:17] It would be great to have the basics that I might fiddle with later. :-) [18:24:20] Fae: if you can give me the items you want to be able to filter on, I can create rules for it :P [18:24:52] Fae: I am evil I have a ton of code that can do almost anything [21:25:49] Nemo_bis, "We don't merge accounts on Wikimedia projects, at all." umm... [21:27:17] Nemo_bis, you do realise that bureaucrats on Wikivoyage wikis can merge users, right? [21:35:14] Krenair: just an exception, and not the wikis where white cat was [21:35:23] he was renamed years ago :) it's a long saga [21:35:31] hardly an exception [21:35:36] what you said is just false [21:36:04] wikivoyage is an exception [21:36:39] I guess any other wiki could become an exception as well if they wanted [21:37:22] I doubt it [21:37:35] and it's not a single wiki, white cat was on hundreds [21:37:48] none of which agrees with the merge :) [22:00:18] ori-l: is E3 done with their deployment window? [22:02:42] kaldari: superm401 is doing the deployment, and he's not done just yet [22:03:47] superm401: what do you guys have left to do? [22:03:56] scap? config changes? [22:05:25] (i pinged him to ask.) [22:11:04] kaldari: finished [22:13:06] kaldari, sorry for the delay. [22:13:45] kaldari, do you guys know when you're planning to enable Echo on enwiki? [22:14:04] before the end of the month [22:14:47] kaldari, okay. There will need to be a scap, but if it's going to be a couple of weeks or so, someone else will be bound to do one. [22:16:12] we're going to scap later tonite [22:16:27] after we fight off the zombies [22:18:21] bsitu: https://gerrit.wikimedia.org/r/#/c/57669/ [22:27:01] kaldari: wait, zombies? :) [22:27:11] no one told you? [22:27:20] better buy some batteries [22:28:08] actually, we're just waiting for Jenkins to merge [22:29:10] ...and Jenkins stumbles across the finish line! [22:29:21] with a fistful of brains [22:29:29] er, code [22:30:39] kaldari: I'm sorry, I couldn't hear you, I was in my basement putting gas in my chainsaw [22:31:11] :) [22:32:25] kaldari: just so I'm up to speed, where do we meet with the chainsaws, I mean, you're going to deploy and then scap later? When? [22:33:07] we'll probably be scapping right before or around 4 [22:33:20] cool [22:33:35] it's fairly important that no one scap in the meantime though [22:33:54] as we're deploying a schema change for Echo [22:34:02] eek, yes [22:34:04] although worst case scenario is MediaWiki.org crashes [22:34:09] no en.wiki [22:34:13] or others [22:34:21] bah, who needs it [22:34:30] kaldari: btw, pm? [22:34:33] we might just crash it for fun anyway [22:34:45] yes 4pm [23:20:48] I will run scap soon [23:34:54] getting errors on test.wiki re E3Experiments [23:35:14] superm401, ori-l: ^ [23:35:55] kaldari, looking. [23:37:57] superm401: did you guys change the name of the primary extension file recently? [23:38:22] kaldari, we did, but we also removed it. [23:38:32] Maybe the CommonSettings is out of date on 1.22wmf1. [23:39:03] yeah, that seems like a likely candidate [23:39:11] if the file name changed [23:39:28] csteipp: ^ [23:39:44] kaldari, it's not even included from CommonSettings, though. [23:39:52] I'm going to sync it. [23:40:29] cool [23:50:16] superm401: Looks like E3Experiments extension doesn't exist in the 1.22wmf1 dir on fenari [23:50:49] kaldari, correct. And it was already removed from CommonSettings. [23:50:59] I tried re-syncing it, but I think I'm missing something. [23:51:11] oh, we're killing the extension? [23:56:41] kaldari, yes, both E3Experiments and LastModified are removed. [23:56:41] Yeah [23:56:45] Doing the commit now. [23:56:59] ori-l had a murderous spree [23:56:59] I misunderstood the issue (thought it was with CommonSettings). [23:57:52] <^demon> Reedy: I thought http://p.defau.lt/?Ylq_QB2QUzfWOrhQFizsRw would fix it. [23:58:45] Um.. hi? [23:59:03] There's some message like "Warning: include_once(/home/wikipedia/common/php-1.22wmf1/extensions/E3Experiments/E3Experiments.php): failed to open stream" at the top of every page... [23:59:18] now no CSS :/ [23:59:29] Yeah, I'm working on it.