[02:03:23] Whom would be an "appropriately authorized developer" for wikimedia? [02:05:57] vague question is extremely vague [02:10:16] ToAruShiroiNeko: What are you trying to do? [02:23:53] I am trying to get an on the record statement saying there is an untold edit limit for bots [02:24:05] so that bots cant make a million edits per second [02:24:33] and sites dont need to be concerned with performance in edit count per sec sense for bots [02:24:47] or something along those lines [02:25:17] An untold edit limit? [02:25:23] I think it's explicitly defined. [02:25:41] its never revealed to the lowly users tho :p [02:25:46] we automatically edit limit if you use the api properly afaik [02:25:51] And depending on the nature of the edits, sites do need to be concerned with edits/minute. [02:26:04] bot edits I mean [02:26:11] Certain edits are a lot more expensive than others. [02:26:38] sure concern is that if artificial edit limits isnt defined wiki would break due to millions of edits [02:26:53] I think that distracts discussions from more resonable and sensable technical issues [02:27:08] https://en.wikipedia.org/wiki/Wikipedia:Don%27t_worry_about_performance [02:27:15] p858snake|l I know that [02:27:20] but its been asked on site [02:27:35] Where? [02:27:44] http://commons.wikimedia.org/wiki/Commons:Bots/Requests/%E3%82%BF%E3%83%81%E3%82%B3%E3%83%9E_robot_2#Discussion [02:27:49] bottom of the discussion [02:28:02] ToAruShiroiNeko: MySQL and Maria have their own protections as well afaik [02:28:17] p858snake|l again I know all this :) [02:29:02] if you can state the api limit for instance it would be helpful [05:03:23] p858snake|l well? :/ [06:02:27] Krenair: so did you file a bug about the silly unicode characters? [07:33:11] Does Wikipedia redirect an article with an english title to the appropriate language page if I only change the language code? i.e. Given https://en.wikipedia.org/wiki/Sacramento,_CA I change en to ja and I should be redirected to article written in japanese even though the title is different for that article? [07:33:26] it does not at this time [07:33:26] I see that it only works for some languages but not all [07:34:02] (although this may become possible in the future, given that we're currently working on Wikidata, which centralizes the Wikipedia links in different languages of a subject) [07:42:45] Cool; If the data is available this would be achievable, i.e. the page titles an article is available in, the redirect could be then be done in both ways, i.e. given an article with the title in english and language code in chinese, one could determine the chinese title for that given article and redirect the user to the chinese version of the article, and then backwards given the chinese article name and english language code, one could find [07:42:45] the given article title in english and redirect the user to the english version. This should work for any two given languages. [08:32:02] BadDesign: "works for some languages"? which? O_O [08:52:44] Nemo_bis: ah, it seems to only tell me that the page doesn't exist, but I remember remember being redirected from the english wikipedia article to a language specific article even though the title of the article was different in that language [08:52:52] s/remember/ [08:53:36] can't find an example right now, i'll try to check if this is the case, I might be rong [08:53:38] *wrong [08:56:26] BadDesign: sorry, it's impossible [08:56:31] you dreamt it [08:56:41] ok [08:56:57] probably "language specific article" was just an en.wiki article with title in original langyage [08:57:04] naming policies are tricky [09:03:53] BadDesign: the closest things we currently have to a feature like that are [09:04:05] 1) https://www.wikipedia.org/ "Find Wikipedia in a language" recently added [09:04:29] 2) the interwiki search which was briefly enabled for some months in 2008-2009 https://bugzilla.wikimedia.org/show_bug.cgi?id=44420 [09:04:47] so we're quite far :P [09:09:18] BadDesign: Something that would not be too hard to do is some JavaScript which searches the title on Wikidata, possibly https://www.wikidata.org/wiki/Special:ItemByTitle , and redirects the user to the most likely result among all titles in all languages [09:09:43] Which however would require guessing the language, or an additional step where the user clicks the link [09:10:17] Such a search would be easy enough to get on the www.wikipedia.org portal [09:19:09] Nemo_bis: I wondered if such a feature exists because I wouldn't have to parse all the Wikipedia dumps and grab the title of the articles that have coordinates in them. I would parse only the english dump and Wikipedia would redirect to the appropriate article if I only changed the language code in the URL (currently as you have said this is not the case if the article title in english is not the same as the one in the language I'm trying to [09:19:09] redirect the user to), but even if the redirection worked I wouldn't be able to tell if there exists an article in that particular language [09:30:53] is there are Wikimedia server LR-384? [09:32:01] Is there a server called LR-384? [09:33:22] why do you ask ? [09:33:43] there is something that I would like to know [09:34:26] LeslieCarr: whether a user is trolling or he is either telling the truth [09:34:31] any help is appreciated [09:34:40] no servers with that name [09:35:09] sure? [09:36:10] LeslieCarr: ping [09:36:13] i am sure [09:36:19] i know our server naming schema [09:36:31] also you can check our site.pp or dns if you disbelieve [09:36:33] heh, I was just about to raise this here :P [09:36:45] LeslieCarr: http://en.wikipedia.org/wiki/User:PT-Kevin-Makowski [09:36:46] I can confirm that LeslieCarr knows everything about our Data Center, etc. [09:36:46] BadDesign: afaik that's what GeoData was created for... [09:36:49] she is also our resident santa [09:36:51] http://en.wikipedia.org/wiki/Wikipedia:AIN#Wikicommons_Server [09:36:54] that's why [09:37:02] addshore: a block, maybe? [09:37:12] meh, not really blockable :P [09:37:32] teehee [09:37:35] oh i'll respond [09:37:46] :) [09:37:47] [= [09:37:59] hmm, we have used external contractors in the past [09:38:03] omg is this really happening [09:38:06] but, no, not him. [09:38:10] but none of the servers are nammed in that way? :P [09:38:28] nor are 'patches' applied in such a way [09:38:32] H900-SDR is not how patches are named [09:38:48] and for some reason H900 reminds me of HAL 9000 [09:38:52] missing a few chars of course [09:39:44] there we go [09:39:45] http://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Incidents#Wikicommons_Server [09:39:46] :) [09:39:47] responded [09:39:57] ;) [09:40:19] i was considering instead of a random string to make it "123FuckOff!" [09:40:25] xD [09:40:27] brilliant [09:40:37] :D [09:40:37] but why give up the chance to continue the troll a little longer [09:41:09] also, it's super early in seattle, right ? should we call him, since his phone number is online for everyone to see.... [09:41:33] LeslieCarr: no it isnt ;p [09:43:36] oh, i got my timezone math wrong [09:44:01] * addshore was talking about the phone number not the math ;p [09:44:25] oh, http://en.wikipedia.org/wiki/User:PT-Kevin-Makowski [09:44:28] it's right there :) [09:44:36] refresh ;p [09:44:42] oh [09:45:07] that was the responsible thing to do addshore .. though not the trolly thing :) [09:45:15] addshore: awww [09:45:19] meh, I trlled also ;p [09:47:45] I would join the party but it would require me editing en wp and I'm trying to keep my edit count down :-D [09:47:51] :D [09:48:15] apergos: only on en.wiki or all wikis? [09:48:28] mmm firs tedit to AN/I? [09:48:30] woo [09:48:32] well technically just on en wp [09:48:54] ah [09:49:01] * addshore goes to migrate a little script from toolserver to labs to see how amazingly fast it will run! [09:49:12] my favourite script :) [09:49:29] then you'll overload labs so much that it will no longer be fast for anyone :P [09:49:32] as with wikidata [09:50:02] Nemo_bis: I dont see overloading labs to be a problem [09:50:33] thursday night till friday morning I spawned 250 processes reviewing articles on WP [09:50:42] nothing broke or got slow :P [09:50:49] if addshore overloads labs then labs needs new things :) [09:50:56] exactly! ;p [09:50:58] they answered! [09:51:17] YuviPanda: ammusingly after I ran my procs core-n added 2 more exec nodes ;p [09:51:34] i thought it was peta-n [09:51:34] :P [09:51:35] i was there [09:51:36] ! [09:51:37] ok, you just typed an exclamation mark with no meaning in the channel, good job. If you want to see a list of all keys, check !botbrain [09:51:50] mhhhm, I cant remember :P [09:51:50] awww wm-bot [09:51:54] * Nemo_bis considers a petition to shut up wm-bot  [09:52:05] Nemo_bis: no! [09:52:11] wm-bot: is the best thing since sliced bread [09:52:23] is sliced bread all that exciting, honestly? [09:52:29] its so good my scripts can use it to talk to my irc channel instad of having yet another irc bot connected.. [09:52:34] i love unsliced bread actually [09:52:40] can tear into it like a savage much easier [09:52:43] also it is usually softer [09:52:43] * saper does not slice bread [09:52:50] I dont know, the sliced bread at the hackathon was quite something [09:52:53] plus you get to munch on the softer insides in heavier concentrations [09:53:05] * bread slices saper [09:53:08] * YuviPanda slices bread [09:53:11] apergos: I'm not sure I even know what slied bread is [09:53:18] *sciced [09:53:19] argh [09:53:29] :? [09:53:36] the word mixes up in my head [09:54:19] we call it "pan carré" with a fake French loanword [09:54:41] and yes it's a very bad thing, no idea why English consider it good :p [09:57:39] i didn't think the bread at the hackathon was too bad, though i only had the brown bread [09:57:58] urgh, wikidata needs more admins :L [10:29:24] LeslieCarr: you caused a sitr ! https://en.wikipedia.org/w/index.php?diff=556996378&oldid=556990247 [10:29:50] problem is I cant tell if this is a serious message or not xD [10:30:08] hahaha [10:30:12] really want to answer "no, we are all about open content here at the WMF" [10:30:14] no that is not really the root password [10:30:20] apergos ++ [10:30:21] but it would mean an edit on wp so can't do... [10:30:24] apergos: yes! [10:30:25] :-P :-D [10:30:28] :< [10:30:40] if you are going to edit ep, that is definatly the sort of edit you should be making :D [10:30:44] ohhhh [10:30:46] errrrgghh [10:30:48] meeehhh [10:31:09] I cant tell is shirt is being sarcastic :P [10:31:44] apergos: do it :) [10:31:54] * apergos groans and saves it [10:32:13] :D [10:33:26] someone else's turn next though [10:34:11] meh, we have to be nice to them :P [13:58:02] Hi, I would like to know if there is a way to know if a file is upload on Commons or on a local wiki without doing an API call [13:58:27] but using a query from the replica databases on WMFLabs [14:02:54] CristianCantoro: I don't have particular knowledge or experience, but wouldn't it be possible by simply looking up in commonswiki_p.image and enwiki_p.image (that is, two queries to the two tables)? [14:03:31] Yes, I think so... I wanted to double check [14:04:31] the point is if I find the image name in commonswiki_p.image then that image is on Commons otherwise I need to check the wiki's db (enwiki_p, frwiki_p, etc/) [14:05:19] right? [14:05:33] yes that's my understanding [14:08:02] the other question is ... from what I understand I can create a DB for my toold with pg___p but there is some way to export it on my local machine? [14:09:00] pg___p whaaaaat? [14:09:46] https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Help#Creating_databases [14:35:41] Krenair: usernames are always in the form pg [14:35:56] Krenair: where are 6 digits ;-) [14:36:32] Krenair: usernames are not the shell account usernames [14:36:36] p\d+\g+ [14:36:47] meh, -typos [14:38:55] Nemo_bis: it should be p\d{6}g\d{6}__ :P [14:39:44] redundant precision [14:40:13] omg typical physicists vs. mathematician discussion, reversed *runs* [17:24:33] anyone knwo what 10.64.0.127 is? :) [17:25:26] Likely a proxy [17:25:38] Yup [17:25:39] cp1005.eqiad.wmnet [17:25:41] :< [17:25:45] Is it editing by any chance? [17:25:46] edits keep slipping through :P [17:26:06] what causes it? [17:26:16] Usually bad XFF data [17:26:17] API proxy [17:26:21] https://noc.wikimedia.org/conf/highlight.php?file=squid.php [17:26:24] It's listed though? [17:26:35] Which wiki? Link to user contribs? [17:26:47] http://www.wikidata.org/wiki/Special:Contributions/10.64.0.127 [17:26:54] wikidata, guessing a labs bot [17:29:17] mhhm, I'm guessing it is safe to rangeblock all squids? :/ [17:30:23] Not sure how MW would cope with that [17:30:30] ipblock exempt? [17:30:48] Hmm [17:31:13] Hmm indeed :D [17:31:28] I've just re-pushed the config file [17:31:33] okay :0 [17:31:34] :) [17:31:35] Might be worth looking up the useragent via checkuser and see who to poke [17:33:24] ill get onto it :) [17:33:26] * Reedy tries [17:33:32] slooooooow [17:33:37] sloooooow? :P [17:33:43] it's taking, like, forever [17:33:50] wheee [17:33:51] and it timed out [17:33:52] oh. [17:33:53] it would :D [17:34:05] CU a proxy must some back with everything that has passed through it? O_o [17:35:33] ahh well :) [17:38:56] 37/r-1 (unknown) Pywikipediabot/2.0 [17:40:13] that's an interesting revision. [17:40:46] Though those edits were in March [17:41:15] pywikipedia-zzinter.py/r11593 Pywikipediabot/1.0 [17:41:29] (the format is script_name/framework_revision pywikipediabot/1.0 (or 2.0 for rewrite) [17:44:06] How does anyone get anything done with the checkuser extension? [17:44:22] reedy, I dont because I dont have it ;p [17:44:30] I installed and ran it once upon a tim [17:44:31] e [17:44:44] so I could figure out some translation related issues. but boy was that a while back [17:44:51] It was quicker to just do select * from cu_changes where cuc_user = 0 and cuc_user_text = '10.64.0.127' order by cuc_timestamp DESC limit 15 [17:45:09] Reedy: then the CU extension is broken? :? [17:45:23] Why does that mean it's broken? [17:45:38] well, it shoud be effectivly doing the same? no? [17:45:40] Nearly every request I've tried hasn't returned in my web browser [17:46:47] Presumably if it was a problem for stewards/checkusers we'd have heard noise by now [20:05:38] Reedy: we just gave up [20:05:45] Who did? [20:05:46] With what [20:05:54] stewards regarding CU [20:06:03] lol [20:06:14] it is annoying and broken, if you really want to know [20:06:23] I sorta got that impression myself [20:06:25] i can rant about it for like ever [20:06:27] I would be interested to see what % of requests succeed in returning data [20:06:37] -1% [20:07:12] lol [20:07:17] really, you just opened the pandora box [20:07:23] Can't say I've ever really looked at the code [20:07:33] good for you :) [20:07:35] But my bounded query directly against the mysql hosts was pretty quick [20:08:30] I would like to get admin tools team to actually look at it and fix it [20:08:46] they did a great job with central auth [20:09:10] which remind me i'd like to thank them, whoever is responsible for this [20:10:09] what's been done with CA? [20:10:18] Bar improvements for the auth stuff that's coming [20:10:20] multilock for example [20:10:28] You probably want James_F, csteipp, pgehres, and TimStarling [20:10:30] Chris Steipp etc? [20:10:34] yes [20:10:56] I can actually lock zillions of accounts at once [20:10:58] * hoo can't login on https://ishmael.wikimedia.org/ [20:11:17] instead of annoying error-prone manual work [20:11:20] Not just you it seems hoo [20:11:36] The checkuser stuff shouldn't be hard to improve [20:11:49] it's extremely like it's running stupid queries that take a long time to run [20:12:03] * hoo votes for rewrite [20:12:09] * matanya goes on ranting about the CU [20:12:13] gj hoo [20:12:18] Let us know when you've done it [20:13:19] Reedy: I'm going to do it after Wikibase is 100% done ;) [20:13:20] hoo: how much do you charge for such a precious service? [20:14:10] Oh, I can login to ishamel [20:14:12] different password [20:14:25] ldap passwords? [20:14:39] yup [20:14:43] wikitech/gerrit password [20:15:06] Still doesn't seem to work [20:15:13] Wonder if it's staff only or somethinjg [20:15:29] I used to be able to access it... [20:15:48] I seem to remember disabling https made it work [20:15:55] Gathering performance metrics is hard enough already... it's more like guessing them right now [20:16:25] (my memory is wrong.) [20:17:46] Noting ishmael is only sql query stuff [20:18:28] ah morebots is back [20:18:48] Reedy: Yeah... I've talked to Chris and we thought it would be good to dump the database schemes of the live DBs to noc.wikimedia or so [20:19:09] Why? [20:19:11] So that non-shell users can have a look at the table and index mess we have [20:19:34] Like I already broke stuff due to inconsistent data structures a few times [20:19:49] Heh, there are definitely wikis with differing schemas [20:20:20] It made some of the stuff I was doing with CentralAuth more difficult [20:20:55] pgehres: I thank you for your work on CA [20:21:06] hoo: It'd be more use to write a schema diffing tool [20:21:18] And finding out versus a "good schema" what differences others have [20:21:27] matanya: I was just doing my job … and it is still not over [20:21:33] Go ahead ... just plain files would already be rather ok [20:21:45] Right now I just have to hope it works (and mostly does) [20:21:47] pgehres: you did a fine job. [20:22:07] and I appreciate it. [20:22:30] Thanks. I guess it was not announced outside of WMF, but I am no longer with the WMF as staff. Just a volunteer as of last Monday [20:22:48] But … I will see the CA project through to the end … in August [20:24:38] matanya: Has a global group ever really been renamed? I spent quite some time on this feature (as it's a /bit/ of a mess) [20:25:08] yes, hoo it did [20:25:55] pgehres: don't know if it is a good thing or a bad thing, but good luck with ever you do anyway [20:26:42] pgehres, so you're no longer staff, but your shell account got reenabled just after it was disabled? [20:27:02] Yeah, shell is not restricted to staff [20:27:18] And I need it to keep working on the CentralAuth stuff [20:27:52] * hoo only needs iced tea for that... [20:29:24] different stuff, maintenance scripts and gobal account audit database :-) [20:30:00] apparently they don't create non-staff shell accounts anymore [20:30:05] though I guess that didn't apply to your situation [20:31:54] There has been informal talk about that changing, but no one could say what exactly it would take to get one. Seems to be something along the lines of: "We know where you sleep if you screw things up" [20:32:07] lol [20:32:31] Oh [20:32:31] :P [20:32:36] I still live with WMF staff, so he can get to me [20:32:38] I've that damn wikiset bug to look at [20:32:51] Someone still... on sql-s3 there's probably on old and crappy CentralAuth Db [20:32:57] We should drop that [20:33:25] It's the version from 2010 or so [20:33:29] I think we did, ages ago [20:33:34] I seem to remember logging an RT ticket and asher did [20:33:36] it [20:33:50] | cdowiki | [20:33:50] | cebwiki | [20:33:50] | cewiki | [20:33:50] | chairwiki | [20:34:04] Reedy: Mh... I spent some time debugging due to that only a few months back [20:34:25] well, whatever [20:34:26] really? [20:34:52] Maybe my mind's playing tricks on me and it was longer ago [20:35:04] If SHOW DATABASES says it's not there, it probably isn't [20:35:15] If RT search didn't suck... [20:35:17] until MariaDB got a troll mode [20:35:59] hoo: Yup [20:36:00] "Kill old centralauth database from s3" [20:36:07] Wed Feb 08 23:53:41 2012 [20:36:12] 15 months ago [20:36:37] yum -y update brain; init 6 [20:37:13] ssh hoo reboot [20:38:50] brain package conflict error [20:39:19] matanya: Maybe I should uninstall the old version first :D [20:39:58] The following packages will be removed as well: body [20:40:10] For example I've lost my shower gel in Amsterdam... second time this year that I've forgotten a shower gel :P [20:40:25] I should definitely use a new configuration [20:41:03] brain.conf.old [20:41:37] flush_memory = 0 [20:41:38] I think you should log the action [20:42:21] echo 1 > /hoo/sys/kernel/brain/mem/ [20:42:59] boy i can carry on with this a long time [20:51:24] hoo: Just store everything in /dev/null [20:51:33] Then you don't have to worry about forgetting something [20:51:39] As you'll always just get a new one when necessary [21:40:57] hmmm [21:40:57] https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Gadgets [21:41:02] feels dupey [21:41:07] Reedy: ^ [21:42:03] MatmaRex: Known caching weirdness [21:42:29] https://bugzilla.wikimedia.org/show_bug.cgi?id=37228 [21:42:50] Probably the extension doesn't properly (in)validate however it caches data [21:43:01] * hoo has never looked at the code... just guessing into the wild [21:43:51] hoo: thanks [23:27:33] any devs around? [23:27:51] * addshore wonders what this breaking change he has just bee told about it] [23:27:53] *is [23:28:30] hoo, any idea? :D [23:29:00] addshore: Maybe the rcid thing? [23:29:12] addshore: we're talking in #wikimedia-dev