[01:29:30] Reedy: around? [01:29:38] Was about to go [01:30:25] do you have any idea why our page namespace still doesn't work on cswikisource? [01:30:27] Wassup? [01:30:47] https://gerrit.wikimedia.org/r/#/c/42365/ [01:30:54] Doesn't work? [01:31:16] https://cs.wikisource.org/wiki/Speci%C3%A1ln%C3%AD:PrefixIndex/str%C3%A1nka: [01:31:19] Is that even live? [01:31:28] i thought it is [01:31:34] how can i find out? [01:31:41] It's nmot [01:31:57] ah... [01:32:07] commit e20edfdc0713b9f60c7b45da2929d6baf6589b1a [01:32:08] Author: Translation updater bot [01:32:08] Date: Tue Jan 1 20:25:09 2013 +0000 [01:32:28] hmm [01:32:55] i guess it at least needs to run namespacedupes or how is that maintenance script called [01:33:29] otoh, the pulldown menu still shows page instead of stránka anyway [01:35:25] Code deployed, script run [01:38:12] \o/ awesome, thank you [03:09:20] hey, do people know about this -http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)/Archive_105#Link_in_history_says_user_does_not_exist [03:19:27] Prodego: You should file a bug, if there isn't one already. [03:19:49] You can have a specific bug (about just that user) or a generic bug (about the general issue). [03:19:53] I imagine there are other cases. [03:19:58] There's probably already a bug... [03:21:47] http://p.defau.lt/?2l8YGz9IKr26fRBUsrnSnA [03:21:49] Prodego: ^ [03:22:07] Susan: super awesome, dragonfly says thank you [03:25:41] He adores me. [03:27:34] Susan: I'll ask him what he wants to get renamed to then have someone do it, so that should resolve it [03:27:47] I imagine there are other cases. [03:27:48] But okay. [03:27:52] Susan: is there an easy way to search the user database for all users who have lowercase letters? [03:27:54] You may need a sysadmin to do the rename. [03:28:00] No. [03:28:01] yep I'm assuming that'll be the case [03:29:12] Susan: could you grab the user log for that user as well for Dragonfly? [04:19:48] TimStarling: http://wikitech.wikimedia.org/view/Swift/Open_Issues_Aug_-_Sept_2012/Cruft_on_ms7 is looking good now [04:20:41] good [04:21:29] * Aaron|home wonders about the last bits [04:23:18] * Aaron|home wonders what https://upload.wikimedia.org/robots.txt is about [04:24:22] TimStarling: so is all the nfs stuff gone? It looks like it. [04:26:06] yes, I guess so [04:26:46] \o/ [04:27:33] could one of you possibly help me by doing a rename of a user with a lowercased username? [04:27:36] anyway, is timelimit an ok solution for the scalar problems? [04:28:15] Its the interesting case of User:ɱ [04:28:15] on enwiki [04:28:20] did you see preilly's comment about the "timeout" command? [04:28:55] you can just do "timeout 30 convert src dest" [04:29:00] timeout is in coreutils [04:29:23] so shell script wins again [04:30:36] Prodego: that sounds complicated [04:31:10] TimStarling: yea, apparently there was an issue where some servers considered ɱ to have no uppercase letter, and some did [04:31:12] ah, I thought it was "timelimit", ok [04:31:24] there is some info here - http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)/Archive_105#Link_in_history_says_user_does_not_exist [04:31:26] TimStarling: is someone working on that? [04:31:40] no [04:32:14] I have no idea if there is still inconsistancy between servers, I never saw that myself [04:32:23] I can probably hack something up if you're busy having dinner or a life or something [04:32:45] * Aaron|home is on a windows system atm [04:34:21] TimStarling: ok [04:34:32] * Aaron|home wonders what to set $wgMaxBacklinksInvalidate too [04:34:50] maybe 200,000 [04:35:00] I'll do this wall clock limit [04:45:53] https://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=dbrepllag&sishowalldb= [04:45:57] [04:46:03] That seems a bit excessive. Is this known? [04:46:05] I got the user's permission to rename them from ɱ to Ɱ (the capital form), which will at least solve the problem of them having no contributions page and such [04:51:46] Susan: do you think I can trust the comment at https://bugzilla.wikimedia.org/show_bug.cgi?id=3507 that renameuser will work? [04:52:06] I was expecting it wouldn't, but if it does than that's easy [04:52:20] Hmmm. [04:52:29] I remember some very old magic in Special:UserRights for funky usernames. [04:52:29] its also from 2007 [04:52:38] I don't remember any such magic in Special:RenameUser. [04:52:48] But it may work. There's only one way to find out. :-) [04:53:00] Well I'd worry it would break everything :) [04:53:12] but I guess once it looks up the userid then there is nothing else to break, right? [04:53:14] Does he have an account on test.wikipedia.org? [04:53:53] Susan: you'd have to check with the toolserver [04:54:07] The lag seems to be decreasing, but if it's going to be so excessive, I'm just going to direct the script to ignore it. [04:54:14] [04:54:30] That's over a day of lag. [04:54:40] Was db59 on vacation or something? [04:55:04] Hmm, I see some logs from December 11: http://wikitech.wikimedia.org/view/SAL [04:55:41] TimStarling: alright with you if I have a crat give it a shot? [04:55:59] sure [05:03:24] TimStarling: looks like it all worked (thanks MBisanz for the rename). If you have the time it might be worthwhile to check if there is that inconsistency about the captialization of ɱ across different servers, since it may affect other chars as well. [05:03:43] thanks for your help [05:11:21] wiki = wikitools.Wiki(settings.apiurl); wiki.setMaxlag(-1) [05:11:31] So I tried that. And the framework I'm using is still hung up. [05:11:38] > Server lag, sleeping for 120 seconds [05:11:43] So I guess I'm just waiting for the lag to decrease. [05:11:49] It's down to 108000 seconds now. [06:01:27] having an issue editing via the API [06:01:31] specifically using pywikipedia [06:01:48] for the last 5-6 hours, it has been delaying all edits by exactly 300 seconds because of "database server lag" [06:01:55] any idea why this might be happening? [06:02:11] my bot tasks haven't made an edit for nearly 6 hours [06:04:26] on enwiki? [06:04:33] yup [06:09:38] Snottywong: One of the API servers is horribly lagged. It's decreasing, though. [06:09:47] Snottywong: https://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=dbrepllag&sishowalldb= [06:10:11] I depooled it [06:10:36] ok thanks, think that did it [06:11:01] Thanks, Tim. [06:34:15] TimStarling: i just heard there was recently somewhere in Australia that was so warm that they had to add an extra (never been used before) color to the weather map. was 129F according to the story (53.8C) [06:34:59] yeah, I saw that on facebook [06:35:23] but I couldn't actually find that map when I looked for it on bom.gov.au [06:35:34] probably not looking hard enough, it is a big site [06:39:56] https://www.facebook.com/photo.php?fbid=472388816158357&set=a.171427712921137.44816.170992086298033&type=1&relevant_count=1 [06:40:50] > Weather maps image symbol, click to link to weather maps page [06:40:56] * jeremyb stabs their title text... [06:41:09] the facebook map is a forecast for January 14 [06:41:26] and it was posted on January 9 [06:42:46] so maybe we should wait to see if it actually gets that hot before we get excited [06:45:18] TimStarling: http://www.bom.gov.au/australia/charts/viewer/index.shtml?type=T&level=2m&tz=AEDT&area=Au&model=CG&chartSubmit=Refresh+View [06:46:25] http://www.bom.gov.au/australia/charts/viewer/index.shtml?type=T&level=2m&tz=AEDT&area=Au&model=CG&chartSubmit=Refresh+View [06:46:54] grrr, i guess that link is no different [06:47:15] I found it [06:47:24] ahhh, this is better: http://www.bom.gov.au/australia/charts/viewer/index.shtml?unit=p25&type=T&level=2m&area=Au&model=CG [06:47:48] yeah, very different to the one posted on facebook [06:47:58] well 2 days difference [06:48:08] maximum 46-48 [06:48:09] anyway, they left in the 54 purple [06:48:18] in the legend [06:48:46] stark new evidence of global warming etc. etc. [07:17:39] As long as Death Valley in CA holds the world record it's nothing new to me [07:20:28] 129F is new to me! [07:20:44] Death Valley's record was 134 F I believe [07:20:51] but desert is unknown [07:20:59] summer 2012 saw 129 F there too [07:21:07] i guess the australian place must be desert [07:21:18] * jeremyb sleeps on it [07:21:51] yes that part is the Great Australian Desert [07:51:12] the what desert? [07:57:26] AaronSchulz: so how hard would it be to use a different dictionary for different languages? https://wikitech.wikimedia.org/view/Generating_CAPTCHAs [09:37:57] Is it possible to give "file mover" permissions to admins on fa.wikip ? [09:40:43] Farsi ? [09:43:22] jubo2, yes [09:55:43] :-؟ [09:55:46] :-? [09:57:56] persian speak [10:00:53] چی میخوای بگم؟ [10:02:29] all I know of Farsi is that "-stan" means "land" [10:03:04] jubo, Yep, that's right. [10:03:25] jubo, Do u know anyone who can help me with this? [10:03:35] Meisam: no, sry cannot help [10:04:19] jubo, all these users online and no one is checking the channel? [10:04:21] :| [10:04:26] the sysop(presseur)ators around these parts are usually not friends of jubo-jubo [10:05:38] ooops.. *realizes not in #wikipedia-en* [10:05:56] Meisam: this channel is for wikimedia tech people and bots [10:06:23] the sysop(presseur)ators are friend here [10:06:49] :)) [10:07:41] I usually tfsu 99.8% of the time in here for obvious reasons [10:07:56] Meisam: Perhaps try at #wikipedia instead of #wikimedia-tech ? [10:08:20] jubo2, I will. [10:22:30] ping RD [11:45:25] hello [11:45:41] would it be possible to update FlaggedRevs to master on the wmf7 branch? [11:46:07] https://gerrit.wikimedia.org/r/#/c/42300/ fixes a major bug, and it'd be nice to get it on the wikis using FR sooner than later [11:56:38] okay, i submitted updating FR for review - https://gerrit.wikimedia.org/r/#/c/43430/ - i hope i did it right [12:48:12] apergos: Can you join #wiktionary, there is someone asking about the most recent enwiki dump, thanks :) [12:48:23] ok [15:00:27] "There was an error collecting ganglia data (127.0.0.1:8654): fsockopen error: Connection refused" on http://ganglia.wmflabs.org/latest/ -- andre__ is there already a bug/rt ticket open re this? :) [15:00:30] * sumanah Delegates [15:01:27] sumanah, urgh, we have a ganglia on wmflabs.org ? [15:01:42] I would have expected it to be as restricted as https://ganglia.wikimedia.org/ [15:01:43] oh I missed the WMFLabs section of that quote, never min [15:01:47] mind* [15:04:26] andre__: usually it's down, so it doesn't matter much [15:04:40] heh. [15:30:42] as valeriej can reproduce search on mediawiki.org not always returning results, any ideas how can she help debug this further? Any ideas welcome. [15:30:44] that's https://bugzilla.wikimedia.org/show_bug.cgi?id=42423 , by the way [15:32:21] andre__: I posted a screenshot to a similar bug, lemme find it [15:32:53] she can reproduce it, I cannot, and I'd love to see her help tracking down and fixing such an annoying bug... [15:33:44] sumanah, https://bugzilla.wikimedia.org/show_bug.cgi?id=16236 [15:33:46] andre__: https://bugzilla.wikimedia.org/show_bug.cgi?id=16236 -- it was intermittent! I think it [15:33:48] ha [15:33:49] yeah [15:33:50] hah! [15:33:53] I winz! ;) [15:34:34] okay, I'll read that one and see if there's some debug instructions [15:34:39] so for me, as you can see, FIRST I searched and got 0 results, and then when I hit search 1ce more, results came up. [15:35:03] so I suspect that thinking of uncommon search terms for mediawiki.org and trying them might work to reproduce, but only the first time? maybe there's an indexing failure? [15:37:37] oh true, I remember that one now. but unfortunately no comments how to track it down further [16:47:03] andre__: you can't reproduce it at all? [16:47:15] jeremyb, so far not [16:47:38] andre__: i got empty results *after* first getting a good result set. so it's not about warming the cache. i think [16:47:46] andre__: did you see my attachment? [16:48:15] jeremyb, yes, I saw it [16:48:18] and my search term wasn't obscure either. and it was easy to repro [16:48:58] jeremyb, as a number of people can reproduce it, the more important question is how (and who) to track down the reasons. Help on that part is highly welcome. [16:49:32] andre__: well first maybe we get it to include the comment on where it tried to fetch from and what the error was in place of "fetched via" [16:49:52] (fetched via is missing from my empty one) [16:51:26] i tried a search that should legitamitely have an empty result set and that also was missing the debug comment [16:52:40] jeremyb, that would be https://bugzilla.wikimedia.org/show_bug.cgi?id=43544 I guess [16:53:03] no [16:53:14] maybe that could fix both but this is distinct [16:53:18] Have we got a lucene sucks bug? [16:53:24] Reedy, many :) [16:53:39] Perfect tracking bug [16:53:45] <^demon> I remember when search *really* sucked. [16:53:52] <^demon> And so we told people to use google instead. [16:54:14] ^demon: i was just talking about that a day or two ago! with notpeter [16:54:47] this being missing is not necessarily the same bug as needing to show an error to users: [16:55:11] also, i said it's missing even when there's legitamitely no results (and hence should be no error to users) [16:55:40] I see. Would that be worth another separate enhancement request, if I get it right? [16:56:01] i guess... [16:56:16] anyway, the point is maybe the comment would help debug further [17:00:09] jeremyb: could you file that request in Bugzilla? It's not that I'm lazy (or a bit, maybe ;-), but more that you understand much better what is needed... [17:14:03] !b 43869 | andre__ [17:14:03] andre__: https://bugzilla.wikimedia.org/43869 [17:14:07] you want to set deps for me? [17:15:10] jeremyb, thanks a lot, appreciated! If you feel like, sure, otherwise I can try :) [17:15:26] andre__: i feel like not ;) [17:15:42] :D okay, sure [18:38:32] moinsen [18:39:01] first question: do i have to speak english, here? [18:39:52] hi foXen_ [18:40:12] foXen_: you don't *have* to, but i'm afraid you won't be understood if you don't :) [18:41:20] foXen_, it's prefered :) [18:41:51] and if you miss words or such, somebody might be able to help (depending on the language of course) [18:42:32] hehe, k... [18:42:53] well, we got the parserextension to be installed today [18:43:11] i tryed it and it seems to work, but... [18:43:56] as i copied an example from wikipedia... to show/hide a tablerow, it doesn't work as expected [18:44:20] maybe you want to take a look first at: http://wiki.goetterheimat.de/index.php?title=Vorlage:Verteidigung [18:45:00] pls, that template isn't finished yet - taht means all styling is inline, etc but think that doesn't matter here [18:45:15] it's mainly one big table with some more included [18:45:45] check out the innertable following "Versuch via Kopie von [1] : " [18:46:29] which is an exact copy of http://en.wikipedia.org/wiki/Help:Table#Conditional_table_row [18:50:06] the exact same template would be rendert on http://de.wikipedia.org/w/index.php?title=Vorlage:Spielwiese as expected ...so is there a problem on our wiki-setup or where do we have to look for? [18:50:22] Nemo_bis: you're making all the bugs hard :/ [18:52:09] foXen_: your {{!}} was broken [18:52:26] was it?... will check your changes... [18:53:07] foXen_: argh, i can't save it [18:53:10] i don't speak german [18:53:11] but [18:53:24] you need to remove the newlines between and [18:53:42] aahh.. i'll try [18:53:48] (i can't answer those security questions :P) [18:54:36] ** [18:54:40] grins [18:54:55] you're right, thanks! [18:54:58] i searched half the day [18:55:37] haha [18:55:43] yeah, wikitext is badly whitespace-sensitive [18:55:51] glad i could help :) [18:56:01] sometimes it's only linefeed - as with ie and html :D [18:56:58] so I'll tune our template, now... again: many/ a lot of thanks! :) [19:14:53] legoktm: I have 5 more tabs open [19:15:02] :( [19:15:18] But it makes my life easier trying to find actual easy bugs :P [19:19:32] binasher: chris is telling me that 10 R720xd are arriving on Monday for databases [19:19:36] binasher: and that these come with an H710 [19:20:21] paravoid: that is correct [19:20:32] binasher: when will you need those? can I borrow a H710 for tests? if it comes to that, could we perhaps get more immediately while waiting for more from Dell or are these boxes needed for the eqiad migration? [19:20:37] legoktm: you mean that by excluding bugs tagged easy you're more likely to find easy bugs? [19:20:57] (and I just realized I'm on #wikimedia-tech, duh) [19:21:12] Er I meant that by removing easy from bugs that arent easy, it makes it easier to find real easy bugs [19:21:24] paravoid: you can use one for testing. you might want to swap out the 15k rpm sas drives first [19:21:32] no we'll just swap the controllers [19:21:38] easy enough [19:22:39] just don't swap h310 into the db server.. leave it controllerless :) [19:23:27] haha [19:23:32] legoktm: ah ok ;) [19:23:39] a db server with an h310, that'd be... fun [19:24:51] hmm, we do need to build a garbage shard for aftv5 [19:26:00] There are some request related to planet that are waiting for three month here: https://meta.wikimedia.org/wiki/Planet_Wikimedia [19:26:03] Do you know someone that is allowed to edit the planet configuration? [19:30:06] Tpt: I just contacted the last editor (James) to find out if he still has time [19:31:19] andre__:Thanks . :-) [20:24:18] brion: Hi, do we have a tool to figure out which commons' files are being used by a certain wiki ? [20:25:07] tarawneh_: https://commons.wikimedia.org/wiki/File:Castillo_Trausnitz,_Landshut,_Alemania,_2012-05-27,_DD_18.JPG#globalusage [20:26:08] brion: :) not exactly what I needed. I need a list of files used by ar.wiki [20:26:41] well [20:26:59] take everything referenced in arwiki's 'imagelinks' table that's not in arwiki's 'image' table [20:27:32] hmm [20:28:42] so I will need access to db somehow [20:29:10] if not through tool server you can download the table dumps from download.wikimedia.org [20:29:38] I recall I had an account a few years ago. I need to check it. [20:29:52] Thanks brion [20:35:31] good luck :D [23:29:56] in pt.wikivoyage, is it possible to change the interiki of "wikipedia", wich goes to en.wp, to pt.wp? [23:33:15] Alchimista: [[wikipedia:]] goes to en.wiki on all WMF wikis as far as i know [23:33:30] Alchimista: [[w:]] leads to Wikipedia in the same language [23:34:14] MatmaRex: http://en.wikipedia.org/wiki/Special:Search?go=Go&search=wikipedia: is an *interwiki*, from wikivoyage to wikipedia, so it's strange that it has to pass throw en.wp [23:34:30] dammit.. http://en.wikipedia.org/wiki/Special:Search?go=Go&search=wikipedia: [23:38:03] Alchimista: the same prefix is enabled for example on the metawiki [23:38:31] no problem then ;) [23:38:49] Alchimista: and it's sort of an "external" link, like [[google:]] [23:39:05] (this one links to Google in English as well) [23:39:25] and [[w:]], [[wikt:]], [[s:]] - these are "internal" and link to "the same" language [23:39:30] (assuming the project exists) [23:40:22] yes, i've never used a link like http://en.wikipedia.org/wiki/Special:Search?go=Go&search=wikipedia:, that's why i thought strange [23:42:34] They can also be re-assigned, if it's going it's going to a different language you should submit a bugzilla to get it changed [23:42:49] oh, I misread [23:42:58] sorry