[02:13:59] How do i handle this error: Error undeleting file: A non-identical file already exists at "mwstore://local-swift-eqiad/local-public/b/bd/Aurora_&_sunset.jpg". [02:16:52] morgankevinj_a: Move the file that exists? [02:17:52] I t does not apperar to exist see https://commons.wikimedia.org/w/index.php?title=File:Aurora_%26_sunset.jpg [02:18:42] Weird. [02:19:03] AaronSchulz: Thoughts? [02:34:20] TimStarling, I get the error "Error undeleting file: A non-identical file already exists at "mwstore://local-swift-eqiad/local-public/b/bd/Aurora_&_sunset.jpg". but no file is currently there. [02:36:56] RobH [02:38:33] Yes? [02:41:45] Error undeleting file: A non-identical file already exists at "mwstore://local-swift-eqiad/local-public/b/bd/Aurora_&_sunset.jpg". but no file is currently there [02:44:25] hrmm, im not exactly comfortable pushing deletes to swift without anyone else about. unless this is a serious copyright issue or something? [02:44:46] (not saying its not a problem, but it may be better for a ticket than just irc, cuz its not exactly normal hours for most folks) [02:45:03] oh, undeleting [02:45:11] hrmm [02:45:21] (i still dont think im going to have an answer for you, dont get your hopes up) [02:47:57] morgankevinj_a: So yea, I have no clue. If it isn't a vital emergency, You may want to just enter a bugzilla ticket. [02:49:58] if its an issue where its blocking some major work or is a big issue, i can try to dig someone up to work it immediately. [02:50:25] (or if its where nothing can undelete) [02:51:26] Filed as Bug 62359 [03:08:37] is perchance someone working on getting the X! edit counter back up and working? [05:29:46] Nemo_bis: If people actually ask questions in here, they usually get answered. [05:29:55] But it's pretty quiet. [05:30:00] Perhaps people are afraid to ask. [05:30:10] Also, #wikimedia-dev [06:03:24] Gloria: I'm not saying the contrary, but for instance the Wikimedia error didn't point here as was expected back then [06:03:52] And I wasn't being serious :) [06:12:44] * Gloria holds court. [06:14:55] * ori curries favor with exotic gifts. [09:01:26] Oh, the git-deploy component was removed, nice. Nnow I can type "Git" and reach the right component directly. [09:01:38] Andre has a good suprise for us in bugzilla every day, it seems. [09:10:53] Nemo_bis: my favorite: type owner:sel [enter] into gerrit's search field [09:11:19] at 'owner' it starts autocompleting it owner:self, continuing to do so with each additional keystroke [09:11:47] but on 'enter' you end up with owner:"Selvatamilancse " [09:11:48] autocompletion is completely broken [09:13:13] not if you're Selvatamilancse :) [09:13:50] I think even for them. Every time I try to add a reviewer to a patch, I'm redirected to a random patch of mine. [09:14:40] You always have to click one of the options, or click in the field, and then tab-enter or click the button. It works with the new change screen though. [10:22:27] Heh, I had forgotten http://stats.wikimedia.org/EN/TablesUsagePageRequest.htm [15:20:43] Still so many bugs to file http://etherpad.wikimedia.org/p/new-gerrit-change-view-comments [15:23:04] "** "Reply" isn't even an obvious label. What are you replying to?" thinks it's obvious [15:23:12] reply to the comment of somebody else [16:05:49] mutante: yeah, I agree [16:08:27] 6 gerrit bugs filed, enough for today [16:08:35] MatmaRex: hey, do your share too! [16:08:44] I cannot reproduce the +2 etc. ones [16:10:29] * MatmaRex checks Bugs filed today [16:10:38] Nemo_bis: i filed one! [16:11:02] ah, unless you mean upstream [16:11:10] then i filed two in total, i think, and both were ignored [16:12:03] <^d> I read your bug. [16:12:07] <^d> (One of them, at least) [16:13:29] My bugs upstream were not (completely) ignored. [16:13:48] But if they don't receive any reports how can they make stuff better? :) [17:30:36] wow https://twitter.com/UKMirrorService/status/441502459807416320 [17:46:13] I wanted my mediawiki vagrant to run with a disabled dhcp on eth0. Is it possible ? [18:05:29] apergos: do you know any command one can use to reliably validate XML dumps? With xmllint, multi-GB files always fail for memory limits. Best I found so far is xmllint --noout --loaddtd --valid --timing --maxmem 1500000000 [18:06:40] Nemo_bis: don't have any good thoughts on it, though it's a nice idea [18:11:30] ok [18:28:29] Nemo_bis: validating against the xml schema? should be able to just run it through a validating sax parser in java [18:28:41] i used to have some script for it but ...... i don't still have it [18:32:48] brion: ideally, yes; but even just validating that it's an actual xml [18:33:35] Currently we (WikiTeam) use something as embarrassing as: grep "" *.xml -c;grep "<page>" *.xml -c;grep "</page>" *.xml -c;grep "<revision>" *.xml -c;grep "</revision>" *.xml -c [18:33:45] <brion> heh [18:42:57] <brion> Nemo_bis: you might be able to rig something up easily in PHP using XmlPullParser, it seems to have an XSD schema validation option as well as inherently checking for well-formedness in the parser [18:43:03] <brion> http://us2.php.net/manual/en/xmlreader.setschema.php [18:43:23] <brion> just stream the file in through the parser until it ends and see if it errors out [18:44:27] <Nemo_bis> Right. Will try. In the end I'd need something I can integrate in a python script though. [18:44:46] <brion> well you can shell out. ;) but similar tools should exist in python as well [18:44:58] <brion> i'm just less familiar with python xml handling [18:47:37] <^d> brion: You cared about interwiki searches...wanna take a stab at a pretty ugly bug? [18:47:46] <^d> (If not, say so hehe) [18:52:45] * Nemo_bis reiterates kudos to everyone working on interwiki search [18:53:06] <Nemo_bis> brion: yes, indeed using a shell command is my lazy solution to every problem in python [18:53:12] <brion> lol [20:22:52] <Newyorkadam> hi [20:22:55] <Newyorkadam> Gloria: hi [20:23:06] <Newyorkadam> Betacommand: hi [20:43:19] <Betacommand> Newyorkadam: hay [20:43:43] <Newyorkadam> hi Betacommand [20:43:48] <Newyorkadam> can you help me with my site? [20:52:28] <Betacommand> Newyorkadam: what do you mean? [20:52:40] <Newyorkadam> Betacommand: I'm making a website and I'd like your help :D [20:53:06] <Betacommand> details...... [20:56:05] <Newyorkadam> Betacommand: nothing in specific right now [20:56:10] <Newyorkadam> someone is helping me in #css with a problem [20:58:06] <mutante> Reedy: hrm. actually i need the dsh group now to shut them down :P:) [21:05:47] <Tpt_> Nemo_bis: Hi! Do you know who should ask help to for parser tests problems? I don't manage to make https://gerrit.wikimedia.org/r/#/c/102475/4 work. [21:07:30] <Nemo_bis> Hm, you already have cscott and gwicke there, try emailing them? [21:23:54] <Tpt_> Nemo_bis: Thanks :-) [21:24:16] <gwicke> Tpt_, did you check https://integration.wikimedia.org/ci/job/mediawiki-core-phpunit-parser/20798/console ? [21:26:01] <Tpt_> gwicke: Yes, and I can't manage to reproduce the error locally. cscott is helping me on #mediawiki-parsoid [21:31:29] <gwicke> Tpt_, ok [21:46:19] <enhydra> https://en.wikipedia.org/wiki/Special:Contributions/10.4.1.65 [21:46:24] <enhydra> how is that possible? [21:52:56] <MatmaRex> enhydra: that looks like a bot accidentally logged out. [21:53:19] <enhydra> a bot editing from a 10.*.*.* address? [21:55:20] <MatmaRex> enhydra: oh. that looks like some proxy misconfiguration then, this has happened before [21:56:43] <MatmaRex> enhydra: i forwarded this to #wikimedia-operations, feel free to join that channel too :) [21:56:54] <MatmaRex> --------------------------------------------------------------------------------------- [21:56:54] <MatmaRex> [22:56] <MaxSem> MatmaRex, some host not configured as a proxy/whatever [21:56:54] <MatmaRex> [22:56] <Damianz> Labs/tools are 10.x.x.x ips [21:57:08] <bd808> 10.4.0.0/21 is pmtpa labs [21:57:41] <bd808> The logged out bit was a varnish cookie mangling issue on 3/5 that was corrected early on 3/6 [22:13:36] <MaxSem> solution: https://en.wikipedia.org/w/index.php?title=Special:Log&type=block&user=&page=User:10.4.1.0%2F24 :P [22:32:32] <rillke> Hi there, what does MediaWiki to determine whether a page exists. I.e. which SQL query is fired first to determine that? [22:34:27] <rillke> I am asking because we have a page at Commons that is in the database (page table) but is shown as if it would not exist. [22:34:57] <rillke> The page is that one: https://commons.wikimedia.org/w/index.php?title=File:A_new_and_accurate_plan_of_Blenheim_Palace_-_L--39-Art_de_Cr%C3%A9er_les_Jardins_-1835--_pl.1_-_BL.jpg&action=edit&redlink=1 [22:35:26] <rillke> And you can find the relevant Village Pump discussion at https://commons.wikimedia.org/wiki/Commons:Village_pump#Disappearing_image [22:37:36] <rillke> You can even view the old revision: https://commons.wikimedia.org/wiki/Special:PermanentLink/118131591 [22:38:06] <rillke> but when switching to the current revision, everything is gone