[15:19:05] Could someone please tell me why I was banned from #wikimedia-operations ? [15:53:30] ToAruShiroiNeko: I can try to find out, do you have any clues? [15:54:03] ToAruShiroiNeko: Might have something to do with you being in the channel? [15:59:49] hmm [15:59:53] oh it sresolved [16:00:09] I am still curious as to why but I dont want to waste peoples times [16:00:24] I honestly have no idea as to why [16:00:37] Hm, interesting. [16:00:43] I wonder if there are logs of these things [16:00:46] I hardly talk there [16:00:54] the online log doesnt indicate bans [16:01:04] and on my irc window I dont see an actual ban [16:01:11] I realised it today [16:01:21] and my buffer doesnt show the ban [16:02:37] Funky. Does that channel have public logs? Or maybe you're logging locally? [16:04:14] http://bots.wmflabs.org/~petrb/logs/%23wikimedia-operations/ [16:04:21] forgot about them because I never look at them [16:06:39] marktraceur sure but those do not seem to keep bans :/ [16:07:14] Hm [16:07:27] I keep logging on my end too [16:08:18] but the ban was either carried on days ago or when I wasnt in the channel [16:08:30] my actual suspect is chanserv screw-up [16:08:49] since services went wild yesterday [16:09:26] *nod* makes sense [16:12:16] See, this is why I love wikitech-l: 70 messages about how end users should have been notified about the really obscure backend change, and only 6 in the appreciation thread. Very entertaining. [16:12:48] (obscure to end-users, I should say, but that's what the thread is about) [16:19:22] hmm [16:20:14] I just went through my logs, dont see a banning so yeah I blame chanserv [16:20:29] marktraceur about that [16:20:40] I am guessing you develop mediawiki? [16:20:51] K, as long as you're happy with that answer [16:21:05] ToAruShiroiNeko: Mostly extensions, but I've dived into core before as well [16:21:51] thats better [16:21:52] https://bugzilla.wikimedia.org/show_bug.cgi?id=38416 [16:22:00] I had this proposal a while ago [16:22:15] I dont expect anyone to pick it in its current form [16:22:35] I am curious if there are any open source tools you might know [16:23:13] I think we've talked about this before [16:23:23] oh? [16:23:26] You sent me to a NVidia site that required Flash [16:23:29] my memory is hazy [16:23:31] oh yes [16:24:15] html5 method seems more interesting probably [16:25:52] *nod* but the HTML5 method wasn't as well supported [16:33:52] sure, nvidia might not support it well [16:34:10] we need open sourced development anyways :p [16:38:00] ToAruShiroiNeko: In any case, I'm really not the person to ask, I'd contact mdale or someone similarly interested in multimedia bits [16:38:16] ToAruShiroiNeko: I think j^ does some of that, too [16:44:25] alright [18:28:53] AaronSchulz: preilly - is now a good time to talk about 20% time stuff? [18:50:28] Change merged: Demon; [analytics/webstatscollector] (master) - https://gerrit.wikimedia.org/r/18653 [18:50:39] Change merged: Demon; [analytics/DeviceMapLogCapture] (master) - https://gerrit.wikimedia.org/r/18636 [18:50:48] Change merged: Demon; [analytics/global-dev/dashboard-data] (master) - https://gerrit.wikimedia.org/r/18644 [18:51:28] * AaronSchulz is teh busy [18:51:46] Change abandoned: Demon; "Mistake." [analytics/E3Analysis] (master) - https://gerrit.wikimedia.org/r/18637 [18:51:56] Change merged: Demon; [analytics/asana-stats] (master) - https://gerrit.wikimedia.org/r/18638 [18:52:03] Change merged: Demon; [analytics/check-stats] (master) - https://gerrit.wikimedia.org/r/18639 [18:52:08] Change merged: Demon; [analytics/editor-geocoding] (master) - https://gerrit.wikimedia.org/r/18640 [18:52:14] Change merged: Demon; [analytics/gerrit-stats] (master) - https://gerrit.wikimedia.org/r/18641 [18:52:21] Change merged: Demon; [analytics/gerrit-stats/data] (master) - https://gerrit.wikimedia.org/r/18642 [18:52:29] Change merged: Demon; [analytics/global-dev/dashboard] (master) - https://gerrit.wikimedia.org/r/18643 [18:52:37] Change merged: Demon; [analytics/global-dev/reportcard] (master) - https://gerrit.wikimedia.org/r/18645 [18:52:43] Change merged: Demon; [analytics/global-dev/sqproc] (master) - https://gerrit.wikimedia.org/r/18646 [18:52:51] Change merged: Demon; [analytics/packages/thrift] (master) - https://gerrit.wikimedia.org/r/18648 [18:52:58] Change merged: Demon; [analytics/reportcard] (master) - https://gerrit.wikimedia.org/r/18649 [18:53:05] Change merged: Demon; [analytics/tools/kripke] (master) - https://gerrit.wikimedia.org/r/18651 [18:53:11] Change merged: Demon; [analytics/udplog] (master) - https://gerrit.wikimedia.org/r/18652 [19:07:33] preilly: ping [19:15:49] tfinc: hey, is preilly around today? [19:17:05] I don't see him, but it's lunch time so who knows [19:33:21] <^demon> JeroenDeDauw: When do you think you could get that existing code into Gitweb and DataValues? [19:33:48] <^demon> I already created Gitweb, but not DataValues just yet. Right now HEAD is pointing to nothing since master does not exist. [19:47:12] ^demon: I can do both now [19:47:20] ^demon: and will do that if you create the DV repo [19:47:59] <^demon> Done. [19:59:33] hi kaldari - thank you for mentoring Ankur [19:59:56] no problem, when do I need to do the final evaluation for him? [20:00:24] He's done some great work [20:00:24] now [20:00:43] kaldari: you should have an email from Melange in your inbox now, asking for that final eval basically ASAP [20:00:50] because sometime early tomorrow is the deadline [20:01:11] kaldari: he also sent his final roundup email to wikitech-l before any of the other students, which endears him to me :) [20:01:14] ah yes, I see it [20:04:26] sumanah: he is [20:04:58] sumanah: he's at lunch right now. do you want me to pass anything on ? [20:05:32] tfinc: ah, thanks. I was hoping to chat and get an idea of what he's up to in terms of 20% time -- the thing I was thinking of asking was Lua-related, I know he's into that [20:28:10] sumanah: I'm not doing 20% time right now [20:28:18] sumanah: we are on 20% break [20:28:22] preilly: Just talked to tfinc about that. [20:28:35] Am updating the 20% wiki page, which is what I go off of [20:31:47] kaldari, mwreview is now running 3039c0417950793ed8da6e302815bedf40f676b2 ; still getting the extension error when importing a photoset though [20:32:32] what's the URL for that wiki again? I'll see about getting an account set up on it. [20:32:44] Does the error say anything specific? [20:33:01] http://mwreview.wmflabs.org/wiki/index.php/Special:UploadWizard [20:33:12] preilly: separately from 20% stuff -- I know you've been doing some Scribunto-related work, and there's a wikitech-l thread right now where your expertise might come in handy [20:33:51] importing any pics from http://www.flickr.com/photos/fabola/sets/72157630578652390/ results in an error "Could not understand the file name '.jpg'" and a blank page [20:34:57] Eloquence: It's been noted on the Etherpad and the Gerrit review, presumably drecodeam and/or kaldari are on it [20:35:02] k [20:35:03] photoset importing is working for me locally, but there are lots of variables involved :) [20:35:19] kaldari: I think Eloquence means single photos (confirm?) [20:35:26] no, batch from the photoset [20:35:30] Oh [20:35:34] Well, single photos also don't work [20:36:00] btw kaldari https://www.mediawiki.org/wiki/Git/Gerrit_project_ownership has an UploadWizard request you might want to mark approval/disapproval on [20:36:06] ori-l: sync just finished [20:36:16] 2 actually [20:36:16] * ori-l tips his hat to kaldari. [20:36:19] thanks! [20:39:05] kaldari: d'oh, i forgot to update fenari, so i'll still need to sync two files. are you done w/your deployment? (i don't mind waiting if not.) [20:39:23] yes, all done [20:39:28] cool, thanks [20:43:38] Eloquence: strangely, I only get that error from that photoset. I tried with a different photoset and it sort-of worked (although I got a JS error on one of the files). [20:44:05] Shouldn't be too hard to debug it though [20:44:20] which photoset did you try that worked (in case it's shareable ;-) [20:44:25] * ori-l is done. [20:44:57] well... http://www.flickr.com/photos/opoterser/sets/72157629202125460/ ... but it's not a great one to test with since most of those images already exist on Commons [20:45:46] mwreview doesn't have instantcommons enabled .. so there shouldn't be any potential for conflict when doing a local import [20:46:06] I still get a "File already exists" message when attempting an import, and only one file from the batch is loaded [20:46:21] One of the tricky aspects is that Flickr handles things differently for Pro accounts - you can actually get the original, non-jpg formats for Pro accounts, but for everyone else you only get the Flickr jpeg versions. [20:46:34] actually the error is "There are some other files already on the site with the same content.", and the "some files" link goes to http://mwreview.wmflabs.org/wiki/index.php/$2 [20:54:26] hey marktraceur [20:54:48] commons is now running the latest master, right? (according to special:version at least). I'm still able to get the duplicate-archive error on the last step [20:56:09] Eloquence: Commons is not (to my knowledge) [20:56:15] kaldari: Confirm? [20:57:31] Commons should be up to date now. I'm not sure if that fix made it in or not. Which bug or change was that one? [20:58:32] Looks like https://gerrit.wikimedia.org/r/#/c/20814/ which is not yet merged [20:59:08] Rightio. [21:00:15] here are the ones that have been merged: https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/extensions/UploadWizard,n,z [21:10:50] Eloquence: I see the problem. All of Fabrice's images are untitled in Flickr, thus ".jpg" is the file name that is generated. [21:11:27] I guess in those cases we would just want to generate something based on the Flickr user's name or something [21:11:38] or the set name [21:13:19] marktraceur, kaldari: major production issue - looks like at some point the language default selection got screwed up; it's set to "Авар" regardless of user language. [21:13:33] so all UW uploads will get this as the description language unless the user changes it. [21:13:49] Looking... [21:14:10] Eloquence: I think I saw that it defaulted to the first in the list, but I didn't see "????" [21:14:18] that code was rewritten, but the default selection was working correctly in local tests [21:14:58] here's an example upload with the incorrect default: https://commons.wikimedia.org/w/index.php?title=File:Fleur_violette_du_jardin.JPG&action=edit [21:15:07] also note that the template code for selecting the language seems screwed up [21:15:08] we'll either patch it up in a bit or revert it [21:15:29] even when selecting a different language, it uses the language name as the template name, instead of the language prefix [21:15:33] Yeah, that's the first in the list all right [21:16:16] example: https://commons.wikimedia.org/w/index.php?title=File:F%C3%BCrstenbersk%C3%BD_pal%C3%A1c_(Vald%C5%A1tejnsk%C3%A1)_01.JPG&action=edit [21:16:22] those languages are supposed to be in the user's language as well [21:16:47] it should use {{cs|1=Fürstenberský palác (Valdštejnská)}} but instead uses {{Česky|1=Fürstenberský palác (Valdštejnská)}} which means the description won't be shown at all [21:17:04] kaldari: Only if that class and the method both exist and are callable. I never had that working, even locally [21:17:09] yes [21:17:23] I think that must be part of the issue [21:17:46] I'll disable the UW gadget for now [21:17:49] let me get clrd running on Commons and see if that fixes it [21:20:11] hmm, cldr should be running on Commons [21:20:35] I'll check on fenari directly [21:21:22] it shows up at https://commons.wikimedia.org/wiki/Special:Version [21:21:39] this change recently got merged: https://gerrit.wikimedia.org/r/#/c/9528/ [21:21:43] let me try locally if this causes it [21:25:33] it works great locally. I wonder if there were changes made to the cldr extension recently [21:27:19] Ah well, I'm going to go ahead and back that change out [21:27:25] Aw [21:28:20] files uploaded a couple of hours ago are fine, so it looks like this was definitely introduced with the deploy [21:28:36] Eloquence: It's definitely that patchset, I agree [21:29:01] I just didn't set the default correctly, so I'll amend that commit or make a new one that does it right [21:29:21] it works correctly for me locally, even with cldr off [21:31:17] must be something about the environment that's different [21:32:46] it's changeset 9528 in combination with MediaWiki:LanguageHandler.js [21:33:07] UW retrieves the list from MediaWiki:LanguageHandler.js if it exists [21:33:24] with mark's commit and LanguageHandler in place I can reproduce the issue locally. [21:34:06] we should be OK as long as we back that one out ASAP [21:35:58] do we need to temporarily disable UW entirely? [21:36:09] since it's generating corrupted descriptions right now [21:37:25] Eloquence: Backing out the commit should be fine until we can fix it [21:37:43] kaldari's on it, my theory right now is that patchset 10130 will fix the problem [21:37:53] We'll experiment [21:38:37] ok .. if you need to disable it, fyi, there's a $wgUploadWizardConfig['fallbackToAltUploadForm'] which, when set to true, will gracefully redirect users to the old form. that's better than hiding links and such. I'll be back in a bit. [21:39:04] *nod* [21:43:08] syncing test now with backout [21:47:26] Yeha, kaldari, I think if we squash 9528 and 10130, maybe make them a new commit, and test that, it should be working properly [21:47:40] s/Yeha/Yeah/ [21:48:59] syncing Commons/etc with backout [21:49:41] Righto [21:55:17] marktraceur: We probably want to get rid of UploadWizard calling MediaWiki:LanguageHandler.js as well, since it's now superfluous. [21:55:58] *nod* agreed [22:02:28] kaldari, AaronSchulz et al - is there a faster way to sync for a single extension change like this? It looks like a full sync takes upwards of 20 minutes, which is really prohibitively slow for production-level fixes [22:03:02] Eloquence: It depends. There is a shortcut but only if there are no i18n changes involved [22:03:10] yeah, sync-dir [22:03:13] the change involves both PHP and JS, so I ran scap to be safe [22:03:27] scap is safe? :D [22:03:34] sync-dir php-1.20wmf10/extensions/UploadWizard will work as long as i18n didn't change [22:03:45] ok [22:07:09] scap is finished [22:07:22] bug seems fixed :P [22:09:19] kaldari: New patchset with your name on it (literally!) that should fix two bugs and not introduce any new ones [22:09:26] cool [22:09:49] the rest of the deployed code is still on commons, just backed out that one feature [22:10:11] Which is great, because this new patch is rebased on master, so everything should be hunky and a little bit dory [22:10:18] just added some details regarding the protect page issue, which continues to persist [22:10:39] i.e. protected titles now trigger the "checking for uniqueness" pop-up which won't go away [22:11:02] Thaaat's not good [22:11:09] Eloquence: Where are these details? [22:11:12] (that alert() needs to burn in hell soon :) [22:11:17] Oh, got it [22:12:07] Eloquence: Are you sure? I get the error properly. Maybe it's browser-specific? [22:12:36] make sure you follow the steps to repro exactly [22:12:45] Righto [22:12:46] and yeah, only tested in chrome [22:13:37] Just tested in Chrome....wait, Eloquence, did you include "File:" in the filename as well? [22:14:00] No, makes no difference to me [22:14:42] New plan [22:14:57] Ah. [22:15:01] That's no good [22:18:13] Eloquence: https://gerrit.wikimedia.org/r/#/c/7608/ still isn't merged, it may fix the problem, I will test [22:19:14] marktraceur, looks like l10n-bot ate your homework [22:19:22] https://gerrit.wikimedia.org/r/#/c/21252/ removed some of the messages added for idfield2 etc. [22:19:31] so it's now borked in production [22:19:45] Bollocks [22:20:06] And the protection methods, too [22:20:14] Why would it do that? Grrrr. [22:20:48] siebrand, awake? [22:21:29] I'll send him a note. in the meantime best just do a f-up commit [22:22:06] Eloquence: barely. [22:22:50] l10n-bot reverted? This can happen if i18n changes are merged while Raimond is exporting. It's usually a 30 minute window. [22:23:17] Can happen occasionally. We update sources, export and commit. Whatever is merged between update and commit may be eaten. [22:24:32] should the bot ever remove English source messages or could that reasonably trigger an automatic deferral of a commit? [22:25:26] It was merged 23.08.2012 21:22 and the i18n update came 23.08.2012 22:10 [22:25:48] siebrand: If l10n-bot bases its changes properly, this will be avoided [22:25:51] So raimond should probably check for i18n merges between his last update and his export. [22:26:01] OTOH it means that it may not be able to merge its own changes in due to conflicts [22:26:14] yep, that happens. [22:26:29] If I see that, I just abandon, because there's always the next day. [22:26:39] happens very rarely. [22:26:47] Eloquence: assert( "f-up commit" === "fix the crap l10n-bot broke" ) [22:27:40] even friendly bots sometimes want to destroy all humans. it's in their nature. [22:27:55] the only way to try to avoid I can see is to further reduce the time between update, export, push and merge, which would mean that we need more manual to for processing the input sequentially. [22:28:05] that's not going to happen. [22:28:56] Im sorry it happened. I'm glad it was caught; I'll see if I can get Raimond to shorten the fuckup window. [22:29:28] siebrand: I'm asserting there need not be a fuckup window if this is implemented correctly [22:29:48] feel free to inspect our process and implement your assertion. [22:29:51] i.e., you check out a revision, run your stuff, then create a commit on top of /that/ revision and submit it [22:30:23] So you should *not* update your checkout between running the scripts and committing, because that introduces this problem [22:30:32] you're forgetting a few things there, but I'll forgive you for that.The complexity of the process is more often misunderstood. [22:30:40] Even better, you could create a temporary branch off master at the point where you start running the script [22:30:43] What am I forgetting here? [22:30:55] * RoanKattouw realizes there is more complexity but would like to understand :) [22:30:58] that the update may contain changes that need to be processed twn side. [22:31:07] So here's the process: [22:31:17] we run "repoupdate mediawiki-extensions" [22:31:42] this updates 350 or so git extension repos, and the svn extensions tree. [22:32:08] Then the content is synchronized with the wiki for en and qqq (sync-extmaintained ext-*) [22:32:20] Then the fuzzyBot user changes are inspected. [22:32:46] If any English messages changed, an assessment is made if those messages need to be fuzzied. [22:32:54] If so, they're fuzzier: fuzzyr key [22:33:14] then the export repo is updated: [22:33:36] repoupdate mediawiki-extensions /resources/username/mediawiki-extensions [22:33:36] "the export repo"? [22:33:50] We have source files that are clean, and remain clean. [22:33:57] Ah, OK [22:33:58] We commit from separate checkouts. [22:34:19] So you update your live repo at the start of the process, but you update the export repo near the end [22:34:23] I think that's the main cause of the problem [22:34:27] This procedure is heavily optimized, by running background tasks; one git pull per 0.4 seconds. [22:34:40] This takes a few minutes for all repos. [22:34:54] I know what the problem is, but it cannot be avoided. [22:35:01] There is a lag. [22:35:11] It can, for git repos at least [22:35:34] When you update the export repo, it will potentially update to a revision beyond the one in the live repo [22:35:36] Once everything is processed properly, and all repos are up to date, what *I* do, is check if the recent merges contain i18n. [22:35:44] But it will at least *have* that revision in its history [22:35:46] I *assume* that's what Raimond doesn't do. [22:35:54] So you can create a branch off of that revision [22:35:59] Then we run the export command. [22:36:10] repoexport mediawiki-extensions targetfolder [22:36:31] This starts 26 threads, for messages groups ext-[a-z] [22:36:54] Depending on the number of changes, this takes anywhere from 30 seconds to a few minutes. [22:37:16] Immediately after that, all repos are checked for changes, local commit and push is done [22:37:24] Last, an svn commit is made. [22:37:38] Then there's auto merge of all open l10n-bot commits. [22:37:50] Then the source repo is updated again, and it's done. [22:37:56] Right [22:38:09] So you see… It's a bit of work. [22:38:17] So what I would suggest, is that when you update the export repo, you then roll back its history so it's in the same state as the source repo [22:38:31] You can do this in git with git checkout -b mytempbranchname [22:38:35] thats always done. [22:38:38] And in SVN with svn update --revision=12345 [22:38:56] If that's actually done then the problem we're discussing is impossible [22:38:56] siebrand, why not have the commit repo pull from the original repo ? [22:39:04] git fetch -q --all && git reset -q --hard origin/master && git clean -q -f -d & [22:39:09] Yeah that would be even better [22:39:14] ;) [22:39:21] siebrand: But the meaning of origin/master changes over time [22:39:24] it's not like we haven't thought about it. [22:40:07] the 'critical window' is always there. [22:40:10] I always forget about how git clones are also repos and you can pull from them [22:40:24] No, it doesn't need to be there [22:40:36] Say when you initially update the source repo, you're at revision A [22:40:44] at worst, you would get a merge conflict [22:40:45] Then you do all your processing based on the i18n in revision A [22:40:57] which would hopefully auto-resolve on next export [22:41:20] Half an hour later, you're done processing and you update the export repo, which updates it to revision B (because master has changed from A to B in the meantime) [22:41:23] This is the commit sequence: [22:41:40] but you can have it update to revision A [22:41:45] Most likely, this issue is occurring because you're creating a commit based on B, but it would be avoided if you created a commit based on A [22:41:51] git commit -a -m "$COMMITMSG" && git fetch gerrit && git review -t l10n || : [22:42:12] Hmm, I think the solution is simple [22:42:21] do not use git-review ? [22:42:33] Well, maybe, maybe not [22:42:39] You probably want to pass -R there [22:42:51] I don't mind commits failing. There's always another day. [22:43:13] But also, I think this would get a lot better if you changed the export repo's "origin" remote to point to the source repo (via filesystem) rather than to gerrit (via HTTPS/SSH) [22:43:31] ehr? [22:43:35] noting that push should still go to ssh [22:44:03] siebrand, your "work" repository can pull from the "untouched" repository [22:44:04] Platonides: Yes, git-review takes care of that, it pushes to gerrit, not origin [22:44:06] These scripts are not yet in the translatewiki repo, but I'll get them in there this week. [22:44:22] I'll gladly accept patches that improve things. [22:44:38] make sure they're properly commented :P [22:45:01] well, not so much the scripts but the procedure you follow [22:45:03] mind you that we're attempting 300 or so commits; it shouldn't take hours... [22:45:18] (it's already taking a frigging long time with all the damn repos) [22:45:24] it will actually be faster not to do a second fetch from the net [22:45:37] Yeah [22:45:53] So specifically, we'd have to patch the script that *creates* the export repo [22:46:09] lemme commit the scripts to translatewiki redo in bin/. [22:46:09] Assuming such a script exists [22:46:24] Then I go get some sleep and I'll see what you come up with tomorrow morning :) [22:46:35] actually, I see no reason it couldn't reuse the same repository [22:47:08] Platonides: They don't want the export script's changes to go live on TWN until they get merged into master and pulled down, I guses [22:47:22] the "untouched" files are those executed by TWN? [22:47:44] I believe so [22:47:57] then yes, it would be bad to be playing on them [22:48:08] * siebrand nods. [22:48:44] you can still benefit in disk usage from having the working repos be hardlinked copies [22:49:01] although I suppose you will note much more the reduced time [22:49:27] Yeah [22:59:49] RoanKattouw, Platonides : https://gerrit.wikimedia.org/r/#/c/21295/ added all the scripts (except for one containing a host and port for IRC shouts to #mediawiki-i18n). [23:00:08] RoanKattouw, Platonides : You want to take a look at repoupdate and recommit. [23:00:14] zzz... [23:00:19] good night, siebrand [23:01:10] Hmm, why is that ps "merge pending"? [23:03:27] svn propset -q svn:eol-style native *..php <- too many dots [23:05:34] is repocommit run interactively? [23:08:49] I miss the link between update, fuzzy and commit-push [23:09:26] siebrand: That's not sleep, that's Gerrit. [23:09:29] :) [23:09:45] well, good night [23:09:50] heh [23:11:31] Took some time… Merged now. [23:43:56] gwicke: ^ is subbu in Minnesota? :/ [23:44:08] Minneapolis [23:44:11] Huh [23:44:19] I probably knew that, but even so, huh. [23:44:30] (this is interesting because I'm from there) [23:44:43] ah- he'll be here next month ;) [23:44:53] likely 4thish [23:45:29] Oh good, I can complain to him about the abysmal Twins season and how people from Wisconsin drive like psychopaths, it'll be a fun time [23:45:51] common enemies and all that [23:45:56] ;) [23:46:01] Right right [23:51:41] Eloquence: Deployed fixed code to test.wiki