[00:06:16] hi varnent_ [00:06:31] sumanah: hey there [00:07:47] sumanah: I never heard back from the main WM folks on what they wanted to do about hackathon [00:08:31] varnent_: you mean the main Wikimania folks? [00:08:41] please do poke 'em again [00:08:50] oh yeah - lol - can't really use that acronym [00:09:04] will do - ty for the show of support on that :) [00:11:46] I'm still trying to navigate what this year's host committee wants to be hands on about - they want to do a rewrite of the friendly space policy - and I think they may have been irked I posted last year's version - which I did more out of habit of updating event wikis than an intentional declaration :) [00:17:45] Change merged: Reedy; [mediawiki/tools/release] (master) - https://gerrit.wikimedia.org/r/47221 [00:19:53] a rewrite of the friendly space policy? [00:22:01] dinnertime [02:06:37] it's really user unfriendly that gerrit does not have any easy navigation to the web tree browsing [02:07:15] one then has to hack the url to get there which doesn't work everytime if you don't know the structure properly [02:07:29] Yeah, we want to replace gitweb with gitblit [02:07:34] It's a nicer web frontend [02:07:41] But we can't do that until we upgrade Gerrit, AFAIK [02:08:15] well, isn't that an issue of gerrit actually? [02:08:30] And of course the same version that we need for gitblit also broke LDAP support [02:08:36] i am on some change page and i can't simply navigate to the history of file [02:08:43] Yes [02:08:51] But I think gitblit fixes most of those issues [02:09:38] if we want to encourage more people to participate on development, we can't create such weird obstacles to them [02:10:46] I told you we're fixing it [02:11:00] It's just been slow because we had to skip a Gerrit release due to LDAP breakage [02:12:38] anybody considered gitlab? [02:13:11] +1: "it's really user unfriendly that gerrit does not have any easy navigation to the web tree browsing" [02:14:56] i guess since gitlab is pretty much the same approach as github uses, much more people would be familiar with that [02:16:34] Danny_B: Right; but a source code management platform for the WMF has to be secure and reliable first; visual appeal comes second. [02:17:42] and IIRC there are no public-facing GitLab instances of a comparable profile [02:21:27] i was just asking. i assume there was some selection process [02:22:40] Reedy, re. https://bugzilla.wikimedia.org/show_bug.cgi?id=44660#c1, would you prefer that i squash all the commits after Idd3a05f1 in https://gerrit.wikimedia.org/r/#/q/PDBHandler,n,z into that commit you reviewed? [02:24:30] It makes sense, usually yeah [02:24:45] The code hasn't been submitted, so they're "fixes" to the original commit [02:24:57] so having the changes in later commits doesn't help with reviewing the first one [11:59:13] New patchset: Hashar; "debian packaging job" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47727 [11:59:30] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47727 [12:36:53] New patchset: Hashar; "wrapper to build mediawiki documentation" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/39212 [12:39:14] New review: Hashar; "PS7:" [integration/jenkins] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/39212 [12:56:08] New patchset: Hashar; "sort list of extensions ignoring case" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47728 [12:57:08] New patchset: Hashar; "sort list of extensions ignoring case" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47728 [12:57:21] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/47728 [12:58:36] New patchset: Hashar; "Add extensions used by translatewiki.net to job builder" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38555 [12:58:58] New review: Hashar; "Rebased to find out what is left to do" [integration/jenkins-job-builder-config] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/38555 [13:04:18] <^demon> qchris: Sooo, I haven't picked exactly which build we're going to use yet. [13:04:29] <^demon> Master's pretty stable right now, so I'm not worried. [13:05:24] <^demon> If stable-2.5 was merged to master again, that'd probably be a sane point to use. [13:14:29] <^demon> Eh, nothing super amazing in stable-2.5. There's the cache-automerge improvement, but that's not required. [13:17:28] master for the masses [13:17:59] <^demon> Yeah. I see qchris got an improvement for delete-project into core (I had been meaning to do that :p) [13:18:26] <^demon> Maybe whatever build we get from jenkins today could be our baseline, and we'll assume that unless anything last-minute (needs to) go(es) in. [13:30:37] ^demon: Sorry... was for lunch :-) [13:30:58] <^demon> Ah, no problem. [13:31:03] I guess using recent master would be ok. [13:31:14] My fix to gerrit isn't strictly needed [13:31:22] However, I use it in move-project. [13:31:33] And I hop to get it into dele-project as well :-) [13:31:47] <^demon> We'll include it. [13:32:12] The remaining LDAP issue is not too much problem for us, so I haven'�t touched that yet. [13:32:26] I guess we can ignore that for now, can we? [13:32:43] <^demon> Remaining ldap issue? [13:33:23] Removing ldap groups that the user is not part of does not work [13:33:36] It's a permissions problem. [13:34:16] <^demon> Ah, yeah that's not as urgent. [13:34:48] ok. Great. [13:43:52] <^demon> I'm building master on jenkins now. Assuming we don't notice any last minute regressions (or see anything last minute we need to pull), we'll assume that's what we're installing. [13:46:23] saper: wanna review some shell scripting? :) https://gerrit.wikimedia.org/r/#/c/39212/ [13:46:58] yummy yummy [13:47:10] Korn Shell 88 or 93? :) [13:47:18] nnoooo [13:47:30] I learned shell using ksh on an AIX [13:47:30] ^demon: including modified etc/mail/ChangeSubject.vm for me? :) [13:47:39] which was 88 probably [13:47:43] 93 is great [13:48:20] saper: back in that time I thought unix / shell was very lame when I could do all of that using a mouse in windows. [13:48:39] saper: and we used some awkward hardware terminals to connect to the central unix. bah [13:48:42] <^demon> saper: That was a puppet change. Did that go into master? [13:48:59] <^demon> (master upstream, I mean0 [13:49:01] <^demon> ) [13:49:02] We call it 'production' here. And, no. [13:49:25] <^demon> I know puppet is production ;-) [13:49:34] if we wait a bit I might actually propose a fix to gerrit master :) since it will be a Java-equivalent of one-liner (~15 LOC) [13:49:58] the branch is called "production" for some reason that's what I said to git push :) [13:49:58] <^demon> If you get it in before noon, it'll make it into today's nightly. [13:50:04] <^demon> Noon. Timezones. [13:50:07] <^demon> Stupid me, 3 hours. [13:50:11] it wont be merged so fast [13:50:17] <^demon> True :\ [13:50:29] <^demon> Well, we don't have to use today's build if we can get it in though. [13:50:35] <^demon> Let's get that going on master. [13:50:42] * ^demon will CR+1 and V+2. [13:50:45] <^demon> If it's good. [13:54:19] <^demon> https://integration.mediawiki.org/ci/view/Java/ is all sunshines :) [13:58:23] \O/ [13:58:55] unline lucene :) [13:58:59] unlike [14:00:12] <^demon> Well, the lucene pom is awful. [15:45:19] New patchset: Demon; "Fix this so it'll actually compile" [mediawiki/tools/mwdumper] (master) - https://gerrit.wikimedia.org/r/47740 [15:46:02] Change merged: Demon; [mediawiki/tools/mwdumper] (master) - https://gerrit.wikimedia.org/r/47740 [17:02:15] Tim-away: Was all the data in the cur tables migrated? Do we still need them? [17:35:37] Reedy: thank you for your work on adding sexual orientation to WikiData [17:36:00] varnent: They wouldn't let me have an over 9000 property though :( [17:36:17] over 9000 property? [17:36:41] Reedy: I'd like to add a transgender-esque property as well - but tom suggested I wait until there's more content to argue over [17:36:50] https://www.youtube.com/watch?v=SiMHTK15Pik [17:36:55] It became somewhat of a meme ;) [17:36:55] hahaha :P [17:37:25] Reedy: lol - awesome [17:37:43] no easter eggs? *sigh* what is the wiki world coming to? [17:44:35] mwalker: eee! [17:44:42] aaahhhhH! [17:44:56] /nick mwalkeraaaahhhhh [17:45:40] or something [17:46:24] mwalker|AGGGHHH: marktraceur https://www.youtube.com/watch?v=LJfowXTXOfU [17:46:47] that is exactly where I was going [17:46:55] :P [17:47:15] RagePanda: For some reason your nick started playing in my head to the tune of "Space Cowboy" [17:47:22] ....something may be wrong with my brain [17:47:33] * RagePanda googles that [17:47:40] Steve Miller Band [17:47:49] (a good addition to your list if it's not there already) [17:48:12] Reedy: hi :-] I got style fixes for you to review ! https://gerrit.wikimedia.org/r/#/c/46753/ https://gerrit.wikimedia.org/r/#/c/46760/ :D [17:48:59] marktraceur: I'm still exploring The Doors :) [17:49:01] (slowly!) [17:49:12] Indeed [17:49:37] A good option might be to set up a classic rock radio station somehow [17:49:49] * marktraceur learned about this stuff via good ol' FM radio [17:49:57] hmmm [17:50:02] pandora, etc don't really work here. [17:50:14] the only radio I do listen to is ah.fm (trance) [17:50:18] good programming music :) [17:50:23] RagePanda: A lot of US stations will also have an Internet stream [17:50:36] * marktraceur looks for the radio station that's good here [17:50:46] marktraceur: radio currently here is 15 minutes of advertisements, 10 minutes of someone trying to be funny, and then 5 minutes of actual songs [17:51:51] RagePanda: Yeah, don't listen to US radio in the morning, it's the same. But most of the rest of the day is OK. [17:52:17] heh :D [17:52:18] RagePanda: https://en.wikipedia.org/wiki/KSAN_%28FM%29 is one I enjoy. [17:52:20] I suppose they'll perhaps succeed at being funny more often though? [17:52:23] * RagePanda clicks [17:52:36] RagePanda: I wouldn't count on it [17:52:40] they've free internet streams that haven't been shut down by the police yet? [17:52:55] * RagePanda is reminded of 'Right to Read' by that sentence somehow [17:53:32] Also, TIL "https://en.wikipedia.org/wiki/Template:Classic_Rock_Radio_Stations_in_California" is a thing [17:53:44] :D [17:53:47] RagePanda: Don't worry, they're working on it [17:54:10] soon enough, eh [17:55:07] And it's not like the streams don't have commercials. Plus there's undoubtedly DRM on it all. [18:44:29] <^demon> Restarting Gerrit to pick up a fix. Please don't panic. [18:47:37] <^demon> saper: Your fix to ChangeSubject.vm is live. [18:47:44] <^demon> Workaround, rather. [18:57:57] New review: saper; "Seems to me that it could be reduced to three lines of /bin/sh code (truncate target name with "sed"..." [integration/jenkins] (master) C: 0; - https://gerrit.wikimedia.org/r/39212 [19:04:47] ^demon: thanks [19:05:05] that's why gerrit gave me briefly 503? :) [19:06:08] <^demon> Yup :) [19:06:47] etc/mail does not need restart :) but I guess you've had more reasons :) [19:16:01] <^demon> Oh, I thought it did. [19:16:02] <^demon> Whoops. [19:19:56] I discovered this when playing with this change :) [19:22:34] <^demon> Everything else does :p [19:37:10] saper: thanks for your shell review :-] [19:37:17] ( context: https://gerrit.wikimedia.org/r/#/c/39212/ ) [19:38:27] if you have access to .git I can make it a two or three liner [19:39:22] and my lifelong career in troubleshooting says that wrapping original error messages into "nice" error messages is not helpful [19:41:26] saper: so basically there is a ZUUL_REF which contains refs/tags/12345 OR a branch like master / REL1_20 [19:41:33] which is then used to craft the output dir [19:41:51] I probably have left ton of code to help me debug out what ZUUL_REF contains [19:45:16] isn't that just current '.git/HEAD'? [19:45:56] the only thing to check is whether it is safe as a filename :) [19:49:05] I am not sure HEAD will contains the ref to a tag that just has been changed [19:49:51] New patchset: Hashar; "wrapper to build mediawiki documentation" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/39212 [19:50:12] the filtering could probably be made simpler [19:50:20] but I wanted to play with bash regex :-] [19:50:37] hi, any php unit test experts here? I'm looking at resetDB(), and noticed it doesn't reset autorow id counter. Is that possible to add? [19:51:19] ^demon: \o/ it works :) [19:51:36] <^demon> The subject thing? [19:51:36] no ^J anymore [19:51:39] uhm [19:51:40] <^demon> Ah, yay! [19:51:54] got hashar responses to my nitpicking [19:52:13] hashar: still don't get what do you mean by "filter ZUUL_REF values" [19:52:19] but gotta go :( [19:53:15] ah :-) [19:53:20] saper: have a good night :-] [19:56:05] New review: Hashar; "to clarify:" [integration/jenkins] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/39212 [20:27:47] Hey AaronSchulz, Krenair and I were talking about global renames, and trying to figure out the best way to recover if, as we're renaming all of the local wiki's `user` tables, we hit an error, how to recover. [20:28:43] The case is if localnames isn't up to date [20:28:45] Would it be insane to open (possibly hundreds) of db connections, do all the updates, and then loop through them all and commit each [20:29:15] I think in this case it'd just silently affect no rows [20:29:28] and we'd have to check $db->affectedRows() [20:31:33] The alternative would be to update each, and track it in yet another db table, then if something fails we can manually re-rename any rows back to the original. [21:02:58] csteipp: You'd only really need one connection per cluster [21:03:10] Reusing stuff in the loadbalancer would do that for you.. [21:04:58] Hmm... yes, that is true. [21:10:53] csteipp: Doesn't CA do that now? [21:12:14] foreach ( $wgConf->getLocalDatabases() as $wiki ) { [21:12:14] $lb = wfGetLB( $wiki ); [21:12:14] $db = $lb->getConnection( DB_MASTER, array(), $wiki ); [21:12:14] //Query [21:12:14] $lb->reuseConnection( $db ); [21:12:15] } [21:12:46] Reedy: cite? [21:13:03] centralauth? [21:13:16] Oh, sorry. filename? [21:13:39] I can't think of where it does that... [21:14:03] Ah, found it.. [21:21:00] devs: fyi - test.w.o is coming down for 5 minutes or so starting now. [21:33:50] test.w.o back [21:55:33] bleh, php unit tests keep too much state from one test to the next [21:55:42] who knows how to reset them completelly? [21:56:13] i need to add a few pages, test things, and reset [21:56:28] works fine when running just those tests, fails with others [21:57:23] is test wiki down? [21:57:23] oh [21:57:24] doh [21:57:29] hmm [21:58:19] nevermind [22:07:37] yurik: README.md: https://github.com/sebastianbergmann/phpunit/ [22:08:46] Amgine: yes, but it's not clear why DB resets get messed up :( [22:09:15] http://www.phpunit.de/manual/current/en/phpunit-book.html#clean-up-database [22:09:53] <^demon> Wait, phpunit supposedly does that on its own now? [22:10:01] <^demon> Weird, we probably don't make use of it since we DIY. [22:10:02] Yessssss... [22:10:29] yes, our MediaWikiTestCase supposedly handles it [22:10:36] but it breaks for some reason :( [22:11:01] Oh, well, then, not a phpunit issue? [22:11:22] its an issue of me using phpunit to test api :D [22:11:30] [22:12:06] problem is, to have a simple compare with the expected result, i need to make sure all tables are clean when i create test pages [22:12:29] first of all, it can't use "delete *" because that doesn't reset autonumbers [22:12:40] which it currently does [22:12:52] second, i need to list all the tables to clean up [22:15:16] http://www.phpunit.de/manual/current/en/phpunit-book.html#tip:-use-your-own-abstract-database-testcase [22:16:21] the problem with our db cleanup is that when i do editPage(), i have to also know all the possible tables that might be affected, which is not very clean [22:18:23] ohh, and on top of that, it seems that some other tests leave bad state after themselves, breaking my tests :( [22:28:29] setUp()? [22:28:45] bleh, now i have to implement my own recursive result comparison - because all pageIDs are mismatching [22:28:50] New patchset: Platonides; "GlobalFunctions.php conditionally uses bcmath functions in wfBaseConvert since 9b9daad" [mediawiki/tools/code-utils] (master) - https://gerrit.wikimedia.org/r/42962 [22:29:20] Amgine: i use constructor to add all tables to $this->tablesUsed [22:29:34] and i use addDBData() for all $this->editPage [22:30:04] unfortunatelly the testing infrastructure does not handle all this properly [22:31:38] Well, I frankly don't understand the testing harness being used, so you'll likely have to ask Hashar or someone who does. [22:32:07] the api results have different pageid depending on how many tests have run before it [22:32:16] hence it breaks the assertEquals [22:32:44] add an assertEqualsExceptForPageid() method ? [22:32:48] seems cleaner [22:37:07] New patchset: Platonides; "Do not complain of the use of User::decodeOptions in User.php:1089" [mediawiki/tools/code-utils] (master) - https://gerrit.wikimedia.org/r/47800 [22:49:40] yeah, and much less work, except that comparing stuff will be pain [22:49:54] especially if i start checking various api calls that sort based on pageid