[00:00:16] (And I'm moving to Mountain View on the 1st.) [00:01:39] well. that'll kill any enjoyment you have. [00:02:46] the peninsula is populated by those who grind paycheck-to-paycheck inside little cubicles, waiting to die. [00:03:05] And a few of my pals, who are a bit cheerier. [00:03:37] * marktraceur raises hand [00:04:02] Er, different part of the peninsula? Mountain View? [00:04:09] not having to wake up early to catch the train makes up for many shortcomings. [00:05:10] mindspillage: that bookstore on Castro in MV - BookBuyers - may still have a few of the books my partner and I sold when we moved from SF to NYC :) [00:06:04] sumanah: nice! I have had to restrain myself from going there on previous trips because I didn't need to take more books back with me just to have to move them. [00:07:24] MV is in the peninsula. [00:07:49] SF is on the peninsula although people typically exclude it when they talk about "the peninsula" :) [00:08:32] And then there's the South Bay which partially overlaps with the Peninsula and the East Bay, depending on your definition. Yay confusing Bay Area geography! [00:09:01] RoanKattouw: and don't get me started on the political vs the geographic constructs that are "Long Island" or "Asia" [00:09:06] true [00:14:04] okay. rewritten statement: [00:14:05] The Bay Area peninsula is a joyless region, peopled by those who drudge day-to-day in colorless cubicles, grinding forward towards the next paycheck, waiting only for their inevitable graves. [00:14:30] patient zero! [00:17:54] lol [00:19:58] Jorm: you are completely missing the point of the experiment^Wcity! [00:20:26] i love sf and i love oakland. [00:20:41] but i don't know if i can ever go to teh peninsula again. [00:20:48] too much time in mountain view. [00:22:56] [00:32:17] kaldari: quick question for you - are there any biology-related open source projects that you find interesting, that have active communities? [00:33:21] asking because I just ran into someone saying they wanted to volunteer to help with such a project [01:18:49] jdlrobson: Thanks for the code review you've been doing on ProofreadPage! btw, the next time you have a little code review time, https://gerrit.wikimedia.org/r/#/c/10741/ is a ProofreadPage item you might have missed on your dashboard. [01:21:31] Reedy: ping re https://bugzilla.wikimedia.org/show_bug.cgi?id=22911 (Install extension:SubpageSortkey on wikibooks) [01:22:31] Sumanah: there's lots for bio OSS: http://www.microscopy-analysis.com/news/open-source-software-eases-bio-imaging-data-analysis [01:23:21] <9 out of 10 Amgines say> [01:23:28] Thanks Amgine! Looking up BioImageXD now [01:24:20] http://www.r-project.org/ [01:25:19] R sure is popular, yeah [01:25:34] you know our old CTO used to work at an R-related company? [01:26:47] Yep. It's not just popular, it's world-changing. [01:28:02] [08:09:00] TimStarling: pinging you again :) Have you seen my question? [10:43:14] hi srikanthlogic [10:43:36] hello Nemo_bis [15:03:53] Nikerabbit, around? [15:04:30] MaxSem: what is it [15:05:47] Nikerabbit, are Extension:Cldr's definitions up to date? seems strange that it doesn't have a Portuguese name for Portugal, for example [15:07:31] MaxSem: pt is the sad kid, because or pt means pt-pt while their pt means pt-br [15:07:37] because of* [15:07:43] because our* [15:08:01] it should work though, but feel free to dig what is going on there [15:08:38] CldrNames/CldrNamesPt_br.php [15:08:38] 91:'cop' => 'copta', [15:08:38] 374:'pt' => 'português', [15:08:45] CldrNames/CldrNamesPt.php [15:08:45] 54:'pt-pt' => 'português europeu', [15:10:42] :facepalm: [15:11:48] Nikerabbit, thanks! [17:23:12] rmoen: TrevorParscal aharoni -- do you already have 20%-related stuff on your plate today? [17:23:19] I can potentially suggest stuff [17:24:06] sumanah: lots of it, but you can suggest [17:24:42] aharoni: seems like you've been doing a bunch of related stuff already, in helping the Wikidata team understand how to do i18n [17:24:44] and related things [17:24:53] yes, actually. [17:25:02] i do [17:25:28] mostly gong to be working on wikieditor and vector things [17:25:32] and documentation [17:26:42] TrevorParscal: cool, thanks [17:26:52] TrevorParscal: is Ashish awaiting any code review? [17:27:18] * sumanah looks at https://gerrit.wikimedia.org/r/#/dashboard/86 [17:28:24] not that I know of, but I will check in with him on that [17:28:37] https://github.com/dash1291/VE-collaboration can't be right, can it? it was last updated 6 months ago [17:29:13] he is working in a branch of the ve extension on gerrit [17:29:47] ahhh! ok. Last update to https://www.mediawiki.org/wiki/User:Dash1291/GSoC_status was July 1 [17:30:31] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/VisualEditor.git;a=shortlog;h=refs/heads/realtimeve [17:30:50] I wonder why these don't show up under https://gerrit.wikimedia.org/r/#/q/owner:Dash1291,n,z [17:51:13] kaldari: https://gerrit.wikimedia.org/r/#/c/9528/2/UploadWizard.config.php <-- Nikerabbit was kind enough to tell me to use LinkBatch [17:51:40] Would the Title implementation be an acceptable method as well? [17:55:08] me kind? citation needed [17:55:42] Nikerabbit: It's right there, in front of your face! :) [17:56:29] "You did something wrong, here's how to do it right" is much kinder than many free software contributors might have been in that situation.... [18:05:51] How long does a LinkCache last? [18:09:13] It looks like LinkCache just sticks the list in memcache, why don't we just stick $uwLanguages in memcache and forget about all the LinkBatch hoops? [18:09:26] kaldari: It looks like titles are never removed from it but it has no persistence, so it lasts for the duration of the request [18:09:33] I don't see memc code in LinkCache [18:14:17] I was looking at the old version, where does LinkCache live now anyway? [18:15:44] ah it's in /cache now, duh [18:16:22] wow, that very different than how LinkCache used to work [18:22:22] marktraceur: I would suggest just putting $uwLanguages in memcache for 24 hours. [18:22:53] RoanKattouw: any opinions on this: https://gerrit.wikimedia.org/r/#/c/9528/7/UploadWizard.config.php [18:23:38] kaldari: You're partially right [18:24:08] The LinkBatch loop is a good idae, but the LinkCache loop can be replaced with $title->exists() calls, I believe those hook into LinkCache [18:24:10] * RoanKattouw checks [18:26:04] Yes, exists() calls getArticleID() which calls LinkCache::addLinkObj() [18:31:41] RoanKattouw: thanks for the info [18:44:18] robla: can we have some kind of note (date, or what needs to be done) about 1.20 release in http://www.mediawiki.org/wiki/MediaWiki_1.20/Roadmap (or is there better place?) [18:44:34] ^demon: are you around? [18:45:13] Reedy: you around? [18:46:07] Nikerabbit: Reedy and I spoke about this. He has it on his list to publish an beta tarball, which is the first step in the process [18:47:11] last time I asked I think you mentioned you wanted to have some new features included in the release [18:48:35] it'd be nice, but I'd rather establish a regular release rhythm [18:51:07] well, that's a rather large list of 1.20 blockers: https://bugzilla.wikimedia.org/buglist.cgi?list_id=140362&resolution=---&query_based_on=1.20%20release%20blockers&query_format=advanced&target_milestone=1.20.0%20release&known_name=1.20%20release%20blockers [18:51:35] robla: +1 for regular release*s* per year [18:54:08] robla: Just glancing at that I spot two that I'm pretty sure are fixed and a few that aren't really release blockers [18:54:44] yeah, need to finish the hiring process on the bug wrangler spot [18:55:10] That's me, and I am on it [18:55:36] don't forget the 1.19.x bugs also probably block a 1.20 release as well [18:55:45] Yeah that person can then triage that list [18:55:46] * robla didn't mean to sound like he was nagging :) [18:56:14] er...also, didn't mean to nag, either :) [18:56:56] so...y'all could help accelerate the process by actually making the true list of blockers [18:57:19] robla: I know you didn't. :) [18:59:05] robla: How about I go and fix some of them :) it's my 20% day and three of those are on my list to work on anyway :) [18:59:09] hrm...any idea why the labels disappeared from the project status helper on mediawiki.org? [18:59:34] RoanKattouw: ah, there you are, and I should have checked in with you re 20% -- that sounds good [19:00:03] sumanah: I was in late today (11am-ish) [19:02:10] kaldari: All right, home is thankfully OK. Power outage. Are you still working on UW things? Should I press on? [19:02:51] yeah, lemme send you my latest thoughts on the language thing... [19:03:30] Mmmkay [19:10:34] * robla engages in reckless speculation that the jQuery 1.8 upgrade is to blame for the project status helper bug, and that that's the tip of the iceberg [19:10:47] https://gerrit.wikimedia.org/r/#/c/17455/ [19:11:30] marktraceur: sent an email. You can still use BatchLink if you prefer, but probably a good idea to cache the list in memcache anyway. [19:11:55] kaldari: Will the wfMemc method still work if whatever server isn't running memcached? [19:12:15] sure, it'll still work, but will be less kind [19:13:15] I would even be tempted to cache it for a week, but maybe someone would complain :) [19:14:34] Considering the list probably only changes a few times a year [19:15:40] booooooo robla is against change!!!!!! [19:16:26] get off my lawn! [19:16:33] Reedy: He's also right :) jQuery 1.8.0 broke JUI dialog which broke ProjectStatusHelper [19:16:36] * RoanKattouw reverts jQuery upgrade [19:17:12] There is a JUI 1.8.25 (we are using 1.8.22) which may be compatible with jQuery 1.8.0, but we can test that outside of production :) [19:17:53] jQuery 1.7.2 worked for mah daddy, and mah daddy's daddy, and his daddy too, and dang nabbit, it'll work fer you too! [19:18:03] Stable is 1.8.23 currently.. [19:18:51] Yeah, 23, not 25 [19:19:19] usually doesn't take long to update it.. [19:23:26] lol, mostly header updates [19:23:27] https://gerrit.wikimedia.org/r/#/c/20927/ [19:27:15] Ha [19:27:20] Yeah I do see the bug fix in .23 [19:27:58] Reviewing [19:31:15] Reedy: Merged [19:31:33] Guess that should be cherry-picked into wmf10 as it breaks RobLa's pet project [19:31:42] yeah! [19:31:45] Ahm, fixes breakage in, I mean [19:31:57] I wasn't suggesting that we cherry-pick something specifically so it would break Rob's code :) [19:32:32] no breaking my stuff! [19:34:35] heh [19:34:38] I'll do it [19:34:44] And if it doesn't, we can blame the author ;) [19:49:06] kaldari: All right, round 8 and go [19:49:32] Ah no, apparently not [19:50:08] looks like the project status helper is fixed. thanks Reedy and RoanKattouw [19:51:02] :) [19:51:43] marktraceur: Roan had some insights on LinkCache while you were in transit, don't know if you saw the backscroll [19:52:46] I wish there was some documentation on LinkBatch and LinkCache [19:53:02] I'd never even heard of them [19:53:08] kaldari: That takes the surprise right out of MediaWiki development [19:53:20] kaldari: I didn't, my IRC had crashed 'cause it was on the server at home [19:53:35] ah... [19:54:03] Anyway, it appears to work now [19:54:06] Quoting Roan: The LinkBatch loop is a good idae, but the LinkCache loop can be replaced with $title->exists() calls, I believe those hook into LinkCache [19:54:18] Ah, sou. [19:54:45] Anyone know if/where we have documentation on when developers should use title::userCan / user::isAllowed? [19:54:48] s/I believe/I have verified/ [19:56:39] OK, so make the link batch, execute it, then rely on the title method to use the data? Sounds like fun, I'll do it [19:57:48] kaldari: But only do that if there's not a memcached list already, n'est-ce pas? [19:58:13] yeah [20:00:16] Although I can take or leave the LinkBatch loop, it's up to you. Personally I prefer readability over performance if the performance difference isn't huge. [20:01:56] Ha, true [20:02:37] But database nuts will prefer LinkBatch, methinks. And I'm sympathetic to people who are nuts of one form or t'other. So I'll do it. [20:03:44] Huh. [20:03:57] I thought the Title code normalized the title. Guess not. [20:04:19] makeTitle doesn't [20:04:29] newFromText does I think [20:05:07] makeTitle is preferred if you know your title names are correctly formed [20:05:28] Yeah, and one call to ucfirst does not a performance problem make [20:05:55] kaldari: I very nearly called my ucfirst'd version of $code "$Code". I think I'm clever. [20:06:09] :) [20:06:16] But of course that would be awful, for so many reasons [20:06:55] Augh, doesn't matter anyway, have to call it every time [20:08:22] Try that on for size [21:40:50] kaldari: Woo! Are you thinking of deploying again today? [21:51:16] marktraceur, ironic indeed [21:52:34] I did some tests of print [ random.randint(0, 10) for x in rangelist ] to check it indeed called randint() each time [21:52:50] some of us are not so used to python, I guess [21:55:51] Platonides: That's fine! And I certainly won't -1 for that, since that's not uncommon (from briefly looking at other bits of the file) [21:56:44] I know what it does, now [21:57:01] but some other people touching this code in the future may not :/ [21:57:35] it has been a nice-to-have for some time if this file was in php instead [21:57:50] happy to look at python code, if there's a need [21:58:07] (not really for readability, but seems it'd help, too) [21:58:42] marktraceur: not today, but Thursday probably [21:58:52] ori-l: I think I have it well in hand, but if you're bored, https://gerrit.wikimedia.org/r/#/c/16872/ https://gerrit.wikimedia.org/r/#/c/16869 https://gerrit.wikimedia.org/r/#/c/16873/ [22:00:56] marktraceur: tend to agree with platonides. list comprehensions are idiomatic python, but readability is even more idiomatic, and guido himself is often chiding people to just write the damn loop instead of trying to fit everything into a gigantic list/set/dict comprehension or generator expression [22:02:00] ori-l: If it were complicated, I'd agree--but one expression isn't gigantic :) [22:02:20] (or two, I suppose, in the case of the nested comprehension) [22:02:33] marktraceur: it isn't so much that -- it's that you're using 'x' as a loop index rather than transforming a value [22:02:51] * marktraceur looks [22:03:25] [line.upper() for line in buffer] is idiomatic, because you're transforming a list [22:03:44] Right [22:03:50] [do_something() for x in xrange(10)] should be unrolled into a loop [22:03:54] Oh I see [22:05:37] hi rmoen [22:06:28] ori-l: I don't know, I've seen exactly that syntax in for loops. I can't confidently say the same about comprehensions, but I'm fairly sure I've seen that, too. [22:06:49] also (just in case you find this useful) regarding this being faster -- i don't think this is the place for optimizations, but it is a good instinct to be curious about the implementation of various language constructs. python makes it easy to peek behind the curtains using the 'dis' module, which shows you the bytecode the cpython interpreter generates for a given bit of python code [22:06:55] Not "once, in some file, in some obscure project", but in the official docs [22:08:22] wait, which syntax? [22:08:32] do you mean xrange vs range? [22:08:52] sumanah: hi [22:09:29] sumanah: how's it going ? [22:10:32] rmoen: not bad! just had my last meeting of the day, so that's rockin' [22:10:38] (yes it is 6pm) [22:10:54] ori-l: No, I mean using a variable in a for statement even when you don't necessarily use its value in the loop body [22:10:56] sumanah: i'm ooo today sick but am doing some work [22:11:18] Like for x in range(whatever): # never use x, it's just there to count through [22:11:18] rmoen: I hope it is not the chicken pox! also sorry for disturbing you when you are sick, and hope you get well soon [22:11:20] sumanah: cold medicine clouds the brain [22:11:26] it really does [22:11:46] Also, 'dis' module make me happy [22:11:47] lol, yeah. no need to be sorry ;) [22:12:18] rmoen, AaronSchulz once was doing some kind of MW work while ill with some kind of Martian Death Flu and he was so weak he couldn't make it around the corner to Walgreens [22:13:40] sumanah, that sounds terrible. Lucky for me I am not that ill [22:14:11] videos you might enjoy watching: https://commons.wikimedia.org/wiki/Category:Hackathon_Berlin_2012 [22:14:59] lets hope my project goes well [22:15:01] https://www.youtube.com/watch?v=ByY3CgR7-4E [22:15:40] marktraceur: in a loop statement, yes, of course! [22:15:46] in a list comprehension, it's less idiomatic [22:15:52] matanya: nice idea! [22:16:34] sumanah: thanks. please use the link below, to give a few points :) [22:17:11] marktraceur: http://dpaste.com/789760/ [22:17:18] I'll try, matanya - what is the deadline? [22:20:02] ori-l: SOOO MUCH FASTER :P [22:20:13] marktraceur: yes, three millionths of a second on my machine :) [22:20:29] OK, maybe not super much faster. [22:20:37] sumanah: a week or so [22:20:47] But get into nesting, then it shines....maybe [22:20:52] matanya: hmm, I should try to get my interesting vacation photos up, then [22:20:57] I'm trying with a bigger sample [22:20:59] marktraceur: repeat after me: premature optimization is the root of evil [22:21:45] anyway i poked just because i thought i'd show you timeit and dis which are both useful [22:21:48] make it work, then make it beautiful, then make it fast, right? [22:22:08] sumanah: where does "behave like an elephant" fit in? :( [22:22:09] They are indeed [22:22:28] ori-l: beauty [22:22:33] :) [22:22:38] ori-l: elephants are pretty [22:22:42] ori-l: to make you feel better: https://commons.wikimedia.org/wiki/File:Pink_Elephant.png [22:22:52] marktraceur: dis is an awesome way to level up as a python coder, but use timeit too much and you become like those people who fit spoilers on honda civics [22:23:29] * ori-l brbs. [22:23:43] matanya: thanks! [22:25:41] np :) cheer up my friend! [22:38:11] marktraceur: This one needs some more rebasing/merging... https://gerrit.wikimedia.org/r/#/c/20814/ [22:40:41] Oh, no doubt [23:11:08] kaldari: That was 20 minutes I'll never get back [23:11:31] ha, 20 minutes! You got off lucky ;) [23:12:02] Eugh, even so [23:12:04] rebasing Flickr uploading took me about an hour [23:12:11] Ha [23:12:30] I suppose the best thing is to make small changes, and rebase them one at a time [23:13:02] My advice is to use as few changes as possible and lots of amends [23:13:10] other than that, I don't know what would help more [23:13:50] it makes rebasing harder, but you have to do fewer of them :) [23:14:07] I guess you're screwed either way really [23:15:16] Yeah, pretty much [23:15:32] At least with smaller ones (I think), you're more likely to slip between other patches [23:40:58] marktraceur: any idea where the heck fileNameOk() and fileNameErr() are defined? [23:41:17] kaldari: IIRC they're passed into the function where they're called [23:41:29] ah, that would explain it [23:46:00] Ill-fated idea of the day: Download and try to work with ConfirmEdit! :P [23:47:29] Ah, confirmedit isn't that bad... [23:48:44] csteipp: I guess we'll see [23:55:19] marktraceur: Nice work on https://gerrit.wikimedia.org/r/#/c/7149/, only 1 more minor change I promise :) [23:55:59] Noted!