[01:16:40] hi multichill ... how goes GLAMCamp prep? [01:35:03] hi tfinc -- just saw https://meta.wikimedia.org/wiki/Mobile_Projects/features#India_Hackathon [01:40:05] sumanah: uhuh [01:40:20] pchang was working on all of those [01:41:06] tfinc: so the H next to an item means that there was specific significant progress on it at the Mumbai hackathon? [01:41:26] correct! [01:42:32] whooooo [02:08:15] sumanah: do you mind if i tweet the dates of the hackathon ? [02:08:27] tfinc: the January one? no of course not go ahead [02:08:34] it's public & on the developer meetings page [02:08:48] k, i'll include the link then too [02:08:56] to https://www.mediawiki.org/wiki/January_2012_San_Francisco_Hackathon [02:09:44] cool, tfinc [02:13:32] done http://twitter.com/#!/WikimediaMobile/status/141701259387346945 [09:47:05] morning [10:22:37] Hello Bryan [10:22:46] hi [10:23:11] Bryan, no pressure, but are you planning to create [[:mw:Extension:VipsScaler]] soon? [10:25:06] good idea [10:25:15] if I do not forget, tonight [10:25:25] awesome, thanks :) [13:11:57] gregdek, hi. Was there any work done on the Coding challenge in November? And if so, could you please add a few sentences to summarize it at https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2011/November#Recent_events ? [13:58:07] RoanKattouw: Reedy: can one of you please deploy https://www.mediawiki.org/wiki/Special:Code/MediaWiki/104663 when you have time ?:) [13:58:31] an svn up of extensions/VipsScaler should do it [13:58:34] hashar: don't you have shell access yourself? [13:58:38] Can't you deploy it? :p [13:58:38] ^ [13:58:53] for some reason I don't have full rights :) [13:58:58] Tim granted me access to wmf branches [13:59:08] yeah, and that works [13:59:10] but now I can not svn up cause I don't have my ssh key in svn :-) [13:59:14] Nooo [13:59:17] agent forwarding [13:59:22] ssh -A fenari.wikimedia.org [13:59:23] ssh -A fenari.wikimedia.org [13:59:28] Should just be the same as you did for site config stuff [13:59:59] no way I am going to use ssh forwarding [14:00:05] that is too evil [14:00:16] ... [14:00:23] It's the way you should have always had to do it [14:00:23] :/ [14:01:30] the way I do it is: [14:01:43] I got several ssh keys on my computers [14:01:51] one of them let me access to live cluster [14:02:00] then I have another key there to let me access to the other servers [14:03:02] so I just need to poke ^demon about adding that key on svn :) [14:08:10] lol [14:08:22] i'll do it when i'm off the phone [15:58:15] hashar, it's possible to connect to a second host without agent forwarding [15:58:15] ssh -oProxyCommand='ssh -W %h:%p monkey.example' banana.example [15:58:49] (monkey is the bastion and banana the target host) [15:59:22] looks funny :) [15:59:45] the issue I have is that I can not ssh from fenari to the svn repository using svn co svn+ssh://foo [15:59:53] just need my key to be pushed there :p [16:00:09] gwicke: welcome back. I have updated the js parser test to use colors :) [16:00:21] you did the changes directly in fenari? [16:00:31] hashar: yay! [16:00:35] checking.. [16:00:47] you need to add a node module [16:00:51] (I copied that line from a mail of dkg to the openssh ml) [16:00:53] colors IIRC, look at the README [16:01:29] btw, the filename MWTerm.php looked a bit ugly [16:02:25] hashar: veeerrrry pretty! [16:02:49] thanks :) [16:02:54] would be good to disable the colors if the output is not to a term [16:03:04] Platonides: yeah I could not come with a better directory or better name at that time :( [16:03:18] Platonides: please be bold and rename to something else (don't forget to update Autoloader.php) [16:03:18] why not just Term.php ? [16:03:37] i though Term.php would conflict with some PEAR package so I added the MW prefix [16:04:05] gwicke: not a term? isn't that script supposed to be run from cli ? [16:04:36] hashar: yes, but normally I write the output to a file or less, so I can jump and search around in it [16:05:17] but that still works, so not really important [16:05:49] hashar, why not copy the changes from fenari to localhost, and commit from there? [16:06:01] seems the easiest way [16:06:06] *gwicke is all sweaty after running for 40 minutes, off to the shower.. [16:06:46] you could even do: ssh fenari 'cd path/repository && svn diff' | patch -p0 [16:07:14] that is not the issue Platonides :) [16:07:22] I did commit from my computer to the wmf branch [16:07:36] then I log on fenari and try to 'svn update' the wmf branch [16:07:57] the working copy on fenari use svn+ssh:// so it prompts me for my ssh key :) [16:08:06] ouch [16:08:23] well??? to be more correct, fenari propose a local ssh key but it is unknown to the svn server [16:08:32] so I ended up having to enter a password that does not exist :) [16:08:40] xD [16:08:44] svn switch http:// + svn up [16:08:50] so the solution is to take my ssh key from fenari and put it on the svn server :-)) [16:08:53] hehe [16:09:01] although that will mess up next guy trying to commit from fenari a change on that file [16:09:15] the working copy is shared among various user. http:// will fix it but I guess we have svn+ssh:// for some reason [16:09:26] you are right [16:09:34] the scheme let us easily commit live hacks. That is the reason [16:09:35] \o/ [16:10:04] svn server should allow that fenari key as read-only [16:13:50] Node.js totally pwn [16:14:04] I really prefer JS over PHP [16:14:49] I was able to manage with js [16:14:50] back in the days were there was no jQuery or resourceloader everywhere [16:16:55] hashar: you could try ssh -A [16:17:05] that forwards the agent [16:17:13] gwicke: yeah that would be the solution. But Tim said ssh forwarding is evil :-) [16:17:24] ok [16:17:42] somewhat understandable, but still more secure than copying keys around.. [16:17:44] SSH forwarding isn't evil under all circumstances [16:17:49] You should have to know what you're doing [16:17:55] And be careful about what other keys are stored in that agent [16:21:18] hashar: I did some profiling this morning, about 30% seems to go into substr [16:21:34] the parser chops up everything into single-char strings [16:21:48] and then joins many of them back together ;) [16:21:49] Roan, it's convenient, but not necessarily good [16:22:55] hashar: that is in the PEG tokenizer, to be more precise [16:24:07] gwicke: is that something we wrote? or an upstream issue? [16:24:20] upstream [16:24:37] but probably fixable without too much trouble [16:25:37] hashar: the colors are especially nice in the diff! [16:27:04] gwicke: ?? Brion 2005 or so [16:27:19] gwicke: I reused the colors he used for the parserTests.php script :p [16:27:30] :) [16:27:39] there might be a bug somewhere [16:27:43] they are very familiar ;) [16:27:46] cause the - line are below the + lines [16:27:55] +

stuff

[16:27:55] -

"hello" stuff

[16:27:57] that is in the diff library [16:28:11] also wondered about that [16:29:41] have to head to town for some shopping, will be back later [16:33:43] parserTests.js now comes with usage and --help : -) [16:33:50] yay [16:33:57] npm -g install optimist [16:33:59] I love that module :) [16:34:02] hashar: is that worth posting to wikitext-l about? [16:34:15] not sure [16:34:22] sweeeeeeet [16:34:26] i say yes ;) [16:34:29] it'll get people to try it ;) [16:34:35] haha [16:34:40] I am going to add a filter option [16:35:41] --quiet will require to just rewrite everything to use events instead of while(true) { switch() case case case } [16:35:50] that will be easier to read [16:38:10] RoanKattouw: just fyi, I smiled at "Winter is coming" [16:38:18] heh [16:38:34] I'm glad someone appreciated my edit summary easter egg [16:38:47] :) [16:39:23] You know, with a power socket and half a season of GoT, that red-eye from Detroit to Amsterdam (back from NOLA) was really short [16:40:50] I found my 16-hr flight back from Mumbai was pretty short, too, with various entertainments [16:41:21] RoanKattouw: do you have anything for the HTTPS portion of https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2011/November ? [16:42:06] Reedy: anything for the Shell Requests section of https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2011/November#MediaWiki_Core ? [16:42:48] *guillom thinks sumanah is trying to get French sweets the next time we meet. [16:43:04] *sumanah laughs [16:43:10] (But I'll take help with the report for French sweets any time) [16:43:31] FYI I just pinged Ryan for Labs, and Mani & Parul for Mobile research. [16:43:44] On this note, dinner time. [16:44:08] sumanah: HTTPSEverywhere release happened. Otherwise, not much [16:58:42] RoanKattouw: ok, did I summarize that ok with https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2011/November#Site_infrastructure ? [16:59:28] Hmm, almost [16:59:34] The cases they missed are obscure edge cases [16:59:47] I mistakenly thought they had pulled an old version of my ruleset, but they hadn't [16:59:57] So then 24h after the HE release, Ryan gave me his list of domains [17:00:09] and with that I was able to add a number of obscure domanis [17:00:10] HA [17:00:16] so you did finally get that list! [17:00:28] aye [17:00:49] https://github.com/catrope/https-everywhere/commit/e421caa18684a350ec8c7a7b93a8b9308ecbc031 [17:00:52] *RoanKattouw notices a bug in those [17:01:08] you're better placed than I am to actually improve & fix the eng report, then, RoanKattouw [17:01:20] *sumanah goes to count up the number of new committers from Nov [17:01:57] gotta escape the dots [17:04:32] I wish we could use some kind of wildcard proxy in front of our http only servers [17:04:43] would also be great to av [17:04:44] o [17:05:06] it would also be great to avoid duplicating apache configuration between :80 and :443 virtual hosts :) [17:18:45] well I ported --quick --quiet :) [17:36:13] hashar: wow ;) [17:41:06] next commit will have an easter egg :) [17:44:13] gwicke: you might want to talk about substr in the PEG parser with Krinkle [17:44:22] I know he did a bunch of js optimization in MediaWiki core [17:44:43] ran across his nick on jsperf a few times ;) [17:45:09] he is Netherlands, but usually connect in the evening [17:45:29] ok [17:46:02] http://www.mediawiki.org/wiki/JavaScript_performance might help also [17:46:33] in the long term, I believe that passing offsets will always be better than passing strings [17:46:55] but that is already the second step [17:47:20] right now I believe the largest gain can be made by speeding up the bulk text match [17:47:51] will ask him about it [17:48:11] how do you profile a js script under node ? [17:48:19] node --prof [17:48:23] ahaha [17:48:27] awesome [17:48:32] and then an analyzer that comes with v8 [17:48:40] but it is quite primitive [17:48:53] no kcachegrind conversion for example [17:49:04] i love kcachegrind.. [17:50:21] I have used it a bit a lonnng time ago [17:50:52] also, a trace of the full test run crashes the analyzer script [17:51:05] you have --filter now :) [17:51:06] so you have to abort the test to keep the size down [17:51:15] exactly ;) [17:51:23] thanks! [17:51:34] I love github and node [17:51:35] that really helps while working on individual cases [17:51:58] but there is always a ton of different modules to do one task [17:52:02] I used to like that style too, played a lot with twisted in python [17:52:16] but now I am spoilt by Haskell.. [17:52:27] never looked at it [17:52:29] does all the event stuff under the hood [17:52:47] at one time I wanted to look at go (google language that looks like an interpreted scripts but that you can compile) [17:52:47] not quite as light-weight as erlang, but close [17:53:32] I have no eperience with go so far [17:53:52] the implementation has a few nice tricks, split stacks for example [17:54:07] but the language itself looks quite conservative [17:54:57] I like Haskell mainly for concurrency and its type system [17:55:14] and semi-automatic parallelization enabled by purity [17:55:17] well I only played with PHP Perl Ruby Python and JS [17:55:22] Haskellians here, Tim going on about Lua in email, what's next? [17:55:30] C is only hacking minor changes :-/ [17:55:53] sumanah: anything but php I guess.. [17:56:05] but LuaJit is awesome too [17:56:06] "burn" as kids used to say in the US in the 90s [17:57:38] gwicke you might want to look at https://github.com/dannycoates/node-inspector [17:57:47] seems it even allow you to change the code of a running app [17:58:45] you end up with an editor in your browser [17:58:45] hmm- the parser is built as a string and then evaled [17:58:55] that messes up the normal debugging unfortunately [17:58:59] with an inspector that looks like firebug :) [17:59:31] trying.. [18:00:38] failed to install with npm [18:00:45] oh nooo [18:01:13] TypeError: Bad argument [18:01:15] npm ERR! at Object._open (fs.js:224:11) [18:01:20] and so on.. [18:02:44] but I have installed node from the last git release tag, could be a problem for a debugger [18:03:04] node --debug (IIRC) also offers a simple debugger [18:03:16] but that does not help for evaled code [18:03:46] maybe an issue with node version? I got 0.6.2 [18:03:48] under mac os x [18:04:14] 0.6.3, Debian [18:05:02] I tried to include code into the grammar using require, but that failed with some obscure scoping issue [18:05:34] so I had to paste it all into the top of the grammar file for now [18:05:47] as long as it works, I guess you are fine [18:06:04] but you have to debug it without help from traces [18:06:17] more like c programming before gdb [18:07:10] won't scale like this, but I gave up on that for now [18:08:01] gwicke: sorry need to get some bread :/ [18:08:03] I am out [18:08:21] hashar: np- I'll also leave soon [18:08:24] *hashar notices that unfortunately he get out when robla get in :-D [18:08:26] see you tomorrow! [18:08:28] thanks for your work! [18:08:31] and enjoy the colors [18:08:34] :-D [18:08:35] bye hashar [18:08:38] and the filters ;) [18:08:40] bye! [18:08:51] hello / goodbye robla :) might connect later in the evening though [18:11:45] I'm leaving too, see you tomorrow! [18:14:04] anyone know how to run update.php on prototype ? I'm unable to properly run it [18:14:19] tried update.php --wiki enwiki, rc-en, rc_enwiki ... [18:14:35] plus --iknowwhatimdoing since it's wmf-branch [18:14:58] $ php update.php --wiki rc-en --iknowwhatimdoing [18:15:05] Command line mode for: --wiki [18:15:06] Notice: Undefined index: --wiki in /srv/org/wikimedia/prototype/wikis/rc/LocalSettings.php on line 56 [18:15:11] Commandline: NULL [18:15:17] Database returned error "1046: No database selected (localhost)" [18:17:18] update.php --help says to use --wiki [18:19:14] okay, so LocalSettings is doing $wikidata=$wikis[$argv[0]] [18:19:16] .. [18:19:20] update.php rc-en worked :) [18:19:35] ARG! Fatal error: Class 'LoggedUpdateMaintenance' not found in /srv/org/wikimedia/prototype/wikis/rc/maintenance/fixExtLinksProtocolRelative.php on line 27 [18:39:19] @replag [18:39:21] Krinkle: No replag currently. See also "replag all". [18:39:25] @replag all [18:39:26] Krinkle: [s1] db32: 0s, db36: 0s, db12: 0s, db26: 0s, db38: 0s; [s2] db30: 0s, db13: 0s, db24: 0s; [s3] db34: 0s, db39: 0s, db25: 0s, db11: 1s [18:39:27] Krinkle: [s4] db22: 0s, db31: 0s, db33: 0s; [s5] db45: 0s, db35: 0s, db44: 0s; [s6] db47: 0s, db43: 0s, db46: 0s; [s7] db16: 0s, db37: 0s, db18: 0s [18:46:18] @replag all [18:46:19] Krinkle: [s1] db32: 0s, db36: 0s, db12: 0s, db26: 0s, db38: 0s; [s2] db30: 0s, db13: 0s, db24: 0s; [s3] db34: 0s, db39: 0s, db25: 0s, db11: 0s [18:46:20] Krinkle: [s4] db22: 0s, db31: 0s, db33: 0s; [s5] db45: 0s, db35: 0s, db44: 0s; [s6] db47: 0s, db43: 0s, db46: 0s; [s7] db16: 0s, db37: 0s, db18: 0s [19:39:21] hrm [19:39:28] *RoanKattouw wants a way to gracefully restart job runners [19:40:12] are you fantasizing about some job runner that doesn't suck? [19:40:44] Wait, I think I know what's going on here [19:40:50] OK so the job runners are run with --max_time=300 [19:41:09] That means that the main process will stop serving jobs to its children 5 minutes after it's started [19:41:36] Of course on srv267 I've got a job runner with 2 children (even though procs is set to 5), both with a running time of 98 minutes [19:41:55] So 2 children were served very long-running jobs that run for much longer than 5 mins [19:42:27] The problems are 1) such jobs exist, they shouldn't, and 2) this reduces the job runner capacity because no new procs are spun up; the parent is trying to wind down and move to the next wiki [19:42:42] I think the cause of #1 is the way RefreshLinksJob2 is written [19:43:39] It fetches a single title, grabs all templatelinks (or pagelinks or whatever the job parameter says, but I'm working on templatelinks right now) entries for that title, then parses *ALL* linked titles, then ends [19:44:07] Now we've got this code [19:44:10] # Not suitable for page load triggered job running! [19:44:11] # Gracefully switch to refreshLinks jobs if this happens. [19:44:13] if( php_sapi_name() != 'cli' ) { [19:44:22] Then in the if block, it inserts a RefreshLinksJob for each title [19:44:45] I'm wondering if we shouldn't do that instead [19:45:01] For cli too, I mean [19:45:10] Because these RefreshLinksJob2 jobs run forever [19:48:09] RoanKattouw: so if it doesn't convert to sub-jobs (refreshLinks2 => refereshLinks) it does $wgUpdateRowsPerJob parses [19:48:23] *AaronSchulz wonders how slow that is [19:48:42] orly? [19:48:45] Where is that limit? [19:48:55] Oh BTW, I've found a way to do a graceful restart, somewhat [19:48:56] LinksUpdate::queueRecursiveJobs [19:49:09] I sent SIGSTOP to the jobs-loop.sh process [19:49:47] hmm [19:50:09] Oh, aha, it's got a start and an end [19:51:07] it could either use a lower number than $wgUpdateRowsPerJob (usually 500) or just always convert to the single-parse jobs [19:51:40] hmm [19:51:43] I guess it's fine this way [19:51:47] 500 is just a really large value [19:51:55] 500 "Barack Obama"s zOmgz [19:52:05] Exactly [19:52:10] That's what srv267 was doing to itwiki [19:52:15] Only one runner left there now [19:52:32] So the thing is, if you have job procs that run for 100 minutes [19:52:40] code updates take a long time to take effect [19:52:45] yep [19:53:40] sheesh [19:53:42] 10780 apache 39 19 256m 97m 3912 R 100 0.8 149:39.17 php MWScript.php runJobs.php --wiki=itwiki --procs=5 --maxtime=300 [20:07:56] See //www.mediawiki.org/wiki/Special:ApiSandbox [20:09:14] $desc[] = 'See ' . SpecialPage::getTitleFor( 'ApiSandbox' )->getFullUrl(); [20:21:49] > var_dump( $a->getInternalUrl() ); [20:21:49] string(48) "http://www.mediawiki.org/wiki/Special:ApiSandbox" [20:21:56] RoanKattouw, ^ Is that enough to fix it? [20:22:11] Use getCanonicalUrl() instead [20:22:21] They're the same on WMF but that's a coincidence [20:23:12] Cheers [21:52:51] bsitu, rmoen: OK, the code is all reviewed, you guys ready to push it to testwiki? [21:53:15] RoanKattouw: Yes :) [21:53:58] RoanKattouw: Yes, thank you [21:56:29] Code should be live on testwiki [21:56:33] Running the index creation on all wikis now [21:56:53] rmoen, bsitu: Please test on testwiki, and ping me when it's good to go site-wide [21:57:27] RoanKattouw: yes, we are testing it on testwiki now [22:11:25] DB updates all done [22:22:52] RoanKattouw: All tested ,we are ready. [22:23:01] OK [22:23:28] Holy fuck [22:23:30] 225 Warning: require(/usr/local/apache/common-local//index.php) [function.require]: failed to open stream: No such file or directory in /usr/local/apache/co [22:23:32] mmon-local/live-1.5/index.php on line 3 [22:23:34] 224 Fatal error: require() [function.require]: Failed opening required '/usr/local/apache/common-local//index.php' (include_path='.:/usr/share/php:/usr/loca [22:23:36] l/apache/common/php') in /usr/local/apache/common-local/live-1.5/index.php on line 3 [22:23:37] 8 Warning: require(/usr/local/apache/common-local//api.php) [function.require]: failed to open stream: No such file or directory in /usr/local/apache/comm [22:23:39] on-local/live-1.5/api.php on line 3 [22:23:40] 8 Fatal error: require() [function.require]: Failed opening required '/usr/local/apache/common-local//api.php' (include_path='.:/usr/share/php:/usr/local/ [22:23:42] apache/common/php') in /usr/local/apache/common-local/live-1.5/api.php on line 3 [22:23:43] Sorry guys I gotta fix that first [22:24:07] Woah [22:26:20] OK they're subsiding now [22:26:22] WTF was that [22:26:46] I don't have a clue [22:27:36] It looks like $IP was set to "" ? [22:27:56] Oh, wait [22:28:01] $version was set to '' [22:29:20] hmm, looks like a bug in MWMultiVersion [22:29:24] AaronSchulz: Did you see this? [22:29:58] AaronSchulz: It looks like it's possible for MWMultiVersion::loadVersionInfo() to populate $this->version with 'false' , at which point you get errors accessing /usr/local/apache/common-local//index.php [22:30:26] It seems to require a corrupted .cdb file and running sync-common on the affected boxes made it go away, but still, it's scary [22:31:15] rmoen: OK, now that things are quiet, MB deploy [22:31:16] Here goes [22:31:51] Hooray! [22:32:01] RoanKattouw: isMissing() checks version == false [22:32:12] Oh, huh [22:32:17] Well somehow those paths were produced [22:32:18] I think some 404 code uses that [22:32:23] Maybe it was APC corruption, I don't kknow [22:34:58] I wonder what was making those paths [22:35:16] It came from the live-1.5 wrappers [22:35:22] So it must've been live-1.5/MWVersion.php [22:35:22] calling getVersion() or so would fail out due to the NotMissing assertion [22:35:40] Which does stuff like require "/usr/local/apache/common-local/$version/$file"; [22:35:50] (Well not directly, it sets $IP first, but you get the idea) [22:36:04] $version = $multiVersion->getVersion(); [22:36:12] aye [22:36:17] Well maybe it was '' or something [22:36:20] or null or whatever [22:36:41] yeah, definitely requires corruption [22:36:48] rmoen: All done, please test on enwiki. I'm not seeing errors at least [22:36:49] there was a key but it was empty [22:36:57] opposed to no key to begin with [22:37:01] Right, a key with a value of '' [22:37:07] wierd [22:37:19] But only on two boxes, where it just magically appeared [22:48:29] RoanKattouw, Thanks for deploying. All is good [22:48:43] Good [22:55:55] congrats guys! [22:56:45] this is really, really cool! [23:09:14] RoanKattouw, can I request one small config change ? see https://www.mediawiki.org/wiki/Special:Code/MediaWiki/104775 [23:10:38] Config changes to trunk/extensions/MoodBar/MoodBar.php [23:10:53] On it [23:11:00] Thanks! [23:14:39] rmoen: Deployed [23:16:33] RoanKattouw: On test ? [23:16:41] On the live site [23:17:20] oh OK, maybe I'm seeing cache [23:17:37] Oh, right [23:17:39] config changed [23:18:06] Touching startup.js to push through the config change to JS [23:18:13] This, and the JS change, may take 5 mins to propagate [23:19:01] Oh you got the $.trim in there ? [23:51:06] hey jorm [23:51:20] do you have a sec to check out the blog draft? [23:53:51] been looking at it. [23:53:53] seems good. [23:53:57] sec. [23:54:05] lemme give it another once over. [23:55:16] reading google doc; don't have perms to read in-situ drafts. [23:55:43] i can't read it, dude. the google doc is weird. [23:55:55] i trust your opinions, though, so just click "publish" [23:55:57] DO IT