[14:14:00] Reedy: mark was saying that we could do stats collection as a separate daemon if we wanted to [14:14:10] Does any other configuration need doing for collector, just running it seems to make it listen on 3811 [14:14:11] Oh? [14:14:37] you can just have the daemon log to an rrd file, and the tell torrus the location of it [14:14:43] yeah [14:15:03] no, collector doesn't need configuration [14:15:04] like everything, it's ever so well documented ;) [14:15:34] torrus is very well documented [14:15:42] I was meaning collector [14:15:48] Hmm, Ryan seems to have debianised udpprofile.. [14:16:08] torrus is written in perl, right? [14:16:12] ah, torrus has a process called collector as well [14:16:13] right [14:16:36] so if we make a torrus plugin, it has to be perl [14:16:46] whereas if it is a separate daemon, you get to choose the language [14:17:14] as you said, torrus can pull numbers over http [14:17:27] so then you don't need to write anything on the torrus side [14:18:12] domas's collector gives stats via a plain TCP socket in XML form [14:18:26] if we wanted to use HTTP we'd have to write some sort of TCP to HTTP thing [14:18:41] I don't think that would be any easier than a torrus plugin [14:20:19] the HTTP plugin is only ~150 lines of perl, so I thought it would be pretty easy to do the required TCP/XML one [14:20:54] as long as Reedy doesn't mind perl too much [14:21:35] adding a http header to the XML served by TCP should be even easier [14:21:46] It's proably a somewhat useful task to get familiar with perl [14:22:12] yeah, perl is not too bad [14:22:28] it's a bit jarring to use a language that doesn't have proper function parameters [14:22:38] kind of like a cross between shell script and a proper programming language [14:23:19] heh [14:23:26] Platonides: I think what you just said was wrong in three different ways [14:23:37] I was just trying to see if I could steal Ryans package for it... [14:24:03] unfortunately ganglia's protocol is not great either [14:24:06] but it's late so I don't really want to have to explain it [14:24:15] it's easy to write python plugins for it, but it doesn't offer that much control [14:24:36] given that I don't know in which language is domas collector done or how it works, it may be [14:26:57] Reedy: when it comes to actually writing the plugin, it would be cool if we could define data sources in the torrus configuration file based on any stats key [14:27:53] To try and keep it somehwat generic/reusable? [14:28:04] yeah, then we could instrument anything in MW with wfIncrStats(), and easily add a graph for it [14:28:15] just by changing the torrus config [14:28:44] Shouldn't be too much of an issue [14:29:06] Might have a power cut coming. Thunder is starting [14:29:33] I would like to have something similar per host [14:29:41] that's the advantage of using a laptop for your work machine [14:29:46] like, mediawiki (or any other application) puts numbers in a simple format in a file [14:29:57] which is picked up by gmond and graphed automatically by ganglia [14:30:02] on each server [14:30:16] The server is on a UPS so that'll be fine [14:31:26] Where do the debs for the wm packaged stuff end up? [14:31:44] mark: I guess that would be useful [14:32:10] some basic per-host apache stats, req/s etc., would be useful too [14:32:23] Reedy: apt.wikimedia.org [14:32:33] http://wikitech.wikimedia.org/view/Apt.wikimedia.org [14:33:30] ok, bed time for me, good night [14:34:10] night Tim [14:34:12] good night, Tim [14:37:42] mark, has the wm apt repository got a gpg key anywhere? (just to shut the errors up more than anything) [14:38:21] RTFM seemingly.. [14:39:42] wikimedia-keyring doesn't exist [14:48:31] yeah, it does [14:48:53] http://apt.wikimedia.org/wikimedia/pool/main/w/wikimedia-keyring/ [14:49:31] Why's it give E: Unable to locate package wikimedia-keyring [14:49:31] whike udpprofile will work... [14:49:50] maybe it's just for hardy? [14:50:00] that ~hardy1 lookssuspicious [14:50:08] oh [14:50:54] Yup, seemingly [14:51:19] Cheers [17:18:41] RoanKattouw: ping [17:18:57] Krinkle: Pong [17:19:19] RoanKattouw: Remember we discussed making an api call to Meta-Wiki in the startup module for RL2 ? [17:19:27] ie. to get global gadgets registry, would be 5 min cached etc. [17:19:52] Yeah... didn't we say we were gonna do it with DB access instead? [17:19:53] (in Berlin) [17:19:57] right [17:20:02] So, here's [17:20:27] So, I'm currently writing out a rough fake code of how it would work, just to discover any obstacles [17:20:40] On thing I'm facing is getting a message from the remote wiki. [17:21:00] For modules it's no problem, they're delivered through resourceloader load.php to the client [17:21:09] but for special:preferences we need the messages server-side as well [17:21:36] ie. title and description of the gadget [17:21:39] Yeah getting messages from a remote wiki is pretty much a nightmare AFAIK [17:21:44] I don't think it's been done before [17:21:58] So just a 1-time api call for all messages and cache them. [17:22:04] Could do [17:22:58] But parsing, hmm.. intially I thought we could parse the descriptions locally [17:23:17] but.. people should be able to use templates in the descriptions (they allready do so) [17:23:25] Yes, that's where the nightmare starts [17:23:34] You could get away with assuming it's an MW: page [17:23:43] Although we could forbid that, it's only used right now for work arounds what will be redundant with RL2 [17:24:27] Oh, you mean using tempaltes [17:24:32] Yeah we could make them noparse messages [17:24:40] hi [17:24:42] Or get meta=allmessages to parse or preprocess stuff [17:24:49] Shouldn't be too hard to do with wfMessage() [17:25:21] aha amenableparser=1 [17:25:24] nice [17:25:32] is that new ? [17:25:37] Does that already exist? [17:25:42] yep [17:25:42] I parsed that as 'amenable parser' [17:25:50] action=allmessages&amenableparser [17:26:43] RoanKattouw: howdy [17:26:43] s [17:26:46] RoanKattouw, you did tidyup on that revision [17:26:48] sorry I wasn't on [17:26:50] !r 62532 [17:26:50] --elephant-- http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62532 [17:26:51] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62532 [17:26:57] http://translatewiki.net/w/api.php?action=help&querymodules=allmessages [17:27:32] RoanKattouw: I think Alolita was wanting me to show you this: r88009 [17:27:42] !r 88009 [17:27:42] --elephant-- http://www.mediawiki.org/wiki/Special:Code/MediaWiki/88009 [17:27:42] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/88009 [17:27:51] Whoa [17:28:38] Reedy: Tidy up how exactly? [17:28:44] Whitespace and shizz [17:28:55] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62535 [17:29:08] Oh [17:29:15] It was a statement, not a question, right [17:29:21] yup [17:29:23] hence no ? [17:29:24] ;) [17:30:51] TrevorParscal, alolita: OK, so you guys wanted r88009 deployed? [17:32:19] TrevorParscal, alolita: OK, so you guys wanted r88009 deployed? [17:34:49] yeah, it will help reduce API load when we ramp up [17:40:00] Alright [17:40:26] I'll get set up in a proper work environment, then do that deployment [17:40:32] cool [18:22:14] RoanKattouw: thanks! [18:23:54] TrevorParscal, alolita: Rev is live on test now [18:24:07] RoanKattouw: Awesome - thanks! [18:24:21] RoanKattouw: Is clicktracking ramped up too [18:24:26] No not yet [18:24:32] I'm taking this one at a time [18:24:47] Please test the change on test and tell me when it's good to go to production [18:27:24] hey RoanKattouw, looking at test right now [18:29:20] RoanKattouw: its' on test? [18:29:35] so the expertise buckets have been removed, right? [18:29:39] I can't seem to get it to wait until I click View page ratings for it to load [18:30:34] DarTar: removed is sort of wrong, they are being adjusted so everyone gets them, rather than 50/50 [18:30:45] right [18:30:45] http://test.wikipedia.org/wiki/Article_Feedback_Test [18:30:52] got it thx [18:31:33] The bucket change hasn't been made yet [18:31:40] All I did it push Trevor's change to test [18:32:04] *did is [18:33:08] TrevorParscal: Are you saying there's a bug? [18:33:38] RoanKattouw: let me check if it's a caching issue [18:33:54] we were also discussing the possibility of adding a link to the AFT FAQ (e.g. what's this), as many people are asking for more transparency - we can probably do it as of the next upgrade? [18:34:38] Sure [18:34:46] DarTar: Why don't you file these tasks as bugs in Bugzilla [18:34:52] There was the print mode thing to [18:34:54] *too [18:34:55] sure [18:35:27] RoanKattouw: nevermind, it's working correctly [18:35:27] shall I post that too or it has been patched already? [18:35:41] I was seeing load.php and thinking api.php... [18:36:00] It hasn't been fixed yet [18:36:05] But it's a one-liner [18:40:14] https://bugzilla.wikimedia.org/show_bug.cgi?id=29155 [18:40:24] Alright, so are we good to push Trevor's change to production? [18:40:44] DarTar: Good, thanks [18:40:55] looks good to me [18:44:32] Alright, if no one objects I'll just go ahead [18:45:04] i say yes! [18:46:00] OK that's done [18:46:06] Now let's do the bucketing thing [18:46:37] And the click tracking rate ramp-up [18:46:57] DarTar: You said we were getting 5x less clicktracking volume after the 100k deployment, correct? [18:47:10] So that was the reasoning for going from 2% to 10%, to get our old volume back? [18:47:34] yes, we should at least have the same volume to be able to do some analysis [18:47:40] Alright [18:47:49] at the moment we are getting a ridiculously small number of events or conversions [18:48:17] TrevorParscal: I can haz sanity check? http://dpaste.org/qhOT/ [18:48:18] from my perspective the more data we get the better, but 10% should already be a good start if it's sustainable [18:48:41] Well, if it brings us back to the original volume in terms of events/sec , it should be fine [18:48:47] If we sustained it before, we should be able to sustain it again [18:49:36] right [18:49:47] just posted https://bugzilla.wikimedia.org/show_bug.cgi?id=29156 [18:49:48] *TrevorParscal looks [18:50:11] RoanKattouw: looks good [18:50:19] Alright [18:51:14] let me know when it's live [18:51:47] !log catrope synchronized php-1.17/wmf-config/CommonSettings.php 'Raise AFT tracking percentage from 2% to 10% (restores pre-ramp-up volume) and put all users in the show-expertise bucket. Also bump tracking version to 7' [18:51:56] Will take between 5 to 10 mins to propagate, as you know [18:52:01] ok [18:52:16] there's a staff lunch right now [18:52:21] but I will be bringing my laptop [18:52:37] hmm, or in a few minutes anyways [18:53:22] Alright, I guess I'm done here [18:53:29] I'll be around in case the world ends [18:53:47] But I'll be doing some thesis work [18:55:09] k [19:04:06] RoanKattouw: Do you of a case where the input mw.loader.register recieves is not an object ? [19:04:31] Not offhand no [19:04:32] this.register = function( module, version, dependencies, group ) { if ( typeof module === 'object' ) { mw.loader.register( modules[i]; } [19:04:41] Oh [19:04:48] I'd like to minimze that to just one argument, and move the rest to a private function [19:04:50] It's a flexible recursive implementation [19:05:08] That pattern is used all over the place in MW/RL [19:05:49] But aside from register itself, it's never passed all arguments, only with 1 argument (string or object) [19:06:13] register with just a string shouldn't work [19:06:18] register with just an object will work [19:06:31] Because it iteratively calls itself for each element [19:06:56] reason being that with RL2 loader.register will need to take an additional option for 'origin' (local or glgadgetswiki) [19:07:08] so? [19:07:15] this.register = function( modules, origin) would be nicer. [19:07:27] Oh, non-repetitive origin? [19:07:34] Meh, who cares [19:07:46] You should take this up with TrevorParscal, who is at a lunch thing right now [19:08:06] The flexible recursive function thing is very much his style [19:08:12] k [19:08:23] anyway, we'll deal with that when it comes to implementation. [19:08:42] I'll assume for now the calling with all arguments is not used except by .register() itself. [19:10:19] You can just add a param in keeping with the existing style, that should be easy, right? [19:10:36] If you want to refactor it so the origin isn't repeated later, you can discuss that with Trevor [19:11:21] No, that would require each of the 100s of modules to add 'origin': foobar to it's register thing. [19:11:59] Yeah so like I said that's duplication, but should be easy to implement [19:12:14] I agree it'll be nicer to do it that way eventually [19:29:57] bah my pretty URL patch got reverted again :-( [21:51:32] best practices question: if i have a function that i want to be useable in a SpecialPage and a maintenance script, where is the best place for that function to live? currently, i am relying on a static method from a SpecialPage class in a maintenance script, but have been told that's icky and bad practice [21:53:57] which kind of function is it? [21:55:02] Platonides: it formats results from a db query into a common data object [21:57:49] maybe it should go somewhere in db folder [22:19:37] thanks Platonides [22:24:12] it was probably not too useful :) [22:24:46] it got me thinking about actually establishing some kind of model accessible within the scope of the extension :) [22:25:12] i think it pointed me on the right track