[00:55:12] gwicke_: Hey, busy tonight? [00:56:24] marktraceur: am busy with not being busy for a change ;) [00:56:50] gwicke_: Care to join me in my regular Sunday insanity? http://rose.makesad.us/pipermail/beowulf-cluster/2012-December/000009.html [00:57:34] beowulf? are you involved in that project? [00:57:49] gwicke_: It's not a project, just a meeting of free software people :) [00:58:04] It's a spin-off of the meeting in Boston at a restaurant called Grendel's [00:58:14] (we're *that* nerdy) [00:58:43] interesting- beowulf used to be a low-cost HPC clustering project [00:58:59] linux-based normally [00:59:02] Yes, that's one of the wordplays involved [00:59:30] All of us being free software people, most of us remembering that project (myself not among them, but I've had it explained to me) [01:00:10] k, its heydays are probably over by now [01:00:30] I'm more in clean-up and reading mood tonight [01:00:35] AOK! [01:00:52] but sounds like fun! [01:01:10] gwicke_: beowulf's heyday was over in the 11th century or thereabouts :P [01:01:20] hehe- yes [01:01:36] read up on that when I came across the HPC beowulf for the first time [01:01:49] ori-l: When he killed Grendel (wordplay #2) [01:02:28] this weekend has been the usual recovery shutdown experience.. [01:03:15] how are things with parsoid? [01:03:37] pretty good- we fared better than expected [01:04:02] clean diffs for the most part make happy early adopters [01:05:23] i've noticed a fever of puppet activity around parsoid -- is it exposed to the mediawiki boxen as a network RPC? [01:05:48] yes, a stateless HTTP service [01:06:16] the load balancing now actually works, so we don't run on a single machine any more ;) [01:06:31] how did you implement load balancing? [01:06:40] node's cluster module? [01:06:42] Roan did all that- he used LVS [01:06:59] and each box uses cluster to manage workers [01:07:17] we start number of cores + 3 workers [01:07:32] but each worker can only handle a single request [01:07:37] at a time [01:08:24] are you planning to keep this basic architecture, or to bundle everything into core? [01:09:10] ori-l: the latest idea is to eliminate wikitext parsing from MW in the long run ;) [01:09:11] it'd be nice if our architecture was more message-oriented rather than monolithic. that would allow for greater diversity of languages and tools. whatever's right for the job. [01:09:36] following the idea of no parser being the fastest parser [01:10:12] NoParsing, à la NoSQL [01:10:18] lot of things to research before making a plan on that though [01:10:45] and discussion obviously [01:10:50] yes, otherwise we'd end up with a NoWiki :) [01:11:27] we can still have Parsoid as a text UI [01:11:36] have you used erlang and OTP, btw? [01:11:59] ori-l: played a little bit with it, but not much [01:12:03] mostly in college [01:12:44] spent more time with Haskell [01:13:13] haskell is ultimately too WTF for me [01:13:42] ori-l: there is a pretty nice book called real world haskell [01:13:51] http://book.realworldhaskell.org/read/ [01:14:10] but I agree re components and message passing [01:14:28] I'm trying to give everything an URL and make it all pure / RESTful [01:15:44] oh, that's neat. i'll have to toy around with it one of these days [01:15:46] if we can avoid spending too much time on writing a super-fast parser we might instead be able to make progress on fragment caching and incremental updates [01:16:18] which has more potential in the longer term [01:16:54] i think so too [01:18:11] the question marks are mainly in the interaction with existing extensions [01:18:30] hardware is cheap and with an architecture oriented around message-passing between discrete components you can scale by adding hardware [01:18:44] how many depend on the frame object for example [01:19:36] i'm reading http://www.mediawiki.org/wiki/Manual:Tag_extensions now, didn't know about any of this stuff [01:21:28] ok, back to sorting out this rat's nest of python and php :-/ [01:21:32] ttyl [01:21:59] to each his own rat's nest to sort out ;) [01:22:36] Indeed. What on earth should I have for dinner? [01:22:42] rats! [01:22:48] * marktraceur is happy with the complexity of that problem, thanks very much [01:23:37] * gwicke_ solved that problem by cutting some bread and pecorino [01:24:34] gwicke_: Not sure I particularly feel like staying in tonight [01:25:11] discovered a really nice Thai place near Japan town on Friday [01:25:39] Hm, bit far. But possible, with BART. [01:25:59] http://www.yelp.com/biz/jitlada-thai-cuisine-san-francisco [01:26:34] Farther than I thought....maybe not [01:27:14] The old standby seems OK, cheap 'n' tasty Chinese food right around the corner [01:27:24] Maybe followed by some cafe or something [01:27:28] hehe [01:27:50] guess which Indian restaurant is right around my corner? [01:27:59] I clearly need to have either more friends in the Mission or a more reliable way of reaching people for dinner plans [01:28:09] gwicke_: I happen to know exactly, and I'm super-jealous [01:28:23] ;) [01:28:23] Though there is the Indian pizza place just up Mission....maybe I could hit the 14.... [01:28:42] Well, maybe. I'll go figure that out now, y'all have a good evening with your rats [01:29:02] ya, have a good one & see you tomorrow! [09:55:55] hello [10:08:40] * siebrand greets hashar  [10:08:47] :) [10:12:36] hashar: are https://gerrit.wikimedia.org/r/#/c/38553/ and https://gerrit.wikimedia.org/r/#/c/38555/ any good? [10:12:44] siebrand: yup [10:12:57] siebrand: I am checking the first one for extension dependencies [10:13:14] hashar: afaik non of them have any formal tests yet. [10:13:18] siebrand: I will have to do some evil copy paste in Zuul configuration next [10:13:32] might end up writing a templating system in Zuul to avoid all the mess :) [10:14:00] hashar: Just want to have linters run, so that when people do merge crap that was pointed out to them, I can take the clue bat. [10:14:29] !g I67f92915ebca7c469116571d90b1a76a68fbf97d [10:14:29] https://gerrit.wikimedia.org/r/#q,I67f92915ebca7c469116571d90b1a76a68fbf97d,n,z [10:14:35] siebrand: will do that :) [10:15:38] need to setup the Wikibase extension first [10:25:05] New patchset: Hashar; "ext dependencies + Wikibase jobs" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38528 [10:25:57] New review: Hashar; "PS2:" [integration/jenkins-job-builder-config] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/38528 [10:26:05] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38528 [10:38:33] hashar: on https://gerrit.wikimedia.org/r/#/c/38528/, validator is not required but otherwise looks good [10:38:48] aude: maybe I should remove it so :) [10:38:52] sure [10:39:11] don't think it'd break anything by being there but [10:39:21] not needed [10:39:54] New patchset: Hashar; "get rid of -install jobs for MW Extensions" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39030 [10:40:08] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39030 [10:42:40] New patchset: Hashar; "get rid of -install jobs for MW extensions" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39031 [10:43:16] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39031 [10:50:22] New patchset: Hashar; "Add Babel, CleanChanges and LocalisationUpdate to job builder" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38553 [10:50:28] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38553 [10:56:45] New patchset: Hashar; "Triggers for Babel, CleanChanges and LocalisationUpdate" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39033 [10:56:58] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39033 [10:57:57] New review: Hashar; "Zuul configuration deployed with https://gerrit.wikimedia.org/r/39033" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/38553 [13:04:05] New patchset: Hashar; "job template for Python pep8 runs" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39044 [13:04:05] New patchset: Hashar; "pep8 job for ops/puppet" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39045 [13:09:27] New patchset: Hashar; "pep8 jobs are non voting (for now)" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39046 [13:09:27] New patchset: Hashar; "lint ops/puppet python files (non voting)" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39047 [13:09:49] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39044 [13:10:02] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39045 [13:10:13] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39046 [13:10:37] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39047 [13:17:23] New patchset: Hashar; "typo declaring the pep8 publishing macro" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39048 [13:17:40] Change merged: Hashar; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39048 [14:12:00] Nemo_bis, Nikerabbit - Why in I7fcbde6a did L10n-bot basically revert I50f775fd? [14:22:04] anomie: let's ask Raymond [15:10:23] anomie: Raymond_> Nemo_bis: it can happen when a i18n related patch set is merged in the middle of the export [15:10:37] he'll resubmit for you this evening before re-exporting [15:12:39] Nemo_bis- Thanks. [17:40:52] Im guessing the wikimedia tech talk 12.13.12 is no longer "NOW" [17:41:24] <^demon> Well, unless they've been running a talk for 4 days straight ;-) [17:41:48] that's a long talk [17:44:06] <^demon> brion: I've attended more than a few lectures that *felt* that long ;-) [17:44:11] hehe [17:46:13] New patchset: Milimetric; "test" [analytics/reportcard/data] (master) - https://gerrit.wikimedia.org/r/39067 [18:37:08] Krenair: Good luck with bug 14862 :-) [18:37:53] James_F, thanks. csteipp's notes make this sound easier than I would've expected... [18:38:22] Krenair: Chris and I were talking about it quite a bit, and he got it all worked out, which I guess helps. :-) [19:01:37] ori-l: ping [19:31:30] hello [19:32:43] hashar: Hi! [19:33:01] <^demon> hashar: Howdy. Did you see my e-mail? [19:33:18] ^demon: not yet [19:33:40] <^demon> Subject: "Misconfigured Zuul?" [19:33:48] oops [19:34:34] so one job disappeared [19:34:50] <^demon> That sounds...not good [19:35:45] unemployment can raise after elections [19:35:46] INFO:jenkins_jobs.builder:Creating jenkins job mediawiki-core-install-sqlite [19:36:29] preilly: hey; in a meeting, but it's ending in 20. should i swing by yr desk? [19:39:29] ^demon: fixed [19:40:28] <^demon> Cool, thanks! [19:40:32] <^demon> Glad it was an easy fix :) [19:41:20] ^demon: also mailed wikitech about it as a post mortem [19:44:34] ^demon: can we get the Verified label extended tomorrow ? :) [19:47:33] <^demon> Yeah, that should be fine. [19:53:31] Do oldids change if you move a page? [19:54:51] Can't see any reason why they would.. [20:03:46] Change abandoned: Milimetric; "(no reason)" [analytics/reportcard/data] (master) - https://gerrit.wikimedia.org/r/39067 [20:57:09] New patchset: Hashar; "support Verified+2" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39082 [20:57:50] New review: Hashar; "Pending Gerrit change to extend Verified to -1..+2" [integration/zuul-config] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/39082 [21:10:05] New patchset: Hashar; "patch merged is now pink" [integration/doc] (master) - https://gerrit.wikimedia.org/r/39083 [21:10:06] New patchset: Hashar; "mention pipeline is from Zuul" [integration/doc] (master) - https://gerrit.wikimedia.org/r/39084 [21:10:06] New patchset: Hashar; "update workflow with Verified+2 enhancement" [integration/doc] (master) - https://gerrit.wikimedia.org/r/39085 [21:10:34] New review: Hashar; "Workflow flowchart is updated with https://gerrit.wikimedia.org/r/#/c/39085/" [integration/zuul-config] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/39082 [21:10:57] New review: Hashar; "Wait for https://gerrit.wikimedia.org/r/#/c/39082/ Zuul change to be applied" [integration/doc] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/39085 [21:11:18] Change merged: Hashar; [integration/doc] (master) - https://gerrit.wikimedia.org/r/39083 [21:11:27] Change merged: Hashar; [integration/doc] (master) - https://gerrit.wikimedia.org/r/39084 [21:15:04] ori-l: I'm remote [21:23:52] preilly: and how [21:24:02] ori-l: ? [22:05:27] hashar: So there's 2 things currently bothering me in ci that I'd like to know what your agenda is for. [22:05:30] * qunit [22:05:57] * build steps instead of job stacks [22:05:57] Krinkle: I am in conf call right now sorry [22:44:16] fenari going down [23:00:57] <^demon> notpeter: dive! dive! [23:01:07] heh [23:02:04] anomie: so I will rebase the EQIAD related changes and ping you so we agree on deployment window [23:02:37] hashar- ok [23:25:57] James_F|Away: "Technical Debt" ? [23:28:33] Krinkle: You know, there's this encyclopedia on the internet... [23:28:33] http://en.wikipedia.org/wiki/Technical_debt [23:30:09] It is the name of a new component in VisualEditor bug tracker. [23:32:11] SauceLabs is awesome! [23:32:33] Reedy: Thx.. [23:39:07] running slightly over my window due to ops meeting/lunch combo [23:47:33] New patchset: Ori.livneh; "Subscribe to ZMQ publisher and log to time-rotated file" [analytics/glass] (master) - https://gerrit.wikimedia.org/r/39157 [23:48:47] fenari back up [23:57:37] Change merged: Ori.livneh; [analytics/glass] (master) - https://gerrit.wikimedia.org/r/39157