[00:40:09] Lue meetup streaming URL: http://www.youtube.com/watch?v=PrhzAtC8fCc - we will start around 6pm-ish (in about 20 mins) [00:40:12] Lua [00:40:18] anomie: ^^ [00:40:39] marktraceur- I'm helping present, so I need the hangout link ;) [00:40:47] Thanks though [00:41:45] TimStarling: [00:42:06] (qgil wants you) [00:42:57] both anomie and TimStarling have the hangout url now [00:43:06] all the rest can live with streaming & irc [00:43:40] * marktraceur will be your resident James_F today, yelling out questions from IRC. [01:14:50] Our speakers have arrived and they have a presentation ready to go! [01:18:56] And we're live! [01:22:29] The talk is "A brief introduction to Lua" by Drew Ditthardt. [01:23:07] Go ahead and ping here with questions, I will be handling the real-life asking for IRCizens. [01:32:21] Next up, robla + anomie + TimStarling giving a presentation on Lua modules in MediaWiki. [01:55:01] how open are the lua developers to bug reports, patches, etc. ? [01:55:17] To the core lua VM I mean. [01:57:41] xyzram: I'll ask just after they're done with this question [02:01:55] xyzram: Hopefully that was helpful, if not I can ask follow-ups too :) [02:02:21] is this where questions can be asked into the live stream going on right now? [02:02:37] That was adequate for this sort of forum I think :-) [02:02:41] Yes! Hi, thingles :) [02:02:49] Thanks. [02:02:55] marktraceur: very cool! Hi! [02:03:09] I'll be your figurehead question-asker today, how may I help you? [02:03:48] I would love to hear how Lua modules can be extended by other extensions? Can an extension like Semantic MediaWiki extend the Lua capability to hook into page properties? [02:04:20] Great question. Just a moment while they finish this question. [02:04:24] thanks! [02:04:57] (thank you to all for streaming this btw) [02:05:16] ohh... that error display is gorgeous [02:06:56] marktraceur: thank you! [02:07:03] thingles: My pleasure! [02:07:31] More questions? [02:08:14] that's great to hear... I sure hope SMW extends Scribunto... I'm sure it's in the plans. [02:08:23] * marktraceur thanks thingles and xyzram for validating his existence :) [02:08:31] lol [02:13:14] if there is time for one more question: What is the target date for MediaWiki 1.21 to be released so we can use this without having to use a wmf-n release? [02:13:36] Well, 1.22 will be explicitly branched on Monday or so [02:13:42] *1.21 [02:13:46] and master will become 1.22 [02:13:53] nice [02:14:45] thingles: https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap [02:15:47] What's the overhead like to convert data from PHP to Lua and back e.g. function arguments ? [02:16:14] xyzram: Would it take anything? It's only strings as far as I know. [02:16:28] Not hashtables ? [02:16:39] xyzram: Or do you mean like the library contents? [02:17:16] No if you have an array, doesn't it need to converted to pass as a data structure ? [02:17:40] This sort of this has a large overhead for Java to C and back. [02:17:48] xyzram: I don't think you can pass an array into Lua from Wikitext... [02:18:11] Ah, ok, non-issue then. [02:18:46] xyzram: Yeah, I guess the answer would be "PHP doesn't call Lua, it just sets up libraries for Lua to use and lets wikitext call into Lua", maybe with some implementation details that I'm ignoring. [02:19:38] Ok. [02:20:02] Nikerabbit: you still want help with that rebase? [02:21:36] cndiv: you have some directions documented for how to start hangout on air properly? (there was another meetup this evening where they were doing it differently than y'all do) [02:21:53] ironically said meeting was at google [02:22:07] and the guy i was discussing the wrong link with was a googler. :) [02:27:06] marktraceur- I guess xyzram left, but to answer the question: using the luasandbox php extension to embed Lua in PHP, it's very fast, just wrapping the Lua values in PHP zvals or unwrapping them. For the LuaStandalone engine, it's a bit more overhead as it has to serialize the data structure on either end to send it as a string over the pipe; then the receiver just calls loadString() (Lua) or unserialize() (PHP) to convert it back into a native da [02:27:06] ta structure. [02:29:03] luasandbox does copy arguments, so there is some overhead depending on the data size [02:29:24] except for closures [04:54:49] New patchset: Krinkle; "Install JSDuck 4.6.2 into tools/gem_modules." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53929 [04:56:13] New patchset: Krinkle; "Rename tools/mw-doc-gen.sh to tools/mwcore-docgen.sh." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53931 [04:56:50] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53931 [04:57:04] New patchset: Krinkle; "Add mediawiki-core-docgen job" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39210 [05:14:59] New patchset: Krinkle; "Trigger tests-mediawiki-docgen from the postmerge pipeline." [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53933 [05:14:59] New patchset: Krinkle; "Trigger mediawiki-core-docgen from the postmerge pipeline." [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53934 [05:16:33] New patchset: Krinkle; "Trigger mediawiki-core-docgen from the postmerge pipeline." [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53934 [05:16:34] New patchset: Krinkle; "Trigger tests-mediawiki-docgen from the postmerge pipeline." [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53933 [05:17:02] Change merged: Krinkle; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53933 [05:25:18] New patchset: Krinkle; "mwcore-docgen: Fix script to work with postmerge parameters." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53936 [05:25:29] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53936 [06:01:16] New review: Krinkle; "Perhaps only do it for jobs of which the results do not matter in detail:" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/53357 [06:12:40] Change merged: Krinkle; [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39210 [06:21:44] New patchset: Krinkle; "mwcore-docgen: Don't hard code mediawiki-core" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53947 [06:21:57] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53947 [06:28:30] New review: Krinkle; "(1 comment)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/39212 [06:35:00] Change merged: Krinkle; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53934 [06:43:51] New review: Krinkle; "For the record here is how they do the logs thing at openstack:" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/53357 [07:36:49] jeremyb_: yes please [07:38:07] hrmmm... but it's past 3:30am :/ [07:42:13] so you have to be fast ^^ [08:02:25] New patchset: Krinkle; "mwcore-docgen: Use --version option." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53956 [08:02:48] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53956 [10:38:25] anyone around to enlighten me about a gerrit mystery? [10:38:31] https://gerrit.wikimedia.org/r/#/c/53570/ [10:38:43] it sais "Can Merge: No". Why? [10:39:50] oh, merge conflict, I suppose [10:40:20] how does it know? dues it try to merge whenever a new patch set is uploaded? [13:16:38] hello [13:18:58] hi hashar [13:19:26] hey there Jarry1250 how are you? [13:19:30] good to see you [13:19:41] Hiya sumanah, just got on vacation [13:19:52] Well, a week, but dying under the paperwork [13:20:06] So many things stored up "for the vacation" [13:20:18] Then you get there and *bam*, all waiting for you [13:20:20] :P [13:20:35] sumanah: hello ) [13:20:58] Jarry1250: yeah, I totally hear you [13:21:15] ^demon: hi there! Turns out Gerrit might be overwhelmed during rush hours :/ I have noticed a change that took roughly 5 minutes to get merged, the GUI showed "submitted, pending merge" [13:21:59] <^demon> Could be. [13:25:50] Reedy:Replied at https://gerrit.wikimedia.org/r/#/c/25838/ . If you do merge, I promise to test it on test2 once it's deployed there [13:28:04] ^demon: and I suspect Zuul to be doing too many ssh queries to gerrit :/ [13:28:08] I should fill a bug [13:28:54] <^demon> Does it do any git operations anymore, or does it entirely use the local clones now? [13:29:25] the Jenkins jobs mostly use the local git replicates [13:29:31] <^demon> mmk. [13:29:32] but Zuul spam Gerrit with ton of ssh commands [13:29:45] I noticed most of the slowness while the i18n bot submitted its hundred of changes [13:29:48] <^demon> Could we disable the "Starting gate and submit" message? [13:29:53] I will try to gather some metrics from Zuul log [13:29:58] <^demon> That'd save one query, and a rather pointless one tbh. [13:30:32] not going to save much in my opinion and that prevent people from complaining to me about : "what the hell is jenkins doing is it going to merge? " :-] [13:30:37] will parse the log and submit you a report [13:31:03] <^demon> Well, it's going to post the results, so you know if it's done. [13:31:20] <^demon> If we added a generic "zuul status" link to the change, that would alleviate the need to add it as a comment. [13:31:30] <^demon> Or "testing status" [13:31:33] <^demon> :) [13:31:34] yeah I though about it, adding a javascript in Gerrit to poll Zuul status page [13:31:49] zuul as a json status page :-] [13:32:02] http://integration.mediawiki.org/zuul/status.json [13:32:55] I also need to add some metrics in graphite [13:38:56] <^demon> https://noc.wikimedia.org/~demon/gerrit-change-ui.png - how about something like this? [13:39:27] <^demon> If I can get my JS changes upstream, it'll be easy to do this. [13:44:57] the test status could even list the pipeline and job currently running :-] [13:45:42] 3275 ssh commands yesterday :-] [14:17:39] <^demon> hashar: Couple of performance improvements I want to deploy soon--that'll help too. [14:18:28] Zuul too has ton of perf enhancements [14:18:38] but they all depends on a python-module :-] [14:19:00] and I haven't managed to find out how to measure the time it takes for a change to be merged after --submit :/ [14:38:58] !g Ia70316f973f220170c03d9a2dddc9897164d889a [14:38:58] https://gerrit.wikimedia.org/r/#q,Ia70316f973f220170c03d9a2dddc9897164d889a,n,z [15:00:50] !g Iae192f6380e72c46822f0a738ab88a1d452aad2e [15:00:50] https://gerrit.wikimedia.org/r/#q,Iae192f6380e72c46822f0a738ab88a1d452aad2e,n,z [15:56:40] New patchset: Krinkle; "Install JSDuck 4.6.2 into tools/gem_modules." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53929 [15:56:46] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/53929 [16:43:15] New patchset: Krinkle; "Add mwext-VisualEditor-docgen job" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/53995 [16:43:33] hashar: Can you explain in more detail how you had the rscync in mind? ^ [16:43:42] (in gerrit) [16:43:49] hashar: I need to run in a few minutes, I'll amend it when I get back [16:43:50] thx [16:44:05] https://gerrit.wikimedia.org/r/#/c/53995/1/mediawiki-extensions.yaml [16:52:13] New patchset: Krinkle; "Add mwext-VisualEditor-docgen job" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/53995 [16:54:35] New patchset: Krinkle; "Enable mwext-VisualEditor-docgen job" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/53997 [17:05:33] New review: Krinkle; "the rm -rf/mkdir/mv is a bit odd. Should be fine, but it leaves a slight window where it is gone. Th..." [integration/jenkins-job-builder-config] (master) C: -1; - https://gerrit.wikimedia.org/r/53995 [17:12:33] New patchset: Hashar; "fix /srv/org/mediawiki/integration symlink" [integration/docroot] (master) - https://gerrit.wikimedia.org/r/53999 [17:13:28] New review: Hashar; "should fix https://integration.mediawiki.org/ giving a 403 forbidden." [integration/docroot] (master) - https://gerrit.wikimedia.org/r/53999 [17:14:50] hashar: Is Krinkle|detached nearly done adding jsduck calls to the tests? Are they stable enough that I can throw my hat into that arena? [17:16:53] marktraceur: no idea. I know he has been working on it [17:17:01] Hm. [17:17:12] marktraceur: a prerequisite was to refactor the integration website which timo has done over the week [17:17:17] The weird thing is, James_F|Away had said he wasn't going to do it for a while. [17:17:25] marktraceur: we are deploying his change with andrew in #wikimedia-operations :-] [17:17:31] * marktraceur joins [17:17:33] I guess once it is done jsduck will land on the site very soon [17:24:07] qgil, hello [17:24:28] hi Rahul_21 ! [17:25:00] i shall talk here,i guess its convenient for you too [17:25:28] Rahul_21, and convenient for you if you ask questions I don't know the answer (but others do) :) [17:25:56] i wanted to ask about gsoc [17:26:03] yes [17:26:46] i have been into mediawiki development for a month or little more! [17:27:05] so the thing i am doing right now is fixing bugs [17:27:09] and i like it! [17:27:23] \o/ !! [17:27:24] but when it comes to gsoc its something much bigger [17:28:01] so how do i choose a project to work on [17:28:16] there is also more time, and probably more help [17:28:31] since 49 days are left(if my maths is right) , [17:28:47] Rahul_21, you can keep fixing bugs and getting familiar with the projects and the people here [17:29:20] Rahul_21, we are trying to have a proper list of projects at http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects but that page is today far from final [17:29:28] denis_o and se4598 : I am very happy to say that the database solution proposed by Matthias has just been approved by Asher, so that blocker has now been removed for today's deployment of AFT5 on the German Wikipedia. [17:29:51] On that basis, mlitn will now attempt to re-deploy the new AFT5 tool to German Wikipedia today (13k articles), and will keep us appraised on his progress. I am cautiously optimistic' that we may be able to deploy the tool on German Wikipedia by about 12pm PT (~20:00 CET, German time). [17:30:00] Rahul_21, you can start searching for something you would like to do, or you can look at the list of projects proposed [17:30:24] Rahul_21, in the meantime, we (Wikimedia) are the ones having the closest deadline since first we need to apply as organization [17:30:37] qgil, and does fixing bugs helps the cause of selection [17:30:42] New patchset: Hashar; "jslint url to wm.o and with https" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54001 [17:30:43] Rahul_21, then Google will decide whether we are accepted or not, and how many slots we have available [17:30:47] Let's coordinate this AFT5 German deployment on this IRC channel (#wikimedia-dev), and update our Etherpad checklist as we complete each task: [17:30:47] http://etherpad.wikimedia.org/AFT5-release [17:31:03] New patchset: Hashar; "jslint url to wm.o and with https" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54001 [17:31:11] Rahul_21, with that information, we will select among the student-project-mentor combinations [17:31:11] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54001 [17:31:12] qgil, I have already assumed that you are going to be selectd [17:31:19] For others following this thread, we will postpone the deployment of AFT5 on the English Wikipedia to Tuesday morning (on 830 articles), once last year's data conversion is complete (about 50% done now). The tool will then be deployed to the French Wikipedia by Thursday (just a dozen articles at first, going up to 42k articles by April 2nd). [17:31:20] qgil, :) [17:31:27] Rahul_21, that is a safe assumption, yes :) [17:31:46] qgil, Hmm... that's an emphatic statement [17:31:54] Now you all know as much as I do :) Fingers crossed … [17:32:20] sankarshan, looking at our history and the status of https://www.mediawiki.org/wiki/Summer_of_Code_2013 today [17:33:11] but anyway, Rahul_21 what matters is that you keep developing your contributor karma and your experience and contacts in this project. Fixing bugs is probably the best you ca do to prepare yourself for GSOC! [17:33:16] fabriceflorin: thx a lot, good to know that everything is running fine [17:33:24] qgil, wikipedia being among the top 5 sites visited ,google has but no choice to offer you a stay and on top of that wikimedia foundation has been participating since the genesis of gsoc [17:34:04] Rahul_21, keep watching https://www.mediawiki.org/wiki/Summer_of_Code_2013 and keep looking for a potential project AND a potential mentor [17:34:17] Rahul_21, there is time, still [17:35:00] qgil, its a dream,i am sharpening my php skills too in the meantime [17:35:57] Rahul_21, as said you are already doing the right steps. Put your time in PHP skills and don't bother much about GSOC practicalities before your deadline arrives [17:36:13] New patchset: Zfilipin; "Fixed Serbian language codes" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54002 [17:37:00] Rahul_21, actually wikipedia is #6 - behind google, facebook, youtube, yahoo and baidu [17:37:33] New review: Cmcmahon; "needed correct language codes" [qa/browsertests] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/54002 [17:37:33] Change merged: Cmcmahon; [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54002 [17:37:36] google + youtube = Google, sin't it. Anyway, details. [17:37:58] Krenair, is it official?alexa rank? [17:38:12] Rahul_21, no, it's an alexa rank [17:39:14] qgil, exactly that brings wikipedia to 5 ,yeaay ! [17:55:00] New patchset: Zfilipin; "A few improvements to the readme file" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54005 [17:58:17] chrismcmahon: have a minute? [17:58:37] hi zeljkof sure [17:58:46] zeljkof: I have a question for you also [17:59:03] chrismcmahon: where on mediawiki.org did you write on how to write tests? [17:59:11] I remember seeing a few pages [17:59:23] I would like to link to them in browsertests readme [17:59:43] zeljkof: http://www.mediawiki.org/wiki/QA/Browser_testing#How_to_contribute [18:00:04] thanks [18:00:14] I will add it to the readme right now [18:00:20] we need "read more" link there [18:01:04] zeljkof: something I forgot about, I want to add IRC notifications to the Jenkins builds [18:01:30] chrismcmahon: are they disabled now? [18:01:48] zeljkof: I am starting to think it would be useful to have a Bugzilla component for tests, I'm going to ask Andre if can [18:01:52] I still have to debug jenkins irc plugin, there was something wrong there [18:01:54] if he can set that up [18:02:12] zeljkof: it's reporting on mobile but the plugin is missing from the browser_tests template [18:02:20] chrismcmahon: there is now a generic "testing infrastructure" category in bugzilla [18:02:32] * chrismcmahon looks [18:02:54] chrismcmahon: now i remember, I have it on the todo list to update the template :) [18:05:35] New patchset: Zfilipin; "Readme file now has "read more" link" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54007 [18:07:06] zeljkof: OK, I found the 'infrastructure' component. Let's get IRC notifications for the other builds working. [18:08:35] chrismcmahon: just to finish some rvmrc file cleanup, will set up irc notifications [18:09:37] mark: the bgp dumps i've found are old. would it be possible to make our own regularly? [18:10:11] can certainly do so [18:10:19] i even have python code that could do so [18:10:27] although there's also ExaBGP [18:11:19] my code's here: https://svn.wikimedia.org/viewvc/mediawiki/trunk/routing/twistedbgp/src/ [18:11:41] 2007? geez, time flies [18:12:00] ExaBGP is probably a better choice now though [18:12:06] New review: Cmcmahon; "improve README (thanks Siebrand)" [qa/browsertests] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/54005 [18:12:06] Change merged: Cmcmahon; [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54005 [18:12:57] New review: Cmcmahon; "link to mw.o in README" [qa/browsertests] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/54007 [18:12:57] Change merged: Cmcmahon; [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54007 [18:13:14] mark: the reason i'm interested in this - we're throwing away ip addresses for the browser side performance data from NavigationTiming, though doing a country lookup first. saving network or route specific info (leslie suggested asn) could help us pinpoint network specific performance issues [18:14:07] i basically wrote that code for almost exactly that in 2007 [18:14:16] but to build a geodns solution on it [18:14:23] based on network and captured performance info [18:14:33] never happened, but writing the BGP lib was fun [18:14:49] ASN sorta works but ASNs can be extremely large, spanning the entire globe [18:15:55] binasher: can I suggest to save the AS path instead of the ASN? [18:16:05] so all paths up to and including the ASN that is owner of the ip addresses [18:16:22] then we can figure out where the problem is, because of a common AS in the path [18:16:39] an AS path is (usually) just a list of ASNs [18:16:49] [14907 2828 6908 43821] [18:17:24] but you'd need _our_ BGP data for that of course, and it also only works for outbound packets, you don't know how packets came in [18:17:31] could still be very useful [18:18:34] mark: i'm looking at bgp.py and not understanding how it gets our bgp data? [18:19:06] binasher: simple, it talks BGP to our router(s) [18:19:19] ExaBGP is the same, a python BGP library, but way bigger now and more features [18:19:39] twistedBGP is also what's used by pybal to announce ips to our networks, i.e. inject routes [18:20:00] (but there it doesn't parse route info, just announces) [18:21:52] actually, that code must be old [18:22:02] I fixed several bugs in twistedbgp when working on pybal for ipv6 last year [18:22:41] binasher: latest code is in git: operations/debs/pybal [18:23:20] pybal/bgp.py ? [18:23:24] yep [18:23:38] this is making my head spin :) [18:23:43] heh [18:23:54] then probably ExaBGP is easier, although Not Invented Here ;) [18:23:59] it has more users [18:24:03] but I'm happy to help out [18:24:19] you probalby haven't done much with BGP yet [18:24:41] nope [18:25:02] so basically what you'd like is a service that takes an IP address, and then returns an AS PATH of our routers to that ip [18:25:05] what would be involved in going from ip -> ASPathAttribute ? [18:25:09] :) [18:25:17] shouldn't be too hard [18:25:21] but it will be different for the US vs esams [18:26:10] I can take a stab at that next week if you like [18:26:24] the test.py in svn already has most of that code iirc [18:26:28] that would be great :) [18:26:37] ok [18:27:05] sounds fun [18:27:12] but i'm going off now [18:27:40] have a good weekend! [18:29:16] you too [18:32:25] New patchset: Zfilipin; "Added IRC notification to Jenkins job template" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54015 [18:32:58] chrismcmahon: IRC notifications enabled: https://gerrit.wikimedia.org/r/#/c/54015/ [18:33:08] I have started one job to test it [18:33:21] zeljkof: that was fast, thanks! [18:33:25] I have to go to pick up the kid now, I will be back online in an hour or two [18:33:55] chrismcmahon: it will probably not work correctly, sending notifications to #wikimedia-mobile or something :) [18:34:00] but we will figure it out today [18:34:05] it is a start [18:34:16] * zeljkof is back in an hour or two [18:35:27] did something just break mw core repo? I'm getting lots of problems trying to do git pull (--rebase)? [18:37:30] New review: Cmcmahon; "add IRC notifications to browser test builds" [qa/browsertests] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/54015 [18:37:30] Change merged: Cmcmahon; [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54015 [18:37:51] can someone verify if it is just us or not? [18:38:32] Project browsertests-linux-chrome build #210: UNSTABLE in 6 min 47 sec: https://wmf.ci.cloudbees.com/job/browsertests-linux-chrome/210/ [18:38:33] * zeljko.filipin: Fixed Serbian language codes [18:38:33] * zeljko.filipin: A few improvements to the readme file [18:38:34] * zeljko.filipin: Readme file now has "read more" link [18:41:16] hello wmf-selenium-bot [18:41:42] nobody? [18:41:50] ^demon: ? [18:43:19] <^demon> Hm? [18:44:00] <^demon> I think it's just you. [18:44:22] git is broken then? [18:44:42] or what would explain that suddenly at least two separate clones start failing in similar way [18:44:59] <^demon> Not sure, but I was able to do a pull --rebase just fine a moment ago. [18:45:03] <^demon> (Since you asked) [18:45:40] hmph [18:45:53] git diff HEAD..origin/master shows nothing but still get conflicts for commits I never made [18:53:49] ^demon: thanks anwyway, I was able to get those into consistent state [20:17:14] Project browsertests-linux-firefox build #198: UNSTABLE in 15 min: https://wmf.ci.cloudbees.com/job/browsertests-linux-firefox/198/ [20:17:15] * zeljko.filipin: Fixed Serbian language codes [20:17:15] * zeljko.filipin: A few improvements to the readme file [20:17:16] * zeljko.filipin: Readme file now has "read more" link [20:17:16] * zeljko.filipin: Added IRC notification to Jenkins job template [20:22:35] robla: fyi ^^ [20:22:53] \o/ [20:25:09] robla: also fwiw, that build is failing right now because it found a real bug earlier today https://bugzilla.wikimedia.org/show_bug.cgi?id=46168 [20:28:31] hi [20:38:06] New review: Hashar; "lets try out :-]" [integration/docroot] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/53999 [20:38:07] Change merged: Hashar; [integration/docroot] (master) - https://gerrit.wikimedia.org/r/53999 [20:49:23] Project browsertests-windows-internet_explorer_9 build #233: UNSTABLE in 15 min: https://wmf.ci.cloudbees.com/job/browsertests-windows-internet_explorer_9/233/ [20:49:24] * zeljko.filipin: Fixed Serbian language codes [20:49:24] * zeljko.filipin: A few improvements to the readme file [20:49:25] * zeljko.filipin: Readme file now has "read more" link [20:49:25] * zeljko.filipin: Added IRC notification to Jenkins job template [21:01:44] binasher: would you have time to do a review of a CentralNotice schema change today or monday? [21:02:24] mwalker: sure [21:03:20] cool; it's this one: https://gerrit.wikimedia.org/r/#/c/48332/ "Initial CentralNotice changes for v2.3" -- should be in your gerrit queue [21:03:40] shoot [21:03:52] not that one; it's predessesor [21:03:53] https://gerrit.wikimedia.org/r/#/c/52913/4 [21:03:54] mwalker: not 52913? [21:03:58] :) [21:04:31] mwalker: how will cn_notice_log and cn_template_log be used? [21:05:10] we use them to log any changes to a banner/campaign -- so anyone with centralnotice_admin who makes a change will have it recorded [21:05:23] ok [21:11:09] New patchset: Hashar; "Revert "Trigger mediawiki-core-docgen from the postmerge pipeline."" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54076 [21:11:33] New patchset: Hashar; "Revert "Trigger mediawiki-core-docgen from the postmerge pipeline."" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54076 [21:11:44] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/54076 [21:12:06] hashar: I just got an unexpected error back from jenkins -- "This change was unable to be automatically merged with the current state of the repository. Please rebase your change and upload a new patchset." I'm curious why there's a requirement for a rebase? Shouldn't it just be looking at the patch? [21:12:50] Is your commit ona dependancy of master? [21:13:51] not sure what you mean by that; it is a leaf of a leaf off of master [21:14:01] and the parent of the top level leaf is not at HEAD [21:14:06] so; it's correct that it needs a rebase [21:14:08] that's probably why then [21:14:12] jfdi ;) [21:14:58] Reedy: sure; but it seems odd that it didn't used to be a requirement and now it is [21:15:13] Things change [21:16:02] certainly -- but if I'm a couple of leafs down in a topic; always keeping them up to date is a PITA [21:17:10] mwalker: generally looks good, just had one change request (adding a pk to cn_template_devices) [21:17:35] cool; I'll do that [21:17:42] thanks :) [21:21:29] mwalker: yeah we attempt to merge the change against the current master. If there is a conflict we do not bother running tests and ask for a rebased change. [21:42:33] binasher: I moved some other indexes around because they didn't make sense; but it's ready for your watchful eye again (compare patchset 4 to 6 -- 7 I was attempting to unify my whitespace which apparently got all borked somewhere along the line) [21:43:54] Krinkle|detached, hashar, how's the jsducking? [21:44:00] mwalker: ok, thanks! [21:47:24] marktraceur: I had the popcorn out for jsduck, still watching what happens next. [21:47:54] chrismcmahon: I would require a GIF of a duck eating popcorn to capture the image of us waiting for jsduck to work :) [21:48:41] oh maybe I'll have the spouse make some Indian-style popcorn tonight, that'd be nice [21:50:17] sumanah marktraceur "jsduck" is a ruby gem that depends on other ruby gems that may or may not have been installed properly on gallium in the last 24 hours [21:50:42] if y'all are up to it, I'm ready to be taught to write a given/when/then test right now [21:52:59] sumanah: I have to step away briefly in about 10 minutes, but we have some background: http://www.mediawiki.org/wiki/QA/Browser_testing/How_to_contribute [21:53:05] OK. [21:54:09] marktraceur: krinkle working on it :-] He had to redo the whole contint website which is done now. [21:54:30] sumanah: you might be amused that the browser tests caught a bug today: https://bugzilla.wikimedia.org/show_bug.cgi?id=46168 [21:54:49] chrismcmahon: OK, I've read How To Contribute [21:55:17] so now I'll just add something to https://www.mediawiki.org/wiki/QA/Browser_testing/Test_backlog [21:55:47] hashar: I saw those patches, but I thought I saw finished jsduck ones too [21:55:50] sumanah: sounds great! is there a feature you have in mind? we were using Search as an example Wed [21:55:53] hashar: I guess I'll check back later [21:56:01] hashar: Time to pack for and execute my ski trip plans. Ta! [21:56:10] have fun mwalker [21:56:10] err [21:56:11] marktraceur: [21:56:18] chrismcmahon: Not yet. I will try to think of something. [21:56:19] Both! :) [21:56:34] oh no! not jsduck! [21:56:36] YuviPanda: Your IRC client is smarter than you give it credit for maybe :) [21:56:39] attack of the rubies! [21:56:48] yes, that is true :D [21:56:57] mwalker: var quack = function () {}(); [21:57:25] It's kind of hilarious that the JavaScript documentation builder is written in Ruby. But I digress! Packing! [21:59:06] mwalker marktraceur at one point I did some research into the Ruby-Ubuntu feud, and how people like Groupon get around it. tl'dr: we ain't them. [21:59:44] ok, added 1, chrismcmahon [21:59:47] there's a ruby/ubuntu feud? [21:59:55] https://www.mediawiki.org/wiki/QA/Browser_testing/Test_backlog#Search [22:01:28] mwalker: here's one example: http://ryanbigg.com/2010/12/ubuntu-ruby-rvm-rails-and-you/ [22:02:50] so, chrismcmahon how is my addition? is it reasonable? what do I need to learn to do better? [22:03:04] I added https://www.mediawiki.org/w/index.php?title=QA%2FBrowser_testing%2FTest_backlog&diff=660264&oldid=659471 . [22:03:10] hah; so situation normal for ubuntu then -- packages massively out of date [22:05:01] chrismcmahon: An old roommate (he's price on Freenode) works for tddium, has some horror stories about Ruby that you should hear sometime - maybe next time you're in the Bay Area we can meet him :) [22:05:40] sumanah: I would say "Given I am searching enwiki (or whatever wiki that is)/And ..." We have a number of targets for these tests, so good practice is to specify target in Given statement [22:06:07] aha. I'll put that advice at the top of the Search backlog list [22:06:19] or actually, where could that go? [22:06:53] mwalker marktraceur yeah, the other side of the story: http://www.lucas-nussbaum.net/blog/?p=566 [22:07:04] "on any random page in any wiki" is the example at How To Contribute, chrismcmahon. [22:07:12] Do you want to change that? [22:07:31] sumanah: perhaps. let me re-read, it was appropriate at the time [22:07:34] okay. [22:11:49] sumanah: I think I should add another Plain English example that e.g. is searching contents of a file uploaded to a particular wiki. [22:12:11] sumanah: I made the general case a little too general :) [22:13:25] okay. chrismcmahon how real do these need to be? do I actually need to check whether there is a file on Commons with the relevant characteristics? [22:14:25] I've changed my scenarios to name the beta cluster as the target. [22:14:27] sumanah: the audience for G/W/T output is humans not machines. real is good [22:14:44] OK. Look at the last 2 in https://www.mediawiki.org/wiki/QA/Browser_testing/Test_backlog#Search now chrismcmahon [22:14:49] (updated to point to the beta cluster) [22:16:50] sumanah: +1 thank you, very clear [22:17:18] OK. chrismcmahon now what do you suggest? [22:17:30] are these good enough to turn into cucumber tests? [22:18:40] sumanah: they are. [22:19:30] sumanah: if you'd like, zeljkof or I can pair on turning those into running browser tests [22:19:42] ok. Let's do it. [22:21:09] sumanah: back in a few minutes... [22:22:04] sumanah: we'll probably be adding scenarios to https://github.com/wikimedia/qa-browsertests/blob/master/features/search.feature [22:28:45] ok, chrismcmahonbrb do you want to look at https://github.com/wikimedia/qa-browsertests/pull/1 ? [22:45:00] sumanah: wow, I didn't expect a pull request. you can also clone from gerrit at https://gerrit.wikimedia.org/r/#/admin/projects/qa/browsertests [22:45:32] chrismcmahon: do you need me to do that, or can you deal with the pull request? [22:46:10] sumanah: eventually it will go through gerrit one way or another. github is just easier to read. [22:46:28] ok. I'll leave it to you to look at it and give me feedback/critique [22:47:16] chrismcmahon: want me to squash the pull req and submit to gerrit? [22:47:18] i've been meaning to try that [22:48:50] YuviPanda: that would be nice to see [22:49:02] yeah trying it out [22:53:12] * sumanah waits for feedback [22:53:57] sumanah: I wish zeljkof were around but it's late for him. [22:54:05] okay. [22:54:18] If the answer is: no feedback right now, wait till Monday, I'm fine with that [22:54:26] chrismcmahon, sumanah: I am here, but brain is at 5% :) [22:54:32] okay :) [22:55:01] sumanah: I will take a look tomorrow, now is really not a good time for anything that involves brain [22:55:01] sumanah: I was also thinking about getting it so you can run tests locally, but I'm a little concerned that rvm.io has an expired certificate. the README has basic instructions. [22:55:32] The most important thing for the purposes of this test exercise, chrismcmahon, is teaching the contributor how to write tests well. [22:55:36] So, fast code review helps. [22:55:54] New patchset: Yuvipanda; "find tokens in the captions of uploaded files" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54097 [22:56:06] sumanah: chrismcmahon https://gerrit.wikimedia.org/r/54097 [22:56:08] zeljkof sumanah I'll be out on Monday, but zeljkof will be here early [22:56:10] Found! [22:56:14] ok. [22:56:17] Well, it [22:56:29] It's not urgent to me. I just wanted to go through the experience you've been providing volunteers [22:56:33] manual Github pull request to Gerrit was not that hard. [22:56:43] the commit hook does not work, and that sucks. [22:57:48] New review: Yuvipanda; "Source: https://github.com/wikimedia/qa-browsertests/pull/1" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54097 [22:58:39] chrismcmahon: to improve the volunteers' experience, in the future, for events like this, we should improve the sample scenario at https://www.mediawiki.org/wiki/QA/Browser_testing/How_to_contribute and for any feature we aim to test we should include a BZ query so people can look at bugs to turn into scenarios. [22:58:59] sumanah: sounds good [22:59:09] I will instruct Quim about this. [22:59:41] sumanah: with some wiggle room. there are 4 BZ components for Search, of various utility. [22:59:46] Whatever. [22:59:49] Something is better than nothing. [23:02:54] qgil: hey there. Just posted comment to https://www.mediawiki.org/wiki/Talk:QA/Browser_testing/How_to_contribute#improving_the_volunteers.27_experience_-_bz_queries_25127 [23:03:17] YuviPanda: thank you for the pullpush or squashcommit or what have you. :) [23:03:30] sumanah: i'm writing an email to wikitech with the process I used. [23:03:47] I am glad to work with someone with such good instincts. [23:03:48] sumanah: I personally think that we should encourage people getting started to send in github pull requests, since they are multiple orders of magnitude easier than gerrit [23:03:51] YuviPanda: me too [23:03:58] chrismcmahon: it wasn't that hard. [23:04:00] That may be a good choice [23:04:10] i'd like to see how to work with multiple commits though [23:04:16] YuviPanda: nope. really easy to see, and the manual step isn't too bad. [23:04:27] yeah. [23:04:30] is it documented somewhere? [23:04:53] YuviPanda: not to my knowledge [23:05:10] sumanah: do you want to fix the one error in that commit (trailing space in the 'Given I am on ' line) so that I can see how to handle multiple commits? [23:05:16] chrismcmahon: then I'll continue writing the email :) [23:05:18] okay! [23:05:20] I will do that YuviPanda [23:05:25] sumanah: thank you sumanah [23:06:34] chrismcmahon: multiple commits will be a little harder, since I'll have to squash them manually [23:06:49] git am doesn't do squash, i think [23:06:59] chrismcmahon: plus I'll have to use personal discretion in the new commit message. [23:07:38] YuviPanda: yeah, I think github pulls are always discrete and can't be made cumulative. (I'm mostly a n00b though) [23:07:51] cumulative as in? you can add more commits to them [23:07:59] they're not static. [23:08:02] but you can't modify a commit [23:08:06] gh doesn't like force pushing [23:08:10] right [23:08:30] chrismcmahon: I also have to manually copy and maintain the Change-ID [23:08:34] which isn't that great [23:08:36] but that's ok [23:08:57] YuviPanda: nice that you're writing this down! [23:08:57] ok YuviPanda I think I did so [23:09:03] chrismcmahon: yeah, doing it [23:09:06] sumanah: okay, checking [23:09:26] okay, I see two commits [23:09:29] * YuviPanda does git magicks [23:10:04] I see dead people. [23:11:11] hmm, we're working in the master branch, which is not super great but not really a problem in this case [23:11:33] whoops [23:12:26] I shall amuse myself by remembering back when I had made, like, 2 pull requests ever via GitHub, and a recruiter emailed me [23:12:40] :D [23:12:42] basically saying they had noticed me on GitHub and did I want to be a web application engineer at Twitter? [23:12:49] ha! [23:12:57] these pull requests had been, like, typofixes for READMEs [23:13:21] now that I've made more like TEN trivial commits via GitHub I figure it's CTO time [23:17:09] hmm [23:17:14] git merge --squash doesn't work like i expect it to [23:17:36] "there is a pumpkin here now!" [23:17:43] (what I imagine "squash" does) [23:17:54] sumanah: last time I was in Palo Alto there were posters on the streetlight poles with little tear-off phone numbers of CEOs looking for CTOs. [23:18:05] OMG you're serious aren't you [23:18:22] dead serious [23:18:44] I am laughing very hard [23:18:51] and you have made my spouse laugh as well [23:19:17] hai Leonard! [23:19:40] New patchset: Yuvipanda; "Squashed commit of the following:" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54097 [23:19:41] ah, right. that works [23:19:57] chrismcmahon: sumanah ^ [23:19:57] Hi Chris! Leonard said. Then he said I was the CTO of "heartco, Leonard Division" [23:20:11] and now he wants to talk about dinner. Spouses! Whatcha gonna do. [23:21:05] :D [23:21:14] YuviPanda: wrong author. wrong parent (" [23:21:15] 6dda3e357e86e45e13d36815eaff6d1d583eb746 Added IRC notification to Jenkins job template") [23:21:17] WRONG FOR AMERICA [23:21:26] ouch [23:21:27] let me fix that [23:22:16] New patchset: Yuvipanda; "Squashed commit of the following:" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54097 [23:22:30] sumanah YuviPanda I think that's b/c we're in master branch (parent at least) [23:22:39] still a weird parent - 6dda3e357e86e45e13d36815eaff6d1d583eb746 Added IRC notification to Jenkins job template [23:22:53] what should the parent be? [23:22:59] sumanah: that was the last merged change to master [23:22:59] this was the last commit before this one [23:23:06] is every changeset the child of a parent? [23:23:09] yes [23:23:25] ok. [23:23:37] I thought that they were connected causally but I see it's just time-based [23:23:44] last merged change to master; kinda causal [23:23:50] sumanah: in the git data structure, every commit has: 1 (or more) parents, a diff, and a commit message. [23:24:07] sumanah: merges have two parents, for example [23:24:37] sumanah: it has correct author now. [23:24:40] yes! [23:24:54] that's one of the nice things about git. immensely powerful. [23:25:01] that's also one of the horrible things about git. immensely powerful. [23:25:03] Indeed. [23:25:15] Yeah. I am familiar with unixy tools and their terrible genie-like nature [23:25:22] YuviPanda: OK, that was fairly magical what you just did [23:25:26] sumanah: also, I volunteer to do this manual job of moving pull reqs to gerrit changesets until gerrit can do it automatically. [23:25:29] wow! [23:25:30] okay! [23:25:47] you may in that case wish to automatically get some kind of notification for pull requests on the wikimedia repos [23:25:55] thank you, that would be a wonderful gift [23:26:09] although of course I hope that, once you have documented it, others step up [23:26:14] sure. [23:26:23] script kiddies with hearts of gold may be able to automate it further [23:26:29] yes, that would be me :) [23:26:43] but I'm unsrue if the current commit summary would be okay [23:26:48] it's just an amalgam of multiple ones [23:26:57] it might have to be rewritten, for example. that would be manual. [23:27:01] but most of this is automatable [23:27:06] i"m writing a script right now, actually [23:27:18] YuviPanda: I am pretty sure ^demon has gone a long way down this path [23:27:19] that is the 1 thing I was going to nitpick about. the first line would be nice if it were a bit more .... drawn from the first 72 characters of the GitHub pull req commit summary [23:27:33] Chad & qchris both [23:27:47] sumanah: sure. [23:28:19] so chrismcmahon now that the change has been pulled into a Gerrit change request; shall I wait till Monday for substantive code review? it's ok if the answer is yes [23:28:47] New patchset: Yuvipanda; "find tokens in the captions of uploaded files" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54097 [23:28:51] sumanah: fixed too [23:29:22] sumanah, commented too [23:29:25] sumanah: and yes, I think "GitHub PullReq Title \n GitHub PullReq Description \n Commits Summary List" is a good idea [23:29:52] YuviPanda, it would be great if you could have those commons uploads stats during your Monday... [23:29:55] sumanah: I think yes. zeljko's reviews of my first few commits were right on, and I'd like to keep that up until we have more experience [23:30:03] ok, that's fine chrismcmahon [23:30:31] chrismcmahon, no asnwer form weekers testers or E3. Should I say something? [23:30:52] * sumanah looks at https://www.mediawiki.org/wiki/Talk:QA/Browser_testing/How_to_contribute thanks qgil  [23:30:59] YuviPanda: thank you! glad to have been the test rabbit [23:31:04] qgil: I think it can wait until next week. I'm not worried (yet) [23:31:17] sumanah: thank you :) I'm also unsure how to tell github 'notify me for every pull request in all these repos' [23:31:19] YuviPanda: thank you from me as well [23:31:24] qgil: doing that now [23:31:36] YuviPanda, thanks! [23:31:50] chrismcmahon, ok [23:31:52] Oh, sumanah, just wanted to let you know I submitted a panel proposal with 3 other interns for GHC. Marina was a _huge_ help! [23:32:07] qgil: Justin isn't particularly fast and Steven will have more info later [23:32:08] GREAT! [23:32:28] thank you, valeriej! good luck on getting the proposal accepted [23:33:19] Thanks, sumanah! I'll let you know how it goes! [23:34:47] * qgil is approaching Palo Alto and Mountain View, two little towns where Internet doesn't play any role, and this is why my connection will probably drop [23:35:42] YuviPanda: So I can see how you would be alerted of every new commit, issue, & pull request in a certain repository, but it's hard for me to see how you could tell GH "just the pull requests" [23:36:02] sumanah: usually GH sends you email notifications for pull requests and issues [23:36:03] although you could configure an email filter to simply discard/archive the ones that are not about pull requests [23:36:06] and no emails for commits [23:36:16] at least, that is what happens when you 'subscribe' to one [23:36:17] " Watching a repository lets you follow new commits, pull requests, and issues that are created. " [23:36:31] commits show up in your 'github dashboard' [23:36:41] ok. maybe "watching" is different from "subscribing" [23:36:45] probably [23:36:57] i have a fair number of repos (mine and others) for which I get emails only for pull requests [23:37:37] no, there's no such thing as subscribing, just "watching" and "starring" [23:40:13] qgil: responded [23:40:15] sumanah: mm [23:40:39] The docs may not follow the reality, or you may have a special subpreference checked or unchecked that I cannot see [23:40:41] qgil: let me know if that is enough info [23:40:41] I do not know [23:40:56] YuviPanda, thank you very very much. And sleep well (at some point) [23:41:01] i'm looking into it. IIRC this is also the default [23:41:07] qgil: pfft, it's only 5 AM on a saturday :P [23:42:10] sumanah: okay, watching them is what I want to do [23:42:23] sumanah: watching them sends me emails for pull requests and issues, and dashboard for commits [23:42:29] sumanah: do you know how I can watch all our repos? [23:42:51] I fear it is tedious YuviPanda [23:42:58] click all the things? [23:43:01] unless there's a cool API call [23:43:30] there is [23:43:42] http://developer.github.com/v3/activity/watching/ [23:43:46] http://developer.github.com/v3/activity/watching/#set-a-repository-subscription [23:44:05] YAYPI [23:44:11] yay-pi [23:44:48] sumanah: another way is to give me push access toa ll the repos :P [23:44:58] > When you're given push access to a repository, we automatically watch the repository for you. [23:45:00] https://github.com/blog/1204-notifications-stars [23:45:10] but... I don't think that's a good idea, since I don't really *need* push access. [23:45:23] I agree [23:45:45] so I'll figure out how to write the API call tomorrow :) [23:47:38] thanks YuviPanda sumanah. I'm dropping off, gotta drive across Colorado tomorrow :-) [23:47:47] Thanks all [23:47:53] have fun chrismcmahon [23:47:54] :) [23:51:03] New patchset: Yuvipanda; "Squashed commit of the following:" [qa/browsertests] (master) - https://gerrit.wikimedia.org/r/54103 [23:51:13] sumanah: I wrote the script :D [23:51:25] let me add the finishing touch