[14:58:51] hmm, can anyone see a reason why the gray edit links show up on https://www.mediawiki.org/w/index.php?title=Wikimedia_engineering_report/2012/May&printable=yes#Readers , for example, despite the fact that they have a noprint? https://www.mediawiki.org/w/index.php?diff=545863&oldid=545856 [15:55:15] does mediawiki API support stashed uploads when uploading from a URL ?? [15:55:32] Reedy: ^ [17:18:18] hi, people who are checking in re 20% time today [17:19:12] bsitu: I think you're the only one who's online right now :) [17:19:42] Hi sumanah [17:19:57] bsitu: so, what are your 20% plans for today? need any suggestions? [17:21:07] no plan yet, I will do what you think is needed to done most [17:21:09] bsitu: I see there are are some more AFTv5 revisions that need reviewing: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/ArticleFeedbackv5,n,z [17:22:01] bsitu: is that an extension where you feel comfortable reviewing? [17:23:04] yeah, sure [17:23:23] bsitu: ok, and one more thing that would also be useful - you're somewhat familiar with MoodBar? [17:23:30] bsitu: https://bugzilla.wikimedia.org/buglist.cgi?list_id=123278&resolution=---&query_format=advanced&component=Moodbar&product=MediaWiki%20extensions - does that search work for you? [17:23:55] yeah [17:24:25] looks like there are quite a lot of bugs, :) [17:24:33] bsitu: the somewhat older bugs there, the ones from more than 2 months ago -- some of them might have been fixed already [17:25:01] I will go over the list and double check them [17:25:30] bsitu: chrismcmahon might have a better eye re https://bugzilla.wikimedia.org/buglist.cgi?list_id=123278&resolution=---&query_format=advanced&component=Moodbar&product=MediaWiki%20extensions , which ones are more likely to need re-validating [17:25:54] chrismcmahon: can you tell just from metadata which bugs are likely to need re-validating, to check whether those bugs are still likely to exist? [17:33:16] sumanah: looking [17:37:25] actually, bsitu, when I look at https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy I see that validating oldish bugs does not explicitly really fall into the scope of it.... "tasks which directly serve the Wikimedia developer and user community, ideally in ways which can't be easily done by most volunteers, and/or which increase volunteer capacity." I think that this kind of work would be good, though, as a way to clear out inva [17:37:25] lid bugs, in case chrismcmahon or another person wants to run a bug triage or a testing event on MoodBar [17:37:37] to repeat in case IRC cut me off: as a way to clear out invalid bugs, in case chrismcmahon or another person wants to run a bug triage or a testing event on MoodBar [17:37:53] bsitu: so I'm comfortable asking you to do this, but I won't make a habit of it. OK with you? [17:39:11] sumanah: that would be an interesting case for a community testing event, marking bugs for a particular extension INVALID or FIXED. [17:39:31] chrismcmahon: right. A reproduction sprint. [17:39:49] chrismcmahon: you already have the test plan all laid out! it's the steps-to-repro from the existing bugs! :D [17:40:08] sumanah: I checked recently and the number of open bugs is growing by about 3000/year since I was hired. [17:40:34] so marking INVALID/FIXED is a worthy cause [17:41:30] where's a Volunteer QA Coordinator when you need one? :) [17:46:36] chrismcmahon: you know as well as I do -- out there, waiting to be hired. [18:46:00] chrismcmahon: I'm skedaddling to try to finish up assessments, but I'll be on Freenode in case you want to talk. And if tparveen comes back, tell her the same? [18:46:23] sumanah: will do, I need to finish mine as well [18:46:26] k [20:06:45] hashar: Can you sum me up the blockers on moving all of conint to a labs project? Things to be puppetized? Things to be unpuppetized (barcoded stuff for int.mw.org)? Uncomitted changes? Access to LDAP? ... etc. [20:06:53] (per email is fine :) ) [20:07:03] I have no clue [20:07:36] why would there be a blocker? [20:07:59] only the development should live in labs, though [20:08:09] last time I asked he said it couldn't be moved yet because there is too many undocumented installation processes that "just" setting up there wouldn't do anything. [20:08:09] the production server should always be in the cluster [20:08:19] Ryan_Lane: oh? that's not what I heard. [20:08:28] production stuff lives in production [20:08:33] labs is not for production things ;) [20:10:05] Krinkle: the main blocker right now is me [20:10:24] I am distracted with too many things :-/ [20:35:25] I am out of Cognac though :-( [20:35:31] so re [20:35:34] about linking our code [20:35:40] hashar: how do you want to handle jerkins jobs for mw-extensions [20:35:57] I have no idea [20:36:04] well not really [20:36:05] hmm [20:36:26] ideally we'd have a generic job that installs mediawiki + the extension-repo that triggered the job, and have it trigger on a whitelist of extension repos' [20:36:37] my idea was to first get a copy of MediaWiki/core master by cloning the repo locally [20:36:52] then add the extension + its specific LocalSettings.php [20:36:55] then run the installer [20:37:09] that would craft a MediaWiki install with just that extension enabled [20:37:12] then run unit tests [20:37:19] yup [20:38:15] hashar: we should also move the logic of install+php-unitest out of the job-specific thing so that we can re-use it between mw-core and mw-whitelsted-extensions [20:38:22] so that we can also add testswarm to both [20:38:41] * RoanKattouw wants QUnit for VE in Jenkins as well ;) [20:39:27] RoanKattouw: To get the QUnit results from testswarm you'll need to wait for TestSwarm 1.0 probably, to get it from a headless webkit is easier, but that requires phantom-js + node [20:39:37] Right [20:39:50] So Jenkins just kicks off TestSwarm but doesn't get the results back? [20:39:58] both are currently held up [20:40:03] ? [20:40:07] RoanKattouw: right now it makes a curl request to submit to TestSwarm and that's it [20:40:10] Riht [20:40:21] no looking back at it, completely ignored, as well as the population of the swarm (mostly empty) [20:41:05] http://www.mediawiki.org/wiki/Git/New_repositories/Requests looks broken, my request at least isn't showing up [20:41:36] I've made tremendous amount of progress on the automation of that though, we're up to the point (at jQuery.org, but I made everything generic so that we can use it as-is for mediawiki) to the point where a commit to the repo at github means 6 minutes later you get a link in IRC with a page like this: http://swarm.jquery.org/job/4 [20:41:46] which then links to stuff like this: http://swarm.jquery.org/result/571 [20:41:56] Krinkle: +2 on having a way to easily install MediaWiki [20:42:12] the actual ant script more or less does that [20:42:13] marktraceur, maybe because it was already fullfilled? [20:42:21] but still need to be largely improved to be reusable [20:42:51] Platonides, http://www.mediawiki.org/wiki/Git/New_repositories/Requests/Entries still has my request in the source code, and several others that aren't showing up [20:43:21] what's your request? [20:43:52] hashar, why not make each task a standalone script, and make jenkins a silly runner, instead of having the orders in the xml file? [20:43:54] I'm working on a new extension called EtherEditor for integrating Etherpad Lite into the edit page [20:44:09] I don't see it on http://www.mediawiki.org/w/index.php?title=Git/New_repositories/Requests/Entries&action=edit [20:44:36] was moved to archive: http://www.mediawiki.org/w/index.php?title=Git/New_repositories/Requests/Entries&diff=550290&oldid=550287 [20:44:44] should be created [20:44:56] Oh, awesome [20:46:19] there was a big backlog, but chad has been cleaning it up :) [20:47:11] New patchset: Ottomata; "Changing timestamp format to YYYY-MM-DD in data files." [analytics/gerrit-stats] (master) - https://gerrit.wikimedia.org/r/11667 [20:48:18] New patchset: Ottomata; "Changing timestamp format to YYYY-MM-DD in data files." [analytics/gerrit-stats] (master) - https://gerrit.wikimedia.org/r/11667 [20:53:19] Platonides: I am not sure what you mean [20:59:35] hashar? [21:00:10] hashar, why not make each task a standalone script, and make jenkins a silly runner, instead of having the orders in the xml file? [21:00:16] Platonides: sorry I am lagging [21:00:52] I think that right now jenkins is run by an ant xml file (similar to a Makefile), right? [21:01:12] if the commands were in standalone shell scripts [21:01:19] they could be used outside of ant [21:02:00] gerrit-wm seems to be lagging too [21:02:45] ah, the patchsets shown here are filtered [21:03:11] ant == makefile indeed [21:03:35] the xml files defines targets, which can be called independantly [21:03:40] you could: ant mediawiki-install [21:03:42] for example [21:04:07] it's ugly :( [21:05:50] Platonides: have a look at http://dpaste.org/jULiO/ [21:06:03] you can list described targets with `ant -p` [21:06:11] (I could describe more of them) [21:06:19] and you can even ask the user for interactive input [21:14:12] do you need to run it in some folder? [21:16:09] the folder where the build.xml is Platonides [21:19:29] Platonides: sorry can't stay any longer, heading bed :-( [21:20:17] and how does it know the mediawiki folder? [21:20:21] no problem [21:20:23] good night [21:21:14] hey bsitu [21:21:27] yes? [21:21:29] I'm a bit confused by the FeedbackDashboard [21:21:47] I see new users showing up who registered after midnight 2012-06-15 [21:23:49] I expect you mean 00:00 [21:24:12] the "next midnight" hasn't arrived yet at UTC [21:27:53] Platonides: if you are interested, the repo is integration/jenkins [21:28:01] the build script is in jobs/_shared/build.xml [21:28:12] you can pass properties to ant script through a properties file [21:28:35] IIRC the mediawiki install is expected to be in $builddir/workspace [21:28:47] well it is a bit nasty though :-( [21:29:07] well I am out for real now. Have a good week end everyone [21:50:30] New review: Diederik; "Ok." [analytics/gerrit-stats] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/11667 [21:50:32] Change merged: Diederik; [analytics/gerrit-stats] (master) - https://gerrit.wikimedia.org/r/11667 [22:28:13] Anyone here with admin rights on gerrit that can abandon this https://gerrit.wikimedia.org/r/#/c/3687/ ? [22:31:41] Krinkle: Done [22:33:40] thx [23:07:54] i'm working on an extension that requires browser-feature detection, to check whether a user has a webgl-enabled browser or not. if the browser has webgl enabled, then a 3d model is requested from the server; if not, then a static image is requested. it's important to get the 3d model or image from the server and render it as quickly as possible, though not important enough to stop the rest... [23:07:55] ...of the document's content (i.e. a Wikipedia article's text) from rendering more than an extremely small amount of time. given that, i'm thinking it'd be best to put the feature detection method within a ') with 'position'=>'top', yet injecting inline scripts into the head is discouraged? i don't know if i'm using the terms 'inject' and 'inline script' a bit too loosely, but that confuses me. does ResourceLoader only load (i.e. request) external resources? [23:34:57] It won't be an inline script, it'll be loaded dynamically [23:35:06] So let me rephrase then [23:35:14] You shouldn't inject inline scripts into the head *yourself* [23:35:20] So should let ResourceLoader do it for you [23:35:23] *You should [23:39:38] so let me check whether i understand: will the non-external script will be loaded into the head dynamically on the server-side by php, or get added dynamically on the client-side via javascript. if the former is true, then that's gel well with my idea. if the latter is true, then the resources would load slower than they would with the former (and if that's strong convention and my use... [23:39:40] ...case isn't a reasonable exception, i'm more than fine with that). [23:41:06] Emw: the slowness is either not true or insignificant [23:42:02] Emw: It'll be loaded dynamically, but it'll still happen before the page loads [23:42:12] If you set 'position'=>'top' at lesat [23:42:36] It's the same effect as putting the script in the directly, the browser won't start parsing the until after it's run it [23:46:27] ah, i think i see. so ResourceLoader would be some javascript function itself loaded at the top of head, which would append this feature detection inline script to , then the detection method would run before gets parsed (running the detection and sending the asynchronous request before the browser's "load" event fires)? [23:47:04] the head-part is blocking [23:47:26] i guess my main goal is to run the detection and send the asynchronous requests before the document's "load" fires. [23:47:27] the load event is at the end of , the position-top finished before the even begins [23:48:37] ok, ResourceLoader it is. thanks for the information and clarification. [23:51:10] (i've been diving into http://www.html5rocks.com/en/tutorials/internals/howbrowserswork/ to refresh on the low-level details of blocking, etc., and what i've heard here matches my understanding of that material)