[00:00:45] ah; yep [00:01:02] to: window.setTimeout( mw.centralNotice.waitForCountry.bind( mw.centralNotice ), 100 ); [00:01:16] so between those 5 changes, hopefully something will improve ;) [00:01:20] (if you need a this object, else just preserve the initial whitespace format [00:01:40] marktraceur: it doesn't use 'this', i checked :) [00:01:46] and .bind is ES5, and will fail in IE8 [00:01:51] Excellent! Forward with the other change! [00:01:53] you'd want to use $.proxy [00:02:07] .... [00:02:15] but: good instinct nonetheless. [00:02:15] Can we just....shoehorn in a bind method? [00:02:21] * mwalker hates JS/ES/*S [00:02:33] * mwalker also hates PHP for good measure [00:02:48] I prefer BS :) [00:02:54] mwalker: clojurescript. shhh. [00:02:58] mwalker: CS (computer science)? [00:03:10] hello, i've posted this on #wikipedia, was redirected here. im facing a very strange issue, there is a particular edit that i'm unable to submit! i keep getting an error "Our servers are currently experiencing a technical problem ...[]", but i do other edits and they pass fine! [00:03:11] marktraceur: I'm actually an EE :p [00:03:18] mwalker: Well fine! [00:03:28] (I keep my CS degree on the downlow) [00:03:35] what i get is : Request: POST http://ar.wikipedia.org/w/index.php?title=%D8%A5%D8%B3%D9%84%D8%A7%D9%85&action=submit, from 41.43.16.246 via cp1002.eqiad.wmnet (squid/2.7.STABLE9) to 10.64.0.141 (10.64.0.141) [00:03:36] Error: ERR_READ_TIMEOUT, errno [No Error] at Fri, 09 Nov 2012 23:53:01 GMT [00:03:41] uwe: does it take a very long time to return the error message to you? [00:03:41] page is ar:إسلام [00:04:01] well, relatively yes [00:04:09] but the consitancy is the strange thing [00:04:21] only this submission is failing [00:04:31] uwe: are you adding any templates? [00:04:50] i can even view diff, but not submit or preview !!! [00:05:01] nop, regular edits [00:05:15] I can't even load the page :P [00:05:25] i know it might be just a timeout, but really i tried more than a dozen of times [00:05:31] ori-l: No, serious question, can we make a hack to add Function.prototype.bind, or is that just a foolish thing to throw in blindly? [00:06:04] finally... [00:06:34] marktraceur: $.proxy does the same thing. hacks to 'back-port' ES5 features to older browsers are called 'polyfills' [00:07:42] ori-l: jQuery is great and does all things? :P [00:08:15] well, it's often hard (sometimes outright impossible) to get polyfills to behave exactly the same way as native implementations. in such cases, the presence of the polyfill can be confusing, because it allows someone to persist in the error that something is implemented natively across platforms [00:08:33] and polyfills that support 90% of a feature are false friends for that reason [00:08:54] better to know that Function.prototype.bind isn't universally available, and stick to $.proxy for now [00:09:04] uwe: it seems to be a pretty long article, so I'm guessing it's the normal problem: mediawiki can't parse the page fast enough. This is a common problem on many wikis now, especially for articles with lots of templates. [00:09:06] kaldari, i even thought the article is big and since i'm editing only the intro i tried to do action=edit§ion=0 and it worked, but still could not submit the edits [00:09:17] :) [00:09:33] kaldari: that's pretty fucked up. i hadn't realized things were that severe. [00:10:04] try editing the article 'a' on english wiktionary :) [00:10:47] kaldari, should the section editing reduce the risk of timing out ? [00:11:09] no, it has to reparse the entire page since you're creating a new version [00:12:04] uwe: sometimes, even though it times-out, it will still actually save your edit [00:12:09] why would it not just submit it in the database then struggle later to reparse it :) ; sorry if this is bad taste :) [00:12:13] but it's hit or miss [00:12:15] kaldari: this article is ridiculous... I mean seriously; 25 sections, at least 60 templates, AND an appendix [00:13:08] try https://en.wikipedia.org/wiki/Srebrenica_massacre [00:13:21] it's a freakin novel [00:13:23] kaldari, so i should just keep trying [00:14:16] uwe: unfortunately, yes. one option is to try to replace some of the templates with regular text, which will reduce parsing time. [00:14:38] or wait until Scribunto is deployed to ar.wiki [00:14:57] * uwe googles Scribunto [00:15:10] kaldari: But how can you replace the templates, if you can't edit the page? :/ [00:15:26] uwe: Have you tried editing a section at a time? [00:15:27] good point! [00:15:49] yeah marktraceur , i just asked if that should help [00:16:09] @marktraceur : kaldari, i even thought the article is big and since i'm editing only the intro i tried to do action=edit§ion=0 and it worked, but still could not submit the edits [00:16:40] i did not expect section=0 trick to work, but well ; it did :) [00:18:33] but if there is a severe existing problem, then it must be that! i'll just keep trying [00:19:14] I just tried myself and got a timeout after 1 minute [00:19:42] might be hopeless, some articles are now effectively locked due to the timeout problem [00:19:56] i was wondering if its not something else, like choking on a certain sequence of characters or some similar black magic issues :P [00:19:58] maybe we should up the timeout [00:20:16] kaldari: But then nobody will learn! [00:20:46] marktraceur, well, otherwise, those who learn, will never be able to apply their knowledge :D [00:21:03] uwe: I doubt it, the simplest explanation is that it's just taking too long to parse [00:21:19] since it's a very long article [00:21:49] uwe: you might try editing it at night when the server load is less and see if you have better luck [00:21:59] uwe: The result will be that new articles get created that mimic the same content, but people are better about splitting pages off :) [00:22:08] yes, of course. i got worried when the section thing did not work either, but as you say its irrelevant, its not it [00:22:15] uwe: hopefully Scribunto will be deployed early next year and fix this problem [00:22:23] kaldari, hehee ... its 2:22 am here :P [00:22:32] hehe [00:22:33] yeah, ill try tommorow [00:22:51] OK, then try at a normal mid-day time (which will be night here) [00:23:11] marktraceur, true :) but a bit sadistic , no ;) ? [00:23:28] uwe: and also keep in mind that even if it times-out, it may have saved your edit [00:23:38] uwe: I prefer the term "negative reinforcement" [00:23:43] for example, I just had an edit go through, even though it times out. [00:23:45] kaldari, yeah, i've checked the history , not there [00:24:39] I imagine your edit might be putting it over the line, I was trying to add a comment, which the parser strips out anyway... [00:24:42] damn! your edit passed and mine did not !! grrrr .... [00:25:02] uwe: try removing some other text from the article at the same time that you add your new text. [00:25:30] especially if there are any templates that can be removed or subst'ed [00:25:31] hmm, not really an option, this is the introduction of a pretty sensitive article :) [00:25:47] oh, maybe from later parts of the body, i'll try that [00:26:16] my 2nd try totally failed though :( [00:26:52] robla: any thoughts on this? [00:29:26] jesus! this is the artcle on Islam! [00:29:32] that's pretty important [00:29:49] kaldari: That was a hilarious interjection to use [00:29:56] haha [00:30:20] well, since i'm here already, there was a discussion about using etherpad + mediawiki ... i found some discussion here: http://strategy.wikimedia.org/wiki/Proposal:Etherpad-based_editing ... I'm not sure what that status means on that page [00:30:41] let me find the bugzilla bug for you... [00:30:50] uwe: Oh, that's me! [00:31:04] uwe: I created the EtherEditor extension, which implemented most of this functionality [00:31:08] !e EtherEditor [00:31:11] ... [00:31:13] Damn. [00:31:17] https://bugzilla.wikimedia.org/show_bug.cgi?id=19262 [00:31:20] http://www.mediawiki.org/wiki/Extension:EtherEditor [00:31:38] oh oh!! [00:31:40] uwe: please chime in on the bug above [00:31:50] damn! i did not know the extension existed !!! YAY [00:32:24] uwe: It's still experimental and unfinished, though. [00:32:34] uwe: File bugs if you find issues, and let me know if you need help [00:32:50] uwe: But I'm currently on another project, so I can't finish it right now :) [00:33:05] uwe: However, we do have an EPL server up: http://etherpad.wmflabs.org/pad/ [00:33:42] uwe: Also, the Etherpad people hang out in #etherpad-lite-dev on Freenode, if you'd like to keep in touch with that group. [00:34:25] marktraceur, i was so excited about getting etherpad+mediawiki+wysiwyg(wiki syntax) working (just in my head of course); that would just revolutionize the way office work is being done without switching to google docs and similar !! [00:35:00] *nod* [00:35:40] i'll sure deploy that extension ASAP [00:36:09] kaldari, im sorry, got drifted away, so i should add the article i was trying to edit to the list in the bug report ? [00:36:24] uwe: If you have enough interest, I may forego other contributions to some projects and try to get it working better for you. [00:37:00] uwe: yes, and please mention it is the article for Islam, so that non-Arabic speakers will understand the importance [00:37:26] uwe: do you have a bugzilla account? [00:37:41] creating one now [00:39:32] * uwe wonders why he does not have one (yet)! [00:40:30] since it has 386 refs it's going to be tough to get it under the parse time-out limit :( [00:40:55] it would be a shame to remove refs. Can any of them be combined? [00:40:58] kaldari, and i was adding yet another one :) [00:41:11] that would be why it can't save it [00:41:33] ref take extra parsing time, especially if they use citation templates [00:42:06] if there are any duplicate references that can be combined, that would help [00:42:07] kaldari: just got back to my computer. what did you want my thoughts on? [00:42:30] the article on Islam on the Arabic Wikipedia is currently uneditable due to the parser timeout [00:42:59] maybe we should adjust the timeout as a stopgap [00:43:13] Hm so those echo notifications still overlap with [00:43:33] Krenair: Benny's working on that one [00:44:05] it doesn't matter much anyway, echo-error-no-formatter shouldn't ever happen and I doubt it'll get seen in production [00:44:55] robla: Also the article 'a' on English Wiktionary has the same problem [00:46:02] I notice that the default notification email always gets sent as wikitext [00:46:12] marktraceur, I definitely want to get somewhere with that one! let me start by deploying it and then see [00:46:38] 127.0.0.1 mentioned you in a [[User talk:Wef#Test|discussion]] on the "User:Wef" talk page [00:46:45] uwe: I'm actually hearing good things, potentially I can work on it sometime in the next few months [00:47:58] * robla is looking [00:50:26] robla: I was able to save the addition of an HTML comment, but only after a couple tries (and it timed out on all tries regardless of whether the edit was successful). uwe's been trying to add a ref, but gave up after about a dozen tries. [00:53:27] uwe: some editors have resorted to subst'ing all the refs for articles that have parser lock-out [00:53:41] marktraceur, can i kiss you ? :D [00:53:51] uwe: That might be a little weird :) [00:53:56] ;) [01:00:26] a few of us on the third floor are puttering with this now [01:02:12] greens fees. [01:02:32] marktraceur: any objections to merging the Flickr code as it stands now? [01:03:08] kaldari: I haven't looked at it recently, but I'm sure it's fine if you think it's fine. [01:04:10] it looks good, and I don't think anyone else is working on UW right now, so it would probably be minimally disruptive [01:05:00] it has a lot of TODOs, but those can wait for another changeset [01:06:25] OK, it's merged. [01:07:14] only took 5 months :) [01:07:29] and 25 patchsets [01:09:50] y'all prepared to debug if it breaks when we deploy next week? [01:11:41] * marktraceur didn't merge the patch :) [01:11:54] kaldari: Now, the HTML5 drag-n-drop patch! [01:12:19] robla: yep [01:12:19] robla: speaking of which, when's wmf4 deploying? monday or tuesday? [01:12:43] Sam's going to do it on Monday [01:13:01] * robla starts packing up his desk because of hard deadline [01:18:40] my toolserver jobs are timing out too, but that would be the toolserver's fault :( [01:30:22] kaldari: I think we're probably going to have to leave it to the Arabic Wikipedians to figure out which template is the issue [01:31:00] there's probably something that changed here: http://ar.wikipedia.org/wiki/%D8%AE%D8%A7%D8%B5:%D8%A3%D8%AD%D8%AF%D8%AB_%D8%A7%D9%84%D8%AA%D8%BA%D9%8A%D9%8A%D8%B1%D8%A7%D8%AA_%D8%A7%D9%84%D9%85%D9%88%D8%B5%D9%88%D9%84%D8%A9/%D9%82%D8%A7%D9%84%D8%A8:%D9%82%D8%A7%D8%A6%D9%85%D8%A9_%D9%85%D8%AE%D9%81%D9%8A%D8%A9 [01:31:01] yeah, it's hard to debug in Arabic :P [01:31:09] marktraceur, i have a few questions/ideas about the pad extension, should i post it here or somewhere else, or maybe pm ? [01:31:13] yeah [01:31:27] er....did I get the right one? [01:31:50] prolly doesn't matter. basically, "related changes" on the page....you know the drill [01:32:17] one edit of mine passed ! it was *removing* a broken ref [01:32:34] yay [01:33:11] im retrying to submit my earlier edit [01:34:20] uwe: this is the problem on ar.wikipedia, or a different problem? [01:34:41] oh oh ! both edits passed ! [01:34:47] :D [01:35:00] yep ! [01:35:09] robla, yes :) [01:35:36] excellent work! [01:37:20] thank you robla, kaldari , everyone for the great effort ! [01:43:54] uwe: thanks for being persistent :) [10:47:15] wtf [10:47:28] Hmm, Hashar, Krinkle|detached and werdna are AWOL [10:49:18] Plus all the Germans [10:49:22] well, "WMDE" [10:50:03] thedj: ohai [10:51:52] thedj[work]: ohai [10:51:53] ;) [12:58:51] * werdna waves [12:58:51] ree [12:58:51] Reedy: so I'm sitting on my sixth train/bus of the day, a high speed train from Breda to Amsterdam [12:58:51] fingers crossed, I'll be there at 3. [12:58:59] three hours later than I expected [12:59:02] :( [12:59:25] I ran into a friend from Maastricht at Den Bosch, though [12:59:28] but she's working [13:39:55] New patchset: Hashar; "gruntjs 3eff3f8 (devel branch) + related modules" [integration/gruntjs] (master) - https://gerrit.wikimedia.org/r/32722 [14:55:43] hashar: "grunt-contrib-wikimedia" [14:57:01] Change merged: Krinkle; [integration/gruntjs] (master) - https://gerrit.wikimedia.org/r/32722 [14:58:00] Krinkle: done integration/grunt-contrib-wikimedia.git [15:02:51] New patchset: Hashar; "updates gruntjs v0.3.17 -> devel (3eff3f8)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32728 [15:03:11] New review: Hashar; "Updated for Jenkins with https://gerrit.wikimedia.org/r/32728" [integration/gruntjs] (master) - https://gerrit.wikimedia.org/r/32722 [15:03:17] Krinkle: gruntjs Updated for Jenkins with https://gerrit.wikimedia.org/r/32728 [15:03:50] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32728 [15:11:57] Krinkle: is it correct for testswarm et al. under integration not to be on ohloh? https://www.mediawiki.org/wiki/Community_metrics#3rd_parties [15:12:31] don't know, don't care.. [15:12:44] TestSwarm is not a wikimedia product, that I know [15:13:12] doesn't it need some configuration and such? [15:13:19] sure [15:13:46] but that's just a few small files very specific to our setup, most of which can be auto-generated with TestSwarm install [15:40:58] New patchset: Krinkle; "initial commit" [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/32733 [15:41:09] hashar: initial commit: https://gerrit.wikimedia.org/r/32733 [15:50:23] New review: Hashar; "nice. I guess that later on we could use some documentation in a README.md file at the root of the r..." [integration/grunt-contrib-wikimedia] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/32733 [15:50:32] New review: Hashar; "nice. I guess that later on we could use some documentation in a README.md file at the root of the r..." [integration/grunt-contrib-wikimedia] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/32733 [15:50:32] Change merged: Hashar; [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/32733 [15:50:40] Krinkle: https://gerrit.wikimedia.org/r/#/c/32733/ merged [16:34:51] Krinkle: Reedy: beer ??? [16:35:10] Reedy: hashar: roarrr [16:35:16] Krinkle: Reedy: ice tea ? [16:35:18] water ??? [16:39:07] roarrr is good [18:18:49] wooooooooo re: grunt [18:20:37] ori-l: oooh ori [18:20:41] ori-l: are you working ? ;-D [18:21:05] zeljkof has been using your mediawiki / vagrant stuff and has some issue. I think he replied to your wikitech-l message [18:21:17] but that can most probably wait till monday [18:21:34] Reedy: https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blobdiff;f=maintenance/tables.sql;hb=bf4fb5ee9ea251512e736be36d9d6bca271c58d4;hpb=f0ee7f9b4a1d091a57ef236c6550b4ab7549731f [18:21:52] hashar: looking! [18:25:41] hashar: is zeljkof on irc? [18:25:53] ori-l: he is :) [18:26:02] heh. sorry, i just woke up :| [18:26:20] zeljkof: can you try typing "vagrant ssh" in the root folder of the repo? [18:26:30] yes [18:26:36] tried it already [18:26:42] then running "sudo apt-get update" [18:26:43] anything specific that I should do? [18:26:47] that works [18:26:54] hashar tried it already [18:27:10] then ctrl+d to exit back to the host, and run "vagrant provision"? [18:27:13] hummmm. [18:27:24] ori-l: just a sec to do vagrant up [18:28:11] i think s page ran into a problem with a copy of ubuntu that had uk.ubuntu.com references in its apt.list, maybe some kind of weird geoip restriction [18:28:30] so maybe the vagrant vm references us.ubuntu.com.. if so we should change that to the generic oone [18:28:34] * ori-l looks [18:31:43] yup: "deb http://us.archive.ubuntu.com/ubuntu/ precise multiverse" [18:31:45] * ori-l frowns. [18:32:05] just tried sudo apt-get update [18:32:10] * zeljkof $ sudo apt-get update [18:32:10] * zeljkof Ign http://security.ubuntu.com precise-security InRelease [18:32:13] ... [18:32:23] looks ok [18:32:41] okay, so exit the ssh session and run "vagrant provision" [18:32:45] to force a puppet run [18:33:03] ori-l: running [18:33:22] done [18:33:36] anything about mysql? [18:33:43] looking... [18:34:05] debug: importing '/tmp/vagrant-puppet/modules-0/mysql/manifests/init.pp' in environment production [18:34:17] debug: Automatically imported mysql from mysql into production [18:34:30] debug: Adding relationship from Service[mysql] to Exec[mysql-set-password] with 'before' [18:34:40] debug: Exec[mysql-set-password]: Adding default for path [18:34:49] looks right to me [18:34:56] debug: /Stage[main]/Mysql/Service[mysql]/require: requires Package[mysql-server] [18:34:57] debug: /Stage[main]/Mysql/Service[mysql]/before: requires Exec[mysql-set-password] [18:35:05] debug: /Stage[main]/Mediawiki/Exec[mediawiki_setup]/require: requires Package[mysql-server] [18:35:15] so far so good [18:35:47] there is ton of mysql debug lines [18:35:54] should I paste them all? [18:36:04] nope [18:36:13] unless you see anything colored red [18:36:14] when I shh to vm and do mysql --version it says mysql is not installed [18:36:44] debug: /Stage[main]/Mysql/Exec[mysql-set-password]/unless: mysqladmin: connect to server at 'localhost' failed [18:37:04] not colored red [18:37:39] must mean some of the packages failed to install [18:37:47] let me see if mysql is installed now [18:38:10] vagrant@precise32:~$ mysql --version [18:38:10] mysql Ver 14.14 Distrib 5.5.28, for debian-linux-gnu (i686) using readline 6.2 [18:38:14] now it is there [18:38:26] run vagrant provision one more time? [18:38:36] but when I did "vagrant up" then it did not install it [18:38:36] i think to fix this i ought to do two things [18:38:55] I should run vagrant provision one more time? [18:38:59] yeah [18:39:05] should I do vagrant halt before? [18:39:16] this shouldn't happen, but you got into a weird state because the first provisioning failed [18:39:22] no, not necessary [18:39:26] so to fix it i need to [18:39:27] 1) remove geo-specific ubuntu servers and replace with generic [18:39:28] running it then [18:39:35] 2) make mysql task dependent on apt-get update [18:39:47] ori-l: sounds good to me [18:40:03] vagrant provision is done [18:40:16] anything I should look for in output? [18:40:22] try to browse to http://127.0.0.1:8080/wiki/Main_Page [18:41:01] it works! :) [18:41:06] but it used to work before too [18:41:17] but then it opened "install mediawiki" page [18:41:33] and now it just opens the wiki, there is no "install mediawiki step" [18:41:40] yeah, that's as it should be [18:41:48] great [18:41:51] thanks a lot [18:42:00] please let me know when you push to github [18:42:05] or is the repo in gerrit now? [18:42:09] no problem, sorry the experience sucked.. but the information you provided will help a lot [18:42:21] no, github [18:42:45] I am watching https://github.com/wikimedia/wmf-vagrant so I will get mail when you push [18:42:57] will delete everything and do a clean install when I see the push [18:43:03] just to make sure it works [18:43:10] great! [18:43:18] ori-l: I am a tester, after all :) [18:43:34] :) [18:43:55] thanks for setting it up, it is really simple to do [18:44:31] the only thing left now is this small bug :) [18:45:53] Reedy: https://bugzilla.wikimedia.org/show_bug.cgi?id=41973#c1 [18:46:13] ChadHorohoe: YUNOUPDATEJENKINS!? [19:02:55] Reedy: no i don't care :-D [19:03:06] I don't have access on manganese [19:03:10] probably gerrit just restart itself [19:03:13] :( [19:04:21] hashar: Don't start committing code drunk, now [19:06:34] ohh [19:06:38] where is my beer? [19:07:07] <^hashar> With me. I stole it :) [19:08:02] oh noo [19:08:43] throw new OOBException( "You ran out of beer, go grab another one\n" ); [19:10:01] <^hashar> You should tweak the constructor, so it passes the message to the parent's constructor without having to pass it at throw time. [19:10:08] <^hashar> So you can just throw new OOBException; [19:10:21] Or have a paramter for type of beer [19:10:43] <^hashar> throw new OOBException( Beer::LAGER ); [19:12:27] Change restored: Hashar; "(no reason)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26959 [19:12:47] Change restored: Hashar; "(no reason)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26960 [19:12:52] Change restored: Hashar; "(no reason)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26961 [19:12:57] Change restored: Hashar; "(no reason)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26962 [19:13:02] Change restored: Hashar; "(no reason)" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26963 [19:15:32] New patchset: Hashar; "Add job for PagedTiffHandler extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26961 [19:15:32] New patchset: Hashar; "Add job for unit testing Echo extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26960 [19:15:32] New patchset: Hashar; "Add job for PageTriage extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26963 [19:15:32] New patchset: Hashar; "Add job for EducationProgram extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26962 [19:15:33] New patchset: Hashar; "Add job for testing DataValues extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26959 [19:15:38] spam [19:18:43] zeljkof: i think i fixed it.. starting from scratch to test one more time [19:19:00] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26963 [19:19:01] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26962 [19:19:01] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26961 [19:19:01] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26960 [19:19:01] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/26959 [19:19:12] ori-l: looking forward to the fix :) [19:19:38] if you push it in the next 10 minutes I can show it at the hackaton [19:27:09] thedj: https://test.wikipedia.org/wiki/User:Krinkle/CollapsingTestpageMw [19:29:45] zeljkof: done! [19:29:50] well, someone needs to pull it [19:29:54] https://github.com/wikimedia/wmf-vagrant/pull/3 [19:29:58] will do right now [19:30:04] otherwise just use directly from atdt/wmf-vagrant [19:30:12] ori-l: you are our hero ::-]]]]]]]]] [19:30:29] :)))))))) [19:30:41] I would probably hire you to work for the WMF ;-D [19:31:22] hashar, you wouldn't be the first to make that mistake :P [19:33:29] New patchset: Hashar; "update Ext-WebFonts job although it is disabled." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32764 [19:33:29] New patchset: Hashar; "Wikibase disabled for a few: HUGE mem leak issue :(" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32765 [19:33:44] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32764 [19:33:53] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32765 [19:34:09] ori-l: yeah, seems like the WMF has an habit of hiring awesome people [19:35:12] hashar: where are you guys, btw? india or netherlands? i lost track of the hackathons currently happening [19:35:47] Sam Reed, Timo, Jeroen, Daniel Kinzler, Zeljko, I and others are in Amsterdam [19:36:05] Siebrand (who is dutch) is in India :-D [19:36:47] tell timo not to give up on me :P (re catching trivial mistakes when reviewing js code) [19:36:58] * ori-l pokes Krinkle [19:37:06] he is spying us [19:37:11] i think he was thoroughly disgusted by some patch i approved yesterday [19:37:40] 47,000 lines fixed this week to pass jshint. Lets hope it is the last major pass before getting linting into jenkins. [19:37:53] which then got broken within 24 hours by jquery.badge [19:38:06] an jquery.tablesorter [19:38:09] Krinkle: :( [19:38:10] and mediawiki.language [19:38:23] on the plus side, that is a lot of cruft we removed from that module [19:38:24] (I believe none of these are ori-l though) [19:38:28] and i'm learning to spot things better [19:38:47] tablesorter and language were partly auto-generated/upstream [19:38:52] and jquery.badge was kaldari [19:38:59] anyway, just 1 line. [19:39:15] * ori-l fixes [19:39:36] * hashar now listens to zeljkof presentation of vagrant [19:39:39] ori-l: oh, you mean the innerHTML, yes that was a bit annoying. Though jshint didn't catch that, its not a syntax error. [19:39:54] hackathon presentations, seeya later :) [19:39:58] bye [19:40:34] * ori-l hopes vagrant works [19:42:41] hashar: make sure to let people know that there's a pull request pending on wikimedia/wmf-vagrant that fixes a bug, otherwise they might encounter the same problem zeljkof did (https://github.com/wikimedia/wmf-vagrant/pull/3) [19:42:50] * Krinkle is currently updating the schema diagram (mwb, [[mw:DB]]) [19:43:18] MySQLWorkbench doesn't support sql patches, so I need to visually apply 12 months of changes to table.sql in the diagram [19:43:27] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blobdiff;f=maintenance/tables.sql;hb=bf4fb5ee9ea251512e736be36d9d6bca271c58d4;hpb=f0ee7f9b4a1d091a57ef236c6550b4ab7549731f [19:55:47] Dammit Jenkins-bot just went around mediawiki/extensions/Echo and failed everything -.- [19:59:33] !g 30066 [19:59:59] https://gerrit.wikimedia.org/r/#q,30066,n,z [20:07:16] Patch Set 4: No score [20:07:16] Build Failed [20:07:29] That doesn't even make sense. [20:08:35] hashar, werdna: who is that triggering failing echo jobs? [20:08:59] hashar :) [20:09:10] * werdna waves Reedy / hashar. I'm on a train! [20:09:15] Missed the one I was meaning to catch [20:09:17] but oh well [20:09:44] Ah yes, it says manually triggered by him on the job page [20:14:53] I rebased two of them, no point with the XMPP change [20:15:43] Btw werdna, did I mention the XMPP change relies on a now missing github repository? [20:15:55] awkward [20:16:10] I thought I cloned it? [20:16:13] so I figured it would stick around [20:16:32] You cloned one of them [20:16:42] The other came from an 'ivan1986': https://gerrit.wikimedia.org/r/#/c/16580/6/.gitmodules [20:17:39] ori-l-away: mediawiki install via vagrant works when I pull from your repo! :) thanks [20:17:51] woooooo! [20:17:57] thanks for letting me know :) [20:22:43] So now Jenkins-bot has been failing them with this error: https://integration.mediawiki.org/ci/job/Ext-Echo/8/console [20:23:24] 20:06:44 [exec] "CREATE INDEX user_timestamp ON echo_notification (notification_user,notification_timestamp) [20:23:43] Krenair: It sounds like the index is in the first file, and then there's another file that also tries to create it also [20:24:05] $updater->addExtensionTable( 'echo_subscription', $baseSQLFile ); [20:24:05] $updater->addExtensionTable( 'echo_event', $baseSQLFile ); [20:24:05] $updater->addExtensionTable( 'echo_notification', $baseSQLFile ); [20:24:05] $updater->modifyField( 'echo_event', 'event_agent', [20:24:05] "$dir/db_patches/patch-event_agent-split.sql", true ); [20:24:06] $updater->modifyField( 'echo_event', 'event_variant', [20:24:09] "$dir/db_patches/patch-event_variant_nullability.sql", true ); [20:24:11] $updater->modifyField( 'echo_event', 'event_extra', [20:24:13] "$dir/db_patches/patch-event_extra-size.sql", true ); [20:24:43] $baseSQLFile = "$dir/echo.sql"; [20:24:43] $updater->addExtensionTable( 'echo_subscription', $baseSQLFile ); [20:24:43] $updater->addExtensionTable( 'echo_event', $baseSQLFile ); [20:24:43] $updater->addExtensionTable( 'echo_notification', $baseSQLFile ); [20:24:46] There's the problem [20:24:50] it does the same call three times [20:25:41] https://gerrit.wikimedia.org/r/32774 [20:29:04] re-triggered the jobs [20:30:41] Reedy, do they need to be rebased on top of that? [20:30:55] Possibly... [20:31:07] Certainly shouldn't do any harm [20:31:16] Well they all failed again. [20:31:47] With the same error [20:33:24] now 503s... [20:33:37] Gerrit died [20:33:38] RiP [20:35:03] Reedy, well that time it saw your fix change, but still gave the error... [20:35:07] You did test it, didn't you? [20:35:30] No [20:35:43] JeroenDeDauw: hashar is me :-) [20:35:55] hashar: this is the stuff we use to diff the JSON: https://www.mediawiki.org/wiki/Extension:Diff [20:35:56] JeroenDeDauw: not to be confused with binasher : Asher Feldman [20:36:00] the sql boss :-] [20:36:35] Update.php works fine locally with Echo installed [20:50:04] Reedy, hashar: wikidata's search index is now an innodb table? but innodb doesn't even support fulltext search! no wonder the results suck... [20:50:06] http://dev.mysql.com/doc/refman/5.1/en/fulltext-search.html [20:50:20] DanielK_WMDE: what is innodb ? [20:50:24] The real solution, of course, is getting the OAI stuff merged and switching to Lucene [20:50:31] DanielK_WMDE: no, it's myisam [20:50:41] It won't work if it's innodb [20:51:02] MW complains if you try and use innodb [20:51:19] Reedy: because it's not webscale? [20:51:23] Yup [20:51:34] Reedy: yea, i know [20:51:44] hashar: sorry, that was for asher :P [20:51:53] Jenkins-bot just failed an I18n update to PageTriage... https://gerrit.wikimedia.org/r/#/c/32795/ [20:52:23] DanielK_WMDE: see #wikimedia-operations - Asher is trying to fix basic lucene at least... We can see if that works (even without your OAI updates) [20:52:49] 20:43:59 [exec] from within function "DatabaseBase::sourceFile( /var/lib/jenkins/jobs/Ext-PageTriage/workspace/extensions/PageTriage/sql/PageTriageTags.sql )". [20:52:50] 20:43:59 [exec] Database returned error "1: near ",": syntax error" [20:53:26] mwsearch should be left enabled on wikidatawiki [20:54:11] there's an index built now, though it won't get pushed for a while [20:54:40] hmmm [20:54:45] Any idea when it does it? [20:55:53] looks like around 6-7am UTC [21:00:03] leaving out the hackaton :( [21:00:15] Reedy, I just set up Echo with an SQLite database and get the same error as Jenkins did [21:00:38] yeah need to disable the builds I guess [21:01:31] Krenair: ahh good to know :-] [21:01:44] Usually I use MySQL... [21:01:46] Krenair: I will disable it so [21:01:59] SQLite sucks ;) [21:02:03] yeah [21:03:03] Krenair: I have disabled it in Jenkins [21:03:08] ok [21:03:11] PageTriage as well? [21:03:12] Krenair: thanks for your investigation :-]]]]] [21:03:20] yup [21:03:39] Krenair: Siebrand added them a few weeks ago [21:03:56] Krenair: I have enabled them without properly checking them beforehand :( [21:04:05] at least we now have Ext-DataValues in :-] [21:04:13] I am out [21:04:16] really [21:11:46] Guess I'll file a bug for SQLite support then [21:16:05] done: https://bugzilla.wikimedia.org/41987 [22:53:21] New patchset: Jeroen De Dauw; "Add job for testing Diff extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32832 [22:54:48] Change abandoned: Jeroen De Dauw; "huh - what happened to the symlinks?" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32832 [22:56:55] New patchset: Jeroen De Dauw; "Add job for testing Diff extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32833 [23:05:23] New patchset: Jeroen De Dauw; "Add job for testing Maps extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32834 [23:06:48] New patchset: Jeroen De Dauw; "Add job for testing Validator extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32835 [23:12:42] New patchset: Jeroen De Dauw; "Add job for testing SemanticMediaWiki extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32836 [23:14:08] New patchset: Jeroen De Dauw; "Add job for testing Maps extension" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32834 [23:16:37] New patchset: Jeroen De Dauw; "minor cleanup of extrasettings for Wikibase" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32837 [23:18:54] New review: Aude; "Wikibase Client shouldn't be in here and think it can be removed with this cleanup. Otherwise the c..." [integration/jenkins] (master) C: -1; - https://gerrit.wikimedia.org/r/32837 [23:20:32] New review: Jeroen De Dauw; "Make a follow up then or so - idk if it should be removed or not, and it's not the issue I want to t..." [integration/jenkins] (master) C: 0; - https://gerrit.wikimedia.org/r/32837 [23:24:03] New review: Aude; "I'll remove the -1 but note there is a problem with the settings file." [integration/jenkins] (master) C: 0; - https://gerrit.wikimedia.org/r/32837