[00:04:05] [13WikibaseDataModel] 15thiemowmde pushed 1 new commit to 06from_numeric_id: 02http://git.io/epotJg [00:04:05] 13WikibaseDataModel/06from_numeric_id 14ac13d9e 15thiemowmde: Update EntityId.php [04:25:47] hi folks, i have a 200G dataset, and I would like to count number of entries for each userid, I would do this in R normally, but this file is too large for that. any ideas on where to start with python? [04:41:30] https://www.wikidata.org/wiki/Wikidata:Project_chat#Wikiquote_notability [04:41:31] go comment! [08:41:50] http://ultimategerardm.blogspot.nl/2014/03/wikidata-replication-to-labs-works-again.html :) [08:41:55] the dailies do not work yet [09:08:57] [13puppet-wikidata-test] 15filbertkm created 06updates (+1 new commit): 02http://git.io/Av1bmg [09:08:57] 13puppet-wikidata-test/06updates 14c811398 15Aude: add check that ExtensionMessages.php exists... [09:11:44] aude: any idea how I can pull from the puppet-jenkins repo on the new jenkins instance? [09:11:51] what user should I use? [09:13:47] no idea how we set it up [09:14:32] aude: hm.. have to ask addshore [09:14:39] https://github.com/wmde/puppet-jenkins/blob/master/README.md [09:14:50] jenkins user, it seems [09:22:17] aude: hm.. I don't have a password for the jenkins user [09:22:35] you can sudo as jenkins [09:22:50] if I do sudo, I'm asked for the password [09:22:52] * aude doubt he has a password [09:23:03] your password, although it shouldn't ask [09:23:15] sudo -s [09:23:18] then sudo jenkins [09:24:18] as jenkins user: sudo git pull [09:24:29] no idea [09:24:32] I get: [sudo] password for jenkins: [09:24:51] things might be setup differently in new instance [09:25:06] aude: I got this on the old instance too [09:25:11] we had a cron job (under jenkins) to update [09:25:52] need to wait for addshore [09:26:07] seems there is no cronjob or it is not working [09:26:17] files are still from march 14th [09:26:29] although I changed them on the repo yesterday [09:26:40] looks like everything is owned by root [09:27:03] aude: if I try git pull with root it asks me for github password [09:27:10] oh, it needs deploy key [09:27:23] then setup ssh://git.... as a remote [09:27:25] and use it [09:27:28] aude: I'm wondering how addshore pulled [09:28:25] no idea [09:28:32] maybe just cloned it [09:28:37] aude: the repo has a deploy key entered. and it is from jenkins.. [09:30:24] if the key is set in ssh config [09:30:38] then using ssh://git remote should work [09:31:19] addshore would know [09:32:07] aude: giving up for the moment [09:32:12] installing rvm now [10:22:54] oh no! aude Tobi_WMDE the build tests failed! [10:22:58] https://gerrit.wikimedia.org/r/#/c/119254/ [10:23:01] no! [10:23:07] jslint looks eww and 2 phpunit test fails [10:23:50] my patches fail [10:23:57] only in the new jenkins [10:28:39] The enWS move has a message that WD will update following the move, this isn't happeningso is this meant to be a local bot job? [10:28:55] sDrewth: ? it should be automatic [10:29:00] no bot [10:29:20] it doesn't seem to happen and I cannot work out where to track it [10:29:28] or where it tracks [10:29:37] * aude try on test2 [10:29:48] that is a cryptic comment [10:29:59] see if i can confirm [10:30:12] do you have a particular page to look at? [10:30:13] oh, you are trying on test 2 [10:30:36] I have moved a couple today, but have updated them since [10:30:49] I will have more later, what state do you want them? [10:31:05] i'm trying to remember my password for test2 [10:31:08] need a minute [10:40:45] sDrewth: it seems to work but there could still be an issue [10:41:07] if you can give examples, that would help [10:41:21] define, give an example? [10:41:33] which author page at enws didn't move? [10:41:50] https://test2.wikipedia.org/w/index.php?title=Parrot&action=history [10:41:59] https://test.wikidata.org/w/index.php?title=Q356&curid=858&action=history [10:42:15] a wikisource page that moved [10:42:36] sure, looking [10:51:13] addshore: daily? :) [10:56:01] aude: I will undertake some live operations, and not fix, and record that data [10:56:15] what would be the expected time scale for WD to update data? [11:00:13] sDrewth: should be a minute, but maybe the job queue backlog is longer [11:00:54] DanielK_WMDE_: watchlist patch: https://gerrit.wikimedia.org/r/#/c/118458/ [11:10:36] sDrewth: my guess is maybe the issue is with pages in the author namespace [11:10:45] maybe not handled correctly by wikidata [11:11:25] okay, I will have one in five minutes or so [11:11:29] ok [11:11:51] i can see the job queue log and see wikisource, but only very few log entries [11:11:54] and main namespace [11:14:00] aude: page moved [11:14:04] great [11:14:08] 11:13, 18 March 2014‎ Billinghurst (Talk | contribs | block)‎ m . . (354 bytes) (0)‎ . . (Billinghurst moved page Author:Joseph Hatton to Author:Joseph Paul Christopher Hatton: expand name) [11:15:04] i don't see anything yet [11:15:13] i just moved a page in test2 and see that in the log [11:15:53] i'll open a bug ticket [11:16:03] I am happy to do it [11:16:45] I will leave it ten minutes or so, and update it manually [11:16:55] ok [11:17:13] basically, wikidata never gets notified (while it should) [11:18:06] so it is a push notice on a move? [11:18:16] yes [11:18:21] i wasn't sure whether it was push or pull [11:50:31] Lydia_WMDE : Though my proposal is incomplete till now but will u please go through the deliverables and synopsis of my proposal and help me to improve it? [11:53:13] shrees: can you please send me an email? i have meetings coming up but will try to go through it tonight [11:53:28] yeah sure [11:53:30] :) [12:06:08] hoo: https://bugzilla.wikimedia.org/62779 (if you are areound) [12:06:11] around* [12:06:47] oh, will have a look later... got a lot of emails I have to skim through first [12:06:51] ok [12:50:31] [travis-ci] wikimedia/mediawiki-extensions-Wikibase/master/40dd50e : Thiemo Mättig The build was fixed. http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/21012756 [13:02:07] looks like my pidgin isn't motivated to much after all... back on irssi now :/ [13:16:47] aude: did you tricker that one per hand? https://test.wikidata.org/w/index.php?title=Q22&curid=203&diff=4902&oldid=4901 [13:16:52] * trigger [13:29:36] Hello everyone, Can I get a little feedback on my GSoC proposal: https://www.mediawiki.org/wiki/Wikidata_annotation_tool, if anything is left. [13:32:47] aude: Lydia_WMDE: Guided tours is now live on Wikidata \o/ :) [13:39:41] Anybody else having problems with editing in clients on master? [13:43:12] DanielK_WMDE: --^ [13:43:33] hoo: yay :) [13:43:36] in meeting atm [13:43:46] ;) [13:45:22] aude: --1 [13:45:25] * --^ [13:54:59] does nobody care about client editing? [13:55:06] *sigh* [14:31:27] * hoo wtfs hard [14:31:44] ? [14:32:00] Lydia_WMDE: Why should a job only have scalar parameters? [14:32:20] What the heck did Aaron think we want to do? Calculate something? :P [14:33:22] heh [14:34:49] DanielK_WMDE: pin [14:34:50] g [14:35:10] he is in the sprint start meeting with everyone else [14:35:13] MEH [14:35:16] * meh [14:35:21] How long? [15:02:35] hoo: like what? [15:02:53] * aude escaped from the meeting [15:03:07] heh [15:03:17] aude: What are you talking about, the client editing? [15:03:31] what's the issue? [15:03:53] It fatals for me, if I have a Lua invoke in the page I want to edit [15:04:04] i have not tried that [15:04:28] Please do :) (on master) [15:04:39] The invoked Lua code ahs to use mw.wikibase of course [15:07:27] we used lua on beta wikipedia and it's ok [15:07:38] beta labs? [15:09:51] http://en.wikipedia.beta.wmflabs.org/wiki/0.07172310128917514 [15:10:11] apparently lua does not work with the label, though [15:10:25] no, there's a bug for that [15:10:37] ok [15:11:03] hoo: https://gerrit.wikimedia.org/r/#/c/119207/ might be an issue [15:11:12] let me find the bug [15:11:58] https://bugzilla.wikimedia.org/show_bug.cgi?id=62754 [15:12:02] [Tue Mar 18 14:49:02.491955 2014] [:error] [pid 14671] [client 127.0.0.1:44754] PHP Fatal error: Maximum function nesting level of '100' reached, aborting! in /var/www/html/mwt/wiki/extensions/Wikibase/vendor/composer/ClassLoader.php on line 266, referer: http://localhost/mwt/wiki/dewiki/index.php?title=Berlin&action=edit&oldid=161 [15:12:06] that's the bug I saw [15:12:11] * error [15:12:32] hmmm [15:14:03] maybe I should set xdebug.max_nesting_level = 200 or so [15:14:32] might not really be an endless loop but rather expected behavior (which is yikes) [15:15:28] * hoo tries that [15:16:19] ok, taht seems to fix it [15:17:02] reaching a nesting level of over 100 might be worrysome, [15:17:06] * worrisome [15:17:11] performance wise [15:23:03] hoo: back now [15:23:05] what's up? [15:23:28] DanielK_WMDE: Any idea why Aaron restricted the new job specification stuff to use scalar values only [15:23:38] I was WTFing about that really hard [15:24:34] hoo: no, i don't know. didn't look that closely yet. perhaps "it seemed like a good idea at the time"? [15:24:51] hoo: i think it should not allow objects, but it should allow arrays. [15:25:03] which would mean you'd have to look inside the arrays and check for objects... [15:25:04] DanielK_WMDE: At least strings... I mean wtf [15:25:21] it doesn't allow strings?? whut? [15:25:27] It only allows key => scalar (nested) [15:25:47] but no other values than scalar or further arrays (with the same scheme) [15:25:59] JobSpecification::validateParams [15:26:13] I locally overrode that function and it worked like a charm [15:26:45] hoo: err - strings are scalars. [15:27:24] wasn't sure about php terminology wrt "scalars", but they are in perl (per I know the term from), and i just tried this: [15:27:32] php > var_dump( is_scalar( "foo" ) ); [15:27:33] bool(true) [15:27:36] so, strings are ok. [15:27:48] what the heck [15:28:27] ah, PHP is seeing scalar as a simple data type and not as a numerical value (like in maths) [15:29:11] yes, same in perl. [15:29:30] Oh right... in perl that's a more common thing [15:29:36] in java, strings are objects, so i guess they wouldn't be called scalars... but java doesn't use the term. [15:29:43] @wtf [15:29:52] it's confusing anyway. wikipedia is telling me that "scalar" just means "variable": [15:30:00] In computer programming, a variable or scalar is a storage location and an associated symbolic name (an identifier) which contains some known or unknown quantity or information, a value. [15:30:09] https://en.wikipedia.org/wiki/Variable_%28computing%29 [15:30:20] not my intuitive definition, but whatever :) [15:30:35] so, anyway. it seems aaron's code makes sense [15:30:41] so can be anything... even a pointer, but not a null pointer, probably :P [15:30:44] it disallows objects. which it should. [15:30:53] But we're not using objects [15:31:08] $params['entityId'] = $entityId; [15:31:10] i hope we arn't [15:31:12] ah, that might be the problem [15:31:16] hehe... [15:31:29] maybe update the code so it provides a better error message [15:31:38] at least, include the the key in the message [15:31:49] Sounds sane [15:32:23] DanielK_WMDE: Oh, one more thing... shall we overwrite JobSpecification to set the name in there or just do new JobSpecification( 'UpdateRepoOnMove', ... ); [15:32:40] that would be a 5 line class or so only hard coding that first param. [15:36:02] DanielK_WMDE: https://gerrit.wikimedia.org/r/119298 [15:37:02] merged [15:37:23] hoo: if there is only a single place where we construct it, there's no need for subclassing,. [15:37:31] Ok :) [15:37:33] actually, there probably never really is [15:37:44] Only thing left then are the unit tests... [16:15:13] Let's totally beat the graph this time? http://wmde.wmflabs.org/t/wikidata-developers/2014-03-18/ ie stay below the green line [16:17:20] yeah :D [16:24:15] Lydia_WMDE: so if I finish an item I do what exactly? Rage to you? Set it to solved on bugzilla? Nothing? [16:24:34] JeroenDeDauw: set it to fixed in bugzilla :) [16:31:48] [13WikibaseDataModelSerialization] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/gpj-KA [16:31:48] 13WikibaseDataModelSerialization/06master 1488ae656 15jeroendedauw: Add link to README [16:32:03] [13WikibaseInternalSerialization] 15JeroenDeDauw created 06docs (+1 new commit): 02http://git.io/f0_qTQ [16:32:03] 13WikibaseInternalSerialization/06docs 148c0cd9a 15jeroendedauw: Add basic docs to README [16:38:20] [13WikibaseInternalSerialization] 15JeroenDeDauw opened pull request #16: Add basic docs to README (06master...06docs) 02http://git.io/Z236aA [16:43:13] [travis-ci] wmde/WikibaseInternalSerialization/docs/8c0cd9a : jeroendedauw The build passed. http://travis-ci.org/wmde/WikibaseInternalSerialization/builds/21029625 [16:48:30] ahhh [16:48:34] jenkins!! [16:48:38] drives me crazy [16:48:59] it sometimes just forgets to run some of the jobs.. [16:51:56] DanielK_WMDE: https://gerrit.wikimedia.org/r/119306 [16:52:22] Still have to test the two-way compatibility... old client -> new repo should work and vice versa [16:54:18] aude: addshore: seems we're running out of space for jenkins on the new instance.. what did you do last time to solve this issue? [16:55:35] move build storage to /mnt or some place (/data/somthing?) [16:55:45] it's in the readme how we did [16:55:58] aude: in what readme? [16:56:04] puppet-jenkins [16:56:42] aude: reading.. [16:56:56] addshore might have better idea [16:57:41] aude: I think it worked out pretty well for the old instance [16:57:45] ok [16:57:52] we can even increase the time we keep builds [16:57:58] Thiemo_WMDE would like that [16:58:01] so I'm going to add that bit to the fstab [16:58:23] that should be everything according to the readme [16:58:27] damn [16:58:35] ok [16:58:45] DanielK_WMDE: The change from using a ItemId to using a string broke b/c [16:58:45] i might have done something manual the first time [16:59:27] that means that we can't do this in the same release :( [16:59:52] Tobi_WMDE: we can see if the fstab thing is enough and reboot th einstance [17:00:04] aude: seems that the bit from the readme is already done.. [17:00:19] so, the fstab has already that entry [17:00:20] aude: Did you branch yet? [17:00:24] For wmf19 [17:00:30] hoo: dont think so [17:00:46] There's something I really want in there to not block other stuff further up [17:01:09] hoo: but we will do that soon [17:01:26] I guess I can backport, if needed [17:01:42] if it's a big issue or bug, backport is fine [17:01:44] hoo: no [17:01:56] we might use the build from today but then will backport some stuff [17:02:11] hoo: can't this be made b/c? [17:02:31] DanielK_WMDE: No, we can only make the old job forward compatible to the new data [17:02:38] that's what I want to do [17:02:38] aude: guess we need to make a new build though [17:02:46] tests were broken in todays build [17:02:52] we do [17:02:57] hoo: oh, right [17:03:16] todays build was not merged yet [17:03:57] aude: no idea why jenkins still complaining about disc space, the fix which is mentioned in the readme is already in t seems.. [17:04:07] hmmm [17:06:33] Thiemo_WMDE: i just updated https://gerrit.wikimedia.org/r/#/c/118277/ [17:06:46] hope you like it better, took me half of today to refactor it :) [17:10:10] I've just enabled the jobs on the old jenkins instance again and disabled the new ones until we've solved the problems there.. no need to be blocked on a broken jenkins.. [17:10:15] ok [17:10:27] i'll maybe be able to look tomorrow or tonight [17:10:47] so, if someone has failures on one of his patches, just trigger a new patch to rerun the tests.. [17:11:20] DanielK_WMDE: hoo https://gerrit.wikimedia.org/r/#/c/119207/ we probably want in [17:11:35] core thing, but i'm uneasy having new branch go out without it [17:12:13] aude: wont the new branch go out at the same time then that? [17:12:21] Or do we plan another rhythm? [17:12:21] they will [17:12:26] aude: probably the instance just needs to be restarted? [17:12:34] the core thing can get in before the core branch is made, ideally [17:12:39] could be that it wasnt after fstab gots changed? [17:12:49] DanielK_WMDE: aude: https://gerrit.wikimedia.org/r/119312 [17:12:51] otherwise, concerned we'll see such issues as i reported in the bug report [17:14:37] I don't see the issue the core change has been merged and should go live just fine with our stuff or am I missing something? [17:14:49] has it ben merged? [17:14:58] * aude is blind [17:15:00] ignore [17:15:03] Sure, I approved it earlier today [17:15:10] no grrrrt [17:15:40] maybe i should try to get grrrrt back [17:15:48] I'd love that :) [17:18:16] [13WikibaseInternalSerialization] 15JeroenDeDauw created 06entity (+3 new commits): 02http://git.io/Ypu7YA [17:18:16] 13WikibaseInternalSerialization/06entity 14571b2a0 15jeroendedauw: Increase robustness of EntityIdDeserializer [17:18:16] 13WikibaseInternalSerialization/06entity 14776f2ff 15jeroendedauw: Add EntityDeserializer and property test data [17:18:16] 13WikibaseInternalSerialization/06entity 142393800 15jeroendedauw: Use EntityDeserializer in ReadEntitiesTest so properties are also covered [17:18:58] [13WikibaseInternalSerialization] 15JeroenDeDauw opened pull request #17: Add EntityDeserializer and add more integration tests and test data (06master...06entity) 02http://git.io/GuwV3Q [17:28:27] hoo: https://gerrit.wikimedia.org/r/#/c/119312/ [17:29:10] looking [17:30:59] Thiemo_WMDE: please to review https://github.com/wmde/WikibaseInternalSerialization/pull/17 (very similar to what you looked at yesterday) [17:34:07] http://wdjenkins.wmflabs.org/ci/job/wikibase-build-browsertests-sauce/27/ : FAILURE wtf [17:34:26] DanielK_WMDE: Updated and replied [17:43:48] oh, the bot was here [17:46:23] still not workin [17:46:24] g [17:46:51] is it able to connect to the ssh stream from gerrit? [17:47:12] aude: WikidataJenkins sais "Aborted"? [17:48:31] [13WikibaseDataModelSerialization] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/auPwkQ [17:48:31] 13WikibaseDataModelSerialization/06master 141453e23 15Jeroen De Dauw: Update composer.json [17:48:35] or is that just what it sais if there is a merge conflict? [17:57:46] [13WikibaseInternalSerialization] 15thiemowmde comment on pull request #17 142393800: Both "@return Deserializer" and "@return PropertyDeserializer" are true. In my opinion being more specific is better. Am I wrong? 02http://git.io/elOotg [17:59:05] DanielK_WMDE: Ok, two way compatibility works fine for me :) [17:59:46] * hoo tests a little further [18:01:12] ok, works flawless [18:06:18] hoo: two way compat? in what way? [18:06:35] you mean, just support the old and the new way on the receiving end? sure. [18:06:41] DanielK_WMDE: Old repo and new client work together (and vice versa) [18:08:13] * aude kicks the bot  [18:08:18] hope it works this time [18:08:35] you could test it by approving my change *hides* [18:16:43] poor WikidataJenkins )-; [18:28:43] aude: https://gerrit.wikimedia.org/r/#/c/118458/ [18:30:35] gah, about to edit the commit message [18:33:23] aude: why are you updating the commit msg all the time.. that is just stashing more jobs to the queue.. [18:33:40] see http://wikidata-jenkins.wmflabs.org/ci/ [18:33:53] sorry, to get gerrit bot back [18:34:03] just keep calm and wait until the jobs are finished.. that takes abou 7 minutes for every patchset [18:34:04] should update non wikibase jobs [18:34:05] Tobi_WMDE: is jenkins totally bogged down, or just dead? [18:34:10] it doesn't seem to do anything [18:34:27] see http://wikidata-jenkins.wmflabs.org/ci/ [18:34:35] why can't they run in parallel?... [18:34:37] cancel mine [18:34:53] DanielK_WMDE: they cant [18:35:10] aude: thats hard to figure out which are the correct ones [18:35:25] I guess you just have to wait [18:35:25] [13WikibaseDataModelSerialization] 15JeroenDeDauw created 06newserialization (+1 new commit): 02http://git.io/1mcEqg [18:35:25] 13WikibaseDataModelSerialization/06newserialization 14f35fc99 15jeroendedauw: Make Component work with new version of the serialization library... [18:35:35] [13WikibaseDataModelSerialization] 15JeroenDeDauw opened pull request #46: Make Component work with new version of the serialization library (06master...06newserialization) 02http://git.io/1YIXtA [18:35:37] why are you thinking it doesn't do anything, look at e.g. https://gerrit.wikimedia.org/r/#/c/119058/ [18:36:26] Tobi_WMDE: queue is empty of my stuff [18:36:42] aude: but you also aborted the jobs for your last patch.. [18:36:48] thats bad [18:36:57] which patch? [18:37:04] Tobi_WMDE: because it took 35 minutes for it to do something :) [18:37:12] aude: https://gerrit.wikimedia.org/r/#/c/118458/11 [18:37:18] that's fine [18:37:24] i need to make a new patch [18:37:31] not bad [18:37:37] DanielK_WMDE: blame it on people updating the commit messages just for fun [18:37:42] :P [18:37:45] not for fun :) [18:37:55] want grrrrt back? :) [18:38:01] .oO(can't be so hard to have multiple instances run in parallel...) [18:38:06] what gerrit? [18:38:20] anyway, time to go home [18:38:27] Lydia_WMDE : Did you go through my proposal? [18:38:33] Tobi_WMDE: the grrrrit bot. it's gone. [18:38:36] DanielK_WMDE: if we had multiple nodes, yes [18:38:51] yea yea... [18:39:04] DanielK_WMDE: is this just for us or for all the gerrit repos? [18:39:23] shrees: i just got home. need to eat and will then look at it [18:39:41] yeah ok thanx [18:39:46] I always thought the gerrit bot is something that acts the same for all repos [18:39:48] Tobi_WMDE: all [18:39:52] ok [18:40:04] so, this is not our issue alone [18:40:16] no but i can fix it [18:40:18] [13Serialization] 15JeroenDeDauw created 06implement (+1 new commit): 02http://git.io/hq4WVA [18:40:18] 13Serialization/06implement 140a85eba 15jeroendedauw: Explicitly implement DispatchableDeserializer in TypedObjectDeserializer [18:40:21] and I guess it wont come back by updating commit msgs [18:40:31] it will come back by fixing it [18:41:00] sure. [18:41:05] [travis-ci] wmde/WikibaseDataModelSerialization/newserialization/f35fc99 : jeroendedauw The build failed. http://travis-ci.org/wmde/WikibaseDataModelSerialization/builds/21039266 [18:41:12] we can spam core next time.. :-P [18:41:25] [13Serialization] 15thiemowmde closed pull request #6: Explicitly implement DispatchableDeserializer in TypedObjectDeserializer (06master...06implement) 02http://git.io/hZDpWg [18:46:08] [13Serialization] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/sONqyg [18:46:08] 13Serialization/06master 143cea2bb 15jeroendedauw: Bump to 3.1 [18:46:11] ok, I think we'll just use tomorrow's build as wmf19 branch then.. ok? [18:46:25] [travis-ci] wmde/Serialization/implement/0a85eba : jeroendedauw The build has errored. http://travis-ci.org/wmde/Serialization/builds/21039636 [18:46:33] [13Serialization] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/9Ws2Wg [18:46:33] 13Serialization/06master 146e9a5c6 15jeroendedauw: Run tests also with php 5.6 [18:48:32] DanielK_WMDE: multiple nodes for jenkis is actually something we want. and once we have puppetized most of the stuff it's easy to add new nodes.. but we're not there yet.. [18:48:35] [13WikibaseDataModelSerialization] 15JeroenDeDauw 04force-pushed 06newserialization from 14f35fc99 to 145042df0: 02http://git.io/8t4KPw [18:48:35] 13WikibaseDataModelSerialization/06newserialization 145042df0 15jeroendedauw: Make Component work with new version of the serialization library... [18:49:31] DanielK_WMDE: and if you only have one node, unfortunately you can run the same job only once at a time. even if it were triggered by a different change.. [18:49:36] [13WikibaseInternalSerialization] 15thiemowmde 04deleted 06entity at 142393800: 02http://git.io/UjhEnQ [18:49:50] [13WikibaseInternalSerialization] 15thiemowmde pushed 1 new commit to 06master: 02http://git.io/Jhhccw [18:49:50] 13WikibaseInternalSerialization/06master 14169a847 15thiemowmde: Merge pull request #16 from wmde/docs... [18:52:25] addshore: ping, I can has a review? [18:52:34] aude: https://gerrit.wikimedia.org/r/#/c/118292/ [18:53:13] aude: i like the refactoring you are doing btw. don't be misled by all the complains :P [18:57:10] thanks DanielK_WMDE :) [18:57:29] The other change has to wait until we have branched... I should probably -1 [18:58:04] probably. currently reviewing that. looking nice so far. [19:04:23] [travis-ci] wmde/Serialization/3.1/3cea2bb : jeroendedauw The build passed. http://travis-ci.org/wmde/Serialization/builds/21040096 [19:07:17] hoo: got time for a review? [19:07:45] JeroenDeDauw: Yep [19:08:35] hoo: https://github.com/wmde/WikibaseDataModelSerialization/pull/46 [19:09:17] That is mostly kicking out some not needed and wrongly implemented stuff (the is(Des/S)erializerFor methods) [19:10:01] looking [19:14:50] hoo: i commented on https://gerrit.wikimedia.org/r/#/c/119306/ [19:15:56] "why clone? shouldn't the service be stateless?" Indeed, but that for some reason didn't work... so I added that during testing [19:16:15] If you can figure out why, I'd totally fix that [19:18:08] hoo: hm, SiteLinkLookup is an interface, no need to do the disableOriginalConstructor thing. just use getMock(). [19:18:28] can't think of a reason why a clone would be needed, or how it would help... [19:18:59] I can poke at that again, I guess... that was what I came up with somehow and then forgot about it as it didn't seem critical [19:19:13] it's not, just makes me wonder [19:19:28] If you have it checked out, just try it [19:19:35] I mean remove and then run the client test [19:22:04] hoo: if fails if i remove the "cone" for the second parameter (the SiteLinkLookup), complaining about the *third* parameter being null. [19:22:07] WTF?? [19:22:36] DanielK_WMDE: Third paremeter to the getSpecification function [19:22:50] which calls out to the sitelinklookup in order to get an entityid [19:24:36] JeroenDeDauw: Looks sane and well unit tested, but I don't think I'm able to do a qualified sign of on it [19:27:29] hoo: ok, thanks. Can you leave a +1 comment? [19:27:59] oh, sure :) [19:29:44] hoo: i don't know what is happening, but the issue does point towards a bug: getEntityIdForSiteLink *may* return null. that should be handledc gracefully, not with a fatal error. [19:30:35] * aude shall start naming all methods after kittens :) [19:30:41] DanielK_WMDE: Thought about that as well, but our only caller (the lcient hook) already checks for the return value of the function [19:30:44] then the reviewer can suggest a name [19:31:10] public function aCuteLittleKitten() [19:32:38] [13WikibaseDataModel] 15thiemowmde pushed 1 new commit to 06from_numeric_id: 02http://git.io/WH2XZw [19:32:38] 13WikibaseDataModel/06from_numeric_id 14b066021 15thiemowmde: Update EntityId.php [19:32:42] hoo: createJob() calls getEntityId(), which ma return null. But the value is never checked, it's passed directly to getJobSpecification, which will fail hard if the itemId is null. [19:33:18] DanielK_WMDE: the client hooks check that it's !== null [19:33:28] if ( !$updateRepo || !$updateRepo->getEntityId() || !$updateRepo->userIsValidOnRepo() ) { [19:33:34] hoo: "it" being what, exactly? [19:33:34] it then returns [19:34:02] oh, we are calling the database multimple times?... [19:34:13] it's still a race condition [19:34:22] mh, why? [19:35:19] because the entry could be removed between the time we check, and the time it gets used [19:35:38] it's external state. we can't make assumptions about it. [19:35:46] DanielK_WMDE: https://github.com/wmde/WikibaseDataModelSerialization/pull/46 [19:35:53] we could just get the value once, the ncache it. that would work :) [19:36:14] DanielK_WMDE: I can make that function cache the value [19:36:29] thought about that before two save a lookup but I thought it might already be cached [19:36:43] probably not worth it for performance only [19:41:15] hoo: not for performance, but to avoid race conditions [19:41:18] Lydia_WMDE : Are you busy? [19:41:31] hoo: sitelink lookups are not cached as far as I remember [19:41:49] Will change that, then :) [19:41:55] would be nice to cache them [19:45:43] ok, it's certainly not cached... not sure it's worth adding an in process cache [19:47:54] JeroenDeDauw: http://www.omfgdogs.com/ [19:48:28] addshore: yes [19:48:46] addshore: Lydia_WMDE has approved this as our new wikidata.org homepage [19:48:56] ;> [19:49:00] Sounds legit [19:49:04] * hoo deploys that [19:49:08] :> [19:52:21] [13WikibaseDataModelSerialization] 15brightbyte comment on pull request #46 145042df0: Why call this on objects that don't even have the isDeserializerFor() method? Why support this? 02http://git.io/m1Ykhg [19:56:11] [13WikibaseDataModelSerialization] 15brightbyte closed pull request #46: Make Component work with new version of the serialization library (06master...06newserialization) 02http://git.io/1YIXtA [19:56:36] [13WikibaseDataModelSerialization] 15JeroenDeDauw 04deleted 06newserialization at 145042df0: 02http://git.io/gI3txQ [20:02:42] DanielK_WMDE: mh... maybe I could implement createJob in the abstract base class and then have a getJobName and a getParameters method or so? [20:04:56] hoo: ok, so that mock object thing is just *odd*. what the fuck?? [20:05:01] giving up on that for now. [20:05:32] I played with it before and couldn't get anywhere either... I even let it run 10 times in a loop and it worked every time... but in the object it just fails... [20:06:28] it's very strange indeed. [20:06:57] leaving it for now. but perhaps make a big HERE BE DRAGONS comment [20:07:30] i recall seeing comments like "for some reason, removing this comment causes a core dump!" [20:07:31] yeah, will do... if it were in production code I'd be more worried, though [20:07:33] gotta love php [20:10:08] hoo: oh, we could revive this one, once the job handler is in the repo: https://gerrit.wikimedia.org/r/#/c/88440/ [20:10:45] good idea, forgot about that one already :P [20:11:01] me too, i was just going througth old patches :) [20:11:07] want to pick it up? it's trivial enough [20:11:20] will do that after I got the current version straight :) [20:11:30] excellent! [20:15:21] [13WikibaseInternalSerialization] 15JeroenDeDauw created 06SerializerFactory (+1 new commit): 02http://git.io/Z-N61w [20:15:21] 13WikibaseInternalSerialization/06SerializerFactory 14bee1d30 15jeroendedauw: Add SerializerFactory which just delegates to the DataModel Serialization component [20:17:33] [travis-ci] wmde/WikibaseInternalSerialization/SerializerFactory/bee1d30 : jeroendedauw The build passed. http://travis-ci.org/wmde/WikibaseInternalSerialization/builds/21046133 [20:21:31] Tobi_WMDE: http://wdjenkins.wmflabs.org/ci/job/wikibase-repo-tests/198/console what's going on here? [20:25:56] wait... what wdjenkins.wmflabs and wikidata-jenkins.wmflabs... [20:27:24] wdjenkins is the new one but not working 100% correct [20:27:32] we switched back to the old one for now [20:27:39] old = wikidata-jenkins [20:27:50] Ok, can't we shut up the not working one then? That's confusing... [20:27:58] we did [20:28:07] is it still reviewing? [20:28:35] aude: Yes https://gerrit.wikimedia.org/r/119306 [20:28:42] huh [20:28:48] i thought it was disabled [20:30:23] nope :/ [20:30:29] i don't have login there [20:30:35] addshore: Tobi_WMDE ? [20:47:11] [13WikibaseInternalSerialization] 15JeroenDeDauw opened pull request #18: Add SerializerFactory which just delegates to the DataModel Serializatio... (06master...06SerializerFactory) 02http://git.io/98fKCg [20:47:15] Thiemo_WMDE: easy review ^ [20:47:25] * aude going to restart wdjenkins [20:47:29] reboot* [21:05:15] (03CR) 10WikidataJenkins: "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/119392 (owner: 10L10n-bot) [21:07:14] the bot is back! [21:08:36] (03CR) 10WikidataJenkins: "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/119392 (owner: 10L10n-bot) [21:09:52] \o/ :) [21:26:59] [13WikibaseInternalSerialization] 15thiemowmde 04deleted 06SerializerFactory at 14bee1d30: 02http://git.io/rG1cRw [21:28:25] [13WikibaseInternalSerialization] 15JeroenDeDauw created 06narrow (+2 new commits): 02http://git.io/xMRbyA [21:28:25] 13WikibaseInternalSerialization/06narrow 1454c30db 15jeroendedauw: Prefix legacy deserializers with "Legacy" to avoid name comflicts [21:28:25] 13WikibaseInternalSerialization/06narrow 147a33756 15jeroendedauw: Limit exposed interface of DeserializerFactory and add todos... [21:28:40] [13WikibaseInternalSerialization] 15JeroenDeDauw opened pull request #19: Limit exposed interface of DeserializerFactory and add todos (06master...06narrow) 02http://git.io/beFY0g [21:36:35] [travis-ci] wmde/WikibaseInternalSerialization/narrow/7a33756 : jeroendedauw The build passed. http://travis-ci.org/wmde/WikibaseInternalSerialization/builds/21051389 [21:41:14] Hey Lydia_WMDE and JohnLewis [21:41:21] Hey twkozlowski [21:41:35] Belated congratulations on 100+ issues of your weekly Wikidata newsletter! [21:41:47] Yeah :) [21:41:56] I think it's 102 on Friday. [21:47:09] :) [22:32:11] (03CR) 10Aude: [C: 04-1] "nitpicks" (037 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/118313 (owner: 10Daniel Kinzler) [23:56:18] Hi. [23:57:57] If a wiki has multiple variants, which variant is used for sitelink ? [23:58:10] How does it be chosen ?