[04:17:11] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1423 bytes in 0.129 second response time [04:48:10] RECOVERY - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1416 bytes in 0.112 second response time [10:18:48] benestar: Script still not working. :( [12:14:50] benestar: merge all the things? ;) [12:15:07] addshore: looking into the test which looks a bit strange [12:15:54] addshore: why did 'invalid-claim' change to 'modification-failed'? [12:18:43] * benestar thinks he found the reason *facepalm* [12:19:02] yep [12:19:14] :P [12:21:01] addshore: kill kill kill, then merge merge merge :P [12:21:25] *facepalm, why kill the method? :P [12:22:17] don't clutter WikibaseRepo [12:22:25] there is a getInternalSerializerFactory method, to avoid confusion there should also be a getSerializerFactory, or there should be none! [12:22:50] * I meant getStatementDeserializer and getInternalStatementDeserializer [12:23:12] If you want in a followup I will make getInternalDeserializerFactory and getDeserializerFactory public [12:23:22] and get rid of those methods [12:23:28] oh, I see [12:23:39] that's a good point, let's fix it in another patch then [12:24:01] :D [12:24:07] meeeeeeerged :D [12:24:09] YAY [12:24:15] *waits for the train* [12:24:16] I think about 6 patches get submitted now xD [12:24:27] yup [12:24:47] poor jenkins [12:24:47] though, see my comment on https://gerrit.wikimedia.org/r/#/c/227425/ which you should also +2 ;) [12:25:23] and probably https://gerrit.wikimedia.org/r/#/c/227427/ ;) [12:26:16] gah, it's always evil to use c&p as an argument for bad code but you're right :P [12:26:45] ;) [12:29:54] 2 merged, 6 queued to go [12:29:56] mwhahahaaa [12:30:00] addshore: \o/ [12:30:14] then just 7 to go until the one that removes all of the code [12:30:23] * benestar should've required a PR on WikibaseDataModelSerialization before merging [12:30:33] mwhahahaaa, nah ;) [12:30:55] addshore: should we follow the design in the serialization component and split the serializer and deserializer? [12:31:02] looks stupid imo [12:31:28] yeh, we may as well :P [12:31:35] *lets you make the PR ;) [12:31:52] heh :P [12:32:02] addshore: first merge https://github.com/wmde/WikibaseDataModel/pull/513 [12:32:09] + https://github.com/wmde/WikibaseDataModel/pull/529 [12:34:22] mhhhm, FingerprintHolder [12:48:26] benestar: also this https://gerrit.wikimedia.org/r/#/c/225933/ [13:50:18] addshore: rebase hell? =o [13:50:23] ya [13:50:33] eating now though [15:06:42] benestar|cloud: :( [15:41:45] * aude waves [15:50:26] * addshore waves [15:50:31] * addshore is about to pop out [15:54:53] ok [18:00:12] * sjoerddebruin wants to do some work, but a script is broken. :( [18:06:01] fix it then [18:06:02] duh [18:09:44] Reedy: Tried, can't guess why it's not working. [18:09:53] js/ [18:09:55] *js? [18:10:00] Jup. [18:10:45] https://www.wikidata.org/wiki/User:Bene*/DuplicateReferences.js [18:11:07] Asked benestar|cloud to look at it, but he's still in his inbox I think. [18:12:42] I think it has something to do with the fact you can add sources now when you add a new statement. [18:13:05] Anything in your js console? [18:13:34] TypeError: undefined is not an object (evaluating 'statementview._referencesChanger.setReference') [18:14:28] It was working before this weeks deploy. [18:19:11] Probably an easy fix [18:19:48] Yup, but I can't see it. [18:27:37] hi sjoerddebruin [18:27:46] Hey aude [18:27:58] * aude takes a look [18:33:10] Did they rename something? [18:35:20] is there something which can make a list including qualifiers using up to date data? [18:36:31] I was hoping listeria could, but it doesn't work unless you also know the value of the property that has a qualifier [18:37:24] all the sparql things I've tried have old data, and as far as I can tell quarry doesn't have statements (if it does, I'd love to know where) [18:37:56] and tabernacle doesn't seem to support qualifiers either [18:44:53] addshore: you keeping track of this? https://gerrit.wikimedia.org/r/#/c/227427/ [18:59:18] sjoerddebruin: i'm not totally sure how duplicate references is supposed to work :/ [18:59:28] i'm sure benestar|cloud can fix it though [18:59:29] Yikes. [18:59:49] "insert references" is grayed out [19:00:11] You need to copy first. [19:00:22] oh, i see [19:00:25] Then you can insert, but it's staying on "saving..." [19:00:35] yeah [19:00:48] Cannot read property 'setReference' of undefined [19:07:12] sjoerddebruin: https://www.wikidata.org/w/index.php?title=User%3AAude%2FDuplicateReferences.js&type=revision&diff=238230540&oldid=238230503 works for me [19:07:19] but not sure that's the best approach [19:07:30] As long if I can work faster again... [19:07:42] ok :) [19:08:11] _referencesChangers got removed from statementview but still is available in options [19:10:11] Okay, can you update the other script. It's used by others. [19:22:10] addshore: I truned your chain into a tree https://gerrit.wikimedia.org/r/#/c/227436/17 muhahahaha [19:27:16] sjoerddebruin: i am not an admin, so don't think i can edit it [19:27:33] Oh? [19:31:48] Done [19:32:44] ok [19:34:42] Should discuss with benestar|cloud what needs to be done before we can convert it to a gadget. [21:16:18] SMalyshev: regarding query.wikidata.org do we use the misc cluster and then send the traffic to one server? [21:16:41] jzerebecki: not sure I understand, what is misc cluster? [21:20:27] SMalyshev: so there are different load balancers / cache clusters, one of them called misc, which serves small random things like phabricator [21:21:31] jzerebecki: yes, that'd be fine. I'm not sure if I want traffic on both servers or just one but given we don't have much of it anyway, doesn't matter too much I guess [22:38:28] benestar|cloud: reviewwwww https://gerrit.wikimedia.org/r/#/c/227436/17