[09:16:02] This is really annoying, as this stuff shows up in the apps etc. https://www.wikidata.org/w/index.php?title=Q12078&action=history [10:22:45] DanielK_WMDE: there? [10:24:05] hoo: hey [10:24:17] hoo: is it just me or does the external identifiers don't have suggestions for sources anymore? [10:24:59] DanielK_WMDE: Regarding the don't fallback to master thing ( T108929 ) [10:25:00] T108929: [Task] only fallback to master in WikiPageEntityMetaDataLookup when explicitly requested - https://phabricator.wikimedia.org/T108929 [10:25:28] I created a somewhat pre-mature patch that introduces EntityRevisionLookup::LATEST_FROM_SLAVE_WITH_FALLBACK [10:27:02] DanielK_WMDE: No I'm struggling a little… look at ChangeRunCoalescer, it uses EntityRevisionLookup::getEntityRevision with a revision id as second parameter [10:27:39] I don't want to make that interface even more weird by making it getEntityRevision( $entityId, $revisionIdOrMode, $mode ) [10:27:45] or something along these lines [10:27:47] DanielK_WMDE: I summarized the Quantity stuff: https://phabricator.wikimedia.org/T115269#2511555 [10:28:33] sjoerddebruin: hm, could be due to the fact that we remove all external identifiers from the suggester data basis [10:28:40] hoo: if we have the case that we want to load a specific revision, and we usually want to do that from slave, but sometimes we want to fall back, then you will have to add a parameter, i'm afraid. i see no way around that [10:28:59] DanielK_WMDE: Ok, will have to do that :/ [10:29:10] hoo: ChangeRunCoalesker can't just use the latest revision, that wouldn't work [10:29:25] What do you think about my approach to test the fallbacks, btw? I mocked DatabaseBase to omit rows :P [10:29:32] To simulate lag [10:29:34] Thiemo_WMDE: yay, more pull requests! [10:29:37] was the best I could come up with [10:30:28] hoo: i havn't looked at that in detail, sounds scary :P [10:30:56] but yea, probably the best you can do. it's a database level thing, it has to be emulated on the database level [10:30:59] https://gerrit.wikimedia.org/r/#/c/302199/1/repo/tests/phpunit/includes/Store/Sql/WikiPageEntityMetaDataLookupTest.php [10:31:52] hoo: always dropping the first row is a bit obscure, but should work here (we are only expecting one row anyway, right?) [10:32:38] DanielK_WMDE: The test is loading data for four entities (one of them doesn't exist) [10:32:58] and we test that it goes to the master after trying the slave to load the missing two entities (one of them it can actually load) [12:17:53] aude: ready [12:17:56] coming? [12:18:21] Lydia_WMDE: here [12:18:33] :3 [12:18:39] aude: yay - i am in the hangout [12:18:43] for the sprint start [12:19:14] i must be in the wrong hangout [12:19:22] aude: https://hangouts.google.com/hangouts/_/wikimedia.de/wikidata-story?authuser=0 [13:28:26] Lydia_WMDE: can I drop by to cowork tomorrow ? The rain is a great encouragement to indoor activities :-) [13:32:22] dachary: yeah totally [13:32:26] when? [13:34:13] 10am ? I'm not sure what the custom is :-) [13:34:29] Lydia_WMDE: any time during the day works for me. [13:34:41] 10 sounds good :D [13:36:05] Great ! [13:42:08] Here is a thread related to wikidata & https://www.softwareheritage.org/ for people with a focus on software https://sympa.inria.fr/sympa/arc/swh-devel/2016-08/msg00000.html [14:13:09] Lydia_WMDE: Thiemo_WMDE https://gerrit.wikimedia.org/r/#/c/298752/ [14:13:40] thanks Katie! [14:14:38] i think up to there can be merged [14:14:49] thiemo had some issues with https://gerrit.wikimedia.org/r/#/c/298749/7..9/view/resources/wikibase/view/ToolbarViewController.js but think can be done as a follow up [14:15:32] otherwise we skip deploy again this week, and i'm not sure my availability next week :/ [14:16:29] ok [14:57:37] !panic [14:57:37] https://dl.dropboxusercontent.com/u/7313450/entropy/gif/omgwtf.gif [15:00:03] :O [15:49:08] Lydia_WMDE: Thiemo_WMDE do you need help with reviewing anything? [15:49:54] Thiemo_WMDE: ^ [15:50:12] Thiemo_WMDE: do you think you can do it today? otherwise we'll not be able to tag this week [15:50:21] and next week is probably not happening either [15:50:33] * aude is moving next week :) [15:50:36] again [15:51:34] I think all the relevant patches have +2. I just don't understand how to merge them. +2 sometimes does not trigger the merge job. [15:51:53] it's dependency hell [15:51:55] for gerrit [15:52:01] aude: Whats wrong here? I don't see it. https://gerrit.wikimedia.org/r/298750 [15:52:16] think we need https://gerrit.wikimedia.org/r/#/c/298750/10 [15:52:53] I could just click the submit button but learned this is bad bad bad. [15:53:02] * aude tries +2 [15:53:28] https://integration.wikimedia.org/zuul/ [15:53:37] looks to be doing something [15:54:15] I need to go soon. These 3 must go in: https://gerrit.wikimedia.org/r/298750 https://gerrit.wikimedia.org/r/298751 https://gerrit.wikimedia.org/r/298752 [15:54:18] ok [15:54:36] or it might need a manual rebase [15:55:12] Rebases are all already done. [15:55:17] ... for these 3. [15:55:34] ok [16:04:09] aude: 2 are in. i retriggered the 3rd by removing and re-adding my +2. [16:04:16] thanks [17:32:05] Is Sumit here? [19:24:33] aude master should be fixed and ready to deploy [19:44:33] Jonas_WMDE: \o/ [19:44:47] thanks for helping review, rebase and amend adrian's patches :) [19:46:44] thanks katie for deploying all the kittens :) [19:48:46] :) [20:16:01] \o/ [20:16:16] thanks aude and Jonas_WMDE [20:16:21] and Thiemo [20:29:58] is there a limit in number of bytes or lines that query.wikidata.org can take as a query ? [20:30:28] i'm hitting ERROR: couldn't contact server and it's getting unnerving :D [20:38:24] hi aude :) [20:38:37] Thanks for taking care of the branch [20:49:29] hoo it seems that I have reached the lua limit : ( https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples [20:50:01] Indeed :/ [20:50:45] oh dear [20:51:13] Can't we have a special namespace on Wikidata itself? [20:51:27] Guess it's about time we need to split that page up :( [20:51:33] sjoerddebruin: I'd love that [20:52:01] Jonas_WMDE: Probably best for now to split it up by topic (into subpages) and to just load all subpages in the UI then [20:52:11] probably not to hard to do, there's an API for getting subpages [20:57:09] SMalyshev: your fix seems to help, a deleted item doesn't show up in my query anymore. :) [20:57:09] hoo I guess I am to lazy to do that -it interestingly is working on https://www.mediawiki.org/wiki/Wikibase/Indexing/SPARQL_Query_Examples [20:57:56] sjoerddebruin: cool! if you see anything else misbehaving please tell [20:58:02] will do [20:59:54] $ curl 'https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples' 2>/dev/null | grep -i 'expensive parser function count: ' [20:59:54] Expensive parser function count: 573/500 [21:00:18] $ curl 'https://www.mediawiki.org/wiki/Wikibase/Indexing/SPARQL_Query_Examples' 2>/dev/null | grep -i 'expensive parser function count: ' [21:00:19] Expensive parser function count: 0/500 [21:00:23] That's interesting [21:00:50] The fact that you can exceed the limit sometimes is https://phabricator.wikimedia.org/T93885 [21:01:04] But I wonder why it's 0/500 for the mediawiki.org page [21:01:40] Ah, I see [21:01:49] the template on Wikidata actually accesses Wikidata [21:02:04] the one on MediaWiki.org just uses data given to it (so no expensive calls) [21:02:07] that makes sense [21:02:10] but still :/ [21:02:25] Above I meant https://phabricator.wikimedia.org/T106190 [21:02:35] hoo: ah, yes, {{Q}} on mediawiki is fake [21:02:54] it doesn't really access wikidata. [21:04:06] hoo: is expensive function limit apply to Lua too? If not, I could make template fetch names via Lua. SPARQL2 template uses Lua anyway [21:04:26] SMalyshev: Yes, and it's even enforced there, so you can't exceed the 500 [21:04:52] hoo: 500 item names per page? does it include repetitions? [21:05:01] i.e. if two templates fetch the same name [21:05:20] mw.wikibase.label is not expensive [21:05:58] getEntity/getEntityObject is always expensive unless you fetch the entity linked with the page in question [21:06:16] So if you just use the label, mw.wikibase.label is the cheapest option to go [21:06:30] hoo: so check out https://www.wikidata.org/wiki/Module:SPARQLMentions - does it do anything expensive? [21:07:11] No, looks fine [21:07:54] hoo: ok, so we could probably make some templates SPARQL2 then. GUI can't handle them yet but we probably can fix it [21:08:13] so I wonder then why {{Q}} is expensive - isn't it the same as mw.wikibase.label? [21:09:48] SMalyshev: Guess they load the whole item [21:09:51] let me check [21:10:52] yeah, they do [21:11:51] ok, so probably needs some work. either using different template or using SPARQL2 which also auto-calculates the properties/items used. [21:12:06] Jonas_WMDE: ^^ [21:13:21] SMalyshev: Btw, your module could use https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua#mw.wikibase.getEntityUrl [21:13:29] which would make it independent from Wikidata [21:14:14] (regarding the wrong heading on that doc page, fixed in https://gerrit.wikimedia.org/r/302606, just needs a +2) [21:16:12] hmm... I don't remember why I didn't use getEntityUrl. either it didn't work on mediawiki or I for some reason didn't like it... [21:16:15] I'll check [21:16:35] oh, wont work on mediawiki.org [21:16:41] Needs Wikibase client [21:16:53] it has been introduced fairly recently, also [21:16:56] ah so that's the reason - the module was first written for MW [21:17:02] I'll try if Wikidata one works [21:17:13] * SMalyshev also wants cross-wiki templates [21:17:55] yeah, that would be awesome to have [21:18:04] has been on the list since forever [21:19:07] seems to be working fine with getEntityUrl [21:19:25] https://www.wikidata.org/wiki/Module:SPARQLMentions [21:21:11] yes so sparql2 works: https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples#Sandwich_ingredients [21:21:22] now we need to make GUI work properly with it [21:45:00] I'm calling it a day now… cu o/ [21:45:25] sleep well hoo <3 [21:45:33] thanks [23:45:13] i want you to think of a teardrop. a teardrop that runs on someone's cheek. someone who tried to do it all in one fell swoop, and had a sparql query that could potentially do it, but is forced to run three times over 97 groups of items. think of that teardrop ^^° [23:45:42] (long story short, stuff's working, but will require much more manual work than expected :D )