[15:52:46] hoo: Shouldn't the readonly errors show up in logstash? [15:53:00] Not sure we logstash them [15:53:01] let me see [15:53:24] I can't find them [15:53:37] Tons of database warnings when you query for "Wikidata" [15:54:36] doesn't seem we log these at all currently [15:54:40] Yeah, I can guess [15:58:05] Some of them should go away next week [15:58:44] Why aren't the read-only warnings logged? I was debugging GWtoolset the other day and that as tons of (useful) logging [15:59:09] Probably someone forgot to add them to the logging or they got removed [15:59:15] I guess I simply forgot about it [16:10:24] hoo: Can you add it? :-) [16:12:49] Not today… people don't like Sunday deploys… [16:14:18] Someday, not today :P [16:14:32] Lydia_WMDE: Broke https://test.wikidata.org/wiki/Q42770 for https://phabricator.wikimedia.org/T58615 ;-) [16:15:13] multichill: Yeah… added a note [16:15:35] multichill: thanks :) [16:16:08] * Lydia_WMDE is currently looking at tickets that have not been touched since XXXX and is closing most of them [16:16:18] currently looking at all before september 2015 [16:16:41] the fun stuff you get to do on sunday evenings ;-) [16:18:21] Going through the backlog can be fun [16:19:39] At work I tend to leave notes like "what do you want from us? please explain" in stories and when I go through it a couple of weeks later and no update, I just close them [16:19:55] yeah [16:20:16] i have a lot of internal tickets also - code cleanup and so on [16:20:28] pretty useless if you don't look at them for 3 years -.- [16:21:40] Themed sprints can be quite useful. Like for example the next 2 weeks we're doing an operations sprint to get all open changes/incidents/problems etc to zero [16:34:04] closing all the bugs? [16:35:35] finally… so much old stuff :O [16:36:42] anything particular that i should work on or review? [16:37:17] i would like to work towards stop needing the wikidata build [16:40:30] aude: I wait for releng on that one :/ [16:41:07] what are they doing? [16:41:39] aude: if you want you can help me close old tickets ;-) [16:42:18] aude: Well, see the updates on https://phabricator.wikimedia.org/T95663 [16:42:23] hoo: https://phabricator.wikimedia.org/T100750 <- can we close this and file it under "will be solved with sites rewrite"? [16:42:34] https://phabricator.wikimedia.org/T141488 [16:43:38] Lydia_WMDE: Not yet, please… that one is very dangerous [16:43:42] hoo: i don't think we would use composer merge plugin for this [16:43:47] hoo: ok [16:43:51] aude: Chad suggested that [16:44:11] re media info [16:44:18] we where? [16:44:21] where* [16:44:32] we need to register these paths using a callback function in the old style [16:44:41] like we do in all/ most our code [16:44:49] aude: On the Dev summit [16:45:18] maybe we could have a jenkins job or bot that submits updates to mediawiki/vendor when composer.json is updated in wikibase [16:45:30] not sure.... [16:45:43] it's chicken and egg.... [16:46:13] when wikibase uses new version of something, then other extensions need to be updated to use new thing [16:46:20] and mediawiki/vendor [16:46:57] aude: https://phabricator.wikimedia.org/T96598 still relevant? [16:47:30] 2017-04-22 14:46:32 aude: I'm looking for the task to index strings in statements in the search engine. Quite sure we had one, but can't find it. See https://www.wikidata.org/wiki/Wikidata:Project_chat#Tool_to_look_for_the_existence_of_one_identifier_in_Wikidata [16:48:04] Lydia_WMDE: don't think so [16:48:13] we generate it for the deployment builds at least [16:48:20] aude: Well, yes… IMO just updating vendor per hand is good enough [16:48:30] and use regular wmf branches [16:48:35] that should just work [16:48:37] multichill: T99899 [16:48:37] T99899: [Story] Allow looking up Entities by external identifiers. - https://phabricator.wikimedia.org/T99899 [16:48:39] yeah, something like that [16:49:52] Lydia_WMDE: nah, not really, https://phabricator.wikimedia.org/T88534 is closer maybe [16:50:24] ok [16:50:53] I want to search for strings like "SK-C-5" and find https://www.wikidata.org/wiki/Q219831 [16:51:24] So my story is quite simple: Get all the strings in statements indexed [16:51:46] Maybe I'll just make a new one? [16:51:55] i thought we had a task for that but can't find it immediately [16:52:07] Me too [16:53:09] Most simple implementation would be to just append all the strings to text at https://www.wikidata.org/w/index.php?title=Q219831&action=cirrusdump [16:53:25] idk, make a new task and it can get merged if someone finds the duplicate [16:53:50] aude: https://phabricator.wikimedia.org/T163551 :/ [16:54:02] That table is such a mess [16:54:10] saw that :/ [16:54:24] I'm still trying to query the number of duplicates [16:54:28] it's not all that easy [16:54:33] don't know how that happened [16:55:13] Oh, there's a dump of that data [16:55:26] I might be able to import that locally, create the right index and then query it [16:57:37] Lydia_WMDE: i'm going through some of the tasks that i created [16:57:48] aude: thanks :) [16:58:15] https://phabricator.wikimedia.org/T109694 - i want to work on this one [16:59:05] aude: that would be nice :) [17:01:09] https://phabricator.wikimedia.org/T163642 for the search [17:01:20] multichill: thanks [17:02:30] aude: How does the wikidata object get serialized into the search engine? Maybe you can leave a pointer in the task? [17:05:00] multichill: think we would have the content handler expose a field [17:53:11] hoo: can we close https://phabricator.wikimedia.org/T90098 ? [18:00:24] No, this is still an issue [18:00:39] although trivial to fix at this point, I guess [19:20:41] Lydia_WMDE: Spring cleaning decline spree? :P [19:43:32] edoderoo: https://www.wikidata.org/w/index.php?title=Q73991&type=revision&diff=277630639&oldid=276406646 promyshlenniki? [19:44:38] multichill: yeah :P [19:49:04] hmmm, https://gerrit.wikimedia.org/r/#/c/349796/ should not pass on jenkins :/ [22:40:36] Anyone really good at SPARQL and can get me a list of items that have P2860 claims with references that have two reference URLs? (that is, one reference, that has *two* reference URLs).