[00:17:16] Creating items are taxing on my coffee pot https://www.wikidata.org/w/index.php?title=Q42819428#P1420 [11:00:54] Hm, I wonder what happened to events... https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-statements?refresh=30m&panelId=3&fullscreen&orgId=1 [15:03:39] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @addshore & @Christoph_Jauera_(WMDE) - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:03:50] Technical Advice IRC meeting starting now in channel #wikimedia-tech, hosts: @addshore & @Christoph_Jauera_(WMDE) - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:21:28] what's up with wikidata dumps? looks like this week's dumps are not there... [16:33:24] SMalyshev: the process has slowed since Wikidata grew so much [16:33:40] since lat week? [16:33:44] last week [16:34:10] Since July [16:34:31] last week's dump was done by 1 Nov 8:00. No dump today [16:36:08] that's Nov 8 16:30. So we're doing significantly worse than last week, not july [16:36:12] something is wrong [16:36:35] Maybe something went wrong with https://phabricator.wikimedia.org/T178247? [16:36:37] didn't the dump process change last week ? [16:36:53] thedj: I don't know, did they? [16:38:06] https://lists.wikimedia.org/pipermail/wikitech-l/2017-October/089040.html [16:38:38] so, new process, and follow up mails indicate that this also kicked off a few days later than normal [16:39:15] isn’t this unrelated to the Wikidata dumps? [16:42:00] i wouldn't know [16:42:03] near hating people repurposing items without cleaning them up entirely [16:42:18] really that's something that make me grind my teeth [16:42:50] We should still have a task force for the list/position held mess. [16:43:15] like Infovarius changing Latin-script names to Cyrillic ones, without cleaning the descriptions [16:43:18] like, seriously [16:58:29] oh my [16:58:36] he didn't *even* clean the uses [16:58:40] seriously! [17:00:12] I clean up everything, labels, descriptions, aliases, properties, uses [17:00:27] and he revert me halfway without even saying why [17:01:16] The way of communication with Russian users is just more... different. [17:01:37] sjoerddebruin: actually I think I know "why" [17:01:50] he want the Slavic names to have the Qid the most ancient [17:02:00] ... [17:02:04] even if it's easier to clean the other way [17:02:25] that's what he once said to me [17:02:44] and that's the only reason I can think of for the mess he made today [17:04:37] and of course, he only clean up descriptions in Russian, doesn't seem to care that if the other 290 descriptions are wrong there will be wrong uses [17:04:43] I'm tired [17:05:10] I clean up one day and a week later they make a mess again because they dislike the Qids [17:09:41] sjoerddebruin: do you think that I can speak of this as desorganisation on the sysop page? [17:09:57] sjoerddebruin: if he do it again without cleaning up? [17:10:07] I presume, you "warned" him. [17:10:48] hmm, I was angry when I wrote, not sure it will count as proper warning [17:11:15] I will write a correct warning the next time and then involve sysops [17:11:24] because he does that since at least last year [17:12:09] (last year he merged "Alexei" and "Алексей") [17:12:23] (yes, they are thousand of entries with these names) [17:12:52] (I had a vivid nightmare about it after spending a whole week cleaning up) [18:19:02] Does anyone know what's the biggest Wikidata-Item? (or the one with most statements) [18:19:13] https://www.wikidata.org/wiki/Special:LongPages [18:19:48] thanks! [18:19:52] Not sure if biggest == most statements. [18:20:25] (pretty sure it's not) [18:21:23] Not sure if we can query on all items and then filter on number of statements. [18:21:40] I'm sure it timeout :) [18:21:54] (I tried at a workshop a few months ago) [18:22:06] I do know the number of statements. https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-statements?refresh=30m&panelId=6&fullscreen&orgId=1 [18:22:07] there is actually a predicate for the number of statements (wikibase:statements) [18:22:28] Lucas_WMDE: yes but it timeout if you try it on all items [18:22:29] but WDQS doesn’t seem to index it, or something – apparently it’s not efficient to look for the items with the largest value for that predicate [18:22:32] yeah [18:22:35] 5113 statements. [18:23:07] I wonder how many of the daily queries are in fact wrongly written queries with SPARQL mistakes [18:23:09] Number of statements is also in pageprops right? [18:23:26] Harmonia_Amanda: https://grafana.wikimedia.org/dashboard/db/wikidata-query-service-ui?refresh=1m&panelId=11&fullscreen&orgId=1 [18:23:34] (I made several stupid mistakes right now and still tried to run the queries) [18:23:54] oooohh [18:24:46] So like 800 per week. [18:25:08] 1k* [18:25:10] I thought it would be higher [18:25:34] Well, this is UI only I think. [18:25:47] like, I kept timeouting in the afternoon queries that ran fine in the mornings [18:26:09] and running them again and again [18:40:50] is there a way to check that ttl dumps for wikidata is actually still running? [21:24:48] !admin could you delete https://www.wikidata.org/wiki/Q20890665 for me, thanks. [21:25:27] Can't it be merged? I'm not aware of the current consensus of Commons sitelinks. :| [21:25:55] no, it won't merge. It refuses to merge, it refuses to change descriptions, it's stuck in some endless loop of hellishness. [21:26:32] I didn't have any problems with merging. [21:26:50] I get Failed to merge items, please resolve any conflicts first. [21:26:56] Error: Conflicting descriptions for language en. [21:27:15] What did you use to merge? [21:27:35] Special:MergeItems [21:27:50] Hm, I used the merge gadget. :| [21:28:56] it was only needing fixed so I could remove the old interwiki link from the Commons category.