[09:43:00] Anyone has an idea why packagist might not pick up the new data values number release? [09:43:07] JeroenDeDauw: addshore ^ [09:48:00] multichill: almost 300k NTA, SUDOC better watch out :P [10:00:05] sjoerddebruin: Yeah, the first run is done. I have some problems getting the full list in SPARQL to progress, might need to switch to the viaf dump [10:09:39] hoo: Quick check, a sitelink can only have one badge, right? [10:10:15] No, it's a list [10:11:29] ok [10:16:36] hoo: https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Label_collisions is still finding new instances..... [10:17:29] Do we have a ticket about that? [10:17:52] I was wondering about too [10:18:25] Oh wait, I hit limit so it might be a fixed bug, but just the leftovers [10:19:11] The page size just changes because some titles are longer or shorter... [10:19:34] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1964 bytes in 0.105 second response time [10:19:43] arghg -.- [10:19:44] Looking [10:19:48] multichill: Took a note [10:19:55] Will hopefully be able to look today [10:19:59] but surely during the week [10:21:01] multichill: looks like you could use FILTER(STR(?item1) < STR(?item2)) to avoid the double results [10:21:41] https://wikidata.wikiscan.org/?menu=live&date=6&list=users&sort=weight&filter=all <- Romaine is going way too fast [10:22:58] You mean Renamerr [10:24:20] Romaine was creating new items [10:25:53] You might want to check your Nagios check "- pattern not found " [10:26:18] The check should be fine… the lag actually is above 300s atm [10:26:44] The check basically requests the lag from the API and then matches it against a regexp that matches numbers up to 300s [10:26:49] Heh, dirty way of making a check [10:27:36] You can just say 0-200 OK (green), 200-300 WARNING (yellow), 300+ (red) and have a message to explain it [10:27:39] Indeed… but it works(tm) :D [10:28:09] Fair enough [10:29:07] It's slowly improving… I guess no action needed for now [10:29:48] Also I have trouble accessing https://www.wikidata.org/wiki/Special:RecentChanges … but that might just be the WMDE internet [10:32:18] Leaving for lunch [10:32:39] If this continues, message or block either KrBot or Renamerr :/ [12:08:02] dispatch should recover any second [12:09:42] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.085 second response time [12:10:39] Let's see whether it lasts… I ran some dispatchers manually before [12:10:41] but stopped now [12:11:31] sneak hoo [12:11:33] sneaky [12:11:55] heh… it was (slowly) improving [12:11:59] so I sped it up some [12:18:27] :) [12:30:22] hoo: lets make a job, that shells out to run dispatch changes? ;) [12:30:36] viola, we are using jobs for dispatching [12:31:33] addshore: That would actually work… but grr, also so so evil [12:31:51] but would also work for most 3rd party installs [12:31:55] which we want as well [12:46:44] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1967 bytes in 0.099 second response time [13:01:45] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.325 second response time [14:02:55] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @addshore & @Pablo_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:47:22] Amir1: You're handling the suggester update this week, right? [14:47:33] yup [14:47:34] Suggester update? :O [14:47:41] Cool :) [14:48:11] sjoerddebruin: Yeah, Amir1 will do it this week… I shared the scripts and steps I use to do that [14:48:20] Cool! [14:48:49] Having more eyes is always a good thing [14:49:36] Correct me if I'm wrong, but I don't think that sandbox properties are meant for testing on 100+ items https://www.wikidata.org/w/index.php?title=Topic:Ub2r65dysqlzyqid&topic_showPostId=ub2rk1wpoyfuv7hx&fromnotif=1#flow-post-ub2rk1wpoyfuv7hx [14:50:04] Ugh [14:50:18] uuuuuh [14:50:34] yeah, that’s definitely not how I thought the properties were supposed to be used [14:51:38] Ehm... https://www.wikidata.org/w/index.php?title=Q4781968&type=revision&diff=619598910&oldid=619598678 [14:52:37] Wow, no… that's not nice [14:55:52] would be nice if quickstatements supported test.wikidata.org [14:56:20] if he really needs to test on hundreds of items [14:56:46] using them on real items also clutters the history of those items [14:57:14] So, he's basing it on "I do not see anything wrong in using it other items." said by a single admin in 2013. [14:57:16] Yeah, and it potentially causes needless constraint violations [15:03:44] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1981 bytes in 0.144 second response time [15:05:00] looking [15:06:27] https://www.wikidata.org/w/index.php?title=Special:WhatLinksHere/Property:P1106&namespace=0&limit=500 pfff [15:07:07] " Just don't expect them to remain there forever." [15:07:55] that means you can just remove them [15:23:44] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1969 bytes in 0.074 second response time [16:23:57] multichill: Do you have any recent examples of duplicate labels+descriptions? [16:24:18] I just checked and this is not possible to create via wbsetlabel, wbeditentity or the special page [16:25:38] The only way I can see right now is by exploiting a race condition, when creating two items very close (in time) to each other [16:39:06] (got disconnected) -- does anyone remember where I can find the graph of "Wikidata labels by language, over time"? [17:06:24] nikki: that sounds like something you might still be interested in? https://www.wikidata.org/wiki/Talk:Q22674925 [17:07:15] (that is, have an opinion to give, as per your older comment there) [17:07:37] addshore: are you around? have a question about wdqs docker setup [17:07:42] o/ [17:09:28] abartov: over time isn't updating anymore, maybe you can generate something with https://www.wikidata.org/wiki/User:Pasleim/Language_statistics_for_items [17:10:09] addshore: so I set it up, and I add items to wikibase, but they do not show up in the db. So I assume updater is not working properly. Is it supposed to, in default install (I didn't change anything in configs, etc.)? [17:10:42] so not show up in the query service db? [17:10:44] *do ont [17:11:01] addshore: also, I wonder where the binaries for wdqs/updater come from - if I want the latest build, is there a way to use it? [17:11:04] addshore: yes, does not show up in the db [17:11:42] SMalyshev: does anything appear in the logs? [17:11:50] are you using the example docker-compose or a custom setup using the images? [17:12:19] addshore: which logs? (assume my knowledge about docker is close to zero, so I have no idea which logs I'm supposed to look at :) [17:12:37] addshore: I am following https://github.com/wmde/wikibase-docker/blob/master/README-compose.md [17:13:11] no customization at all - my purpose is to discover the path to working envt and document it so that's the first step [17:13:36] so, you can run "docker-compose logs" and it will give you the logs [17:13:40] you can also filter it by service [17:13:54] if you remove the -d when you run docker-composer up it will print the logs to stdout [17:14:26] The binaries come from https://github.com/wmde/wikibase-docker/blob/master/wdqs/0.3.0/Dockerfile#L6 [17:14:40] currently there is not a tag setup for the 'latest' / current master [17:14:45] although that could be possible [17:15:31] ah, 0.3.0 is old... I wonder if it's possible to get one from archiva instead? or from https://github.com/wikimedia/wikidata-query-deploy ? [17:15:47] Hey, folks, can I ask you something about the WDQS? :) [17:15:49] does docker know how to do git-fat? [17:15:52] 0.3.0 is old? but you only released it a week or so ago? :D [17:15:58] abian: yes [17:16:11] I have a CSS patch for https://phabricator.wikimedia.org/T191749 [17:16:12] SMalyshev: docker images / building can know how to do everything [17:16:22] But no idea on where to apply it [17:16:28] addshore: no, 0.3.0 was release in February, 2 months ago [17:16:39] (and for me a week is old too ;) [17:16:42] SMalyshev: hehe, it just took me a while to update it / use the new release :D [17:17:07] *idea where, in what files [17:17:33] SMalyshev: can you link me to the latest version on archiva ? [17:17:44] abian: put it on gerrit and tag Lucas_WMDE and myself as reviewers [17:18:01] Hmm... as a new file? [17:19:00] abian: hmm I am not sure I can answer that without looking at the patch... generally preferably to keep everything in styles.less I think [17:19:46] However, I can't find all the modified rules in there :/ [17:19:54] addshore: hmm let me remember how that thing with git-fat actually works... [17:20:02] Should I write some new ones that overwrite the others? [17:22:13] abian: probably? I am not sure, I'd need to look at the patch, and Lucas probably knows better than me. I'd start with doing what you think is reasonable, and if it's not correct, we'll modify it [17:22:21] on review [17:23:30] addshore: I am not sure how to extract things from git-fat directly. Any chance your setup can use rsync? it works over rsync [17:24:03] it can use anything the base image can use [17:24:26] well, actually, thats a lie, yes, it can use anything [17:24:33] it can rsync the files in if that is really what needs to happen :P [17:25:01] maybe it can use git fat then? https://wikitech.wikimedia.org/wiki/Archiva [17:25:56] basically if you have git-fat installed then you do repo checkout, git fat init if it's a fresh one and then git fat pull [17:26:07] this gets you the binaries, which then you can copy around [17:27:00] SMalyshev: if you write up a phab ticket with the instructions to follow (rsync command / git clone or something with git-fat installed) then we can probably alter the images :) [17:28:05] addshore: ok, will do shortly [17:28:14] awesome! :) [17:29:57] * addshore leaves [17:32:16] reosarevok: when I created it, it was for things which are no longer populated, and I don't get the distinction the ip is making there, they both mean to me that the population is 0 [17:32:30] if it's not 0, then it's a populated place [17:33:42] nikki: the IP's idea is that the *place* might no longer officially exist, because say a village was absorbed by a neighbouring city, so it's a former populated place, but it's not "formerly populated", but "formerly a place" [17:34:08] then that's something different to what I intended [17:35:01] Welcome back, Lucas_WMDE :) [17:35:26] I'm pretty sure it's been used for that quite a bit at the Estonian level at least, wonder what it should be replaced with if it's not appropriate for that use [17:36:16] there are a few items for country-specific things, dunno if there's anything generic [17:36:34] https://www.wikidata.org/wiki/Q19953632 perhaps [17:37:17] Heh [17:37:26] The cswiki linked https://cs.wikipedia.org/wiki/B%C3%BDval%C3%A9_s%C3%ADdlo is specifically about merged, it seems [17:37:37] So I guess we probably need a generic option for that [17:37:49] :/ [17:38:03] Lucas_WMDE: Can you read us? [17:38:23] hi, yes [17:38:28] sorry, had internet problems earlier [17:38:30] * Lucas_WMDE reads logs [17:38:34] Great :D [17:38:43] Don't, worry, I tell you [17:39:06] I was wondering where to upload a CSS patch for https://phabricator.wikimedia.org/T191749, for the Query Helper [17:39:30] I see that style.less doesn't have all the rules [17:39:46] *Don't worry, argh [17:40:01] haha, I like the version with the comma too :D [17:40:06] but style.less should be the place [17:40:20] https://www.youtube.com/watch?v=LbTxfN8d2CI [17:41:30] all the other stylesheets come from npm dependencies, as far as I’m aware [17:41:35] so it would be difficult to change those anyways [17:41:48] Okay [17:42:36] But it's going to be hard for me to test the rules :/ [17:43:55] if you open the website locally, it should automatically reload any time you change style.less [17:44:02] without needing to reload the page [17:44:51] Do you mean query.wikidata.org? [17:45:03] no, your local copy [17:45:17] do you have the query UI cloned locally? [17:45:24] Hehe, okay, I'm using the web interface [17:45:35] But let's clone the repository, yeah :) [17:45:54] yeah, you’ll need that to send the change anyways :) [17:47:14] Not really, I've made the patches via the web interface until now [17:47:26] Not many nor complex, that's true :)) [17:48:22] oh, I didn’t realize you could do that, sorry :D [17:49:39] You're used to doing things right xD [17:55:21] I think I left my chocolate in my basket :( [18:00:13] hoo: My assumption is that it's only old items [18:00:27] hm ok [18:00:30] at least something [18:01:02] Could sort the query by timestamp, let me poke at it a bit. Lucas_WMDE also had a good suggestion to reduce the dupes [18:07:28] Lucas_WMDE: Now I remember, adding the filter will make the query time out :-( [18:07:44] multichill: weird, I think I got a result earlier today… [18:08:09] Feel free to improve the query [18:10:19] multichill: 1000 results in 30 seconds http://tinyurl.com/ybwghlao [18:11:33] Lucas_WMDE: It looks like sometimes it works and sometimes it times out? [18:11:44] possible [18:17:04] SMalyshev: The conversation about indexing ExternalId's is pretty active. So what about strings. Do you have any idea what the impact would be of indexing these too? [18:17:13] (another bug, but related) [18:17:38] That would be https://phabricator.wikimedia.org/T163642 [18:24:39] Ну шо [18:24:46] Будем ебашить в контру? [18:43:20] What did that say? [19:16:13] SMalyshev: oh noes [19:16:23] Does that mean the current 0.3.0 image is kind of broken? [19:17:19] $V002 -> $V003 [21:04:39] hoo: let's hope we can improve the suggester this year <3 [21:05:20] sjoerddebruin: I really hope so [22:19:54] hey, i dont really know how to research this: is there a hard maximum of values to a property e.g. P1476 (title)? [23:04:54] addshore: the image is not broken per se, but uses old dictionary. Old dictionary can be used with new code, at the price of some performance hit and a bunch of warning [23:04:57] *warnings [23:05:04] but there's no real reason to