[07:51:00] Another day. https://www.wikidata.org/w/index.php?title=Q599373&action=history [09:30:02] Not sure if I did this correctly https://www.wikidata.org/w/index.php?title=Q3064310&type=revision&diff=281227169&oldid=244089664 [09:30:08] feels so wrong adding multiple different dates [09:33:47] Josve05away: It's not weird. What's the most correct one? [09:33:53] You could adjust the ranks. [09:35:01] that's the tricky bit [09:35:18] Trying to gather more sources and adding them to wikidatat before I chose the rank [09:35:44] Wait, I'll do something. [09:37:26] The last statement should be the most neutral. https://www.wikidata.org/wiki/Q3064310#P569 [09:38:21] but the sources say he died at 80 years of age, and that would be in 1935 [09:38:53] this is why I hate controdicting sources [09:40:53] Hm [09:43:52] still https://tools.wmflabs.org/reasonator/?q=Q3064310&lang=en says "February 25, 1935" [09:45:21] Think Reasonator doesn't support ranks [09:46:25] hoo: https://phabricator.wikimedia.org/T116773 became worse, no results anymore. [09:47:11] (also, ping aude) [09:50:30] sjoerddebruin: :( [09:51:03] "wereldbeker" is already enough to fuck it up [09:52:25] like how slow is it? [09:53:03] a few seconds? [09:54:02] 5 seconds now, but a few minutes ago it keeps loading after one minute. [09:54:21] (it's always working when you want to test it. :) ) [09:55:48] well, the query is cached now [09:55:57] i'll have to try later, but i believe you [09:56:22] and maybe there is somethign in the logs about a slow query for this [09:59:17] Yeah, that's why I pinged hoo [09:59:27] i see stuff about the query "lost connection" and maybe retries more than once [10:00:36] meh :/ [10:00:55] I would need to look at the concrete query... it really shouldn't be slow [10:01:03] I mean it's awful... but not *that* awful [10:01:12] * that slow [10:02:46] hoo: https://gist.github.com/filbertkm/b5b473bae4caf35d023f [10:05:42] runs in 3s on db1071 [10:06:09] it's probably cached at the omment [10:06:11] moment* [10:06:12] same on db1070 [10:07:33] Even in that case it shouldn't be so much worse [10:07:41] aude: Do you know which server it timed out on [10:07:55] it really shouldn't have ended up on one of the weak 64GB ram slaves [10:07:57] but it might [10:08:39] db1017 [10:08:46] 1017? [10:08:52] ah [10:08:56] db1071 [10:09:06] Ok, that's weird [10:09:46] that's one of the strong slaves, it really should be ok with that query [10:10:00] and db1070also [10:10:00] the index on the terms table is large, but still bearable, I guess [10:10:14] yeah, those are the only ones that still have a weight assigned [10:10:32] the others (weaker slaves) just don't cut it anymore [10:12:10] I can't see how that query could really be improved :S [10:14:07] Maybe I'm just nitpicking. [10:15:27] * hoo_ loves power outages -.- [10:17:46] Anyway, I'm off [10:20:02] =o [10:31:25] * aude sighs [10:31:43] external validation needs to be adjusted again for changes in data type registration [11:31:17] @Thiemo_WMDE https://www.lieferheld.de/lieferservices-berlin/?lat=52.4984352&lon=13.3811952 [11:31:26] @jzerebecki https://www.lieferheld.de/lieferservices-berlin/?lat=52.4984352&lon=13.3811952 [11:31:52] https://www.lieferheld.de/lieferservices-berlin/restaurant-thai-huong-snack/98/ [11:32:37] :P [11:35:53] Jonas_WMDE: This may not be relevant for the Wikidata channel, just for those in the office :) [12:24:07] How do I get the list of all items having a P963 property? [12:25:55] https://www.wikidata.org/wiki/Special:WhatLinksHere/Property:P963 [12:26:11] namespace : (main) [12:26:44] Yup. [12:26:56] petterson: https://goo.gl/y1IE9d [12:27:03] :P [12:31:46] petterson: developping the addshore query, you can also print the label of the item, and the value of the stream [12:31:49] http://tinyurl.com/q3999tg [12:32:49] thanks [12:39:17] Indeed! [13:23:37] Wondering why people keep doing stuff like this... Something wrong with the labels in one language? https://www.wikidata.org/w/index.php?title=Q6153778&action=historysubmit&type=revision&diff=281265794&oldid=280514269 [13:23:43] nikki: do you have a clue? [14:23:15] addshore: can I has quick review before SWAT?: https://gerrit.wikimedia.org/r/#/c/257318/ [15:16:00] jzerebecki: infact, we could import the MASSIVE revision to test wikidata, and then find out if the setting is correct in group0 ;) [15:18:35] addshore: we could do the same thing on a vagrant instance ;) [15:18:52] indeed, but that would require ever so slighty more time ;) [15:32:17] Where's Lydia? [15:32:39] * hoo needs a PM decision [15:34:28] lydia says yes! :P [15:35:01] :O [15:35:18] Now that I have the answer, I can phrase the question to get it "right"! :D [15:35:35] heh [15:35:42] But more seriously, the current JSON dump is incomplete, because we messed exception handling [15:35:50] Eeeks [15:35:51] Missing maybe 20k items [15:35:56] ugh [15:36:40] More like 110k items :S [15:36:56] Am working on a patch [15:36:58] i'm trying to think of how to deploy ChangeDispatcher should use locks on the local DB. https://gerrit.wikimedia.org/r/#/c/253898/ [15:37:22] without needing root to change the cron job [15:37:37] Why would you need to change the cron? [15:37:54] We didn't alter the public interface AFAIR [15:38:23] to deploy it I need to stop all dispatchers before deployment and ensure they don't start until it is done [15:39:10] so manually put an exit 1 in dispatchChanges.php on the host with the cron and then let scap overwrite it? [15:39:21] cherry pick, pull on tin, killall on terbium, sync-dir …, profit [15:39:40] If you think, that it's needed, you can also null it on terbium locally [15:39:46] sync-common will fix that afterwards [15:41:10] ok [16:21:00] hoo: sync-file or sync-dir would also fix a 0 byte size php file that was manually changed on one server, so there is no need to sync-common in addition? [16:21:46] yeah, it should [16:31:41] aude: jzerebecki: https://gerrit.wikimedia.org/r/257354 [16:32:04] Would like to deploy ASAP and regenerate the dump [16:42:41] sjoerddebruin: oh, there you are :P the only thing I can think of is that people are misunderstanding the wording, i.e. they're reading it as "this person is the head of government of (insert item)" rather than "the head of government of this item is (insert person)" :/ [16:52:47] nikki: yeah [16:53:09] not sure how we could make that clearer though :/ [17:42:10] hoo: Read-only again. Waiting for 4 lagged database(s); 11:38 CET judging from my last bot edit [17:42:55] Looks fine from here [17:43:03] but am probably to slow again [17:44:26] It's about 7 hours ago hoo [17:44:42] Amir1: Can you rerun the fuzzy bot? :-) [17:46:03] multichill: Whoops, didn't see the time :P [17:46:58] multichill: hey [17:47:01] okay [17:47:07] can you re-run the scripts? [17:48:31] Amir1: Good point, forgot about that. Will do tonight and ping you when done [17:48:41] great [17:48:44] thanks :) [18:17:06] dennyvrandecic: How long will the Freebase site be up? [18:40:04] Which API should I use for getting site links on Wikidata? [18:42:07] Wbgetentiyies [18:42:17] Wbgetentities [18:43:47] Mavrikant: https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q42&props=sitelinks [18:44:18] What do you want to use it for exactly? You might be able to reuse one the client libraries instead of starting from scratch [18:46:17] sjoerddebruin: dunno. at least for three months once the deprecation is announced, which has not happened yet [18:48:48] Okay, it's taking longer than expected.. [18:48:49] multichill, I'm a bot owner in trwiki. I will fix violations on https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P2123 [18:49:33] sjoerddebruin: yes [18:52:00] Mavrikant: Did you try pywikibot? No need to talk to the api directly [18:53:41] multichill, I know pywikibot but no [18:53:57] hoo: ping me if I should look at the patch fixing dumps again [18:56:00] Mavrikant: https://tr.wikipedia.org/wiki/A%C4%9Fa%C3%A7l%C4%B1,_Ey%C3%BCp and https://tr.wikipedia.org/wiki/Abbaslar,_Kahramanmara%C5%9F have the same link [18:56:04] How are you going to fix that? [19:01:22] Mavrikant: Added some more constraints at https://www.wikidata.org/wiki/Property_talk:P2123 [19:07:43] What is the API query for no label in Armenian (hy) ? Something like NOLABEL[hy]? [19:09:23] I want this query but only the items that are missing an ARmenian label: http://tools.wmflabs.org/autolist/autolist1.html?lang=hy&q=CLAIM[1343%3AQ16387823] [19:10:38] tobias47n9e__: Labels are not in that database, you have to use query.wikidata.org for that. How is your sparql? [19:12:07] multichill, I still haven't had time to look at more than 2 sparql examples yet. And I didn't want to use pywikibot for this. [19:14:35] amir1: Wow, last time (20 oct?) I ran a query for all painters the result was 164236, now it's 241923. That's quite the increase! [19:15:28] I guess alias import added quite a bit to that [19:16:40] Amir1: Anyway, ran the queries for https://www.wikidata.org/wiki/User:Multichill/Fuzzy_painter_matches . Can you run the fuzzy thingie? [19:27:29] tobias47n9e__: http://tinyurl.com/zefn3xs seems to work [19:30:43] http://tinyurl.com/gohddrn an alternative which shows all the items with the hy label, then the ones without appear at the top [19:38:57] Exception encountered, of type "Diff\Patcher\PatcherException" [19:39:03] at https://www.wikidata.org/w/index.php?title=Q9206170&action=edit&undoafter=235683457&undo=276078677 [19:40:04] https://phabricator.wikimedia.org/T97146 I think [19:40:14] hoo__: Still looking at those exception logs? ^^ [19:40:32] Didn't look yet [19:45:58] dennyvrandecic: I guess you already seen https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2015-12-02/Op-ed ? [19:49:07] multichill: I did, and I am not sure how to react [19:49:34] "Look mama, I'm in the paper!" ? ;-) [19:49:46] :D [19:50:32] I'm probably the most hated person of SEO experts now. <3 [19:52:34] well, it is an opinion piece. [19:56:03] dennyvrandecic: For one thing he seems to be only pointing out problems and not providing solutions. I might have read over them because it was a bit tl;dr [20:00:46] This looks like the tl;dr version of "Wikidata needs more references" [20:01:17] Working every day on that. [20:02:59] Finally some of the data sets for people are opening up so that means it becomes easier to have a bot import and source data from that [20:03:12] multichill: I was afk [20:03:14] sure [20:03:18] Funny to see how many of these sources disagree about something as simple as a date of birth [20:13:16] Lydia_WMDE: hey, around [20:13:32] Amir1: hey [20:36:09] jzerebecki: aude: Could you have (another) look at https://gerrit.wikimedia.org/r/257354 please [20:46:00] hoo: does ASAP mean you are not going to wait for the train for that patch? [20:46:39] I kind of assumed I missed it [20:46:49] hijacking it there sounds fine [20:47:21] * doing [20:53:34] Is this the correct way to indicate that something is disputed by the source? https://www.wikidata.org/wiki/Q2080887#P812 [20:54:30] shit wrong property [20:55:38] https://www.wikidata.org/wiki/Q2080887#P512 * [21:11:31] aude: Lydia_WMDE: We need to upgrade the office… https://www.youtube.com/watch?v=UCaukTe2J3I [21:12:54] :D [23:30:25] nikki, Awesome thanks. I will try to built a few maitainance templates from that query!