[00:19:44] hoo: thanks for the review, I'm working on them [00:19:52] :) [00:36:04] hoo: done! [00:36:32] Great:) [00:36:37] I'll have a look in a bit [00:39:44] hoo: Also it would be great if you take a look at https://gerrit.wikimedia.org/r/314322, leszek wrote CI tests in another patch [00:40:26] Will see what I can do there [00:40:55] Thanks :) [01:28:09] hoo: Thanks for +2 [01:29:05] You're welcome [01:29:28] I'm actually wondering whether I should also add tasks to our team's sprint… because I reviewed them [01:29:31] hm [01:30:18] I have no idea the process in these cases [01:38:09] Well, I'm just going to spend a lot of time doing cross team CR then, I guess [06:36:42] Harmonia_Job: good morning. :) [06:37:37] hello sjoerddebruin [06:37:46] Working on surnames, I see? :) [06:38:06] sjoerddebruin: I think I'll do the same as you: just delete the wrong surnames and add the family as such [06:38:14] I'm working on surnames since august [06:38:40] the violations constraints went from more than 5500 to less than 3000 [06:38:47] There are some errors around those families though: some are marked as noble but they aren't. [06:39:24] actually I think we really should talk about that property [06:39:36] and transform it to all families [06:39:41] and not only noble ones [06:39:48] Same [06:39:57] we had the problem with a Swedish one, a few months ago [06:40:12] https://www.wikidata.org/wiki/Property_talk:P53#Other_families [06:40:13] some branch of the family were noble, others weren't [06:41:40] sjoerddebruin: ahhh you mean as P31? [06:41:58] families marked as noble when they aren't in their P31? [06:42:41] Yes. [06:42:55] sorry, ignore what I said then [06:42:59] I misunderstood [06:43:06] yes, that a problem [06:43:11] Yeah, but you've got a point too. [06:43:32] but I'm only working on names now (and Broadway, and sled dog racing, and tolkien...) [06:43:36] (but not families) [06:43:47] https://www.wikidata.org/wiki/Q14512814 for example [06:44:14] i'm disentangling (is that a word?) family names and disambiguation pages [06:44:18] slow going [06:44:26] I know what you mean. [06:44:51] But the item I've just mentioned is a patrician family and is wrongly marked as a noble one. [06:45:39] sjoerddebruin: I think we'll have one day to query all items with P31 subclass of family and verify all of them [06:46:05] I would love to fix them, but want to focus on my current stuff first. :P [06:46:47] well, I won't do it [06:46:59] I have my own list of stuff to do :p [06:47:10] * Harmonia_Job still hope the RADA will answer her [08:20:45] hoo: https://gerrit.wikimedia.org/r/#/c/315623 [08:22:06] Will have a look… current going through my email [08:24:02] thanks [08:24:19] Amir1: thanks for adding tests. I'll leave some comments later. Please still consider what Marius mentioned in the review of the other patch. Let's say there is set up is the test now, plus page 11 uses entity Q4 with some other aspects, for instance 'L.en' [08:24:49] if I run the sql query for a local db, L.en will sneak in to the entity usage page of Q3 [08:25:02] which is obviously not correct [08:25:31] leszek_wmde: hey, okay [08:30:06] leszek_wmde: I just added them. Tell me if that's what you mean: https://gerrit.wikimedia.org/r/#/c/315623/8..9/client/tests/phpunit/includes/Specials/SpecialEntityUsageTest.php,unified [08:30:43] Amir1: yes thanks! the test should now fail :) [08:31:39] leszek_wmde: It already failed: https://integration.wikimedia.org/ci/job/mwext-Wikibase-repo-tests-sqlite-hhvm/12196/testReport/junit/(root)/Wikibase_Client_Tests_Specials_SpecialEntityUsageTest/testReallyDoQuery/ [08:31:45] https://integration.wikimedia.org/zuul/ [08:31:51] hmm [08:31:57] how can I fix this? [08:33:03] Amir1: the where condition for groupconcat should be probaby something like eu_page_id=page_id and eu_entity_id=serialization of current entity id [08:33:29] so it only concatenates aspects for the page and the entity [08:33:29] okay [08:33:36] Indeed [08:33:40] that's what I commented yesterday [08:34:32] I missed it somehow [08:40:40] Good morning everyone. :) [08:42:07] hi sjoerddebruin :) [08:42:23] good morning :) [08:44:16] Spot the problem :)))) https://integration.wikimedia.org/ci/job/mwext-Wikibase-client-tests-sqlite-hhvm/12190/testReport/(root)/Wikibase_Client_Tests_Specials_SpecialEntityUsageTest/testReallyDoQuery/ [08:51:07] leszek_wmde: I added them and tests pass now [09:11:50] Amir1: You might want to have a look at my comments on https://gerrit.wikimedia.org/r/315623 [09:11:56] despite of that, I think we're good to go here [09:13:44] hoo: I do it now [10:33:42] :D [10:34:01] sjoerddebruin: I started looking into the suggester workarounds [10:34:14] hoo: yeah I saw my mail [10:34:25] is it possible to influence the color of map layers on WDQS? with two layers, I’m getting blue and light blue, which is terrible, especially on a map with a light blue ocean :D [10:34:59] WikidataFacts: No, don't think so [10:35:07] :( okay, thanks [10:35:11] you can try opening a bug and suggest more sensible colors [10:35:45] Thiemo_WMDE: FYI https://phabricator.wikimedia.org/T147307#2710914 [10:35:49] well, if I capitalize the first letter, I get a different color (orange), so I guess rephrasing the layer works [10:38:32] hah, I can add U+200B (zero width space) and it changes the color but doesn’t show in the legend! victory! :D [10:38:49] Nice hack :D [10:39:02] in the code it shows as •, but I don’t mind that ;) [10:39:49] WTF, but not in the embed.html version [10:43:01] okay, in embed.html the name of the layers seems to have no influence at all [10:44:31] no, actually, apparently it’s not directly influenced by the name… every time I render a new map in the same tab without reloading the page, it starts cycling through the same colors [10:44:49] Why can't we specify colours? [10:44:56] layers with the same name keep their color across the various maps, new layers get a new color assigned [10:45:04] sjoerddebruin: Removing P31 from the qualifiers… from P569 and P570 only, right? [10:45:30] hoo: ehm, let me check [10:47:41] hoo: better add P571 and P576 too. [10:48:38] Ok [10:48:44] The suggestions of https://www.wikidata.org/wiki/Property:P585 are already terrible so that one can be ignored. [10:50:42] sjoerddebruin: ok [10:50:45] Running the script [10:51:25] :/ [10:51:33] Now all suggestions for the properties are gone [10:51:39] because the probability is low [10:51:56] ugh [10:53:35] sadness :/ [10:54:55] For "inception" it's still ok [10:54:55] So the only way to get those back is add more qualifiers or fix those Julian dates... :/ [10:55:48] Yeah, because the difference between the P31 and P1480 on that one is just 2k. [10:56:06] http://tinyurl.com/jcfuypl [10:59:06] The min probability is 0.069 and we're at 0.0650071 with P569 -> P1480 for example [11:02:10] * hoo gives up for today [11:02:50] Well that was for the qualifiers, the other properties can just be added right? [11:03:03] I already did that [11:03:14] Oh, will check my reading list then. Saved some pages. [11:03:28] pids=(17 18 276 301 373 463 495 571 641 1344 1448 1476) [11:03:40] that's the list of properties we clear item context suggestion for [11:04:26] I think the Chinese ID is at least gone from most items. [11:06:27] hoo: https://phabricator.wikimedia.org/T148022 [11:06:58] Films don't get mixed up with scientific articles anymore. \o/ [11:07:31] WikidataFacts: Subscribe Jonas_WMDE, he will probably have a look [11:08:18] hoo: thanks [11:08:29] * Subscribed [11:08:31] meh [11:08:42] and I was wondering why it wouldn’t let me do it – you beat me to it :D [11:08:55] I thought you were telling me to subscribe him ;) [11:09:46] the hamming distance of English is clearly to low :P [11:11:00] Streets don't ask for a heritage status anymore. \o/ [11:12:17] Well I think this fixes the most small annoyances, better start working to a new suggester. :) [11:13:15] I hope I can find a Saturday to sit down with the suggester all day at some point [11:25:41] WikidataFacts: is it possible to get a list of wanted surnames, stripping the last word of a people label and count them up? [11:26:07] you mean from people without a surname statement? [11:26:15] I can try, but sounds like it might be prone to timeout [11:26:19] string operations are expensive [11:26:26] but it could probably work if restricted to citizens of one country [11:26:39] Yeah, we can try that. Or occupation etc. [11:27:17] I'm searching for a easy way to boost the surname property. [11:37:43] I made https://www.wikidata.org/wiki/User%3ASjoerddebruin%2FDutch_people_without_family_name meanwhile, giving Dutch people with a given name but no family name, sorted on family name. [11:38:11] Oh, and a sitelink to nlwiki [11:38:56] It still works without that, let me try that... [11:44:27] Thiemo_WMDE: Could you please remove your -2 from https://gerrit.wikimedia.org/r/288524 [11:44:41] This is no longer a backwards compatibility breaking change [11:46:33] Ugh, should start discussion about Dutch surnames first.... [11:47:06] sjoerddebruin: http://tinyurl.com/hb72hx5 [11:47:28] for Netherlands there’s almost no results. Gee, I wonder who’s responsible for that ;) [11:47:52] :P [11:48:25] This is Belgium though, they have a unique thing around family names. [11:49:03] well I’m running out of smaller countries here, France already times out :( [11:49:32] and with the commented-out code (that’s supposed to find the name item if it exists), even Monaco is already too much [11:49:41] If I was born in Belgium, my name would be Sjoerd De Bruin and will be sorted on De Bruin. I was born in the Netherlands though, where my name is "Sjoerd de Bruin'. The surname consists of a "tussenvoegsel" and the actual surname. Sorting is on the actual surname. :) [11:50:18] heh, names are terrible :D have you seen this before? https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-believe-about-names/ [11:50:55] The same thing can be said about dates... [11:51:08] yeah, there’s a few of those lists flying around [11:51:26] most stick to the term “falsehoods” so you can find them [11:52:53] The Peeters one looks good though, running that one now [11:54:23] sjoerddebruin: what if you go to belgium with your dutch name and they want to sort you in a list? wouldn't they still sort you under "de bruin"? [11:56:15] sjoerddebruin2: did you see the question I just asked? [11:56:26] What is just? [11:56:40] a few seconds before you joined again [11:56:42] As you can see I'm having some connection problems. [11:56:51] anyway, I asked: what if you go to belgium with your dutch name and they want to sort you in a list? wouldn't they still sort you under "de bruin"? [11:57:26] Good question. [11:58:10] They probably going to write Sjoerd De Bruin :P [12:05:55] But Belgium seems like a easy one to solve then. [12:33:16] hey aude :) [12:34:34] hoo: regarding https://gerrit.wikimedia.org/r/#/c/315623/, the parent patch isn't merged. Do you want to review it later or someone else needs review the parent patch? [12:35:25] Amir1: I planned to review it [12:35:33] but it kind of got lost in between [12:35:35] will do in a bit [12:35:55] Thanks [12:35:58] Awesome [12:55:40] hi :) [13:01:22] I am going to add to wikidata.org a couple languages that are not defined in mw [13:01:27] https://gerrit.wikimedia.org/r/#/c/312944/ adds sje and smj [13:01:29] that is for Thiemo_WMDE / Jonas_WMDE I guess :) [13:03:44] hashar: thanks :) [13:07:15] hoo: https://test.wikidata.org/wiki/Q16170?uselang=kittens [13:07:30] Existing entitytermsforlanguagelistview DOM does not match configured languages so the js ui doesn't work [13:07:44] oh wait [13:08:02] allows editing but the backend doesn't allow saving [13:08:13] :P [13:08:51] but kittens is in the language box :P [13:09:40] all solved for me [13:09:49] ok [13:10:20] see them in https://test.wikidata.org/w/api.php?action=help&modules=wbsetdescription [13:57:05] Amir1: Follow up to your patch: https://gerrit.wikimedia.org/r/315674 [13:57:11] missed this in the review [14:35:19] hoo: merged [14:36:29] Amir1: Thanks :) [14:59:00] Labs certificate revoked? [15:03:16] No, https://twitter.com/globalsign/status/786505261842247680 [15:03:28] Can also affect production, if you're unlucky [15:03:40] It's hitting tools.wmflabs :( [17:16:41] is this the right place to ask for help building a query, or is this more of a site/policy channel? [17:18:33] iotashan: WikidataFacts can help you. :) [17:19:09] hello :) [18:17:10] Jonas_WMDE: click edit where? https://phabricator.wikimedia.org/T148060 [18:17:53] on a statement [18:19:07] Hm, above a reference? [18:23:32] oh, I've noticed that [18:24:25] see the gif [18:25:00] That's probably a bug in DuplicateReferences. [18:25:10] I thought it was just some custom css messing it up but thinking about it now, it doesn't make sense to have "remove" there afterwards, whether it overlaps or not [18:25:10] I've talked to you about that after the first fix, but you've said back then that it would be fixed by the next deploy. [18:31:54] yes, because I couldn't reproduce on my local wiki [18:32:05] aude do you know when we last deployed? [18:41:22] the references header doesn't exists in vanilla wikibase, DuplicateReferences inserts it to display the "copy reference" [18:41:37] (it doesn't exists in read mode, I mean) [18:43:53] the bug also appears without the gadget [18:45:00] I can't confirm it in private mode. [18:45:22] I don't get it in a private window either [19:42:58] Good evening. sjoerddebruin, do you know if quickstatements is back up? [19:47:32] multichill: it should be. [19:52:23] sjoerddebruin: https://www.wikidata.org/wiki/Special:Contributions/Multichill <- yeah, it works. Can finish the P&P stuff I started yesterday [19:52:31] :) [19:53:00] I couldn't do much because of https://phabricator.wikimedia.org/T148045 in the last 4 hours [20:03:49] That's anoying sjoerddebruin. Back online with a different browser? [20:04:12] No, used the workaround listed there and they installed another certificate now [20:09:06] It's so sad that the Wikidata search of mix-n-match random mode is so slow [20:13:25] sjoerddebruin: I did most of the import. Might contain some mistake, but those will surface as dupes I think [20:14:08] You mean https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1749#.22Unique_value.22_violations? :) [20:14:24] (merging those now) [20:14:29] yup [20:14:53] Oh, did I do that? ;-) [20:15:25] If you get two different persons, the first one probably lost the election to the other and that is sourced to our favorite site [20:16:36] Errors like this. https://www.wikidata.org/w/index.php?title=Q20113628&action=history [20:17:00] (will fix in mix-n-match too) [20:17:33] Yeah, relative errors too I guess [20:19:00] There is something strange going on in http://www.biografischportaal.nl/persoon/11042750 [20:20:10] idk where that birth date comes from [20:21:14] I think I exchanged emails with the person who build that one sjoerddebruin . Not maintained at all atm :S [20:21:29] :( [20:35:50] multichill: https://www.wikidata.org/wiki/Q13136088 same author, same year... [20:40:00] Ah, there is Josve05a. You've made a error here. :O https://www.wikidata.org/w/index.php?title=Q20113628&action=history [20:40:09] multichill: fixed all doubles. [20:40:09] :O [20:40:29] drats! [20:40:39] yah! We should be able to get https://www.wikidata.org/wiki/User%3ASjoerddebruin%2FDutch_politics%2FTweede_Kamer complete P&P wise. I think about 20 left [20:40:49] I already did a bunch of them [20:41:00] 21,8% to go on mix-n-match [20:45:00] Merging more items before they appear. [20:47:53] sjoerddebruin: http://tools.wmflabs.org/multichill/painters/index_combined_new.php?propa=1749&propb=651 in case you feel like making some new ones [20:48:12] I'll rather clean the current shit up. :) [20:58:35] sjoerddebruin: pff, still plenty to match! [20:59:51] multichill: https://nl.wikipedia.org/wiki/François_d%27Hoffschmidt and https://nl.wikipedia.org/wiki/Ernest_François_Joseph_d%27Hoffschmidt [21:00:13] https://nl.wikipedia.org/w/index.php?title=François_d%27Hoffschmidt&type=revision&diff=44164208&oldid=41808354 error? [21:00:52] Maybe, you should ask Lodewijk [21:01:54] *mailing* [21:12:28] sjoerddebruin: Matching is quite slow without mix'n'match.... [21:12:35] Database reports! [21:14:18] Duplicates: done. Now need to fix the errors on mix-n-match [21:20:03] multichill: does mix-n-match solve redirects automatic? [21:23:23] no clue [21:23:58] But you can sync it so just fix wikidata and sync it sjoerddebruin? [21:24:30] Yeah, but it will still say "x connections here, but not on Wikidata" [21:25:00] I'll just wait a few minutes. :P [21:45:01] hi everybody