[00:22:48] :D Flow https://www.wikidata.org/wiki/User_talk:Josve05a [06:53:48] good morning [09:14:08] Amir1: this is the "vandalism" I hate. https://www.wikidata.org/w/index.php?title=Q879275&action=history [09:47:33] sjoerddebruin: I just saw it, I think my AI tool can handle that [09:47:56] We have a abuse filter for it, but it doesn't seem to work... [13:00:50] the observation that Wikidata is dead (not mine) seems to be correct [13:01:02] loads of lurkers no conversation [13:02:01] Yup, some things in the project chat go away unnoticed. [13:04:19] if it is not noticed, it did not happen [13:04:37] or it might as well not have happened [13:12:58] https://www.wikidata.org/wiki/Wikidata:Project_chat header should respect http://www.w3.org/QA/Tips/noClickHere [13:14:33] 372 "actual" watchers out of 793, not bad. https://www.wikidata.org/w/index.php?title=Wikidata:Project_chat&action=info [13:22:53] no conversation ? it's not the case on https://fr.wikipedia.org/wiki/Discussion_Projet:Wikidata (page created the 22th August) :D [13:27:20] how do I know there is a conversation and, what is the point made there ? [13:29:09] good points are made ... but how does it get heard ? [13:30:02] the one thing I find interesting is that much is about infoboxes [13:30:22] we are deploying Wikidata on the frenchs infoboxes. [13:30:37] good for you [13:31:03] it will be a bit rough but it is to be preferred for doing it before the English [13:33:11] it will be more international as a result [13:36:34] what is the biggest issue so far ? [13:37:36] we started by adding infobox on articles without infobox, but a lot of items where not in French (paintings in Sweden, football clubs, items in japanese). [13:38:45] some wikipedians wera also against Wikidata. They don't know the project, don't trust it, it's in english... [13:39:21] that's why we started a talk page on fr: to discuss about wikidata [13:39:55] Living under a rock? :) [13:39:58] the biggest issu right now is to easily edit wikidata from wikipedia [13:40:08] ruwiki has some good gadgets for that [13:42:35] The community is organizing a RFC ;) Right now it looks like this https://fr.wikipedia.org/wiki/Eug%C3%A8ne_Hugo [13:47:53] how it that an Request for Comment... is it for the French community ? [13:48:26] yes [13:50:18] there is redundant data in there [13:51:01] no need to include brothers [13:51:19] then again, the software is probably not good enough to make that inference [13:51:40] ok solid [13:52:12] Does it include references when they exists/ [13:53:12] LUA experts of fr: are working on references. [13:53:29] I saw a example of that somewhere... [13:54:17] cool [13:54:33] it is important that we communicate about such things [13:54:54] one reason to blog is to inform about things that I learn [13:56:04] Ah, the Russians again. https://ru.wikipedia.org/wiki/Зимар,_Губерт_Теофил [14:02:20] he was awarded several times [14:03:02] [14:03:17] hard to get that right [14:34:09] Amir1: You around? You still operate a bot to automagically set labels, right? [14:34:56] hey, yeah [14:35:18] Also stuff like "American painter" based on claims? [14:36:35] for descriptions? [14:36:42] yeah I do that for Persian [14:37:02] I gave my code to some poeple and they used it for their own language, I can recall Norwegian [14:55:00] Amir1: Ok. Would be nice to have it as a service [14:57:07] I like to work more on this, Can you explain more? [14:58:38] multichill: ^ [14:58:45] I always forget to ping [15:00:06] SetLabelsAsAService [15:00:08] SLAAS [15:00:23] Say for example an item has a claim occupation -> painter and country of citizinship -> Netherlands and no description in some language [15:00:31] A bot could add the missing description based on that [15:00:41] Like what Magnus is doing with reasonator [15:09:49] hmm [15:10:12] okay, I start working on it in mid-September [15:10:35] pretty busy with anti-vandal AI system for Wikidata [15:11:15] Multichill, but would that be why why the person is notable ? [15:11:59] arguably a few more statements and he would also be a sculptor and a writer ... what then ? [15:13:27] we would be better served with a tool that helps people with problematic statements or problematic info in a Wikipedia [15:16:18] we already have tools to create descriptions [15:16:39] use the ones Magnus uses and make them fixed and you are done [15:16:58] we do not have enough workflows [15:55:25] addshore: around? :) [16:05:47] Almost hitting the 300k... [16:12:18] benestar: ya [16:12:44] addshore: why do we have date serializations with zero months and days? [16:12:52] like 2013-00-00Tetc [16:13:29] that way I cannot parse it using .NET DateTime :S [16:13:39] benestar: crap unit tests? :D [16:13:48] Or it's only specifying the year [16:14:06] Reedy: it means that the month/day are unknown [16:14:09] but that's not valid ISO :S [16:14:27] lol [16:14:34] indeed not valid ISO [16:14:37] DateTime.TryParse() [16:14:42] but I dont think things get serlialzed like that now do they? [16:14:43] obj["value"]["time"] = obj["value"]["time"].ToString().Replace("-00", "-01"); // hack! [16:14:44] hehe [16:14:51] its just legacy data [16:17:16] benestar: I finally added support for generators... it was easier than I thought.... care to merge this? https://github.com/addwiki/mediawiki-api/pull/26 [16:17:53] also Reedy do you like how confusing I am trying to be? See the example usages in the readme! https://github.com/addwiki/addwiki/blob/master/README.md [16:18:48] * Reedy slaps addshore [16:19:01] yeh... I didnt actually notice until someone pointed it out .... [16:19:08] I guess Ill change it to aw ;) [16:31:32] addshore: bah, .NET also doesn't support negative dates and very large years [16:31:57] hi [16:32:37] hi benestar [16:32:43] hi eurodyne ;) [16:53:05] benestar: neither does php [17:06:18] addshore: I think I need to use something different than the standard implementation then :S [17:07:50] yes :P [17:08:02] you need to implement DataValues/Time in .net ;) [18:10:50] addshore: Does LUA have a way to access user #babel? [18:13:26] related: https://phabricator.wikimedia.org/T104912 [18:15:03] Nemo_bis: Thanks for the pointer [18:16:59] multichill: urrr, before you start messing with that, talk to Hoo [18:17:39] multichill: splitting parser cache by user language is one thing. splitting the parser cache by any possible combination of bable boxes... urrrr... [18:17:43] DanielK_WMDE: No messing, just writing some documentation and wondering how to only show the links relevant to that user instead of all 65 links [18:18:09] showing what links where? [18:18:25] multichill: hm, put css classes on the links, make babel emit rules for these? [18:18:45] That sounds like a good approach from a caching perspective [18:21:23] multichill: Special:MyLanguage ? [18:22:03] Take a look at https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Find_paintings_on_Wikipedia [18:22:08] It now has a hardcoded list [18:22:31] What I would like to do is use some LUA magic to generate the links from https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Find_paintings_on_Wikipedia [18:23:07] multichill: yea, but caching... [18:34:48] addshore: I think it isn't possible to remove a label using your php wikibase api client [18:37:09] benestar: use a LabelSetter? Term with a language and value = '' [18:37:12] i think [18:37:18] addshore: hmm, I see [18:37:37] afaik it can do everything ;) [18:37:38] but what if I want to modify the whole entity and also remove a label? [18:38:04] well, you can use the revision saver [18:38:24] make a new revision holding the change entity and save it.. should work! [18:38:27] yeah, I mean that but afaik you have to submit a diff to that [18:39:03] a diff? [18:39:15] ohhhhhh, hmmmmm [18:39:17] yes, add remove: ture [18:39:19] that may actually be true! [18:39:30] does the serialized entity contain empty labels? [18:39:33] I guess it doesnÄt [18:41:25] well, the api may also work just setting a labels value to "" too [18:41:25] not 100% sure though [18:41:25] if your passing back the whole entity [18:41:25] if it doesn't work, file a bug ;) [18:49:14] wow, masive lag [18:49:17] addshore: yes but the datamodel doesn't support empty labels afaik [18:50:22] oh, it does :o [18:50:45] :P [18:51:07] but if this doesnt work, this is probably a use case for a setentity module.... [18:51:18] ;0 [18:53:50] or an option for editentity, something like 'exact' => true ? :P [19:00:14] addshore: setentity would be editentity with with "clear" flag set. [19:00:27] that's why we have that flag [19:00:41] mhhhm, i guess, ut then we need to stop making that summary be "Cleared an entity" [19:00:46] :P [19:01:01] o_O [19:01:11] that's not at all what that flag does... [19:01:13] I think thats the default summary when you use the clear flag [19:01:17] we use that summary? wtf? [19:01:33] even if there is new data? [19:01:36] and then when using edit entity normally it is smoething liek "edited an entity" [19:01:47] DanielK_WMDE: yes, the summaries are dumb, [19:01:54] really it should do a diff and make a summary from that [19:02:24] see my draft thing @ https://github.com/wmde/WikibaseDataModelServices/commit/7eaaf4ed7f3ce16333eabafbd9b0659715c6697c ;) [19:03:01] addshore: clear=true should do it ;) [19:03:32] yeh, I think my api thing only sets that if the entity is empty currently though [19:03:47] would make sense to always set it? [19:04:31] even with the ugly summaries? :P Maybe always force a default different summary too ;) [19:05:12] I will try if empty labels also remove them [19:05:30] but what about removed statements... [19:07:15] yeh, I guess it should always set clear, want to make a PR? ;) [19:19:20] sjoerddebruin: If you feel like a puzzle, check out https://www.wikidata.org/wiki/Q245230 [19:19:23] You touched it before [19:19:43] ugh... [19:20:37] It's an artistic theme, but all the labels are still wrong [19:20:54] And some of the connected pages might be disambiguation pages :S [19:22:12] removed the desc [19:23:12] nothing weird in the sitelinks [19:28:19] Other ones: http://tools.wmflabs.org/autolist/index.php?language=nl&project=wikipedia&category=Wikipedia%3ADoorverwijspagina&depth=12&wdq=claim%5B31%3A4167410%5D%20or%20claim%5B31%3A(tree%5B82799%5D%5B%5D%5B279%5D)%5D&mode=undefined&statementlist=&run=Run&label_contains=&label_contains_not=&chunk_size=10000 [19:30:05] Weird things like https://nl.wikipedia.org/wiki/Dinitrofenol [19:36:31] benestar: https://github.com/addwiki/wikibase-api/pull/36 [19:37:19] addshore: I cannot merge that =o [19:37:24] ohrly? [19:37:54] "Only those with write access to this repository can merge pull requests." :( [19:38:16] invited you, thought you were already in it! [19:44:16] addshore: I think in Wikibase.git Cleared an entity should become either "Set an entity" or "Overrid an entity" [19:44:33] yeh, I think "Set an entity would make sense" [19:44:47] make a PS for that? ;) [19:44:48] Maybe just use the same summary as without setting clear=true [19:45:08] yeh, or that, I mean thats just Edited an entity i believe [19:56:45] benestar: what would your opinion be if I were to move those libs to gerrit? or keep them in both? [19:57:13] addshore: it would be a next step to make it like pywikibot [19:57:21] haha :P [19:57:30] did I already show you the command line bit? [19:58:49] nope [19:59:03] benestar: look at the readme of https://github.com/addwiki/addwiki [19:59:14] of course I have barely implemented anything, but you can kind of see where it is going ;) [20:01:18] hmm, awb? [20:01:30] yeh, *facepalm* im going to just change it to aw..... [20:02:15] just change it to addshore [20:02:19] xD [20:02:27] so everytime it doesn't work I can say "addshore isn't working for me" [20:02:35] mwahahahaaa [20:02:37] addshore fix wikidata [20:02:47] addshore task:restore-revisions --wiki localwiki --user localadmin 555 [20:02:55] I might call it aww [20:03:00] and you get a notification on your phone and start restoring the revisions *muhaha* [20:03:05] not sure what the last w is for, but aww! [20:03:10] yes, that's it :D [20:03:27] cool, it shall be called aww! [20:03:45] if you ever have spare time I have a long todo list ;) [20:03:49] * benestar thinks mediawiki will once use mediawiki-datamodel [20:03:49] hehe [20:04:22] * addshore would love to see more value objects in mediawiki..... I mean they could already make a value library and put title value in there [20:04:48] addshore: is TitleValue actually used anywhere? [20:04:58] I think it is in some places [20:32:26] multichill: 12462 humans on nlwiki left without birth date... [20:33:21] sjoerd is that also because of Kian ? [20:33:39] A very little bit. [20:33:59] Kian can't add birth dates [20:34:29] But that number could grow thanks to Kian [20:34:30] I can make it to add year, but doesn't worth the programming hassle [20:35:40] Wish the Wikidata Game had a option to skip people with a birth or death date in the date game. [20:36:44] A lot of birth dates weren't imported by multichill because there was no comma where there should be one, like https://nl.wikipedia.org/w/index.php?title=Charles-Thomas_de_Schietere_de_Lophem&type=revision&diff=44849406&oldid=44810652 [20:46:27] aude: https://www.mediawiki.org/wiki/User:Duesentrieb/Fix_interwiki [20:50:21] Amir1: would be interresting to see if it could guess the correct century, though :) [20:52:51] Back [20:53:29] sjoerddebruin: We could play around with more regular expressions [20:53:51] Funky dashes are fun [21:12:49] You can also do a run for pages created after the time you've done the last batch [21:16:24] multichill: https://commons.wikimedia.org/w/index.php?title=Creator:Abraham_Samuel_Fischer&diff=prev&oldid=170960125 [21:17:18] Ah, can you import the | Occupation = painter Amir? [21:17:47] | Nationality = DE is also useful I guess, but harder. Gender and dates should be easier [21:18:10] Amir1: Or are you still importing? [21:18:57] we can do that in other scripts [21:19:09] I wait this to finish [21:19:52] Since it does a difficult job finding out what wikidata item is related to whom [21:20:23] (It get VIAF from commans template, sends to autolist, gets the item and checks it) [21:22:17] multichill: An idea, we can make category for each one of the creator arguments, e.g. if nationality is in commons, but not in wikidata, or they differ. [21:22:21] and so on [21:22:45] That's not possible yet [21:23:04] We'll figure something out [21:23:12] arbitrary access? [21:23:44] oh I see now [21:23:53] Okay [21:39:25] What is the deal with those random qunit test failures on the Wikibase CI? [22:49:11] hey I need some help please [22:51:15] alo [22:55:50] Amir1, hi [22:56:41] are you Amir1 form wikifa ? [23:58:32] hey I need some bothelp