[07:50:44] Hi.. I need to know how to link this category https://en.wikipedia.org/wiki/Category:Films_directed_by_Charlie_Chaplin with https://ml.wikipedia.org/wiki/%E0%B4%B5%E0%B5%BC%E0%B4%97%E0%B5%8D%E0%B4%97%E0%B4%82:%E0%B4%9A%E0%B4%BE%E0%B5%BC%E0%B4%B3%E0%B4%BF_%E0%B4%9A%E0%B4%BE%E0%B4%AA%E0%B5%8D%E0%B4%B2%E0%B4%BF%E0%B5%BB_%E0%B4%9A%E0%B4%B2%E0%B4%9A%E0%B5%8D%E0%B4%9A%E0%B4%BF%E0%B4%A4%E0%B5%8D%E0%B4%B0%E0%B4%99%E0%B5%8D%E0%B4%99%E0%B5%BE [07:51:29] the latter is Malayalam translation of the first.. I don't know how to link them.. can anyone help..? [07:53:36] iBoT: remove the mlwiki link from the repository and then connect to the English category [07:54:16] how to do that..? I'm completely new with Wikidata..? [07:55:05] https://www.wikidata.org/wiki/Q1152989 is the place where the sitelink is connected [07:55:37] just remov [07:55:45] remove the ml linj [07:55:51] *link [07:56:45] oh.. got it.. thanks a lot... [07:59:39] removed it.. now I'm on https://www.wikidata.org/wiki/Q8453315 ... but there is no option to add https://ml.wikipedia.org/wiki/%E0%B4%B5%E0%B5%BC%E0%B4%97%E0%B5%8D%E0%B4%97%E0%B4%82:%E0%B4%9A%E0%B4%BE%E0%B5%BC%E0%B4%B3%E0%B4%BF_%E0%B4%9A%E0%B4%BE%E0%B4%AA%E0%B5%8D%E0%B4%B2%E0%B4%BF%E0%B5%BB_%E0%B4%9A%E0%B4%B2%E0%B4%9A%E0%B5%8D%E0%B4%9A%E0%B4%BF%E0%B4%A4%E0%B5%8D%E0%B4%B0%E0%B4%99%E0%B5%8D%E0%B4%99%E0%B5%BE [08:01:27] iBoT: click edit next to Wikipedia, ať the bottom an empty field appears [08:02:02] or go to the page in mlwiki and click edit links [08:02:45] I've gone to mlwiki and clicked on edit links.. I'm getting "create a new item"... what's that..? [08:03:37] matej_suchanek with many options such as label, description and all.. [08:03:49] hm, strange [08:04:04] you should see a JavaScript widget [08:04:32] never mind, there is still the second way to connect it directly in the repository [08:08:48] iBot: I have just done the connection, if you need more help, you can also read [[Help:Items]] [08:08:49] 10[4] 10https://www.wikidata.org/wiki/Help:Items [09:28:29] addshore: https://gerrit.wikimedia.org/r/#/c/149183/ [09:40:59] hi hashar [09:41:17] hashar, the bug in question is https://phabricator.wikimedia.org/T110518 [09:41:19] addshore: jzerebecki: aude: we would need a patch backport for mediawiki/extensions/Wikidata [09:41:46] https://gerrit.wikimedia.org/r/#/c/232272/ changed the Wikidata.php entry point for CI/Jenkins so it is no more hardcoded to a specific job name [09:42:29] and Wikidata is now included in ContentTranslation job, but it has a different name hence Wikidata is not properly loaded :-/ [09:43:00] hashar: we can do that [09:43:12] aude: though you probably don't want to backport the whole patch :-( [09:43:23] aude: this would be very helpful, because it's a blocker for deploying another urgent ContentTranslation patch [09:43:27] but just the Wikidata.php bit aka https://gerrit.wikimedia.org/r/#/c/232272/1/Wikidata.php,unified [09:43:38] aharoni: we lost you from the hangout [09:43:41] seems it is generated by the Wikidatabuilder [09:43:49] probably jan takes a look if he is here [09:43:56] WAT [09:43:56] else i can look [09:44:29] I should be back [09:44:48] aharoni: once the Wikidata extension has the fix included for both wmf branch, you can recheck your CX patch in Gerrit and it should work ™ [10:13:42] benestar as far as I understood DanielK_WMDE is ok with a fake api search module for the moment, but also would appreciate to research an integration into cirrus search, because thats what we want to have finally [10:14:01] Jonas_WMDE: thanks for that summary :) [10:14:09] that's also what jdlrobson would be ok with [10:16:23] jzerebecki, aude - any chance you could resolve https://phabricator.wikimedia.org/T110518 any time soon? Should probably take you just a few minutes, and it's a blocker for us. [10:33:34] aharoni: made patches for the branches [10:34:04] even though they don't really affect production, we'll want to sync them [10:36:00] aude: thanks for the patch backports :-} [10:37:07] i can probably just sync them, assuming jenkins approves [10:37:12] aude, hashar - who will merge and deploy them? [10:37:18] aharoni: i can [10:37:22] aude: wonderful, thank you [10:37:29] it's friday though, but think this is ok :) [10:38:14] I can formally +1 them for approval if you want [10:38:38] waiting for the tests to complete [10:40:28] ok [10:46:28] multichill: started dewiki https://www.wikidata.org/w/index.php?title=Q20853366&diff=prev&oldid=246048613 [10:51:08] jenkins approves :) [10:51:29] wait.... we don't need this on wmf19 [10:51:32] aharoni: [10:51:42] only wmf20 is deployed now [11:10:07] aharoni: merged + synced for wmf20 [11:24:05] multichill: btw Can you run your query for fawp? [12:48:32] Lydia_WMDE: https://gerrit.wikimedia.org/r/#/c/234488/ [12:48:39] someone has to give it +2 [13:13:55] multichill: around? [13:18:18] aude + hashar_ , again, thank you so much for the help [ cc Lydia_WMDE ;) ] [13:21:01] aude: merge? https://gerrit.wikimedia.org/r/#/c/233994/ [13:22:03] aharoni: does your patch landed ? [13:24:56] hashar_: will go out on Monday SWAT [13:27:08] aharoni: excellent! [13:46:43] addshore: https://phabricator.wikimedia.org/T110668 [14:11:10] BAHSKJHKSHJ [14:11:13] aude: I hate this [14:11:51] fix the json, make the xml look stupid, :/ [17:10:46] Lydia_WMDE: arbitary access to biggest wikis delayed due to performance, right? [17:44:05] lazowik: that's correct :/ [17:45:07] no promise, but i'm thinking after next deployment (on september 10) that we can enable [17:45:21] e.g. sometime after september 10 [17:48:12] mhm [17:48:29] do you solve that on code level or infrastructure level? [17:49:28] and if in code do you just lower rate or whatever or are there actually complexity wins to be gained? [18:01:50] lazowik: what do you mean lower rate? [18:02:06] of updates :p [18:02:12] we are changing how some queries are done [18:02:40] adding some deduplication of jobs [18:02:44] (for updating usage) [18:03:29] will add a setting that limits the number of entities accessed on a given page (full access, not stuff like mw.wikibase.label which uses term lookup) [18:04:04] the limit will be generous, but help if someone does something insane [18:04:39] also improving how some of the lua methods work, to use lookups instead of accessing full entities [18:04:52] and could add batching of things like label lookup [18:13:14] !admin [18:13:39] legoktm: hi [18:13:47] 25 seconds?? [18:14:10] Such an ambiguous statement. Maybe? :) [18:14:23] * aude going home :) [18:14:27] JohnFLewis: could you remove the login having issues notice from https://www.wikidata.org/wiki/Wikidata:Main_Page/Content ? the bug has been fixed now [18:14:42] https://phabricator.wikimedia.org/T109038#1584889 [18:15:09] I was just noting that it took 25 whole seconds to get an admin. I was used to instant response times back in the day ;) [18:16:50] legoktm: I think it is a good time to say my last admin action was about 2 weeks ago ;) [18:17:01] And done, marked for translations and so on [18:17:06] thanks :D [18:19:29] I just sent a verrrrrrrry long email to wikidata-l [18:19:33] :D [18:21:14] Amir1: not long enough to get moderated though :p [18:21:32] I try harder next time :D [18:23:59] what sort of things does it look at? [18:25:04] I see quite a few pairs of people in jaHuman where it thinks it is a person [18:26:39] yeah, It can have false positives [18:26:46] depends on how the wiki treats duos [18:26:53] etc. [18:26:58] but most of them I can guarantee are mistakes [20:03:25] héllo, I have a more closer look at the json dumps [20:03:30] they are super complex >< [20:03:48] Am I mistaken, or is it an hypergraph ? [20:05:27] Also I started looking at wiktionary [20:05:48] I think it will be easier for me to start by making the link between wikidata and conceptnet [20:05:52] Amir1: Kian seems to be busy [20:06:11] multichill: yessss [20:06:29] more importantly I'm getting list of possible mistakes [20:06:46] and see mess everywhere [20:06:54] So what languages did you work on? [20:07:34] de, ja, fa are finished [20:07:39] fr is training [20:07:55] How much stuff did you find for de? Do you have an overview? [20:08:11] Next thing to train on for claimless items is sports [20:08:13] and I'm training different models for different wikis too e.g. for japanese I have P31:Q5 and P31:film [20:08:19] You can tag shitloads of items with some sport [20:08:54] for de I got the report of mistakes [20:08:57] which is huge [20:09:04] and the bot is adding [20:09:13] but for sports, good point [20:09:22] can you give me some value-paris to work on? [20:09:26] *pairs [20:09:43] Have to figure that out. [20:10:08] Amir1: Something else. LACMA import done is done. You had a bot to add images based on collection and inventory number, right? [20:10:29] I think so [20:10:43] Can you have it run on all subcategories of https://commons.wikimedia.org/wiki/Category:Paintings_in_the_Los_Angeles_County_Museum_of_Art ? [20:10:54] It will probably find close to 2000 paintings to illustrate [20:11:09] sure thing [20:11:15] just let me find the code [20:11:40] Haha, I have that problem too. Sometimes I write code and later realize that I already solved that problem before [20:12:23] I publish most of them now [20:12:30] but finding them in github and gist is hard too [20:14:28] I also have codes like phab104707_2.py [20:18:32] Amir1: Yeah, I've seen your naming conventions :P Did you find it? [20:18:52] :)))))) [20:19:00] I found two codes [20:19:11] one adds inventory number to wikidata item [20:19:52] That's not the one [20:20:05] second one adds wikidata parameter to the file in commons [20:20:27] Hmm, images baby, we want images! [20:20:43] Doesn't the first one add images to Wikidata items based on inventory number? [20:21:26] do you mean P18? [20:22:48] Writing the code given my old codes is easy [20:23:29] Yeah, P18, would be nice if it also adds the backlink on the image to Wikidata [20:24:13] Yeah [20:24:15] okay [20:24:37] It would be okay if I run it tomorrow? [20:24:55] I lost my wallet in my home (I hope) and I need to find it [20:31:15] Amir1: Yeah sure, items will still be there tomorrow [20:31:49] thanks :) [20:33:51] Amir1: To get more at http://www.zone47.com/crotos/?p276=1641836 [20:36:07] loading [20:39:38] hey Amir1 and multichill :) [20:39:55] hey benestar :) [20:40:01] working on your bot again? :) [20:40:27] multichill: It's awesome [20:40:40] benestar: on Kian? [20:40:43] aude: See pm, in case you're around ;) [20:41:30] yes, saw it on the mailing list [20:42:18] I was working on it in the last two months [20:42:40] It finished yesterday (I'm still working on it) [20:42:52] but thanks :) [21:56:16] There is a massive documentation about this project, it's not only the dataset [22:01:28] what is the difference between wididata and wikidata-l mailling list? [22:02:41] amz3-: nothing, wikidata-l is the old one and it's renamed to wikidata [22:11:27] IIUC wiktionary -> wikidata will happen mostly manually [22:11:32] Amir1: thx :)