[02:31:50] (03PS2) 10Aude: Use SiteStoreFactory in top-level factories [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/174992 [02:32:01] (03CR) 10jenkins-bot: [V: 04-1] Use SiteStoreFactory in top-level factories [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/174992 (owner: 10Aude) [08:47:26] Amir1: Did you see the report updates? [08:48:58] Amir1: https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1612#.22Conflicts_with.22_violations is about done, just some left [08:49:28] https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1472 contains quite a few interesting things to import [09:06:13] multichill: hey :) [09:06:18] I was afk for lunch [09:07:59] ah, what have you been up to Amir1? [09:09:50] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: I was not sure about that. It is never set via a constructor and stands for itself here. 02http://git.io/fePKNA [09:10:00] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: Logically, you are right and I had the same thinking when writing the initial documentation. However, technically, it is wrong and the JSDuck parser complains about that. You cannot have a required attribute after an optional attribute. Anyway, I will correct the documentation since the languageCode param may also be a MultiTermMap. 02http://git.io/Gz94vw [09:10:47] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: You are right, it is technically correct at the moment. But actually, looking at the code, that error is missing. Setting a ```MultiTermMap``` with a language code does not make any sense. 02http://git.io/v7DKLQ [09:11:32] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: Would need to implement an extra JSDuck tag parser for that special purpose and since there is no compatible jQuery documentation to jump to, it does not make much sense anyway. All common uses of ```@see``` have been replaced by ```@inheritdoc```. 02http://git.io/SY9DTQ [09:12:49] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: If you find one on https://github.com/senchalabs/jsduck/wiki#syntax, you can tell me. Could use ```@class wikibase.datamodel``` with ```@singleton```, but that would render differently. 02http://git.io/CuIJ7A [09:13:45] multichill: I had harvested info from commons [09:13:55] but I haven't harvested place of birth [09:14:13] That's harder I guess. Occupation is probably easier to start with [09:15:13] Amir1: Take for example https://commons.wikimedia.org/w/index.php?title=Creator:Karl_Josef_Gollrad&action=edit [09:15:18] You should probably make a lookup table [09:15:35] yes, I was checking https://commons.wikimedia.org/w/index.php?title=Creator:Bruno_Lechowski&action=edit [09:15:38] And some logic if it's "painter / writer" etc [09:16:24] What I would do is make the lookup table. Loop over the creators. If it's one and it's in the loopup table -> add it [09:16:39] If it's multiple, split it, if all of them are in the lookup table -> add all of them [09:16:55] That way you can do multiple itterations [09:17:09] It's easy for me to do, I can consider , and similar things too [09:19:43] For place of birth, place of death you have to puzzle a bit more [09:48:13] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: Missing param documentation. 02http://git.io/5zQbTw [09:53:21] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: Probably needs to be declared ```@protected``` as the member is accessed by subclasses (see ```ClaimList```). Should probably applied in ```Map``` and ```Set``` as well. 02http://git.io/e2xFeA [10:16:28] [13WikibaseDataModelJavaScript] 15snaterlicious comment on pull request #20 14eb3ae1a: ```wikibase.datamodel.Statement.RANK``` 02http://git.io/dPz5WQ [10:54:41] Amir1: Back, you still around? [10:54:58] multichill: yeah, https://www.wikidata.org/wiki/Special:Contributions/Dexbot [10:55:14] finished writing it, the bot is working [10:55:40] Ah, nice, that should make these reports much smaller [10:55:49] I think so [10:55:59] we can see it tomorrow [10:56:29] You can use the autolist link to see the (almost) live status [10:57:48] Amir1: Look at https://www.wikidata.org/wiki/Property_talk:P1472 . The links are in the templates [10:58:23] That uses Wikidata query. You know you can use WDQ as a generator in Pywikibot? [10:58:38] neat [10:58:46] yes I use that [11:00:17] howdy [11:00:20] anyone awake? [11:00:50] mdupont: It's midday in Europe so some people are ;-) [11:01:18] Even jayvdb from Australia is awake [11:03:03] :D [11:03:21] the Special:MergeItems of Q18152333 into Q5964 says :Error: Conflicting labels for language sq., but i removed all the sq labels in the dest. [11:03:36] the special merge never worked once for me [11:04:58] * multichill looks [11:06:31] mdupont: WORKSFORME..... [11:06:37] Are you using the merge gadget? [11:07:11] no [11:07:14] the special merge [11:07:23] the merge gadget works, [11:07:39] mdupont: Don't use that, just use the merge gadget and tick redirect [11:08:20] ok, i am going to report a bug on the merge special [11:08:34] they should remove that cause it cost me hours of pain [11:08:39] I'm not sure if it's a bug [11:08:47] it is a bug for sure [11:08:54] it sees sq in the target [11:09:08] but it was removed, there was no sq in there [11:09:24] mdupont: See https://www.wikidata.org/w/index.php?title=Q18152333&diff=175948292&oldid=175948282 [11:09:24] also the site is unusable in firefox [11:09:30] It still had an sq label [11:09:30] yes, thanks multichill [11:09:35] what? [11:09:38] let me see [11:10:19] that is the source [11:10:23] And the destination also had an sq label. So that causes the label conflict [11:10:23] Kategoria:Candidates for speedy deletion is the right one [11:10:31] the dest had one? [11:10:39] how can i see that? i deleted them [11:10:55] Kategoria:Faqe për grisje [11:11:15] mdupont: You're mixing up labels and sitelinks [11:11:25] ok [11:11:33] let me double check this, thanks [11:11:38] https://www.wikidata.org/wiki/Wikidata:Glossary [11:12:30] is there any way to render the rdf of the page? [11:14:27] mdupont: Not sure, what do you want to use it for? It's mostly json [11:15:29] ok, then can i see the json page? [11:15:43] i mean i would like to see the full page in some computer readable page? [11:15:54] Amir1: Are you also working on the locations with instance of? [11:16:00] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q1999&props=claims <- example [11:16:20] Manual at https://www.wikidata.org/w/api.php?action=help&modules=wbgetentities [11:16:26] ok thanks! [11:16:46] I check for having P131 [11:16:58] to be sure it is a location not something else [11:17:05] mdupont: It really depends on what you want to do. [11:17:09] multichill, has the pywikibot beeen adapted to wikidata? [11:17:18] Of course [11:17:24] ok, need to check that out. [11:17:34] How do you think me and Amir1 are doing so much edits? ;-) [11:17:43] well i am working on a bot and want to use the categories from wikidata as input [11:17:49] :D [11:18:00] https://www.mediawiki.org/wiki/Manual:Pywikibot/Scripts#Wikidata [11:18:04] alright, Amir1 you have some code online/ [11:18:22] https://www.mediawiki.org/wiki/Manual:Pywikibot/Wikidata [11:18:25] Ouch, https://www.mediawiki.org/wiki/Manual:Pywikibot/Wikidata is quite outdata [11:18:39] mdupont: Don't, I stress, don't use compat, use core [11:19:27] ok [11:19:51] WikidataBot as a base class in pywikibot [11:20:08] ok thanks! [11:20:25] Yes, extend on that one [11:20:42] yes, use core [11:20:45] alright! [11:20:52] Amir1: I meant the huge list on https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1566 [11:21:31] Amir1: And the suggestion at https://www.wikidata.org/wiki/Wikidata:Project_chat#Having_separate_properties_for_controlled_ontologies_is_silly to just move it doesn't seem very appealing to me [11:22:48] It's possible to get the country from coordinates [11:23:11] I don't know if any library is written for that but It won't be a hard thing to do [11:23:38] Ehm, yes, I did that in the past [11:23:50] Reverse geocoding lookup. Nomatim I think [11:24:22] and see my talk page in wikidata [11:24:31] this user wrote this to me too [11:25:22] Amir1: http://nominatim.openstreetmap.org/reverse?format=xml&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1 [11:26:06] Screw xml, go json: http://nominatim.openstreetmap.org/reverse?format=json&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1 [11:27:03] that's amazing why it's not fixed yet [11:27:05] ? [11:27:13] Fixed what? [11:27:25] you can add country to the items easily [11:27:28] P1566 [11:28:14] Yeah, that's the easy part. Adding instance of is a bit harder [11:28:21] ok i found the hidden labels apge [11:28:35] and was able to fix that [11:28:36] Wow, that's quite a few items to do [11:28:59] Should be able to fix abou 28K items [11:29:25] Be sure to source it to nominatim if you use it Amir1, that makes debugging easier [11:29:40] It isn't that good near borders [11:30:12] Oh, and don't add country to country items [11:30:15] multichill: I think it should be a way to ignore coordinates around borders [11:35:59] Database reports. <3 [11:36:07] Who wants to make database reports an extension? [11:36:11] Or fix all the broken ones? [11:43:21] Amir1: I wonder why we were not able to match https://commons.wikimedia.org/wiki/Creator:Willem_Wenckebach [11:43:35] because I just started [11:43:40] it's "W" [11:44:17] Amir1: You're matching creator templates with items again? I thought you were done with that part [11:45:14] http://wdq.wmflabs.org/wdq/ <-- When will something equivalent to this be available on en.wiki? [11:45:19] I think there's a Wikibase Query something. [11:45:31] I was hoping by the end of 2014. [11:45:32] oh I thought about info [11:45:37] let me check [11:46:14] Take creator template without wikidata link, go to home category. Use wdq to see if some item uses that Commons category. If that item is human, connect [11:46:30] Amir1: That logic would be straightforward, right? [11:46:56] i used viaf id [11:47:03] I can use commons cat too [11:48:04] We're in the last bit we can match, can be a bit dirtier [12:11:36] I just realized we can't use wdq in labs [12:11:47] we can but I have to change it to another url [12:11:55] http://wdq.wmflabs.org/api?q=string[373:%22Ludwig%20Willem%20Reymert%20Wenckebach%22] [12:12:01] this url doesn't work inside labs [12:13:40] I can use proxy :D [12:26:02] multichill: url = u"http://5.hidemyass.com/includes/process.php?action=update&obfuscation=1&u=" [12:26:03] f=urllib2.urlopen(url + urllib2.quote('http://wdq.wmflabs.org/api?q=string[373:%s]' % json.dumps(tem.get('Homecat',"").strip()))) [12:26:34] Amir1: Why? [12:26:51] since the inside urls doesn't work in labs [12:27:09] Oh, just use the output from the twitter link [12:27:35] I used hidemyass no differenece [12:31:52] multichill, ok, i was able to generate a list of cats from the wikidata, this is great. here is my hack : https://github.com/speedydeletion/pywikibot/blob/master/scripts/speedydeletion.py#L3700 [12:32:09] it creates a set of calls to the transfer bot to save those articles [12:33:18] How do I indicate that people are in a hall of fame? [12:33:38] multichill: https://commons.wikimedia.org/wiki/Special:Contributions/Dexbot [12:33:59] sjoerddebruin: Maybe member of? Part of? Not sure [12:34:02] Or awards? [12:34:24] I think member of, award is a little bit weird imo. [12:35:47] Same thing, how do you indicate that someone has a star on the walk of fame... [12:35:55] Probably easiest to look for someone else who is in the same hall of fame [12:36:01] (whatlinkshere) [12:36:11] No one. ;P [12:36:18] Lydia_WMDE: data acces for commons? :D [12:36:24] JohnFLewis: yes yes [12:36:36] 'lo people [12:36:40] <- tired in stockholm [12:37:06] Why Stockholm? [12:37:32] giving a talk on monday [12:37:38] Ah, they use a separate item for that. http://tools.wmflabs.org/reasonator/?&lang=nl&q=17985761 [12:37:38] at internetdagarna [12:37:40] Ah [13:25:43] Amir1: https://commons.wikimedia.org/w/index.php?title=Creator:Jean_Daret&diff=prev&oldid=140248368 , you lost the Q [13:26:25] oh thanks [13:26:35] let me fix things that needs to be fixed [13:26:56] Last time that happened that was quite a bit of work to clean up.... [13:27:31] I will clean all of them up [13:27:57] I don't know why I didn't use the re.sub when I wrote it [13:28:26] hehe [13:35:07] multichill: all of mistakes reverted and I fixed the code and now it's working correctly [13:37:52] yah! [13:38:49] :) [13:52:59] I got to go [17:43:37] hi! are there a wikidata sandbox or a testing site? [17:45:41] Sure there is [17:45:51] https://www.wikidata.org/wiki/Q4115189 is a sandbox item [17:45:59] https://test.wikidata.org/ [17:46:04] that is a whole testwikidata ;) [17:48:21] perfect [19:11:10] Lydia_WMDE: I demand we make a #FeministHackerBarbie about Wikidata [19:11:33] JeroenDeDauw: lol totally! [19:11:48] a what? [19:12:11] hoo: you've not been on the internets for a day or two? [19:12:14] :D [19:12:15] hoo: https://twitter.com/BoingBoing/status/535864808164585472 [19:12:32] JeroenDeDauw: ideas for one? [19:13:25] https://twitter.com/CLFLN/status/535458870093967360/photo/1 oO [19:14:01] :D [19:16:39] https://twitter.com/JenJenMi/status/535138862730076160/photo/1 wow :P I need to read up more I guess [19:18:11] JeroenDeDauw: :D [19:18:17] not in the office on monday!!! [19:18:55] she working in a different country because JeroenDeDauw's been annoying :p [19:19:13] i deny any comment on this! :D [19:19:27] but secretly she acknowledges it is correct! [19:19:28] Lydia_WMDE: You can haz make captions here https://computer-engineer-barbie.herokuapp.com/new [19:19:50] How can that be correct? Lydia would always be out of the country if that was correct [19:20:02] JeroenDeDauw: So... can you print again? :D [19:20:09] lol [19:20:13] hoo: yeah I can actually [19:20:22] we're doooooooomed [19:20:27] totally [19:21:12] Not entirely. I have to walk over to the printer with an USB key. That's way to much effort [19:21:24] And the tard thing does not recognize pngs and jpgs [19:21:53] JeroenDeDauw: But the real question is: Can it print animated gifs? [19:22:36] ok we're only dooomed [19:22:38] \o/ [19:24:20] hoo: funnily enough it recognizes animated gif files, though sadly it prints them in non-anmiated form [19:24:30] Would be funny if it printed one page per frame :) [19:25:24] :D [19:34:28] [13WikibaseDataModel] 15JeroenDeDauw closed pull request #272: Add requirements for language fall backs. (06master...06languageFallBack) 02http://git.io/kKxPEg [20:34:52] delete Q18566009 : spam [20:35:20] done [20:35:29] sjoerddebruin: urg :( [20:35:38] :D [20:35:42] * JohnLewis really needs log actions [20:36:19] Here. https://www.wikidata.org/w/index.php?title=Special:NewPages&hideliu=1&hidepatrolled=1 [20:36:28] I see vandalism. <3 [20:37:11] heh why is labs editing yet again [20:37:23] idk [20:37:40] But, delete... https://www.wikidata.org/wiki/Q18562485 [20:39:08] yay [20:40:17] sjoerddebruin: 5 deletions there :p [20:40:28] just 5 more and I meet activity for another 6 months :D [20:40:32] And mark the good pages please. [20:40:38] no :p [20:40:54] :( [20:48:09] delete/undelete your user pages few times and you get your log actions ;) [20:48:45] A sysop on nlwiki created redirects to get above the local rules [20:48:50] Stryn: genuis but I'd prefer it be your page :p [20:48:58] xD [21:53:45] Lydia_WMDE: Has something changed to the sorting of results? [21:54:28] Oh wait, the number of sitelinks of https://www.wikidata.org/wiki/Q225298 changed. [21:54:49] Now the good men-property is unreachable again, need to load the second batch every time. [21:54:55] item* [21:55:11] sjoerddebruin: hey :) [21:55:17] sjoerddebruin: k] [21:55:23] sorting of suggestions when adding a new property? or value? [21:55:27] value [21:55:36] nothing should have changed [21:55:50] Yeah, read the rest. [21:55:51] Lydia_WMDE: heh 'Ok peeps. What's the best item out there?' :p [21:56:06] JohnLewis: not enough answers :( [21:56:13] Q2013 of course [21:57:00] But now I need to tell a new contributor that he needs to load a second batch of results to mark someone as men. [21:57:17] Oh, wait (s)he. ;) [21:59:19] just say he [21:59:22] generic masculine FTW [22:01:19] Aaargh, why is all this based on number of sitelinks... -_- [22:07:50] Lydia_WMDE: tell me the dev team have done more than that 1 thing :p [22:09:11] JohnLewis: lol yes. let me look into that now [22:09:45] sjoerddebruin: how much further down is it? maybe we can just increase the number of results shown? [22:09:55] 1 position. [22:10:32] There is a group of people above it with two sitelinks. [22:10:42] And the correct one has only 1 sitelink. :( [22:11:11] I also see two disambiguation pages, booooring. [22:11:27] Lydia_WMDE: k :p [22:12:34] sjoerddebruin: ok let me put that on my todo to investigate some more next week [22:13:02] Here are they. https://www.dropbox.com/s/hf42leytjkysria/Schermafdruk%202014-11-22%2023.12.57.png?dl=0 [22:14:15] thx [22:22:19] JohnLewis: there you go :) [22:22:31] k :p [22:38:21] Lydia_WMDE: Status update: "Bug fixes" I think we should be more or less specific than that :D [22:39:26] hoo: heh maybe... [22:39:52] but its so much small stuff that in itself isn't really news [22:40:06] it's more the general thing that counts: we care and fix things [22:40:09] imho [22:40:25] Like we didn't last week? :P [22:40:30] Just sounds like taht a bit [22:41:05] heh fair enough [22:41:36] hoo: less specific? :p [22:41:46] JohnLewis: * Did things [22:41:47] 'we may have fixed or introduced bugs...' [22:42:00] ah [22:45:13] rofl [22:58:46] Lydia_WMDE: https://www.wikidata.org/wiki/User_talk:Magnus_Manske#Oauth_problem [22:59:00] Seeing a shitload of edits coming from https://www.wikidata.org/wiki/Special:Contributions/10.68.17.174 [23:01:34] multichill: hmmmm never heard of that problem -.- [23:01:36] thanks [23:01:45] New for me too [23:01:52] Magnus should be using assert [23:02:02] i don't know if there is anything we can do beside speeding up the move to hhvm [23:02:32] We can't... Magnus could use API asserts, though [23:02:50] if the edits are a problem, I can softblock the IP for a bit [23:03:19] The edits are alright, otherwise I would already have placed a block on it [23:03:45] Pff, why don't people just use assert? It's to prevent this kind of crap.... [23:04:21] He's probably not aware of it [23:04:35] it's not really an advertised feature [23:13:07] JohnLewis: ok i think that's all from my side for the weekly summary [23:13:18] ok [23:13:27] https://www.wikidata.org/wiki/Wikidata:Status_updates/Next <- everyone have a look if anything important is missing :) [23:16:10] Lydia_WMDE: Well... we're almost done with Statements on Properties. Probably [23:16:45] hoo: i hope so too but nothing yet for the weekly summary i'd say :/ [23:17:01] though... [23:17:04] The code is there... I just need to make someone press the merge button [23:17:17] let me link to the discussion for preparing for statements on properties [23:17:30] there was one on project chat [23:19:06] hmmm its already in the archive [23:24:42] hoo: give me merge and I will ;) [23:25:58] Sounds legit :D [23:26:22] Wow, I could actually do that (technically) [23:26:37] But I guess Lydia would like the idea... somewhat less ... :P [23:28:07] ;-) [23:28:09] correct [23:29:28] Lydia_WMDE wouldn't mind. She loves me too much to say no :D [23:29:58] JohnLewis: sush! :D [23:32:13] * aude cries.... wants to report a bug [23:32:25] aude: me too! :D [23:32:35] heh [23:32:54] i want to break Do not load full entities when invoking LinkBegin hook into smaller tasks [23:33:03] aude: Just fix it :P [23:33:08] hoo: what? [23:33:19] :) [23:33:20] The bug, easier than reporting right now [23:33:44] ah [23:34:05] lol [23:36:32] * JohnLewis mumbles [23:36:45] the bz->phab migration is the worst planned thing [23:39:37] here https://www.mediawiki.org/wiki/User:Aude/Bugs :)