[04:10:57] wait what is ceb.wiki [04:14:26] I find it odd that this "ceb" language has pages on places in norway that the norwegian wiki does not :/ [04:15:39] also they seem to link to swedish pages also, but those pages do not exist on the sv. side [04:15:46] examples: https://ceb.wikipedia.org/wiki/Nordre_H%C3%B8gmyra [04:16:02] https://ceb.wikipedia.org/wiki/S%C3%B8ndre_H%C3%B8gmyra [04:16:29] https://ceb.wikipedia.org/wiki/H%C3%B8gmyr%C3%A5 [07:06:12] CatQuest: https://www.quora.com/Why-are-there-so-many-articles-in-the-Cebuano-language-on-Wikipedia [07:16:54] (for more discussion on the issues, https://meta.wikimedia.org/wiki/Proposals_for_closing_projects/Closure_of_Cebuano_Wikipedia) [11:08:05] Nobody in the Wikidata team working today? [11:08:13] um [11:08:20] we thought we were in the hangout… [11:08:27] leszek_wmde: ^ [11:08:49] :O [11:09:09] Am I in the wrong hangout? [11:09:20] https://hangouts.google.com/hangouts/_/wikimedia.de/wmde-wd-daily [11:09:26] well we’re done already [11:09:35] no idea if that *was* the hangout we were in or not [11:09:51] There are two: https://hangouts.google.com/hangouts/_/wikimedia.de/wikidata-daily [11:10:13] oops [11:10:34] wikidata-daily is the right one [11:10:47] wmde-wd-daily must be the old one that got dropped last week [11:10:52] sorry for the confusion [11:11:05] Why did we dropped the hangout link we used for years? [11:11:40] Thiemo_WMDE: please feel free to do daily-over-irc at #wikimedia-de-tech [11:12:04] Thiemo_WMDE: re hangout link, I believe that has been an accidental mistake [11:12:34] Lydia created a new calendar entry starting Thursday last week. o_O??? [11:14:08] Thiemo_WMDE: yup, the old one I have deleted. The old entry was created in a way it was difficult to update participant list (my mistake long time ago), and it was also not being able to be shared in some other calendars (or something like that; some google cal weirdness) [11:15:05] Ok. Shouldn't it be possible to edit the new calendar entry then? For me it isn#t. [11:16:21] Thiemo_WMDE: looks like only Lydia has edit permissions [11:20:05] Great. [11:21:51] I don't even can say if I accept the calendar entry, like I'm not even invited. [11:23:05] hm, it shows you as a guest on my end… [11:24:42] I think I need to step back from Google Calendar for today. [12:38:55] "three million articles, as is the case with Cebuano today, and only ten editors to maintain it all" [12:38:55] og dear [12:39:04] that doesn't sound good at all :( [13:11:14] seems to me the solution is to delete the bot and promote the cebuano language wikipedia to speakers to get a large enough community to fix issues [16:47:56] Hi SPARQL wizards, is there anyway i can whether a particular QId is an instance of place/ Qid has coordinates? [16:51:25] the latter would be FILTER EXISTS { ?item wdt:P625 ?coordinates. } or just ?item wdt:P625 ?coordinates. [16:55:05] Thanks for the response. That works great [16:55:46] cool! [16:56:34] Also trying the former one [16:56:42] SELECT ?coordinates [16:56:42] { [16:56:44] VALUES ?item { wd:Q1353 } [16:56:44] ?item wdt:P31/wdt:P279* wd:Q486972 [16:56:44] SELECT ?coordinates [16:56:44] { [16:56:45] VALUES ?item { wd:Q1353 } [16:56:47] ?item wdt:P31/wdt:P279* wd:Q486972 [16:56:49] } [16:57:13] yeah, that looks about right [16:58:14] Lucas_WMDE, doesn't seem to work though http://tinyurl.com/ycq3h6cb [16:58:55] well you’re selecting only ?coordinates, and ?coordinates doesn’t appear anywhere in the query now [16:59:29] oh [16:59:44] btw if you really only want a yes/no answer, you can also use a slightly more exotic query form: http://tinyurl.com/y7qxjt24 [17:00:43] oh wow, dint know about ASK functionality [17:00:48] that's the best [17:01:09] basically I wanna know given a Kid, if that is a geographical feature or not [17:02:07] Also Lucas_WMDE , a question. Can I assume that if a feature in OpenStreetMap is assigned an wiki id, it will always have coordinates in wikidata [17:02:17] probably not [17:02:20] or if it will be instance of any subclass of human settlement [17:02:33] I don’t think you can assume either of those things [17:03:08] in particular, I suspect a feature in OSM might also be assigned to the item ID of some organization, for example [17:03:12] for the headquarters on the map [17:03:18] but the organization wouldn’t be a human settlement [17:03:27] sorry afk for a few mins [17:03:36] ooh yes makes sense [17:03:40] no worries [17:03:50] thanks for helping me out :bow: [17:14:41] Hey hoo. :) [17:15:18] (I’m back btw) [17:23:08] hi sjoerddebruin :) [17:23:42] Busy weeks? [17:27:24] Yeah, exams [17:27:42] Aww, good luck then. [17:50:01] thanks :) [17:53:21] https://www.wikidata.org/wiki/Wikidata:Property_proposal/Authority_control#MuIS_person_or_group_ID is it being RDF only a problem? [17:53:36] (they don't have normal pages for the artists, just RDF/XML API entries) [17:54:04] It'd be nice for bot use, but it might be weird for people if they click and their browser downloads a file... [18:04:55] I don't see a problem. [18:09:13] Ok then :) Wanted to make sure [18:09:39] (I wanted to excitedly tell multichill about our painting plans, but he's still not around. Oh well) [19:13:15] Hey are there any wikidata peeps with OTRS access? [19:16:05] I think so [19:17:46] I have. [20:14:45] reosarevok: you're painting wikidata blue? tsk tsk only a few days on the ee board and already delusions of grandeur [20:14:52] ;) [20:18:40] reosarevok: there is multichill [20:18:47] oooh [20:18:50] thanks :) [20:18:58] Hi multichill! [20:19:15] sjoerddebruin: Weet jij wel. https://www.wikidata.org/w/index.php?title=Q19832283&curid=21433998&diff=634088911&oldid=580352061 <- en-varianten, die deden we toch niet dubbel? [20:19:28] multichill: ik weet het soms niet meer. [20:19:44] multichill: https://www.wikidata.org/wiki/Wikidata:Property_proposal/Authority_control#MuIS_person_or_group_ID :) We're planning to start importing data for ~4k paintings soonish :) [20:19:54] WHO SUMMONED ME? ;-) [20:19:57] https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Labels_and_descriptions_in_language_variants [20:20:36] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/NikkiBot seems to say it is acceptable [20:21:33] reosarevok! eek! [20:21:36] :D [20:22:05] reosarevok: Looks nice. What collection? [20:26:42] multichill: https://www.wikidata.org/wiki/Wikidata%3AWikiProject_sum_of_all_paintings%2FCollection%2FTartu_Art_Museum [20:26:48] (as you can see there's quite a bit to add :p ) [20:27:18] For other big collections I split it up at some point to the different time periods [20:28:14] reosarevok: Did you find my code to re-use? [20:28:53] multichill: which one? (I guess that's no? :D ) [20:29:47] multichill: https://github.com/wikimediaeesti/muis-data/blob/master/import-tkm-paintings.py is what I have right now for this one (I haven't started writing the turning-data-into-statements part though) [20:31:38] reosarevok: I am running https://github.com/multichill/toollabs/blob/master/bot/wikidata/auckland_import.py right now [20:31:44] THey have a nice json based api [20:32:04] Also running https://github.com/multichill/toollabs/blob/master/bot/wikidata/yale_uni_art.py , that's scraping [20:32:51] This is XML/RDF [20:33:10] But hmm. artdatabot, huh. That might be easier than setting every property one by one :p [20:33:42] The artdatabot contains all the logic, for each source you just have to make a generator [20:33:48] Although of course, "Dude, write your own bot" :p [20:33:54] I might have an xml example, let's see [20:34:10] I mean, I have already extracted all the info [20:34:37] That's the most work. Now you just have to dump it in the right fields and right format and yield it as a dict [20:34:40] (except I only have author IDs and author name strings, so I need the property I linked earlier to match them by ID rather than guessing) [20:34:58] I did that for the Berlin thing [20:35:41] I should be able to just use your bot - what's the license? Can I just throw artdatabot.py into my repo? :p [20:36:36] (I was going to just write it all myself but it seems your stuff does everything I wanted it to do, so eh) [20:36:45] reosarevok: https://github.com/multichill/toollabs/blob/master/bot/wikidata/berlinische_galerie_import.py [20:37:13] Contains some simple lookup code for GDN (German authority control) [20:37:47] It's pywikibot based so MIT [20:38:08] k [20:38:25] Our current plan is to import all the metadata, then use author death dates to figure out what we can ask the museum to send us in high res :) [20:39:00] I use pre-1923 right now for the automated import [20:39:20] https://commons.wikimedia.org/wiki/Special:ListFiles/BotMultichill [20:39:58] A bot runs every night that will match paintings to painters based on the name of the painter [20:42:55] reosarevok: Use requests [20:43:33] sjoerddebruin: Would you mind a PM about an OTRS ticket? [20:43:42] No problem. [20:45:06] multichill: requests for what? [20:45:34] Don't use urllib, use http://docs.python-requests.org/en/master/ [20:47:32] Does it really make a difference for literally just "urlopen"? I mean I'm not using anything else... [20:48:26] Good best practice, not sure if urllib even works in python3 [20:52:21] I'll keep it in mind for when I look into it more tomorrow :) [21:46:24] *sigh* https://www.wikidata.org/wiki/Special:Contributions/Sjoerddebruin [22:01:50] abian: who gets to decide what something is a subclass of?