[04:32:30] !admin [04:32:30] Please visit https://www.wikidata.org/wiki/WD:AN [19:59:58] reosarevok: do you know, if ELNET is officially participating in VIAF or still in testing phase? :)) [20:42:32] Lydia_WMDE: Do you know how often the property suggestion table is updated? [20:42:41] I just ran the query for https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Most_used_painting_properties and get the exact same result as in December last year [20:43:02] multichill: we probably didn't update it since then. it's a manual thing [20:43:18] hoo does it mostly [20:44:29] Maybe a good thing to automate to run every once in a while? [20:45:03] It still says 325.000 paintings and we're at 355.000 now so it must be some time ago :-) [20:45:22] yeah it's on the list of things to automate indeed [20:46:20] I recall some cutoff minimums, do you know if those are documented somewhere? [20:50:17] I am not sure, sorry [20:55:00] Nvm, I can always just look at the source code ;-) [21:02:53] :D [21:02:58] true [21:12:19] All sorts of mentions of csv files. I quickly closed it... [21:18:15] Hi: [21:18:16] What's the recomended workflow to upload a JSON-LD dataset to Wikidata? Maybe something more elegant than converting into tables? [21:19:01] Guest14481: What kind of data is it? What is the license? [21:23:05] is a database of historical heritage items [21:23:51] I have to fully check the licensing thing but now I'm intriguided about how to work here with JSON-LD data. [21:24:24] Good! You have it in a structured format, that makes it easier than say scraping [21:24:32] You would have to map the data to the Wikidata model [21:24:52] I know, I tried OpenRefine before [21:25:22] For example I just imported works like https://data.collectienederland.nl/data/aggregation/catharijneconvent/ABM-s371.json-ld [21:25:30] and I've been reading about other import methods but seems all are a form of converting to tables for import [21:25:35] And now visible at https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Collection/Museum_Catharijneconvent [21:26:06] I'm wondering if there is a more RDF'native way, or something. [21:26:18] How are your programming skills? Do you know any python? [21:27:19] I'm giving my first steps with a colleague, absolutelly newcomers but pretty interested on this project. [21:27:53] I found that PyLD library too, but still not sure which workflow to use. [21:28:05] Do you have link to an example record? What kind of heritage items exactly? Are we talking buildings or objects? [21:28:49] buildings and gelocated places, yes [21:29:12] For what country? We might have existing items you can expand on [21:29:55] the example: http://olea.org/tmp/ficha-inmueble-1867.jsonld [21:30:00] We have a lot of heritage data in https://commons.wikimedia.org/wiki/Commons:Monuments_database/Statistics . A lot has been moved to Wikidata, but still plenty to do [21:30:30] from Andalusia, Spain, and yes, I've been adding updates of this data by hand before [21:31:06] in any case this is the source data we finally use at WD [21:33:31] So let's see, you probably assign an unique id to each building? [21:33:38] 01040500002 in this case? [21:34:52] A good strategy is to take one record (building) and completely model it on Wikidata by hand and get feedback on that [21:36:59] This is a current entry for the same database: https://www.wikidata.org/wiki/Q61045615 [21:37:46] > A good strategy is to take one record (building) and completely model it on Wikidata by hand and get feedback on that [21:37:47] Yep, I have this in mind because I never did and exhaustive transcription, I think I know how to ask about this details [21:38:39] And you already have the property https://www.wikidata.org/wiki/Property:P3318 to link records [21:38:50] My doubt is more about the way to upload the data itself, after the modelling mapping is approved [21:39:01] > And you already have the property https://www.wikidata.org/wiki/Property:P3318 to link records [21:39:01] yes, exactly [21:39:34] the idea is to full upload the whole database with some scripting, you know. [21:39:36] Are you in touch with Wikimedia Spain? [21:39:47] sure [21:39:48] :) [21:39:50] I'm member too [21:40:48] Excellent. Do you know https://www.wikidata.org/wiki/User:Escudero ? [21:41:16] He did a lot of work on the Prado import [21:41:58] you can see that on https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Collection/Museo_del_Prado [21:43:14] > Excellent. Do you know https://www.wikidata.org/wiki/User:Escudero ? [21:43:15] I think not, but I know David Abián is working with WD too [21:45:17] Would be nice to have it done before September (Wiki Loves Monuments) [21:46:26] sure, they are more than 22K items! [21:49:26] Certificate on https://guiadigital.iaph.es/bien/inmueble/19519 is broken btw. Do you have any place to report that? [21:50:34] SSL certificate? in my system seems correct [21:52:23] The chain is incomplete, see https://www.ssllabs.com/ssltest/analyze.html?d=guiadigital.iaph.es&latest [21:55:38] Maybe I have installed root signatures by hand in the past [21:56:44] Anyway: the Spanish government websites experience with SSL have been more like a nightmare for years. [21:57:44] Platonides might have mentioned that before.... [21:58:24] :) [22:00:04] (I think matrix is not showing my real Freenode ID, Olea) [22:01:36] You hostmask contains it, but you didn't set your nick [22:19:40] olea: Good luck with the import [23:07:30] good night