[00:03:40] Hi [00:03:45] some human here? [00:08:46] some humans here are, tykoth [00:21:15] can somebody bock https://www.wikidata.org/wiki/Special:Contributions/95.49.123.216 please [00:21:31] *block [09:33:49] Hello, I am trying to bring civilisation to Commons, for the moment by importing data to automatically document categories for museum objects [09:34:11] leszek_wmde: I had a look at the Serialization 2.3 release. I'm looking into https://github.com/wmde/WikibaseDataModelSerialization/pull/207 . Please don't merge this. [09:34:17] example at https://commons.wikimedia.org/wiki/Category:Olmec_scupture-10.582852 [09:34:34] leszek_wmde: Everything else is fine and can be merged in my opinion. The missing patch I talked about is in an other component. [09:34:55] Thiemo_WMDE: I was about to ask. I'll merge the other one then. Thanks. And then look at the DataValues one [09:35:36] I am very limited at the parser level, and the useful Lua module do not exist so I'd have to write them myself [09:36:13] problem is that basic functions of mw.wikibase seem to be non-functional there [09:36:38] Auregann_WMDE: do you know who can help rama ^ ? [09:36:39] rama: What is "civilisation" on Commons? [09:36:47] Harmonia_Boulot advised mentionning the issue here, so if anybody has an idea... [09:36:51] Thiemo_WMDE: wikidata, obviously :p [09:37:06] Oh, got it. ;-) [09:38:18] Thiemo_WMDE: the piece of civilisation I am after right now is replacing code like https://commons.wikimedia.org/wiki/Category:Golden_pendant-70.2003.14.1 with direct import from Wikidata [09:38:31] we are duplicating information [09:38:52] and all your information is belong to Wikidata, obvisouly [09:39:03] :D [09:39:47] (I didn't expect this joke and now I'm giggling at work, thanks rama ^^^) [09:39:55] -^ [09:40:07] rama: Thats a wonderful project. I suggest to start with a single of these template parameters. Start simple. [09:40:46] I think some parameters are already working? [09:40:48] Thiemo_WMDE: I realised I needed more than parser when I stumbled upon the "Reference" field [09:40:57] which ones are bugged? [09:41:18] well, basically, the proof-of-concept is essentially working, with the paser only [09:41:30] but there are lots of things that could be much much nicer [09:42:11] for instsance, the Reference gives the URL, it should give the URL and the title of the targetted page so it's all nice and fluffy [09:43:32] one nice consequence of using Wikidata is that there is no need to set the title and description in N languages, since everything gets imported in the language that the user is using [09:44:06] when you start getting effets like this for free, you know the gods of good design are smiling upon you [09:51:11] so I'd need some rather basic functions, such as importing Property P_id for a given Q_id [09:51:48] which I already have with the parser, but I need that as a stepping stone towards importing SubProperties [09:52:09] for instance, the human-nicely-readable title of the web page given as reference [09:52:39] which is a property "Title" for the property "Reference URL" for the Object [09:53:27] at the moment I seem to be stumbling already at the level of mw.wikibase.getEntity(q_id), which returns nothing [09:57:38] rama: I'm sorry but a feature to fetch the title from a website does not exist, at least not as a function thats part of Wikidata. [09:58:15] If its stored in a separate "Title" snak next to the URL, you can fetch it obviously. [09:58:27] Thiemo_WMDE: no, to fetch the title from p:title used as a qualifier on Wikidata [09:58:38] You mean a reference? [09:59:20] yes [09:59:49] ah ok the Olmec statue is a bad example, we do not have a reference for this one, but look [09:59:56] You start with the Q-ID, Fetch the entity. It contains a list of statements. Find the statement you are looking for. The statement does have a list of references. Each reference is a list of snaks. In your example there are two snaks: URL and Title. [10:00:31] yes, fetching the entity fails [10:00:56] https://commons.wikimedia.org/wiki/Category:Shark_god-71.1969.51.25 <--- here is another example which has a reference URL [10:01:42] I have this draft module here: https://commons.wikimedia.org/wiki/Module:RamaTesting [10:07:59] What does "fail" mean? What is your actual example? Which entity ID are you using, and on what page is your current code? [10:09:05] ok for instsance [10:09:15] this category: https://commons.wikimedia.org/wiki/Category:Shark_god-71.1969.51.25 [10:09:33] the table is generated by https://commons.wikimedia.org/wiki/User:Rama/Catdef [10:09:53] which uses a few things from https://commons.wikimedia.org/wiki/Module:RamaTesting [10:10:00] but these things fail/ [10:11:04] I dont see a failure there. [10:11:36] If you look at https://commons.wikimedia.org/w/index.php?title=Category:Shark_god-71.1969.51.25&action=edit at the end all Wikidata objects are listed there. [10:13:19] ok, look at the title: it's supposed to show the title from Wikidata, the word "plouf", and the Wikidata label [10:13:35] but we get "Lua error in Module:RamaTesting at line 36: attempt to index local 'entity' (a nil value). wikidata:Q28739036" [10:17:42] which I understsand as saying that it cannot fetch the entity [10:18:54] and this is a very simple example: just fetch the entity and its label. We're not into sorting properties or anything [10:32:17] if you return just the q_id, it works. Trouble begins with mw.wikibase.getEntity(q_id) [10:37:02] The q_id was empty [10:37:17] That was the first problem. I edited your code a bit. Have a look. [10:38:41] Note that there are multiple ways to fetch a label. What you did was fetching the full entity and then requesting a specific label from that entity. You do not have language fallbacks when you do this. [10:38:59] Much more convenient: Use mw.wikibase.label( q_id ). This even includes language fallbacks. [10:39:53] Rendezvous with Rama [10:42:52] I am very surprise that q_id was empty, I managed to print it with this code [10:44:35] thank you very much, I'll try with the entity and property then [10:45:56] JeroenDeDauw: this is indeed the origin of the nick [10:49:41] How do you guys know this game? I learned about it just recently, literally 2 weeks ago. ;-) [10:50:08] leszek_wmde: I reworked https://github.com/wmde/WikibaseDataModelSerialization/pull/207 and *introduce* a 1-line private method now. [10:50:31] Thanks for your work Rama, let us know how it goes and when you'll have feedbacks from other Commonists :) [14:55:29] ok, it works now [14:55:53] I don't fully understand why, but that's still better than not understanding why it does not [15:22:39] Is it possible to use wikidata in other wikibase-installations in a simple way like Instant commons? Or is it more complicated? [23:53:40] WikidataFacts: are you there [23:53:58] sjoerddebruin: yeah, but not much longer [23:54:10] fire away :) [23:54:15] oh wait, i was too soon [23:54:23] nah, the report worked suddenly again [23:54:34] This makes me fucking sad https://www.wikidata.org/wiki/Wikidata:Database_reports/Complex_constraint_violations/P172 [23:54:41] (and I need it to combat a LTA) [23:55:23] ouch, that sucks [23:55:24] LTA? [23:56:15] https://www.wikidata.org/wiki/Wikidata_talk:Abuse_filter#Block_.28unsourced.29_additions_of_P172_done_by_anonymous_users [23:57:02] sorry, I don’t know what LTA means [23:57:13] long-term abuse [23:57:17] oh :( [23:57:25] Sorry, I thought it was a pretty known term.