[05:59:48] !admin Please add the article '위키낱말사전:질문방' in Q4026300. [05:59:48] Attention requested  HakanIST sjoerddebruin revi [06:00:15] [[d:Q4026300]] [06:00:15] 10[1] 10https://www.wikidata.org/wiki/Q4026300 [06:00:18] @link [06:00:18] https://www.wikidata.org/wiki/Q4026300 [06:02:23] Thank you for quick response. 지체 없는 빠른 답변 감사합니다. :) [07:57:51] https://petscan.wmflabs.org/ down? :( [15:11:26] I have a quick question: I see wikidata has pages like this: https://www.wikidata.org/wiki/Q2360101 about the "list of PlayStation games", or https://www.wikidata.org/wiki/Q861322 about "Alundra", a PlayStation game. (But no link between the two?) I guess I can export the data I see there in various formats. Does it also have things like the data in the list on, for example, https://en.wikipedia.org/wiki/List_of_PlayStation_games_(A% [15:12:01] ...publisher, release dates, etc. So wikidata has metadata for wikipedia articles (and wiktionary etc), but not things like a database of all (most) known PlayStation games, and, I assume, the list at https://en.wikipedia.org/wiki/List_of_PlayStation_games_(A%E2%80%93L) is edited by hand and not generated from a "proper" database. Is this correct? [15:12:38] (Sorry, I think https://en.wikipedia.org/wiki/List_of_PlayStation_games_(A%E2%80%93L) is not clickable, you need to add ")") [15:13:59] !admin low priority but maybe someone can give a quick answer :-) [15:13:59] Attention requested  HakanIST sjoerddebruin revi [15:14:40] There is no guarantee that the content of lists on Wikipedia is the same as in items on Wikidata [15:14:52] Sure, but is there even such data in Wikidata? [15:15:09] Something like a (long) list of video games, movies, etc [15:15:18] You can generate that with a query, yes. [15:15:44] Would I do a query for a category? [15:16:25] This is a list of PlayStation games: http://tinyurl.com/y7f7bmqb [15:17:53] ah, nice, thank you. you wrote a SPARQL query for that? [15:18:14] It was quite easy with the Query Helper on https://query.wikidata.org [15:18:35] Just click on "filter" and enter "video game". Then click again on "filter" and then enter "PlayStation". [15:19:36] Oh, okay, I think that's just the pointer I needed, I'm sure there are lots of tutorials etc on SPARQL and it's use on wikidata, Thanks a lot! [15:20:07] No problem, have fun! [15:20:25] I will! Bye :) [15:22:20] Hey, sjoerddebruin :) [15:22:30] Hey ;) [15:22:34] How are you going, my friend? Will you attend the Conference? [15:24:08] The conference this week? No, sadly. [15:24:14] maybe we should ping more adminz [15:24:25] Wasn't really some admin question anyway :P [15:24:32] well yeah [15:24:34] but anyway [15:25:34] In other channels we have helpers, lists that aren't related to the permissions in the Wikimedia projects but to volunteers in IRC [15:26:43] So these helpers are notified by a bot to solve some issue [16:05:35] abian: will you be there? [16:08:32] No, I won't either, unfortunately :/ [16:08:36] I have never attended a Wikimedia Conference or a Wikimanía, not even when I was on the Board of WMES [16:08:44] *Wikimania [16:10:25] But I'd love to attend sometime... [16:11:06] Hackathon would seem especially convenient this year… [16:14:10] I'm stupid and forgot the deadline; otherwise, that would be a great chance, I can go from my home town to Barcelona in an hour and a half [16:15:10] I was too busy and stressed some weeks ago, and forgot the hackathon [16:20:29] (The Hackathon isn't organised by WMES but by Amical Wikimedia, so I'm not part of the organization team) [16:23:00] oh, right [21:51:53] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1973 bytes in 0.127 second response time [22:11:53] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1950 bytes in 0.106 second response time [22:16:46] Look mom, without hands! [23:26:01] sjoerddebruin: Be careful where you stick that thing [23:26:09] .pic:P [23:30:18] why wikidata is untyped? [23:30:24] for instance, in the “languages” section, you can put something else than a language [23:30:30] that is error prone [23:30:41] also, in “references”, I can’t put a precise URL [23:31:03] like this https://translationproject.org/domain/bash.html as a reference for the languages of the Bourne-Again Shell [23:31:05] galex-713: To allow for flexibility. But we have soft-constraints to look after that [23:31:33] Though I can put the entry “The Translation Project”, but I’m not sure it can point automatically to the “bash” domain web page of the Translation Project [23:31:49] hoo: what soft-constraints? why is it still suggesting, on the same level, anything else than a language? [23:32:37] galex-713: because it can be. Is dance a language? is gesture a language? it could be, depending on context [23:33:09] like suggesting Gallician, German, Gallice, Germany, etc. before Greek, Greece, etc. instead of listing all the languages before [23:33:12] that's why wikidata does not prescribe what data can be modelled in it. That's up to the editors [23:33:15] galex-713: If you actually want to put a URL, there are specific properties for that [23:33:47] SMalyshev: what about some specific constraints, for instance not allowing to put something else if the object, is, say, a software [23:33:57] dance, gesture, etc. can’t be the language of a software [23:33:59] right? [23:34:06] hoo: didn’t find it :/ [23:34:27] galex-713: there are bots that implement soft constraints. But wikibase has very little of hard constraints [23:34:55] how sad :/ [23:35:02] howeve wikidata seems way optimized to be edited by bots [23:35:12] that’s a bit exclusive of users not writing bots [23:35:36] wikidata is optimized to not have data model that fits a particular use case, but to accommodate all (or most) of possible use cases [23:35:39] how sad there’s nothing automated, for instance, for importing data from translationproject.org [23:35:48] (optionnal) [23:35:57] or an option to script imports instead of writing a full bot [23:36:11] galex-713: if it's so sad, you are welcome to provide the said automation :) [23:36:18] The difference between a script and a full bot is minimal [23:37:31] Reedy: isn’t the later a bit harder to program? [23:37:37] Why? [23:37:45] The later is just a more complex script, effectively [23:38:07] giving a form where to put a shellscript that gives you a list would be a lot more doable than writing a bot that interact with mediawiki, having to learn all the wikidata API [23:38:47] But how would that know where you wanted to put what data? [23:39:01] SMalyshev: I currently would had the time to make a shellscript scraping all the languages of this page with grep, or to manually add all the languages, but not to dive into mediawiki code, to program a bot, etc. [23:39:26] Reedy: maybe with some builtin command or with a specified simple output format [23:39:35] Plenty of bot libraries so you don't have to reinvent the wheel interacting with the API [23:39:43] you could have a item_add shell function, or some format like one entry per line [23:40:04] Reedy: of wikidata bot libraries? in bash, C, lisp, etc. languages? [23:40:26] https://www.wikidata.org/wiki/Wikidata:Creating_a_bot [23:41:09] none of this languages… [23:41:29] I'm pretty sure if you can do all those languages.. You could manage python, for example [23:42:26] There's a lot more options for plain MediaWiki [23:42:26] https://www.mediawiki.org/wiki/API:Client_code [23:42:56] I don’t like python :/ and, above all, I’m not used to it [23:43:39] I think the answer is tough then [23:43:45] maybe being used to reading its syntax, and maybe filling a small form with python code would be okay, but finding out how to remake a whole bot is definitely going to be really really long compared to the time I’d need to enter all the languages manually without automation [23:43:48] People do this as a volunteer effort [23:43:53] You can't expect to have your cake and eat it [23:44:07] I’m not expecting, I’m commentating :) [23:44:22] The other option is providing it in a machine readable format (or create an api for the translation project) and ask someone else to import it [23:44:33] when I’ll have time, now you gave me this link and information, I’ll certainly try to make a library usable from lisp, or C/shell [23:45:01] that would be the long term clean option :) [23:45:14] involving even more research, even more time and work, but paying on the long term [23:45:32] Reedy: also I didn’t find how to add an url as a reference [23:45:44] via the ui? [23:46:21] Use "reference url" for that [23:46:26] Click +add reference, type url in the first box? [23:47:12] it becomes red [23:47:16] it? [23:47:20] with written “no match found” [23:47:44] galex-713: On the left side, use "reference url" [23:47:47] (I’m translating this message from french, dunno how to get wikidata in english without an account, even more temporarily) [23:47:53] you can then add a URL on the right [23:48:31] I don’t see this, I only see “affirmed in” and “consultation date” [23:48:42] (not sure these are the correct english words) [23:48:45] Use "URL de la référence" in that case [23:48:51] just URL should do [23:49:08] oh it works! [23:49:10] thank you [23:49:13] Cool :) [23:49:16] I had to write it down [23:49:21] it wasn’t displaying as a suggestion [23:49:30] was I supposed to guess or already know it? [23:49:40] would be better if it was suggested maybe…? [23:49:52] It should actually be suggested [23:49:53] hm [23:50:00] hoo: you’re french too? [23:50:14] No [23:50:37] If I add a new reference, and don't yet type anything, the reference URL is suggested to me [23:50:45] third option [23:51:13] not at me [23:51:28] it suggest to me only the first two ones [23:53:22] hoo: how do I publish? there’s no such button for references [23:53:48] The publier button above should work [23:53:54] or is it grayed out? [23:54:25] oh, enter did, when I removed consultation date [23:54:29] was grayed out [23:55:47] hoo: I suppose since this was supposed to be used by automated tools anyway, if I have the same reference for each language, then I should add the url for each language? [23:56:12] Yeah [23:56:45] We hope to make that nicer one day (and there also is a JS gadget for that) but you essentially need to re-create the same references for every value [23:56:46] :/ [23:57:07] much work is needed for wikidata in the future… [23:57:12] at least its purpose is beautiful [23:57:15] and it does progress [23:57:28] I didn’t recall of seeing infoboxes directly displaying wikidata content [23:57:36] I fully agree with both :) [23:57:42] I did however recall of wikidata being filled with infoboxes from wikipedia [23:57:49] And we are (gradually) improving [23:58:08] hoo: mediawiki, wikibase, etc. are all php right? :/ [23:58:36] and only have sparse bots and web as their main standard clients anyway too? [23:58:37] Yeah, we use PHP for basically everything (but frontend) [23:58:53] What do you mean by that? [23:59:42] (by but frontend, you mean displaying, which is done in CSS, and maybe a bit of optional javascript too?)