[00:07:54] I am new [00:07:59] to wikidata [00:08:07] where do I start? [00:22:00] septrillion: find an obscure, but interesting to you, category on a Wikipedia editition.. have the category page open in one tab. take some items and search for them in wikidata in second tab. if the object in wikidata has almost no properties.. click to add one, and it will auto suggest to add some [00:22:30] for example if you are looking at persons or bands.. it might ask for "date of birth" or "inception" [00:22:44] and then you could see if the WP article has the date and enter it [00:28:30] mutante https://en.wiktionary.org/wiki/editition is a red link [00:28:47] what does "editition" mean? [00:29:24] septrillion: typo of https://en.wiktionary.org/wiki/edition [00:29:28] "language version" [00:29:47] English Wikipedia, German Wikipedia.. etc all the others [00:30:12] septrillion: are you interested in a particular topic / subject area? [00:30:40] it would be easier to explain with an example [00:30:47] its ok [00:30:55] i understand now [00:31:01] thanks [00:31:05] you're welcome [00:32:01] also see: https://tools.wmflabs.org/wikidata-game/distributed/# [00:52:05] ok thanks [10:22:57] !help Hello, I would like to know the SPARQL query to determine which items are referenced by a specific P [10:24:13] I would like to know the SPARQL query to determine which items are referenced by a specific P [10:29:59] I would like to know the SPARQL query to determine which items are referenced by a specific P [10:30:07] hi accosta [10:30:19] What specific P? [10:30:28] For example P31 [10:30:42] I want to know all the items referenced by this P [10:30:48] using SPARQL [10:30:48] That's going to time out, I bet [10:30:53] But let's try with a small limit [10:34:25] accosta: P31 dies horribly [10:34:30] But as an example: https://query.wikidata.org/#SELECT%20DISTINCT%20%3Fitem%20%3FitemLabel%20WHERE%20%7B%0A%20%20%5B%5D%20wdt%3AP1080%20%3Fitem%20.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D [10:34:44] :/ [10:34:53] and how can i get this data? [10:34:56] That's all the items for P1080 ("for fictional universe") [10:35:30] Can I know all the values from P31? [10:36:44] from web page for example? [10:37:31] Not directly that I know of [10:37:34] But hmm [10:37:43] Query.wikidata has a timeout of 30s, so it will always die with this [10:37:56] Someone in stackoverflow suggests using SQL and https://quarry.wmflabs.org/ which has a timeout of 30 minutes [10:38:26] accosta: What do you need to do? [10:38:31] If you are only interested at the top one you could use https://tools.wmflabs.org/sqid/#/browse?type=classes [10:39:10] I need to know all items referenced by most used Ps for my research [10:40:23] ¿The number or the complete list? [10:40:31] *The [10:40:39] The complete list [10:40:46] I already have the number [10:41:12] You could get a dump... https://dumps.wikimedia.org/wikidatawiki/entities/ [10:41:44] I already parsed the dump mate [10:42:33] And that's a headache :) [10:42:46] So what should be your next step? [10:43:22] Classify items checking their properties [10:45:23] And don't you get it with the parsed dump? [10:45:25] accosta: you don't happen to be Spanish by any chance? Looking at your IP, it seems so. both abian and me are Spanish, so in case Spanish might be a better language here :p [10:46:43] Anyway, if you have a full dump it should be possible to get the info by parsing, no? I mean, kinda annoying maybe, but possible [10:46:54] :D [10:47:12] Just parse every entry and throw everything into a dictionary or something and add 1 to the specific value whenever it repeats? [10:47:42] The dump is more than 400 GB; if you got it parsed, the rest is easier I think [10:49:09] reosarevok: Where are you from? :) [10:49:42] Gijón, but I've been in Estonia for like 7 years now [10:50:42] :) [10:50:51] Interesting! Nice to meet you, I had no idea :) [10:53:33] accosta: Do we know each other? Where do you research? :) [10:53:59] I'm from Barcelona [10:55:33] Your name sounds familiar to me :) [10:55:39] I'm trying to use https://quarry.wmflabs.org/query/27046 but I cannot get the info., "reosarevok can you? [10:55:57] Reply the same query you used using SQL [10:56:02] ? [10:57:32] accosta: you need to start with use wikidatawiki_p; [10:59:35] And check out https://www.mediawiki.org/wiki/Wikibase/Schema [11:00:32] abian: I'm not seeing docs about how to query triples with Quarry though, so maybe it does not support it? :/ [11:02:44] Tables aren't so expressive as triples, so you only can directly filter by row (entity) [11:03:34] But you can manage to get those entities and process them after, or use more complex filters [11:04:55] I still think that transforming the dump would be easier given that you've already parsed it [11:22:11] Hey, Lucas_WMDE [11:22:16] hi! [11:22:20] Your IP isn't from Barcelona :D [11:22:39] where is it from? I’m on the hotel network… [11:24:46] Forget it, just a lack of precision I think [11:24:50] How was your trip? :) [11:25:47] booking an early flight was a terrible idea, but apart from that it went okay :D [11:26:19] reosarevok: the WDQS timeout has actually been raised to 60 s some time ago [11:26:52] Lucas_WMDE: ooh, neat. But still not long enough to get all items linked to with P31, I suspect :p [11:27:00] perhaps not ;) [11:28:42] Many algorithms aren't linear, so even if we extended the limit to 90 s, that wouldn't be enough for many queries :/ [11:29:45] well, just getting a list of all entities which use P31 should be cheap and linear [11:29:57] I got some 37 million rows of JSON before the timeout [11:30:08] and I think the limiting factor was just my network connection [12:24:38] !help, In JSON wikidata dump, how can I get the property name? [12:24:49] for example, P31, I want to get "instance of" [12:25:46] accosta: look up the labels of the P31 entity, it should be included in the dump too (somewhere) [12:30:36] I could not find it.. [12:30:43] on labels [13:34:36] accosta: did you find it eventually? [13:37:34] well I'm parsing the dump again saving more information that i had not have [14:25:47] RECOVERY - puppet last run on wdqs1008 is OK: OK: Puppet is currently enabled, last run 4 minutes ago with 0 failures [14:27:57] RECOVERY - puppet last run on wdqs1006 is OK: OK: Puppet is currently enabled, last run 3 minutes ago with 0 failures [17:47:14] anyone have an idea what's wrong with https://tppr.me/K94rG this? [17:47:24] item is https://www.wikidata.org/wiki/Q33570421?uselang=en [19:00:28] revi: you mean the display? [19:00:58] I get the same [19:01:07] me too :p [19:01:40] I guess it's taking it from some wiki page that just got marked for translation [19:02:36] or it was marked in the code? [19:04:40] for ?uselang=ko, it shows a paragraphs without translation tags [19:04:46] but then it’s damn outdated [19:15:20] it also shows up when i don't use the ?uselang option :p