[08:23:46] !admin I have crete a bot request (https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Pathwaybot). The bot did some test runs and the botcode is available on github. Yet the bot request does not show up in the list of bot requests https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot [08:24:14] Are there some step I am missing? [08:25:11] It seems like you didn't add the request to the list, like others did https://www.wikidata.org/w/index.php?title=Wikidata%3ARequests_for_permissions%2FBot&type=revision&diff=486093320&oldid=483831785 [08:25:29] The edit notice says "Please transclude your request to Wikidata:Requests for permissions/Bot after you created the request page." [08:25:38] yes bot request page needs to be transcluded [08:40:28] @sjoerddebruin Thanks for the suggestion, it now is. [09:45:39] Hi to anyone listening, new here and am just browsing out of interest ?? [09:51:27] ... [11:58:19] Hm, the constraint table hasn't been updated right yet? [12:24:59] sjoerddebruin: apparently not… [12:25:32] Or does it require today's database dump? [12:33:30] hi sjoerddebruin [12:33:36] hi aude [12:34:13] you can create properties, right? [12:34:17] you're an admin? [12:34:24] Yes, I can. [12:34:31] * aude needs someone to check if tabular data is now an option on https://www.wikidata.org/wiki/Special:NewProperty [12:34:38] otherwise i need to login as staff [12:34:44] It is. [12:34:47] cool [12:34:50] thanks [12:34:59] Reminds me to translate that shit. [12:35:04] :) [12:35:11] i can look at updating constraint also [12:35:14] constraints* [12:35:30] Would be nice! It was promised for the 11th. [12:35:33] yeah [14:18:27] sjoerddebruin: constraint tables are now updated :) [14:18:32] \o/ [14:19:37] Can confirm. [14:44:52] Hi, anyone, is navigational popups working? [14:52:27] infovarius: where? context? [15:01:24] infovarius: nope, the feature has been broken on Wikidata since they've made a radical code change. [15:22:31] :( [15:47:17] My coworker is just getting started with wikidata, and she's trying to get only elements that have id attribute with a certain value, "P227", but she's getting all kinds of other matches that contain this sequence somehwere. [15:47:33] this is her uri: https://www.wikidata.org/w/api.php?action=wbgetentities&ids=P227&sites=dewiki&titles=Gerhard_Bohner&props=claims&format=xml [15:48:51] I know nothing about this topic. She just doesn't IRC. [15:50:27] looking [15:51:19] If it helps to know, this is GND number. GND is shared database of library metadata for german-speaking countries. [15:53:16] what about to use "What links here"? [15:53:29] https://www.wikidata.org/wiki/Special:WhatLinksHere/Property:P227 [15:55:26] https://www.wikidata.org/w/api.php?action=help&modules=wbgetclaims seems to only give one statement, but only works with Q id's. [16:02:54] I passed along your responces, infovarius and sjoerddebruin. Thanks. [16:03:03] responses. damnit. [16:03:06] spelling is hard. [17:08:54] !admin https://www.wikidata.org/wiki/Special:Contributions/Okbear [18:47:11] Amir1_: Can you have a look at https://phabricator.wikimedia.org/T165249 ? I think it's an upstream bug (not Pywikibot but Wikibase) [18:47:16] or a feature :P [18:48:38] sure [19:07:01] Amir1_: Thanks! Are you coming to the hackathon this weekend? [19:07:30] multichil: yeah, I just got my visa ^_^ [19:11:10] :-D [21:50:34] yo [21:50:44] and thanks for wikidata.org [21:51:09] trying to make sense of some things that should be expressed as triplets or graphs described with triplets [21:51:45] I've been trying to uncompress [21:52:23] but 'bzip2 -d latest-all.json.bz2' takes for ages and the extracted file grows over 100GB long [21:53:58] well, there's a *lot* of data :) [21:55:23] sorry, i feel somewhat responsible of that :-( [21:55:52] I bought a 1TB server [21:56:02] 11.78€ + VAT quaterly [21:56:05] if you only want to look at a few items, rather than dealing with the entire dump, you could access individual items, there's a concept uri link in the sidebar for items (you'll need to add .json to the end of the url to get json in a browser) [21:56:09] so you rented it ;) [21:56:32] nikki oooh, nice, didn't know about that ! [21:56:52] nikki: me neither, thanks :) [21:56:52] nikki: I'm trying to get get something imported to Semantic Mediawiki [21:57:21] Alphos: It's calles IaaS [21:57:24] I wish they'd do something about that link so that it doesn't just redirect straight back to the html page [21:57:28] * Alphos hi5s WikidataFacts [21:57:36] * WikidataFacts waves [21:58:02] nikki sayyyy, common.js ? :p [21:58:51] My home machine is at 81GB decompressed on the 'bzip2 -d latest-all.json.bz2' job [21:58:56] common.js is great for me, but not much use when I want to help someone else :P [21:59:02] I hope even one of my machines would finish it [21:59:20] nikki MediaWiki:common.js ;) [21:59:46] FWIW, the dumps are available on PAWS (https://paws.wmflabs.org/) under /public/dumps/public/wikidatawiki [22:00:06] though I have no idea how much CPU time you get on PAWS before people start sending you sternly worded messages :D [22:00:29] besides, depending on what you want to do, you may not need a full dump [22:06:56] So how do I upload the wikidata.org dataset onto Semantic MediaWiki 2.5.1 on Mediawiki 1.28.2 ? [22:07:13] I think that would be possible [22:09:17] 94GB extracted [22:09:39] last I saw of the previous attempt was 101GB and still going on [22:16:12] Is wikidata using a native triplestore, graph database or a piggy back on RDBMS? Any recommendation on software to try for that? [22:41:06] jubo2: wikidata itself just stores JSON blobs in the same database that the rest of MediaWiki uses (RDBMS, usually MySQL afaik) [22:41:36] WikidataFacts isn't there a maria backend to store the json blobs as objects ? [22:41:42] the query service runs on BlazeGraph, which has its own copy of the data (triplestore) and is kept in sync by watching recent changes (I think) [22:42:12] Alphos: haven’t heard of that, but might be [22:42:45] although for most intents and purposes, you can mostly consider it all plaintext regardless [22:43:35] although a mariadb would've helped with the old query interface, the one that didn't rely on blazegraph or sparql [22:53:06] hmm.. decompression finished [22:53:18] "only" 135GB resulting .json file [22:54:20] jubo2 : so somewhere between peanuts and nuts ? :D [23:40:01] Hi everyone! [23:42:11] I am working with Wikibase in a project. One doubt that I have: [23:42:38] * Ivanhercaz thinking how to explain it and writting :) [23:43:24] the suspense is killing me ! :p [23:44:47] I have a page ("Example") and then an item (Q1). I want to transclude some properties of the Q1 into "Example". [I keep typing] [23:45:23] I know that I can use {{#property:Px|from=Q1}} [I keep typing] [23:46:21] But, how can I link "Example" with Q1 to use only {{#property:Px}}? [23:47:31] I think you need to add “Example” as a sitelink to Q1? [23:47:44] Yes, I imagined it [23:48:43] But in the item I only see options to add sitelinks to Wikimedia projects [23:52:44] I am searching some option to make it possible