[00:19:55] Just finished successfully installing wikibase on a localhost where I've used a mediawiki wiki for several years. I see I'll have to set some basic things like Property:P31 instance of and other things to really get started, but my first question is how to make my existing wiki (same namespace) a "Site" to which I can "SetSiteLink." Any advice, or directions to a manual? [00:20:37] Adding a new site is not really nice [00:21:44] There's a XML representation which you could probably use, but not sure it's well documented [00:22:02] You would need to write the xml per hand and import it using a maintenance script [00:22:26] alternatively, you could (using eval.php) live-hack your site object and store it [00:22:37] both options aren't especially convenient… [00:23:27] hoo: I may be wording this poorly. I just mean to be able to use the wikibase installation at all, to have the labels associated with articles in the wiki, at present there is no "site" to which I can link wikibase items. Is that more clear? [00:24:08] isaaccurtis: So... you don't need your local site, but Wikimedia's sites for example would do? [00:25:29] sounds like he/she wants to have an item namespace (and items) alongside existing content [00:26:17] and be able to link pages to items [00:26:25] Yeah. Sounds like it, but adding any site definitions is hard (unless SiteMatrix is set up, which probably only Wikimedia has really set up) [00:27:05] hoo: I do actually need the local site (mw wiki installation) linked (to the local wikibase installation). I've been working about 3 years on a research project where I've taken all the notes in the wiki, and would like to begin sorting that into structured date (for example to have 10 pages about different presidents in the wikibase as Thomas Jefferson (Q1) is an instance of (P31) a president (Q2) or whatever. Does that make sen [00:27:06] for client + repo on the same wiki, this should be done automoatically (imho) but that's not the case [00:27:25] This is a local repo installation. [00:27:36] Wow... unpacked JSON dumps are at almost 60GiB [00:27:58] Do I simply need to add a local client installation as well? This is my first time doing anything more complicated than the wiki locally. I've done a lot of wikidata editing, nothing fancy, but I know how it works. [00:27:59] Yeah, what aude says [00:28:21] You will need that as well in order to show/ use the data in your articles, yes [00:30:00] Is this as simple as setting the $wgEnableWikibaseClient and $wgUseWikibaseClient in LocalSettings both to true? Or is there something fancier to it? [00:32:03] Depends on what you include/ require in your LocalSettings… usually $wgEnableWikibaseClient = true; should do [00:33:01] hoo: thanks, I'll give it a shot and see what happens [00:33:16] Nice! :) [00:33:24] https://github.com/wikimedia/mediawiki/blob/master/docs/sitelist.txt For adding a custom site definition [00:33:33] not sure there's a nicer way. aude might have an idea [00:34:11] i'm hacking up an example [00:35:08] https://gist.github.com/filbertkm/0254b6fc417ae12bc1b1 [00:35:25] After having saved the LocalSettings changes, do I need to run $ php maintenance/update.php again? [00:35:39] isaaccurtis: no [00:35:45] Oh well, in that case yes [00:35:56] Because WikibaseClient adds own tables [00:36:00] but not in general [00:36:35] aude: Missing site identifiers, I guess [00:36:46] ./tests/phpunit/includes/site/SiteImporterTest.xml is the example in core [00:37:10] We need site identifiers for linking :/ [00:37:34] hoo: i used exportSItes [00:37:43] maybe wikidata has no ideniftiers for me [00:37:54] aude: Oh, think that's the brokeness I ran into [00:38:02] remember when I screwed production by using that script [00:38:05] think we have a bug for that [00:38:54] updated example using enwiki [00:39:59] https://gerrit.wikimedia.org/r/#/c/259198/ (admittedly not that nice, but hacked the script) [00:41:06] aude: https://phabricator.wikimedia.org/T100750 [00:41:14] isaaccurtis: so, using https://gist.github.com/filbertkm/0254b6fc417ae12bc1b1 example, i would make globalid same as your database name [00:41:27] If you have any findings, you might want to add them there [00:41:34] never looked into that again [00:41:48] not sure you need the localids or if they would work correctly in this case [00:41:57] group can be whatever [00:42:04] aude: Just to be clear, this is simply to be able to have the localhost wiki entries (e.g. http://localhost/mediawiki/index.php/Jacco) be linkable to the associated wikibase entry (e.g. http://localhost/mediawiki/index.php/Item:Q2) ? I'm neither trying to create data for nor pull data from anything except my computer. [00:42:20] isaaccurtis: yes [00:42:58] we might also need to configure a sitegroup setting (to match whatever you pick as the site group name) [00:43:22] Ok thanks. So, back to beginning, what's my first step? [00:43:53] isaaccurtis: maybe start with https://gist.githubusercontent.com/filbertkm/0254b6fc417ae12bc1b1/raw/78728ef2603ec7870df1191abeeb4dedb3420930/sites.xml [00:44:06] and replace the globalid, group and paths [00:44:21] aude: okay, into what file does that need to be pasted? [00:44:22] then use maintenance/importSites.php script [00:44:42] isaaccurtis: you can call it anything [00:46:12] (sorry this is so difficult and not well documented :/) [00:49:11] aude: I'm working at a very 101 level here. What do I look at to find my local versions of the globalid, group, and path? Since the url of a sample page is http://localhost/mediawiki/index.php/Jacco, I presume page path is http://localhost/mediawiki/index.php/$1, but not sure about the others. [00:50:03] globalid should be your database name (you can find it in LocalSettings.php) [00:50:57] Look for $wgDBname there [00:51:06] file_path... if you click, for example, the history tab (e.g. https://en.wikipedia.org/w/index.php?title=Berlin&action=history) [00:51:07] Okay, so globalid from LS.php looks like $wgSitename = "Recollection"; so Recollection (case sensitive presumably) [00:51:28] Correction: $wgDBname = "recollection_wiki"; so recollection_wiki [00:51:36] then you see, for enwiki, it's https://en.wikipedia.org/w/$1 [00:51:53] and think page_path is http://localhost/mediawiki/index.php/$1 [00:53:57] For file path, clicking page history, I have http://localhost/mediawiki/index.php?title=Jacco&action=history, so then the file path is http://localhost/mediawiki/$1 ? [00:54:14] isaaccurtis: yes [00:54:24] And since the page itself is http://localhost/mediawiki/index.php/Jacco, the page path would be http://localhost/mediawiki/index.php/$1 ? [00:54:45] isaaccurtis: think so [00:55:37] Just to be clear again, these are paths for the main wiki articles, or paths for the local wikidata items? [00:55:44] main wiki articles [00:56:09] when you add a site link, the code will check that the page exists [00:56:15] k thanks, using this to edit what you jotted up, one second [00:57:30] and the "group" ? [00:59:29] just not sure where to look in LS.php for what to substitute in the line wikidata [01:00:00] group can be whatever (e.g. 'researchwiki' ) [01:00:21] maybe if you had multiple wikis, then somethign more generic to group them together [01:01:26] no multiples (unless wikibase counts as a multiple from the wiki itself). So I've just set the group name to researchwiki for now. So this little script, it gets copy-pasted into another file, or saved as text and imported, or...? [01:02:14] then you can add in local settings [01:02:38] $wgWBRepoSettings['siteLinkGroups] = array( 'researchwiki' ); [01:02:49] $wgWBRepoSettings['siteLinkGroups'] = array( 'researchwiki' ); [01:04:00] then use importSites.php [01:04:05] e.g. php maintenance/importSites.php --file sites.xml [01:05:22] okay, I'll add that line (the second one with both ' marks) to the end of local settings. then from command line I run the importsites on the text I've modified, having saved it as sites.xml. Okay to just save that file right in the mediawiki folder and run it from there? [01:06:09] isaaccurtis: ok to save it tthere [01:06:22] you can delete it, move it or whatever after [01:09:10] Okay: $ sudo php maintenance/importSites.php --file sites.xml Done. [01:09:13] Next step from here? [01:09:52] if it all worked (and we didn't forget a step), then adding site links should work now [01:13:09] Okay, I see "In other languages - addlinks" now in the sidebar of pages. But from my Item:Q2 page, for example, I have now (0 entries) to choose from, but clicking edit doesn't give me any text box to choose an entry from. Thoughts? [01:14:43] might be caching :/ [01:16:05] It definitely changed the interface, which used to have basically the same site options as wikidate proper, but, as before, clicking "edit" doesn't yield a text box to choose the site and page from. [01:16:31] isaaccurtis: hm... so the group of your site is "reasearchwiki"? [01:16:34] to change the heading, you can set it in MediaWiki:Wikibase-sitelinks-researchwiki [01:18:11] Yes, the group, as set in that sites.xml file, was researchwiki. I just put that there arbitrarily, if there was a place I was supposed to get a correct name from, I did not do that. [01:18:52] hoo: we had isaaccurtis add $wgWBRepoSettings['siteLinkGroups'] = array( 'researchwiki' ); to local settings [01:18:58] am i missing anythign else? [01:19:37] Progress! [01:19:42] No, that should be it [01:19:52] I had not saved the line at the end of localsettings: $wgWBRepoSettings['siteLinkGroups'] = array( 'researchwiki' ); (it was there but not saved) [01:19:59] Resource loader purging of the sites can take up to 15m, keep that in mind [01:20:02] Now it's saved, and yes it pops up a text box when I click edit. [01:20:10] isaaccurtis: are you using any form of caching? e.g. memcached? [01:20:19] 10m from our cache + 5m from RL [01:20:23] isaaccurtis: ok [01:20:38] probably typing 'r' would invoke autocomplete for the site [01:20:59] to choose the site [01:21:11] aude / hoo: So, from that box, I see recollection_wiki is the site that works, and then from there typing anything in the "page" box autocompletes to my pages. This looks great! [01:21:27] (exactly, you were typing the same thing as I was doing it) [01:21:30] Wait... Wikipedia becomes 15 in a month already? oO Feels like it only got 10 last year [01:21:39] isaaccurtis: \o/ [01:21:46] Nice! :) [01:22:11] we should definitely document this some place and make it easier :/ [01:22:29] Totally [01:22:52] All of this is a huge mess :( [01:23:18] whew... ! So, this seems like plenty for one day, but if tomorrow I want to be able to start assigning properties, like "Jacco" is an instance of a "runaway slave", right now it seems properties like P31 aren't up and running. Where would I start reading tomorrow to learn how to either important a bunch of existing core properties like P31, or to start writing them myself? [01:24:01] As for documentation, that is something within my skill level. I have to clean up and simplify these notes for my own purposes, and once I do that I'll see if I can find an appropriate place to add them to the online resources. [01:24:13] isaaccurtis: You can either create them yourself on Special:NewProperty or do an import of Wikidata's [01:24:33] Okay. Export/Import work basically the same as with wikipedia? [01:24:34] isaaccurtis: Nice, let us know if you have any further questions :) [01:24:44] isaaccurtis: i have https://github.com/filbertkm/WikibaseImport (exprimental) that might work for importing stuff [01:24:45] No :/ aude has a nice tool for that, though [01:25:02] you can import all properties or only specific ones using the --entity option [01:25:02] lol [01:25:19] Yeah Reedy I'm still at it. But 99% functional! [01:25:36] You're making people fix the shit they wrote a while ago ;D [01:26:38] Reedy: It's not broken... the implementation is just very incomplete [01:26:45] !bug 1 [01:26:46] http://phabricator.wikimedia.org/1 [01:26:56] We still have three systems for sites in MediaWiki :P [01:27:26] Reedy: It's T2001 by now… so definitely not as severe anymore [01:27:28] Ask Services to create a 4th [01:28:30] Then we could write an abstraction that somehow (but not really) fits them all... [01:29:03] WikiMap already falls back to sites, but using sites still is slow, so it also still uses the $wgConf stuff [01:29:59] https://phabricator.wikimedia.org/T113034 [01:30:02] aude: That WikibaseImport tool. I don't want to important items in general, but will there be any harm in importing all properties from wikidata with $ php maintenance/importEntities.php --all-properties ? [01:30:21] Even if I don't use most, or is it better to add them one by one as I find myself needing them? [01:30:41] isaaccurtis: probbly one by one is better in your case if you only need a few [01:31:24] (note that my tool adds referenced items also, without statements) [01:31:47] so it would add the example items in https://www.wikidata.org/wiki/Property:P31 [01:31:54] Cool. Thank you all again very very much. I may be back harrassing you for a bit in the morning (Arizona time), but for now I feel very optimistic. (And copy that re: referenced items being added, got it) [01:32:04] (i could probably add an option to the script to not add the linked items) [01:33:05] * aude needs coffee and food :) [01:33:30] back later... [01:33:47] Breakfast? [01:33:51] Is Aude in Germany? [01:34:05] Reedy: nope :) [01:34:10] Seems a bit late for Kaffee und Kuchen [01:35:00] :O [01:35:20] Kaffee und Kuchen um 2 Uhr ist normal! [01:36:19] Ich hatte eben auch noch Kuchen :P [01:36:39] When making properties manually, is it possible to set the Pnumber? Like to write my own P31 and have it be P31 (rather than P2, which is what happens if I just Create a New Property) [01:36:47] isaaccurtis: No [01:37:11] Well, you can if you don't mind laying hands on the database manually [01:38:04] meh... I think I should quit while I'm ahead. : ) [01:38:28] "P2:Instance of" is good enough for now [01:53:47] addshore: around? [01:53:50] I slept [01:58:01] hi need help please [01:58:31] I want to have a list of articles without interwiki (but may be with a wikidata page) for a specific category in a specific language [01:58:35] / or made by a specific person [01:59:19] I guess you will need to write your own query for that and run it on tool labs [01:59:40] Information about pages that are *not* linked to Wikidata is not part of Wikidata [01:59:50] and neither are category memberships [01:59:56] ok let's say linked to wikidata [02:00:30] Still, category membership is not part of Wikidata [02:07:51] hoo, a list of articles created by the same user but without interwiki ? [02:09:57] You could probably do that on labs via SQL [02:10:08] but who created an article is not part of Wikidata as well [02:10:13] so you need to use other tools [02:18:19] hoo, is there a channel for bot specialist % [02:18:22] ? [02:20:15] Amir1: hi [02:20:26] Amir1, get my PM? [02:20:34] hey [02:20:37] yeah [09:31:57] aude: thanks for the merges [09:32:29] sure [09:32:42] dont believe that stuff has been wrong for so long too ... [10:05:49] Lydia_WMDE: around? :P [10:39:10] addshore: jep [10:42:41] hello Wikidatians! :) [10:42:41] I get some timeout errors on Wikidata API: [10:42:41] "Failed to connect to 2620:0:862:ed1a::1: The network can't be reached" [10:42:41] Could that be that you are working on some ipv6 transition or something? [10:44:36] this error was with /usr/bin/curl and server-side requests, but the problem seem to disappear when I use a browser user-agent o_O [10:45:23] maxlath: what if you use an arbitrary user-agent identifying your tool? [10:46:01] oh, actually, using a custom user agent seem to worked in my JS server but not with curl [10:47:41] so, maybe it doesn't have anything to do with user agents, it's just random timeouts (443), maybe depending on which machine you're loadbalanced? [11:53:08] Lydia_WMDE: I was going to ask you a question about counting references, and exactly what we want, but I dont think I need to any more ;) [11:53:22] addshore: hehe ok [11:53:26] I rewrote the reference dump counting thing in java using the toolkit and am running now, and will see how quick it is [11:53:34] hopfully it doesnt take 6 hours like the php one... [11:53:42] \o/ [11:53:46] keeping fingers crossed [11:53:54] but then I should be able to run it over all the dumps we still have and AFAIK thats one of the last things ticked then [11:54:23] well, it says it has done 1million already... [11:54:27] heh, woo! [13:37:23] hi [13:37:30] I am bumping composer on CI to 1.0.0-alpha11 https://gerrit.wikimedia.org/r/#/c/258933/ [13:40:13] hashar: i think ok with us [13:40:26] aude: seems to fail on integration/config :D [13:40:28] err [13:40:31] operations/mediawiki-config [13:40:32] o_O [13:43:11] Lydia_WMDE: so the java one runs in 22 mins instead of 6 hours ;) [13:45:53] reverted it [13:54:20] addshore: wohooooo! [13:54:33] will shove it into graphite now [15:02:12] Lydia_WMDE: https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-references [15:03:49] addshore: i can haz percentages? absolute makes it look like we're doing very poorly while we're actually doing rather well ;-) [15:04:00] percentages are top right [15:04:43] alternatively put the total into the other two graphs? [15:04:51] it is very tempting to misinterpret them [15:04:56] total statements? [15:04:59] (and ill-meaning people will) [15:05:01] yeah [15:05:21] and referenced means any reference? or only external? [15:05:52] i think the combination that magnus did wasn't bad there [15:05:54] referenced means has any reference [15:05:59] ok [15:06:33] wikipedia is just wikipedia? or also other wikimedia projects (not sure how many we have to the other projects) [15:06:42] also yay :) [15:07:10] this list https://github.com/wmde/wikidata-analysis/blob/master/java/analyzer/src/main/java/org/wikidata/analyzer/Processor/ReferenceProcessor.java#L29 [15:07:29] looks like that is just wikipedias currently [15:07:48] yeah [15:07:53] ok [15:08:49] reload, I updated it [15:09:03] k [15:09:36] yay [15:09:38] :) [15:10:56] is it possible to split the referenced vs unreferenced graph? [15:11:21] what people are interested in most i guess is the split between wikimedia/non-wikimedia [15:11:28] and how that compares to all statements [15:11:38] i think magnus got that right in his first graph [15:11:45] on https://tools.wmflabs.org/wikidata-todo/stats.php [15:12:38] should be able to, give me 2 ticks [15:12:43] :D [15:13:02] * Lydia_WMDE is currently fighting with the challenge proposal... need to work in reviewer comments [15:25:55] Lydia_WMDE: referesh [15:26:07] k [15:27:01] addshore: hmm? [15:27:19] looks the same? [15:27:28] *clicks save again [15:27:53] middle right you should have reference mainsnak types [15:28:00] as a percentage [15:28:10] ah! [15:28:59] you d not like the combination like magnus did it in the first one here? https://tools.wmflabs.org/wikidata-todo/stats.php [15:29:33] i can get the same info from the graphs you have now but it is harder to get and understand it [15:29:47] well, no entirly true :P [15:29:56] ? [15:30:14] statements referenced to wikipedia is not equal to number of Wikimedia references [15:30:40] ahhhhhh [15:30:45] that part.... [15:30:46] yeah [15:30:47] narf [15:30:59] I tried to break it down at a statement level in to top row, then a reference level in the next row, and then a per reference snak at the bottom [15:31:21] *nod* [15:31:25] ok i see the issue now [15:31:45] hmmmmm [15:31:57] thinking: could a breakdown like this work:? [15:32:01] * unreferenced [15:32:06] * referenced to wikimedia project [15:32:10] * referenced to outside [15:32:20] * referenced to mix of wikimedia and outside [15:32:22] ? [15:32:38] and then each statement would fall into one of those categories [15:32:55] well, thats basically what you have on the right, but its in 2 graphs right now :P [15:33:14] *nod* [15:33:35] *will wait to see what it looks like when its all filed up from the last dumps and then look again* [15:33:41] ok [15:36:13] right now it looks like all of the trends are going in the right direction Lydia_WMDE ;) [15:36:22] \o/ [17:23:48] Good morning all. Doing some fine-tuning of a basically functional localhost wikibase installation (in same namespace with local mediawiki). I'm noticing that, viewing a mediawiki entry that has a corresponding wikibase item, the mediawiki link to the wikibase item links to wiki/index.php/Q2 (an empty link) rather than wiki/index.php/Item:Q2 (where the item actually is). Any idea what would be causing this? The path entry in the site [17:26:37] iscutwo: on the client wiki, you need to set the repoNamespaces option in $wgWBClientSettings. see docs/options.wiki for details. [17:29:36] DanielK_WMDE: I'll have a look at it. Thanks! [17:31:40] DanielK_WMDE: Actually I'm not seeing an options.wiki in my mediawiki/docs folder. Maybe I misunderstood where you were directing me? [17:31:53] iscutwo: extensions/Wikibase/docs/options.wiki [17:32:19] DanielK_WMDE: ahhhh gotcha. On it. Thanks again! [17:32:34] np [18:00:22] Lydia_WMDE: right, now to throw some more time at that reference extractor thingy ;) [18:07:11] addshore: what! y u no join teh gamez! [18:07:21] cause im in teh wrong coutries! [18:07:57] addshore: then take over teh worldz and make one big country! [18:08:05] =o [18:08:21] only if commute travel and be 20 mins from anywhere to anywhere [18:19:03] hi, in https://www.wikidata.org/wiki/Q82516 I see " [18:19:03] subclass of [18:19:08] sorry [18:19:29] hi, in https://www.wikidata.org/wiki/Q82516 I see "subclass of" Microprocessor (Q5297) [18:19:34] that's not a microprocessor [18:19:36] that's a SOC [18:19:42] (System on a chip) [18:20:01] which contains an ARM microprocessor (like cortex A8 or something like that) [18:20:04] GNUtoo-i1ssi: feel free to edit [18:20:04] ok [18:20:06] thanks [18:20:12] I was told to ask before edigin [18:20:17] By who? [18:20:52] I'm a total newbie with wikidata, so the general advise I was given was: [18:21:01] Ask before doing anything problematic [18:21:19] This doesn't sound problematic to me. [18:21:26] ok [18:21:26] thanks [18:29:05] Hi all. Sorry for re-asking a question, I asked this an hour or so ago but my irc session crashed after DanielK_WMDE replied with a very helpful answer: I'm trying to correct the default path for links between my localhost wiki and same-namespace-localhost wikibase (currently, "wikibase item" links to Q2, not Item:Q2). I know that for more information I need to read the options.wiki file, and I am, but can someone point me to the specific variable I need to chan [18:33:05] iscu: in the LocalSettings.php file for your client wiki, you should see $wgWBCleintSettings somewhere [18:33:08] do you? [18:34:28] I do not, actually. I have RepoSettings but not Client. I added the RepoSettings line myself yesterday with help from aube. [18:35:10] iscu: do you have separate repo and client wikis? [18:35:17] or is it all inside one wiki? [18:35:59] DanielK_WMDE: It is all inside one wiki, with enable & use of both repo & client set to true in LocalSettings.php. [18:36:07] ah, i see [18:36:18] that'S why it works without the client settings. [18:36:34] and itwould actually be nice if you didn't need to set repoNamespaces in that configuration... [18:36:46] DanielK_WMDE: And to get the links to begin to work yesterday between wbase and wiki, aube helped me yesterday and I added $wgWBRepoSettings['siteLinkGroups'] = array( 'researchwiki' ); which got me up and running [18:36:48] I can't find how to do that [18:36:50] https://www.wikidata.org/wiki/Q82516 [18:37:11] the edit button witin the content permits me to edit "Set label, description and aliases" [18:37:36] I didn't find how to edit "subclass of" [18:38:14] anyway. just add this: $wgWBClientSettings['repoNamespaces'] = array( 'item' => "Item", 'property' => "Property" ); [18:38:18] iscu: --^ [18:38:29] hm, the documentation seems to be a bit off there... let me check that.. [18:38:44] DanielK_WMDE: Okay, giving it a shot. Or, are you saying that might be incorrect, and to wait first? [18:39:01] Microprocessor -> https://www.wikidata.org/wiki/Q610398 (SOC) [18:40:19] iscu: ah, my example was wrong, you need to use "wikibase-item" instead of just "item" as the key in the array. [18:40:29] (there is no good reason for this... ugh) [18:41:10] iscu: so $wgWBClientSettings['repoNamespaces'] = array( 'wikibase-item' => 'Item', .... ); [18:41:28] DanielK_WMDE: So, if I understand correctly: $wgWBClientSettings['repoNamespaces'] = array( 'wikibase-item' => "Item", 'property' => "Property" ); [18:41:52] iscu: no, also 'wikibase-property' => 'Property' [18:42:23] DanielK_WMDE: Okay: $wgWBClientSettings['repoNamespaces'] = array( 'wikibase-item' => "Item", 'wikibase-property' => "Property" ); Giving it a shot. [18:42:31] yea [18:45:12] DanielK_WMDE: We're in business! Thank you. And for more information on these settings, you recommended reading options.wiki, yeah? [18:45:53] iscu: yes. though I see that tehre is a sligh mistake in the documentation of this setting. fixing that now. [18:46:39] DanielK_WMDE: Got it. Thanks again. For now, back to work in the wiki & database. Warm thanks from chilly, snowy, southwestern usa. [18:47:22] benestar: WikimediaMediawikiFactoryFactory :D [18:48:16] addshore: \o/ [18:48:32] *headasplode* [18:48:38] :D [18:49:02] addshore: gaaah [18:49:06] Its a factory for wikimedia that give you mediawiki factories when you pass it a wikimedia siteid :D [19:31:03] I've got new PhpStorm licenses if anyone is in need [20:19:47] addshore: If you say stuff like that I have to post http://projects.haykranen.nl/java/ ;-) [20:20:29] multichill: heh! I love that thing ;) [20:21:16] Hehe, it's quite useful when people come up with silly names :-) [23:02:53] Lydia_WMDE: its super late, but meh, I did some re factoring to make that thing I proofed easier to add to, and the added referencing actors in about 2 mins https://www.wikidata.org/w/index.php?title=Q43968&diff=prev&oldid=283814380 [23:03:38] Amir1! you pinged me last night? :P [23:07:11] * aude enabling data access for meta-wiki in a few minutes :) [23:07:17] :) [23:12:09] metawiki doesn't sound like a big consumer of wikidata ;) [23:17:08] Reedy: but i can haz arbitrary kitten on my global user page :D [23:25:32] xD [23:27:35] and there were references https://www.wikidata.org/w/index.php?title=Q638386&diff=283817893&oldid=283448243 [23:27:57] global kitten! (e.g. https://ak.wikipedia.org/wiki/Odwumany%C9%9Bni:Aude)