[01:12:24] addshore: yes, I'm reloading it. I told icinga to shut up, but looks like it ignored it [02:08:31] hi everyone! I'm in the middle of editing this item: https://www.wikidata.org/wiki/Q21546271 [02:08:33] based on http://adb.anu.edu.au/biography/bradley-joseph-5332 [02:08:54] I'm wondering: do I have to attach references to each of the statements I make based on the ADB entry? [02:09:24] and do I list the source as the ADB or to the specific volume of the ADB in which this person is described? [07:02:00] Hi [10:33:09] Lydia_WMDE: I think there might be things for all of the thigns now [10:33:28] addshore: lol. excellent. will have a look right now [10:33:35] excluding the few things that are at the bottom of the description on https://phabricator.wikimedia.org/T119182 which im not 100% sure the best way to tackle them yet [10:34:34] Lydia_WMDE: if anything is missing just make a ticket for it! :) [10:34:36] Lydia_WMDE: am I right here? https://phabricator.wikimedia.org/T118322#1830867 [10:35:01] addshore: all looking good \o/ [10:35:04] benestar: looking [10:35:55] benestar: commented [10:37:50] thanks! [11:18:13] oh my, are there even any indexes on wb_terms? [11:18:23] the performance is so bad... [11:20:01] well, I guess im not doing a query it was designed for, but bah! [12:11:55] (╯°□°)╯︵ ┻━┻ https://www.wikidata.org/wiki/Wikidata:Wikimania_2016 <-- Last week for comments! [12:41:35] Adrian_WMDE: I'm looking at where the filter method can be used [12:41:55] And I'm kinda thinking perhaps focing the user to have an implementation of this interface was not a good idea [12:42:01] Kinda bothersome [12:42:10] Esp since PHP 5.x does not have anon classes [12:42:28] Yeah [12:42:37] And you can still put the stuff in an object if you want without using an interface and without wrapping it in an anon function by using __call [12:44:00] If you want to use magic you can also do it the other way round (see http://www.clock.co.uk/blog/mimicking-anonymous-classes-in-php-using-closures) [12:44:19] i. e. new ClosureStatementFilter(function () {}) [12:44:36] with ClosureStatementFilter::statementMatches calling the constructor parameter [12:45:00] Hmm good point [12:45:13] I think we might already have stuff like this somewhere in WB [12:45:23] Might be good adding this to DM [12:50:20] hey JeroenDeDauw and Adrian_WMDE :) [12:50:40] Adrian_WMDE: quickly, hide, I'll distract him with a NyanData [12:56:09] Adrian_WMDE: benestar https://gerrit.wikimedia.org/r/#/c/255370/1 [12:56:13] Also see the follow up [12:57:01] Some prayer driven development here as I got no working MW install :D [12:57:27] Which means I can only do very simple refactoring like this [12:58:41] JeroenDeDauw: is that the implementation of DM or of thiemo's thing in Services? [12:59:15] benestar: DM one [12:59:25] benestar: else StatementList->filter would not take it [13:00:21] oops, I broke it [13:00:23] meh [13:00:45] * JeroenDeDauw grumbles about evil frameworks being such a pain [13:03:00] JeroenDeDauw: cool to finally see that stuff in action :) [13:03:54] How the smag did I refactor the ->toArray() call away >_> [13:05:38] benestar: Adrian_WMDE take two https://gerrit.wikimedia.org/r/#/c/255374/ [13:13:11] Looking at this stuff and being reminded how broken PHP collection support is makes me depressed [13:18:49] !merge 255374 [13:18:50] merge merge merge MERGEEEEEEEEEEEEE https://gerrit.wikimedia.org/r/#/c/255374/ [13:33:21] this may be important (especially for JeroenDeDauw) https://github.com/aleju/cat-generator [13:36:51] Jens_WMDE: jep, work done today: -1 [13:43:47] addshore: y u no merge [13:53:08] (╯°□°)╯︵ ┻━┻ https://www.wikidata.org/wiki/Wikidata:Wikimania_2016 <-- Last week for comments! [15:26:59] hello, how can I link wikidata wiki to other wikis? Here http://wikidata-reading-web-staging.wmflabs.org/wiki/Q2 I want to link the wikidata item to this wiki page http://en-reading-web-staging.wmflabs.org/wiki/Book but I don't see that wiki in the list of wikis. [15:31:11] btw, i'm using vagrant [17:37:24] bmansurov: AFAIK your wiki need to be added to the sites-table [17:37:43] Tobi_WMDE_SW_NA: thanks! [17:37:58] bmansurov: have a look at https://www.mediawiki.org/wiki/Manual:Sites_table [17:38:22] cool! [18:15:27] frimelle: http://link.springer.com/chapter/10.1007/3-540-44681-8_80 [18:18:51] frimelle: http://eprints.dcs.warwick.ac.uk/1453/1/cs-rr-339.pdf [18:35:55] JeroenDeDauw: a test for GetClaimsStatementFilter.php would be great.. [18:58:38] :P [18:58:45] benestar: if it is just copied code though ;) [19:00:29] SMalyshev: you fixed wdqs1002 then? :) [19:06:11] addshore: it's in the process of being reloaded [19:06:36] ahh cool, I noticed the lag going up, but the tripple count is now staying even :p [19:07:05] should be up there in a couple of hours I think [19:07:16] but no queries are going to it right? [19:07:26] Which makes me wonder what the hell the DoneCount is counting..... [19:07:29] not yet. When it's fully synced I'll turn it on [19:07:40] it's counting the updater I imagine [19:07:58] the dump reload is 2 stage - load dump, run updater to catch up with delta since the dump [19:08:15] since the dump is from Monday and now is Wednesday :) [19:15:06] =] [20:24:52] hi [21:24:17] SMalyshev: its interesting, as your syncing the dataset is slowly getting behind, every 1 hour it falls 30 mins behind :P [21:24:52] somebody running a bot again? [21:25:23] you think its just from fast editing ? *looks* [21:25:52] we could probably set up an icinga check for the lag now [21:26:16] it looks like some non bots are going pretty fast [21:27:04] https://www.wikidata.org/wiki/Special:Contributions/Coyau [21:27:08] yeh [21:27:20] I guess it would be usefull to also have a graph of edit rate on wikidata? :P [21:29:54] https://www.wikidata.org/wiki/Special:Contributions/KrBot [22:06:59] hello anyone here... ? [22:09:41] I have just found out that a suggestion of a new wikidata-property has been created after my proposal. but what is the best to do next in line to get it spread ? [22:14:59] Migrant, which proposal where? [22:17:40] I'm asking cos i am fairly new to wikidata. but the Property is a clear sports-related one : https://www.wikidata.org/wiki/Property:P2350 [22:19:20] (╯°□°)╯︵ ┻━┻ https://www.wikidata.org/wiki/Wikidata:Wikimania_2016 <-- Last week for comments! [22:20:33] SMalyshev: Are you around? [22:20:49] and another reason that i am asking is that i am thinking of suggesting other same-sports related but from other databases later. [22:21:30] tarrow: yes [22:22:04] Hi, I thought I might be more helpful on IRC than keeping up my email chain about blazegraph [22:22:36] have you tried running the update script a second time? That is when it breaks for me. [22:28:56] In any case the dumps are very small so you can take a look if you like. I just need to find a way to get them to you [22:33:22] http://librarybase.wmflabs.org/rdf.ttl [22:33:35] SMalyshev: ^^ [22:34:25] tarrow: works for me second time too [22:34:34] ah, how odd [22:34:39] hi guys, I need some help of parsing the wikidata property descriptions into a postgres table [22:35:06] tarrow: let me load it into clean namespace and try again [22:35:13] i wonder if the wikidata weekly dumps has the properties descriptions defined withing the valuesnaks. [22:35:41] SMalyshev: thanks! I really appreciate the help. [22:39:10] has anyone tried any such thing [22:39:52] tarrow: how many updates do you see after you load the dump? [22:40:34] Quite a few I think. [22:42:30] tarrow: the ttl there is raw dump. right? not processed by munge? [22:42:42] yeah, that is right [22:43:09] and, in possibly dodgy fashion I just gzip it and rename it to what loadData.sh expects [22:43:26] wikidump-00001.gz or something [22:44:16] tarrow: I think you still need to use munge [22:44:31] ah, perhaps that is the problem then! [22:44:38] let me see [22:47:19] tarrow: yes, after I do munge/load and then run updater with proper -w option, it all seems to work ok [22:47:49] thanks, I guess it must just be the lack of munge then [22:48:11] Do you do both the munging and loading with the included scripts? [22:48:25] tarrow: yes, you have to do munge since internal data format is not exactly the same as RDF, see: https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#WDQS_data_differences [22:48:37] just so I can replicate what you do exactly [22:48:47] the differences are small but important [22:49:13] I did this: munge.sh -f ~/Downloads/libase.ttl -d . -- -w librarybase.wmflabs.org [22:49:27] where libase.ttl is the file you sent me [22:49:49] then: sh loadData.sh -d `pwd` -n libase [22:50:09] then, from ./tools: java -cp target/wikidata-query-tools-*-SNAPSHOT-jar-with-dependencies.jar org.wikidata.query.rdf.tool.Update --sparqlUrl http://localhost:9999/bigdata/namespace/libase/sparql -w librarybase.wmflabs.org -v [22:50:36] (of course if you use standard wdq prefix the use it, I used other one because I have data that I need at wdq prefix :) [22:50:45] s/prefix/namespace/ [22:51:09] awesome; and just to be super specific in case made a mistake: you aren't running from the packaged version right? [22:51:43] hi guys, can some one guide to where I can find the Property descrition from within the json dumps [22:52:14] because for me the .jar is in service-0.1.1-SNAPSHOT/lib/ [22:55:48] tarrow: yes, I'm running on my working checkout. But that shouldn't be different from packaged one functionally. Unless I miss something of course :) [22:56:27] if that still doesn't work for you please send me the full sequence of commands from empty DB and the log and I'll try to reproduce here and see what's up [22:57:08] cool; I'm afraid I can't test right now. I'm not at work and where I'm staying has an awful connection but I'll let you know how it goes tomorrow. [22:57:44] Thanks for all of the help; it's very kind of you. I'll try and write this all up for the next fool who comes along and wants to do the same thing as me! [22:57:47] hey wikidata960! you have to scan the dumps, file all entities that are properties and then look in the structure for a description in a given language [23:00:59] addshore : I actually have parsed a small part of it for my testing but are you saying that somewhere within the terms there are properties too. [23:01:17] I believe the properties are in there yes! [23:01:30] but of course, there are 19000000 items and only 2000 properties [23:01:51] addshore : do they have the same 'Q' infront of them or are they represented with a 'P' [23:02:02] P! [23:02:28] there is a bug open somewhere about adding an index of the dump, to make finding specific things easier [23:02:55] ok awesome then I guess I need to parse the whole dump to my db to find out. :) [23:03:16] so far I use some wildcard queries but the return empty. [23:03:51] wikidata960: https://phabricator.wikimedia.org/T85101 [23:04:33] right, its after midnight again, time to hit the hay! [23:05:11] thanks, I'll continue