[11:23:17] (03PS1) 10Daniel Kinzler: Move data type code to separate directory. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/181555 [12:14:12] Merge mode on Wikidata game never works for me [12:36:45] YuviPanda: That's quite a few! [12:37:34] multichill: yeah [12:37:56] multichill: I wonder what the referrer data says…. [12:38:55] Hey multichill [12:42:52] YuviPanda: You have access to that, I don't :P [12:42:58] hoi sjoerddebruin, holiday? [12:43:06] Yeppp [12:43:31] multichill: these are already stripped off IP addresses, so can be made publicly accessible if needed :) [12:45:09] sjoerddebruin: I see that Gerard is in his puppy mode again [12:45:22] multichill: A list is a list imo [12:45:32] yup [12:45:46] Mixing concepts gives a huge mess [12:46:11] I'm still not happy with P360 on categories. [12:46:44] That's another example of mixing concepts and he's the only one using it. [12:46:53] I understand, but I'm willing to accept that a category is some sort of list [12:48:25] A category is an ordered set without duplicates. That's pretty close to a list ;-) [12:49:34] But I'm not a huge fan of quantity over quality like Gerard. [12:49:58] He goes overboard every once in a while [12:50:27] I've encountered the item on his talk page because no properties were left. [12:50:39] That's just rushed. [13:07:09] sjoerddebruin: Bot was even vastgelopen, maar gaat nu weer als een speer [13:07:14] <3 [13:07:38] Zou prettig zijn als de NTA-rapportage naar een keer per 12 uur kan. [13:08:03] Wish I could speak Dutch XD [14:02:29] multichill: Op hoeveel staat de teller nu? [14:02:42] Welke teller precies? [14:02:48] Ik kijk in autolist voor de aantallen [14:03:39] NTA. :P [15:53:21] Lydia_WMDE: Thanks for adding Wikidata to https://phabricator.wikimedia.org/T85105 , I wasn't possible for me. [15:53:36] https://www.dropbox.com/s/qv851ytuo6qkx1y/Schermafdruk%202014-12-23%2016.53.34.png?dl=0 [15:53:39] sjoerddebruin: yeah but in phabricator... [15:53:48] fix is coming i am told [15:54:01] You can clearly call that search broken. [15:54:12] Anyway, do you like the idea? [15:59:03] hey wikidata - is there a convenient anchor to click to help me make links like http://www.wikidata.org/wiki/Q1#P580 ? [15:59:21] manybubbles: nope [15:59:46] sjoerddebruin: we definitely need to make fallbacks visible. but my thinking is we should show that for everyone [15:59:47] Lydia_WMDE: thanks! I figured I could get it from reading the source but I wondered if there was a clicky [16:00:12] manybubbles: well the # is simply the id of the property [16:00:25] yeah - its easy to build by hand [16:00:28] manybubbles: so you can get that without reading the sourcecode. but yeah no nice clicky way [16:00:31] right [16:01:12] sjoerddebruin: so yeah. need to think more about how we actually want to show them. maybe some css rule is good and just have that for everyone [16:01:31] Anyway, it needs custom styling. ;) [16:01:36] agreed [16:01:56] Before, when I saw a item ID, I've just added a label. [16:02:01] right [16:02:06] that's currently not good [16:02:13] i fear we're losing out there [16:02:28] especially since the search doesn't take fallbacks into account yet [16:02:51] not more search work! [16:03:06] :P [16:03:20] I thought the Reasonator uses a dotted underline for fallback labels. [16:03:25] Lydia_WMDE: do you want me to add wikidata to all of our query service stuff? [16:04:11] manybubbles: yeah we use it for everything the wikidata team should keep on their radar and that the community might want to follow for wikidata [16:04:18] k [16:33:41] Lydia_WMDE: fun: https://github.com/elasticsearch/elasticsearch/issues/9048 [16:34:37] :) [16:36:14] wikidata doesn't seem to have _hypthetical_ dates [16:36:25] like http://www.wikidata.org/wiki/Q858624 doesn't have one [16:36:36] which means I have less good examples for the future [16:38:27] manybubbles: Are you asking to index dates in arbitrary calendars and ranges? [16:38:32] or just ranges? [16:39:23] bd808: more like arbitrarily far in the past and the future. [16:39:55] I feel like calendars and sort of irrelevant here - like, we can totally handle those transformations at parse/format times [16:40:09] but you physically can't stick a date that far in the past in elasticsearch [16:40:15] because it wants to use millis since epoch [16:40:26] *nod* [16:40:31] and that only lets you go back about 200 million years [16:40:34] and forward that many [16:40:53] so no indexing the hypothetical date for the heat death of the universe [16:40:57] and you know you need to be able to do that [16:41:15] and, now that I think about it, I'm not sure how much precision its ok to lose [16:41:20] date math is fun and horrible and confusing and never right [16:41:33] because inflation happened really really really close the inception of the univers [16:41:36] universe [16:41:56] bd808: yeah - I'm not even to the date math part yet [16:42:11] I'm pretty sure Joda Time does a decent job of it though [16:43:37] Joda-time's website says "The library internally uses a millisecond instant which is identical to the JDK and similar to other common time representations." [16:43:49] * bd808 has not looked at the source [16:43:52] Is the resorting of statements currently broken? [16:44:19] I am trying to change the order of two values for a property, and it lets me change it, but I cannot save it then :( [16:47:46] bd808: DateTime also keep ahold of a Chronology which it can use to do date math more properly. it talks about what happens when you do 2012-3-31 - one month in the javadocs. at least it thinks about these things [17:13:33] just because it became relevant for me right now: https://www.wikidata.org/wiki/Wikidata:Project_chat#COI_and_editing [17:13:45] I shouldn't start such things just before xmas, though :P [17:32:15] dennyvrandecic: seems like something we'll need to discuss in all languages... [17:32:36] dennyvrandecic: yes, a very good question [17:32:49] Harmonia_Amanda: like (almost) everything in Wikidata :) [17:33:01] yes [17:33:27] if the project chat is the wrong forum for the discussion, feel free to take it to the right forum, or point me to the right forum [17:35:07] Maybe start a Rfc [17:35:46] :-/ I am not sure I am the right person to champion an RfC considering that people might think am CoI'd about the topic itself [17:36:28] where's GerardM when you need him? [17:37:15] dennyvrandecic: That Freebase dataset. Does it contain a lot of authority control data? Overview somewhere maybe? [17:37:42] multichill: whith authority control data do you mean foreign keys to other data sources? [17:37:55] Yup, stuff like Viaf, ULAN, etc [17:38:05] yes, quite a bit of these [17:38:13] https://www.wikidata.org/wiki/Special:Contributions/BotMultichill <- dennyvrandecic like this [17:39:04] yes, it does [17:40:24] what would be an appropriate reference for such authority control statements? [17:41:51] dennyvrandecic: https://www.wikidata.org/w/index.php?title=Q8009301&diff=182933194&oldid=158302299 is what I'm doing [17:41:57] <3 [17:42:17] looks good [17:42:36] Date is the date of the dump [17:42:36] hmmm [17:44:17] dennyvrandecic: I did quite a bit of automated processing on Commons and having an obvious data trace is very important to hunt down mistakes [17:44:30] agreed [17:44:49] let's say we cut such authority data out of the freebase dump [17:45:06] would we want to just upload it (after reviewing a sample) [17:45:12] or would we still want to curate it fully? [17:45:28] Guess so. I wonder how much new claims this would give [17:45:39] I only saw VIAF in Freebase [17:45:49] It's extracted from Wikipedia and we already pretty much completely imported viaf from Wikipedia [17:49:07] dennyvrandecic: Viaf is a really good start because http://viaf.org/viaf/data/ is available [17:55:38] dennyvrandecic: Do you know if someone already downloaded the freebase dump somewhere on toollabs? [17:56:01] dunno [17:56:21] note that the freebase dump is not CC0 as it is now [17:56:48] but we need to rerelease the parts as CC0 that are wanted [17:57:18] but if someone cuts out the VIAF part, I'll go through the rerelease process internally [17:57:53] Did you read https://meta.wikimedia.org/wiki/Wikilegal/Database_Rights ? [17:58:12] I sure did :) [17:58:32] I'm using the viaf set based on that [17:59:59] The OCLC datasets as a whole have a license, but according to that article the individual records are not copyrightable in the US. [18:00:08] And OCLC happens to be in the USA [18:00:29] I am not a lawyer, so I will refrain from commenting [18:01:03] It's the same reasoning we use for importing data from Wikipedia [18:03:01] looks like the vi and war wiki have every animal article known to man o.o [18:03:39] what's the difference between using "imported from" and "stated in"? I see both used for viaf for different statements on https://www.wikidata.org/w/index.php?title=Q8009301&diff=182933194&oldid=158302299 [18:04:23] nikki: We use imported from for Wikipedia. It's like a reference-light, it should be replaced by a real one [18:05:47] nikki: https://www.wikidata.org/w/index.php?title=Q833774&diff=183041359&oldid=166047865 is from a different source [18:07:27] This linking is tiring D: [18:07:32] hmm [18:08:58] on that one, the isni uses imported from, but the freebase id uses stated in... would that be because the freebase one refers to a specific data dump, whereas the isni one doesn't? [18:23:10] sjoerddeafwas: Can you keep up with the NTA dishes? :P [18:36:14] I don't need to do anything. [18:38:43] multichill: I can't wait for the top 100 of properties. <3 [18:43:08] aude: Here? :) [18:54:50] I'm suprised we don't have "kingdom" "phlyum" "class" "order" and "family" properties [18:57:54] GeorgeEdwardC: I think we used to, but then it was changed to "has parent taxon" [18:58:14] less denormalization, I guess [18:59:24] I suppose [18:59:52] I've been adding links to enwiki on species articles. I think it'd be better if a bot would do this. [19:05:27] GeorgeEdwardC: not sure I understand - do you mean there are missing links to enwiki to articles on species? [19:05:45] I mean this: https://www.wikidata.org/wiki/Special:Contributions/George.Edward.C [19:06:02] There's hundreds, if not thousands that need to be done [19:09:09] putnik: ping [19:09:35] hoo: ? [19:09:44] hi aude :) [19:10:01] I've been profiling api stuff today [19:10:11] ah [19:10:22] mostly get entities (as that is being used by the new, awesome, version of authority control) [19:11:00] We spend quite some time with Wikibase\Lib\Store\WikiPageEntityRevisionLookup::selectPageLatest [19:11:34] :/ [19:11:53] what do you think about making a getEntityRevisions (plural) function to EntityRevisionLookup? [19:12:00] That could batch the queries and all [19:12:17] whenever we load more than one entities (mostly going to happen in the API, I presume) [19:12:35] if feasible and not too ugly, sounds good [19:12:49] Shouldn't be to ugly [19:13:05] Maybe I take a stab at that later on [19:13:33] ok [19:14:15] aude: I just saw that, whenever you go to "Germany" on ruwiki, they load that: https://www.wikidata.org/w/api.php?callback=jQuery111102415136429040846_1419362002729&format=json&action=wbgetentities&props=claims&ids=Q183&_=1419362002730 [19:14:28] Just to get the commons category to link it in the sidebar [19:14:39] Hope I can make them use wbgetclaims [19:14:40] what? really? [19:14:45] yep :( [19:15:31] is it a gadget? [19:15:46] for all pages views? [19:15:53] https://ru.wikipedia.org/wiki/MediaWiki:Sidebar-related.js [19:15:55] not a gadget [19:15:59] loaded via common.js [19:16:03] a script [19:16:12] all the time, yes [19:16:21] would be ugly but they could get it from lua [19:16:31] true [19:16:31] kind of like itwiki does, then it's only on parse [19:16:45] but would require adding a template to *all* pages [19:16:46] ugly [19:16:55] :/ [19:17:08] At least using wbgetclaims should help for now [19:17:28] Just getting the Germany json took 1.1s for me [19:17:39] that should be about the backend response time [19:17:44] that's quite slow [19:19:26] aude: Does WMF use MemcachedPeclBagOStuff? [19:19:55] i think so [19:20:04] per config, afaik [19:20:06] ok, then we can also make use of getMulti [19:20:30] 'class' => 'MemcachedPeclBagOStuff', [19:20:46] :) [19:20:54] Away for food [19:20:58] ok [19:20:59] going to have a look after [19:28:49] uhg - http://www.wikidata.org/wiki/Q30 is really slow in my firefox [19:28:58] javascript spins for seconds [19:33:02] manybubbles: :/ [19:37:17] manybubbles: a bit slow but loading here [19:37:50] * aude has used up my high speed data for the month and thus doesn't dare to look :P [19:38:03] shall have more datas tomorrow :) [19:45:54] manybubbles: Fairly slow for me [19:46:09] GeorgeEdwardC: I'm having a quick look at it [19:46:22] Then again its a nearly 300,000 byte item [19:46:59] Well Germany is 1 mil bytes o.o [19:47:01] GeorgeEdwardC: which is 1/3 the size of germany [19:50:09] Question: Why are there gaps in the Q numbers, with no deletion log? [19:50:43] Because we reserve ids before it's absolutely sure that a page can be created [19:51:04] Thus we sometimes reserve ids that will never be used because the page can't be created for some reason [19:51:31] Ah [20:00:07] Label collector doesn't work [20:00:14] Either that or it just takes an age to load [20:01:28] GeorgeEdwardC: We killed those properties. Now it's taxon rank and parent taxon [20:02:31] sjoerddebruin: I meant the constraint violations. Quite a few new ones for NTA [20:02:52] Hm, maybe take a look later this night. [20:03:21] hoo, pong. [20:03:56] putnik: https://ru.wikipedia.org/wiki/MediaWiki:Sidebar-related.js [20:04:06] We should probably get a bot to link species articles to items [20:04:12] Since there are so many [20:04:15] Could you please change that to not load all claims [20:04:25] You should use https://www.wikidata.org/w/api.php?action=help&modules=wbgetclaims [20:04:29] and only load the data you really need [20:05:26] Like this: https://www.wikidata.org/w/api.php?action=wbgetclaims&entity=Q42&property=P373&format=json [20:06:12] GeorgeEdwardC: Can you give an example? AFAIK loads of them are linked [20:06:50] Woo-hoo! [20:07:09] I found several of them in Special:UnconnectedPages over at En [20:07:14] I've already attached load [20:07:16] *loads [20:07:18] I been waiting for this for a long time =) [20:07:43] putnik: Has been there for a long time :D But not sure it has been there since the script exists [20:08:07] anyway, taht can cut down page load times quite a bit for pages that have big items associated (worst example: Germany) [20:08:46] GeorgeEdwardC: enwp is a bit of a mess when it comes to taxons because enwp has one article for monotypical taxons and we will have two items (genus and species) [20:09:46] There just seems to be an overabundance of unconnected species pages [20:10:06] They're often sub-stubs with a taxobox [20:20:48] putnik: If you need help/ code review, I can do that :) [20:25:57] aude: getEntityRevisions or getEntitiesRevisions [20:26:01] mh, first one, I guess [20:26:07] (function name) [20:28:01] hoo: probably the first [20:28:39] still could be confusing though... does it it get multiple revisions of one entity or of many entities? [20:29:04] but since we have "EntityRevision" object, then not as confusing [21:11:15] hoo, done. [21:12:10] Awesome! [22:25:18] Ayone \\able to edit https://www.wikidata.org/wiki/Q5582 ? I want to add nm0994883 but I don't see edit button [22:26:05] multichill: Broken for me :( [22:26:24] I can edit. [22:26:27] For some reason I think the protection is involved [22:28:34] wbUserCanEdit [22:28:34] 23:28:23.572 true [22:29:14] purging fixed it [22:29:25] Nasty bug... we probably cache the edit buttons in whatever state they are [22:29:30] and that depends on the user's rights [22:29:41] ouch [22:29:49] hoo: will you open a bug? [22:29:51] Lydia_WMDE: ^ :( [22:30:08] Given that Lydia is not around, yes [22:30:37] hoo: I'm an admin on wikidata so I should be able to edit it even if it's protected [22:30:52] multichill: Sure... but you need to purge it, because of said bug :( [22:31:24] yes, but that means we're showing the same page to everyone regardless of rights.... [22:31:34] Yes [22:31:45] they still can't edit it (will get an error then) [22:31:47] that's nasty [22:34:40] https://phabricator.wikimedia.org/T85252 [22:38:34] * hoo rages [22:38:43] can anyone add project Wikidata to that task? [22:39:21] Phab wont let me... [22:41:07] ah, darn, locations are still earth only... [22:41:33] hoo: ditto [22:43:02] ditto [22:43:07] hoo: I had that too earlier this week! I thought it was just me being an idiot with phab [22:43:24] dennyvrandecic: Are you sure about that? I can vaguely remember me having a talk with aude about that [22:43:35] hmm [22:43:42] looks like phab shows only 5 suggestions [22:43:44] I wanted to add the coordinates for this https://www.wikidata.org/wiki/Q11762294 [22:43:51] and 'wikidata' does not make into them [22:44:06] Yeah, I guess so [22:44:15] and there's no apparent way to bypass that thing [22:44:20] hoo: sub'd -bugs as a sub :p [22:44:24] it's on triton (Just pressed random item accidentally, and saw this, and tried to figure out what it even is) [22:45:32] hmm, maybe like this [22:46:47] dennyvrandecic: Ah, I think I remember now... at some point adding non-earth coordinates was possible via the API, but not the UI [22:46:54] that's why they were removed [22:47:13] hoo: ah, makes sense. so it's future work :) [22:47:47] hoo: We imported quite a few non-Q2 coordinates [22:47:51] Pywikibot supports it [22:47:59] Is that still working? [22:48:21] Probably, done importing them quite some time ago. katie knows best I guess [22:56:37] hoo: mwahahaha >:D [22:56:51] the api actually returns the whole suggestion list [22:56:54] How'd you do that? [22:57:08] I copied the dom element of the existing project [22:57:18] inside it is a hidden input [22:57:22] with project id [22:57:34] I swapped the project id with the one returned by the api [22:57:39] :p [22:58:11] but that limiting is kinda lame [23:39:25] !admin [23:40:21] Ber, ? [23:40:25] yes? [23:40:36] hi Ash_Crow_ and hoo [23:40:46] i need a protect here --> 12https://www.wikidata.org/wiki/Q293130 [23:41:15] war edits at es.wiki. Coat of arms maybe wrong [23:41:43] we are waiting for some sources [23:42:49] Ber, I protected the article [23:43:32] thanks a lot, Ash_Crow :) [23:58:12] Feels odd to be creating opbject with no name in a language I speak, lol. [23:58:23] *objects [23:58:48] (is entering data for Dutch artists from the RKDartists database