[00:06:56] rschen7754: you're a polar bear [00:08:47] Vogone: You're a xchannel spammer. [00:54:08] multichill: You asked "Do you know why the English Wikipedia mixes nth_century and nth-century? See for example https://en.wikipedia.org/wiki/Category:14th_century_in_Asia". Yes, the difference can be seen in the examples "14th-century monarchs in Asia‎" and "14th century in Mongolia‎". A hyphen is used between "14th" and "century" when the two words are as a single adjective phrase... [00:54:09] ...(https://en.wikipedia.org/wiki/Adjective_phrase). E.g. in "14th-century monarchs", "14th-century" is a single adjective. In the phrase "14th century in Mongolia", "14th century" is a noun phrase, not an adjective phrase. Adjective phrases are often hyphenated to help clarity. More: https://en.wikipedia.org/wiki/English_compound#Hyphenated_compound_modifiers. [00:54:12] Romaine still here? I have some issues with pages which have been deleted [00:54:40] hi [00:54:56] Emw: A right, but looks like some people messed this up, because a lot of them were renamed [00:55:32] I left the state library and am at home now [00:57:52] I see well-educated native English speakers omit hyphens in compound modifiers all the time. I'm sure I use them inconsistently. [00:58:34] Romain what is your user name? are you an admin? [04:14:53] >: Wikidata makes me sad, properties should automatically sort by P number by now (or something). It's weird going to different items and seeing properties in different orders [04:15:56] Moe_Epsilon: We have ordering now. {{sofixit}} :p [04:16:44] {{sofixalltheitems}} [04:16:44] XD [04:51:05] How about reading all of the things and ordering in your mind? All this user interface stuff is so overrdated! [04:53:38] Yeah what [04:53:53] Take the JSON output and stick it in a python dictionary which has no order! [04:54:04] well, that'd be great, if we didn't already have 100 million items xD [05:59:03] legoktm: Can you help me determine a property violation for Property talk:P40 [05:59:23] possibly, what's up? [05:59:27] trying to ensure that the property isn't being used for non-human [06:00:06] ie,. a horse parent doesn't have a child [06:00:16] the > Type "person (Q215627)": element must contain property instance of (P31) with class person (Q215627) or its subclasses (defined using subclass of (P279)) should do it I think [06:00:47] https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P40#Type_Q215627 [06:00:48] hrmmmm [06:01:25] * Gloria violates legoktm's constraints. [06:01:37] sDrewth: https://www.wikidata.org/wiki/Q44204 is a weird edge case. [06:02:24] sDrewth: also, unsure if it makes sense to use Q5, human [06:02:46] https://www.wikidata.org/wiki/Q79999 is similar [06:03:08] hmm [06:03:37] all fictional characters! [06:04:06] hahaha [06:04:36] and is probably being used in fictional stuff too :-/ [06:05:54] we never did figure out the fictional properties versus real ones :/ [06:06:05] * sDrewth splits forks child to have real and make-believe [06:06:34] leave it then, too hard for today [06:07:35] still enough issues getting PEOPLE removed from fictional characters [06:08:14] "fake child" xD [08:37:23] legoktm, sDrewth: please only use P31 Q5 on real humans [08:38:39] see https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Migrating_away_from_GND_main_type#P31_value_for_things_like_Coco_Chanel [08:39:02] and https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Migrating_away_from_GND_main_type#Fictional_entities [08:40:25] the most interesting task force on Wikidata is also relevant here: https://www.wikidata.org/wiki/Wikidata:Fictional_universes_task_force [09:33:23] Emw: umm, of course [09:35:10] that was not the basis of the discussion, it was about checking for the use of the property child applied to non-humans [09:54:17] does anyone know of a pre-populated {{infobox}} template with the #property completed? [09:54:30] or have one with all the bits matching? [12:57:44] http://ultimategerardm.blogspot.nl/2014/01/eusebio.html [12:58:02] How do I qualify a soccer season ? [13:12:43] GerardM-: I think you're linking to the wrong eusebio in your blog post... [13:13:57] and have the picture of the wrong guy... [13:14:16] (or the right picture linked to the wrong guy) [13:18:04] mineo ?? [13:18:34] you link to an eusebio who's 49 [13:18:50] you are right [13:18:55] [13:19:33] I have updated the article [13:19:40] now about the question ... ? [14:48:25] autolist is update every day ? The number of people dead in 1943 is the same since Wednesday despite my edits https://twitter.com/MonsieurBraun/statuses/418224453676326912 [15:10:50] (03PS1) 10Mushroom: Fix ItemByTitleHelper site/title combinations [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/105511 [15:26:34] (03CR) 10Hoo man: [C: 04-1] "This no longer allows an equal number of sites and titles" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/105511 (owner: 10Mushroom) [17:47:49] question, using the api on a given wiki, how do I know what wikidata-record is linked to a page? [17:48:57] (or do I have to regex the parsed page for the wikidata url ending in Q\d+ ? [17:49:02] Beetstra: Are you asking "how do I get the Wikidata ID of a page using the API?" [17:49:12] yes [17:50:30] I searched 'api.php' on en.wikipedia for 'wikidata' and similar - no hits [17:50:46] Either not documented, or not implemented? [17:51:59] Nothing is juping out at me on the extension page for the client. https://www.mediawiki.org/wiki/Extension:Wikibase_Client [17:52:03] for API [17:52:05] which is weird [17:52:13] (because that would make sense, yeah?) [17:52:24] yep [17:53:22] Do I have to do it the other way around then .. [17:53:50] I'm skeptical that you need to parse the whole page. what exactly are you trying to do? [17:55:10] OK, say we have a piece of data A on en.wikipedia.org on page B (that data A is part of the content, say the birthday of John Doe). Now, I expect that the data on WikiData is correct .. and I can bot-parse it out of the en.wikipedia page .. how do I figure out what the data is on Wikidata [17:55:33] Now, I would click in the tools-bar on the left on 'Data item', find the birthday - compare (that is manual) [17:56:05] validity checking? [17:56:29] What I can also do is 'Birthday -> Property X', and find on 'Property X' what links back to the en.wikipedia .. but that .. seems weird [17:56:42] Yes, see [[:w:en:User:CheMoBot]] [17:57:12] (I know, will become obsolete when the data can be back-linked from wikidata - until then ..) [17:57:55] What CheMoBot now does is that it has a revid for a given page where it knows the data is correct, and it compares with that .. [17:58:06] It makes sense to compare with Wikidata .. [17:58:15] It's a good starting place, sure. [17:58:33] But maybe you don't know that birth date isn't type Item but type Time, for example. [17:58:37] And I have free time now, so I'd like to start at least with the implementation of it [17:58:43] You wouldn't have to bounce all over the place for it. [17:59:11] There are caveats - I have to see whether it is even possible [17:59:46] But some 'immutable' data is - most 'identifiers' can only be represented in one way - birth days, boiling points, weights, densities are another problem [18:02:11] I mean, the infobox on [[:w:en:Benzene]] has 'CASNo = 71-43-2', wikidata has Q2270, P231 = 71-43-2 ... I tell the bot that CASNo in ChemBox = Wikidata P231, and get both sides and compare [18:02:39] But .. how does my bot know that Benzene = Q2270 [18:03:55] You might have a look at https://en.wikipedia.org/wiki/Module:Authority_control [18:04:10] i know they do checking in that template [18:06:25] alternatively, https://en.wikipedia.org/wiki/Module:Wikidata [18:06:30] might be moderately more helpful [18:07:44] bingo!! [18:07:46] https://en.wikipedia.org/w/api.php?action=query&prop=pageprops&titles=Benzene [18:08:17] :D [18:09:01] Thanks! [18:09:58] (Which of my links helped, btw? hah) [18:11:17] oh, the first one talked about 'property' .. which led me to think that it is info about, or a property of the page, which I did know I could get from the API [18:11:48] :D [18:12:23] Now, get readable data from the api on Wikidata [18:12:42] .. https://www.wikidata.org/w/api.php?action=query&titles=Q2270&prop=revisions&rvprop=content&format=xml ... hmm .. [18:12:58] https://www.mediawiki.org/wiki/Extension:Wikibase/API [18:13:58] ... {"m":["value",231,"string","71-43-2"],"q":[],"g":"q2270$597c9dfa-4bcb-ee30-7af4-6cda1f579ac8","rank":1,"refs":[]} ... [18:16:14] better: https://www.wikidata.org/w/api.php?action=wbgetclaims&entity=Q2270&property=P231 [18:17:42] Aaah .. this is going to be easy :-D [18:18:10] And this is why Wikidata exists. [18:18:12] :P [18:18:17] eh .. no [18:18:39] the hard work is to parse the right data out of an infobox from the wikicode of a page on a wiki .. [18:19:24] Indeed. [18:20:01] And that part is done .. but now the 'verified' data is not coming from an old revid anymore [18:20:50] This may be up after an hour or 2-3 of programming [18:20:58] Should run before the weekend I would say [18:22:22] Actually, the hard part is to botwise update the infobox .. [18:23:46] * Beetstra sees 15.000 edits coming for his bot :-D [18:25:58] Anyway, thanks for the hints! [18:27:06] I try