[00:03:48] https://phabricator.wikimedia.org/T47839#1806763 the fact that removing enwiki added bytes is odd... whitespace in the json formatting, affecting the way it's parsed ?... again, just bouncing what comes to mind... [00:09:06] https://3v4l.org/WJ8sV \json_decode doesn't seem to mind whitespace where appropriate though [09:18:35] Hello everyone, I'm having some issues with my wikidata installation, any help would be appreciated [09:23:21] I'm getting stuck on an "Internal server error" after a Mediawiki + Wikibase repo installation via GIT + update of the wikibase via composer, the wikibase is functional but after addind 1 element or 1 property, any other add of elements returns an Internal server error, and I have to wait for several minutes before it's fine again, for only 1 addition of element and then error again. [09:24:34] I have to particular errors in my php logs or apache logs, just - error 500. [13:56:45] is there a wikibase-specific db layout [14:02:37] Wikibase adds some tables Alphos [14:02:47] i know :) [14:03:00] wbc_entity_usage for instance ^^ [14:03:18] So it's not wikibase-specific the database layout. It's general MediaWiki + extra's [14:03:22] i want to know how they relate with each other [14:04:19] well, i want the layout for tables not in https://www.mediawiki.org/wiki/Manual:Database_layout [14:04:47] Alphos: https://www.mediawiki.org/wiki/Wikibase/Schema [14:04:55] thanks ! [14:05:06] Looks outdated and incomplete [14:05:24] But it's a start [14:06:37] Alphos: The Wikibase developers are very good at hiding the documentation. Quite a few things are documented, but no clue where it is and a lot of stuff just isnt'documented [14:06:43] It's almost a real software project [14:06:50] :D [14:07:57] wb_items_per_site seems promising for what i'm looking for [14:11:28] Filed https://phabricator.wikimedia.org/T124603 [14:11:35] What do you want to do exactly? [14:12:31] Alphos: ^ [14:15:09] looking for items that point to wikilinks that are redirects to pages that link back to another item [14:15:20] and publishing periodic reports on that ^^' [14:18:28] the good news is there may yet be a somewhat sqlish solution to my problem ! [14:40:22] fun fun fun https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Bot/Statut#ListeriaBot [14:45:54] Nemo_bis be quiet, everybody knows wikidata is a bad idea ! :p [14:46:12] sometimes, i'm ashamed to call myself a french wikipedian... -_-° [16:16:52] addshore i have something that *partially* works ! [17:19:16] addshore : scratch that : i have something that *completely* works ! [17:19:47] bit sluggish, elwiki (114k pages) takes a little over 2 minutes [17:21:24] but assuming it follows a law in O(n^2), it would mean roughly 2500 times that for enwiki, so, give or take, two days [17:27:04] Alphos : have you tried wikibase-api for wikidata? (at https://github.com/addwiki/wikibase-api) [17:29:46] HakanIST in the general term, or for this current project ? [17:30:23] for general term [17:30:32] general term : no. wouldn't especially have made my life simpler, so i just stuck to guzzle. current project : most if not all of the work can be done using sql ;) [17:30:53] well, general term being the previous project, that did call the api [17:31:34] it didn't particularly work on items as items, merely as things that have revisions [17:32:24] example on wikidata (https://www.wikidata.org/wiki/Wikidata:Creating_a_bot#Example_1:_Basic_example) refers to it [17:32:56] it didn't work "out of the box" for me , was wondering if you had given it a run [17:34:01] not so far ; guzzle really was what i needed, as i didn't need to really look at the object, just its revisions, without knowledge of their actual meaning [17:34:50] this way, rollbot works with pretty much anything mediawiki-based, although i've limited it for now to wmf wikis [19:46:24] hi [19:46:35] do we already read data from all templates? [19:46:37] or? [19:47:13] Nope? [20:01:20] Hi. Should items that are not a proper noun be capatilized? Item in question: https://www.wikidata.org/wiki/Q19652 [20:08:47] acebarry: it should only be capitalised if it's a proper noun, although you'll see lots of items where the label hasn't been fixed yet (and so the label starts with a capital letter because the wikipedia page name has to) [20:09:24] (https://www.wikidata.org/wiki/Help:Label has more information about it) [20:09:26] Follow up question, if I am using this property in an infobox, how would I make it be capatilzed without changing the Wikidata item? [20:09:36] Thank you for taking the time to respond by the way :) [20:11:16] I don't know the answer to that question, someone who has more experience with templates might (and if you don't get an answer here, I would suggest asking on the project chat page... more people pay attention to that :)) [20:11:34] Meaning #wikipedia ? [20:11:54] I mean https://www.wikidata.org/wiki/Wikidata:Project_chat [20:13:07] Awesome. Thank you again! [20:24:36] ok I guess not, because than the template with data looks differently [20:25:06] what is a road map to harvest template data to wikidata? what is needed to be done that wikidata gets it? [20:28:44] Juandev: There is no road map for that. [20:28:56] Sounds to me what people are doing for years already. [20:29:21] well [20:29:41] so is there something local community can do about it? [20:30:05] Use https://tools.wmflabs.org/pltools/harvesttemplates/ if you want to do it yourself or write a bot. [20:42:34] I guess you could also start using tracking categories (like https://en.wikipedia.org/wiki/Category:Wikidata_tracking_categories), it won't import the data, but it helps with finding data that is still missing in wikidata [20:54:32] nikki: That's an exponential problem. Templates * field in template * states [20:55:03] That will give a lot of categories so I've been a bit reluctant to do this with infobox templates and only did it for simple templates [21:02:51] yeah, it would get a bit ridiculous if you have categories for each parameter of each template, but the ones I've seen so far haven't mentioned the template or parameter [21:04:14] that doesn't help narrow down the exact template/parameter you want to import, but it's still useful for finding out where there's still lots to do and where there's only a handful of things that you might as well do manually [21:06:02] nikki: The ideal situation would be to track all the template/key/value combinations in the database [21:06:12] Than some logic could use that to compare it to Wikidata [21:06:53] I think I talked with Magnus about that. Just take an xml dump and parse that into a database [21:07:19] that sounds like it'd be cool [21:13:47] I like lawiki templates [21:13:50] https://la.wikipedia.org/wiki/Noam_Chomsky [21:14:31] Useless shadows. [21:16:17] I'm glad I'm not the only one who doesn't like the shadows :) [21:17:18] Just like the gradients on some Indian projects [21:17:51] I like the way it works off wikidata, not the shadows particular [21:17:53] ;) [21:19:15] nikki: The Wikidata logo link is awful [21:20:41] which indian ones are you thinking of? the only one that comes to mind which has gradients is the navajo one [21:20:58] I guess the navajo mostly stands out because of the colours [21:27:18] sjoerddebruin: oh, btw, I did see your comment on that phabricator ticket about the maps, haven't had time to look into it yet though [21:27:29] no worries