[10:10:06] Thiemo_WMDE leszek_wmde https://gerrit.wikimedia.org/r/#/c/348756/2/resources/templates.php [10:23:04] Lydia_WMDE: https://de.wikipedia.org/wiki/Spezial:ApiSandbox#action=query&format=json&list=wblistentityusage&wbeuentities=Q1 [10:23:39] Lydia_WMDE: https://www.wikidata.org/wiki/Special:ApiSandbox#action=query&format=json&list=wblistentityusage&wbeuentities=Q1 [10:24:54] Lydia_WMDE: https://www.wikidata.org/wiki/Special:ApiSandbox#action=query&format=json&list=wbsubscribers&wblsentities=Q1 [10:29:19] hiya people, bit of a suggestion : when "wikibase-validator-missing-field" occurs, it would be helpful to know which field is missing validation ^^' all we get as an error message from the api is "Missing required field \"precision\"", that's a bit unhelpful ^^' [10:31:05] Alphos: Huh? So it says which field is missing? What do you miss then? [10:31:22] Thiemo_WMDE precision for which property for instance [10:31:50] You mean when you are doing a more complex API request that edits multiple things the same time? [10:31:57] yup [10:32:16] essentially repushing an old version of a given entity [10:32:26] (which means that old version is lacking that precision) [10:33:33] it seems geographic coordinates are the most lacking in that regard, but could be time properties as well [10:38:56] case in point, https://www.wikidata.org/w/index.php?title=Q35672&action=history planemad requested a mass revert of his bot, i had my bot (RollBot, currently with training wheels) try and push the version from 09:58, 17 April 2017‎ (also by planemad's bot), but it seems https://www.wikidata.org/w/index.php?title=Q35672&oldid=475759488#P625 was missing the required precision, which failed the edit [14:09:11] don't you just love it when the geographic precision for an indian state that's over a hundred thousand square miles is to the tenth of an arcsecond, which is roughly 3 meters ? :D https://www.wikidata.org/w/index.php?title=Q1188&diff=476812663&oldid=476490969 [14:13:08] Alphos: Blame enwp [14:13:35] multichill they're second on my list, after enwikivoyage. trwiki is third. [14:14:39] If you find the source of the problem, the import can be done again [14:14:44] Now with the right scale [14:16:43] Alphos: https://en.wikipedia.org/w/api.php?action=query&prop=coordinates&titles=Madhya_Pradesh&coprop=type|name|dim|country|region|globe [14:17:49] According to https://www.mediawiki.org/wiki/Extension:GeoData that's 10KM [14:17:51] currently exploring why lots of reverts by rollbot couldn't be performed. most of them were because precision wasn't indicated on previous versions of entities, so it's the first thing i check. second reason is a redirect on enwikivoyage pointing to a page that is linked to a different item (and same in one case with frwikivoyage) [14:18:50] third is trwiki, who had a bit of a RfD spree in late march, and some of these sitelinks are still in wikidata ; but also some entites about provinces point to a trwiki sitelink about the province and the city it's named after [14:19:09] Alphos: https://www.wikidata.org/w/index.php?title=Q1188&type=revision&diff=63409508&oldid=60980253 was the edit [14:19:23] 2013 was I think before we cleaned up the whole dim thing [14:19:38] i wish bot owners would be a little smarter when it comes to such things [14:20:00] had a 0.001 arcsecond precision on a region in saudi arabia about the same size as madhya pradesh [14:20:20] i'm not entirely sure which grain of sand is the correct location :p [14:21:06] Alphos: https://www.wikidata.org/w/index.php?title=Q1188&type=revision&diff=476815693&oldid=476815668 is what a normal bot edit would do [14:21:47] it would be soooo great to have a popup/modal showing up when someone selects <1' precision, saying "are you sure ? you're going into millimeter precision there, is that needed for something that's probably bigger than a mile ?" :D [14:23:57] This is all unattended alphos. You can review the logic at https://phabricator.wikimedia.org/diffusion/PWBC/browse/master/scripts/coordinate_import.py [14:23:59] and https://phabricator.wikimedia.org/diffusion/PWBC/browse/master/pywikibot/__init__.py;e72afad7b02506b35a97fe8c75af2afb313174ef$337 [14:24:21] i know ^^ [14:24:47] I doubt addshore was using that one because according to https://phabricator.wikimedia.org/diffusion/PWBC/history/master/scripts/coordinate_import.py I wrote it in 2014 [14:25:42] We could setup a report to get a list of crazy dim differences? [14:26:30] I think everything is available through either SQL or SPARQL [14:29:30] i'm definitely thinking about it, and it's my next project after i'm done with this one [14:29:48] both missing dim, and dim that are grossly inadequate [15:59:50] oh god, i didn't sign up for this :-( [16:00:03] whole item needs splitting >_< [16:00:23] one of the districts of cyprus coincides with one of the turkish districts of cyprus [16:00:53] and even though there appears to be two items, one of them seems to be both >_< [16:21:54] Alphos: may the force be with you [16:22:11] i'm saving it for last :-( [16:28:47] ohia multichill [16:29:05] I indeed didnt use that python script, I used something I wrote in php [17:31:21] addshore: Thanks for helping Hanno with https://phabricator.wikimedia.org/T163101 :-) [17:58:16] Amir1: https://phabricator.wikimedia.org/T163475 [18:01:09] can i die now ? :D [18:01:48] oh right, i can't, i still have the mess with the district of cyprus and the one with the town/province or turkey to deal with [18:02:09] i think i'm still entitled to a break to get my smokes :D [18:02:35] Alphos: Duty calls? https://xkcd.com/386/ ;-) [18:03:00] how weird is it that i knew which one it was before i clicked on that link ? :-( [18:03:28] does anyone know/remember if it’s intentional that dates with year +0000 or -0000 are type-tagged xsd:string instead of xsd:dateTime on WDQS? [18:03:29] I found T94064, but I’m not sure what its resolution was, and anyways BlazeGraph can handle those literals now without a problem if I create them manually (no more exceptions) [18:03:29] T94064: Date of +0000-01-01 is allowed but undefined in wikibase but is not allowed in xsd:dateTime as implemented by blazegraph - https://phabricator.wikimedia.org/T94064 [18:04:01] WikidataFacts there is no year 0 in the julian and gregorian calendars, so i'm guessing yes [18:04:17] Alphos: according to XSD 1.1 it’s interpreted as 1 BCE [18:04:50] and that’s also how the UI displays it: http://tinyurl.com/m46y6xo [18:27:42] WikidataFacts can't seem to find any reference to that in XSD 1.1 :/ [18:29:09] oh right, datatypes, brain currently rebooting, kinda sluggish ^^' [18:30:15] "a ·year· value of zero represents 1 BCE, −1 represents 2 BCE" alrighty then :) [18:30:30] Alphos: yeah, it took me a while to find too yesterday – I was just about to send you the link :) [18:31:12] should I create a task for changing the datatype to xsd:dateTime, now that blazegraph can deal with it? or just comment on T94064? [18:31:13] T94064: Date of +0000-01-01 is allowed but undefined in wikibase but is not allowed in xsd:dateTime as implemented by blazegraph - https://phabricator.wikimedia.org/T94064 [18:37:06] could it be a reoccurrence under another guise of that bug ? https://phabricator.wikimedia.org/T143897 [18:38:15] > The bug is triggered by comparing date with non-date (string) literal that somehow ended up in the DB. [18:38:43] hm, that could be the cause for those non-date literals [18:40:09] well, one shouldn't compare date with non-date, doesn't make sense [18:40:38] there's STRDT() to convert "date" string to actual xsd:date iirc [18:40:43] well sure, but there are a few xsd:string's stored in wdt:P569 (date of birth) and similar triples [18:40:56] yup [18:41:12] STRDT( ?dateTimeString, xsd:dateTime ) to be accurate [18:41:26] if only there was a way to find those directly ;) [18:42:06] Alphos: I already have an issue report written up, with two queries to find those 87 literals, and also downloaded results because the queries take so long to run :) [18:42:17] I’m just not sure if I should submit that bug report :D [18:42:57] i'd mention it in T.94... [18:43:12] not sure it deserves a bug report of its own [18:43:18] alright, then I’ll do that [18:43:19] thanks [18:43:37] stick around though, i might need your help in the next few minutes ^^ [18:44:23] sure :D [18:53:00] WikidataFacts : how would you get the precision of geographic coords or time values ? [18:54:08] Alphos: something like this: http://tinyurl.com/knhruy9 [18:54:32] precision of datetime is similar, but under wikibase:timePrecision [18:54:33] oh right, thanks ! [18:54:42] np :) [18:54:58] (the editor can autocomplete all those predicates btw, if you roughly know what they’re called) [19:05:22] oh boy http://tinyurl.com/jw9g6cy ping multichill [19:07:34] Alphos: slight optimization: http://tinyurl.com/l55ta8h [19:08:17] WikidataFacts yeah, but it's only the first step, so i'm keeping mine : afterwards, i'd like to find items with an absurd amount of precision [19:08:24] okay, sure [19:08:33] say, 0.00001 for something that's 100.000km² [19:08:42] and yes, we do have plenty of those >_> [19:09:43] wow, 1452569 items with geo precision 0 [19:14:58] giddy up >_> [19:15:37] seriously though, the smallest item with precision 0 is recorded at 0.43 square meters [19:16:05] someone used square feet instead of square miles [19:16:12] it's kind of a big diff :D [19:19:12] is there a non-SERVICE way of getting an item's description ? [19:19:21] rdfs:description doesn't seem to be it :D [19:19:46] Alphos: schema:description [19:19:52] right [19:21:35] ok then, i have a few SPARQuickL in store now :) [19:22:26] Alphos: Eh, one big Magnus import? [19:22:49] multichill could be, not sure [19:23:01] could be several big magnus imports :D [19:23:09] along with a lot of smaller ones :D [19:23:20] https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1459 ? [19:23:41] uh ? [19:24:04] He did a couple of heritage imports a while ago [19:24:19] oh [19:24:26] yeah, not all of them are heritage [19:24:36] https://www.wikidata.org/wiki/Q2581974 case in point ;p [19:24:39] If it's something like that you can guess a precision that would make sense [19:25:51] i'm seriously thinking square-rooting the area if available [19:25:53] 1.0E-7 [19:25:58] huh? [19:26:39] the precision of an item should be somewhat related to its area [19:26:56] it's absurd to have 1e-7 precision for the antarctic for instance [19:27:18] technically, a precision of 10° is more than enough for a continent [19:28:06] it's obviously not proportional, and there are edge cases such as chile (the country), which have one dimension much smaller than the other [19:29:14] but the precision is actually an angle, and that angle should pretty much be the square root of the solid angle that area covers from the center of the earth. that angle should thus be proportional to the sqrt of the area of the object [19:30:03] simply put, an arcminute of precision is roughly 2km on the surface of the earth, that should be perfect for areas between 1sqkm and 10sqkm [19:31:10] and yes, even as low as 1sqkm : the way i see it, lower (better) precisions are for really minute objects [19:43:54] let's first try and remove the obvious mistakes [19:46:13] You can try to use the approach I did at https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Image_suggestions/Higher_resolution : From biggest difference to smaller [19:50:57] i'll check areas below 10000m² for accuracy of that value, then let a script do the actual work of giving precision [19:52:05] simply put, sqrt(area) / 40000km * 360°, it's not perfect but it'll definitely do the job rather well [19:52:24] (at least for the items that have an area) [19:52:44] (which is not a lot) [20:26:21] what do you think should be done for buildings ? should the usable area be listed as "area" ? [20:26:37] usable area != ground area :/ [20:50:04] up to (and not including) 10 000 m², i've checked all the values, it all seems in order now [21:04:07] helloooooo brain saturation >_<