[01:06:38] !admin Anyone? http://www.adnradio.cl/noticias/sociedad/salfate-desmintio-su-propia-muerte/20160328/nota/3094796.aspx protection please: https://www.wikidata.org/wiki/Q5947173 another TUS Radio death troll (started in https://es.wikipedia.org/wiki/Juan_Andr%C3%A9s_Salfate ) [01:15:54] I guess that command didn't work [01:16:12] Anyways let's hope they don't get to Wikidata [01:16:19] Last death troll they did [07:34:44] ... [08:48:18] @jzerebecki https://github.com/Modernizr/Modernizr/blob/f839e2579da2c6331eaad922ae5cd691aac7ab62/feature-detects/css/fontface.js [09:35:06] * aude waves [09:37:35] :) [17:05:38] * yurik pokes Lydia_WMDE & DanielK_WMDE__ re =raw [17:18:40] Lydia_WMDE, DanielK_WMDE__ how much time would it take to implement the raw format for geo coordinates? [17:19:52] yurik: the more important question is: is it a good idea to implement it? how exactly should it work? what does it mean for other formats? [17:20:08] before we have answers to these questions, it's hard to tell how long it would take. [17:21:31] it's not a hard thing to implement. but it has implications for how people use wikidata. it adds a commitment to our public interface, and it would be very hard to change anything about the raw format later [17:21:39] so we should be very careful about it. [17:25:04] DanielK_WMDE__, well, we need to figure out how other extensions like should consume geo coordinates. When geodata outputs it in a localized fashion, we will have to parse it, which is obviously a pain. If we rely on Lua to serve as a converter, we will force all wikis to implement it, which is obviously not good [17:25:32] and yes, i understand that it may mean different things for other datatypes [17:25:52] DanielK_WMDE__, are you coming to israel? [17:25:54] yurik: you could just use the parser wikibase uses. that way, all this would be nice and consistent. [17:26:10] i think it's already in a separate component [17:26:26] yes, i'm flying out tomorrow morning [17:26:37] awesome, i'm in haifa already [17:27:17] re-parsing would also be an option, albeit maybe not as clean - there are tons of different ways to enter it [17:27:32] could you point to the right code? [17:27:35] I was not suggesting to use Lua to convert anything. The question is just whether the parser function should support the raw flag, or whether we leave that for the formatter function that is *available* in lua (nut implemented in lua). [17:27:53] not sure i follow [17:28:13] yurik: we support tons of ways to enter it :) And the only clean way would be fully structured parameter passing in mediawiki. i'm all for it. [17:28:28] my idea was to do this: [17:28:51] that will give you localized wikitext. possibly including markup. [17:29:34] yeah, that's what i was trying to avoid somehow. allowing this code ^ makes the tag very portable cross-wiki, without any helper lua [17:29:54] yurik: there is a Lua equivalent of #property, which can be used by Lua code. It's not implemented in Lua. It's implemented in PHP, by the Wikibase extension. It does the same formatting that #property does. [17:30:05] But in Lua, we can support a lot more options more cleanely. [17:30:31] we are trying to avoid any smart stuff for #property. [17:30:42] DanielK_WMDE__, if we require a wiki to use lua, that code will have to be copied to all wikis [17:30:48] or i'm not understanding something [17:30:59] in my example above, no lua is needed at all [17:31:07] yurik: oh, by the way... no, it can't ever work with #property. #property can return multiple values. [17:31:15] I see no way to work around that [17:31:43] #property returns a rendering of *all* statements for a property. potentially even including qualifiers and sources. [17:31:45] hmm... time to allow {{#lua:lambda expression}} :) [17:32:37] you could make a lua module that returns the main value of the first preferred statement in raw format. [17:32:41] and then call that [17:32:58] i see the problem... can i get the coordinates in "data-ready" mode, as oppose to a localized pretty-printed format? [17:32:59] you can already do that, but it's not very easy,. we could make it easier [17:33:15] in lua [17:33:23] in Lua, yes - you have access to the JSON structure just like you see it in the API oputput [17:33:40] it's the same JSON data converted into Lua tables [17:34:31] perhaps thats the answer... Still it would suck to have an identical lua code on every wiki [17:34:49] That Lua code could ship with the extension [17:34:53] that'S easy enough [17:35:18] hm, i think you would still need a local page in the Module namespace... maybe that restriction can be fixed, though [17:35:31] or we wait for shadow namespaces to solve this issue :) [17:36:31] yurik: but anyway - how do you want to handle multiple values? Just using the first seems bad. [17:37:44] how to handle that kind of thing depends on the use case. which is why we opted to not even try to support that kind of thing via parser functions. [17:37:44] don't really know. Hard problem, as it may be different depending on qualifiers, etc. Not ready to solve it. Hence, no pre-shipped lua module. Shadow namespaces.... right.... might be a while :) [17:38:03] you need script code that knows about the specific use case. there is no generic solution [17:38:20] exactly [17:38:31] so yeah, i guess disregard the raw= [17:39:11] raw could use "|" to concatenate multiple values. might still be useful. but it's not really a solution to your problem [17:39:15] at least not a good one [17:39:31] exactly, so i would rather not have it at all [17:39:54] so, think about what you would rather have :) [17:39:58] eventually i think there will be a complex module that will allow qualifiers as filtering parameters, etc [17:40:06] any ETA for geoshapes? [17:40:21] no. it's one the "nice to have" list afaik. [17:40:31] i think its at the top of it ;) [17:40:38] if you have someone who wants to implement it, we'll be happy to do some hand holding. [17:40:51] aude does :D [17:41:29] but we now have an identical problem :) [17:41:32] yea, after implementing proper search integration, optimizing change propagation, supporting derived values in api output, supprting unit conversion, cleaning up variant handling... [17:41:56] Oh, and we need to get ready for federated data access. [17:42:31] meh, should all be done during this hackathon [17:42:55] go ahead :P [17:43:02] i wonder if we will have to do the same thing for the geo shape as for the coordinates [17:43:26] most likely, geo-shapes will live on separate wiki pages, and will ony be referenced [17:43:26] afterall, if we have a {geoshape}, we will have to pick one as well [17:43:34] also, i don't think we will have localization for shapes [17:43:46] in that case, add other data types as well :) [17:43:58] so... you'll have raw json to work with. or whatever other format we pick. [17:44:00] e.g. TSV :) [17:44:09] json is good [17:44:25] GEOJSON/TOPOJSON is what we use for now [17:44:37] we might, actually. yes. Though TSV in wiki pages will not work for realyl big data sets. [17:45:11] DanielK_WMDE__, awesome, we have to discuss this - milimetric is now leading that effort [17:45:18] also, we'd kind of like to be able to use our value representation in the tables. so... it would probably end up not being TSV, but a JSON based table format. [17:45:35] that should definitly be coordinated, yes [17:45:52] anyway, need to get my daily refactoring done. [17:46:11] yeah, that's what i also suggested -- { "headers": [{},{},...], "values":[[1,2,3],[...],[...]] [17:46:11] 10[1] 04https://www.wikidata.org/wiki/1%2C2%2C3 [17:46:18] DanielK_WMDE__, ^^ [17:46:22] ok, chat later [17:47:02] not sure yet. plain old TVS also has its merrits. but you need meta data (at least the column type) to make it useful [17:47:09] yea, see you tomorrow :D [17:47:47] DanielK_WMDE__, tomorrow? [17:47:49] day after [17:48:02] it starts on thu [17:48:10] i will be in haifa until then [17:48:41] ah, right. I'm arriving in Jerusalem tomorrow [17:49:11] enjoy :) awesome city [17:50:10] Lydia_WMDE, i think https://en.wikipedia.org/wiki/Wikipedia:Wikidata#Parser_function needs to be updated that it may return mulitple values. DanielK_WMDE__ - not sure what the proper format of the multi-value may be [18:02:12] yurik: https://en.wikipedia.org/w/index.php?title=Wikipedia%3AWikidata&type=revision&diff=712538147&oldid=708428171 [18:02:21] hrm, where does the extra space come from?... [18:16:48] jzerebecki: https://integration.wikimedia.org/ci/job/mediawiki-core-qunit/62063/console [18:16:49] ?? [20:25:13] I had a question about the usage of the query service https://query.wikidata.org/ [20:25:21] I want to search for an entity with a label matching a regex [20:25:26] but I couldn't find a way to do that [20:25:32] any help will be much appreciated [20:26:14] https://www.w3.org/2001/sw/DataAccess/rq23/#funcex-regex could be helpful? [20:27:05] thanks let me try that [20:34:15] I tried to use the regex with something like this https://gist.github.com/anonymous/2810eb5747e51a9ae746183a43f20771. However, it is not working. Sorry but this is the first time I am using sparql [20:35:11] I'm also not a real sparql expert... [20:54:07] is there a way (maybe a special property) that can be set to specify how an item is supposed to be formatted when it being displayed using the #property parser function in the wikitext? [20:56:14] sounds like a good question [21:37:41] to answer my question... the only way I could find is to extend the lua module https://www.wikidata.org/wiki/Module:Wikidata