[09:11:33] Hello and thanks you thank you for the semantic database and UI to alter it and automation to provide services to the wikis [09:12:12] This year I've been active in Wikivoyage and I do say it is screaming for automation solutions based on structured semantic data [09:12:50] if someone feels like chatting with me I'll be here [09:13:05] I'll give you e.g. [09:13:07] *waves* :) (although i'm just about to have a meeting) [09:14:42] Instead of article on "X" saying in the == Get in == that there is connection to "Y" and "Y" saying there is connection to "X" we should in the simplest improved implementation just have a "transport link" object and use that to generate contents of the get in, go next parts [09:15:13] If link changes you need to update it only in one place and all relevant articles would be updated by automaton [09:15:23] another example [09:15:33] We have free coordinate data right.. [09:17:06] Instead of manually needing to say in the "Go next" section that "X is to the east and Y is to the west and Z is to the south-west" one could just have a machine look at the coordinates and the neighboring entities and put these automatically for each article [09:19:39] enabled with only the data of "transport link" objects and knowledge of entities and their locations and GIS knowledge of "which entities have borders with each other" we could massively automate template content generation [09:20:02] and this is simple as 123 stuff in comparison to what is information technically complicated [09:32:00] complex system would be able to do similar stuff as this http://rome2rio.com [09:32:39] is a multimodal global traveler router that additionally gives cost information [09:32:56] apparently their current revenue is from Googel Ads [09:33:27] but I'm sure they are working on revenue sharing schmes with the booking sites and the ultimate travel services providers [09:34:11] To get this type of behaviour requires complex algo [09:34:37] first you store each tranport connection as links between stops from one end to another [09:35:35] then you put all those together into directed graph network with some special properties and use a network travelsal algo (which ofcourse finds all optimums when doing it's job) [09:35:54] cache those in massive data structure in RAM and you're almost done [09:36:28] jubo2: doesn't osm store that information already? [09:36:42] I would also like to see how well Wikivoayge could do with Wikidata utilisation and specialist code for specific Wikivoyage features [09:36:56] jzerebecki: Open Street Map [09:37:25] I'm not very up-to-date on that.. All I know is it gotta be an ethical alternative coz it competes with Googel Maps [09:38:01] jzerebecki: Whether trying to figure out car navigator task and traveler router task is not so different under the hood [09:39:38] The car navigator (afaik) works so that each point where choice to "Go to road A" or "Go to road B" can be made is included in the network as a node and all these nodes are then connected to the neighboring nodes [09:40:24] so instead of having bus stops, trains stations, tram stops and subway stations as nodes you just have a node for each junction in the network [09:41:45] each road junction is node and each node has link to it's neighboring nodes [09:42:13] surely there is various search accelerating optimization tricks one can pull [09:43:08] but this way it is once again a network traversal problem with costs on the vertices [09:45:01] the ULTIMATE IoT thing besides remote controllable sauna stove is obviously the when the features of a car navigator and a traveler router put into a single device/app/collaboration [09:45:08] I can give this away [09:45:45] I've never gotten any money for any of these "a child could easily see"-style innovations anyways so how could I change that [09:48:43] Then you can ask you ubiqutuous traveler router "This car is going to X but there are passangers A and B onboard who need to get to Y and Z respectively. Give routing optimal routings and time and cost estimates all travelers for car and car + public transport, please." [09:48:52] Is so fucking obvious this is coming [09:51:22] Or are the devels gonna be like "No we don't care about innovating and providing new and improved services to customers of our capitalis masters and will not work to implement obviously techincally feasible new stuff like this." [09:51:38] I'm for hire btw [09:52:04] also for other contractual relationships (VAT incl. billing is ok) [09:52:30] and ofcourse will listen to revenue and ownership sharing scheme propositions if any [11:13:34] Hello [11:13:39] Hi, is there a tool to calculate number of labels in any given language? [11:14:08] labels or properties* [11:50:15] fdf [12:29:00] fdf [13:43:40] Is Lydia not in the office? [14:40:22] * aude waves [14:52:45] hi [14:55:14] I get an error when adding Q16698801 as a child to Q64976 [14:55:32] melderick: as a child? [14:55:49] P40 [14:55:50] P40 silly nagios - https://phabricator.wikimedia.org/P40 [14:55:58] oh right [14:56:24] error is "The link cawiki:Nicolau d'Oldenburg is already used by item Q11938734. You may remove it from Q11938734 if it does not belong there or merge the items if they are about the exact same topic." [14:56:35] seems weird to me ^^ [14:56:35] was thinking too much as a database [14:56:41] when adding statements? [14:56:45] yeah [14:58:48] sorry my computer likes to crash on moments like this :( [14:59:00] Q16698801 is a redirect to Q64976 [14:59:08] weird part is item Q11938734 is redirected to item Q64976 so I fail to understand why it could be a problem [14:59:11] exactly [14:59:41] do you have the same problem ? [15:00:00] yeah wait [15:00:03] i said something wrong [15:00:08] sure [15:00:24] why is it complaining about Q11938734... [15:01:11] wait [15:01:24] Q11938734 doesn't seem to be empty [15:01:34] somehow [15:01:43] the software still thinks there is a link to cawiki on it [15:02:14] guess so [15:02:31] any idea on how to correct this ? [15:02:44] let me redo the merges... [15:03:51] that worked [15:04:06] great ! [15:04:10] thank you :) [15:04:13] weird hiccup [15:04:26] yeah [16:36:50] Lydia_WMDE: You were a bit too hasty on the identifiers to convert [16:37:46] multichill: we have been dragging this out forever. i am sure not all of them are perfect but that is why there is a review period now :) [16:38:02] i saw you have issues with the airport codes. i'll move those [16:38:09] any others? [16:38:54] Everything below 90% has issues [16:39:08] And should be looked at in an idividual bsis [16:39:17] Mass converting the lot is not a good plan [16:39:41] ok but then we need to do it and get our collective ass up [16:40:36] It's like with "Historic Scotland ID (P709)" on https://www.wikidata.org/wiki/User:Addshore/Identifiers/0 . Checked, reason found -> convert [16:40:36] P709 Deployment Sequence Diagram.uml - https://phabricator.wikimedia.org/P709 [16:41:07] The abbreviations shouldn't slip through [21:38:00] I'm trying to get sitelinks from the WDQS. I don't think that sitelinks to the enwiki have schema:isPartOf unlike e.g. sitelinks to commons. Is this what I should expect? What would be the best way to get the enwiki sitelink for an item? [21:43:31] tarrow: are you sure schema:isPartOf . doesn't work? [21:45:57] I don't think so. http://tinyurl.com/gmphytl vs http://tinyurl.com/zljw3t9 [21:47:26] unfortunately I think the UI might be a little broken so you might have to go to link, REST Endpoint to see the result [21:51:55] :/ [21:52:30] is that for the UI or for the RDF? :) [22:28:35] aude, that's an interesting wikidata usage :) https://en.wikivoyage.org/wiki/User:Matroc [22:29:06] scroll down [22:31:17] yurik: :) [22:31:42] yurik: https://www.mediawiki.org/wiki/EMWCon_Spring_2016 (see the infobox) [22:32:41] aude, nice! you might want to change the icon to something "enterprisy" [22:33:09] where is the list of icons? [22:34:19] the ones on https://en.wikivoyage.org/wiki/User:Matroc ? [22:35:41] aude, https://www.mapbox.com/maki-icons/ [22:36:04] {{done}} [22:36:04] How cool, aude! [22:36:07] :) [22:37:30] tarrow: i could be wrong but maybe the only way to match a sitelink is with hacky SUBSTRING [22:37:58] like FILTER (SUBSTR(str(?sitelink), 1, 37) = "https://de.wikipedia.org/wiki/Orangey") [22:38:11] http://tinyurl.com/zg67zoa