[10:20:32] I made a new wikidata map and difference map covering a period of 6 months @ https://addshore.com/2018/10/wikidata-map-october-2018/ [10:20:52] please let me know if you know of any of the users or projects that caused the increase in coordinated items in the areas that I highlighted! [11:04:13] addshore: sweet. [14:00:08] Auregann_WMDE: are you around? [14:02:53] I wish the contact the development team page was actually a way to contact the development team [14:07:10] I feel like I'm just talking to myself in here :( [14:08:01] nikki: hm? [14:08:19] oh, so other people *can* see what I say [14:08:27] * abian looks at nikki and blinks [14:09:24] lea was finding me a ticket several days ago and she hasn't said anything since [14:09:55] damn, there isn't much chatting here anymore :( [14:10:00] yeah :( [14:10:41] I figured she didn't respond to my ping on wednesday because it was a public holiday [14:10:49] could be it [14:10:52] but then she didn't respond yesterday either [14:11:15] and she doesn't seem to be around today [14:11:21] Our town was crowded with Germans... [14:11:35] going to the supermarket? [14:11:45] Or shopping in general [14:11:59] I dream of living on the border for that reason :P [14:12:50] (also so that I can buy the sort of bread I'm used to) [14:13:35] Can't find the ticket either and have to do groceries now :( [14:14:57] Lydia isn't at the office these days, maybe Léa is traveling too [14:16:13] addshore: Everything okay at the office, right? [14:16:18] how inconvenient [15:03:28] Office live stream: https://www.youtube.com/watch?v=2z460nEsNww [15:10:27] nikki: both Lydia and Lea are indeed off currently. Lea is back on Tuesday [15:13:23] ok [15:24:40] abian: the office is fine as far as I know, but I am also not there! [15:25:01] Cool :) [15:25:06] Just the goat is there :P [15:25:18] yup, just a gota dancing around [15:25:22] with leszek_wmde filming it [15:56:51] RECOVERY - Check systemd state on wdqs1010 is OK: OK - running: The system is fully operational [16:09:32] PROBLEM - High lag on wdqs1010 is CRITICAL: 1.719e+04 ge 3600 https://grafana.wikimedia.org/dashboard/db/wikidata-query-service?orgId=1&panelId=8&fullscreen [16:10:42] RECOVERY - High lag on wdqs1010 is OK: (C)3600 ge (W)1200 ge 37 https://grafana.wikimedia.org/dashboard/db/wikidata-query-service?orgId=1&panelId=8&fullscreen [16:16:02] PROBLEM - High lag on wdqs1010 is CRITICAL: 1.382e+04 ge 3600 https://grafana.wikimedia.org/dashboard/db/wikidata-query-service?orgId=1&panelId=8&fullscreen [16:22:32] RECOVERY - High lag on wdqs1010 is OK: (C)3600 ge (W)1200 ge 30 https://grafana.wikimedia.org/dashboard/db/wikidata-query-service?orgId=1&panelId=8&fullscreen [19:30:52] PROBLEM - WDQS HTTP Port on wdqs1009 is CRITICAL: HTTP CRITICAL: HTTP/1.1 503 Service Temporarily Unavailable - 387 bytes in 0.002 second response time [19:45:02] RECOVERY - WDQS HTTP Port on wdqs1009 is OK: HTTP OK: HTTP/1.1 200 OK - 434 bytes in 0.062 second response time [20:02:05] addshore: Did you deop Sigyn? [20:03:02] If so, feel free to deop again, but I don't see the deop in my immediate backscroll. [20:03:30] So I also don't see any discussion about why she would be deopped. Sorry if it was intentional! [20:04:14] Freso: nobody deopped it [20:04:19] it quit and rejoined [20:04:26] Guess it was jus... yeah. [20:04:43] 07:39:12 -!- Sigyn [sigyn@freenode/utility-bot/sigyn] has quit [Quit: People always have such a hard time believing that robots could do bad things.] [20:04:43] It was restarted a few days ago, could be that. [20:04:46] 07:39:51 -!- Sigyn [sigyn@freenode/utility-bot/sigyn] has joined #wikidata [20:05:01] on 3rd October [20:05:03] (It's now running on Python 3! :o) [20:05:50] That might've been then. I don't remember the exact time niko restarted it. [20:22:23] Hello structured data wizards! [20:22:43] Is there wikidatazitation of Wiktionary or Wikibooks going on? [20:25:54] See I have these lists of loan words in Albanian on my own semi-private study wiki: https://wiki.study/regarding/Learning_Albanian#Lists_of_loan_words and I would like to contribute the lists to the Wikibook on Albanian _but_ at the same time I'd also like to store the loan words in a database because there are a lot of words that fit under the heading "English", "French" and "Italian" (mostly) and I would really like a database .. uhh .. ontology (?) [20:25:55] that I could enter the data and to generate the lists from the database instead of needing to settle for listing each word under just one heading [20:27:27] If someone "gets" me and could offer some help into how to .. uhh .. create .. an ontology (?) so I could enter the entries into Wikibase and have the lists automatically updated I would be much obliged. Grazias! [20:29:08] If such an ontology exists in Wiktionary and/or Wikibooks please point me in the right direction. Cheers! [20:30:53] Let me make my request more general: Is there an introduction into designing ontologies somewhere. I need to get to grips with the task for other purposes too. [20:36:38] wikidata does have some basic support for lexicographical stuff now [20:37:08] but it's still missing some key features, like senses (i.e. saying what a word means), but apparently we're getting that soon [20:41:15] also because it's still quite new and incomplete, we don't have support for queries yet either [20:41:47] nikki: çķemi [20:43:03] but I imagine that eventually it'll be possible to do a query for words which come from a particular language, and if it's possible to do that, it should be possible to use tools like listeria to include dynamic lists in other pages [20:43:20] (although listeria will probably need updating to support lexemes first, right now it expects an item) [20:43:23] jubo2: ^ [20:43:27] hello [20:43:57] hi [20:44:00] where do know jubo2 [20:44:47] I was responding to something they wrote in here [20:44:57] ylel9: stop stalking me here and go back to ##learnanylanguage. I'm trying to get info here. [20:45:08] ok [20:49:37] for loan words, the "derived from" property would be the one you'd need, like https://www.wikidata.org/wiki/Lexeme:L1898 says it comes from "lemon" [20:53:02] Freso: nope [20:53:56] I see you already figured that out though! [20:55:10] addshore: Yep. Thanks. :) [20:57:19] nikki: Thank you for the information, though some of it is lost on my newbness. I dunno what is lexicography nor what is a lexeme [20:58:16] a lexeme is a word, more or less [20:58:20] nikki: Listeria? [20:58:42] In Finland "listeria" is some disease or virus or bacterium or something [20:59:08] but we treat different forms (e.g. plural forms of nouns, past tense forms of verbs) as part of a single lexeme [21:00:16] and they can have spaces in them too (e.g. "ice cream" would be a lexeme) [21:03:00] listeria does mean that, but it's also the name of a tool made by magnus (see https://meta.wikimedia.org/wiki/User:ListeriaBot) [21:03:24] nikki: thanks for the link to Listeria [21:04:10] bookmarked [21:06:30] nikki: Also bookmarked https://www.wikidata.org/wiki/Lexeme:L1898. But that one is difficult to yield without doing a lot of research for each word in order not to name the wrong source language [21:08:12] nikki: With the loan words in Albanian very many of them would fit under French, Italian and English so I'm kinda looking for a way to express in a semantic database that "This word can be memorized from languages X, Y and Z for its similarity" [21:08:54] I see [21:09:03] I don't think we have anything like that right now [21:10:10] Often the similarity with the English word (and the anglocentrism or the fact that so many people speak it) is obvious but the pronounciation is clearly more from French I'm now needing to select one of the lists and put it there based on more-or-less case-by-case evaluation [21:12:09] I'm such a beginner in designing semantic databases that I don't want to mess this one up and come up with something I cooked up at home that doesn't integrate into the much finer constructions that the knowledgeable people in the WMF efforts come up with sooner-or-later [21:14:01] I need to take a good, hard look at https://www.wikidata.org/wiki/Wikidata:Lexicographical_data and get my head around the issue I'm having. [21:14:29] But right now it is past midnight and I've been at the keyboard all day long so my brain is not working in the best possible way [21:14:56] I can understand that :) [21:15:58] So not trying to come up with a solution right now.. just digesting this new info. Thank you nikki very much for your kind attention to a semantical newb's questions [21:17:24] you're welcome :) [21:17:36] feel free to come back and ask more questions later too [21:18:24] I did some wikiditing of Wiktionary in 2003-2004 but then gave up on it because the Mediawiki devels seemed to pay no or little attention to Wiktionary's needs, but seems that soon with Wikidata Wiktionary will have the system it needs to stop putting people off by the manual and redundant replication requirement that it suffered a lot from back in the day [21:19:31] jubo2: good night [21:42:51] hi, how does wikidata site manages multilingual WIKI content (not the items/properties). E.g. community portals could be in multiple languages. Does wikidata use any kind of items to store sitelinks that all point to wikidata? [21:56:13] yurik: generally policies and things like that are in english and then enabled for translation and those pages have a box at the top with all the languages linked (the list is generated automatically afaik) [21:57:03] other pages, like project chat, have a manually maintained template with links to other pages at the top [21:57:58] sitelinks doesn't really work for wikidata itself, because you can only have one sitelink to a given wiki, so we would need as many items as we have sitelinks [21:58:22] nikki, but how does it get generated? e.g. a wiki page in frwiki is translated to enwiki because both are listed as sitelinks. But it is not possible to list multiple pages in the same wiki in a single item [21:58:57] right, but i was thinking maybe there is a multilingual string support of sorts? [21:59:18] or perhaps there is a string property with a facet [21:59:29] string = page title, facet = language [21:59:32] I don't really understand your question :( [22:00:26] nikki, if you go to https://wiki.openstreetmap.org/wiki/Key:bridge:movable you see a list of translations at the top. That template takes a very long time to generate by MediaWiki. I was hoping to store the list of translations in the corresponding wikidata item [22:00:59] and i was wondering if wikidata, which in theory should have a similar problem, has solved it in a creative way [22:01:51] I think when pages are marked for translation, that means they use https://www.mediawiki.org/wiki/Extension:Translate and the translations are stored as subpages with the language code as the page name [22:02:18] so https://www.wikidata.org/wiki/Help:Contents/de is the german translation of https://www.wikidata.org/wiki/Help:Contents [22:02:57] gotcha, and how does it iterate through the avialable translations? is that some magical mw template, or lua, or the extension itself creates a list of available langs? [22:03:04] no idea, sorry [22:03:09] np, thanks! [22:03:20] I think mediawiki has a way to fetch a list of subpages though, so maybe it's based on that