[06:09:41] !admin [06:09:58] * Cameron11598 lurks but is not an admin [06:10:05] !admin Hello guys, I am a bureaucrat on mk. wiki and I have an issue [06:10:29] on our wiki with the transclusions of numbers (population and dates) of Philippine Wikidata sets [06:11:52] I just cannot get my head around it, why it creates the problem, giving awful decimal numbers and showing errors for dates. I have looked both at PH wikidata template and at Module:String and none of that seems to be wrong [06:12:01] I am not sure if this will be the case with other countries in the future [06:13:05] an example would be https://mk.wikipedia.org/wiki/%D0%90%D0%B1%D1%80%D0%B0 [06:14:11] the population as a number is fine, but gives the wrong delimeter (comma in stead of point for thousands, and vice versa for decimals - as in many languages, it is opposite of the rile in English). it therefore also gets the densitywrong with ridiculous numbers [06:14:31] and does not show the date of the information, showing and error with red words [06:48:27] !admin [06:48:46] sorry my computer somehow froze, so I have had no chance to see, if there was any reply to my query, that is [11:30:15] hoo: any progress in the suggester process? [11:50:40] sjoerddebruin: Nope [11:50:52] awww [11:50:58] Maybe I'll find time for that this week, but doesn't seem very likely :/ [11:52:19] ok [12:13:16] DanielK_WMDE: https://gerrit.wikimedia.org/r/323817 Would like to backport that today [12:13:20] so please review [14:58:05] edoderoo: https://www.wikidata.org/w/index.php?title=Q2261335&type=revision&diff=286607010&oldid=280341929 ehm? Hehe, I guess you're not parsing the dates correctly [15:04:25] Nice growth this month tho. https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-terms?panelId=5&fullscreen&from=now-1y&to=now [15:08:57] For Arabic? https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-terms?panelId=5&fullscreen&from=now-1y&to=now&var-lang=nl is Dutch [15:09:34] oh, weird [15:09:51] Yeah, something fishy going on there [15:10:07] yea, i was looking at the same thing. for "all" the curve looks pretty smooth [15:10:30] I must have set it on ar by accident or there must be a wrong setting [15:10:36] let me try icognito [15:11:07] multichill: i wanted to ask you about the pywikibot breakage. We need to do breaking changes every now and then. How could we have avoided the trouble? How can we handle this better in the future? [15:11:20] DanielK_WMDE: FYI: wfLoadExtension( 'WikibaseLexeme' ); $wgLexemeNamespace = 130; That's it. ;-) [15:11:21] Would it be sufficient to warn about breaking changes earlier? [15:11:31] Thiemo_WMDE: \o/ [15:11:45] I had a better and easier idea: Just file a task in Phabricator? [15:12:04] multichill: for every client? [15:12:20] No, Pywikibot, name 5 other clients that are in phabricator? [15:12:46] Or 4, or 3, or 2, or 1? (can't think of one) [15:13:14] multichill: well that's kind of the point, we'd have to hunt them down. [15:13:25] i think wd toolbox is on phab. or just guthub? [15:13:26] whatever [15:13:47] No, you haven't. No other project has asked you if you could please file a bug. [15:14:00] multichill: so, the argument is that pywikibot is so widely used that it should get special treatment? How about magnus' tools? wd toolkit? [15:14:34] Say you have a breaking api change every month (I doubt that), it would be 12 bugs a year. How much work is that. [15:14:36] I'm not opposed to giving pywikibot special treatment wrt json changes. [15:14:49] i'm just wondering where we should stop, and what the criteria are [15:15:21] Don't go down the slippery slope Daniel [15:15:46] Just keep it scoped clearly and see how it goes. If it doesn't go well, don't continue, if needed, expand [15:15:46] exactly :) [15:16:14] avoiding the slippery slope means not taking the first step. [15:16:49] No, that's a logical fallacy [15:17:10] multichill: i'm open for the "please notify pywikibot directly, because it's BIG" argument. I'm just wondering where to put that, and whether it needs to be formalized. [15:17:34] Do you keep some documentation on what to do on a API change? [15:17:44] That would be the logical place to document it [15:18:02] multichill: https://www.wikidata.org/wiki/Wikidata:Stable_Interface_Policy [15:18:29] you can propose to amend the policy to include notification of pywikibot on the talk page [15:19:46] i wonder whether "file a ticket" is a good notification mechanism. maybe the pybot mailing list would be better? [15:20:13] I think the main problem is that people don't realize a change is breaking and don't act [15:21:14] well, we did send mail to two lists with BREAKING CHANGE in the subject. we alöso put it on the project chat [15:21:22] i can try to make the BREAKING blink, next time... [15:21:54] my experience is that people don't act until studff breaks, period. no matter when or when or how often you ntify them. [15:21:56] Haha, too much junk mail going over these lists? [15:22:06] wikidata-tech is pretty quiet [15:22:23] So the moment you committed it to master, one of our tests should have failed [15:22:35] should, yes [15:22:41] But some people went apeshit on that so now it's a complicated monster I don't understand at all [15:22:58] what is? pywikibot tests? [15:23:05] The testing yes [15:23:15] *sigh* [15:24:25] ok, so my suggestion: subscribe to wikidata-tech, and suggest a change to the policy on https://www.wikidata.org/wiki/Wikidata_talk:Stable_Interface_Policy [15:24:53] Don't really feel like a big vote right now, I'm just going to add pywikibot-l to the notification list and see how that goes for say the next 6 months [15:24:54] we'll also try to give the warning earlier. time was a bit tight this time. [15:25:46] multichill: no big vote. the policy is subject to itself, so breaking changes to the policy should be announced in advance. [15:26:00] adding another list isn't breaking i guess :) [15:26:14] though it may be considered "significant". [15:28:01] https://www.wikidata.org/w/index.php?title=Wikidata%3AStable_Interface_Policy&type=revision&diff=412100914&oldid=404590759 as easy as that ;-) [15:28:28] heh, be bold ;) [15:29:46] Talking about testing. Do you guys have code to generate incorrect output to see if the client survives that? [15:32:57] aude: here? [15:33:01] multichill: i don't think so. "incorrect" is also a bit broad. binary garbage is probably not helpful. things that are "nearly" right are the interesting cases, right? [15:33:25] baslically, we'd want a bunch of correct and incorrect json files. [15:33:28] DanielK_WMDE: Like for example, I would like to force a database lagging error or non-json output (for example when a proxy died) [15:34:01] can't you just read from a file? [15:34:29] that's all mediawiki core (or varnish or whatever) stuff btw, not under the control of wikibase [15:34:37] DB lagging I know, that was lag=-1. Sure, that would work, just wondering what you guys already have in that direction so I could re(ab)use that ;-) [15:34:46] DanielK_WMDE: aude Lydia_WMDE Heads up: I want to SWAt https://gerrit.wikimedia.org/r/323347 and https://gerrit.wikimedia.org/r/323817 today [15:35:16] multichill: We only have Database mocks that "randomly" omit results to test master fallbacks [15:35:19] but that's about it [15:35:38] multichill: we have very little code that calls a web api. well, except in the JS code. no idea how they test error cases. I'm afraid there are no such tests there. [15:35:43] ask Jonas_WMDE [15:35:53] We don't [15:35:59] \o/ [15:36:02] lovely [15:36:05] There's a bug for that, but we don't integration test that [15:36:12] we unit test it, though [15:36:24] but nothing that does API + JS [15:37:00] a unit test for handing erronous api responses would already be good [15:38:22] hoo: you have my +1, i'd want to test manually for a +2. won't get around to that until later, so maybe someone else can. [15:38:39] in particular, i want to see all three modes work [15:38:54] The tests do that, but I can see the motivation for doing it per hand [15:39:22] We have https://doc.wikimedia.org/pywikibot/api_ref/tests/wikibase_tests.html [15:43:09] DanielK_WMDE: Thanks for your input. [17:05:24] hm… does anyone have more information about https://www.wikidata.org/wiki/Property:P1921? Is there a relevant bug? [17:05:40] I couldn't find the property proposal for that, nor could I find anything on Phabricator [17:06:07] P1921 is "RDF URL template" [17:07:29] hoo: https://www.wikidata.org/wiki/Wikidata:Property_proposal/Archive/33#P1921 ? [17:07:34] It's linked from the talk page [17:08:24] Ah, that makes sense [17:08:36] https://www.wikidata.org/w/index.php?title=Special%3AWhatLinksHere&target=Property%3AP1921&namespace=120 <- it's used more than I expected [17:09:23] the property itself also makes sense [17:09:46] but not sure we have that in mind when working on the actual RDF mapping [17:10:00] Otherwise you'lll have to work with qualifiers for the other property. That might become a bit messy [17:11:58] You should probably include both as schema:sameAs ? And one with some sort of qualifier to indicate it's the rdf representation? [17:15:47] hoo: Related, how would you add the url pattern https://api.rkd.nl/api/record/images/$1?format=json to https://www.wikidata.org/wiki/Property:P350 ? See for example https://api.rkd.nl/api/record/images/32610?format=json [17:17:20] multichill: hm… make the current one preferred and add that as normal? [17:17:39] You can add further qualifiers, if you feel like it, but not sure what would be useful there [17:22:07] https://www.wikidata.org/w/index.php?title=Property%3AP350&type=revision&diff=412119269&oldid=411621845 ? [17:24:05] yeah, probably [17:24:49] hoo: Hey... [17:25:01] hi d3r1ck [17:27:01] Was just thinking about how I can use Wikidata here in my community [17:27:05] hoo: Any ideas? [17:27:22] Or probably get some people interested in Wikidata [17:28:00] Use for what precisely? [17:28:17] I have two perpective [17:28:23] 1. Data Mining [17:28:51] 2. Prediction of trends using Wikipedia and Wikidata [17:29:05] *prediction using trends rather [17:31:45] hoo: Just still thinking though, maybe better ideas will come. [17:32:44] I still don't really know what you're trying to do: Use the data in Wikidata? Have people edit Wikidata? Just introduce Wikidata? …? [17:33:29] hoo: Basic stpes for introduction will be done [17:33:47] After that, people will know how to edit Wikidata etc... [17:34:12] I am trying to figure out a way in which Wikidata can be used for prediction or Data mining. [17:37:23] hoo: But Will send you an email with my thoughts, sounds good? [17:54:37] d3r1ck: Go ahead [17:54:41] sorry, I was afk for a bit [18:28:16] hoo: Ok [18:37:56] good evening [18:38:03] hi melderick! [18:39:22] any news about the JSON dump ? wasn't generated last week ^^; [18:40:37] which one? [18:40:47] is there a task in the issue tracker about it? [18:41:12] the ones that go there : https://dumps.wikimedia.org/wikidatawiki/entities/ [18:41:28] yes please, we gotta check what happened by looking into the maintenance crons [18:41:52] dumps is the place where they end up but after they are generated [18:42:14] * andre__ looks at https://phabricator.wikimedia.org/maniphest/query/Djk4JXMGOqEw/#R [18:42:15] by some scripts ran from a cron maintained by hoo/wikidata [18:42:29] hoo: Ping regarding https://phabricator.wikimedia.org/T126146#2819350 - what do we need to do and if something can it be done soon? I understand is still a requirement to update wikidata/sites and related file caches before/when updating a site's language code? [18:42:42] We need to move forward with this, and it's been blocking the config change for a long time now. [18:42:56] could you created a new ticket,please, melderick, that would be great [18:43:21] mutante: sure [18:43:29] i'll also take a look at obvious errors or if it's disabled by a person [18:43:36] thanks [18:47:35] Feel free to create a ticket [18:47:51] I'll kick of the dump in a bit [18:48:00] am currently working on fixing the dumpers [18:48:30] We have https://phabricator.wikimedia.org/T151356 for that [18:48:42] but that's not specifically about the dumpers [18:48:53] if you want, you can create a parent ticket for that one, melderick [18:49:34] Krinkle: I'll put it on my list [18:51:25] great hoo, thanks [18:55:09] hoo: Thanks [18:56:23] https://phabricator.wikimedia.org/T151787 [18:56:49] melderick: Great, thanks [18:57:08] sorry just saw your message, haven't link it to yours [20:26:51] DanielK_WMDE: ping?