[05:26:39] RECOVERY - High lag on wdqs1002 is OK: OK: Less than 30.00% above the threshold [600.0] [05:28:49] RECOVERY - WDQS SPARQL on wdqs1002 is OK: HTTP OK: HTTP/1.1 200 OK - 13048 bytes in 0.003 second response time [05:29:09] RECOVERY - WDQS HTTP on wdqs1002 is OK: HTTP OK: HTTP/1.1 200 OK - 13048 bytes in 0.001 second response time [05:39:31] Change on 12www.mediawiki.org a page Extension:Wikibase Client was modified, changed by SamanthaNguyen link https://www.mediawiki.org/w/index.php?diff=2515885 edit summary: [+47] mark extension type as "data extraction", "parser function" [08:26:00] Do we have a problem with PAWS again? I don't get my scripts running... [08:37:52] MisterSynergy: any specific error you get? [08:39:30] http://paws-public.wmflabs.org/paws-public/User:MisterSynergy/Untitled1.ipynb [08:40:31] that always worked for me [08:43:36] hm, that's strange indeed [08:44:08] perhaps try flushing cache and running in verbose mode [08:44:12] tried already to stop and start script/server, as well as fresh logins [08:54:56] matej_suchanek: what exactly do you mean by "verbose mode"? Do I have to go to the command line? If so, which command? [08:55:34] https://www.mediawiki.org/wiki/Manual:Pywikibot/Global_Options (-verbose) [09:07:35] matej_suchanek: I hope I got this now (I ran a .py file with the demo script using: "pwb.py Untitled1 -verbose"). There seems to be an authentication problem [09:07:47] have a look at the transcript linked above again [09:08:17] happens in two different browsers, one of which is new to PAWS [09:08:47] that's the old PAWS problem... [09:09:01] which convinced me that using PAWS is no-go for me [09:09:45] what can I do? my tool labs pywikibot install is not really ready for use yet, and PAWS is a convenient tool to test things [09:13:04] take a look at https://phabricator.wikimedia.org/T136114, perhaps someone has suggested a workaround there [09:22:05] matej_suchanek: thanks for this phab. looks complicated unfortunately [09:36:17] Lucas_WMDE: earlier yair rand was asking about http://tinyurl.com/ybfgdjh4 and if anyone knew why it it returns blanks, I didn't see any responses and I certainly can't figure out why it does, any ideas? [09:38:55] that looks like a BlazeGraph bug to me [09:39:05] http://tinyurl.com/ya36o4oo is also weird [09:39:10] that query still returns two nonempty results [09:39:30] but if you use BIND instead of a triple (comment out L3, add L2), the results are already empty without the UNION [09:41:02] yeah I was thinking it seemed like a bug [09:56:51] nikki: should I report it? [09:56:57] please do [09:57:05] sure [10:19:25] done: https://phabricator.wikimedia.org/T170915 [10:33:30] thanks :) [10:38:27] oh [10:38:37] I said "thanks :)" but apparently you weren't here [10:40:01] just had to reboot, sorry :D [10:42:37] * nikki wonders when the unique value constraints will start showing the other items with the value [10:43:41] as soon as the next deployment happens… not sure when that is [10:43:56] should happen before thursday, hopefully (unless there are more build problems) [10:45:09] oh good [10:45:45] because thursday is when we want to enable constraint statements, and that definitely needs the updated code [13:01:13] matej_suchanek: Muhammad Abul-Futooh still editing with 159 edits per minute. [13:01:29] Wait, 103* [13:03:11] are they? [13:03:59] According to http://wikidata.wikiscan.org/?menu=live&date=12&list=users&sort=edit&filter=all, stopped now. [13:04:12] Still a huge difference with other users and bots. [13:05:10] https://www.wikidata.org/w/api.php?action=query&format=json&list=recentchanges&rcuser=Muhammad+Abul-Futooh&rclimit=10 [13:05:23] their most recent edits were one per sec. [13:05:48] Magnus is also to blame imo [13:12:37] Hallo. [13:14:21] When will Wikidata be enabled for the new Dinka Wikipedia? ( din.wikipedia.org , https://phabricator.wikimedia.org/T168518 ) [13:15:41] Usually I don't push for this and I just wait patiently, but this case is a bit special, as there's an interesting story behind it... [13:17:04] It's one of the biggest languages of South Sudan, a country that recently became independent. [13:17:18] There's a professor of Linguistics, who is researching this language. [13:17:36] He frequently travels to South Sudan for this, and he's trying to convince the government there to invest more in producing school textbooks in this language. [13:18:00] someone probably forgot a step [13:18:13] He says that currently they aren't investing in anything but English, which is not so sensible given that most people in South Sudan don't speak it. [13:18:33] But they think that it's impossible to write books about education and science in Dinka. [13:18:58] you could try pinging some of the people who were involved in getting jamwiki sitelinks enabled (see the comments after https://phabricator.wikimedia.org/T134017#2277505) [13:19:06] So he wants to prove them that it is possible by showing them that there's a Wikipedia in that language, [13:19:35] and he's sure that the most impressive step will be showing them an interlanguage link from English to Dinka. [13:20:45] nikki: hmmmm... don't the relevant steps appear at https://wikitech.wikimedia.org/wiki/Add_a_wiki ? [13:21:46] aude: ^ [13:21:51] as far as I can tell it's there, with a warning saying we might want to ask hoo or aude to do it [13:21:51] heh yeah [13:25:16] nikki, aude - https://phabricator.wikimedia.org/T170930 , for what it's worth. [13:25:45] yeah, that's probably a better idea, aude seems to not be around in here much :/ [17:31:54] Bit sad that they break page previews on Wikidata but it doesn't have priority for them to fix it. [18:02:49] DanielK_WMDE or anyone: given that I have a string, how do I quickly construct the correct EntityId (ItemId, PropertyId) extension? [18:38:29] matej_suchanek: I guess this isn't helping with the dispatch as well? https://www.wikidata.org/wiki/Special:Contributions/Nikosguard [18:38:33] Should I ask him to postpone? [18:38:56] Also 73 edits per minute. [18:40:07] matej_suchanek: would you be able to add an abuse filter to prevent edits with "CoordinateScraper" in the summary? (because of https://www.wikidata.org/wiki/Wikidata:Administrators%27_noticeboard#Another_fight_with_bot if you didn't see that) [18:40:37] https://www.wikidata.org/wiki/Topic:Tu1xoy8ppki62iw7 [18:41:19] It seems like everything above 60 is problematic. [19:02:08] sjoerddebruin: thanks [19:02:31] nikki: why cannot we stop the tool directly? [19:02:42] anyway, it should be easy [19:05:49] magnus isn't responding, and blocking the account would mean blocking other people using quickstatements [19:06:13] so I was hoping we could just prevent the bad edits until magnus reappears [19:06:26] nikki: magnus is ill, apparently https://twitter.com/MagnusManske/status/887059573630259205 [19:06:51] I missed that... [19:09:23] nikki: https://www.wikidata.org/wiki/Special:AbuseFilter/98 [19:10:48] I set it to disallow, feel free to supress it as soon as possible [19:10:59] thanks [19:12:08] I have concern whether it does not break that account, though [19:12:47] oh? [19:12:50] if I'm not mistaken, those batches are queued [19:13:23] https://www.wikidata.org/wiki/Special:Contributions/Mr.Ibrahembot still running at 100 edits per minute. [19:13:27] two things can happen: either each error will be skipped, or the first one will cause it to hang [19:13:42] the former is preferred, of course [19:13:48] we will see :) [19:15:15] sjoerddebruin: it's too fast, yeah [19:15:40] tell me, why does everyone need to run their bots *now*? [19:15:54] All I got was "Hi, Okay I will do it." [19:18:03] I replied there [19:23:33] WikidataFacts: "Berg im Saudi-Arabien" is wrong, isn't it? [19:23:41] yeah, *in [19:23:47] thought so [19:24:06] <3 German [19:24:22] Only 1122 to fix. :P [19:24:31] but im Iran, im Libanon… :D [19:24:54] in der schweiz, in den something plural... such fun XD [19:25:05] e. g. Niederlanden [19:25:06] yeah [19:25:25] in den Vereinigten Staaten :D [19:25:36] or that :D [19:26:02] ..von Mexiko. [19:26:41] "US-amerikanischer" [19:26:48] oh, you want to start with demonyms? [19:26:53] „monegassischer Ruderer“ [19:27:13] I just want to smash these backlogs. [19:27:54] The ones that you take care of yourself and are scared to learn others because you think they will do something wrong. [19:28:36] monegassisch doesn't seem too bad, I don't even remember how to spell the english word [19:30:28] wow, I didn’t know the English one was just as bad [19:30:40] I blame french :P [19:37:06] Monacosian ? [19:37:30] Monegasque, apparently [19:38:11] People native to Monaco are called Monegasque. A person born in a foreign country but resident in Monaco is a Monacoian. [19:38:38] what, seriously? [19:38:52] Still interesting how cases like this are still quite unique in Wikidata. https://www.wikidata.org/wiki/Q32313737#P27 [19:39:01] source: http://www.tenfactsabout.co.uk/0017monaco.htm [19:39:36] "Monaco is famous for its casino in Monte Carlo. However, residents of Monaco are not allowed to gamble or even enter the casino!" lol [19:47:18] any russian speakers around? what little I remember about russian grammar is making me think "гора в Саудовская Аравия" is probably also wrong [19:49:17] I think so but I can't tell you the correct one [19:49:53] I suppose the suffixes will be -oj -ii but not sure about that really [19:53:27] google translate agrees with you, at least [19:53:41] lol :) [19:54:24] yeah, that could mean anything [20:05:09] does anyone _actually_ know how to use sparql? I still keep coming across query results that make no sense, and unfamiliar syntax and quirks that don't appear in the spec. [20:06:36] YairRand: did you see that WikidataFacts made a ticket about the thing you asked about last night? [20:07:32] and sparql is a bit of a mystery to me at times too [20:07:43] like just now I had to wrap something in an extra select to make it stop giving me an error [20:08:10] I did not see the ticket [20:08:46] * YairRand searches [20:08:48] https://phabricator.wikimedia.org/T170915 this one [20:10:26] just found out about the "include" syntax. very confusing. [20:11:07] I'm still trying to figure out a way to have items linked to each other by "same as" to be treated as a unit in queries. [20:11:17] it is not going very well. [20:14:30] YairRand: try to say that another way, perhaps I can help you... [20:16:49] matej_suchanek: Suppose items X and Y are linked to each other using P460. Z wdt:PW Y. X wdt:PW V. search for graph of PWs. Desired result: Z > ( X "/" Y ) > V [20:17:09] (actually, that probably wasn't any clearer. let me try again.) [20:20:46] Suppose that we're querying for a tree of items linked using Property:W. Item Z's P:W is Item Y, and Item X's P:W is V. We query for ?subject ?subjectLabel ?target ?targetLabel, with ?subject wdt:PW ?target, or some variation. Desired result includes: (on row 1) ?subject = Z, ?targetLabel = "X / Y", (on row 2) ?subjectLabel = "X / Y", ?target = V [20:21:11] oh [20:21:15] let me think... [20:23:43] Z -> Y and X -> V are independent of each other, aren't they? [20:24:01] sorry, I forgot to mention that X and Y are linked by P460 [20:24:28] aha [20:24:37] (was an important point to forget, sorry) [20:27:40] a bit complicated, I must admit... [20:28:38] Yeah, I spent a lot of time on it today and yesterday without much progress [20:57:54] Well, one backlog down... [21:02:39] matej_suchanek: https://www.wikidata.org/wiki/Wikidata:Administrators%27_noticeboard#Block_my_bot [21:05:32] so finally... [21:11:29] that seems to have been a response to me pointing out the grammatical errors in the descriptions [21:12:04] but any reason that helps bring the lag down is a good one :P [21:12:29] the first user ever to thank for blocking their bot [21:23:04] Yeah, just do more edits when we already have lag. http://wikidata.wikiscan.org/gimg.php?type=edits&date=201707&size=big [21:24:51] this chart refuses to go down: https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch?panelId=2&fullscreen&from=now-12d [21:26:39] Give it some time, I think charts have lag as well. [21:26:49] this is a neverending story... [21:26:55] https://www.wikidata.org/wiki/Special:Contributions/Muhammad_Abul-Futooh [21:27:16] I’ve lost track of how many different high-frequency bots are causing problems now [21:27:26] And https://www.wikidata.org/wiki/Special:Contributions/Nikosguard is still running [21:27:54] Again, http://wikidata.wikiscan.org/?menu=live&filter=all&sort=weight&date=6&list=users gives a good view. [21:28:10] at least the query service is healthy again [21:34:10] I hope I can leave now, hopefully things get better during my sleep :) [21:35:12] I wish I could do a polar projection for antarctic coordinates [21:35:47] nikki: I feel like that's probably possible with vega [21:36:43] what's vega? [21:36:58] https://www.mediawiki.org/wiki/Extension:Graph [21:37:51] ah [21:38:24] I was thinking more about the query service and geohack [21:38:50] the extension hooks into the query service, iirc [21:39:00] like I was trying to compare two pairs of coordinates the other day to see how far away they were but one was on a normal map, the other on a polar map >_< [21:39:12] so I had to compare the shape of the coastline [22:21:55] Interesting: a lot of these users never made so much edits before. [23:37:55] Dispatch seems steady now. https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch?refresh=1m&orgId=1&from=now-12h&to=now