[13:20:21] Hi tim_WMDE [13:20:37] Hey [13:21:14] is this the Technical Advice IRC Meeting [13:22:10] That will take place tomorrow! [13:22:50] oh sorry. I did not notice that [13:23:21] No problem [13:23:39] Also keep in mind it's 3 pm UTC+0 [13:26:45] thank you. I might be able to make it to the 11pm slot [13:42:57] tim_WMDE: cool on https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting#{{/timezonelink|WMF|2018|09|05}} there is a timezone convert. I can be back in Berlin at 1 am even with deutsche bahn beeing late [15:00:55] Hi, how do i enable the new RecentChanges Ui (live updates etc) in mediawiki please? [18:49:03] paladox: are you talking for a private mediawiki or a wmf one ? [18:51:36] he's asking for a non-WMF mediawiki cluster [18:51:51] and i _think_ they got it solved [19:20:33] yup i managed to find the config [20:18:01] thedj: regarding https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/FlaggedRevs/+/457992/ - could mwgrep to check other wikis. [20:18:06] can you provide a regex? [20:18:16] I meant https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/Cite/+/449710/ [21:36:40] How do I determine whether a wikipedia page exists using a cross-origin request? I'm working locally so origin will be something like file:// or localhost. I tried sending a HEAD request on the page, with origin=* in the query string, but there's no CORS header on the response. Manual:CORS didn't enlighten. [21:39:36] hmm, why do you want to know that? [21:41:05] LeoTal: origin only works on api.php not the main web interface [21:41:07] Given a word, I want to determine whether it has a disambiguation page on wikipedia [21:41:46] you probably want api.php?action=query&prop=info&origin=* [21:42:13] https://www.mediawiki.org/wiki/Manual:CORS but I'd recommend using JSONP [21:42:21] I think you can also directly query if its a dissambig page or normal page [21:42:37] Boo jsonp :P [21:42:56] it just seems easier to use :) [21:43:19] Well i have certain biases [21:43:36] as a security person when it comes to jsonp [21:43:40] Oh, then I can query several pages at once by passing titles=Foobar|Bazqux|Fumzot? [21:43:56] LeoTal: up to 50 i think [21:45:18] on a serious note: jsonp has benefit of being really easy. Downside: its bad for security (you are giving the other site control of your page) and its harder to handle errors/timeouts [21:56:22] I'm trying to GET https://en.wikipedia.org.wiki/api.php?action=query&prop=info&format=jsonfm&formatversion=2&origin=*&titles=Cheese but it's still failing the cross-origin preflight. Should I be POSTing? Sending credentials of some kind? [22:02:49] Umm. Whats with the .wiki tld? [22:09:07] bawolff: operated by Raymond King of ICANN [22:09:34] he was at WMF office long time ago and then WMF got all the domains [22:10:02] https://toplevel.design/wiki/ [22:10:41] https://phabricator.wikimedia.org/rODNS4d6c5488a758a84b431d38d1ed6f1af806f2292c https://phabricator.wikimedia.org/T88873 [22:10:58] they are not being used. only "w.wiki" is used for the url shortener [22:11:38] also see the "related objects" in the phab ticket above [22:12:28] mutante: i meant why is LeoTal connecting to en.wikipedia.org.wiki instead of en.wikipedia.org [22:13:00] and yeah...it saddens me how icann has sold out [22:13:04] Because I copy-pasted something dumb from stackoverflow. https://en.wikipedia.org/w/api.php is answering, hooray [22:13:38] bawolff: oh, i didnt get the relation to LeoTal . i see it now [22:13:46] heh, ok [22:59:19] Thanks a lot for the help y'all, I have exactly what I want now