[14:00:31] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @CFisch_WMDE & @nuria - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:50:15] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @CFisch_WMDE & @nuria - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:00:23] Hi \o/ [15:01:02] hi Gopa [15:02:37] \o/ [15:02:53] CFisch is busy so I think I’m substituting for him :) [15:02:59] welcome everyone! [15:04:42] hello Lucas_WMDE [15:04:46] I have hosted the back-end of VideoCutTool in toolforge, https://tools.wmflabs.org/video-cut-tool-back-end/ [15:04:46] I'm trying to implement the Mediawiki OAuth by taking https://github.com/srish/nodejs-mediawiki-oauth-tool as referance, I have implemented here https://github.com/gopavasanth/video-cut-tool-back-end/blob/master/routes/index.js#L25 [15:04:46] but when I use this https://tools.wmflabs.org/video-cut-tool-back-end/login routing [15:04:46] I'm getting error as "OAuth authentication requires session support" [15:04:46] I tried to fix this my self, I searched around and also went through the https://stackoverflow.com/questions/22298033/nodejs-passport-error-oauthstrategy-requires-session-support but failed to do this :( [15:04:47] [15:04:47] Any idea where I went wrong ? [15:05:33] hm, have you seen https://wikitech.wikimedia.org/wiki/Help:Toolforge/My_first_NodeJS_OAuth_tool already? [15:05:40] yes yes :) [15:05:44] I’ve never used OAuth from JS but I had very good experiences with the Python version of that page [15:05:46] oh okay :) [15:08:46] FWIW this is something internal to passport, OAuth does not require session support in the MediaWiki sense [15:11:44] Any ideas to fix this https://ibb.co/qD0BvmK [15:12:30] url call back is working, but problem is session support :( [15:14:47] Gopa: your best bet is stepping down the js code to see what is missing , you can do that with js debugger of chrome for example [15:15:33] Gopa: https://developers.google.com/web/tools/chrome-devtools/javascript/ [15:16:00] * Gopa going through the provided resources :) [15:18:23] Hello everybody. I'm here to ask for help about the popup extension. In short, I want to use popups to show a special page but it seems this is not working for several reasons. There are some mechanisms in the popup extension to avoid calling special pages. When I bypass these parts it turns out that the TextExtracts extension does not extrect the [15:18:24] text from special pages. [15:18:49] So I have the feeling there is a reason why popups should not visualize special pages but I don't know any reason. [15:18:57] Is there any documentation to host react front-end app (video-cut-tool-front-end) in toolforge ? [15:18:57] I tried to find it too, tried to host by going through https://wikitech.wikimedia.org/wiki/Help:Toolforge/Web#node.js_web_services but :( [15:24:11] To put some context to my question from above. Here is the phabricator ticket i'm working on: https://phabricator.wikimedia.org/T208758 [15:27:28] andreg-p9199: popups need to have "summary" versions of pages to be able to show them [15:28:42] Yes, but the popups extension just fetch some data via the mw.Api.get method, put into an internal model and than visualizes the data from the model. [15:29:13] The get method I'm talking about is here: https://github.com/wikimedia/mediawiki-extensions-Popups/blob/1912ea943d893c7a56b3e1d286605c1591ca2512/src/gateway/mediawiki.js#L36 [15:29:21] andreg-p9199: and the api call is working well? [15:30:31] I think it does. The prop two lines below is given as:It calls the TextExtracts extension [15:30:38] andreg-p9199: did you try to make that api call (with curl, say) for the pages you are interested in? [15:30:39] ups... sorry. The prop is given as: prop: 'info|extracts|pageimages|revisions|info' [15:31:13] andreg-p9199: try to do the api call with "curl" that that code is executing and see what you get [15:31:24] Not with curl but I manipulated the TextExtracts extension and I verified that the api call worked well [15:31:42] But the TextExtracts extension does not extract the text from the special page [15:31:49] only from normal other pages [15:32:12] andreg-p9199: right cause special pages do not exists "as text" I think [15:32:33] ah... that's new to me [15:32:41] Lucas_WMDE: I also tried to debug and find something, Can you suggest me someone who can help me with the Mediawiki js OAuth? [15:34:04] maybe Lucas_WMDE can verify, do special pages exist as "text" such they can be parsed? (probably not right?) [15:34:09] nuria can you explain that a little bit more in detail. Why does it not exists "as text"? [15:34:19] nuria: I’m not sure, but you *can* transclude them, at least [15:34:25] {{Special:PrefixIndex/}} is sometimes used, for example [15:34:52] but perhaps that only works for certain special pages? no idea [15:36:11] Gopa: I’m not sure who the right people would be, but it looks like Srishti Sethi wrote the initial version of that NodeJS OAuth page, perhaps she can help? [15:36:17] (doesn’t seem to be online right now as far as I can tell) [15:37:02] nuria: ah, there’s a class IncludableSpecialPage and a method isIncludable(), so I guess that controls transclusion [15:37:09] not sure if that also affects TextExtracts [15:37:19] (ping andreg-p9199 for that part too) [15:38:05] yeah thannks Lucas_WMDE :) [15:38:05] And about React front-end app to host in toolforge? any guide for that ? [15:38:09] I'm following your advices. transclusion is new to me as well [15:38:10] TextExtracts basically approximates the plaintext rendering of the first paragraph of the article [15:38:10] special pages have no such thing [15:40:08] Gopa: all I know is that Magnus used Vue for a tool a while ago http://magnusmanske.de/wordpress/?p=441 [15:40:13] not the same thing as React, of course [15:40:47] "transclusion" for special pages just means that the SpecialPage object is asked to generate some wikitext [15:40:59] @tgr_ So I guess TextExtracts should not be extended to handle special pages since this would effect too many other classes. [15:41:15] that wikitext is very unlikely to be suitable for getting a summary from so TextExtracts doesn't even try [15:42:01] I doubt it would do any harm, I also doubt you could generate any kind of meaningful extract though [15:43:36] So it makes more sense to extend the popup extension for my task (popups for math that show information from wikidata) [15:43:49] and it would be a major re-engineering effort, text extracts are generated on page save, there is no such thing for special pages, the extension would have to be invoked whenever someone wants an extract of a special page, the special page would have to know how to invalidate that extract when the underlying data changes.... [15:44:12] what do you want to show in the popups? [15:45:05] When somebody hovers a math equation that is annotated with a wikidata ID, i want to visualize a summary (or at least something) from wikidata in the popup [15:45:56] So we thought it would be the best to create a special page that fetches the information from wikidata by a given ID and this special page can be visualized in the popup [15:46:04] That was our first approach [15:48:02] I could perform the fetch from wikidata directly in the popups extension. but since this is very math related and the popups are quite open, I doubt such changes would pass the review [15:49:24] hm, not sure what would be better here [15:50:27] well, wait [15:50:48] why do you need to go through TextExtracts for this? [15:51:06] I assume you already have a special case so that, for Wikidata items, you get the special page from Wikidata for the preview [15:51:09] I don't need to. It just what the popups extension does now [15:51:39] can you bypass it? [15:51:53] I assume Reference Previews don’t use TextExtracts https://www.mediawiki.org/wiki/Extension:Popups#Reference_previews_content [15:51:57] perhaps you can do something similar [15:51:58] Yes. But the preview is empty since no data was fetched from the special page via the API call [15:52:18] oh [15:52:25] the “mechanisms in the popup extension to avoid calling special pages” you mentioned? [15:53:02] Yeah, there is also a switch-case block that simply ignores special pages [15:53:13] So I thought there must be a reason for that. [15:53:28] Here is the api call the popups extension use: [15:53:29] https://github.com/wikimedia/mediawiki-extensions-Popups/blob/1912ea943d893c7a56b3e1d286605c1591ca2512/src/gateway/mediawiki.js#L36 [15:54:20] Maybe I can extend the api call... and handle that call in the math extension. this extension can fetch the data from wikidata than [15:54:36] yeah, that would need to use a different API call [15:54:47] the special page won’t have pageimages or revisions either, for example [15:55:15] yeah I noticed that already [15:55:49] As I said, when I bypass every mechanism a popup appears but its always empty :/ [15:56:24] I don't really know how the api call works. the props argument specifies what extensions are called. right? [15:56:52] If I add the math extension and somehow handle that request in the math extension, that would be the best option I guess? [15:57:02] But I don't really know how to do that [15:58:13] props controls which properties about a page are returned – it looks like extensions can register additional properties, like the extract [15:58:47] I would try to modify the JS to make a completely separate API call in this case [15:59:08] oh really?... damn [15:59:21] I worry that would not pass the review [16:01:39] perhaps I’m still not understanding how some of these bits fit together [16:01:48] but I think that would make more sense [16:02:09] in my view, you would have three different kinds of popups – page previews (extracts), reference previews, and math/Wikidata previews [16:02:18] and only page previews would use that APi call [16:02:30] yeah [16:02:45] page reviews and reference previews are already handled differently [16:03:08] anyways, our time is up – I think that concludes today’s Technical Advice IRC Meeting! [16:03:12] thanks to everyone who participated :) [16:03:20] don’t forget you can always ask questions over on Wikimedia Developer Support: https://discourse-mediawiki.wmflabs.org/ [16:04:00] thank you very much. Thanks to nuria and tgr_ as well!