[02:08:11] hello! shouldn't this return the namespaces in German, since I added uselang=de ? https://meta.wikimedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces&uselang=de [02:09:35] or rather, how can I get localized namespace for metawiki? is this not possible using the API? [03:34:37] [[Tech]]; Killiondude; /* Do Wikimedia sites have sitemaps? */ re; https://meta.wikimedia.org/w/index.php?diff=19159551&oldid=19159407&rcid=13727227 [03:44:04] musikanimal: Meta's namespaces are just in English [03:44:49] legoktm: but I see localized names in the dropdown at https://meta.wikimedia.org/wiki/Special:Contributions?uselang=de [03:45:06] (same for any wiki) [03:46:11] sure, but like the namespace is still "User", e.g. https://meta.wikimedia.org/wiki/User:Legoktm?uselang=de [03:46:34] https://meta.wikimedia.org/wiki/Benutzer:Legoktm?uselang=de doesn't work of course [03:46:59] yeah, I guess I'm just asking how to get that list you see at https://meta.wikimedia.org/wiki/Special:Contributions?uselang=de [03:47:22] they don't appear to be normal system messages, even. uselang=qqx doesn't show a key [03:49:40] so I'm looking at Html::namespaceSelectorOptions() [03:49:54] Language::getFormattedNamespaces() apparently [03:51:18] ah, that's a start. I see ApiQuerySiteinfo uses it https://gerrit.wikimedia.org/g/mediawiki/core/+/7462d3075a23095fd627cebeb0f66e081ad2bede/includes/api/ApiQuerySiteinfo.php#287 [03:51:41] but it uses the content language, not the user language [03:52:02] oh right [03:52:39] Html::namespaceSelector() has an option called 'in-user-lang' to make it use $wgLang (user language) instead of the default content language [03:54:18] mkay. So not exposed through the API I take it [03:54:33] do you think it would make sense for meta=siteinfo&siprop=namespaces&uselang=de to show localized names? seems intuitive to me [03:56:39] I can create a phab, and maybe attempt to implement this myself [03:57:38] all I want is the localized names of the basic namespaces, everything with an ID <= 15 [04:01:18] the default uselang value is the user language, so I think it would be a bit confusing if the namespaces suddenly weren't the correct names just because the user changed their interface language [04:01:44] a different parameter like &sinslang or something might make sense [04:02:18] musikanimal: regardless, filing a ticket sounds like a good idea. probably a.nomie has some suggestions on how to implement it [04:02:34] okay! will do [04:04:23] in the meantime, I'm okay with just fetching the messages by i18n key, if I can figure out what they are... [04:09:37] they're not i18n messages [04:09:48] namespaces are still localized in PHP files :( [04:10:41] bummer [04:13:58] https://phabricator.wikimedia.org/T226072 should you be interested. Thanks for the help! [05:08:06] musikanimal: localised namespaces are not a property of the wiki itself [05:08:49] there's also no way to no if a local alias is a translation (hopefully not) [05:08:58] * to know [07:39:57] and you need to be carefule with translations of namespaces, as they can invalidate existing proper ns:0 titles that happen to use the same prefix, so it's better to have them on stricter control to begin with. [14:00:27] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @CFisch_WMDE & @amir1 - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:02:50] I forgot what day it was. Can't attend anyway as on holiday. [14:50:12] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @CFisch_WMDE & @amir1 - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:01:45] \o/ [15:02:03] Hello hello and welcome to the Technical Advice IRC meeting! [15:02:09] o/ [15:02:57] We don't have any topics pre-posted on the wiki so just fire away with your questions ;-)! [15:03:33] Why is the job queue broken in .10? [15:03:34] * Reedy hides [15:03:52] :-D [15:04:41] Hi CFisch_WMDE [15:04:53] Hi Gopa! [15:05:59] Reedy: I don't answer to your question but I think Reedy knows better [15:06:08] *don't know [15:08:56] yesterday I asked inthe same group regarding this, I couldnt fixt it so I'm aksing again :) [15:08:56] >I'm working on video cut tool, an online tool to trim videos in commons, [15:08:57] To concede videos using ffmpeg, I'm doing in this way https://github.com/gopavasanth/video-cut-tool-back-end/blob/error/routes/index.js#L99 [15:08:57] this command works and the videos are concatenating when I try this in my separately in my console, [15:08:57] But If I run this file, I'm getting this error: https://ibb.co/bBzBbj2 [15:08:57] I tied to debug in several ways and tried to find some solution but couldn't, Any solution please? [15:09:25] Could you paste text as text into a text pastebin? [15:09:41] That blurry image is hard to read, plus text should be text [15:09:41] andre__: yeah :) [15:10:30] Plus it lacks which exact command was called, I think. [15:10:59] https://pastebin.com/W4fkHf4K [15:11:28] it seems the bash script has syntax error [15:11:30] Gopa: That's a bash/ffmpeg question? :) [15:11:38] Have you tried replacing <( by < ( [15:11:44] that seems obvious given the error [15:12:08] Amir1: how to fix this then ? [15:12:11] wellifyoudontseparatecommandsitshardforcomputerstorealizethatanothercommandisstarting [15:12:27] Gopa: bynotconcatenatingthingsasIalreadywrote [15:12:30] andre__: bless you [15:12:32] I need to see the bash file it tries to run :D [15:13:03] oh yeah andre__ is right [15:13:04] yeah. Hard to say what's wrong in code when nobody can see the code :) [15:13:14] 'ffmpeg -f concat -safe 0 -i <(for f in ./trimmed/*.webm; do echo "file $PWD/$f"; done) -c copy ./trimmed/output.webm' } [15:13:21] This seems problematic [15:13:30] https://github.com/gopavasanth/video-cut-tool-back-end/blob/error/routes/index.js#L99 [15:14:04] Gopa: Have you tried replacing "<(" by "< (" and "done)" by "done )"? [15:14:06] Amir1: yes that commmand is problem, How to correct that? [15:14:17] andre__: yes I have tried that too :( [15:14:19] My bash knowledge is good as a toddler's but "ffmpeg -f concat -safe 0 -i < $(for f in ./trimmed/*.webm; do echo "file $PWD/$f"; done) -c copy ./trimmed/output.webm" [15:14:28] Gopa, ah, okay [15:14:47] you need to add a dollar sign I think [15:15:06] Amir1: Ooh Let me try with that :) [15:15:54] I don't know if "<" is also needed [15:16:09] try with different variants [15:16:27] Unfortunately, I can't entertain you with getting oauth to work as I'm off my computer [15:16:52] Amir1: okay, I'm trying :) [15:20:02] Does https://meta.wikimedia.org/w/index.php?title=Template:Globalrenamequeue&action=history load for you? It keeps loading and loading for me... [15:20:22] yup, loads for me. [15:20:38] hauskatze: loads for me too [15:21:03] thanks for checking - I seem to be able to get only a plain text version :) [15:21:05] paladox: +1 [15:21:29] Now it seems to work [15:25:54] https://pastebin.com/xH8riGiN [15:25:54] yeah, Thanks Amir1 [15:25:54] I think that '$' solves that error, I'm now dived into multiple other errors, Ahh I got this new error, [15:25:54] This concatination proccess is sarting earlier to trimming videos (async problem) [15:25:54] First the video has to be trimmed and generate multiple new trimmed videos and then Concatenate into One (if the user requests). [15:25:55] [15:26:11] I will try to solve this :) [15:26:57] ^_^ [15:33:09] Soooo anyone else? :-) [15:34:00] CFisch_WMDE: i dont understand why my clock within firefox is 1 hour different to my OS time :/ [15:34:17] You have a clock within FF? o.O [15:34:36] Why would you need that? :-D [15:34:39] well, for chat things in browser tabs / windows :) that have times in them :P [15:35:19] Uhhhh, that sounds weird. [15:36:29] silly technology [15:43:27] I have followed this https://wikitech.wikimedia.org/wiki/Help:Toolforge/Web#node.js_web_services to host my front end of the application https://tools.wmflabs.org/video-cut-tool-front-end/ but :( [15:43:27] This is working fine over here though https://video-cut-tool.netlify.com/ [15:44:00] https://tools.wmflabs.org/static/js/bundle.js [15:44:13] it looks like it's missing your tool name from the prefixes its using [15:44:17] [15:44:37] yes but this works fine in netlify ? [15:44:52] And? This isn't netlify [15:44:52] and in my localsystem :D [15:45:07] I think if it didn't have the leading / it'd be ok [15:45:12] as it'd actually be relative [15:45:18] rathe than trying to use the root [15:46:04] https://github.com/gopavasanth/video-cut-tool-front-end/ [15:46:04] This is the code repo :) [15:46:48] any suggestions with in repo ? [15:47:47] Gopa: What do you mean by "any suggestions with in repo?" Do you have a specific question? [15:47:49] I know I still wrote the localhost here https://github.com/gopavasanth/video-cut-tool-front-end/blob/master/src/components/home.js#L130 but that doesnt matter for now :) [15:48:40] andre__: I ment suggestions to fix that error told by Reedy in this repo [15:48:56] Well, where to look in that repo? [15:49:08] You don't expect someone in this room to now start reading each and every code line in your repo, do you? [15:49:39] https://github.com/gopavasanth/video-cut-tool-front-end/blob/master/src/components/home.js [15:49:39] This is the main file of the front-end code :) [15:51:35] As I want to host this in toolforge, I dont have any idea to fix that issue, I tried to solve it myself from ast days, couldnt achive it, So I asked inthis group for suggestions :) [15:51:48] *past [15:52:45] so a question for the IRC meeting: MW has two memcached clients, a pure PHP one and another one that wraps the PECL memcached extension. Does using one of them over the other bring any advantage? [15:54:24] mszabo-wikia: did git blame help you with anything? [15:55:10] hi! [15:55:11] "Implemented a wrapper for the memcached PECL client" - https://github.com/Wikia/mediawiki/commit/3c62077fe28778eb41bde7549fbae402c8be0ef7 [15:55:27] not many details there [15:55:30] I would have one question [15:55:56] how can I import articles from Special:LintErrors? https://hu.wikipedia.org/wiki/Speci%C3%A1lis:LintErrors [15:56:17] I would need them for AWB [15:56:42] Bencemac: you can query them in quaary [15:56:46] *quarry [15:56:52] mszabo-wikia: I would assume ( without knowing exactly though ) that the PECL Memcached extension is performing better and the PHP implementation is just a fallback for users that cannot install PECL extensions on their hosts. [15:56:54] Do you know how to use quarry? [15:57:05] not really [15:57:24] but if you create me, I will be able to do it myself [15:57:43] https://phabricator.wikimedia.org/T177813 [15:57:53] yeah that is a reasonable guess, but I was wondering in case there was some extra context [15:58:02] Just needs a simple listprovider making [15:58:29] would be very useful [15:59:14] I also need to make a long overdue release :) [15:59:48] new AWB version? :) [16:00:37] Bencemac: so for the quarry https://meta.wikimedia.org/wiki/Research:Quarry this is the help [16:01:42] for writing the proper query, you can take a look at https://github.com/wikimedia/mediawiki-extensions-Linter/blob/master/sql/linter.sql [16:02:00] you can also just the API: https://www.mediawiki.org/w/api.php?action=query&list=linterrors&lntcategories=obsolete-tag [16:02:39] ^ [16:02:53] that and a simple find and replace to strip the bloat might work [16:05:44] well, it will not be easy [16:06:50] https://www.mediawiki.org/w/api.php?action=query&list=linterrors&lntcategories=obsolete-tag&format=xml [16:07:06] title="Extension:SphinxSearch/Windows install" [16:07:19] You only need to match between the title quotes... [16:07:40] API + curl + jq makes a super simple client [16:07:58] curl -s 'https://www.mediawiki.org/w/api.php?format=json&action=query&list=linterrors&lntcategories=obsolete-tag' | jq --raw-output '.query.linterrors[].title' [16:09:16] dealing with continuation is a little harder but still doable [16:10:52] tgr: that's what limit=max is for ;) [16:11:20] and potentially minimising properties if you can [16:17:12] Hi. Sorry for the late entry. Is the technical advice IRC meeting still in progress? [16:18:17] kaartic: Technical speaking, no, but please go ahead :-) [16:18:40] * CFisch_WMDE has to leave not but I'm quite confident someone will help [16:22:28] Recently we were wondering if it was possible to somehow get a list of the Wikimedia Commons campaigns that are in progress currently? [16:22:48] Anybody knows about any sources which might be related to this? [16:28:29] For some background, this is for a feature of the Wikimedia Commons android app which shows campaigns which are currently in progress. As of now, this is implemented by manually maintaining a repository of the list of Wikimedia Commons campaigns. [16:29:40] I imagine an API module probably needs writing for the Campaigns extension and/or EventLogging [16:34:39] what's the campaigns extension? is it something that exists? [16:35:23] https://www.mediawiki.org/wiki/Extension:Campaigns [16:37:50] Reedy: wrong campaigns, these are UploadWizard Campaigns :| [16:38:16] https://www.mediawiki.org/wiki/Extension:UploadWizard/Campaigns [16:38:40] {{disambiguation}} [16:38:47] https://www.mediawiki.org/wiki/Campaigns !! [16:38:58] https://commons.wikimedia.org/w/api.php?action=help&modules=query%2Ballcampaigns exists - you'd presumably have to filter it yourself at the moment [16:39:18] (that UploadWizard stalkword continues to be useful) [16:39:21] kaartic: There is https://commons.wikimedia.org/w/api.php?action=help&modules=query%2Ballcampaigns [16:39:30] Reedy: Jinx! [16:39:40] https://commons.wikimedia.org/w/api.php?action=query&list=allcampaigns&uwcenabledonly= [16:39:41] Yeah... [16:39:52] So maybe requesting a non ALL version of that could work [16:40:09] I can't imagine that's a small download to keep doing on mobile devices and be nice about it [16:40:44] 151KB atm [16:40:49] 151KB just for enabled [16:41:02] without a continue [16:41:10] Probably don't need to update it more than once a day, but yeah, not great [16:41:55] kaartic: File a task to request a list=campaigns rather than list=allcampaigns [16:43:15] This is a fairly common request for stuff like this [16:43:19] we have list=pages and list=allpages [16:43:20] etc [17:07:22] Reedy: marktraceur: legoktm: Thanks a lot for the information. will look into it and come back with doubts if any. Thanks a lot. [17:08:33] Got to go for now. Bye :-) [17:21:42] [[Tech]]; Þjarkur; /* Do Wikimedia sites have sitemaps? */; https://meta.wikimedia.org/w/index.php?diff=19160552&oldid=19159674&rcid=13728497 [17:25:12] [[Tech]]; Legoktm; /* Do Wikimedia sites have sitemaps? */ phab link; https://meta.wikimedia.org/w/index.php?diff=19160557&oldid=19160552&rcid=13728502 [18:59:01] Hi, I have problems with Template:Maplink. can anyone tell why the first map here renders (Interstate 2) but the second one (Vltava river) does not? Demo is here: https://en.wikipedia.org/wiki/User:Kozuch/Sandbox3 [20:27:32] On serval wikiprojects the sites doesn't stop to load. Some objects are missing. Is there an overload? [20:39:33] oh, I thought that's my connection [20:41:02] happens in incognito mode too, with revision histories with custom number of entries per page also take ages to load [20:41:08] everything else is cached, I guess [20:41:20] I can't provide a link, that would be cached again [20:43:20] can't reproduce it anymore. Happened occassionally in the last few hours [20:44:22] I've had the revision history being loaded in visible, small chunks onto the screen, like 5 edits per chunk [21:09:48] Der_Keks, ToBeFree: Are you in Europe? [21:09:55] * ToBeFree nods [21:10:00] Wuppertal, NRW, Germany [21:10:11] Der Keks is likely judging by their name [21:10:14] ^ [21:10:15] ^^* [21:10:29] Thanks. Might be https://phabricator.wikimedia.org/T226048 - please no "me too" comments, but see the comments how to provide info [21:10:31] They speak German in other places ;) [21:11:10] Reedy: outside of Europe? Namibia? :) [21:11:25] * ToBeFree gave a me-too-thumbsup-token [21:14:42] I wonder if this is related to wikiblame not working... [21:14:53] andre__, yeah [21:15:14] ah thanks :) [21:19:51] AntiComposite: why should it? [21:20:19] I thought about it more after saying that and decided it was unlikely [21:21:01] AntiComposite: Where is it reported that Wikiblame is not working? Any link? [21:21:15] https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Is_Wikiblame_broken%3F https://de.wikipedia.org/wiki/Benutzer_Diskussion:Flominator/WikiBlame#WIkiblame_down? [21:22:13] AntiComposite: Ah thanks. I know I saw a report somewhere but could not find it again... Meh that's stuff on some external website hence way harder to analyze [21:22:42] such stuff should be on toolforge imo :-/ [22:48:47] Some people {citation needed} says Wikipedia is sluggish. Any problems after the hw update? [22:53:10] hmm, i also experenced that a minute ago, now it's faster. [23:07:09] jeblad: what hw? [23:07:50] Slow again. [23:09:31] "You will be able to read but not to edit Wikimedia Commons for 30 minutes on 19 June at 05:00 (UTC). This is to fix a hardware problem." [23:10:43] 23:10:29  Thanks. Might be https://phabricator.wikimedia.org/T226048 - please no "me too" comments, but see the comments how to provide info [23:13:18] Actually, sometimes it does not load at all. [23:13:56] jeblad: an article page or something like your watchlist?