[07:55:31] [[Tech]]; ArchiverBot; Bot: Archiving 1 thread (older than 30 days) to [[Tech/Archives/2015]].; https://meta.wikimedia.org/w/index.php?diff=14247710&oldid=14230750&rcid=6916392 [11:17:16] Hey, I'm looking for a programmer willing to work on a bot for the Medical Translation Project (we have a coding budget): Quick specs https://meta.wikimedia.org/wiki/User:CFCF/botspecs [15:56:22] Hi [15:56:53] https://commons.wikimedia.org/wiki/Special:Upload says: [15:56:58] "Maximum file size: 1,000 MB" [15:57:38] but uploading anything ~130MB I got: [15:57:39] 413 Request Entity Too Large [15:57:40] [15:57:47] ankry: https://commons.wikimedia.org/wiki/Help:Server-side_upload [15:57:48] what am I doing wrong? [15:57:54] https://phabricator.wikimedia.org/T115984 [15:58:59] andre__: I am asking why it says 1000 if "Only 100 MB uploads are supported" ? [15:59:10] where is it improperly set? [16:00:59] ankry: the 100MB limit is set at some higher layer (Varnish or something), your request gets rejected before MediaWiki starts processing it [16:01:13] you can upload bigger files using chunked upload only (see the linked Phab task) [16:01:27] ankry: 1000MB is the point where MediaWiki itself will reject your file [16:04:20] hmm, UploadWizard actually knows the real limit, not sure why Special:Upload doesn't [16:07:06] Does special:upload use chunks? [16:07:34] Oh, the "real limit" is 100mb. Got it. [16:08:04] MatmaRex: shouldn't special:upload say yhe limit is 100MB ? [16:08:20] and if so, how make it to say so? [16:08:32] it should [16:08:37] users are complaining because being misled [16:08:46] i don't know, but if you file a task and assign it to me, i'll find out at some point [16:09:31] MatmaRex: OTRS [16:10:06] ankry, https://phabricator.wikimedia.org/maniphest/task/create/?projects=MediaWiki-Uploading [17:02:09] mobrovac: Have time for a few questions about T116147? [17:02:52] csteipp: in a meeting, should have 30 mins from now [17:03:04] Cool [17:39:50] Hi all [17:41:34] csteipp: kk, i'm here [17:47:13] csteipp: ah i see you replied on the ticket :) [17:47:27] cheers [17:47:30] mobrovac: Yeah, I think I understand it now [18:28:55] So before anybody says "I told you so", is there a way to reset my Phabricator 2FA without having a committed identity hash? [18:30:58] parent5446: beg admins to disable 2fa via .. a phab ticket ..ehm.. [18:31:08] i think you can mail task@ [18:31:15] to create one without the login [18:31:20] Yeah was gonna say, b/c the Create Task page is forcing me to log in [18:31:50] parent5446: try asking #wikimedia-devtools what to do [18:32:05] there are phabricator admins there [18:32:10] So many channels... But yeah I'll try that [18:32:15] heh, i know, yea [19:43:06] https://www.mediawiki.org/wiki/Special:Watchlist became very slow for me today [21:00:48] Krinkle: is our CI convention for jshint: Put directories to ignore in .jshintignore, and paths to process to in Gruntfile.js's jshint.all? It feels weird to have the filespec in two places, but grunt-contrib-jshint still requires a .jshintignore to process files. [21:05:32] spagewmf: No, .jshintignore is optional. It is not required in any way [21:05:54] spagewmf: I've been experimenting lately with making the grunt config match how you would use jshint on the command line [21:06:00] spagewmf: that is to say, set all: '.' in gruntconfig [21:06:08] which will use the current directory recursively [21:06:12] just like "$ jshint ." [21:06:18] and then the ignorefile takes care of the rest [21:06:37] In that case, however, you must have jshintignore to at least exclude node_modules/ [21:06:40] don't you need a .jshintignore to override core's if you keep your extensions in a subdirector? [21:06:47] subdirectory* [21:06:55] He didn't specify that this is a MediaWiki extension [21:07:27] Yes, if you run in MediaWiki extension context, and the extension is cloned inside mediawiki core, then you need a jshintignore becayse otherwise jshint traverses up and discovers mediawiki-core's jshintignore file which excludes extensions/* [21:07:46] Krinkle: legoktm: this is for MediaWiki skins and extensions. Without a .jshintigore, ">> 0 files linted. Please check your ignored files." [21:08:03] however in any other context, or if you clone extensions in the natural structure that matches Gerrit (e.g. mediawiki-core and mediawik-extensions/Foo as siblings, not children) then it's fine without [21:08:16] spagewmf: Yes, I explained why that 0 happens a few lines up [21:08:21] Does that answer your question? [21:16:08] Krinkle : sort of. Both Gerrit and Diffusion show the "natural structure" of extensions is mediawiki/extensions/NAME, so extensions need a .jshintignore to avoid "WTF" (thanks for your explanation!). I propose .jshintignore contain only a comment "// Block discovering .jshintignore in a parent directory. The file spec for jshint is in Gruntfile.js." That means extensions need to add '!node_modules [21:16:14] /**' to Gruntfile.js. [21:16:42] spagewmf: No, avoid that. [21:17:05] spagewmf: Also that is inaccurate. If you choose to use the natural order (most people don't, I do) then you wont' need this [21:17:13] since then extensions would be alongside core, not inside [21:17:18] mediawiki/core, mediawiki/extensions/Foo [21:18:29] Also you wouldn't leave it empty, you'd put node_modules inside of jshintignore [21:18:31] not in Grunt [21:18:45] Krinkle: https://gerrit.wikimedia.org/r/#/admin/projects/mediawiki/extensions/BoilerPlate <-- natural order. [21:18:59] Yes. [21:19:11] and then /mediawiki/core, not just /mediawiki/ [21:19:11] Not mediawiki/core/extensions/BoilerPlate [21:20:18] For developer environment, you'll need to set wgExtensionsDirectory accordingly so that wfLoadExtension works correctly (or require_once from there, for non-json extensions) [21:20:27] Anyway, besides the point. [21:25:27] Krinkle: ah, I understand structure. But you say both ".jshintignore is optional" and "you must have .jshintignore to at least exclude node_modules/" which Krinkle is right? :-) [21:26:00] spagewmf: Neither grunt, grunt-jshint nor jshint requires jshintignore in any way, and it works fine (no "0 files") in normal circumstances. [21:26:51] However if the repo in question is a skin or extension and you choose to clone that repo as subdir of mediawiki-core, then by default it will discover mediawiki's jshintignore, which excludes extensions/ and skins/ [21:27:26] So you need to counteract it by creating a jshintignore file [21:27:39] The mere existance will make it not discover mediawiki's file [21:28:03] however it must exclude node_modules as other wise you will end up linting all files fetches as npm packages [21:28:12] e.g. grunt, and other upstream packages you use. [21:29:08] Krinkle: Yes, I'm only talking of skins and extensions. Our convention seems to be node_modules in there. But if you want to exclude e.g. vendor or lib/bootstrap, would you add to .jshintignore or to Gruntfile.js jshint.all ? [21:29:26] spagewmf: jshintignore [21:29:51] spagewmf: there should be little to no configuration in Gruntfile for software that is not grunt-only as otherwise you block discovery by other consumers. [21:30:18] E.g. jscs and jshint configuration should not be in Gruntfile as otherwise running them standalone, or via your text editor plugins etc. they can't find the configuration. [21:30:29] so always use native config where possible, the grunt plugin in question should make use of that. [21:30:56] So you configure jshint: { options.jshintrc:true, all: '.' } [21:32:03] Krinkle: thanks for the detailed explanation, I shall put it in the docs somewhere :-) [21:40:24] addshore, https://phabricator.wikimedia.org/T9148 is marked in phab for #user-notice for next Monday's Tech/News. But phab/gerrit say the feature is currently disabled by default, and I can't see any way to enable it at beta cluster. Do I need to add it to the draft Tech/News now (this week), and if yes, how should I summarize? (How do editors access it) [21:42:15] (sidenote, categories at beta cluster seem to have a problem: they're showing the anchor name/ID - e.g. http://en.wikipedia.beta.wmflabs.org/wiki/Category:Felids ) [21:46:50] quiddity: it's disabled, and I didn't get around to enabling it on beta yet (I'll do that today) next thursday we'll turn it on on test.wp and go from there. So it probably shouldn't be announced yet [21:47:01] perfect, thanks [21:48:04] (del/undel) 21:47, 22 October 2015 Legoktm (Talk | contribs | block) deleted page MediaWiki:Category header (view/restore) [21:48:05] should fix it [21:56:59] legoktm, was that related to the new watchlisting code? (I.e. should I phile a task to check other customized MediaWiki:Category_header s? (there's one at Enwiki https://en.wikipedia.org/w/index.php?title=MediaWiki:Category_header&action=edit ) [21:57:48] quiddity: no, that was an old change [21:57:57] quiddity: judging by https://en.wikipedia.org/w/index.php?title=MediaWiki:Category_header&diff=645801609&oldid=86207875 , it was in february [21:58:08] and just nobody cares to fix the message on beta [21:58:14] k, ty. :) [22:44:05] Volker_E: I'm on https://gerrit.wikimedia.org/r/#/c/240580, I wish Paladox were on IRC to benefit from the exegesis from Timo [22:44:29] (Note he is on IRC) [22:45:30] Krinkle: where? He's changing that patch set in the midst of our illuminating conversation :-/ [22:45:33] Not right now, I don't know his hours. Hashar explained him how to use IRC. He's in -releng and some other channels [22:45:47] he left a few mins ago [22:45:51] Krinkle: ah thanks [22:51:03] quiddity: legoktm knows all ;) I'm travelling all of tommorrow! :) [23:13:58] hoo: (or another german speaker) could you translate what is being asked for at https://de.wikipedia.org/wiki/Wikipedia:Umfragen/Technische_W%C3%BCnsche_2015/Schwesterprojekte_und_Globales#Mehrsprachige_globale_Benutzerseiten ? [23:15:37] legoktm: the ability to create multilingual global user pages? [23:16:07] yeah, I got that from google translate. I'm trying to figure out the specifics of what they want, because I'm fairly sure it's already implemented [23:17:10] legoktm: Showing a localized user page (instead of the generic one) based on the wiki's language (and preferably) even based on the user's Babel (like on Wikidata) [23:18:59] hoo: would the current LangSwitch implementation suit their needs, or do they want something more? (e.g. https://meta.wikimedia.org/w/index.php?title=User:Vogone&action=edit) [23:20:16] legoktm: The requester is saying that he doesn't believe that local templates are going to help, but it should be built in [23:21:03] But I guess they're not really aware of the hacks being possible right now [23:21:17] You could even use langswitch and have a /de subpage and stuff like that [23:21:19] I guess [23:21:19] I think it's the latter bit that is important to the requester -- displaying the user page in the language that is most relevant to the specific reader [23:21:45] Yeah, that as well [23:21:57] we have many multilingual wikis by now [23:22:08] right now if the page has conditional language stuff, it'll display it in whatever your user language is set to [23:22:55] but no babel based fallback and stuff, I guess? [23:23:00] * I know, actually [23:23:57] no, it just passes the language to the parser [23:24:29] apparently https://meta.wikimedia.org/wiki/Template:LangSwitch supports fallbacks. [23:24:53] I guess they want to eg. show a French version as a fallback on ocwiki [23:24:56] and stuff like that [23:25:12] not sure oc actually falls back to fr, but oc speakers will know French [23:25:29] (which can be found in their babel) [23:25:44] right... [23:25:56] this'll be fun to implement :) [23:26:20] legoktm: While you're on it, you can also fix https://phabricator.wikimedia.org/T90398 [23:26:22] * hoo hides fast [23:26:45] oh hmm [23:26:50] I wanted to fix that differently [23:26:58] and make Babel not rely on categories. [23:27:04] that would also work [23:27:19] but would break babel working with the ancient template things [23:27:25] which I think are still around on many wikis [23:27:29] in one way or another [23:27:46] meta is fully transitioned right? [23:28:08] I was thinking it would still add categories, but we also update a database table, and expose a proper API for it [23:28:13] that Wikibase could use [23:28:50] meta is, It hink [23:29:06] That sounds good [23:29:18] having Wikibase not bind again Babel directly would be nice [23:47:50] I'm having some trouble with a skin installation, can anyone spare a minute to look? [23:48:01] MediaWiki skin installation, that is.