[01:30:27] can there be an effort to translate catonese? [01:30:33] it tends to fail to be translated [01:33:27] Interface language? [01:33:41] Find people to do it and point them at translatewiki [01:34:55] its a lot of languages [01:35:18] I didn't ask that [01:35:19] I wish to reach someone whom handles such stuff [01:35:28] whom can handle this for me [01:35:30] Your best bet is Siebrand [01:35:32] It's done by volunteeers [01:35:37] You can't make people translate stuff [01:35:47] can you actually tell me the languages cattonese does not have a translation? [01:36:00] Sorry, what? [01:36:54] zh-yue. [01:36:58] http://translatewiki.net/w/i.php?title=Special:Translate&group=ext-0-all&language=yue [01:36:59] 20% [01:37:36] 83% for MW core [01:38:33] ah thats not what I was looking for [01:38:50] http://commons.wikimedia.org/wiki/Template:Assessments/translate/mt [01:38:55] check the last template [01:39:11] Ċiniż language then Cantonese language [01:39:24] Ċiniż is "Chineese" in mt language [01:39:45] no yue equavalent for mt [01:40:10] So fix it? [01:40:11] Add one [01:40:15] You know how wikis work [01:57:26] Reedy the thing is I am trying to figure out what isnt working [01:58:29] ..? [01:59:14] I believe the code is {{#language:Catonese|xx}} [01:59:21] *I believe the code is {{#language:Cantonese|xx}} [01:59:47] what am I trying to get treanslated on mediawiki? [01:59:59] its not regular interface probably [02:01:17] https://www.mediawiki.org/wiki/Help:Magic_words#Miscellaneous [02:01:24] it's a parser function [02:01:39] yes [02:01:48] but what needs to be translated here? [02:03:14] $lang = Language::fetchLanguageName( $code, $inLanguage ); [02:04:37] ok... [02:04:53] so $inLanguage holds the complete list? [02:05:20] cldr [02:05:20] $wgHooks['LanguageGetTranslatedLanguageNames'][] = 'LanguageNames::coreHook'; [02:05:46] It's an upstream thing [02:05:46] http://cldr.unicode.org/ [02:06:20] If you look at CldrNamesMt, there is no 'yue' entry [02:06:25] which is the problem here [02:06:57] There's apparently also no Yue file, so you can't get any translates as they would be in cantonese [02:07:19] http://cldr.unicode.org/index/bug-reports [02:07:37] Stuff could be added to the cldr extension temporarily [02:07:45] but it needs to go upstream, we shouldn't have to maintain it [02:13:48] ah I see [02:14:03] there are several ways to say cantonese though [02:15:00] seems like they use zh-yue instead of yue [02:16:10] or not [02:16:14] I am confused by this site [02:54:39] ***MEMORY-ERROR***: rsvg-convert[9385]: GSlice: failed to allocate 496 bytes (alignment: 512): Cannot allocate memory [03:05:26] ToAruShiroiNeko: make your SVGs smaller [03:06:29] not mine [03:06:50] its 23.85 MB in size [13:07:50] New patchset: Hashar; " now supports 'failonerror' passed to " [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14477 [13:07:51] New patchset: Hashar; "ant: supports options and failonerror" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14478 [13:07:51] New patchset: Hashar; "ant: redo copy-project to use --ref cloning" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14479 [13:08:32] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14479 [13:08:33] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14478 [13:08:34] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14477 [13:26:50] 13:26:06 [exec] error: RPC failed; result=22, HTTP code = 502 [13:26:51] ... [14:58:42] New patchset: Hashar; "ant: target to easily apply a change" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14486 [14:58:43] New patchset: Hashar; "ant: experimental Wikidata build target" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14487 [14:59:15] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14487 [14:59:16] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14486 [15:21:10] New patchset: Hashar; "ant: apply change macro get refspec/branch params" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14489 [15:21:10] New patchset: Hashar; "ant: Wikidata build did not correctly apply change" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14490 [15:21:30] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14489 [15:21:43] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14490 [15:57:23] New patchset: Hashar; "ant: make sure we import a clean & uptodate Gerrit project" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14496 [15:58:31] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14496 [16:06:20] hmm [16:06:33] almost able to build test out Wikidata using Jenkins! [16:10:22] hashar: what do you think about deploying to beta labs (including some extensions) with Jenkins? [16:10:47] hey chris :) [16:11:15] chrismcmahon: I have no idea :( still a goal I want to achieve though [16:11:16] hey Antoine :) nice work on Wikidata and Jenkins [16:11:24] would need a funny script to handle that [16:11:39] the Wikidata I have been hacking it for most of the week :/ [16:11:55] hashar: I was reading about deploying PHP to Ubuntu with Jenkins yesterday, there are some good articles via Google [16:12:58] chrismcmahon: I did wrote some doc at https://labsconsole.wikimedia.org/wiki/Deployment/Overview [16:13:22] still need to write some maintenance doc such as how to update extensions, how to update core, update db ... [16:15:41] chrismcmahon: I did a demo of Jenkins yesterday at my local hacker group [16:15:51] (well more a group where we troll and drink beer, but still) [16:15:56] ;0 [16:16:14] some expressed interest and did not even knew about that software [16:16:30] most probably because it is a Java world tool and the group does not play Java :-D [16:17:44] hashar: it's way beyond Java, it's pretty much the industry standard CI tool anymore, much more popular than CruiseControl I think these days [16:19:08] probably [16:21:27] anyway, it would be great to see Mediawiki and the AFT/E2 projects deployed to beta labs from Jenkins. from your doc, it sounds like that might be close. [16:21:28] <^demon> chrismcmahon: Indeed, we played with cruise control before using jenkins. [16:21:56] <^demon> cruise control was a pita to configure and didn't give nearly as useful output [16:23:32] yeah, I don't think Thoughtworks wants to put in the effort to really have CC compete with Jenkins. CC broke the ground, but Jenkins is pretty much the future from what I can tell. [16:25:12] New patchset: Hashar; "(bug 37050) enable Jenkins for Wikidata" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14500 [16:25:41] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/14500 [16:26:36] enough for this afternoon, will be back later this evening [20:45:22] how big is ARticleFeedbackv5 for you? [20:48:56] hmm CategoryTree is 6.5M locally but 1.5M after a new clone [20:49:04] and 7,4M after a repack! [20:50:18] <^demon> How did you repack it? [20:50:28] git repack -A -F [20:50:54] and then again with --window=20 hiping it would improve [20:51:04] <^demon> No -d? [20:51:13] <^demon> -d removes redundant packs. [20:52:15] I just looked up the incantation :P [20:52:31] <^demon> My usual repack is `git repack -a -d -f --depth=250 --window=250` [20:52:41] <^demon> Which is roughly what you'd get with `git gc --aggressive` [20:53:04] how big is your ArticleFeedbackv5 repo? [20:53:16] <^demon> I don't think I have it cloned, one sec. [20:54:03] <^demon> 6.2 on fresh clone. [20:54:17] 75M here :P [20:54:56] before -> 75M [20:54:58] git repack -A -F -> 78M [20:55:05] git repack -A -F -d -> 6,4M [20:55:16] git repack -a -d -f --depth=250 --window=250 -> 5. M [20:55:23] *5.0M [20:55:28] <^demon> Fresh clone -> 6.2M. [20:55:38] <^demon> git repack -A -F -> 5.9M [20:56:08] <^demon> git repack -A -F -d -> 6.0M [20:56:25] I think I'll start repacking repos [20:57:09] <^demon> Generally git is pretty good about garbage collection, but usually doing some repacking helps. [20:57:16] one wonders why should that be done manually... [20:57:17] <^demon> Core especially has the habit of getting absurdly large. [20:58:16] well, in this case it shouldn't need to garbage-collect [20:58:23] i don't think I have committed anything there [20:58:23] <^demon> git repack -a -d -f --depth=250 --window=250 -> 4.8M [20:58:24] <^demon> :) [20:58:47] different git version? [20:59:09] <^demon> git version 1.7.7.5 (Apple Git-26) [20:59:18] git version 1.7.11.1 [20:59:54] maybe the blobs were considered in different order... [20:59:58] <^demon> I wonder if delta compression is OS/platform-specific in some manner. [21:00:12] I don't see how it would be [21:00:24] also, isn't that little-endian, too? [21:00:52] well, maybe Linus put some arch-specific optimizations [21:02:33] <^demon> yeah, osx is little endian. i honestly don't know really. [21:03:02] how many cores? [21:03:15] <^demon> Dual. [21:03:28] so no difference there, either [21:05:58] 6,2M ArticleFeedbackv5-cloned [21:05:58] 4,8M ArticleFeedbackv5-cloned-repacked [21:06:12] so it was how it considered the blobs [21:17:28] AbuseFilter: 73.1M [21:17:47] sending your spell [21:18:08] <^demon> I have that aliased as `git repack-everything` :) [21:18:38] then I'll probably copy your alias name :) [21:18:50] Translate: 76M [21:19:19] <^demon> I really don't understand it sometimes :p [21:19:47] what's "it"? [21:20:19] <^demon> git ;-) [21:24:06] heh :) [21:24:32] it wasn't a good idea to start both repack-everything at the same time [21:24:45] it started swapping :P [21:25:18] not really strange if one of them uses 2GB of memory... [21:26:55] why isn't the total of "Compressing objects" the same as the output of counting objects? [21:30:35] AbuseFilter went down to 7.4M [21:30:39] almost 10% [21:31:43] <^demon> Maybe not all objects could be compressed? [21:31:45] <^demon> Dunno [21:35:51] Trnaslate 76M -> 15M [21:36:00] ^demon? [21:36:24] 73.1M -> 7.4M is a very good compression [21:36:58] WebFonts is also big, but this contains big objects [23:08:24] jorm are you around by any chance? [23:08:47] could you tell me the number of edits commons can take every minute? [23:09:13] i am. [23:09:23] yay [23:09:37] no; i suggest you ask. . . tim starling. [23:09:45] oh ok [23:09:49] disregard the pm then [23:09:57] since he had a thing about this recently, iirc. [23:10:01] or disabled someone's bot. [23:10:06] yes [23:10:23] performance statistics are not my forte. [23:10:39] roan might know as well. also aaron shultz. [23:10:42] maybe reedy. [23:32:05] UploadWizard was 64M -> 6.7M