[00:02:06] hey! :p [00:05:17] werdna: https://gerrit.wikimedia.org/r/17366 [00:06:04] ah yes I forgot [00:06:20] heh [00:06:28] among everything else [00:06:29] man [00:06:37] Reedy: btw, you're in Hull, right? [00:06:45] Not quite [00:06:48] 30 miles or so west [00:07:05] that being Kingston upon Hull? [00:07:16] Nope [00:07:29] Kingston upon Hull is the proper name [00:07:35] I'm almost in another county [00:07:36] right [00:07:43] Why, are you going to Hull? :p [00:07:44] yeah I'm just figuring out vaguely where you are [00:08:34] 30 miles west? that's pretty much leeds, right? [00:08:43] Leeds is 60 odd [00:08:48] well, edinburgh [00:08:54] and London [00:09:14] My front door leads to the english channel [00:09:49] * Danny_B|backup is sad :-( that extension does not work properly :-((( [00:10:23] What's wrong with it? [00:11:17] the first item is not returned as list but as wikitext [00:11:21] so there is [00:11:28] # subpage1 [00:11:32] 1. subpage2 [00:11:57] lol [00:11:58] awesome [00:12:18] Link? [00:12:27] let me try with #tag [00:12:34] Reedy: i got it in preview [00:13:28] #tag also did not work [00:14:31] Reedy: https://cs.wiktionary.org/wiki/Wikislovn%C3%ADk:%C5%A0ablony put on top of page in preview [00:17:21] * Danny_B|backup facepalms... wanted to start to develop the system and can't. and obviously won't be able for some time since the extension needs to be fixed... :-/ [00:19:20] mmmm [00:19:25] deprecatednesss [00:19:49] is it anything easily fixable or not? [00:26:12] hmm, i don't see any obvious issue in its source [00:27:32] it looks like the first # isn't being parsed [00:28:23] same with unordered [00:28:29] bar doesn't put hte first . seemingly [00:28:56] though, that looks intentional if show parent.. [00:28:56] try showparent=yes [00:29:06] it gives way so weird output [00:29:12] it doubles the nesting [00:29:37] of lists [00:30:03] ## subpage1 [00:30:10] 1. 1. subpage2 [00:33:25] Now, have I any subpages locally.. [00:35:16] hmm, it will never show parent obviously [00:35:32] string(268) "# [[Wikia code/api.php|api.php]] # [[Wikia code/includes|includes]] # [[Wikia code/index.php|index.php]] # [[Wikia code/languages|languages]] # [[Wikia code/maintenance|maintenance]] # [[Wikia code/opensearch desc.php|opensearch desc.php]] # [[Wikia code/skins|skins]]" [00:35:38] the output of makelist looks vaguely right [00:35:39] because it does $list = array(); on 461 [00:35:50] Sans newlines [00:36:22] after the parent has been set on 456 [00:36:42] if( count( $list ) > 0 ) { [00:36:42] $retval = implode( "\n", $list ); [00:36:56] RoanKattouw: it's just browser display [00:37:00] source is "right" [00:37:03] Oh OK [00:37:08] I had to check also [00:37:09] Yeah that should just work [00:37:14] I saw the implode and was like wtf [00:37:26] so it's how it's parsed after makelist [00:37:45] obviously it is not used on wikiversities since nobody complained ;-)) [00:41:17] http://p.defau.lt/?iR6RxI0oEUU6gZN8RZRPvQ [00:41:39] again, parsed stuff looks correct [00:44:04] Ah [00:44:08] That's very interesting [00:45:47] If I prefix the output with a \n it works fine [00:47:29] Danny_B|backup: try it now [00:48:47] yeah, better. [00:49:07] however that error i described above still needs to be fixed but that should be simple i guess [00:49:15] sure, 1 thing at a time [00:49:23] (not whowing parent because of overwriting the arrray) [00:49:27] I'm not sure if this fix is so valid, feels a bit hacky [00:50:44] Nemo_bis: I forgot, how do you mark messages as optional to translate? [00:55:08] TimStarling: So it seems like in it's current format, what subpagelist3 outputs doesn't get correctly parsed, leaving the first item in a list unparsed, but then following items are parsed fine. Prefixing a \n to what's outputted by the function "fixes" the problem, in that all of the items get parsed correctly. [00:55:32] It however feels a bit hacky. Any ideas if there'd be another way to fix it? [00:55:45] Reedy: Is it parsing with $wgParser? [00:55:52] Is it using recursiveTagParse()? [00:56:37] It is using that, but not $wgParser, it uses ParserFirstCallInit [00:57:02] and then passes $parser from the resultant call of it's hook subscription into the constructor and uses that [00:57:16] first "that" being recursiveTagParse [00:58:02] we use a \n prefix for the bug 529 hack [00:58:12] !b 529 [00:58:12] https://bugzilla.wikimedia.org/show_bug.cgi?id=529 [00:58:23] kaldari: it's in translatewiki repo, lemme a sec [00:58:46] search Parser.php for "Bug 529" to see what I mean [00:59:53] recursiveTagParse() doesn't actually run doBlockLevels(), that's actually done after unstrip [00:59:54] kaldari: should be documented here https://www.mediawiki.org/wiki/Help:Extension:Translate/Group_configuration#TAGS [01:00:29] that's why there's no $linestart parameter to recursiveTagParse(), only to parse() [01:00:31] In the translatewiki config then? [01:00:50] kaldari: ^ [01:01:00] kaldari: like https://gerrit.wikimedia.org/r/#/c/17188/ [01:01:08] thanks! [01:01:17] the implication is that after unstrip(), the resulting text has to have a line break pattern that doBlockLevels() will understand, regardless of how hacky that is [01:02:07] I have some vague plans for how to fix the interaction between unstrip() and doBlockLevels() in general, there are other bugs, not just this [01:02:51] but that would be a larger project [01:03:34] Aha, on that ever growing "todo later" list [01:03:43] :o) [01:04:00] So adding a newline as I am is the workaround? [01:04:19] I'm sure some of the wiser devs store their "todo later" list in an append-only data store [01:05:36] yes, I think it's a reasonable workaround [01:05:50] /dev/null would probably be easier [01:06:12] to fix it properly, the plan would be to integrate doBlockLevels() with unstrip() [01:06:29] recursiveTagParse() would call doBlockLevels() [01:06:47] then doBlockLevels() would expand any strip markers it finds as it goes, and avoid double-parsing the contents [01:07:26] also, doBlockLevels() would need to check the contents of the strip markers for block-level elements, so that it knows whether

elements need to be broken up [01:07:47] e.g.

foo

is invalid [01:07:50] * Danny_B|backup hides from wizzards, black magic is being discussed here... [01:08:09] so yeah, you can do it either way [01:08:19] add a linebreak or rewrite doBlockLevels [01:09:38] Rewriting doBlockLevels doesn't seem the best idea at after 2am... [01:11:49] Reedy: gonna fix that parent or do you want me to file a bug for it? [01:12:03] * Danny_B|backup fortunately doesn't need parent for their purposes [01:12:21] Probably not tonight.. I was just looking at it [01:12:43] imo just move the $list=array() upwards [01:13:02] haha [01:13:05] I think you're right [01:13:35] 461 to 448.5 [01:14:07] Yup [01:14:14] That at least gives you the parent title [01:15:20] ah, i realoade, so 462->449.5 [01:17:21] awesome! [01:17:32] If someone reviews and submits them, I'll remove the live hack from WMF and push a newer version ;) [01:17:33] * Danny_B|backup puts Reedy on his candy bribelist [01:17:39] Pffft [01:17:43] I wasn't already!? :p [01:17:57] let me think... :-P [01:18:03] Wait, I'm having trouble imagining how anyone would bribe Reedy with candy [01:18:08] Given that Reedy is our candy dealer [01:18:25] not *my* candies [01:18:44] He doesn't think in pieces of candy or grams of candy, he thinks in suitcases of candy [01:19:05] RoanKattouw & Krinkle are the only devs who had a chance to taste my candies [01:19:13] True, it's not Czech candy [01:19:42] well, and whoever was to wikimedia conference this april [01:19:52] because i brought like 200 candybars there [01:20:29] but different than those for roan and timo [01:20:54] Reedy: if you'll be to glamcamp and me as well, i'll bring some [01:21:15] and many thanks!!!!! [02:18:44] Reedy: works now as expected for the beginning, many thanks on behalf of cswikt [03:05:17] Wait... can a chocolate trade be anything *but* black market? [11:36:43] Hi everybody, I wanted to ask if there is an automatized way to download all djvu files from commons [11:38:21] or at least a way to obtain all the djvu file names from Commons [12:33:56] CristianCantoro: You can query the database. [12:34:03] Or have someone do it for you. [12:35:36] Brooke: thy, I would like to know how to do so... I know some basic SQL so I should be able to do that myself but I don't know where to start [12:35:52] Do you have a Toolserver account? [12:40:10] You'd think media would be categorized by file type. [12:40:38] > select img_name from image where img_name like '%.djvu'; [12:40:45] Slow query is slow. [12:44:34] Brooke: no, I don't have a toolserver account :) [12:44:49] (anyway, speed is not a problem for now) [12:44:53] In a sane world, you'd be able to query the API for this. [12:44:57] But I'm not sure there's any easy way. [12:45:08] You'd think all the DJVU files would be categorized as such. [12:45:21] 22,508 files on Commons end in ".djvu". [12:46:32] http://p.defau.lt/?CBx_9_8ibLvHRBKZOY0duA [12:46:39] Anyway, you don't really want this channel. [12:46:53] You want #wikimedia-tech or #wikimedia-toolserver or even #mediawiki or #wikimedia-commons. [12:47:07] I was actually thinking that I was missing some API call that would do that... [12:47:18] Brooke: thy [13:54:12] New patchset: Hashar; "ant: git-archive macro" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17416 [13:54:12] New patchset: Hashar; "Job to generate MediaWiki core nightly archive" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17417 [13:55:19] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17417 [13:55:19] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17416 [14:05:33] New patchset: Hashar; "mw nighly: build only once at 3am" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17419 [14:05:51] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17419 [15:31:54] To ssh://krenair@gerrit.wikimedia.org:29418/mediawiki/extensions/WikidataRepo.git [15:31:55] ! [remote rejected] HEAD -> refs/publish/master/bug/37982 (can not upload a change to this reference) [15:32:22] Shouldn't I be able to do that..? [15:50:26] hexmode: hi :) [15:50:45] hexmode: I have created a Jenkins job to generate the mediawiki nightly : https://integration.mediawiki.org/nightly/ [15:50:50] hashar: :) [15:50:54] hexmode: should be generated at 3am UTC every day :) [15:51:01] hashar: you are awesome [15:51:06] where are they hosted? [15:51:08] danke :) [15:51:11] on gallium [15:51:15] the contunt server [15:51:37] hashar: so, what url can I use to fetch them? [15:51:49] and does it change if you move them to swift? [15:52:46] hexmode: how swift is related ? [15:52:57] Jenkins fetch the latest master, then use git-archive to build a zip [15:53:16] and copy it locally to /srv/org/mediawiki/integration/nightly/ [15:53:23] which is made public by Apache on https://integration.mediawiki.org/nightly/ [15:53:31] hexmode: the nightly are at https://integration.mediawiki.org/nightly/mediawiki/core/ [15:53:31] hashar: see robla's follow-up on the list [15:53:37] I am going to reply to your mail on wikitech-l [15:54:03] hashar: could you have it run the "make-release --snapshot" [15:54:23] that will bundle some extensions, too [15:56:11] hexmode: yeah that would be possible to handle :-D [15:56:12] for later [15:57:47] hashar: I'm willing to try if you just point me at the code [15:58:34] hashar: In fact, it might be a way for me to begin to grok the build process more [16:00:07] hexmode: I am using the ant script you wrote a long time ago [16:00:12] git archive macro: https://gerrit.wikimedia.org/r/#/c/17416/ [16:00:22] the whole nightly build job https://gerrit.wikimedia.org/r/#/c/17417/ [16:00:24] heh [16:00:31] the code https://gerrit.wikimedia.org/r/#/c/17417/1/jobs/_shared/build.xml,unified [16:00:41] note that I am on vacation for 3 weeks starting tomorrow evening [16:00:55] so will be unlikely to review anything :D [16:01:21] hashar: we should probably catch up about labs then today :p [16:01:35] oh yeah completely forgot about that :( [16:02:03] hexmode: we would need to find out to pass a list of extensions (probably a flat file), then clone those extensions and bundle them in a mediawiki-bundle zip [16:02:10] hexmode: but that would be for later :-) [16:02:37] hexmode: the ant target can be tested using something like ant nightly-mediawiki-core -Dgit.shared.dir=/path/to/git/repos -Dnightly.dir=/tmp/nightly [16:02:58] hexmode: where /path/to/git/repos hold snapshots of the mediawiki/core.git repo in mediawiki/core [16:02:59] hashar: well, it sounds like I probably have some latent familarity with it right now, so if I get a chance, I'll loo. [16:03:33] * hexmode copies hashar's comments to his todo list [16:49:17] Reedy: I have been reviewing the doc on https://labsconsole.wikimedia.org/wiki/Deployment/Overview would drop an email later tonight [16:52:14] out for now [17:19:51] So is there anywhere that usercreate.php is documented? [17:20:11] svn.wikimedia.org/doc ?? [17:20:58] No dice [17:21:03] Moment [17:21:29] I mean, I can get the source there. But I was hoping to not have to read php ;-) [17:21:32] well, actually about 5 minutes for this netbook to process that page. [17:21:37] Ugh sorry [17:21:56] [17:23:39] where, in the directory structure, is that file located StevenW? [17:24:34] includes > templates ? [17:26:09] got it. [17:26:19] Trying to find the class QuickTemplate now. [17:31:26] Reedy: where is the class documentation? thot it used to be at the svn docs? [17:32:45] Amgine_: I broke it [17:32:53] will be fixed next time the doc is regenerated [17:33:05] Ah. StevenW: come back tomorrow. [17:35:17] No worries. Thanks for trying to help Amgine. :) [17:35:31] yw. [17:37:47] [17:54:34] !class QuickTemplate [17:54:35] See http://svn.wikimedia.org/doc/classQuickTemplate.html [17:54:42] Amgine_: ^ [17:55:19] [17:56:17] StevenW: wm-bot saves the bacon per Reedy: ^ [17:57:04] http://svn.wikimedia.org/doc/annotated.html <- all the classes. [20:06:02] <^demon> Ryan_Lane: Would you mind taking care of https://gerrit.wikimedia.org/r/#/c/16841/ ? [20:06:13] <^demon> Whoops, dunno why I asked here. [20:32:45] yeah for Echo!! moooarrr notifications! [20:33:49] hashar: Getting high off low-friction social interaction? :P [20:34:03] I am afraid I can't parse your english :-D [20:34:43] I am not entirely sure what "getting high" mean, probably being happy or something [20:34:53] and low-friction is totally unknown to m [20:34:54] e [20:34:58] marktraceur: sorry =) [20:35:27] hashar: Getting high usually involves drugs, so the implication is that the indirect object is a drug of some kind [20:35:52] Low-friction means "without much difficulty", so in this case, it means you don't need to refresh the page or anything nasty like that [20:36:21] ^demon: Ryan_Lane: have you made a change to Gerrit gitweb ? Somehow all the text has the same color on https://gerrit.wikimedia.org/r/gitweb?p=integration/jenkins.git;a=tree [20:36:35] no [20:36:37] ^demon: Ryan_Lane I am pretty sure dirs had a different style [20:36:50] well, I didn't [20:36:54] <^demon> It uses the gerrit's css. [20:36:55] maybe I have dreamed about it so :) [20:36:55] <^demon> haha [20:37:08] <^demon> So the new css made gitweb look bad [20:37:17] gitlist [20:37:18] <^demon> Even more reason to ditch this stupid crap. [20:37:20] gitlist [20:37:21] gitlist [20:37:22] :) [20:37:27] linus way: http://git.kernel.org/?p=linux/kernel/git/torvalds/linux-2.6.git;a=tree;h=fc6bdb59a501740b28ed3b616641a22c8dc5dd31;hb=fc6bdb59a501740b28ed3b616641a22c8dc5dd31 [20:37:34] dirs are links with underline [20:37:39] ours are similar [20:37:42] * Ryan_Lane nods [20:38:05] also that might be just me but I find the contrast to be very low on the new change list [20:38:15] the text is not dark enough or the background is too dark [20:38:21] but at least there is no more greeny stuff :-D [20:38:25] <^demon> Ryan_Lane: So, I'm thinking of just "forking" it to our gerrit install so we can get all the latest fixes (0.2 release sucks), but still have the vendor/ shit included and not have to install composer and shit. [20:38:43] sounds fine to me [20:38:58] <^demon> I'll do that *right now* [20:39:03] \o/ [20:39:29] ^demon: also I get a 404 loading https://gerrit.wikimedia.org/r/static/openstack-page-bkg.jpg :-D [20:40:26] <^demon> Refresh. [20:40:27] ahhh #484848 for body color http://www.colorhexa.com/484848 [20:40:28] jorm: can you draw a fire-breathing bear as a logo for gerrit? we all know it's a beast, let's give it a logo to reflect that [20:40:44] fire breathing bear? [20:40:47] yeah, i can do that. [20:40:50] \o/ [20:40:57] ^demon: yeahhhhhhhhhh :-))) [20:40:59] ^demon: and this is how we end up with awesome logo [20:41:01] *logos [20:41:10] fire farting bear. [20:41:11] <^demon> Gerrit the Grizzly. [20:41:12] <^demon> :) [20:41:14] gimme the weekend. [20:41:15] jorm: dimensions might be weird [20:41:17] then get someone to hack it to get WMF and we get a round of beer [20:41:23] rectangular? [20:41:23] *another awesome logo [20:41:27] yeah [20:41:32] * hashar sends Friday night beer to jorm [20:41:38] see the current logo [20:42:05] how about this drawing style: https://office.wikimedia.org/wiki/File:Hr-contractors.jpg [20:42:10] ^demon: on https://gerrit.wikimedia.org/r/#/c/5424/ commit message, I can't tell the difference between the change id link and the regular text [20:42:17] ^demon: but overall kudos, that looks very nice [20:42:46] jorm: looks good [20:43:25] echo, btw, has been deployed to mw.org [20:43:29] sweet [20:43:30] ^demon: I would just put the body color to plain black for nicer constrast [20:43:43] <^demon> Just play with the CSS. [20:43:47] <^demon> This is why we have git :) [20:44:04] need to write mooaar doc :-D [20:44:20] ... [20:44:27] <^demon> Docs on how to edit css? [20:44:34] jorm: can you please destroy the bar at the top of the page with fire [20:44:36] ? [20:44:42] na I meant … I need to write doc [20:44:45] there's now 8 things in it [20:45:04] jorm: and 4 of them basically have to do with notifications [20:45:51] heh. I guess all of your current designs already have them purged with fire :) [20:45:51] ^demon: what is the repo name ? [20:46:00] <^demon> For what? gerrit's stuff? [20:46:02] <^demon> It's in puppet. [20:46:53] gitweb looks strange [20:46:54] heh. actually the bar at the top have 9 items when "Select font" is there [20:48:10] <^demon> Ok, forked gitlist to our gerrit. [20:48:26] <^demon> https://gerrit.wikimedia.org/r/gitweb?p=operations%2Fsoftware%2Fgitlist.git;a=shortlog;h=refs%2Fheads%2Fmaster [20:49:53] once we have echo integrated with lqt better, we'll ditch that one, i think. [20:49:58] ^demon: https://gerrit.wikimedia.org/r/17462 :-) [20:50:28] <^demon> hashar: Talk with RoanKattouw and Krinkle, I think they have some other fixes they want to make too. [20:50:33] <^demon> Best to do them all at once. [20:50:34] it would be better if all of them didn't have the stupid "my" in front of them. [20:50:40] thanks [20:51:31] hashar: Yes, I'm working on gerrit css now [20:52:14] I don't know why the css change was just blindly merged. It looks horrible imho now, worse than it was (double borders almost invisible, link colors indistinguishable and the logo image was 404 for a bit) [20:52:15] Krinkle: I have made body color to be black with https://gerrit.wikimedia.org/r/17462 ;; I guess I can abandon it [20:53:13] Krinkle: have fun fixing the style, the new one surely have good potential [20:57:22] <^demon> Krinkle: It wasn't blindly merged--I asked people and they liked it. And the header missing was a mistake when I copied it from labs to puppet--easily fixed. But I welcome further improvements. [20:57:53] a designer looked at it? [20:58:12] (again, the general stretch is an improvement, but it has a few serious unacceptable drawbacks) [20:59:19] <^demon> It's identical to what openstack uses. [20:59:41] <^demon> With the orange shifted to wmf red. [21:00:15] we have a very different kind of audience (both in developer experience and device capability) [21:00:40] anyway, past tense. moving on. [21:00:43] robla: "cookie-licked"? :) [21:01:40] jorm / Ryan_Lane: New messages is going to die when that is fixed [21:01:46] \o/ [21:02:02] I've always hated the "new" part of new messages [21:12:22] <^demon> Krinkle: Are you including Trevor's logo with your fixes? [21:12:58] Yres [21:13:00] Yes [21:13:10] already live on gerrit-dev [21:15:31] <^demon> Krinkle: Looks good :) [21:15:59] guys, say truth - who broke servers? [21:18:55] do you still want a fire-breathing bear? [21:19:26] * TrevorParscal rolls eyes [21:27:17] Krinkle: is there any common function in rl to ajax load content of the given page? [21:31:33] Danny_B|backup: I know its just terminology but thats in no way related to ResourceLoader. ResourceLoader just loads files and packages/caches them. You mean mediawiki javascript. [21:31:48] Then, no. Not in MediaWIki javascript. That's just plain $.ajax [21:32:13] Works out of the box [21:32:35] it would be practical to have some function say loadPage("page_title") [21:32:53] for $.ajax you still have to put bunch of params [21:33:21] you need to know how to get rendered content of the page [21:33:31] etc. [21:35:55] New patchset: Hashar; "disable some experimental jobs" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17466 [21:36:11] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17466 [21:37:49] New patchset: Hashar; "Ext-WikiBase : disable StickToThatLanguage ext" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17467 [21:38:03] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/17467 [21:57:24] I am out to bed. See you tomorrow [21:59:59] <^demon|away> Krinkle: Were you done making changes on gerrit-dev? I can submit them now if you'd like. [22:00:22] still working on ti [22:05:22] RoanKattouw: https://gerrit.wikimedia.org/r/#/c/17444/1 [22:40:35] siebrand, about? [22:40:51] MaxSem: not really. Only when brief. [22:42:08] * siebrand cries when seeing ugly logos and thinks WHY!? [22:42:36] siebrand: what logo you saw? [22:42:49] gerrit.wikimedia.org [22:43:03] MaxSem: sorry, I'm off to bed now [22:43:20] siebrand, I'll poke you tomorrow [22:43:26] k [22:44:38] hmm, whoever designed the skin should be punished for not testing it in classical screen resolutions [22:48:07] otoh much better than the original view