[00:03:47] ok, so that's a working console thingy where i can input lua and see what output it gives. now only if i knew how to input the params [00:06:45] legoktm: ? [00:07:05] just call your count function [00:07:15] =p.count('inputtext') doesn't seem to be working [00:07:44] you have to paste the whole function count(…. stuff first I believe [00:08:04] apparently not [00:08:15] it gave me module specific errors without that [00:08:59] anyways, was that the correct way to call a lua function? were the args right? or do i have to make a frame object thingy for the args? [00:09:24] count('text') should work [00:10:58] count('text')Lua error in console input at line 7: attempt to call global 'count' (a nil value).p.count('text')Lua error in Module:प्रयोगस्थल/Siddhartha_Ghai/अ� at line 10: attempt to index field 'args' (a nil value). [00:12:10] oh :/ [00:12:15] looks like you might need a frame thing [00:12:18] i'm not really sure sorry [00:13:15] * Sid-G sighs [00:30:20] So I'd have to do something like msg:title( Category:Foo_Bar ) followed up with mw.message:parse [00:30:44] To get the html of the category? [00:30:53] er [00:31:00] mw.message probably isn't what you want [00:33:01] * T13 goes back to digging through the docs. [00:35:10] getContent() perhaps? [03:04:08] hey all, can @import rules be used in user CSS? [03:10:22] i think so [03:10:34] https://meta.wikimedia.org/wiki/User:Isarra/common.css [03:10:36] yes [03:10:46] Writ_Keeper: ^ [03:10:59] hm [03:11:01] thanks [03:11:23] You can use @import, but you have to make sure you don't have any after other css. [03:11:51] So if you want to use it in your user skin css, you can't have any css in your common.css. [03:12:26] Because those are merged into one file and @import only works at the beginning of the file. [03:12:51] Writ_Keeper: ^ [03:13:33] hmm, I was reading about LESS, which sounds like it should allow for that [03:13:54] not that it matters; it doesn't seem to be working at the top of common.css anyway [03:14:40] oh, that's not in MW yet [03:14:43] i guess [03:16:08] LESS is only implemented for server side stuff, not user.less [03:16:32] yuvipanda: did a bug ever get filed for that? ^ [03:19:58] okay, well, I guess the next question is [03:20:16] can css files from one's local machine be imported through the @import statement? [03:20:38] (presumably through some kind of file:// url or something) [03:21:48] You... could try, but depending how resourceloader handles the files I kind of doubt it. [03:21:56] But I don't know. [04:23:53] Isarra: yes, @import works, for both CSS and LESS [04:24:33] @import is discouraged for CSS files, because you get the same functionality by declaring stylesheets together to ResourceLoader in PHP code [04:24:47] As I recall, @import also has some funky rules. [04:24:55] Regarding its usage. [04:25:01] doing it in PHP means MediaWiki can reason about it and concatenate the files together to minimize the number of requests [04:28:39] @import is appropriate to use in LESS files to import mixins [04:29:06] I explained that in a wikitech-l e-mail that I promised to port to MediaWiki.org and didn't [04:29:11] * ori-l does that [04:29:55] legoktm: I don't think LESS for user modules will be implemented [04:30:06] why not? [04:31:16] because it requires implementing a large set of guards to ensure malicious LESS files don't @import "/etc/passwd"; or trick functions into doing inappropriate things. [04:31:48] that's not insurmountable, and it's worth doing for certain things [04:32:37] but users can still write LESS, compile it by hand, and upload the generated CSS [04:33:21] so it would be a lot of trouble for very little gain [04:33:55] I don't want to be glib about it, I think the fact that there is a sense of collective ownership over the code is one of the things that makes MediaWiki awesome [04:34:27] and having something shiny only available on the backend soils that a little bit [04:35:11] but I think the 'solution' to that is for more people to be involved via git / gerrit [04:36:12] Because when you want to change the background color to red, what you really want is to spend the next two months figuring out Git and Gerrit. [04:36:59] well, again, browsers don't speak LESS, so at the end of the day mediawiki has to output CSS [04:37:02] Hmm. [04:37:11] so even if there was some intent of deprecating CSS, it's not really technically feasible [04:37:31] so it's not like changing the background color to red becomes more difficult [04:38:13] but there is a degree of convenience afforded by LESS and access to that convenience is not even [04:38:50] i recognize that, but i think if you weigh all the advantages and disadvantages, it's ok [04:41:42] well, [04:41:56] 'less security risk' was a very useless search [04:42:05] Heh. [04:42:09] i'm not opposed to it at all, i just don't expect anyone to actually feel like doing it [04:42:16] so if you want to file a bug, go ahead [04:42:23] i'll help if someone takes it up [04:42:27] doing it properly * [04:46:54] https://bugzilla.wikimedia.org/show_bug.cgi?id=54864 [04:47:13] Thanks for cc'ing me. [04:48:14] Did I say @import didn't work? [04:49:09] oh, I misread [04:49:11] I think that ping was meant for Writ_Keeper. [04:49:20] I tried to correct, but Writ_Keeper is gone. [04:49:21] RIP. [04:49:34] Well, that was user stuff, but... otherwise, aye. [04:50:08] Holy cheese waffles, my script finished without the server going down. Yippy horayoo or something. [04:50:13] * Isarra collapses on Elsie. [11:12:50] mark: any chance of RT744-RIPE's mnt-by being changed to TORCHBOX-MNT? (i don't have access to auth as WIKIMEDIA-MNT anymore) [11:23:13] felicity: done! [11:24:13] i removed the org reference while I was at it, left the remark in place though [11:33:55] mark: thanks! [12:19:20] you can't update mntner objects with password auth via auto-dbm anymore? [12:21:12] I can't use picture select on Norwegian wikinews [12:51:12] mark: sorry, could you do the same for PGPKEY-2B9CE6F2 please? [12:51:31] (not urgent but i have to send all my updates as river@wikimedia.org or the database ignores them ;) [12:53:51] felicity: want me to update the mail address in that as well? [12:56:22] hm, yes might as well change it to river@torchbox.com [14:02:14] felicity: done [14:03:17] thanks [14:27:38] Reedy: around? i've got an ops question about Vector [14:28:24] Reedy: https://bugzilla.wikimedia.org/show_bug.cgi?id=45051#c21 will you guys undeploy it? (not right now, there's a blocker still) or will it just be updated to git master and left there? [14:29:02] Usually we disable it and then stop it being branched in future [14:30:25] All the depends on bugs are closed... [14:43:47] Reedy: okay [14:44:03] Reedy: on an unrelated note, i submitted https://gerrit.wikimedia.org/r/#/c/86319/ which you asked for [14:45:42] do we hav a user agent policy? [14:46:06] for api [14:47:31] 1. provide a user-agent header that will allow ops to contact you if necessary (it probably won't ever be done) 2. there is no step two. [14:47:52] ok [14:47:59] thx [14:48:00] technically, you can also not provide a user-agent and hope for the best. :P [14:48:16] No you can't [14:48:25] No user-agent won't be serviced [14:48:35] you'll get a domas implemented error message [14:48:57] oh. that's new, isn't it? [14:49:25] Nope [14:49:37] Unless it's regressed/been turned off... [14:50:01] >curl "pl.wikipedia.org/w/api.php?action=query&meta=siteinfo&format=json" -A "" [14:50:03] no error [14:50:28] :/ [14:50:59] thx [15:36:16] apergos: around? [15:43:34] yep [15:44:06] yuvipanda: [15:44:20] apergos: woops, sorry - wandered away [15:44:35] apergos: is there a tool that'll let me split pages from the xml dump into smaller chunks? [15:45:37] heh [15:45:52] tell me what you want and how you intend to use the chunks [15:45:56] but the answer is 'probably' [15:46:22] apergos: so a friend of mine is trying to exercise hadoop by running some wikipedia dumps through it, so asked me if I wanted to do something [15:46:27] ah [15:46:29] apergos: so I want to split the logging dump [15:46:39] and I guess en meta hist is a bit much :-D [15:46:41] sure, so [15:46:46] apergos: into multiple fragments, and then do things like aggregate delete actions and stuff [15:47:21] so it's not going to be fast because it's going to write one piece for each time you run it [15:47:50] but in my braanch of operations/dumps git repo you want mwbzutils, see 'writeupto' I think it's called [15:47:55] ah [15:47:58] checking out... [15:47:59] if you are on linux or bsd it should build [15:48:17] i've more labs machines than I've fingers :D [15:48:23] give it -h or --help to see args [15:48:27] sweet [15:48:34] mm there's alos a prebuilt package for precise [15:48:42] apergos: I was planning on building a small tool myself, in Scala, to do the splitting [15:48:43] in our repos [15:48:47] ah [15:48:53] apergos: think it'll be useful at all? [15:48:58] well this could just as easily be extended [15:49:00] apergos: mostly because I'm learning Scala on the side. [15:49:22] if you write it there will be another tool, just lob it into the batch of tools already floating around [15:49:33] sure, but it'll be in scala, and so nobody will use it :P [15:49:38] some such tools (misc dump related) are described on meta [15:49:50] link? [15:49:50] well you'll have to write scalawpbot then :-P [15:49:52] uh [15:49:57] haha [15:49:59] sees tickets to install scala packages on prod servers ..heh [15:50:17] oh also to get the last page id which you will want for this tool, is [15:50:28] dumplastbz2block I think it's called [15:50:30] mutante: I've a commit for that in a branch, actually... [15:50:32] same directory etc [15:50:37] yuvipanda: *g* [15:50:41] which will uncompress the lasst block, [15:50:45] apergos: ah! [15:50:51] oh you know what, I dunno if the log files have page tags [15:51:04] if they don't you'll have to change what it looks for in there [15:51:25] I almost never look at the loggin xml dumps so don't remember what they have [15:51:44] should be self explanatory though [15:51:44] [15:54:16] anyways have a look at those [15:56:38] apergos: will do! Thanks! [15:57:21] since this is twice in two days maybe I should fix up that tool to do a few more splitting (like: multiple files one pass, ligging files, etc) [16:02:46] yuvipanda: for 'other tools' and add yer scala thing when done: [16:02:48] https://meta.wikimedia.org/wiki/Data_dumps/Other_tools [16:02:59] heh, yeah [16:03:00] will do, apergos [17:30:09] heh [17:46:24] Aaron Schulz: ping [17:47:35] or guillom [17:52:21] On the Andorra wikipedia article, there is an iso3166 code in the infobox but not in the source. Anyone know where it comes from? [18:13:36] or Nemo_bis O_O [18:17:44] Steinsplitter: I'm around, but not for long. [18:18:39] guillom: can you add me pleas, on test to the sysop and ta group? [18:19:07] Steinsplitter: If you're asking me, I assume I'm a 'crat there? [18:20:19] Guillom (talk | contribs)‏‎ (bureaucrat, patroller) (Created on 26 June 2007 at 10:06) [18:20:36] Steinsplitter: ok, thanks for checking. I'll get to it in a minute [18:20:43] thanks [18:22:40] Steinsplitter: {{done}} :) [18:23:03] thx :-) [18:23:37] sure [20:25:25] it looks like "uccontinue" flag is not working anymore (ucstart instead), but this is not reflected in documentation :-( [20:29:26] Nirvanchik: it's to check whether you were paying attention! [20:39:29] hm. actually, mediawiki api source code supports this, but with enabled flag "multiUserMode" [20:39:33] and what is multiUserMode? [20:42:08] ah, got it [20:42:18] If you're not using userprefix and the count of provided users is over 1 [20:44:16] Reedy: yes, I see, thanks! [20:46:23] git.wikimedia.org is better documentation than ru.wikipedia.org/w/api.php ) [20:50:35] https://commons.wikimedia.org/wiki/Special:GlobalUsage/Wappen_von_Xanten.svg [20:50:40] Usage on de-labs.wikimedia.org [20:50:47] de-labs.wikimedia.org dos not exist :/ [20:51:12] It was deleted [20:52:09] If there's a lot of those kinds of things, writing a script to clean them up should be fairly trivial [20:52:10] <^demon> Obviously we didn't clean up GU tables :p [20:52:29] ^^ [20:52:52] * Reedy blames hoo [20:53:03] * ^demon blames Reedy, he probably deleted the wikis. [20:53:42] https://git.wikimedia.org/blame/operations%2Fmediawiki-config.git/2eb4cbbb419aec87d14d4c30bd7a7c65a12009cf/deleted.dblist [20:53:46] Damn, you people noticed :D [20:53:55] * hoo has no idea what you're talking about [20:54:08] It's your fault [20:55:34] Ouch [20:55:37] We probably should [20:55:38] <^demon> greg-g: Is today's LD spoken for yet? [20:55:45] There's 56012 for de_labswikimedia alone [20:55:52] ^demon: nope! [20:56:41] Steinsplitter: Want to open a bug? [20:56:45] <^demon> greg-g: Want to roll out Cirrus master to affected wikis, do in-place index rebuilds on english projects. :) [20:56:58] Reedy: yes. [20:57:18] ^demon: which? [20:57:24] * greg-g is kind of lost [20:58:03] <^demon> testwiki, test2wiki, mw.org, itwikt, enwikisource, cawiki + all closed wikis get new code [20:58:14] <^demon> english speaking ones of those will get in-place index rebuild. [20:58:27] oh, I see [20:58:38] I was confused what "affected" meant [20:59:01] <^demon> I'm thinking of making a dblist for it, so we can easily say what that is :) [20:59:07] :) [20:59:32] sure, I guess that fits within mark's concerns :) [20:59:51] <^demon> Yeah no we're not rolling out further, just pushing some fixes for reported issues. [21:00:17] * greg-g nods [21:00:19] sure [21:01:03] short one this month https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/September [21:02:46] Reedy: done https://bugzilla.wikimedia.org/show_bug.cgi?id=54894 [21:05:11] Steinsplitter: Just about written a script to do it ;) [21:08:40] Reedy: thank you :) [21:16:58] Reedy: ^demon: how do you do this? [21:17:15] <^demon> Huh? [21:17:44] "Reedy blames hoo" [21:17:47] "1 [21:18:17] /me [21:18:34] * Nirvanchik tests the test [21:18:39] ahahahah))) thanks [21:19:31] oh damn, time to sleep, bye! [21:36:44] Reedy: I'm a bit confused, the globalblocking/torblock/wikimediamessages patch was causing problems in production but is not in https://www.mediawiki.org/wiki/MediaWiki_1.22/wmf19#GlobalBlocking ? [21:37:17] is it correct that it's going to be in 1.22wmf20 or is the page outdated? I didn't check exact hour it was branched [21:39:55] 1.22wmf20 hasn't been branched yet [21:40:13] Nemo_bis: There was actually a couple of parts to it [21:40:17] You were calling the hooks wrongly [21:40:25] I know I know [21:40:26] and you were also returning the wrong value [21:40:37] Let me update the changelogs [21:40:39] but they were all merged at the same time I believe [21:40:50] Indeed [21:40:53] And they were all broken [21:41:18] I'm open to suggestions on how to prove my sorriness [21:41:42] So far, my proposal is to update the changelog so that people can more easily blame me ;) [21:42:16] git blame blamed you! [21:42:20] * Nemo_bis goes to add a yellow star parameter to {{git}} used on changelog [21:42:51] I'm just updating my deployment branches so I can update the changelogs [21:42:58] great [22:43:51] felicity: Welcome back. :-) [23:22:03] Elsie: i'm only pretending to be back, but hi ;)