[03:09:04] Is there a simple way to just strip wikimarkup from a chunk of wikitext? [03:09:18] Other than parsing it and removing all the HTML? [03:09:53] gwicke: ^ [03:10:19] kaldari: there are simple regexp-based approximations [03:10:28] that would work [03:10:36] where can I find such things? [03:10:40] if you want expanded templates etc, then just stripping the HTML is easier [03:10:50] $.text for example does it in a single line [03:11:06] what I actually want to do is convert an edit summary to plain text [03:11:15] no wikimarkup, no html [03:11:34] k- I wrote such a thing in the past, but am sure that there are better solutions around now [03:11:59] am not very up to date about the 450 extensions there are ;) [03:12:46] my extension was called catnews, but it was not moved to git afaik [03:12:53] still in SVN somewhere [03:12:54] that's OK [03:13:06] any pointers on where to look in catnews? [03:13:22] it's just a single file [03:14:29] hmm, don't see the extension in SVN either [03:14:34] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/?dir_pagestart=0 [03:15:00] it is really old ;) [03:15:05] ca. 2005 [03:15:08] oh [03:15:09] :) [03:15:26] I can dig it out for you- still have it somewhere [03:15:38] crap, what's the command to show deleted files in that interface? [03:16:33] if this works, I owe you a wikibeer! It will save me much time and headache. [03:17:22] kaldari: http://pastie.org/5545747 [03:17:41] excellent! thanks!! [03:17:52] sorry, missed the last lines [03:18:16] http://pastie.org/5545750 [03:19:23] the htmlspecialchars call is important [08:02:38] Hi, I have created a new version of the Math extension that allows displaying mathml using the latexml converter instead of rendered images. Furthermore I have created a test wiki that contains all pages of the English Wikipedia Project that contain math. Now I’d be interested to test this feature inside the Wikimedia framework. Do you have a proposal how to do that. [08:04:35] physikerwelt: is the code on gerrit? [08:06:19] physikerwelt: labs.wikimedia.org provides VMs for MediaWiki testing and development, but if it's already working, I suggest you simply get the code in for review, and we can try deploying it to a smaller wiki. [08:07:45] yes [08:08:11] physikerwelt: would it make sense to submit a patch against the Math extension, or create a new extension repository? [08:08:17] did you improve it or rewrite it? [08:09:06] I tried a long while to commit a change that changes from the old version to the new step by step [08:09:14] but I had little sucess with it [08:09:34] in addition to that someone has created a new branch [08:09:39] called latexml for me [08:09:54] is it up-to-date? [08:10:02] can you have a look at the math extension [08:10:45] * ori-l looks [08:10:49] I will check again if it's up to date [08:11:44] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fextensions%2FMath.git;a=shortlog;h=refs%2Fheads%2FLaTeXML [08:12:27] yes that's the one I was referring to [08:15:49] physikerwelt: can you try to merge to master and submit a review? [08:17:02] I could but I don't know if that's a good idea [08:18:01] I tried to get a review for the very first step that only restructures things and doesn't change anything to the functionallity see https://gerrit.wikimedia.org/r/#/c/30177/ [08:19:40] 0:18 (NEW) Review LaTeXML branch of Extension:Math for deployment - https://bugzilla.wikimedia.org/43222 normal; MediaWiki extensions: Math; () [08:20:32] It'll need to go through review, but I'll try to generate some interest so it gets done quickly. [08:21:17] physikerwelt: can you add yourself as CC on that bug and post some comment in a week if you don't hear back by then? [08:22:23] ok [08:22:27] thanks for your help [08:22:44] np; thanks for your work! [08:23:08] Sorry not to be able to give more definite answers. I need to familiarize myself with the extension and your change a bit first. [08:24:09] yes otherwise the review would be pointless [08:27:24] you can help move things along by perhaps explaining some of the rationale for the change in a comment on that bug and suggesting a good candidate wiki for initial deployment. [08:27:32] you can also explain a little how you tested it. [08:28:47] can you send me your email adress [08:29:03] I have a paper about that change [08:29:19] i'm ori@wikimedia.org, but you should reply using the form on https://bugzilla.wikimedia.org/show_bug.cgi?id=43222 [08:30:23] http://wiki.physikerwelt.de/images/text_math_search.pdf [08:31:24] physikerwelt: could you add that link (plus the other details i suggested above) to a comment on https://bugzilla.wikimedia.org/show_bug.cgi?id=43222? [08:31:56] I'm waiting for the confirmation email for the creation of the account [08:31:59] at bugzilla [08:32:12] great [08:34:58] i noted the paper in the bug [08:35:19] i'm off to sleep, good night. [08:35:28] good night [08:36:41] physikerwelt: you might also want to join #mediawiki where theres a bot that announces new reviews+bug activity [09:39:29] hello [10:07:48] yuvipanda: is there a project page somewhere for https://bugzilla.wikimedia.org/show_bug.cgi?id=41252#c18 ? [10:13:46] Nemo_bis: for ShortURL? [10:14:11] Nemo_bis: no. It was never a foundation project, and I didn't really make one for it :) [10:14:18] there's the extensions page. [10:17:47] yuvipanda: no, for the WP:whatever thingy you mentioned [10:18:00] oh [10:18:02] yes [10:18:13] Nemo_bis: https://en.wikipedia.org/wiki/Wikipedia:DRREFORM it's an RFC [10:18:28] Zhang, who was a former WMF fellow, asked me to help out and implement it, and I *just* started [10:18:37] like, 12 hours ago. I'll put up a page soon. [10:18:49] ah [10:18:58] I thought WMF was stopping doing that stuff [10:19:34] Nemo_bis: they did [10:19:40] Nemo_bis: i'm doing this as a vol. [10:19:45] I did ShortURL as a vol too. [10:19:52] and am currently not on the foundation's payroll :) [10:20:06] (Will be starting January, but only 30h/week) [10:20:10] ou [10:22:11] Nemo_bis: nothing involved here is on WMF dole. [10:23:39] i just realized dole wasn't the best word to use here. sorry. [10:23:53] * yuvipanda smacks self with a cultural appropriateness trout, and goes back to JS [10:24:25] :) [12:58:28] <^demon> hashar: Good morning. I'm finishing up breakfast, then we'll do this :) [12:58:38] ^demon: good morning :-) [12:58:54] ^demon: take your time! I am finishing lunch myself [12:59:17] <^demon> I was thinking about the descriptions for the values. Right now we have Fails/No Score/Verified for -1/0/1. [12:59:41] <^demon> For the new range, how about Fails/Problems/No Score/Checked/Verified [12:59:49] <^demon> (for -2/-1/0/1/2) [13:01:29] do we really need both -2 and -1 ? :-D [13:01:50] <^demon> Well, what you drafted (and I sent) said that. [13:02:00] :-D [13:02:05] <^demon> Which kind of confused me, since we just said -1..+2 before ;-) [13:03:35] maybe I made a mistake [13:03:38] or timo changed it [13:04:49] ^demon: yup Timo did it r289 of http://etherpad.wmflabs.org/pad/p/NotifyVerified2/timeslider :-D [13:05:34] ^demon: since we can tweak the messages later on, I would go for your proposal Fails/Problems/No Score/Checked/Verified [13:08:16] <^demon> But do we need -2? [13:09:15] ^demon: I don't think so [13:09:29] Jenkins / Zuul reports a message that give enough details in my opinion [13:13:12] to me more clear: I do not really care about the range as long as we have +2 :-] I let you make a choice whether we get -1 and -2 or just -1 :-] [13:13:31] <^demon> Let's just add the +2. Less change. [13:13:38] <^demon> Less buttons to click too for people :) [13:13:54] <^demon> s/too// [13:13:59] <^demon> Dunno where that came from. [13:14:18] and V+1 should prevent submit :-) [13:14:46] <^demon> V+2 is required for submit. We're still using MaxWithBlock as the rule. [13:15:02] <^demon> (Require the maximum possible score, with the minimum possible score being a veto) [13:15:13] nice [13:16:25] <^demon> So with the new range of -1..+2, the values can be: "Failed / No Score / Checked / Verified" [13:16:38] <^demon> Good, I was iffy about "Problems" anyway, since the rest are verbs. [13:16:57] I problem, you problem, we Google. [13:17:22] <^demon> lol. [13:24:23] !g I54b230376ed682d6a6b7d6fa4fd5ab133c8c8b1b [13:24:23] https://gerrit.wikimedia.org/r/#q,I54b230376ed682d6a6b7d6fa4fd5ab133c8c8b1b,n,z [13:42:15] re [13:42:43] ^demon: getting the change in ? [13:43:00] <^demon> Started it. Just renamed V+1 to "Checked" [14:07:42] <^demon> hashar: New field is in the database. They'll show up in the UI and stuff after we restart gerrit (will do in a moment, grabbing a glass of water) [14:08:18] <^demon> http://p.defau.lt/?Tc9OaZhPqmdlubH4u9PTEg [14:09:12] youhhouuu [14:09:25] merging the Zuul related change [14:09:36] then I guess we want to restart both Gerrit and Zuul [14:11:15] New review: Hashar; "Change made in Gerrit (pending restart):" [integration/zuul-config] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/39082 [14:11:23] Change merged: Hashar; [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39082 [15:00:51] is there a InitializeSettings group that contains precisely the same wikis as wikipedia.dblist? [15:01:25] cause 'wiki' also contains stuff like Commons and Meta [15:14:16] New patchset: Hashar; "mw-ref-updates pipeline for mw doc generation" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39207 [15:24:20] New patchset: Hashar; "mediawiki-core-doc-generation job" [integration/jenkins-job-builder-config] (master) - https://gerrit.wikimedia.org/r/39210 [15:59:24] New patchset: Hashar; "mw/core doc generation wrapper based on Zuul reference" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/39212 [16:00:05] New patchset: Hashar; "mw/core doc generation wrapper based on Zuul reference" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/39212 [16:00:12] daughter timem [16:00:14] cya [17:07:32] anomie: hi, i hope i explained my reasons clearly, but if you have questions, feel free to ask here [17:07:53] hi Yurik [17:11:21] anomie: did i address your concerns in the email, or am i still misunderstood? [19:05:46] csteipp: did i hear correctly that you have been working on using cors to change how central auth works? [19:06:30] awjr: It's planned, I'm not actively working on it [19:06:43] csteipp: very cool - do you have an idea of the timeline? [19:07:05] Late Jan at the earliest [19:07:57] csteipp: ok cool, thanks :) [19:15:17] I am going to install a new table on test/test2/mediawiki, do I just run the sql manually on each of them, or is there a standard procedure to install it? Thanks [19:16:27] bsitu: There are scripts for this, yes. [19:16:32] Should not be run manually [19:17:12] bsitu: Probably best to ask someone in operations first who knows this. [19:17:28] Krinkle: will do, thx [19:18:32] Also, don't run on more than 1 wiki at a time. Meaning, run on test before any others (the whole procedure of the tables, extension enabling etc.), then if that is all fine, do it from a-z on other wikis. [19:18:52] so you don't have to revert it on 3 wikis :) [19:20:21] yeap, :) [19:57:02] robla, regarding test2 - so the sequence should be test -> test2 -> prod? [19:57:24] yup [19:57:42] and how many time in arrows? :P [19:59:34] robla, ^^ [19:59:36] This is Robert Miller Host for the brownbag today. Setting up now. [20:01:57] MaxSem: in a mtg right now, but quick answer: it can be minutes, it just needs to be deployed to test2 so that Chris can do write testing [20:02:16] this is the link to the youtube stream http://youtu.be/BYtkl9AD04M [20:09:43] this is the link to join the brownbag google hangout https://plus.google.com/hangouts/_/fe5ac9fee322036c50f1bb8744c09f549c557bea# [20:11:12] google hangout link for the brownbag https://plus.google.com/hangouts/_/fe5ac9fee322036c50f1bb8744c09f549c557bea# [20:19:15] There is an unmerged change on CommonSettings.php, if ( $wgDBname === 'testwiki' ) $wgCaptchaFileBackend = 'global-multiwrite'; [20:20:54] <^demon> AaronSchulz: Was that you? ^ [20:20:56] bsitu: yeah, that was there so a script could run [20:21:01] <^demon> :) [20:21:06] python script is giving me trouble though [20:21:15] and I was off in an ops meeting [20:21:25] "Unable to find valid word combinations" [20:22:41] AaronSchulz: I can't merge my change, do you want me to reset the file and apply the change via gerrit? [20:23:18] I just reset [20:23:38] AaronSchulz: thx [20:34:38] ok this --blacklist param seems useless [21:04:02] New patchset: Hashar; "run tests for some whitelisted users" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39310 [21:06:41] New patchset: Hashar; "run tests for some whitelisted users" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39310 [21:21:51] New patchset: Hashar; "run tests for some whitelisted users" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39310 [21:43:33] AaronSchulz, are captchas broken seriously? [21:44:01] cause we're trying to test mobile account creation and... :P [21:49:46] ori-l: FYI, concerns raised about E3 cookies on wikitech-l [21:50:14] I think Adam wants to verify that they were vetted by legal. [21:51:32] prolly at lunch [21:52:25] ori-lunch [21:53:01] I always wondered what the L was for ;) [22:04:21] New patchset: Hashar; "run tests for some whitelisted users" [integration/zuul-config] (master) - https://gerrit.wikimedia.org/r/39310 [22:05:55] MaxSem: on testwiki? yes [22:26:00] sleep time *waves* [22:26:45] AaronSchulz: yeah, on testwiki. [22:29:39] I need a wikitech-l moderator to help me fix a problem with my buggy mailman aliases [22:32:34] awjr: sorry, I've been running into one bug after another with this captcha script [22:32:39] no one ran this stuff in ages [22:33:28] /home/wikipedia/common/php-1.21wmf6/bin/ulimit4.sh: line 4: 17455 Killed [22:33:30] * AaronSchulz sighs [22:36:23] doh! [22:37:27] AaronSchulz: is that for generating the captcha images? [22:38:26] yes [22:38:51] awjr: do you need testwiki and not test2wiki? [22:39:03] AaronSchulz: yeah [22:39:07] maybe I can just switch to running via another wiki [22:39:17] we can't do accurate MObileFrontend testing on test2 yet [23:34:10] AaronSchulz: is it easier to rewrite the captcha, maybe? :p a new image generation method implementation was proposed some months ago on wikitech