[02:34:05] Prodego: thanks [06:22:57] Mono[Away]: why don't you follow docs? you'd save time http://www.mediawiki.org/wiki/Special:MyLanguage/Help:Extension:Translate/Page_translation_administration [07:42:24] 1.22wmf6 - START OF ONE WEEK DEPLOY CYCLE [07:42:26] scary [08:20:30] :D [11:09:10] What is the advatange of the multistream bz2 archives + the index file vs the usual single stream bz2 archive? I'm referring here to the Wikipedia dump file formats. dewiki-latest-pages-articles-multistream.xml.bz2 + dewiki-latest-pages-articles-multistream-index.txt.bz2 vs dewiki-latest-pages-articles.xml.bz2 [11:10:03] BadDesign: you can avoid extracting / loading the entire thing at once [11:10:22] BadDesign: you'd just extract the index, and use it to find out which stream contains the page you wanted to see, and only extract that one [11:11:21] and of course the files are smaller, which can sometimes be easier to work with [11:20:18] MatmaRex: I'm thinking of downloading the Wikipedia dump files and while still downloading unzip the chunk and XML parse it, can I do something like this? [11:21:01] Ultimately I need to go through each page of the dump and extract something from the tag [11:22:29] BadDesign: uh, i have no idea if that's possible [11:22:47] i always downloaded the entire thing, unzipped the entire thing, and only then played with it [11:23:04] no idea if a "streaming" approach would work, you'd probably need some customized software [11:24:11] MatmaRex: Do you unzip the dump inside the code or through some script / shell? [11:24:55] BadDesign: i wget it, then bunzip2 it: :) [11:25:26] BadDesign: and then i go through it with a little script that is barely more than a grep, but AWB also includes a full-blown database dump scanner [11:25:33] https://en.wikipedia.org/wiki/Wikipedia:AWB [11:30:12] I've written the XML parsing code in C++ I just need to figure out how to input the dump file to it [11:30:36] *the best way to get the dump file to it [11:31:16] But I think I'm going with the script a bash shell that I'll put inside a cronjob that once a month fetches the dumps I need [11:35:11] MatmaRex: So the multistream files are only used when I programatically unzip the bz2 file? [11:37:12] BadDesign: i have never used them myself, so i can't really tell you [11:37:35] BadDesign: i think they are just parts of the full XML data [11:38:05] BadDesign: with the index being a mapping of article title to which file it's in [11:40:39] So for example if I have this entry in the index: 508:84:Ardentiner-Massakren the third column is the article title, the first column is the stream and the second column is the page ID or what? [11:49:31] BadDesign: i have absolutely no idea, sorry :) [11:49:40] probably stream and page's position in it? [11:49:59] it seems a littleshort for a page id, and a page id would be a little useless in the index anyway [11:57:53] oohhh people talking about the dumps [11:58:04] * apergos lurks [11:58:26] the multistreams are multiple bz2 streams concatted together, with some indexing [11:58:51] you can use them to (sort of) seek to a page. well close to it anyways, to the block containing the page [11:59:27] there's a little python script in the... ah. heh. lemme find it. in the repo that's there as an example of how one could use the file and the index [11:59:42] BadDesign: [11:59:45] y [12:00:44] The documentation on this matter is extremly scarce, even the bzip2 website docs don't mention barely a thing [12:00:52] https://gerrit.wikimedia.org/r/gitweb?p=operations/dumps.git;a=tree;f=toys/bz2multistream;h=95916c873c8c691a3450a919bdbc2f227fd9afb8;hb=refs/heads/ariel [12:00:59] well there's not that much to it [12:01:15] so the story is that bz2 is block oriented but the blocks are not guaranteed to be byte aligned [12:01:30] so while there is a start of block marker and an end of block marker in the format, finding them turns out to be [12:01:35] quite annoying [12:02:12] the easier thing (and one that most bindings will support) is being able to feed a complete bz2 object with header etc, because that is guaranteed to be byte aligned [12:02:28] you can think of it as a pile of bz2 files just concatenated together [12:05:01] Good, thanks for the info. Using this approach renders my XML parsing code useless, right? [12:05:18] I don't know what your xml parsing code looks like [12:05:45] I parse the XML dump after it has been already completely decompressed [12:05:50] ah [12:05:57] well here you might want to get the specific page [12:06:05] then parse it for the entities and information you want [12:06:15] so presumably that part of your code would still be helpful [12:09:02] yes, the extracting of useful info from the page would be similar, but If I need to go through each page anyway and not a specific page isn't the single bz2 stream the same as multistream? [12:09:15] *approach [12:10:24] If I've not misunderstood the differences between the two [12:11:51] file-offset:page-id:page-title is what's in the one index [12:12:09] if you need to do them all then there are still a couplle of choices [12:12:33] you could run a few jobs in parallel by taking the index and starting at a few different points in the file [12:12:44] that is, specify a starting and ending point or something [12:12:46] to each job [12:13:44] you might have to uncompress each stream as a complete bz2 file (i.e. using the appropriate library calls) in order, that would be the only little hack depending on your language bindings [12:14:11] that would not be convenient to do with the regular bz2 file (running multiple jobs on different parts of the input) [12:14:50] these are compressed with 100 pages to a stream, so you could divide it up in multiples of a hundred I suppose [12:18:08] I'll go with the native C API offered by libbzip2 and add multithreading support around that, as I see the parallel bzip2 http://compression.ca/pbzip2/ doesn't expose a library [12:22:04] any WMF people here? stuff's broken: https://bugzilla.wikimedia.org/show_bug.cgi?id=48693 [12:22:08] ("Gadget settings cannot be changed on Mediawiki 1.22wmf4") [12:22:19] ok. In the meantime I have added a line or two of docs about the index file format, thanks for pointing that out [12:29:10] apergos: what docs? [12:30:33] in the toys'/bz2multistream section of the dumps repo [12:30:41] where I linked to above [12:31:07] https://gerrit.wikimedia.org/r/gitweb?p=operations/dumps.git;a=tree;f=toys/bz2multistream;hb=refs/heads/ariel [12:34:00] Reedy: ping [12:34:20] (not ignoring you, just have no idea about the gadgets issue) [12:34:32] apergos: yeah, assumed so :) [12:35:06] what other ops do we have in european timezones? roan's away, so's krinkle [12:35:29] most people are probably shuffling off to the hackathon around now [12:35:58] it's really only mark and para void but is this an ops issue? it sounds like a mw issue [12:36:15] well, it's deployed just about everywhere by now [12:36:20] uh huh [12:36:23] and will be deloyed fully everywhere in a few hours [12:36:37] so i dunno, someone competent should look at it [12:36:43] well if people show up to deploy it elsewhere they can be told that there is this issue [12:36:50] that will be on the "dev" side [12:36:57] i marked it as a deployment blocker [12:36:58] I can bring it up if I'm online tonight [12:37:06] ok, presumably folks will see it [12:37:22] yeah. i was just wondering if someone's online now to do that [12:38:10] looks like not [12:38:17] * apergos goes to look at the deployment calendar [12:38:59] 9 pm my time, reedy [12:39:10] ok, I can probably point it out [12:39:24] and frankly he'll probably show up before then, so... [13:26:36] * hashar shows up [13:29:29] * valhallasw is surprised by the puff of smoke [13:40:05] MatmaRex: I'm back [13:40:20] (And only temporarily in Europe this week and next :) ) [13:40:54] RoanKattouw: there's trouble :) https://bugzilla.wikimedia.org/show_bug.cgi?id=48693 [13:41:58] Yeah I saw in backscroll [13:42:02] I'll e-mail greg-g and Reedy [13:43:33] Sent [13:43:56] MatmaRex: ...which reminds me, are you coming to Amsterdam this weekend? [13:45:50] RoanKattouw: yup [13:46:40] Awesome [13:46:59] I was thinking you and I could sit down at some point and get the VE unit test suite running in Opera and IE [13:47:32] From cursory testing it seems there are a couple dozen test failures in Opera, and last time I tried IE I needed to fix half a dozen things before it ran without crashing [13:47:58] (yay better automated testing) [13:48:17] Ya srsly [13:48:37] The automated testing of this part is actually good, it's telling us exactly how badly we support those browsers [13:48:53] ha! [13:48:54] RoanKattouw: hm, interesting [13:49:00] So we'd be fixing the issues the tests expose, rather than the tests themselves [13:49:10] RoanKattouw: has anybody found thetime to look at the two opera-related changes i submitted some time ago? [13:49:21] RoanKattouw: https://gerrit.wikimedia.org/r/#/c/61365/ and https://gerrit.wikimedia.org/r/#/c/61358/ [13:49:25] MatmaRex: This is something I might do by myself anyway, but since you've been poking at VE Opera support I figured you might be interested [13:49:29] RoanKattouw: they're currently both -2'd [13:49:32] Hmm, IIRC those are the changes we keep bugging Inez to review [13:49:33] Let me see [13:49:57] https://gerrit.wikimedia.org/r/#/c/61365/ is a "proof of concept" and waiting for a response from Rob [13:50:00] i mean, i could just wrap that in a user-agent check, but i'm trying to understand why it was done like this in the first place [13:50:10] Which you're not gonna get any time soon because he shattered his ankle [13:50:20] I would love to know that too [13:50:24] oh. :/ [13:50:37] i mentioned rob there since he's the author of the original patch i'm "reverting" [13:50:48] Unfortunately Rob needs to stay at home with his leg held up for three weeks and that clock starts running /after/ he has surgery (which I think was supposed to be yesterday) [13:50:54] D: [13:51:10] sumanah: Yeah the VE team is on a great roll healthwise [13:51:16] aw man :/ awful [13:51:30] Before this, I got hit by a car while riding my bike to work and had my knees banged up for a wek [13:51:47] Before that, Christian tried to stop a guy from stealing someone's backpack on BART and broke his index finger [13:52:33] The universe is conspiring against VE? [13:52:43] Clearly [13:52:51] Have you made propitiations to Athena or Saraswati or somebody? [13:52:55] We were also all sick more or less simultaneously during the latest wikiplague [13:53:29] Hmm https://gerrit.wikimedia.org/r/#/c/61358/ just needs a response from Inez, I'll harass him about it when he's online tonight [13:54:06] sumanah: Maybe if we get Brandon to come up with a Greek mythology-based name for the project, it'll go away? xD [13:54:26] ha! [13:54:30] All I know is VisualEditor has never been renamed [13:54:42] Saraswati is a goddess of knowledge. [13:54:43] And every project we have cared about in the past few years was renamed at least once [13:55:02] (I probably also have an aunt or cousin named Saraswati) [13:55:04] Even Echo got its rename at the last minute [13:55:12] sumanah: Hah, thanks for the Greek mythology lesson [13:55:21] Saraswati is actually Hindu, Roan [13:55:24] * RoanKattouw has a knowledge gap when it comes to Gods and stuff [13:55:28] D'oh! [13:55:30] See [13:55:34] * sumanah laughs aloud [13:55:35] I don't know anything about any mythology [13:55:51] Anyway, I will just wait for July 29th, e.g. V-E day (like at the end of WWII) [13:56:05] * sumanah leaves y'all alone to do this work; she must pack [17:15:09] YuviPanda: around? [17:15:24] hey [17:15:25] maty [17:15:27] matanya: [17:15:28] sortof [17:15:33] about to crash in about 5 minutes [17:15:34] 'sup? [17:16:11] mostly fine, wanted to bug you about a regression in the wiki app for android [17:16:54] oh? [17:16:58] ah, just saw bugmail [17:17:07] https://bugzilla.wikimedia.org/show_bug.cgi?id=48716 [17:17:17] which might be the same as [17:17:18] https://bugzilla.wikimedia.org/show_bug.cgi?id=35270 [17:17:29] and looks like this: https://upload.wikimedia.org/wikipedia/commons/b/b8/Bug_in_wikipedia_app.png [17:17:57] hmm, I remember fixing this [17:18:00] but it was a long time ago [17:18:08] brion: ^ bug report on RTL stuff [17:18:22] seems to have regression again [17:18:40] I remember speaking to aharoni about this some time ago [17:18:41] bleh. what device is it? [17:18:55] also, another problem is that the android app is essentially in full blown maintainence mode right now [17:18:57] tablet lenovo ideapad k1 [17:19:13] Phonegap was a massive PITA, so ideally we wouldn't want to touch it [17:19:15] 'ideally' [17:19:15] why is it so? [17:19:30] looks a lot like the bug that I reported at last year's Wikimania... [17:19:40] aharoni: but we fixed it, didn't we? [17:19:53] I think I remember fixing *everything* you reported, for the most part... [17:20:09] looks like its back :) [17:20:24] no, I think that 35270 is indeed fixed. [17:20:44] should it be reopened or is it a new bug? [17:21:09] I think that it's reported somewhere, although possibly closed as WONTFIX (tsk tsk tsk). [17:21:44] aharoni: https://bugzilla.wikimedia.org/show_bug.cgi?id=35270 [17:21:46] ? [17:21:53] no. [17:21:55] let me see... [17:23:59] yeah we gotta start planning a non-sucky wikipedia app :) [17:25:19] brion: too many things to do, too little time :( [17:25:32] wheeeee [17:25:55] you can count on my hand if it helps :) [17:26:09] ok i gotta pack everything up for the flight to AMS, be back on irc later :D [17:26:12] matanya: wheee! Plan is to make it 'native Android' [17:26:26] there was a mail to wikitech-l about why we were ditching PhoneGap [17:26:42] i remember something like that [17:29:54] flashing echo bar is so naughty [17:31:09] matanya: it's gonna be a while though. [17:31:19] good things take time [17:31:39] matanya: our current 'thing' is the Android Commons app, and I think when I come back to work (I'm technically offline now), we'll be going up full steam on getting 'campaigns' into the app [17:31:43] WLM, etc type stuff [17:32:07] yes, i have bugs in that app too :) [17:32:16] :D reported? [17:32:23] listing them up to file in bugzilla now [17:32:28] niicee! [17:32:41] 4.2+ 'flipping' RTL support landed yesterday [17:33:15] for example, it doesn't show images smaller than 40k in the app [17:33:26] matanya: no, it doesn't show images less than 640px :) [17:33:28] known bug [17:33:36] I meant [17:33:44] as in, 640 is hardcoded in the app. [17:33:59] you can't search [17:34:14] yup. And I don't think search is on the roadmap right now... [17:34:17] and boy do i need this function [17:34:23] it's primarily 'first point of entry' [17:34:38] got that [17:34:43] matanya: we'll add search as soon as we have some sort of editing in place [17:34:46] not intended to hardcore as me :) [17:34:53] :D [17:35:19] matanya: how many uploads do you have, btw? [17:35:40] want to through a number? [17:35:48] approximate number is good enough [17:36:00] i'm trying to check if my code is performant enough ;) [17:36:05] and are you seeing perf issues? [17:36:11] shoot a number, lets see if you are close [17:36:35] answer the second question? [17:36:37] and i'll guess the number [17:36:40] are you seeing perf issues? [17:37:15] matanya: ^ [17:37:16] a bit [17:37:34] hmmm [17:37:38] somewhat slow in loading the photos [17:37:41] throwing out a random number, 30,000? [17:37:48] more like, crashing, scrolling, etc? [17:37:53] as in, scrolling - is it bad? [17:38:09] double [17:38:15] about 65,00 or so [17:38:28] yes, scrolling is a nightmare [17:38:44] didn't check it until now [17:38:53] ah [17:38:55] and ow :( [17:38:56] what phone? [17:39:02] tablet [17:39:04] as above [17:39:15] lenovo ideapad k1 [17:39:25] so we're planning on shifting to a different networking library [17:39:30] with better caching / image support [17:39:35] should make performance much better [17:39:43] but hey, at least it didn't crash and burn :) [17:39:59] hmm, I think adding indexes should speed up your case a fair bit [17:39:59] i see i stand on some toes here :) [17:40:07] no, not crashing at all! [17:41:32] i think there was someone else with 27,000 images [17:41:42] and didn't have *too many* problems [17:41:52] matanya: problem with perf testing for me is that I don't have that many images :P [17:42:11] and it'll be hard to make that many artificially, in testwiki or local [17:42:16] that can be easily solved [17:42:21] how? [17:42:28] upload using a bot [17:42:30] hmm, I can easily do 'sudo' [17:42:35] in the app [17:42:37] that too :) [17:42:40] since it is only listing your pics [17:42:42] doesn't need [17:42:45] auth [17:42:53] i'll do that when i'm back to work on the android stuff ;) [17:43:02] but do file the bugs you were going to :) [17:43:09] i will [17:43:13] matanya: and if you're going to test RTL, use latest build from https://integration.wikimedia.org/nightly/mobile/android-commons/ :) [17:44:46] that would most likely won't work well on the device without testing [17:44:58] my rom is custom built [17:45:00] ah [17:45:17] like, you built it? [17:45:19] the stock stuff was pure crap [17:45:22] or is it CM / another ROM? [17:45:30] you had 2.3 on that? [17:45:33] yes [17:45:55] i took aosp and added a little cusomization [17:46:03] ah [17:46:07] nice! [17:46:13] i've never built all android from scratch [17:46:18] i had to pry out the IME and build it by itself [17:46:20] was a major PITA [17:46:30] much like building a linux from scratch [17:46:58] just more python and less shell [17:47:14] heh [17:47:17] and other stupid make file issues [17:47:23] Android.mk files aren't meant to be used standalone [17:47:29] but i'm stuck in a machine with only 128G space [17:47:29] :( [17:47:43] i found that out the hard way [17:48:04] 128G is a lot [17:48:13] i have only 25G [17:49:14] not 128G empty space [17:49:16] just 128G space [17:49:23] about 2G empty space [17:49:25] (laptop) [17:49:54] that is a bit [17:50:28] go get a drive :) it is very cheap [17:50:37] O BTW YuviPanda ! [17:50:49] matanya: laptop [17:50:52] Are you done with exams and all that? [17:50:57] aharoni: nope. 3 left [17:50:58] but [17:50:58] Are you available tomorrow? [17:50:59] but [17:51:01] next one is on 31st [17:51:10] Are you going to AMS? [17:51:11] so i'll be online around until saturday [17:51:15] aharoni: sadly no AMS [17:51:15] COOLNESS [17:51:17] :'( [17:51:23] aharoni: what tomorrow? [17:51:30] twn.net app meeting? [17:51:32] i can make it [17:51:36] YuviPanda: so, can't replace HD in a laptop? [17:51:42] No AMS for me, either, so I made myself a hackathon in TLV. [17:51:42] matanya: macbook air. [17:51:47] aharoni: oooh [17:51:49] nice! [17:51:54] how can I hlep? [17:51:56] *help [17:51:58] you deserve it than YuviPanda :) [17:51:59] Or and Tomer will be there, [17:52:05] so they may have some questions, [17:52:12] aha! [17:52:13] and there probably will be a few more mobile devs. [17:52:19] sure, I'll make sure I'm on IRC [17:52:25] can you tell me the timings? [17:52:27] and TZ? [17:52:28] a few new ones to add to the community, hopefully. [17:52:35] India minue 3.5. [17:52:39] again: [17:52:44] 10:00- 22:00 IST [17:52:45] India minus 3:30. [17:52:53] what matanya said. [17:53:23] 10:00 AM I might not be up *that* early, but I'll be around by 11:00 AM or so [17:53:31] OK [17:53:57] :) [17:54:04] this one will be small and local, and if it goes well, I'll have a good reason to make the next one bigger and at least a bit international :) [17:54:25] I whis i was there [17:54:46] :) [17:55:10] though now I have no idea how good it will actually be... there will be some people who I know are serious, and some more people that I don't know at all, but I hope for the best. [17:57:12] Reedy: Hey reedy, we have a fix merged for the gadgets bug [17:57:25] aharoni: :D indeed. [17:57:28] aharoni: good luck. [17:57:33] https://gerrit.wikimedia.org/r/#/c/64953/ [17:57:40] oh please merge that already! [17:57:42] aharoni: there *might* be one this weekend in my City (Chennai). I've mostly not been involved.... [17:58:00] kaldari: Please go ahead and cherry-pick that to wmf4 as well [17:58:00] (by mostly I mean not at all) [17:58:07] ok [17:58:34] aharoni: oooh, btw - I have a Firefox phone :) [17:58:37] I've abandoned my reverts [17:58:50] haven't tested Indic rendering support on it, but last time I checked it was horrible [17:59:01] gerrit seems slow today [17:59:05] Theo10011: I guess you have your answer now :-) [17:59:09] YuviPanda: is it any good? :) [17:59:16] aharoni: the phone? [17:59:20] yes. [17:59:22] aharoni: well, it's a developer preview.... [17:59:32] i have one with a very old OS build, and I don't quite know how to reinstall it. [17:59:42] (too lazy^H^H^H^H busy to check.) [17:59:49] aharoni: i've been using it as a dumbphone. When I'm studying, I keep all my other devices off, and only a few friends know the number in the FF phone. [17:59:58] :) [18:00:02] aharoni: and I literally can't seem to do much else in that, so i'm u sing it as a dumbphone :) [18:00:20] aharoni: text input in particular is very bad. Plus plenty of performance issues. [18:00:30] but it's far less 'obviously' buggy than I expected [18:00:54] OK [18:01:07] aharoni: incredible battery life though [18:01:13] aharoni: anyway, let me know if you want me to do some tests on it [18:01:23] Ok [18:02:10] RoanKattouw: half the time I do a cherry pick it works and half the time I get "fatal: bad object". Do you know how to prevent that? I usually just switch to the more explicit cherry-pick syntax rather than git cherry-pick . [18:03:44] I guess I just need to explicitly fetch it in those cases [18:06:41] greg-g: Who's doing the WMF deployment today? [18:09:43] AaronSchulz: do you know who's doing the WMF deployment today? [18:11:10] I guess everybody's gone :P [18:20:07] greg-g: Since it looks like no one from platform is around (probably drinking beers in Amsterdam), I'll just do the deployment myself for the fix. [18:23:21] kaldari: i think roan is around [18:23:33] RoanKattouw: ^ [18:23:39] that's OK, I'm almost finished [18:23:42] k [18:27:36] all done [18:27:41] :) [18:28:37] aude, kaldari: I'm doing it as Reedy isn't here [18:28:43] But I have to run around and do a few things first [18:28:53] kaldari: Are you deploying the preferences fix? [18:28:58] already did :) [18:28:59] done already [18:29:58] OK [18:30:17] Give me a minute to take care of some stuff back home and I'll be all up in ur wikis, updatin ur code [18:32:04] RoanKattouw_away: wmf4? [18:39:25] hey guys, gerrit is 500ing. [18:39:34] Guice provision errors: [18:39:38] Cannot open ReviewDb [18:39:38] at com.google.gerrit.server.util.ThreadLocalRequestContext$1.provideReviewDb(ThreadLocalRequestContext.java:71) [18:39:38] while locating com.google.gerrit.reviewdb.server.ReviewDb [19:05:27] AaronSchulz: wmf4! [19:05:29] If you wanna do it go ahead [19:05:36] I've never actually done one of these in the new system before [19:05:39] already done [19:05:43] and I only took it because no one else seemed to be around [19:05:45] Oh awesome xD [19:05:47] Thanks man [19:17:21] I need to pick your brains. What whitelist do we need to update again if we know the ip's we're going to use at the hackathon? [19:18:10] multichill: file a bug? [19:18:25] multichill: it's somewhere in mediawiki-config repo, i think [19:18:47] I'm sure apergos knows, but not sure if he's online [19:19:02] ? [19:19:17] it's 10 pm for me so I"m here but not here [19:19:29] yes, you can bugzilla it and yes it's a mw config setting [19:19:40] jeremyb would know [19:19:53] apergos: What is it called again? [19:19:57] it's good to file it in advance, not wait til the day of [19:19:59] uhh [19:20:10] you're gonna make me look at for the setting aren't you :-P [19:20:18] apergos: I just installed the wireless, we start in about 24 hours ;-) [19:20:38] And maybe you can add 213.127.161.46 and 77.170.89.227 ? :P [19:20:52] (yes, we got two internet connections, don't ask) [19:21:13] wmf-config/throttle.php [19:21:18] apparently they go in there now [19:21:21] new system [19:21:55] wmfThrottlingExceptions [19:22:25] http://noc.wikimedia.org/conf/throttle.php.txt [19:26:29] Hmm, that's just new account creation. I thought we had something else too, but I probably just mixed something up [19:28:34] I"m afk for the night (don't break the site, kids) [19:28:43] :) [20:52:13] search error: pool queue full [20:52:22] this is new for me, always nice surprises :) [20:55:12] Nemo_bis, where did you see that? [20:56:43] Special:Search [20:57:00] the new nice non-misleading errors which replaced bogus 0 results [21:03:38] MatmaRex: what's the bug number for the broken custom edit toolbar? [21:04:11] uhhh [21:04:19] !bug mwcustom [21:04:19] https://bugzilla.wikimedia.org/buglist.cgi?quicksearch=mwcustom [21:04:26] those two. [21:04:38] that one, actually [21:04:40] the other is old [21:04:55] Nemo_bis: ^ [21:07:25] MatmaRex: do I guess correctly that I just have to give up hoping? [21:13:09] heyas. Is this the getting-help-writing-a-bot-with-pywikipedia channel? [21:14:13] eptalon: #pywikipediabot is the right channel, technically [21:14:33] ok. [21:14:39] but it's quite low-volume. Depending on the question, you can also try the mailing list, pywikipedia-l@lists.wikimedia.org [21:18:00] valhallasw: I technically can separate a wikitext, but I can't seem to parse '====h4====somesectiontext...' into a dict of 'h4:somesectionText' [21:19:22] eptalon: https://github.com/earwig/mwparserfromhell might help you [21:20:13] other than that, you'll have to rely on string manipulation and regexps [21:21:43] Nemo_bis: most likely. this is quite hairy [21:21:53] Nemo_bis: and the fix is a pretty simple search and replace [21:21:57] the fix for users* [21:22:31] valhallasw: I am currently doing that (string parsing, in essence find()) [21:23:28] MatmaRex: that's not pretty simple [21:23:45] the large, large majority of users just copy and paste them [21:23:52] docs have not been clear for years [21:23:58] and nobody knows about the breakage [21:24:24] anyway, nothing to discuss, we'll just need someone to cleanup the mess [21:24:31] Nemo_bis: is anybody actually complaining? [21:24:45] Nemo_bis: because i've got a hunch that most people with this still in their user js are inactive by now [21:24:48] I don't get the question [21:24:57] I'm active [21:25:02] (by which i mean, super-low priority) [21:25:35] people don't complain because they are used to stuff just breaking [21:25:45] and they don't know who/where to ask [21:25:56] because in that case you;d be the third person having issues [21:26:10] the ycertainly know where their local village pump is [21:26:29] and i'm sure that every reasonably large project has at leats one active person with a bugzilla account [21:26:38] so what [21:26:45] i haven't seen complaints about this on VP/T [21:26:55] i haven't heard any at pl.wp either [21:26:58] and i mean literally none [21:27:00] * Nemo_bis doesn't care at all about WM/T [21:27:03] and i watch all the subpages [21:27:05] *WP/T [21:27:07] and i'm the go-to tech guy [21:27:21] zero people noticed [21:27:27] and we're the fifth largest project or something [21:27:53] with like every third admin still using monobook and old edit toolbar [21:28:14] yes, there's a very poor communication [21:28:17] we already knew [21:28:20] * Nemo_bis going to bed now [21:28:22] oh please [21:28:27] there *is* communication at pl.wp [21:28:38] at least between people and me=sort of a tech ambassador [21:28:52] we're considerably less bad at it than en.wp, for example, i believe [21:29:51] well this issue shouldn't exist ideally, but it's a noop now, not 'crashing'. [21:30:00] as far as i understand. [21:33:15] perhaps we can put some sort of watch condition on the global and then 'update' the toolbar. [22:29:10] I'm curious about why shutting down just a few of the app servers affected the incoming network traffic so dramatically: http://j.mp/14yRfyw