[00:09:38] marktraceur: Elsie: please file a bug so this issue can be resolved before the next deployment. Importance->High i'd say. [00:09:44] don't want to lose it [00:10:20] Api is down. [00:10:22] https://meta.wikimedia.org/w/api.php?format=jsonfm&action=query&meta=globaluserinfo&guiprop=merged&guiuser=Rillke [00:10:43] Again? [00:10:52] again? [00:10:52] en.wikipedia.org seems fine. [00:10:53] Wikimedia Foundation Error Request: GET http://meta.wikimedia.org/w/api.php?format=jsonfm&action=query&meta=globaluserinfo&guiprop=merged&guiuser=Rillke, from 208.80.152.16 via sq62.wikimedia.org (squid/2.7.STABLE9) to () [00:10:53] Error: ERR_CANNOT_FORWARD, errno (11) Resource temporarily unavailable at Fri, 30 Aug 2013 00:10:26 GMT [00:10:59] ugh, same error [00:10:59] Technical_13: What do you mean by again? [00:11:06] It loaded for me. [00:11:10] Took a bit, though. [00:11:13] Elsie: it happened today around.... 3pm pacific? [00:11:18] Api was down earlier too. [00:11:21] Problemas in es.wiki also [00:11:21] Elsie: the issue was with esams [00:11:26] What'd you do to the API? [00:11:33] so, not an issue for us 'mericans [00:11:39] Hmm. [00:11:45] addshore broke it. [00:11:57] At least that is my theory... [00:12:00] And in every wiki I'm testing of [00:12:15] jem-: you're located in Europe, right? [00:12:41] rillke: where are you physically located? which continent? [00:12:42] Yes, I am. [00:12:45] * greg-g nods [00:13:01] * greg-g goes to read backscroll from when this happened earlier today [00:13:02] Can I somehow connect to the U.S. Servers? [00:13:10] I don't see anything in the SAL. [00:15:57] commons seems slow for me. [00:19:50] greg-g: Yes, in Spain more precisely [00:20:04] rillke: working now? [00:20:04] jem-: ^^ ? [00:20:22] Not yet [00:20:46] (I'm checking just the API with my bot) [00:21:01] No. [00:21:08] hm, well I switched the service to a working datacenter [00:21:17] Error: ERR_CANNOT_FORWARD, errno (11) Resource temporarily unavailable at Fri, 30 Aug 2013 00:20:56 GMT [00:22:06] I must go to bed... good luck with it :) [00:22:08] Do I have to flush my DNS cache? [00:26:15] https://commons.wikimedia.org/w/api.php?action=tokens&format=jsonfm also fails [00:26:26] this means UploadWizard fails [00:28:14] rillke: no, it seems my change didn't switch things [00:31:37] It's back. [00:58:08] Hi! Was there any recent config change which might have caused the bold items from my https://pt.wikipedia.org/wiki/Special:Watchlist not to be bold anymore? [00:59:17] How recent? 6-7 hours ago? [02:40:52] Looks like the HTTPS switch broke Flickr importing on Commons. Anyone want to +2 the fix: https://gerrit.wikimedia.org/r/#/c/81890/ [02:47:48] kaldari, default UploadWizard config should be modified as well [02:48:03] true, i'll do that now [04:10:31] wtf, i am logged out and seeing commons in the dyslexic font [05:58:14] oh, overzealous magic linking https://bugzilla.wikimedia.org/show_bug.cgi?id=37583#c3 [06:00:07] ori-l: could you check this in the logs for me? https://bugzilla.wikimedia.org/show_bug.cgi?id=28827#c19 [06:00:38] sure [06:01:53] it happened again today [06:02:03] [29-Aug-2013 15:45:50] Catchable fatal error: Argument 1 passed to EditPage::__construct() must be an instance of Article, null given, called in /usr/local/apache/common-local/php-1.22wmf14/extensions/LiquidThreads/classes/View.php on line 734 and defined at /usr/local/apache/common-local/php-1.22wmf14/includes/EditPage.php on line 272 [06:02:24] let's see what lqt is trying to do there [06:03:38] ori-l: you should go to sleep soon [06:03:57] probably [06:04:45] lol [06:05:05] Reedy: I don't even know what to say to you ;) [06:05:23] Reedy: how long have you been up? [06:06:30] 16 hours or so I think [06:07:37] golly gee [06:07:43] or something [06:08:16] I went to bed 3 or 4 hours ago but apparently couldn't sleep [06:08:40] ok, time for me to go. 4 things checked off my "moving out of this crazy house" todo list [06:08:47] Reedy: ugh [06:08:52] good luck for tomorrow/today [06:10:07] LQT has methods like 'assertSingularity' [06:10:18] I'm not making that up [06:11:09] good night greg-g [06:11:31] hmm ori-l, I've pasted your line in the bug but now I wonder if it actually is the same thing [06:12:03] dunno yet [06:12:23] LQT has plenty of bugs with X must be an instance of Y fatals [06:15:03] this must be an instance of one of those [06:25:59] Nemo_bis: sorry, I'm running out of steam tracing this down [06:26:09] i got to a comment that reads: // Yuck. [06:26:36] i'd be happy to keep looking tomorrow, feel free to ping me about it [06:36:27] ori-l: haha, LQT does that [06:38:18] it seems Krenair has a special mithridatisation against LQT madness, we should isolate his anti-toxin and distribute the vaccine [11:28:38] hello, i wanna help wikimedia, how can i contribute ? [11:28:39] i mean in the tech field [11:36:30] cortexA9: https://www.mediawiki.org/wiki/Special:MyLanguage/How_to_contribute [11:37:00] MatmaRex: thanks [11:37:30] cortexA9: the good stuff for developers is at https://www.mediawiki.org/wiki/Developer_hub :) [11:40:28] MatmaRex: it's not possible to contribute in the sysadmin operations? [11:41:49] cortexA9: i don't really know, sorry [11:42:09] ok no problem thanks MatmaRex [11:42:24] cortexA9: but wikimedia's operations stuff is usually handled by wmf's operations team :) [11:42:49] cortexA9: you can help with e.g. coding up configuration changes projects ask for, but that's usually not too exciting [11:43:37] cortexA9: their channel is #wikimedia-operations, feel free to hang out there :) [11:44:07] MatmaRex: good, thanks :) [13:15:19] Anyone that can explain why the the search results in the drop down menu suddenly has a very low hit rate? [13:17:08] jeblad: where? mw wiki? [13:17:47] Its at nowiki, but I've heard people say it is also at other wikis [13:18:19] I have only observed it on nowiki and it seems like obvious articles are left out [13:19:29] central login has been logging me in to wikis without actually properly applying user settings such as skin [13:19:39] i feel it's a new thing, started happening today [13:20:03] with popup, similar to the 'edit was saved' one (never saw these before either) [13:20:15] you need to refresh for user perfernces such as skins, you have always needed to do that [13:20:22] gry: that message has been around for ages [13:20:34] unless you had js to disable it [13:20:46] edit message yes, 'central bla bla logged you in' is new [13:21:35] i mean edit message is old, but the central auth one is new, i know these popups but they appeared from central auth only today [13:21:39] I think the skin is failing more often now, I been asked about it several times lately [13:22:27] is there a way to disable these popups without js tricks btw ? i would prefer messages similar to 'you have new talk messages' which are static and dismissable [13:22:45] without having to re-do js stuff at each new computer [13:23:29] them moving around is more or less ok but them /popping up/ disturbs me [13:53:19] okay, I am still having problems where rather often page load takes a long time because requests to mediawiki's resourceloader does not complete. is this a known problem? [15:01:49] parent5446: hello [15:02:44] Hey [15:03:14] apergos? [15:04:05] apergos is on vacation today (and i think also on monday) [15:04:47] Ah OK, well for some reason he's still on IRC [15:05:25] yeah [15:06:13] so, today i'm working on using zdelta (to compare it with other options) [15:07:53] OK. How many algorithms are you comparing in total again? [15:09:08] well, according to that paper, zdelta is best for delta compression, so i'm not planning on trying another delta compression algorithm [15:09:25] but i want to compare it with using LZMA on a group of revisions [15:09:43] OK. Makes sense since zdelta is L77-based. [15:11:07] and in both cases, i want to try some options, like the size of a group or what happens if i don't make one huge delta chain for all revisions of a page, but a separate chain for each n revisions (so that loading the last revisions doesn't require all the previous ones) [15:14:03] and first results look somewhat promising: the tenwiki dump i'm testing it on has 100 MB in umcompressed XML, 2 MB as xml.7z, if i remember correctly, ~30 MB each revision separatelly compressed using LZMA, and ~5 MB for zdelta [15:14:35] that's still more than 2x the size of xml.7z, but i think it could work [15:14:57] Hmm, yeah it's still bigger, but still has a good compression ratio. [15:16:19] yeah [15:17:11] OK, well I don't have anything else. Hopefully the test results stay the same or get better. [15:17:27] yeah, we'll see about the other options [15:17:48] i don't have anything either; see you monday [15:17:55] See you monday [16:04:22] https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2013-08-28/Technology_report <- If anyone sees anything that's been left out of this, please let me know [16:08:18] I'm holding back the new VE buttons until they get a bit nearer en-wiki, since "Faster loading" is a pretty good VE lead for this week. [16:08:48] Galleries get a big writeup as they've just reached en-wiki. [16:08:58] And, well, they're kind of gamechangers [16:24:51] AdamCuerden: In regards to your post to wikitech-ambassadors about if you missed anything - The campaign namespace already exists: example: https://commons.wikimedia.org/wiki/Campaign:wlm-nl YuviPanda is currently working on making a less technical pretty display for them like http://blue-dragon.wmflabs.org/wiki/Campaign:show-off-campaigns [16:28:00] Right [16:28:14] Let's see... [16:28:40] I think that got edited down in Kirill's copyedit [16:30:28] * '''Campaign infrastructure improvements planned for Commons''': [[:mw:User:Yuvipanda/Campaigns_namespace_proposal|Improvements to the "campaigns" namespace]] have been proposed for Commons. The proposed infrastructure will allow programs like Wikipedia Loves Monuments to easily preformat file information and document uploads. Although a [[:commons:Campaign:wlm-nl|ba [16:30:29] sic form of the idea]] exists already, this should vastly improve efficiency. [16:30:35] Does that sound good? [16:31:01] * AdamCuerden appends "and useability [16:41:16] sounds good to me [16:42:17] AdamCuerden: s/useability/usability/ [18:11:30] MArk: Technically, that's a BE / AE issue. [18:11:53] AdamCuerden: Me? [18:12:57] Aye. Isn't "Useability" the British spelling? [18:14:05] wiktionary calls it simply alternative spelling but doesn't claim British/American [18:14:10] I um [18:14:14] Hmm. [18:14:15] I didn't think that was true [18:14:27] Eh, well. Usability looks weird. [18:14:27] James_F: "Useability" - en_GB or just en? [18:15:07] "variant spelling" [18:15:15] according to OED [18:15:15] * marktraceur thinks there's a used car marketing campaign in "USA-bility", maybe for President's Day or something [18:15:17] all you variants [18:35:22] Eh, well. Acceptable variant is acceptable variant =) [18:40:19] http://en.wiktionary.org/wiki/variant#Alternative_forms [18:40:27] Hah [18:40:44] Yo dawg, I herd you like varients. [18:41:31] variaunt vagrant = future Ubuntu release? [18:41:52] https://github.com/DataDog [18:42:16] <-- open source stuff of https://www.datadoghq.com/ [18:42:50] mutante: IF it gets that far :) [18:43:20] hah, yea [18:56:34] What is Ubuntu up to? U or so? [18:56:45] Q, methinks [18:56:55] October will be R IIRC [18:57:35] Oh, maybe April was R [18:58:09] "Raring Ringtail" [18:58:13] Next up "Saucy Salamander" [18:58:22] better than sassy salamander? [18:58:47] Sadistic Sasquatch [18:59:22] Wonder what moneybags will use for T [18:59:39] Oh, it'll be an LTS [18:59:46] Terrifying Tapir [18:59:47] The adjective had better be "Trusty" [19:00:01] Trusty Turtle [19:00:12] Trusty Tortoise, you mean [19:00:13] :P [19:00:17] Ooh, better. [19:00:25] Trusty Tortoise, you heard it here first [19:00:49] Though they might go with Tapir [19:01:00] Them liking their exotic animals so much [19:01:32] Terrifying Tarantula [19:01:34] Treacherous Tarantula? [19:01:37] gah, beat me [19:01:39] ;) [19:01:44] Terrible Tarantula, perhaps [19:01:46] it would be a problem for them to decide between Tortoise and Turtle :P [19:01:49] * YuviPanda likes spiders, they're cute. [19:02:23] Platonides: if it was the wikimedia community deciding that, we'd have 80 RFCs and we'd have massive fights after one was actually decided upon [19:02:40] then a staff member would mispell it [19:04:08] and then there'd be 20 threads on different VPTs asking for all the staff to be fired [19:04:16] VPs, not VPTs [19:04:42] and then of course.. the Unbelievable Unicorn [19:04:55] mutante: that would be a great name [19:04:58] too bad that's not an LTS [19:05:16] hah, yea:p [19:05:43] Ryan_Lane: October 2014 is obviously when they start rolling releases - Unstable Unicorn will be that branch, and they'll start releasing a separate line for stable use [19:05:45] Unfindable Unicorn? [19:06:03] they'll probably start using absolutely random chinese words :P [19:06:05] if it was LTS it should be 'unbreakable' [19:06:38] emoji [19:06:45] 'pile of shit' [19:07:01] Promising Panda [19:07:16] marktraceur: eh? I thought the code name was directly associated with the date release? [19:07:17] Panda is clearly not obscure enough for them [19:07:25] 💩 [19:07:40] brion: hehe, was too lazy to search [19:07:41] Ryan_Lane: 14.10 will be the "U" release [19:07:51] 2014, 14, October, 10 [19:07:51] Utopian Unicorn [19:07:58] yeah, that's not an LTS [19:08:05] Right, 14.04 is [19:08:06] oh [19:08:07] wait [19:08:08] it is [19:08:11] Which will be a "T" release [19:08:25] I keep forgetting we're in 2013 [19:08:32] Tempting Trilobite [19:08:47] Heh [19:08:55] i used to have a trilobite [19:08:58] Even number .04 releases are LTS [19:09:16] Troubled Tribbles [19:09:23] :D [19:09:31] ST + Ubuntu! [19:09:36] YuviPanda: Oh no, they're totally going to do that [19:09:38] Damn it [19:09:47] I only wish. [19:09:55] I was so excited about "Trusty" and the potential of tapirs. [19:10:11] Tribbles got that beat [19:10:18] so if we come up with release names for labs or something ..Worrisome Wikipedian [19:10:43] Totally [19:11:04] mutante: Dapper Dramah [19:11:14] Or Deletionist [19:11:34] Interpid Inclusionist? [19:12:13] marktraceur: Useability. "Usability" Would canonically be pronounced US-AH-BIL-IT-IE, not EUSE-AH-BIL-IT-IE. [19:12:37] James_F: No it wouldn't - the "a" after the "s" means "u" is pronounced "you" [19:13:16] * marktraceur goes to lunch to debate this further [19:27:11] Hi all [19:45:08] robla: looks like we're converting the page into a DOMDocument and then using Xpath to remove the stuff we don't want. Obviously not a very performant solution. [19:47:35] robla: see MobileFrontend/includes/formatters/HtmlFormatter.php [20:13:10] kaldari, could've been worse: https://graphite.wikimedia.org/dashboard/temporary-29 :) [20:17:59] MaxSem: what are these graph lines showing? [20:19:54] total transformation time [20:20:06] what's the difference between the 2 lines? [20:20:21] avg vs 99th percentile [20:20:23] ah [20:20:39] and this is in milliseconds? [20:20:48] or seconds? [20:20:53] ms [20:20:54] :) [20:21:07] seconds wouldv'e been a disaster:) [20:22:44] that's not as bad as I would expect, especially for the 99th %. [20:25:46] MaxSem: fwiw, Chad is looking into an HTML parser that Google wrote as something to bolt in as a C extension [20:26:24] their new pure-C HTML5 parser? that has a certain appeal yes [20:26:50] iirc speed wasn't a goal but i haven't compared it to libxml2 for speed [20:26:51] brion: I think so. all I know is what I learned in our 2 min conversation on the subject [20:26:55] for compatibility, it should be way better [20:27:04] libxml2 occasionally really fucks up odd-looking html [20:27:59] robla, for which purpose? [20:28:23] MaxSem: they need to strip out some navigation stuff prior to indexing for search [20:28:34] manybubbles: you may be interested in this conversation ^^ [20:28:52] heh, I had entertained myself with similar stuff too:) [20:29:34] hey, yeah! [20:29:37] * robla wonders about this: https://code.google.com/p/streamhtmlparser/ [20:30:18] so right now I've got a patch set that uses https://github.com/Masterminds/html5-php and it is slow. We're going to sit on it for a bit. [20:30:19] meh, dead project, nevermind [20:30:24] what are you trying to do? [20:30:43] strip out some navigation stuff prior to indexing for search [20:31:30] I'm not sure if there's an equivalent of Sax for HTML (or if Sax is more suitable for it these days) [20:32:09] ori-l: pretty much. What I wanted was to be able to $('.whatever').remove() from php but what I got was xpath [20:32:12] which isn't too bad. [20:32:34] well, i know that using string processing methods to handle HTML is a way to get hospitalized in an asylum, but [20:32:51] mediawiki html is uniform in some key respects [20:32:54] manybubbles, is that parser faster than libxml? though yeah, I guess any random parser you find on the internetz is way less buggy than it:} [20:33:19] i wonder if you could just grep for
[20:33:24] in any case, it looks like our options for actually parsing html5 in php are that html5-php which is slow-ish and gumbo-parser which doesn't support php yet so isn't really an option now. [20:33:26] i guess you need to figure out the matching
tag [20:33:35] ori-l: well, http://stackoverflow.com/a/1732454/17865 [20:33:46] yes, i know :) [20:34:01] no NO NOO̼O​O NΘ stop the an​*̶͑̾̾​̅ͫ͏̙̤g͇̫͛͆̾ͫ̑͆l͖͉̗̩̳̟̍ͫͥͨ [20:34:26] so you can do some stuff with a regex - it is pretty fragile and you end up relying on the whitespace behavior of our html renderer [20:34:30] which is not a good idea [20:34:45] we could also patch mediawiki to add comment if there isn't a suitably unique string that demarcates the end of page-specific content [20:34:51] libxml doesn't support html5. it just blows up [20:35:15] it is worse than that though - we need to remove some content from the page because it shouldn't be searchable. [20:35:18] does mediawiki output html5 specific things? [20:35:24] wait, what do you grab, manybubbles? [20:35:24] stuff like the contents of