[00:02:41] MaxSem how can I look up the page id though? [00:02:54] do I need to download another bajillion gigabytes of data [00:03:30] I should try for the toolserver, although not sure I am established enough [00:03:47] md_5: You can look it up using the API [00:03:55] I need it offline [00:04:03] has to be fast (thousands of lookups / sec) [00:05:11] md_5: You need foowiki-blah-page.sql.gz [00:05:51] I hope its not big [00:06:02] depends on the wiki [00:06:11] enwiki-latest-page.sql.gz 2013-Jan-02 15:22:55 909.9M application/x-gzip [00:06:14] thats ok [00:06:26] still importing the pagelinks table [00:10:57] chrismcmahon: The JS/CSS caching bug strikes again: https://bugzilla.wikimedia.org/show_bug.cgi?id=43805 [00:11:48] kaldari: jeez. also wtf? [00:12:18] yeah, this is a pain in the neck [00:12:48] at least this time it's been caught early so hopefully people can help debug the problem [00:13:32] any suggestion for who would be good to look at this further? [00:14:06] who's the expert on caching issues? [00:14:38] I'd think Roan at least to start [00:17:17] I cced him on the bug [00:21:37] I just hope somebody confirms the bug before it disappears. Sometimes I feel like Big Bird trying to explain Snuffleufigus :) [00:24:06] RoanKattouw_away: ^ [00:26:42] chrismcmahon: kaldari might caching be a reason for the upload wizard to break on test2 ? [00:27:00] when i use ?debug=true or use it on my test wiki, then no problems [00:27:13] yes, that's definitely a possibility [00:27:44] * aude searches for the bug [00:28:03] aude: we had a real bug in UW on test2 earlier today, it's behaving properly now [00:28:26] aude: are you still seeing the UploadWizard problem currently? [00:28:30] looks like it's fixed [00:28:32] https://bugzilla.wikimedia.org/show_bug.cgi?id=43791 [00:28:48] there was an actual bug in the code but seems ok now [00:29:05] Cool, I thought you meant it was still broken [00:29:16] was when i tried earlier [00:30:06] kaldari: AaronSchulz and I were the emergency response team earlier [00:30:20] yeah, it's good [00:30:24] marktraceur: nice firefighting :) [00:30:34] yay browser tests [00:30:42] yay! [00:30:46] :) [00:30:53] * aude on the lookout for bugs [00:31:42] aude: if you're ever interested, current browser test status is at https://wmf.ci.cloudbees.com/ (kaldari and marktraceur too for that matter) [00:32:01] chrismcmahon: nice [00:32:18] bookmarking [00:32:36] sorry about the failing IE6/IE7 tests, we changed the way we exclude tests from being run recently and those aren't caught up [00:33:03] that's ok [00:33:30] are there tests for stuff like watchlists and recent changes? (browser tests) [00:34:26] aude: not yet. http://www.mediawiki.org/wiki/Qa/test_backlog Right now I'm working on one that checks that history is preserved when deleting/restoring a file. [00:35:55] aude: I am looking forward to extending the test coverage a lot, we've worked out most of the architecture issues, now we can start producing nice tests more quickly [00:37:36] Anyone know if Roan is around today? [00:38:43] I'm tempted to try re-syncing it if he's not going to be able to look at today. [00:44:04] chrismcmahon: ok, thanks [00:48:31] aude: docs are still in flux a bit too, but this might be of interest: http://www.mediawiki.org/wiki/QA/How_to_contribute_to_browser_testing [01:08:11] http://screencloud.net/v/gd2B [01:08:23] and I was getting 400kbps from that mirror yesterday [02:42:34] lwelling questioned the use of varbinary for timestamps in a code review. I was about to point him to http://www.mediawiki.org/wiki/Manual:Coding_conventions/Database, but it occurred to me that I don't know the reason behind this convention. Is it purely for the sake of consistency? [04:04:11] Anyone have free time to help with some custom js and css stuff? [05:26:00] domas: see ori-l above [05:36:14] ori-l: For new code (or particularly for code in an extension), I'd say it's fine to use a timestamp field if you want. [05:36:38] At least one timestamp field in MediaWiki core is marked as such. [05:36:49] (categorylinks.cl_timestamp) [05:37:33] ori-l: There's also a note in tables.sql explaining the rationale. [07:54:23] Can someone with access to the toolserver please run a quick query to find the count of the link table on enwiki. [07:54:56] I really need to see how much longer this dump has to import, been going for hours [08:37:52] md_5: what would the query be? [08:40:06] Nemo_bis something like: SELECT COUNT(*) FROM pagelinks; <-- I assume there is a link table since there is a dump of it [08:40:41] yeah [08:40:56] SELECT COUNT(*) FROM `pagelinks`; should do it [08:41:17] on that note, I requested a toolserver account [08:42:48] http://ganglia.wikimedia.org/ requires a login now? Thought it was a public ganglia. Wanted to see a demo :C [08:45:47] JesperHansen: there are some security issues with it atm, its only temporaily locked down [08:46:14] * Nemo_bis hopes it's not as temporary as the createpage restriction on en.wiki [09:00:33] p858snake|l: anything I should know about when I am about to choose between cacti, ganglia and more? Or is it specific to wikimedia [09:01:16] no idea [09:02:29] I would assume that if its a problem with the actual ganglia base, the ops team would have filed a upstream ticket so that it can be fixed for everyone [09:08:48] md_5: too slow [09:31:54] damn that sucks Nemo_bis , does phpmyadmin not have a row count for that table? [10:01:56] At [[Special:Version]] why would it list in the Parser extension tags when the extension isn't shown to be installed in the list? [12:58:03] Is it me or mediawiki.org is super sluggish today? [13:01:12] guillom, doesn't seem like that to me - are you on HTTPS? [13:01:21] yes [14:56:56] hi! how it is possible to remove a log from the recent changes feed? [15:00:59] lol https://www.mediawiki.org/w/index.php?title=Bugzilla/Fields&diff=0&oldid=621749 [15:01:33] so in the end we just replaced LATER with lowest and shifted all the other priorities definition up of a level [15:29:33] Nemo_bis, yes, that's my plan. [15:30:08] Nemo_bis, that's an easier way to fix years of optimistic priority setting than retriaging a few thousand reports [15:30:17] I'd like to merge Low and Lowest priority anyway. [15:30:31] but you found out now ;) [15:33:31] low / norma / high / immediate would be enough :-] [15:33:41] andre__ i don't think we need high / highest either [15:34:36] hashar: consider high as normal, and highest as high, and you're happy :) [15:35:01] lol [15:35:01] At [[Special:Version]] why would it list in the Parser extension tags when the extension isn't shown to be installed in the list? [15:35:43] * sDrewth slaps frWP [17:26:05] \quit [17:32:48] aww no more henna [18:01:10] RoanKattouw: By any change did you get to take a look at https://bugzilla.wikimedia.org/show_bug.cgi?id=43805 before it resolved itself? [18:01:39] I saw you pinged me yesterday but I wasn't around, sorry [18:01:45] I was able to reliably reproduce the problem this time and got the headers at least [18:01:48] I am now going to school Monday and Wednesday [18:01:56] (I should e-mail engineering@ about that I guess) [18:01:59] ah, good to know [18:02:23] kaldari: It's very hard to debug such issues after the fact [18:02:34] yes, that's why it's such a frustrating bug :) [18:03:15] it always seems to resolve itself after a day or two [18:03:31] but in the meantime it can cause chaos [18:14:15] RoanKattouw: happy studying! what are your courses this term? [18:15:54] sumanah: Automata, advanced networking, intro crypto [18:16:42] "Intro Crypto" is my favorite noisepunk band [19:09:05] does anyone know where to find the historical wikipedia xml dumps, i.e. the earliest versions? [19:12:04] notconfusing: I would ask apergos [19:12:12] apergos: ^^ [19:12:40] ah [19:12:51] yeah I think we have the archives of those public [19:12:55] sec lemme find the link [19:13:51] http://dumps.wikimedia.org/other/ [19:13:55] see where it says [19:14:03] Historical material only: archives of sql/XML dumps for previous years starting from 2001 [19:14:08] see what suits you [19:14:27] there's some pretty weird formats there [19:14:31] notconfusing: [19:16:46] apergos thanks, i'm looking for something that would be compatible with the parser that i wrote that accepts the modern format [19:18:24] so probably 2006? from the site: [19:18:24] 2005 (MediaWiki 1.5?) XML files, warning: old schema! [19:18:25] 2006 (Mediawiki 1.5+) XML files [19:19:36] try either of those [19:21:10] cheers apergos [19:23:05] good luck! [19:25:38] thanks I wrote a tool that analyzes the level of "advancedness" of the wikipedia by using Citation data and the advancedness of those books, so I wanted to see if it's changed over time [19:26:05] but download speeds are slow (30K/s) maybe its cos i'm in the netherlands [19:43:35] you can download from a mirror site, I think ftp.mirror.your.org has those [19:43:54] (although it might be in the middle of a firmware update right now, but it should be back in 15 min or so I would think) [19:44:02] lemme find the list of mirror sites [19:44:11] http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors [19:51:35] apergos woohooo 2M/s on the mirror thx [19:52:42] yw [19:52:54] notconfusing: for some of them you can also use archive.org [19:53:18] notconfusing: especially the torrents, which are webseeded from two different locations and are faster than the normal HTTP downloads [19:54:11] oh, good advice [20:00:55] so…oddball question - do we have control over wikivoyager.org? [20:01:42] http://whois.net/whois/wikivoyager.org [20:01:56] No, but you could probably ask Stefan/Hans to transfer it like they did the others.. [20:02:22] that'd be good - Jimmy Wales kept saying WikiVoyager instead of WikiVoyage on TV - so I suspect some folks are going to the wrong domain [20:03:19] Reedy: I'll try later today to track them down and make the request [20:03:43] Might want to try Erik [20:04:02] I suspect they're all in that office session [20:44:29] Greetings room, I'd like to call attention to my post @ http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Urgent:_API_returning_.22textmissing.22_breaking_tools [20:48:22] westandrewg: known issue [20:48:23] https://bugzilla.wikimedia.org/show_bug.cgi?id=43820 [20:49:30] Thanks, I'll post on the VP accordingly [21:07:31] is bits running Apache or something else? looks like Apache [21:07:37] I need to tweak a config setting [21:08:39] it's just normal app servers [21:08:53] ah good [21:08:58] but a different group receiving only bits requests, but identically configured [21:09:04] then i should be able to find the config in puppet somewhere [21:09:13] need to add a mime type,real easy [21:09:40] for .webapp manifest files [21:10:34] i'll futz with it after meeting [21:24:01] mark, preilly: does this look right? https://gerrit.wikimedia.org/r/43344 [21:28:40] brion: actually, it's not in the puppet repository at all [21:28:45] poop [21:28:49] you need to have operations/apache-config, I think [21:28:50] i've been fooled [21:29:06] which is just the old apache style config you're familiar with in a git repo [21:29:12] old style apache config [21:29:28] (if it's not, i've been fooled too ;) [21:29:32] pulling... [21:30:45] <^demon> clone all the repos, can't go wrong :) [21:32:30] mark: how's this https://gerrit.wikimedia.org/r/43346 [21:33:20] <^demon> lgtm. [21:34:05] i think that should work [21:34:11] excellent [21:34:14] i just can't babysit it right now [21:34:16] i'll +1 it [21:34:20] no rush thanks [21:34:30] i'll poke at it tomorrow, got other fixes to make on that app [21:36:17] If the search on mediawiki.org does not work reliably and does not allow provide search results, is there any way on the client side to provide some useful debug info for a developer to track it further down? [21:36:40] we're after https://bugzilla.wikimedia.org/show_bug.cgi?id=42423 and valeriej can reproduce it sometimes, so any hints appreciated. [22:50:28] I need a varnish change pushed, assuming it's not already live [22:50:35] https://gerrit.wikimedia.org/r/#/c/42867/ appears to be merged [22:50:52] anything need doing for such? [23:37:57] ori-l: i see unmerged changes in php-1.21wmf6 for event logging? [23:38:39] on fenari, i mean [23:38:50] spagewmf: ^^ [23:39:11] is it safe for me to git pull? [23:40:59] hrm, let me take a peek [23:41:49] ori-l this is what i see: https://gist.github.com/ca0ac9087cb5dfdb6820 [23:42:31] awjr: I must have forgotten to sync it. It is safe to sync / pull [23:42:40] ori-l cool thanks [23:42:45] sorry about that! [23:42:50] no worries [23:53:43] At a [[Special:Version]] why would it list in the Parser extension tags when the extension isn't shown to be installed in the list? They are not aligned? [23:55:02] Where? [23:55:16] which wiki? [23:55:38] ya [23:55:46] [[:w:sw]] [23:56:16] they came to me to with a non-functioning timeline (copies from another wiki) [23:56:23] copied [23:56:32] sw.wikipedia.org? [23:56:50] yes https://sw.wikipedia.org/wiki/Special:Version [23:56:52] EasyTimeline (6bda302) Adds tag to create timelines Erik Zachte [23:56:55] WFM [23:57:21] hmm [23:57:51] I know that it was late, I didn't think that it was that late [23:59:44] * sDrewth just buries his head in shame and crawls away, thx