[00:00:07] But if there's a good reason for doing so... [00:00:17] YairRand: Yeah, we should bump it up, after abstracting it. [00:00:23] I have notes to file a few bugs about that. [00:00:26] Haven't gotten to it. [00:00:30] Is that a new wiki a-coming? [00:00:31] so, it's actually feasible? [00:00:41] Yeah [00:00:45] To increase the size? Definitely. [00:00:53] It's just currently hardcoded stupidly in a lot of different places. [00:00:57] ah, great :) [00:00:57] obviously varchar can't be larger than 255 [00:01:16] I'm not sure if it is possible and/or efficient to index a text/blob column [00:01:42] I can't tell if you're being sarcastic about varchar. Is it really hard-limited to 255 bytes? [00:01:44] Elsie: "title varchar(255)" 16 matches in tables.sql [00:02:04] I could swear I've seen varchar(1000) before... [00:02:28] "Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 255 before MySQL 5.0.3, and 0 to 65,535 in 5.0.3 and later versions" [00:02:30] mwalker: probably so, if you have privs on deployment-prep host(s). I'm looking for the wikitech page but getting no response from https://wikitech.wikimedia.org/wiki/Nova_Resource:Deployment-prep [00:02:33] https://dev.mysql.com/doc/refman/5.0/en/char.html [00:02:55] I was wondering about that recently. The 255 thing dates back to like r51, I think. [00:03:08] chrismcmahon: wikitech.wm is down :/ [00:03:11] I didn't realize it was an SQL thing. [00:03:28] greg-g: can't win for losing today [00:03:34] interesting [00:03:47] "The effective maximum length of a VARCHAR is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used" [00:03:49] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/51 [00:04:16] I also thought that's where hit counters were introduced, but it turns out they date to r45, sort of. [00:04:56] chrismcmahon: I'm not part of that group; so no then :p [00:05:36] Who's in charge of wikitech.wikimedia.org? [00:05:39] Ryan L.? [00:05:50] mwalker: I think it is a reasonable thing to do in beta, but it might be hashar/Antoine who can make it possible [00:05:58] Elsie: that's virt0 being down [00:06:06] Elsie: well, whatever you wanna call down [00:06:17] Not up [00:06:19] mutante: Planned or unplanned? :-) [00:06:28] Elsie: unplanned [00:06:31] Hrm. [00:06:36] i dunno whats going on yet [00:06:40] K. [00:06:41] it has 94% packet loss [00:06:43] but not 100% [00:06:48] looks like networking [00:06:53] Ah, so netw... yeah. [00:07:09] Faidon's not around? [00:07:16] It's after 1am for him [00:07:19] He'd be my next choice if you can't reach Mark or Leslie. [00:07:39] Hi springle. [00:07:47] Similarily not nice to poke mark at this time ;) [00:08:07] springle: some wikis have many archive revisions missing from their slaves but not their masters [00:08:12] see https://bugzilla.wikimedia.org/show_bug.cgi?id=56577 [00:08:18] looking [00:08:27] it's not quite as widespread as I had feared [00:08:35] frwiki and itwiki are affected, per the bug reports [00:08:40] Based on count(*)s? [00:08:49] but bgwiki, ruwiki and jawiki are OK [00:08:56] yes, based on count(*) from archive [00:09:20] I'm wondering if the *links tables are also affected. [00:09:30] archive is being reported because it's most obvious to local admins. [00:09:51] It's alright, you can purge all the broken pages [00:10:11] I'm not concerned about missing *links, just curious if the issue extends outside of the archive table. [00:10:44] springle: Elsie doesn't appear to know what he is talking about [00:10:59] so don't let him distract you [00:11:05] :-) [00:12:33] the only tables populated by INSERT SELECT are archive and oldimage [00:12:43] and oldimage has a very low write rate on frwiki and itwiki [00:14:30] it's after 2 am for people in this time zone actually [00:15:08] archive recently had ar_id PK added. that was done using triggers on the master for archives over 1M rows, but it's possible statement based repl has drifted slaves [00:15:21] i'll run pt-table-sync to investigate [00:16:06] * AaronSchulz really doesn't like statement based replication [00:16:18] yeah [00:16:30] toolserver locks us into it for a while yet [00:41:54] springle: can you do an !log when you do something? [00:42:09] will do [15:15:52] is the API on Commons returning incorrect values right now? Popups is acting strangely. [15:16:51] You tell us? [15:19:16] let me take a screenshot [15:23:58] reedy: http://tinypic.com/r/hx0iv6/5 [15:24:03] notice all the "undefined"s [15:24:36] Undefined would point to javascript rather than the api [15:25:10] Hmm [15:25:15] Looks to be something to do with history pages [15:26:23] https://commons.wikimedia.org/wiki/MediaWiki:Gadget-popups.js [15:26:25] * Reedy tests enwiki [15:26:51] Fine on enwiki [15:27:08] I'm guessing it's broken on 1.23wmf2 then [15:27:11] obviously it's a JS issue [15:27:18] but what's *causing* the JS problem [15:27:23] it doesn't usually act like this [15:27:24] Reedy: the javascript thing on Special:Upload is broken too (reported by an user today) [15:27:39] so either someone sneekily changed the JS today, or something is misbehaving in the API [15:28:30] http://lists.wikimedia.org/pipermail/wikitech-ambassadors/2013-October/000471.html mabye this? [15:28:31] https://en.wikipedia.org/w/index.php?title=MediaWiki:Gadget-popups.js&action=history [15:28:32] 3 edits today [15:28:53] Certainly looks fishy [15:30:15] Popups don't seem to work at all for me on mediawiki.org [15:31:11] I am notifying the script owner and starting a thread on the talk page [15:31:29] Krinkle|detached is away [15:32:13] Updated mw.o to use enwikis popups, working but broken in the same way for history [15:35:15] Krinkle|detached: FYI ^^ [16:03:33] Reedy: https://en.wikipedia.org/wiki/User_talk:Amalthea#Security_fix_-.3E_breaking_the_script.3F [16:03:50] the last comment is for you :) [16:04:02] Uhh [16:04:24] * Reedy looks for escapeQuotesHTML [16:06:08] mw.log.deprecate( win, 'escapeQuotesHTML', $.noop,'Use mw.html instead' ); [16:06:23] * To be removed in MediaWiki 1.23. [16:06:24] * [16:06:24] * @deprecated since 1.18 Use mw.html instead [16:06:43] * Reedy replies [16:06:54] I will post that [16:07:07] I'm doing it ;) [16:07:14] oh good [16:07:23] I don't have to worry about attribution and licensing and the like then :) [16:08:05] yikes; I know the devs want to purge old functions, but that sounds like it could cause other problems too [16:08:41] It's been deprecated for a long time [16:10:04] Of course [16:10:09] It's not said how it's broken [16:10:45] I'm guessing broken means it doesn't work at all [16:10:53] As there is only one usage of escapeQuotesHTML in all of core [16:11:21] well it's not like he can't write his own custom script [16:11:29] it's what? [16:11:57] string.replace('\"', """), or something of the like [16:12:01] I've no idea [16:12:09] It's not in core any more :P [16:14:24] * Reedy greps all the things [16:15:44] FYI, Commons has a huge brain drain problem [16:15:48] all the page patrollers are gone [16:15:53] copyright violations run rampant [16:16:04] deletions are literally down by 50% since a few years ago [16:25:08] what is a "brain drain" ? [16:26:06] hashar: people leaving [16:26:09] ahh [16:26:12] particularly, 'smart people leaving' [16:26:34] wondering why [16:26:39] like, Michigan had a brain drain during it's recession (before the big US recession) due to the auto industry dieing, people went other places [16:27:33] much like Detroit ? [16:27:45] right [16:27:51] (Detroit is in Michigan ;) ) [16:28:03] ahhh [16:28:16] on this side of the earth, we have no clue what Michigan is [16:28:19] the only place in Michigan that didn't have the problem was Ann Arbor, where the University of Michigan is. It's fairly insulated from the surrounding issues [16:28:29] but we had multiple documentaries regarding Detroit huhu [16:28:36] yeah :) [18:07:12] 17:15:51 - Magog_the_Ogre: copyright violations run rampant [18:07:12] Fae: do you can scan uplods by brand new user and scanning with googleimage(match)?