[00:04:12] I'm going to do a quick unscheduled deployment of UploadWizard since it is mostly broken right now. [00:04:35] kaldari: I've merged some things today [00:04:51] In a futile attempt to get things done. [00:04:54] marktraceur: thanks [00:05:33] I guess 'quick deployment' is an oxymoron though :) [00:06:38] 'round here it sure is. [00:06:59] What happened to that git-deploy thing that was circling the lists a while back? [00:12:15] kaldari: what's broken about UW? (and I can I add whatever it is to the browser tests we have?) [00:13:27] chrismcmahon: https://bugzilla.wikimedia.org/show_bug.cgi?id=45939 is one thing that is broken [00:13:49] ah, flickr [00:13:59] the UploadWizard default config got messed up, so I'm not sure what all is actually broken, but probably several things. [00:14:40] unfortunately, the flickr uploading is a bit hard to write tests for [00:15:16] !log payments cluster updated to 207a112b5b36a9e125999acc6f4098d9058ef594 [00:15:23] Logged the message, Master [00:18:57] kaldari: bah, got a browser test failure too upon selecting "This file is my own work", I wasn't looking b/c I had a slew of other failures today. I put a Flickr check in the backlog. [00:22:13] chrismcmahon: that's ok, should be all fixed in a little while [00:22:21] IE8 and below looks like [00:23:04] FF and Chrome OK for UW [00:23:59] * chrismcmahon considers more granular Jenkins builds [00:24:33] * kaldari considers tallying how many hours of my life I've spent waiting for deployment scripts to complete :) [00:27:54] kaldari: You're not slacking off.... [00:28:27] at least now I have more time for swordfights [00:28:45] Aww, I don't have anything that would enable that. [00:29:43] chrismcmahon, marktraceur: OK, deployment to test.wiki is finally complete, if anyone wants to help test UploadWizard there. [00:30:18] kaldari: did you put it on test2wiki? [00:30:19] Cool. [00:31:48] chrismcmahon: just test.wikipedia.org right now [00:36:32] It's looking good to me [00:37:51] gn8 folks [00:37:58] yeah, looks like the main bug is fixed at least [00:38:02] deploying to cluster... [00:39:26] I'll watch it tomorrow [04:29:23] looks like the bike shed needs to be recoated and painted with math symbols now >.> [04:29:58] * Jasper_Deng suggests Maxwell's equations [04:40:49] it's pretty stupid to label any code review "bikeshedding" [13:33:52] http://www.w3.org/TR/webaudio/ [13:34:06] Are there plans to implement support this? [13:39:46] Qcoder00: none I know of [13:39:58] OK [13:40:03] Qcoder00: feel free to talk about it on wikitech-l :-] [14:21:43] DaB|Eingeschneit, Es hat Schnee bei euch? [14:22:58] Cyberpower678: ~10cm and rising [14:23:53] DaB|Eingeschneit, Und hier regnet es nur. [14:24:17] Cyberpower678: das KANN besser sein ;) [14:25:03] Ich will Schnee :( [14:26:53] Cyberpower678: wo bist du? [14:27:13] DaB|Eingeschneit, Pennsylvania. [15:51:57] should've asked here in the first place: does anyone know a tool for mass revision deletion? [15:53:42] No [15:53:52] Just use the multiple selection thing [15:54:46] I need it for multiple pages, not revisions of a single page. [15:55:36] i have a script that does it [15:57:02] legoktm: really? you're a godsend. [15:57:22] yeah, you just need pywikipediabot [15:57:39] specifically the rewrite branch [15:58:21] Any explosions from the UploadWizard deploy last night? [15:58:56] oh wait [15:58:58] its using trunk [15:59:00] Vito-Genovese, what sorts of things are you deleting? [15:59:13] Vito-Genovese: https://gist.github.com/legoktm/5089173 [15:59:30] its hardcoded to enwiki though [15:59:48] that may be a bit tricky, first off, I don't know the first thing about it. not to mention that it'd violate the bot policy (I think) [16:00:20] And how many revisions are we talking about? [16:00:26] Krenair: stuff like these: http://tr.wikipedia.org/wiki/%C3%96zel:Katk%C4%B1lar/62.201.207.14 [16:00:43] this time I caught him early [16:00:52] well there's a quick start guide at https://www.mediawiki.org/wiki/Manual:Pywikipediabot/Quick_Start_Guide [16:01:15] someone in #pywikipediabot might also be able to help, i'm in class right now so i cant [16:01:19] sometimes he manages to edit several hundred times [16:01:49] thank you, legoktm [16:02:39] np [16:02:45] hopefully someone solves https://bugzilla.wikimedia.org/show_bug.cgi?id=23005 soon :) [16:02:49] * legoktm looks at Krenair ;) [16:03:35] gurch! [16:03:46] Huh. I was unaware of that bug [16:03:51] I'll try it, legoktm [16:03:54] yay! [16:04:04] I found it last week when someone asked me for a mass-revdel script [16:04:28] And after half an hour of looking for documentation on the revdel API, I found that and wrote a script that screenscraped :/ [16:08:34] and any way of stopping him alltogether? here are the IPs: http://tr.wikipedia.org/wiki/%C3%96zel:G%C3%BCnl%C3%BCk/block All proxy, of course. [16:08:34] Gerrit query "project:mediawiki/core message:delete OR message:deletion" returns 500 ISE [16:08:36] ^demon ^ [16:09:02] <^demon> Krenair: Known bug, have a patch in progress upstream. [16:09:07] he's been quite persistent. goes back to summer, I think. [16:10:21] If he's got lots of different proxy IPs then I'm not sure there's much you can do [16:10:39] Just play whack-a-mole really…. [16:17:09] *sigh* [16:24:48] I so wanna make legal threats right now [16:26:06] * Nemo_bis prepares Special:Block [16:27:22] * Cyberpower678 blocks Vito-Genovese :p [16:27:45] might as well [16:28:17] Vito-Genovese: I'm sure there are plenty of big companies with lawyers who would love to play "let's get sued" with you [16:29:46] Well, we do have [[Adnan Oktar]]. That's more than enough :) [16:30:40] Reedy: you around? [16:32:12] is vito-genovese=vito ? [16:32:21] nope [16:32:47] ah ok. vito usually not as wordy [16:33:08] :) [16:39:53] hey guys [16:40:02] can i get two backports deployed? [16:40:17] https://gerrit.wikimedia.org/r/53383 and https://gerrit.wikimedia.org/r/53384 [16:40:24] (this is the same change for wmf10 and wmf11) [16:41:23] this is fixing broken display on category pages on sv.wiki per the bug linked [16:41:32] and it would be nice to get it ifxed soo-ish. [16:41:41] soon-ish*. soon, even. [16:42:47] and then purge an entry from the object cache per https://bugzilla.wikimedia.org/show_bug.cgi?id=45446#c9 (comment 9) [16:42:56] it would be nice if someone did that. thanks. [19:54:42] qgil: PDT now :)BBBBBBBBBB [19:54:45] grr [19:54:50] qgil: PDT now :)* [19:55:12] also, don't really see the relevance to the SF list [19:57:37] hi jeremyb_ wikimedia-sf is a list also used by https://www.mediawiki.org/wiki/Groups/San_Francisco [19:58:22] qgil: i've heard of the group but not yet seen it used like that. i also don't necessarily read everything that appears on the list [19:58:37] qgil: anyway, PDT! [19:58:47] jeremyb_, it was discussed and agreed at the list back in December [19:58:59] ok [20:00:38] jeremyb_, ok, thanks for making me realize that people "change" of timezone from PDT to PST [20:00:57] jeremyb_, I don't think we have such things in Europe but might be wrong :) [20:01:00] qgil: CET/CEST [20:01:15] BST/EET/EEST [20:01:41] jeremyb_, ok I keep learning. I'll use SFT from now on ;) [20:01:48] i usually tell people that if they don't care then they should just say "Pacific" [20:02:17] jeremyb_, good advice. In my case it was a matter of ignorance. Will remember now. [20:02:44] :) [20:03:20] ... and at least https://www.mediawiki.org/wiki/QA/Browser_testing/Search_features references UTC only :) [20:28:19] where does the "something is wrong" feedback in the VisualEditor go? [20:30:06] no idea :) [20:33:58] 'lo [20:35:38] I saw a request for allowing doc pages for scribunto modules to be elsewhere than in a subpage. but version being pushed these days still has "scribunto*-subpage-*" configuration [20:35:59] does it may change in next version or will it stay this way? [20:36:59] (if it will change we can delay docs creation to prevent renaming later, and we don't need so much docs at this time) [21:00:51] Hi. I would need some help from a dev, I just got a database error [21:00:58] DUPLICATE ENTRY [21:01:13] could somebody check if nothing went wrong, please ? [21:05:27] Quentinv57, was that it? [21:05:28] just 'DUPLICATE ENTRY'? [21:05:34] Do you have a screenshot or something? [21:18:50] Krinkle: have you lost the reply on commit message guidelines because of that bug? [21:19:34] Nemo_bis: ULS? [21:19:46] Yeah, I had to type it again. [21:20:33] Krinkle: I don't see it? [21:22:32] Nemo_bis: you replied to it, of course you seen it [21:22:33] https://www.mediawiki.org/wiki/Thread:Manual_talk:Coding_conventions/Release_notes_and_bug_fixes#x.5B.5BManual:Coding_conventions.23Release_notes.7CRelease_notes.5D.5D_and_bug_fixes_25022 [21:23:02] Krinkle: I mean the part "More on that over on that talk page." [21:23:33] there is more info on that page + if you want more discussion, go to that page to start a discussion. [21:23:44] ah ok :) [21:23:48] I didn't reply on that talk page, got nothing to say (yet). [21:24:11] I was just worried LQT had eaten that reply :) [21:26:48] is there an official mediawiki method for browser version checking in javascript? [21:27:44] binasher: jquery.client, probably [21:27:46] or jquery.profile [21:27:55] one of those is deprecated, and i forgot which one [21:28:02] thanks, i'll check both! [21:30:53] MatmaRex: I think you were thinking of jquery.browser which is deprecated [21:31:06] binasher: actually, jquery.browser is the deprecated one, jqyer.client is okay, and i have no idea where did i get jquery.profile. [21:31:10] argh [21:31:12] edit conflict :) [21:31:16] :) [21:32:14] MatmaRex: is jquery.client a plugin? [21:32:49] binasher: yeah, i guess it is [21:32:50] Does ipv6 work with wikimedia wikis? [21:33:01] binasher: it's a RL module for sure :) (included in coe) [21:33:02] core* [21:33:42] ah, yup, it's in resources/jquery [21:37:29] Isarra: it should. doesn't it? [21:43:17] Some folks were working on an ipv6 router and couldn't get wikipedia to work. [21:43:59] Although... there were also dns issues - are there addresses for all the projects? [21:44:04] * MatmaRex has no idea [21:44:16] i managed to make an edit or two via ipv6, though. so at least sometimes it works. [23:01:28] I've read Wikipedia from home using IPv6 [23:01:33] haven't tried editing though [23:02:14] Works fine [23:02:15] Isarra: if that happens again can you get their dns resolver and what sites they don't see ipv6 addresse for ? [23:02:23] Isarra: because ipv6 should be working well [23:03:11] lots of people have edited via IPv6, I have checked the stats [23:06:00] LeslieCarr: I'm not sure they actually had dns. [23:06:17] Is that a requirement with wikimedia sites? [23:07:14] um, everyone in the world has dns, it's how the internet works [23:07:30] it's what looks up a name and translates it to the ip address [23:07:52] it's usually run by the isp [23:07:57] But if you only have an ip, can you still get there, or does the server need the name that was used to sort it out for ipv6 too? (or is that even how it's done?) [23:08:21] The server will need the hostname at some stage, yes [23:08:34] Although if you know the IP, you can put the IP-hostname association in /etc/hosts and use the hostname, that should work [23:08:34] hehe, man lemme look for a minute if there's a good primer of how your computer works to get a webpage [23:08:42] because i am sure someone else can explain it way better than i can [23:09:07] You mean it doesn't just wait patiently at the tubes for interesting information? [23:09:12] I know how the computer gets the webpage, but the server can decide which to yield different ways. vhosts, or whatever. [23:10:08] Which is needed since there aren't enough ipv4 addresses - you need to hostname to figure out which site it is, right? But there are plenty of ipv6... if that makes any sense. For all I know I'm blathering, [23:10:31] Isarra: [23:10:45] Isarra: You could use a separate IPv6 address for each hostname, yes. Not sure anyone does that in practice [23:10:55] Google apparently does. >.< [23:10:58] Because existing infrastructure can already do vhosts and has to for IPV4 [23:11:25] i think this works okay http://computer.howstuffworks.com/web-server.htm [23:11:33] But perhaps that is good practice, what with folks blocking based on domain, and dns sometimes going down, and considering just how many ipv6 addresses folks have... [23:11:36] oh okay [23:12:06] so since the ipv4 pool is low, we do need to use the http host header to tell the server which page it should fetch [23:12:33] we could theoretically with ipv6 have a 1:1 address to host ratio [23:12:45] but since everyone supports vhosts, we figure we make ipv6 identical to ipv4 [23:12:50] Right. That does explain that. [23:12:58] Eeeeegh don't say that. [23:13:21] You know there are people who want to make ipv6 identical to ipv4 and NAT it, right? [23:13:34] Because ipv4 uses nat! We should use nat! [23:13:41] there's no stable ipv6 nat standard ss out there though [23:13:48] i know, i've talked about tht at a network conference [23:13:55] i think it's due to the fear of having to deal with this again [23:13:56] :) [23:14:00] Yeah, there be folks pushing to make the natting the standard! Why would anyone do that? WHYYYY? [23:14:08] most everyone now just gives out a /64 or larger [23:14:28] 64 is many, many. 56 or 48 should be standard, though. [23:14:31] Eventually. [23:14:33] well there is the security/privacy thing -- if you nat then the outside world doesn't hae direct access to your internal boxes [23:14:33] Probably. [23:14:42] well arin only gives out ip addresses in /48 chunks [23:14:53] For now! [23:14:57] so we'd have to get them to allocate a chunk per server instead of a chunk per dc [23:14:57] hehe [23:15:35] Anyway, thanks. That clears that up. [23:18:40] no prob [23:18:51] these things are so much easier to describe on a whiteboard than via conversations [23:18:54] online [23:39:44] !log updating PHP to php5-3.5+wmf1 for new ICU [23:39:52] Logged the message, Master