[00:17:01] hello ? anyone who understand norwegian here ? [00:17:03] Me and bsitu are going to use the lightning deploy window if no one objects [00:17:31] got an interesting tech suggestion ? [00:17:58] about ggogle-doodle and Wikipedia-articles [00:18:36] I only know 1 norweigan wiki person.. and he's not here atm [00:19:22] http://no.wikipedia.org/wiki/Brukerdiskusjon:Danmichaelo#Kan_vi_bruke_Google_doodle.3F <-- here's the suggestion in norwegian [00:24:57] reedy; well if i try to explain it english then... it is about making a bot for google doodles and sending out a message to wikipedians who have subscribed for this thing to update the doodle-article it links too. [00:57:31] hi; I'm now getting a "Permission denied (publickey)" when I try to ssh into my instance [00:57:36] this worked fine last week [00:57:53] gribeco: For Wikimedia Labs issues, please seek help in #wikimedia-labs [00:58:03] oops, thanks [00:58:16] overlapping screens :p [03:24:16] gn8 folks [03:24:33] DaBPunkt: oh, and i thought its too late to ping you an hour ago [03:24:36] lets talk tomorrow [03:24:43] good night [03:25:01] mutante: it IS too late ;) [03:25:10] ping me tomorrow [03:25:14] yes, will do.. cu [03:25:22] cu [03:32:26] Wikipedia has been on and off for me the past 20 minutes or so [03:32:27] is it just me? [03:32:32] going very slow now [03:42:28] we are looking at this [03:42:44] seems to be improving a bit [03:43:46] Thank you for working on it. [03:44:10] yeah it seems to be working properly now, but I said that before and it went down again [03:44:30] * Jasper_Deng always remarks how we manage to serve so many users with so few resources. Kudos to all sysadmins. [03:45:30] :) [03:45:49] uhoh Soapy don't say it's working properly again ;) [03:56:52] Soapy: where are you? [04:14:50] gone [05:46:17] hey is there a reason the database dump server is so slow? [05:46:28] in terms of download speeds, I'm seeing 50kb/s [05:46:32] sorry kB* [05:46:40] and that isn't at my end :) [05:48:58] Is there a faster way I can get some partial database dumps? I'm looking at using the pagelink data to construct relations/paths/networks between pages, but the 4gb download is going to take over 20 hours. Its pretty slow for everyone that has tried, not my internet [05:50:24] It's a known issue Prodego and md_5 [05:50:36] Have you tried grabbing from one of the mirrors? [05:50:52] I shall take a look [05:51:20] I know the person responsible for it is aware of the issue [05:51:33] sorry Reedy , the file I want isnt on the torrent list [05:52:00] I was meaning these mirrors: http://dumps.wikimedia.org/mirrors.html [05:53:51] getting 110kbs from the chicago one [05:53:59] I imagine thats best even though I am in Australia [05:54:14] Ah :( [05:54:23] Download manager with multiple streams? [05:54:43] I'll probably sit out the 7h:19m [05:54:57] Surely you must be used to this problem? ;) [05:55:05] torrents? [05:55:16] slow internets down under [05:55:20] Reedy: i keep forgetting to ask at a more reasonable hour... do you know anything else about the dumps is slow issue? [05:56:05] jeremyb: I know Ariel is both aware and has looked at it [05:56:07] Reedy I think 30/1mbps is actually pretty good :P [05:56:27] IIRC there is nothing obviously wrong. But no idea if much time was spent on it [05:56:30] Reedy: all i know is that i looked at dataset2 in ganglia a few days back and it looked ok [05:57:10] md_5: you're welcome to become a mirror or find someone else in australia to do so. ;-) [05:57:21] that should definitely speed it up for you! ;) [05:57:32] all my servers are overseas, hosting is ridiculous here [05:57:40] I might shoot an email to AARNET though [05:57:45] they host the ubuntu mirrors etc [05:58:09] md_5: well do let us know what you hear from them [05:58:14] * jeremyb crosses fingers [05:58:48] md_5: that won't tell you much, i run TZ=UTC [05:59:01] trying to figure out if you are australian [05:59:07] He's not ;) [05:59:09] * jeremyb is actually America/New_York atm [05:59:13] Reedy: shhhhhhh [05:59:18] I wonder if ULSFO might help here.. [05:59:32] Reedy: what about that new pipe they were building? ;) [05:59:36] Though, we don't cache/similar dumps [05:59:39] (undersea) [05:59:44] right [05:59:57] If it was a pipe that went via space I'd be very intrigued... [06:00:07] Reedy is there rsync or something for use by mirrors? [06:00:12] Reedy's so funny [06:00:12] Reedy: how about just airmailing harddrives [06:00:13] ftp maybe [06:00:27] Prodego thats cheaper per terabyte in lots of cases [06:00:33] md_5: yes. even rsync from mirrors that you can use now [06:00:35] md_5: it's on the wiki... [06:00:35] Reedy: any time someone wants a DB jump, just mail them a harddrive [06:00:50] Reedy: it can come out of the WMF budget for stupid things, which I understand is very large [06:00:52] http://en.wikipedia.org/wiki/Wikipedia:Database_download <-- rsync is not listed on that page [06:01:03] :) [06:01:07] I didn't think there was rsync [06:01:33] md_5: http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Media [06:01:38] Reedy: there is, i've used it even [06:01:39] I know (possibly Tim else ariel) it was setup for a period for WhiteCat [06:01:42] orly? [06:01:58] Reedy: from a downstream mirror [06:01:59] yeah but the main one doesnt have one jeremyb [06:02:03] only downstream [06:02:11] Prodego: Random sized drives with random wikipedia articles on them? [06:02:27] md_5: right. but if you became a mirror then you could maybe get upstream rsync access [06:02:32] "I've sent an email to AARNet, a mirror that was set up in 1998. Although I can't guarantee that it would allow us to have the XML dumps content (since they only focus on open source software), but cross your fingers! --Hydriz 10:55, 24 November 2011 (UTC) [06:02:32] And they rejected, sigh. --Hydriz 03:54, 30 November 2011 (UTC) " [06:02:46] I guess I will give it another shot. [06:02:49] Prodego: https://twitter.com/LilyLivingstone/status/287822929314594817 [06:02:50] Reedy: just make sure they are all SSDs [06:02:50] some of those [06:03:25] ok, sleepy time... [06:04:27] * Prodego waves goodnight to jeremyb  [06:04:30] md_5: there's a much faster way, funny that nobody has mentioned it yet [06:04:38] good night Prodego [06:04:59] what is that [06:05:03] (faster way) [06:05:12] Reedy: (there's nothing tracking it yet? e.g. RT/bz) [06:05:29] There is a bug for it [06:05:35] AWS has some Wikipedia backups as shared volumes, I think they are even uncompressed [06:05:48] with cheap access [06:06:10] TimStarling: updated how often? [06:06:16] not sure [06:06:33] but the general principle with big data is that processing is easier to move around than data, right? [06:06:49] Reedy: looks like bug 43647 [06:07:04] http://aws.amazon.com/datasets/2506?_encoding=UTF8&jiveRedirect=1 claims "Last Updated: September 29, 2009 1:09 AM GMT" [06:07:21] but it could be updated under a different name or something [06:11:29] no, not updated [06:13:37] TimStarling: how are you doing by the way? I've been away a while, and haven't run in to you since I've been back frequenting IRC [06:13:49] I'm doing fine [06:14:15] awesome, awesome glad to hear it [06:56:25] Hi Michael, [06:56:26] [06:56:26] I remember the last request for this, and unfortunately our reasoning this time will be the same. The archive is too big for the present available capacity, and we�ve had no requests from any researchers (identifying themselves as such) for the data to be mirrored. [06:56:26] [06:56:26] We may revisit the decision later this year once a new iteration of Mirror is deployed, but until then, sorry, we won�t mirror it. [06:56:29] oops [06:56:32] oh well, there is the answer [07:18:08] md_5: I do believe that quite a bit of data would be lost if the WMF lost all copies of wikipedia [07:18:21] it really isn't hosted elsewhere [07:18:45] there are 3 mirrors, and I dont think wmf is going to crash and burn across their entire network [07:21:50] it is unlikely [07:21:56] but hypothetically [07:22:13] there was a loss of images a while ago [07:22:47] bottom line, they have used up their 84TB of space and are waiting to upgrade [07:24:39] I don't blame them, I doubt it would get too much use [07:48:19] dat pastebn [07:48:21] pastebin [08:50:15] is anyone else getting a large white box that loads on pages when they render, which then disappears and leaves a white gap at the top? [08:56:15] see also http://oi46.tinypic.com/16bncec.jpg [09:01:55] Moe_Epsilon: that's a mighty fine white box [09:02:22] indeed it is [09:02:32] do you see it as well, or is it just me? [09:03:02] nope. [09:04:10] well that's weird [09:04:37] Moe_Epsilon: seems a harmless error [09:05:13] well when it shows up on every page you load it gets annoying :p so I have to find the source [09:05:57] Source and gravy [09:06:45] Moe_Epsilon: can you pastebin a copy of your {view/page}source? [09:07:16] sure [09:09:36] p858snake|l: http://pastebin.com/W2GiNiJf [09:10:25] pagesource of [[Main Page]] [09:16:40] p858snake|l: ahh, I right clicked on the page and see "This frame", and I see some kind of ad thing in it [09:17:00] Moe_Epsilon: check the extensions you have installed and running on firefox then [09:18:33] p858snake|l: yep, it was in relation to a file I recently downloaded, and uninstalled [09:20:16] and fixed [09:20:50] thats what I get for downloading a program from cnet [09:56:41] andre__: is it possible not to show the "Spam" product on https://bugzilla.wikimedia.org/enter_bug.cgi ? feels silyl [09:57:12] Nemo_bis, I have no idea anyway why somebody ever created it. Using RESOLVED INVALID is totally fine. [09:57:27] anyway, you can close certain products for new bug entry. [09:57:27] sure, there's a lot of wth-products in that page, but soo much clutter [09:57:34] and I think it's a good idea to do it in this case. [09:57:39] andre__: well, it says why, to filter them out automatically [09:57:53] doesn't make much sense to me. [09:58:07] Who wants to find INVALID reports anyway? [09:58:23] CLosing for new bug entry also means that you cannot move reports to that product anymore, but I think that's not a big loss here. [09:58:42] andre__: I do search invalid reports sometimes [09:58:56] but admittedly it's quite rare [09:59:10] in that case you can probably cope with six invalid ones more ;P [09:59:20] Nemo_bis, are you fine with filing a ticket in bugzilla against Wikimedia/Bugzilla about it? [09:59:20] only six bugs there? [09:59:34] Yes of course!!! As if somebody used the stuff that somebody set up ages ago ;) [09:59:44] (like a "Spam" product) [10:01:45] https://bugzilla.wikimedia.org/show_bug.cgi?id=43767 [10:02:03] Nemo_bis, yay, thanks. And also thanks for bringing this up :) [10:02:18] andre__: also, is it really so hard to move some products under a single product to make the new bug page less cluttered? [10:02:40] soo many entries when 97 % of the people only want MediaWiki [10:02:55] (of course exaggerating, but...) [10:05:10] Nemo_bis, I guess there could be some hacks to do that. I am not aware of any "Move these products to the top" setting in Bugzilla. [10:05:34] Nemo_bis, again I'd appreciate a bug report so I don't forget. Once I've done with some other code changes in Bugzilla that sounds indeed like a very good thing to do [10:08:33] nice [10:08:36] andre__: also https://bugzilla.wikimedia.org/show_bug.cgi?id=43768 [10:08:49] thanks a lot :) [10:08:59] coming from http://quominus.org/archives/714#comment-688 , I must say [10:14:43] for the collapsing I don't know, we alredy have the bugzilla taxonomy bug and RfC [10:39:28] Nemo_bis: yeah, that posting had quite some good and interesting ideas. And I'm working on a few (e.g. new frontpage, see https://bugzilla.wikimedia.org/show_bug.cgi?id=22170 ) [10:39:34] * andre__ refering to the blogpost [13:34:24] andre__: can't the reminders be sent only to the person in question? [13:34:39] Nemo_bis, by private email? [13:35:00] what's the advantage? [13:35:28] not cluttering the bug with comments [13:35:37] oh, "Send Mail to Bug Assignees" is just a mailto, tsk [13:35:57] I hoped for some magic [13:36:49] I never clicked it to keep this hope [13:37:41] I guess I hope that by having such reminders visible that more people keep an eye on their assigned to list. I might be considered naive. [13:40:45] andre__: sure, I don't mind either, just throwing ideas and hoping for you to extract the usual rabbit from your bugzilla admin magic hat [13:40:59] haha [15:01:10] anyone know chrome on windows? trying to find an easy way to tell someone how to get timing for HTTP reqs (i.e. initial connect to connection close. not to page load event firing) [15:01:51] * jeremyb hasn't windows handy [15:01:57] jeremyb: Yuvi knows Chrome stuff and MaxSem knows Windows but I don't know whether either of them knows both! [15:02:19] sumanah: presumably chrismcmahon would know too. not on yet i guess [15:02:27] correct [15:04:11] sumanah: is this restaurunt thing serious? i can't tell ;P [15:04:23] jeremyb: Please be more specific. [15:04:24] sumanah: also, wow, i've never seen you with a pocket protector [15:04:47] sumanah: http://identi.ca/notice/98769411 [15:05:30] jeremyb: No, he's joking. Also, I presume you have recently seen a photo of me wearing a pocket protector? [15:05:43] sumanah: nope! [15:05:44] Oh I see. [15:05:56] http://www.crummy.com/2001/01/03/0 right. [15:06:25] yah ;) [15:06:55] * jeremyb wanted to go eat at the restaurunt ;-( [15:07:08] we could have even had wikipedia meets there! [15:07:27] * jeremyb waits for a windows person [15:07:58] I still own that Linux pocket protector, and have now gotten it signed by the members of They Might Be Giants in a bid to make it the geekiest pocket protector ever. I rarely wear it; maybe I shall wear it tonight. [15:08:47] well how often do you wear a pocket in the right place/size/etc. ? [15:08:52] to put it in [15:10:43] Right. [15:23:19] sumanah: my first round of googling came up with something related but not quite what i wanted. tried again with different terms and i think i got it. sent links to https://developers.google.com/chrome-developer-tools/docs/overview#access && https://developers.google.com/chrome-developer-tools/docs/timeline [15:23:25] good luck [16:41:40] PHP Notice: Undefined variable: wmfRealm in /home/wikipedia/common/wmf-config/InitialiseSettings.php on line 12244 [16:45:17] qgil: hey, saw your name recently while browsing mako's bookshelf. was a surprise because i hadn't heard your name until a few months ago but mako caught me up a little :) (mako's name was on the same book. can't really remember which one except that it was yellow or orange.) [16:45:50] jeremyb, :D Mmm Ubuntu manual? [16:46:12] idk. the the ubuntu books are mostly purple? [16:46:56] <^demon> MaxSem: Hmm... $wmfRealm should be defined before InitialiseSettings is included. [16:47:02] qgil: it was near the middle of the back outside cover [16:47:04] jeremyb, The Art of Community was blue, I think [16:47:48] jeremyb, http://www.amazon.com/The-Official-Ubuntu-Book-ebook/dp/product-description/B000OZ0NA8 [16:48:33] jeremyb, so yes, I have been around in the free software community for some time, but changing jobs - areas [16:49:07] qgil: oh, yeah, that looks familiarish. i remember rotating the boook to look at all the people on the cover :) [16:49:26] jeremyb, I see you had a deep look at the book :D [16:50:01] qgil: i looked at a bunch of them :) [17:54:15] notpeter: I hear chrismcmahon needs to talk with you re that checklist https://wikitech.wikimedia.org/view/Eqiad_Migration_Planning/Checklist :-) [19:01:55] hi henna, how are you? [19:02:37] sumanah: ok, I'll touch base with him today [19:03:04] Thanks notpeter [19:03:09] chrismcmahon: ^ [19:03:17] you can expect a notpeter later :) [19:03:21] henna: http://dpaste.com/869761/plain/ [19:04:07] ooh, deploy time [19:08:18] sumanah: busy :( [19:18:08] jeremyb: was that for me? [19:23:21] I've read the blog post about using swift for the 320px pictures and i'd be very interested to hear more about how its working out. I've seen some mentions of hardware failures and such in the engineering reports but i'd like to know more about the hardware specs. I'm building a similar platform with 150MM images now and increasing, roughly 1k RPS. [19:41:48] xmltok: you want to ask paravoid. He is working on the hardware/ops side of our swift installation [19:42:07] xmltok: AaronSchulz might know about it too. He is in charge of the MediaWiki counterpart [19:42:12] more people involved probably [19:42:45] xmltok: if they don't answer here (paravoid might be off, he is in Europe and Aaron probably going to eat soon), you could try our wikitech-l mailing list. [19:45:27] xmltok: if you search for "swift" on mediawiki.org and on wikitech.wikimedia.org you will get some more insight [19:46:43] cool, i'll keep digging for now. i'm very interested in how you guys have scaled it, i had something similar to the original spec but it basically transformed the nodes into pointless replication load generators [19:46:48] thanks [19:57:25] xmltok: we also have doc on the ops wiki : https://wikitech.wikimedia.org/view/Main_Page [19:57:52] xmltok: https://wikitech.wikimedia.org/view/Category:Swift [19:58:17] xmltok: you have some basic hardware specs at https://wikitech.wikimedia.org/view/Swift/Server_layout [19:58:33] the wiki might not be up to date though [20:35:27] Reedy: do you know if anyone altered test2wiki today? UploadWizard seems to have stopped working out there recently. [20:51:34] chrismcmahon: WFM mostly.. [20:51:34] https://test2.wikipedia.org/wiki/File:Reedytestfishthing.png [20:51:50] Going from describe I see a JS error [20:51:51] Uncaught ReferenceError: response is not defined [20:52:29] Reedy: https://bugzilla.wikimedia.org/show_bug.cgi?id=43791 all the browser tests failed and I can repro manually too. [20:53:01] Yeah, I get that I think [20:53:05] but the file does get uploaded [20:53:06] sure enough [20:53:43] updating BZ, thanks [20:56:54] you beat me to ie :) [20:56:57] it [20:59:43] * aude tries uploading [21:00:24] Uncaught ReferenceError: response is not defined [21:00:46] <^demon> chrismcmahon: Beat you to ie? Why would you race to open IE? [21:00:52] <^demon> "I can open a crappy browser before you can!" [21:00:56] <^demon> :) [21:01:20] I have automation to open crappy browsers these days, makes me happy. [21:20:52] chrismcmahon: Yes, UW got updated, what broke? [21:21:20] Oh I see [21:21:36] marktraceur: yep, https://bugzilla.wikimedia.org/show_bug.cgi?id=43791 [21:22:09] I don't think I merged anything that touched that part [21:26:14] Reedy: could the CommonSettings.php thing you fixed 30 min ago have anything to do with bug 43791? [21:26:37] * robla is just casting about for a few minutes [21:28:03] nm....I see the chatter in #mediawiki [22:37:36] henna: yeah, it was about the same time that sumanah poked you and I rarely see you active here and I usually associate you and stroopwafel with eachother :) [22:41:00] mmmm stroopwafel [23:13:16] o_o https://meta.wikimedia.org/wiki/Help_talk:Import#XML-Import_only_for_small_Files_posible [23:22:43] $ mwscript eval.php testwiki [23:22:43] > $wgConf->loadFullData(); [23:22:43] > var_dump($wgConf->get('wmgEnableGeoData', 'eswiki')); [23:22:43] bool(false) [23:22:57] what am I doing wrong? [23:36:18] hashar: sorry for the belated response, those links are exactly the kind of information i was looking for. thanks a bunch [23:41:49] Can anyone explain how the pagelinks table works? [23:41:58] 10 0 Computer_accessibility [23:42:07] pl_from, namespace, pl_title [23:42:38] from - page containing the link [23:43:04] namespace and title are the page the link points to [23:43:30] it can be nonexistent hence ns/title instead of page id