[00:00:10] ok, checking out the load balancer... [00:00:21] well, probably browser dependent I gess. Opera shows the page without CSS until IPv4 fallback [00:00:55] connection to :b working too [00:08:05] MrOmNom: can you try doing a refresh of a page ? [00:10:12] alright, a refresh doesn't hit :a again because it's still smart enough, but I moved to a different browser and did the load again (same problem, :1, :b works, can't get to :a) [00:10:39] yeah, i see that… weird, lots of syn packets [00:13:13] i'm checking out all of the backend hosts for bits now [00:15:56] I just noticed the same problem on youtube (some hosts in a subnet respond, others don't)... so it must not be a problem with wikipedia. Either local issue or Comcast issue. [00:17:06] that's so strange though ... [00:21:48] thanks for looking in to it though, I appreciate it [00:21:54] no problem [00:22:05] if you find out what the issue was soon, pop in and let me know :) very curious [00:23:50] word [01:05:08] binasher: got a minute to talk about varnish? [01:05:52] ori-l: not right now, helping ryan with some labs db stuff [01:06:06] binasher: k, np [01:46:32] Hi guys. Does anybody know it is possible to rename an account having more than 50,000 edits? [01:48:52] sometimes it works [01:49:05] it shouldn't be encouraged [01:49:38] White Cat does it about once a month, I think his edits are scattered over 2 or 3 usernames now [01:50:01] ok maybe that's a slight exaggeration [01:50:08] maybe once a year [01:50:11] An user needs to change his username due to privacy issues [01:50:49] (right to vanish) [01:51:58] feel free to try it [01:52:18] Only devs can do that, right? [01:54:53] no, I think anyone can [01:55:26] there used to be a low limit, but that was removed years ago I think [01:55:42] for a while the limit was 200k, but eventually that was removed too [01:55:50] humm [01:56:08] so, could i do that locally as a crat? [01:56:29] yes [01:57:08] it will send a lot of jobs to the job queue, the name you see in article histories will change as the job queue catches up with them [01:57:26] it may take a while [01:59:00] ok, so i will try this [01:59:07] thanks, tim [01:59:47] yw [02:23:27] !log LocalisationUpdate completed (1.20wmf7) at Thu Jul 19 02:23:27 UTC 2012 [02:23:39] Logged the message, Master [02:31:02] !log LocalisationUpdate completed (1.20wmf6) at Thu Jul 19 02:31:02 UTC 2012 [02:31:10] Logged the message, Master [03:58:53] James_F: Is there an updated demo of VisualEditor somewhere? Are you all still looking for bugs? [03:59:16] Brooke: No, not until Monday's deploy-train. Yes. [03:59:26] Brooke: In git, I guess is my smart-arse answer. :-) [03:59:47] Heh. [04:00:01] I don't think I've checked out any code from Wikimedia's git repo yet. [04:00:16] VE can't just be installed as an extension, can it? [04:00:22] No, it can. [04:00:27] It is on my local dev instance. [04:00:32] Oh, hmm. [04:00:35] Maybe I'll install it, then. [04:00:36] You'll need to point it at a Parsoid instance, though. [04:00:42] I thought it had a Parsoid dependency or something. [04:00:44] Ah. [04:00:51] So it'll pass all my data to some foreign host? [04:00:58] Unless you want to run nodeJS, yes. [04:01:09] No issue at using wmflabs.org if you're OK with it. [04:01:17] It's only for testing. :-) [04:01:38] You mean it's already on labs or I could set it up there? [04:01:52] I figured there'd be a visualeditor.wmflabs.org or something. [04:02:24] Parsoid's already on labs. [04:02:28] That's what my dev box uses. [04:02:33] * James_F hunts for the config line. [04:02:40] http://wmflabs.org/ is still broken? [04:03:21] https://bugzilla.wikimedia.org/show_bug.cgi?id=36885 [04:04:21] I fell down the Labs rabbit hole again. [04:05:00] I was reading Mozilla's commit access process today. Wikimedia's procedure has a long way to go before it's as awful as Mozilla's. ;-) [04:05:06] Requires signed paperwork and such. [04:05:15] And about ten steps. [04:05:37] Brooke: Are you asking for it to be more bureaucratic? [04:06:12] James_F: I'm the one filing the bugs to remove the required human intervention for things like account creation. ;-) [04:06:20] * James_F grins. [04:06:20] I find the current system very walled garden-y. [10:02:47] hello! [10:03:13] i need the rights to push tags to a git repo. halp? ^^ [10:51:27] Jens_WMDE: ^demon would be able to do so. He should be connected "soon" if he is working today [13:10:27] !log Deleted php-1.20wmf6/cache/l10n from mediawiki-installation [13:10:35] Logged the message, Master [13:12:55] !log Pointed /h/w/c/php to php-1.20wmf7 [13:13:02] Logged the message, Master [13:36:43] grmrmblbl [13:36:49] java loooves long paths [13:37:05] cd hooks-jira/src/main/java/com/googlesource/gerrit/plugins/hooks/jira/ [13:38:33] that's nom [13:38:35] that's normal [13:38:37] I'm sure you can be working on java.com.googlesource.gerrit.plugins.org.wikimedia.users.hashar.plugins.src soon [13:39:03] ;-D [13:43:21] hate gerrit hate hate [13:46:50] paravoid: hello Faidon. How can I help you today? [13:46:57] hahaha [14:18:03] !log reedy synchronized wmf-config/ [14:18:11] Logged the message, Master [14:53:32] anyone from features team around? [15:32:23] matanya: probably not yet [15:32:36] matanya: I guess they are mostly in SF where it is 6:30am right now [15:32:42] or is it 8:30 am [15:32:49] well not in office yet ;-] [17:01:36] !log Running copyFileBackend.php for commons (shards c-f) [17:01:43] Logged the message, Master [18:52:09] !log preilly synchronized wmf-config/mobile.php 'fix subdomain check' [18:52:17] Logged the message, Master [18:52:28] !log fixing subdomain check for zero vs m [18:52:35] Logged the message, Master [20:21:43] when I try to run git branch --track wmf/1.20wmf7 origin/wmf/1.20wmf7, I get the error 'fatal: Not a valid object name: 'origin/wmf/1.20wmf7'.' [20:22:40] which has never happened before [20:25:47] did a git fetch and now it works [20:25:50] weird [20:29:07] Most likely your local repo didn't know about wmf7 yet [20:31:58] ah [21:02:05] RoanKattouw: brt [21:33:05] running scap now [21:36:26] !log catrope synchronized php-1.20wmf7/extensions/LastModified 'Remove E3Experiments cruft from LastModified' [21:36:34] Logged the message, Master [21:39:40] !log kaldari Started syncing Wikimedia installation... : [21:39:47] Logged the message, Master [22:10:14] !log catrope Started syncing Wikimedia installation... : [22:10:22] Logged the message, Master [22:45:04] gn8 folks [22:48:47] !log catrope Finished syncing Wikimedia installation... : [22:48:54] Logged the message, Master [23:34:33] Hi, I'm a researcher and I have a question about the wikimedia servers (specifically the servers running the en wikipedia). Is this the right channel to ask in? [23:34:49] DavidJurgens: Yes [23:36:53] We're working on a research project that looks at how people gather information to make decisions. Given that wikipedia is a great source of information, we've been looking at page-views as a proxy for how interested people are by topic. Would it be possible to get the page views broken down by geographic region? [23:38:06] I don't think any geographic information (or IP addresses for doing a location lookup), so i wasn't sure if this data was retained at all, let along publically available [23:38:36] not on a per page basis, but for all WMF hosted sites http://stats.wikimedia.org/wikimedia/squids/SquidReportPageViewsPerCountryOverview.htm [23:38:58] DavidJurgens: On http://stats.wikimedia.org/ You can find a lot of data [23:39:30] Is it possible to obtain it on a per-page basis? The choice in which page matters quite a bit for our research [23:42:09] DavidJurgens: I'm not system administrator, but with the current infrastructure, that wouldn't be possible easily [23:42:17] For example, I can get access to the per-page access lots here http://dumps.wikimedia.org/other/pagecounts-raw/ [23:42:29] yes, these are public [23:42:36] which is a fantastic resource. However, we'd love to see if we can get those on a localized level [23:43:50] I realize it's not a trivial task at all, but I wasn't sure who specifically I could talk to about eventually getting access [23:44:32] especially since I realize there are privacy concerns with the data and I would want to fully explain what we do with it to ensure it meets the Wikimedia foundation's standards [23:45:01] DavidJurgens: See https://wikimediafoundation.org/wiki/Contact_us [23:45:05] I have to leave, good night ;9 [23:46:09] thank you for your help! [23:46:32] Does anyone have a suggestion for which address is best to email with research-based questions? [23:47:02] hang on [23:48:50] maybe Erik Zachte (ezachte@...) would be a good place to start [23:49:40] TimStarling: Thanks! What is the domain for that email address? [23:49:47] wikimedia.org [23:50:10] okay great! I'll send him an email first. Thanks for helping me figure out who to talk with. There are so many options :)