[10:01:22] hashar, Reedy: you guys up for the hangout in about 15 minutes? [10:01:33] DanielK_WMDE: I am awake :-] [10:01:39] ;) [10:01:43] DanielK_WMDE: Sam usually show up a bit later [10:01:54] I mean later than 10am ;-] [10:02:06] yea [10:02:25] i don't have that much to discuss... but maybe we can talk about the profiling stuff and get it merged [10:02:53] ah, aude just mentioned https://gerrit.wikimedia.org/r/#/c/33734/ [10:02:56] havN#t looked yet [10:03:02] anyway, talk to you in a minute [10:05:28] hi [10:12:03] getting a coffeee [10:15:08] hashar: can i have one too [10:16:07] ori-l: too late for you :-D Get a pillow instead? [10:16:30] :< [10:16:34] heh [10:19:43] hashar: DanielK_WMDE Reedy https://gerrit.wikimedia.org/r/#/c/32574/ also [15:26:07] New patchset: Hashar; "minor cleanup of extrasettings for Wikibase" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32837 [15:26:30] New review: Hashar; "rebased, fixed merge conflict. Deploying." [integration/jenkins] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/32837 [15:26:30] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/32837 [15:49:02] PHP 5.5 [15:49:04] "We also dropped support for Windows XP and 2003." [15:49:09] Seems a little premature [15:50:13] yer still on XP? [15:51:43] I'm not [15:52:18] Though, I guess MS have dropped support for it now [16:11:35] On April 10, 2012, Microsoft reaffirmed that extended support for Windows XP and Office 2003 would end on April 8, 2014 and suggested that administrators begin preparing to migrate to a newer OS.[11][12][13] [16:11:40] The NT-based versions of Windows, which are programmed in C, C++, and assembly,[14] are known for their improved stability and efficiency over the 9x versions of Microsoft Windows.[15][16] Windows XP presented a significantly redesigned graphical user interface, a change Microsoft promoted as more user-friendly than previous versions of Windows. A new software management facility called side-by-side assembly was introduced to ameliorate the "DLL hell" t [16:11:46] During Windows XP's development, the project was codenamed "Whistler", after Whistler, British Columbia, as many Microsoft employees skied at the Whistler-Blackcomb ski resort.[19] [16:11:50] According to web analytics data generated by W3Schools, from September 2003 to July 2011, Windows XP was the most widely used operating system for accessing the w3schools website, which they claim is consistent with statistics from other websites. As of October 2012, Windows XP market share is at 22.1% after having peaked at 76.1% in January 2007.[5] [16:11:55] Contents [16:11:58] 1 User interface [16:12:01] 2 New and updated features [16:12:03] 3 Removed features [16:12:05] 4 Editions [16:12:08] 4.1 Editions for specific markets [16:12:11] 4.2 Languages [16:12:13] 4.3 ATMs and Vendors [16:12:16] 5 Service packs [16:12:18] 5.1 Service Pack 1 [16:12:20] 5.2 Service Pack 2 [16:12:23] 5.3 Service Pack 3 [16:12:26] 6 System requirements [16:12:28] 6.1 Physical memory limits [16:12:30] 6.2 Processor limits [16:12:33] 7 Support lifecycle [16:12:36] 8 License and media types [16:12:38] 8.1 Retail [16:12:41] 8.2 Volume License [16:12:43] 8.3 Original Equipment Manufacturer (OEM) [16:12:46] 9 See also [16:12:48] sh*t sorry for that [16:19:52] saper: Great job! :P [16:23:46] yeah [16:23:52] wikipedia2irc gateway :( [16:26:34] New review: Erik Zachte; "one issue with .gitignore, if you fix I will +2" [analytics/wikistats] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/33858 [16:26:59] New review: Erik Zachte; "one issue with .gitignore, if you fix I will +2" [analytics/wikistats] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/33858 [16:28:00] <^demon> saper: I can haz code review? [17:08:45] btw Reedy Score is getting deployed in about 7 hours to test2 [17:18:05] pop quiz andre__ [17:18:11] what site(s) are we deploying to today? [17:18:13] oh noes [17:18:19] 1.21wmf4 [17:18:31] en.wp [17:18:38] as wmf4 got out last week [17:19:01] good job :) [17:19:17] in a month you'll be quizzing me [17:19:18] ;-) [17:19:27] qgil: how was your weekend? [17:19:53] sumanah, relaxed. :) Yours? [17:20:11] Not bad! my spouse threw a party that had a bunch of our friends and it was lovely to see so many [17:21:09] sumanah, I was actually in a (baby birthday) party. :) The food was excellent (not baby food) [17:21:35] sumanah I'm writing an email inviting people to take Kevin's survey about newcomers [17:21:38] HA [17:21:40] okay! [17:30:50] ^demon: hope to finish unpacking stuff after move and then sit down to gerrit; I guess I need a working ldap env to test :/ [18:02:12] New patchset: Krinkle; "Cleanup: Remove crap" [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/34108 [18:02:24] Change merged: Krinkle; [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/34108 [18:09:23] New patchset: Krinkle; "Fix jshint .bin path" [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/34110 [18:09:34] Change merged: Krinkle; [integration/grunt-contrib-wikimedia] (master) - https://gerrit.wikimedia.org/r/34110 [18:13:33] New patchset: Krinkle; "Grunt: Add basic build file for linting with jshint." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34112 [18:45:25] sumanah: When you get a chance, I could use a push-through of some OPW candidates' requests, in particular https://www.mediawiki.org/wiki/Developer_access#User:Ilonakz_2 [18:45:43] jeremyb: want to help? ^ [18:46:00] * sumanah waits for bits to load :/ [19:14:42] sumanah: sure [19:14:45] Thanks [19:14:55] what's OPW? [19:15:12] Outreach Program for Women. [19:15:17] https://blog.wikimedia.org/2012/11/15/apply-for-the-foss-outreach-program-for-women-internships/ [19:15:34] oh, right [19:15:54] the thing that i pointed nicole ebber to [19:29:32] http://jsfiddle.net/XQpzU/4358/ [19:30:27] plop [19:35:31] hey Nikerabbit [19:36:14] hi DanielK_WMDE [19:36:29] Nikerabbit: what do you think of https://bugzilla.wikimedia.org/show_bug.cgi?id=42159 ? [19:37:03] i think that's the best we can currently do without having to wait for a patched version of squid, or for varnish [19:37:08] http://jsfiddle.net/XQpzU/4361/embedded/result/ [19:38:02] DanielK_WMDE: doable I guess [19:38:32] is someone going to work on that... I've been allocated 2h for ULS issues in this sprint [19:38:51] i might. can't promise it though [19:39:05] depends on how other wikidata stuff is progressing [19:39:58] Does anyone know what the name of this pattern is ? http://jsfiddle.net/XQpzU/4361/embedded/result/ http://cl.ly/image/2R0d1e3w270R [19:40:11] pie-angle-turn-draw pattern I call it.. [19:40:21] pi * [19:42:20] Krinkle: do you happen to know how we keep squids from caching pages for logged in users (or more importantly, from returning cached pages to logged in users) [19:42:54] DanielK_WMDE: I think I can find out, but is there something you need help with? [19:43:15] It has to do with headers I believe (Vary: Cookie, perhaps) [19:43:22] Krinkle: i'm trying to figure out how to implement bug 42159. [19:43:34] well, not just "cookie" but something like it [19:43:37] the question is - can it be done without changuing the squid config? [19:43:44] A header from mediawiki, not from the browser. [19:43:52] !bug 42159 [19:44:01] http://bugzilla.wikimedia.org/42159 [19:44:15] Krinkle: can it be done with cache control headeres? does it need that XVO stuff? [19:44:31] Krinkle: sorry, link is a few lines up in the backlog :) [19:44:49] DanielK_WMDE: Wait, so you're saying you want to disable squid caching for pages that use ULS - like we disable squid caching for any logged-in user page views? [19:45:02] that means all wikidata pages will not have squid caching, right? [19:45:25] Krinkle: right. where "use" ULS means the user has actively set their preferred language, and ULS is remembering it using a cookie [19:45:49] Krinkle: no, they will use squids for all pages that are accessed by anons without manually changing the language [19:45:50] But why would that require a cache miss? USL is all javascript, right? [19:46:06] ULS* [19:46:14] Krinkle: but the page content is not translated by javascript. mediawiki actualkly serves different content [19:46:26] hey :) [19:46:26] What does ULS do that makes mediawiki do taht [19:46:31] ULS just sets the user language. for an anon user. [19:46:47] hashar: There you are, when did you join? I've been looking for you. [19:46:55] Krinkle: ULS forces the user language of the anon user. Wikibase renders page content according to the user language [19:46:59] DanielK_WMDE: define "set user language" [19:47:02] together, you get what we have on wikidata.org [19:47:08] DanielK_WMDE: Set uselang query param? A cookie? Session ? [19:47:14] Krinkle: just joined. Keep on with daniel :-) [19:47:24] hashar: k, look at the gerrit changes I assigned to you. [19:47:26] Krinkle: set $wgLang and $wgUser->whatever [19:47:38] Krinkle: ...and set a cookie, so the next request triggers the same thing [19:47:39] DanielK_WMDE: No, at that point stuff is already happened. [19:47:46] DanielK_WMDE: What does ULS js do that makes mediawiki do that. [19:47:55] Ah, so it sets a cookie [19:48:10] yes. but we can't use that for per-language caching [19:48:23] so you just want to fragment the cache by user language, because it is not user specific right? Just langauge specific. [19:48:26] see https://bugzilla.wikimedia.org/show_bug.cgi?id=41451 for exhaustive discussion [19:48:50] Krinkle: no, actually fragmenting the cache is for later, because it is not feasible with squid as it is. [19:49:06] Krinkle: there has been a long debate involving tim, mark and eric on this. [19:49:14] DanielK_WMDE: There is no parser cache for wikidata pages? [19:49:37] Hm.. I suppose not, since no wikitext is being parsed. [19:49:51] Krinkle: that'S a completely different level. but at the moment, there is no parser cache. because of the magic "canonical" parser options, which can not account for the user language [19:50:02] that's a similar story on a completely different level [19:50:18] Well, I'd say this is outside my knowledge base where I feel comfortable giving recommendations. [19:50:22] Krinkle: what is wrong with using mw.foo() instead of mediaWiki.foo() ? [19:50:39] Krinkle: the parser cache can be used for any kind of generated html, no matter how it was produced. but not having to parse wikitext helps when the cache is off - generating a wikidata page isn't hoprribly slow [19:50:42] hashar: We never use mw as a global variable. [19:50:52] hashar: We always have closures that keep a permanent reference. [19:50:59] hashar: These files don't have that yet. [19:51:06] ahhh [19:51:11] Krinkle: so back to me original question - how is squid caching supressed for logged in users? I just want to understand the machnism, or at least know where to find that mechanism [19:51:14] is it in OutputPage? [19:51:20] I don't know. [19:51:38] ok, thanks then. [19:51:41] DanielK_WMDE: we also have ideas for fragment caching with Parsoid, but that is future stuff [19:51:44] i guess i'll just have to dog in [19:51:44] DanielK_WMDE: Afaik it is using Vary options, and mediawiki sends some special header for logged in users. [19:51:51] oh hi gwicke [19:52:00] DanielK_WMDE: Tim would know, did you ask hiim? [19:52:01] might get to it in February or so [19:52:17] It is 7AM where Tim lives, I guess he'll join within a few hours. [19:53:17] DanielK_WMDE: in theory things like infoboxes are self-contained in the DOM and can thus be cached / updated separately [19:53:25] DanielK_WMDE: as I said this morning, definitely send Tim an email [19:53:38] DanielK_WMDE: I will talk about it during our weekly conf call which is in 2 hours [19:54:20] Krinkle: the submodule from https://gerrit.wikimedia.org/r/#/c/34112/ is using an ssh://Krinkle@… URL for the submodule. Should be https :-] [19:54:41] hashar: Thanks [19:55:36] Krinkle: i'll sleep in a few hours :) I'll dig into the code tonigh and tomorrow, and then ask tim. [19:55:41] hi TrevorParscal [19:55:50] hello [19:56:19] hashar: Can I just change the gitmodules file, or what? I see no "git submodule del" or "git submodule rm" or " change" [19:57:24] Krinkle: either manually change the .gitmodule file or change the remote in the sub directory? [19:57:28] no idea honestly [19:57:35] hm.. [19:59:10] hashar: I updated the file, rm-rf the dir, and submodule update. It is still pointing to krinkle@ (the change will be fine, but my local copy will not) [19:59:23] looks like it kept the .git dir still [19:59:30] since the .git dir is actually inside the root .git dir [19:59:33] anyhow, fixed [19:59:39] maybe git submodule sync [20:00:00] submodule interface is really a mess in git [20:00:16] checkout line 3608 in OutputPage.php, " $wgUseSquid && session_id() == '' && !$this->isPrintable() && [20:00:34] New patchset: Krinkle; "Grunt: Add basic build file for linting with jshint." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34112 [20:02:32] saper: 1844: # Add an X-Vary-Options header for Squid with Wikimedia patches [20:02:34] also interesting [20:04:45] GRmnlnlnlblb nodejs [20:05:28] ./tools/gruntjs/bin/grunt [20:05:29] Fatal error: You must install grunt-cli to use the "grunt" command. [20:05:30] I am cursed [20:05:47] I though we had installed grunt-cli under tools/gruntjs [20:06:10] well it is there [20:25:55] Krinkle: I am wondering how one could launch grunt-cli :-] [20:26:37] it seems the correct path is tools/gruntjs/node_modules/grunt-cli/bin/grunt :-( [20:27:17] Yes [20:27:23] which is hmm [20:27:25] long [20:27:27] ;-] [20:27:41] hashar: obvously, if no global stuff and no npm stuff is used, things suck [20:27:47] what puzzle me is that when install grunt with "npm install grunt" [20:27:52] we also install grunt-cli as a dependency [20:28:11] Yes, that's what you just pointed out in that path [20:28:14] but the gruntjs/bin/grunt still has the lame error message [20:28:24] but it does have it [20:28:27] Because that's an old legacy thing for compatibility [20:28:28] that is really dumb [20:28:28] anyway [20:28:44] I guess we could add a symbolic link under /bin [20:28:45] hashar: Once we have vagrant stuff, maybe we can just go with npm install [20:28:50] not sure how well it will play though [20:33:35] New patchset: Hashar; "symlink to grunt-cli under ~/bin" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34213 [21:12:48] New patchset: Krinkle; "Add symlink for grunt-cli to ./bin" [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34213 [21:15:40] Change merged: Krinkle; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34213 [21:20:02] Hiya, just wanted to ping people about https://bugzilla.wikimedia.org/show_bug.cgi?id=42271 " Special:ExtensionDistributor gives 404 error page when downloading from master " [21:20:38] in case it's something easy to fix (a server/service restart, etc) [21:20:47] thxbai [21:43:52] TrevorParscal: Can you ping James_F to #wikimedia-staff when you/he gets back? [21:44:26] yes [21:44:32] he is eating [21:44:44] will you be on for a bit? [21:45:02] Ah, I see. [21:45:06] Now you're both back. [21:45:21] I'll wait until he's back. Want to talk about the dates of me being in SF. [21:45:39] TrevorParscal: [21:46:18] sure [21:46:19] TrevorParscal: yes, I'll be on for a bit. Just finished some work on continuous integration. We're getting _really_ close now to enabling jshint on repos. [22:15:31] hashar: ping [22:15:40] Krinkle: in conf call :-) [22:15:47] ok [22:15:53] Krinkle: i have added a very trivial change to add a symbolic link to grunt-cli [22:16:02] for the rest, ping me by email :-] [22:16:03] and I merged it [22:16:10] nice! thanks. [22:16:26] I will head bed just after the conf call so can't really follow up on anything today [22:18:23] Nikerabbit, any reason why I shouldn't deploy Solarium today? I was thinking about deploying the code but not actually enabling it anywhere so that it could be enabled later at any moment without requiring a scap [22:39:51] New patchset: Hashar; "Grunt: Add basic build file for linting with jshint." [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34112 [22:40:59] New review: Hashar; "nice :-] As per our IRC conversation, lets keep /tools/ for libraries." [integration/jenkins] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/34112 [22:40:59] Change merged: Hashar; [integration/jenkins] (master) - https://gerrit.wikimedia.org/r/34112 [23:16:32] is wikivoyage on the central login system? [23:16:45] it won't let me log in, says " .*[A-Z -]{5,}.* <newaccountonly|casesensitive>> " [23:17:01] hashar: any idea why AFTv5 stopped working in beta just now? [23:17:07] sounds like the system's there but the local blacklist forbids my account from being created locally [23:17:11] no idea, not there [23:17:13] :( [23:17:21] chrismcmahon: really need to get to bed, I am sorry :-( [23:17:32] brion: your last name most be offensive ;) [23:17:40] just cause i'm SHOUTING [23:17:40] chrismcmahon: maybe some faulty commit has been automatically deployed [23:20:05] brion, it's probably in the central login system but won't let you merge your account there because of the blacklist [23:20:32] yeah [23:21:02] I wonder if there are any local sysops around willing to fix it [23:22:21] asking on the pub: https://en.wikivoyage.org/wiki/Wikivoyage:Travellers%27_pub#Can.27t_log_in_with_WMF_account [23:22:56] ……… [23:23:39] + [23:23:40] + [23:24:58] bah [23:25:03] keyboard :P [23:27:20] that'll teach ya not to yell with your last name:P [23:57:14] hey, what exactly is this bit of code trying to express: wfGetCache( CACHE_ANYTHING ) [23:57:42] i get the idea that you get a cache handler appropriate for the type of things you want to store, but i'm not sure how/why CACHE_ANYTHING maps onto a particular cache driver.