[00:09:57] I'm having a baffling login problem on Meta. Are other people able to log into Meta right now and then access their watchlist? I can get past the initial login but I seem to be automatically logged out after I click anywhere else after logging in. [00:10:16] Other WM sites work just fine. [00:10:55] GreenPine: that's strange - I was able to (however I have nothing in my watchlist) - let me add a few watchlisting things [00:11:14] the good old fashioned clear all wikimedia cookies may be in order …. [00:11:31] I thought about that but I was logging into meta FIRST [00:11:39] So there were no WM cookies before I went there [00:11:46] I only went to other sites after having problems with Meta [00:12:40] there might be some old stale ones - after adding some pages to my watchlist i still am not seeing the problem [00:13:12] I cleared all cookies, logged back into meta, and then clicked to the main page. Same problem, I'm automatically logged out [00:14:19] I'm even allowing 3rd party cookies and the problem still happens [00:16:11] GreenPine - just a very random gamble: You don't happen to log in with HTTPS, but end up being redirected to http? [00:16:56] hrm [00:17:04] Excirial: good guess, but no, HTTPS in both places [00:17:45] This worked fine yesterday. I'm wondering if this is due to the NoScript update today. [00:18:03] But I selected "allow all" and the problem still happens [00:18:14] Can't reproduce it. [00:18:35] And i'm using the latest NoScript and Ghostry as well. [00:18:44] Hm [00:18:56] Does it do the same in another browser? Or just firefox? [00:19:41] Unfortunately IE seems to be broken on this PC or I'd try it [00:20:33] Hm, I can even see the cookie with my username in it [00:21:29] and ENWP works fine from the same login session [00:21:41] For some reason Meta isn't catching that I'm logged in [00:22:41] I wonder if I offended Geoff and he used his magic powers to block my accounts on Meta... [00:22:51] haha [00:23:50] If your feeling adventurous, you could install Firebug and see if that cookie is actually forwarded to the server :). [00:25:46] hmm, that is an idea [00:30:44] Excirial: how would I see in Firebug if the cookie is being forwarded? [00:33:30] Firebug => Net monitor section => Open a request => Header => Request headers => Cookie. [00:35:10] Should have something like "centralauth_User=Excirial;" [00:35:19] Along with a bunch of session strings. [00:36:02] I'm not sure I'm in the right place. I have the Firebug window open and have a sub-window open for cookies with a list of cookie names alike centralnotice_bucket. Is that the right place? [00:37:26] That one also works yes :) [00:38:00] Mind you that i have no idea what it SHOULD be sending, but you should at least see if it submits something. [00:38:19] I see a number of cookies but none of them are centralauth [00:39:08] For me it includes a centralauth_User: Excirial along with centralauth_Session, enwikiUserID, enwikiUserName and enwiki_session [00:39:17] I see Centralauth cookies on ENWP but not on Meta [00:40:28] If i go to Meta, it just gives Centralauth_user and centralauth_session. [00:41:00] Interestingly, there are 3 of those - one for commons.wikimedia.rg, one for meta.wikimedia.org and one for .meta.wikimedia.org [00:42:09] ok, so on the page that says successful login, I see a ton of centralauth_user and centralauth_session cookies [00:42:40] when I go to the main page I see only the centralauth cookies for Commons. [00:42:47] They're not retained for Meta. [00:44:01] Then i guess that might be the problem :) [00:44:15] ok so how do I fix that? [00:45:16] Yep, if i block the .meta cookies, i am booted out of Meta, but i enwiki is working just fine. [00:45:57] I have the meta cookies, just not meta centralauth cookies [00:46:38] Thats what i meant - i removed those. [00:46:50] (Just how do i get hose back now... mhm) [00:46:50] In fact looking at the login page, I never get meta centralauth cookies when I log in [00:47:03] I get other meta cookies but not meta centralauth [00:47:36] * GreenPine wonders if someone is eating his cookies [00:48:15] Try disabling all addons, quit firefox, reopen it and see if it persists? [00:48:33] If NoScript or another addon is eating em, you should have a working meta sans addons. [00:48:50] ok maybe I'll try Firefox safe mode and see what that does [00:48:52] BBL :) [01:01:28] Safe mode didn't fix the issue with the Meta login. Any other suggestions? [02:10:45] for what it is worth, I can't log in to meta either (as discussed earlier) - have tried in four different browsers [02:13:39] I asked someone who was already to log off and then log in, and it failed for him as well (but not problems before he logged off manually) [02:13:58] (someone who was already *logged in ...) [02:15:24] I can reproduce it [02:17:23] Set-Cookie:centralauth_Session=e899e10e8932d47d97efce9d73b2abb3; path=/; domain=meta..wikimedia.org; secure; httponly [02:17:40] two dots? [02:17:43] yep [02:39:08] hoo: I found a centralauth cookie with ".meta.wikimedia.org" in the domain [02:39:30] Kaare: And I found the source of the problem [02:39:36] even better :-) [02:45:29] Kaare: https://gerrit.wikimedia.org/r/41018 Now we only need someone to merge and deploy that fix... [02:48:19] great, I'm glad I don't *have to* be logged in to meta right now [02:48:47] Kaare: If you log in somewhere else it probably still works [02:49:11] Ok, I was wrong :P [02:50:30] I'm logged in on my homewiki, but had to delete cookies to succeed :-) [02:58:08] Kaare: Should be fine now ;) [02:59:32] hoo: so it is - thank you! :-) [03:01:31] You're welcome... tomorrow I probably would have been caught by that bug as well :D [03:01:37] :-) [15:06:17] https://bugzilla.wikimedia.org/show_bug.cgi?id=43486 Full text search broken in German Wikipedia [15:06:48] ^ for me not at every request but some [15:07:50] se4598: three in a row work for me [15:08:55] yep, but some requests took unusual long time and then return no result [15:09:13] give me one? [15:09:28] that's doing it now I mean [15:10:24] puppet run mebbe? [15:10:38] something that made them spike, yeah perhaps [15:11:04] I don't see anything in gannglia [15:11:12] yeah I looked there already [15:11:36] the three hosts that were a bit spikey, I restarted lucene on them just in case but I think it probably didn't make any difference, they weren't in a hung state [15:11:39] had one with "Biologie" but request worked after refresh [15:11:56] does our monitoring check if every node actually returns results? [15:12:16] I don't know if it does that or just connects [15:12:45] anyways several more random words and they all work [15:12:50] quick, responsive... [15:15:07] apergos, are you on duty? [15:15:25] rt? I don't think so any more :-D [15:15:34] I know my name is on th channel but that was for last week [15:15:45] it's just that this week there's a lot of vacationing going on [15:16:09] so whom shall I annoy for review?:P [15:16:12] though I have looked at the incoming tickets the last few days anyways [15:16:15] hmm [15:16:29] well you can annoy me and if I feel like I'm not qualified I'll let you know [15:16:52] what tz are you? [15:17:00] UTC+4 [15:17:17] ah so not that far off from me (eet) [15:17:39] leslie was on duty for this week [15:18:04] so I would be grateful if someone looked at https://gerrit.wikimedia.org/r/#/c/40983/ [15:18:20] btw, why is there a password protection on ganglia (again)? [15:18:32] *reason [15:18:38] efen: there's a vulnerability that we became aware of [15:18:44] efen, XSS [15:19:06] Chris is looking at it [15:19:16] he's going to fix it & discuss it with upstream I guess [15:19:27] as soon as we feel safe, we'll open it up again [15:19:56] ah ok, chance to get pw for interesting folk? [15:20:19] last time I heard, it was at NDA level [15:20:32] I've been told it's a matter of days, maybe weeks [15:20:33] but not more [15:24:34] do you use the "results" check anywhere MaxSem? [15:25:17] what do you mean> [15:25:19] ? [15:26:22] right now the default is "service" I guess... [15:26:33] ah [15:26:43] no, currently results shouldn't be used [15:26:45] ok [15:26:58] maybe for TTM, but that's up for Niklas [15:27:11] it's in there just in case? [15:27:44] I was going to use results later, when there will be replication so that all hosts are guaranteed to have a non-empty index [15:28:00] gotcha [15:28:01] ok then [15:30:43] what's a good host to check this on? yttrium? [15:30:51] solr1001 [15:30:55] ok [15:34:05] I suppose that we have to wait for a good puppet run on spence first [15:34:20] every time I run it by hand over it breaks so I won't :-P [15:35:06] thanks! [15:35:48] yw [18:03:28] hi [18:03:28] [20:00:46.999] TypeError: mw.Api is not a constructor @ http://bits.wikimedia.org/uk.wikiquote.org/load.php?debug=false&lang=uk&modules=user&only=scripts&skin=vector&user=Base&version=20121228T171719Z&*:1 [18:03:28] whats wrong? [18:15:20] anybody is here? [18:15:57] Base: I'm going to take a look [18:16:23] hoo: will be nice )) [18:17:44] Base: https://uk.wikiquote.org/wiki/User:Base/vector.js that's the problem [18:17:59] you need to load mediawiki.api before using it [18:18:05] you can do it like this: [18:18:53] hoo: I tried to find how but near here www.mediawiki.org/wiki/ResourceLoader/Default_modules I couldnt find [18:18:56] mw.loader.using( 'mediawiki.api', function() { /* Your stuff here */ } ); [18:20:27] Base: http://www.mediawiki.org/wiki/ResourceLoader/Default_modules#mediaWiki.loader ;) [18:21:39] oh i am on duty this week [18:21:42] sorry apergos :( [18:21:45] hoo: so there should be more links to it) [18:22:40] hoo: It works now. Thanks you! [20:41:25] LeslieCarr: no worries, a did a few but it has either been a slow traffic week or you did most of them :-D [20:41:39] slow traffic week [20:41:50] well that plus daniel seems to be a ticket machine no matter what week :) [21:04:02] fyi: 2620:0:861:1::2 resolves to lists.wikimedia.org, but the corresponding forward dns entry is missing [21:07:58] felicity: https://bugzilla.wikimedia.org/show_bug.cgi?id=41939 ? [21:08:05] felicity: long time no see, how are you doing? [21:35:28] https://bugzilla.wikimedia.org/show_bug.cgi?id=43457 - how many wikis are missing SecurePoll tables? E.g. 'securepoll_entity' [21:36:39] Why should it matter? [21:36:45] Aren't they created when needed? [21:37:37] Because going to Special:SecurePoll gives an SQL error instead of "You have requested an invalid special page." [21:39:24] well, it can provide another error [21:39:39] "no tables, bye" [21:39:46] rather than "table does not exist" [21:39:47] :) [21:40:26] Would be better to only enable it on wikis with the tables (or vice versa). [21:50:53] Reedy: thanks; reviewing -- sorry I haven't gotten to it myself yet. [21:51:03] heh [21:51:18] I was just doing the admin after reading the bug comment ;) [21:51:50] If you can hold off for a few more minutes, I'll do a proper review [21:51:58] Yeah, I'm in no rush [21:52:09] As long as we have it tidied up for the next set of deployments :) [21:55:38] ugh. [21:55:51] You probably made the right the choice, but this sucks. [21:57:02] Reedy: do you know when scribunto is expected to be deployed to all wikis? [21:57:10] Nope [21:57:14] err [21:57:44] "early 2013" [21:57:46] helpful. [21:58:05] January-March 2013 [21:58:05] Full deployment of Lua to the production cluster (contingent upon October assessment) [21:59:35] if scribunto were not in the picture, I would push for just fixing mod so that we have one mod operator that works well [21:59:53] as opposed to an ecosystem of templates that rely on PHP implementation details leaking out [22:00:27] but with scribunto in the works, it seems better to minimize disruption to existing templates [22:01:13] just thinking out loud. I think you made the right choice. [22:02:03] ParserFunctions.php, line 26: "WARNING: enabling this may have an adverse impact on the sanity of your users." [22:08:13] ori-l: are you deploying something right about now? thinking not, it being Friday [22:08:25] no, i'm not [22:09:17] * DaBPunkt knows: "friday: no meat", "Friday: no deployment" is new ;) [22:35:48] * Reedy deploys ori-l. Only because it's a Friday [22:36:20] It's Friday, Friday [22:36:22] Friday, Friday, Friday, gotta get down on Friday [22:36:23] Gotta deploy on Friday [22:37:44] now deploying on a friday and doing it on a plane >.>