[00:07:12] csteipp, are you around? [06:20:30] There are no more bits issues, are there? [08:46:43] Nemo_bis: ^ [08:46:51] yay legoktm [08:47:12] I probably should have checked that the site was actually up before doing that :P [08:48:31] twkozlowski: the overload lasted perhaps 20 min, but network stayed higher than normal for few hours I'd say https://ganglia.wikimedia.org/latest/?r=day&cs=1%2F9%2F2014+18%3A11&ce=1%2F9%2F2014+21%3A14&m=cpu_report&s=by+name&c=Bits+application+servers+eqiad&h=&host_regex=&max_graphs=0&tab=m&vn=&hide-hf=false&sh=1&z=small&hc=4 [08:48:31] legoktm: You would know if it was down. [08:48:43] not necessarily [08:49:42] Nemo_bis: He /would/ know. [08:57:55] odder gratulacje za naswietlenie tematu edytowania za forsę w WMF [08:59:34] Sir_Designer_: to tylko anglojęzyczne media chyba [08:59:46] Sir_Designer_: plus nie wiedziałem, że jesteś Polakiem / znasz polski :) [08:59:52] niespodzianki niespodzianki [08:59:54] ale Ars Technica, heh heh [09:00:01] albo arse, as I call it [09:00:13] ah, /whois is your friend :) [09:00:35] Sir_Designer_: aż takie złe że arse? [09:00:36] twkozlowski nie wyg-ʉpiaj się,a lbo wyjmij język z policzka :/ /ns info [09:00:51] Sir_Designer_: właśnie sprawdziłem :-) [09:00:58] twkozlowski na odwrǿt. b. dobre. celujące, nawet. [09:03:28] twkozlowski a to w temacie wyjaśniania mojego nowego nicka http://share.shutterfly.com/share/received/welcome.sfly?fid=8716ad8f412f3a56&sid=0AasXDJu0as27mY [09:04:32] Sir_Designer_: :-) [09:05:09] Sir_Designer_: well, na pewno artykuł na Ars jest dużo bliższy prawdy niż ten na Daily Dot [09:13:35] Ars jest pisana, nawet kiedy blogowato, rzetelnie i fachowo przez (u)znane autorytety, tak więc... [11:44:00] !summon Thehelpfuloe [11:44:07] one [11:52:12] WARNING: after wandering whole heaven and hell, we confirm that you tried to summon a non-existing soul. Satan is not amused. [11:52:36] Nemo_bis: *gg* [12:09:07] Could someone tell me where I can find the bug mentioned in https://bugzilla.wikimedia.org/show_bug.cgi?id=45771 in the code? [12:12:17] arav93_: since it says Component:Special Pages, a rough guess would be [mediawiki/core.git] /includes /SpecialPageFactory.php [12:13:05] maybe https://git.wikimedia.org/blob/mediawiki%2Fcore.git/HEAD/includes%2FSpecialPageFactory.php but not a dev [12:13:26] Factory for handling the special page list and generating SpecialPage objects. [12:28:26] Could you please type the exact command? I'm a newbie here. [12:31:51] arav93_: git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git [12:32:04] What does https://gerrit.wikimedia.org/r/#/c/106537/ mean? [12:32:54] twkozlowski: looks like it means that commons is temp. back to old search system [12:33:19] Does it have any consequences for me as a user? [12:33:30] wondering if we should feature this in Tech News #3 [12:33:54] no wait, looks more complicated. see "commons is special" comment.. hrmm [12:34:13] i don't know, let's ask search people [12:34:32] they're asleep :) [12:35:18] so there is no extra index for the File: namespace [12:35:29] is what it says, right [12:35:57] $wgCirrusSearchExtraIndexes[ NS_FILE ] = array( 'commonswiki_file' ); [12:36:30] that would mean searching for stuff in File: NS would be slower than before [12:36:48] okay... [12:36:52] thanks mutante :) [12:37:47] also mutante, while I have you here :-P [12:38:00] any ideas what caused the bits problems yesterday? [12:39:40] sorry no, i'm not up2date on that [12:39:52] do you mean ULSFO? [12:39:54] or other [12:40:17] eqiad [12:40:31] https://ganglia.wikimedia.org/latest/?r=day&cs=1/9/2014+18:11&ce=1/9/2014+21:14&m=cpu_report&s=by+name&c=Bits+application+servers+eqiad&h=&host_regex=&max_graphs=0&tab=m&vn=&hide-hf=false&sh=1&z=small&hc=4 [12:41:22] hmm, no, i don't [12:43:24] twkozlowski: unrelated note. PDF has issues [12:43:30] pdf1 died [12:43:55] Yes, I noticed in the git log [12:44:08] "pdf2 and pdf3 are taking a heavy beating" hrmmm [12:45:13] the Wikipedian in me cries there is no source to use for the bits load spike :-( [12:45:58] twkozlowski: re: PDF we still have hope the dc tech might save it. [12:46:17] yes [12:47:15] wanna bring it up on list? bbl .. lunch [12:47:52] mutante: not sure it's newsworthy for a public like ours on Tech News [15:19:13] hashar: Hi! ProofreadPage has now 1.23wmfX branches created automatically. https://gerrit.wikimedia.org/r/#/admin/projects/mediawiki/extensions/ProofreadPage,branches Does it change anything to the backport procedure? [15:24:17] Tpt: I have no idea [15:24:26] Tpt: I guess you have to cherry-pick your changes from master to the wmf branch [15:24:37] and I have no idea how the extensions are now deployed in production [15:26:38] hashar: Ok. Thanks. I have two questions about testing of file based features. [15:26:39] 1) About https://gerrit.wikimedia.org/r/#/c/106507/ I haven't managed to create a fake RepoGroup based on a file directory. Do you know if there is an easy way to do it? [15:27:47] The test file is: https://gerrit.wikimedia.org/r/#/c/106507/1/tests/includes/FileProviderTest.php [15:33:08] ahhh [15:33:29] the fake filesystem is a bit crazy, I think I coded them to avoid hitting disk [15:35:13] hashar: isn't operating system vm machinery responsible for deciding what goes ondisk or into the memory those days? [15:41:06] saper: we don't use vm :( [15:43:22] Tpt: my use case was preventing parser tests from creating files on disk https://gerrit.wikimedia.org/r/#/c/61276/ [15:43:59] and I do new MockFileBackend( ) instead of new FSFileBackend() [15:45:06] hashar: Thanks :-) I'll use it. [15:46:00] Tpt: AaronSchulz can help there [15:46:03] I've also tried to add to core a parser test for the [[File:|page=Xx]] parameter but I haven't managed to load the test DjVu file: https://gerrit.wikimedia.org/r/#/c/102475/ [15:46:11] I must say I not fully understand how our FS system works [15:49:24] hashar: I'll ask Aaron. Thanks for you help :-) [18:10:47] http://guillaumepaumier.com/timelines/wikipedia-in-2013/ [18:43:40] +1 [20:51:39] hello [20:52:02] when logged in, my mw-content-text is maxed on 715px (Chrome, Windows) [20:52:42] does anyone know what could cause that? [20:53:16] effeietsanders: beta feature - Typography refresh [20:54:00] se4598: what does that mean? will it disappear, do I have to do something to make it go back to normal? [20:54:10] (I'm guessing it's a bug and not a feature) [20:55:00] effe: have a look at https://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-betafeatures [20:55:04] it causes the page content to be squeezed into a small column :) [20:55:16] "Typography refresh" is one if the features listed there :) [20:55:21] effeietsanders: It's "by design" (original quote) since last update. You have to disable the rule in your user-css or the beta feature itself [20:56:38] ok, so I just have to disable the typography thing then [20:56:58] effeietsanders: see https://www.mediawiki.org/wiki/Talk:Typography_Update#Max_width_.2F_Narrow_pages_opt_out_option for a css-rule to disable the width [20:57:30] se4598: I just disabled the whole thing. For beta features, I rather experience the real feature, or not at all :) [20:57:36] otherwise I'd just be fooling myself [20:57:50] (the reason I enabled them at all is to see what is coming up for new users) [20:58:11] thanks for helping me find the source! :) [21:19:01] may someone approve a new oAuth consumer, please? https://www.mediawiki.org/wiki/Special:OAuthListConsumers/view/eec12bb221b2e8e4e5dafbbacf949505 [21:19:16] It's only a modified version of https://www.mediawiki.org/wiki/Special:OAuthListConsumers/view/efb0f3953364514e15ff4d4ac0dc881a [23:14:02] hi ori! I have just drafted a Schema for an EventLogging run [23:14:24] at https://meta.wikimedia.org/wiki/Schema:CommonsCategoryTreeUse [23:14:33] how should I proceed? [23:16:51] hey dschwen, that's pretty cool [23:18:16] so, eventlogging will provide a means for you to dispatch the data to our servers for analysis, but you still need to have a bit of javascript code that generates the event object on some codition [23:20:41] for impression, it'll be something like: if ( mw.config.get( 'wgCanonicalNamespace' ) === 'Cateogory' ) { mw.eventLog.logEvent( 'CommonsCategoryTreeUse', { isAnon: mw.user.isAnon(), action: 'impression', pageNs: mw.config.get( 'wgNamespaceNumber' ) } ); } [23:21:05] though you don't want to generate an event on *every* impression, that could be a lot [23:21:24] sure [23:21:38] the code needs to live somewhere; for one off data-collection jobs the natural place is the WikimediaEvents extension [23:21:48] I would have to bind click handlers to the links as well? [23:22:13] capturing link-clicks is trickier. there's no nice way of doing it so i never added it to eventlogging [23:22:37] Couldn't I just put the code into Mediawiki:Common.js (on commons) [23:22:48] no, because there's a php snippet too [23:22:53] Oh [23:22:59] if you click on the red icon in the top right on the schema page you'll see it [23:23:12] ok, but I cannot do that, can I? [23:23:23] you can submit a patch and ask to have it merged [23:23:24] with gerrit access maybe? [23:23:26] ok [23:23:31] hm [23:23:53] you can also log document.referrer on category pages instead [23:23:58] if you want to know how people are getting to them [23:24:12] The problem with link clicking is that asynchronous operations are killed as soon as the link is followed, right? [23:24:35] I could add a URL parameter to each link and fire the event upon arrival at the new page [23:25:00] or the referrer [23:25:08] you could; you could also cancel the click event, log the link-click, then change window.location [23:25:19] yeah [23:25:20] but it's easy to screw up and if you do it breaks things rather profoudnly for the user [23:25:25] yep [23:25:51] but in general i recommend that you write up a short description of what you want to do and e-mail it to the wikimedia analytics mailing list [23:26:08] because while we don't have the full request log anywhere just yet we can capture specific slices of it [23:26:09] the stuff that is on the schema talk page? [23:26:31] ah yeah, that's great! [23:27:09] i have to run in a moment, but if you get stuck you can email me directly at ori@wikimedia.org [23:27:45] " you could; you could also cancel the click event, log the link-click, then change window.location" please don't [23:28:08] this breaks middle-clicks, shift-clicks, ctrl-clicks, right-clicks and god knows what else when not done right [23:28:19] dschwen: MatmaRex is pretty knowledgable about front-end code too so I'd be inclined to take his advice :) [23:28:23] and it's hard to get right, it seems [23:28:35] since i've seen it broken a few times already :D [23:28:56] i think you can get this via document.referer on the actual category page or by filtering the request log on the server side [23:29:07] probably best to discuss on the analytics list