[07:06:56] Why the Wikipedia dumps are not UTF-8 encoded? [07:07:00] i.e. why they don't have specify encoding="UTF-8" ? [07:07:05] \/ [07:47:55] BadDesign: What's the issue you're having? [07:50:45] Susan: The XML parser I am using complains of "Encountered incorrectly encoded content" [07:51:04] When I try to parse any Wikipedia dump no matter the language [07:53:29] and I suspect it is because the dumps don't specify the XML document encoding [07:54:15] BadDesign: Which XML parser? [07:56:45] Susan: https://qt-project.org/doc/qt-5.0/qtcore/qxmlstreamreader.html [08:26:03] hello Reedy [08:26:42] https://bugzilla.wikimedia.org/show_bug.cgi?id=20825#c12 - can you remove the TTC file first at least? [08:29:26] BadDesign: what do the XML specs say? [08:30:16] i.e. is UTF-8 assumed by default or does it need to be declared? depending on the answer, file as MediaWiki or parser bug [08:30:35] (though in any case the parser could be a bit smarter) [08:40:13] Lydia_WMDE: please thank jeblad and danwe for answers [08:40:32] Nemo_bis: :) [08:40:34] ok [10:32:25] Hello y'all, I'm experiencing some anomalous behavior connected to: http://en.wikipedia.org/wiki/Template:GL_Photography_reply [10:35:34] It displays fine in the WikEd preview but merely flashes and disappears on actual pages. Anyone care to speculate? [10:36:25] I first noticed the behavior last night after creating and editing a /doc page for said template. [10:36:35] flashes? [10:37:22] may be seen for a brief (fractional second) 'flash' when page loads and then nuthin'. [10:37:46] I first noticed the behavior last night after creating and editing a /doc page for said template. [10:38:00] http://en.wikipedia.org/wiki/User_talk:Invertzoo#GL_Photography_reply wfm [10:40:43] Do you see the (blue field) template displayed? The text below I manually entered when I left that message. [10:40:56] It's not showing on my end. [10:43:04] Kevjonesin: http://gyazo.com/ba19e299686709188c4d89d5a49b7c2f [10:46:19] Yep that's it! Damn, that means theirs likely an issue on my end. I've reloaded pages with ctrl-shft-"r" and still get the issu. I'll restart firefox and see what happens. Maybe completely clear cache. [10:47:20] Tanxs : } [10:47:49] might be something in your monobook.js [10:54:09] Out of curiosity what browser and OS are you using. I'm running Firefox on Bodhi Linux. [10:54:21] firefox on windows 7 [10:54:29] Right on. [10:55:11] How'd you know I'm using monobook? [10:57:24] i just assumed, because it's what i've always used [10:58:33] Right on. I switched to it from Vector fairly recently. Just discovered WikEd as well. [11:05:01] ...& only added something (1st time) to my .js recently. hmm... [11:17:20] Ah, the .js change was on my Commons account not Wikipedia. Rules that out. Added a CropBot link to the sidebar. [11:21:03] Anyway, new topic: I'm wondering about about the possibility of automating (scripting?) some of the functions of http://en.wikipedia.org/wiki/Wikipedia:Graphics_Lab/Photography_workshop/Eight_Requests [11:21:41] I'm envisioning crowd sourcing suggestions by having users note image files in need of work on a page with data fields which would feed into a database for a bot to pull from to automatically update the display after automatically archiving "done" requests. [11:21:58] Crowd source rather than pull from categories at random so as to prioritize in use files and avoid files that have already been fixed but still have old maintenance tags. [11:23:03] Anyone care to speculate? [11:39:00] p.s. It was on my end. Cleared cache and cookies and restarted Filrefox and now 'all is well' in regards to template display. [11:49:06] If English WS is wanting to get a new namespace, and it may be something that allt he WSes may use, how do we identify (and pretty much reserve) a namespace number? [11:49:24] all use = use on the respective wikis [11:52:56] Brief interjection: The issue came back after I logged back in to Wikipedia. I'm guessing the issue lies within my 'preferences' settings. I played with them quite a bit yesterday. [17:30:35] greg-g: Hey greg, I need a deployment window to add the new Disambiguator extension as a new submodule and turn it on on test and test2. It looks like all the lighting windows are already claimed though and we don't have any other windows this week. Any ideas? [17:32:41] kaldari: how long do you need? [17:32:57] probably 20 minutes [17:33:33] is E3 using the lighting window today? [17:33:41] "configuration change to enable new account creation and login as default for all projects " [17:33:50] kaldari: wanna work with whoever is doing the config change from E3 today? [17:33:53] yeah [17:33:55] sure [17:34:38] none of them are here right now, but I'll ask them later [17:34:56] kaldari: lazy gits [17:35:22] :) [17:37:52] hi greg-g [17:37:58] hello aude [17:38:19] * aude reading https://wikitech.wikimedia.org/wiki/Deployments#Week_of_June_3rd [17:38:30] are we really starting with test2 etc. on thursdays tomorrow? [17:39:56] aude: yeppers, see https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap#Schedule_for_the_deployments [17:40:11] hmmmm, ok [17:41:03] suppose we can keep wikibase at the same commit point/branch it's on now [17:41:19] should be fine, though we'll run our tests to make sure it works with new core branch [17:41:27] ok, cool [17:41:46] we just made our branch for next week today [17:42:12] it might be not tested enough for deploying but can decide tomorrow [17:42:27] then wikidata on monday? [17:42:47] aude: right right [17:43:13] ok [17:43:16] we'll be ready [17:43:39] might be fine to update test2 and wikidata at the same time then [17:43:40] thanks aude! [17:43:48] (with our extensions) [17:49:31] aude: hey, i'm back and just looked at the patch set from hackathon again and the new one by Michal, seems like we're back to "proper MediaWiki entry point" .. tons of comments on both :p [17:50:19] sounds like we have to use a local path / internal redirect [17:50:52] hi mutante [17:51:20] hey [17:51:42] maybe we can try to get https://gerrit.wikimedia.org/r/#/c/65463/ done first then [17:52:10] daniel had a comment about the rewrite rule being to http://www.wikidata.org$1 [17:52:11] has the same issue with the protocol :p [17:52:18] oh really? [17:52:33] * aude thought http just magically worked as i see it that way in other rules [17:52:52] Daniel:"needs to use https if the request came in via https" Tim: "In general, you can't have different redirects for different protocols using mod_rewrite" [17:53:32] see Tim's comment on https://gerrit.wikimedia.org/r/#/c/47088/4 [17:53:39] hmmm ok [17:53:43] and there's another older patch he links to there [17:54:01] so i had coded something for the entry point [17:54:31] there was some hesistance about duplicating code but i can factor stuff out to make it small as possible [17:54:36] and share code with the special page [17:54:46] i would need advice on how to handle headers etc [17:55:40] made one more patch set anyways that simplifies it.. but it doesnt solve the protocol issue https://gerrit.wikimedia.org/r/65443 [17:56:47] hmm.. yea, so we'd need a mediawiki dev to help with that [17:57:13] and i hear currently we're blocking Reedy from using test.wikidata [17:57:20] oh no [17:57:38] i can post my patch as a work in progress [17:57:56] DanielK_WMDE_: ^ [17:58:01] Reedy: hey, what could we do to not block you if we're not merging those patches? [17:58:39] we'll have to use local pathes for these [17:59:45] As long as apache doesn't treat test.wikidata like it would en.wikidata (etc)... [17:59:48] Which it does currently [18:00:35] Not sure if we can just modify the current rules, or if we need to duplicate out the vhost and amend as necessary for test.wikidata [18:01:34] http://test.wikidata.org [18:01:34] * 301 Moved Permanently http://www.wikidata.org/wiki/Wikidata:Main_Page [18:01:34] http://wikidata.org [18:01:34] * 301 Moved Permanently http://www.wikidata.org/ [18:01:34] http://www.wikidata.org [18:01:36] * 301 Moved Permanently http://www.wikidata.org/wiki/Wikidata:Main_Page [18:03:33] mutante: hm? [18:04:12] ok, looking at a way to just not redirect test then first [18:04:24] DanielK_WMDE_: so mutante thinks having simple redirect to the special page won't work [18:04:31] DanielK_WMDE_: there are 2 patch sets with lots of comments but they all run into the same problem in the end [18:04:32] with the various protocols and other issues [18:04:33] because of https? [18:04:36] yea [18:04:41] among other, yes [18:04:56] mutante: you can make a RewriteCond and check the protocol [18:05:03] "In general, you can't have different redirects for different protocols using mod_rewrite, see https://gerrit.wikimedia.org/r/#/c/13293/ . " [18:05:07] and have two otherwise identical rewrite rules [18:05:15] or you can just always redirect to https :) [18:05:38] do you want that? [18:05:40] mutante: hm? you can check the protocol in a RewriteCond, can't you? [18:05:42] always https? [18:05:54] https://gerrit.wikimedia.org/r/#/c/13293/5 [18:06:03] see that example that has been tried in the past :p [18:06:35] so, if we want an entry point, here is my initial patch from hackathon [18:06:36] https://gerrit.wikimedia.org/r/#/c/67123/ [18:06:38] mutante: http://www.askapache.com/htaccess/http-https-rewriterule-redirect.html [18:06:48] we'd want to factor out all the wikibase specific code and share it with the special page [18:06:53] then make the entry point tiny [18:06:58] DanielK_WMDE_: the Apaches dont speak HTTPS [18:07:01] * aude not sure about the headers, etc [18:07:18] but this would handle edge cases like enwiki/What's My Line? [18:07:21] ah, right, i see the problem [18:07:28] #you'd have to know if the original request was https [18:07:30] apache strips the question mark and then can never handle that [18:07:34] herm... right... [18:07:35] DanielK_WMDE_: so all you could to is try to use HTTP_X_Forwarded_Proto [18:07:45] and ENV:HTTP_X_Forwarded_Proto doesn't do that? [18:07:46] but that has been abandoned too :p [18:07:57] * aude thinks entry point is much better to handle all that [18:08:15] aude: i don't see how a separate entry point would solve the issue, really... [18:08:26] how does the enotry point know where to redirect to? [18:08:34] if PHP knows, why can't mod_rewrite know? [18:08:35] errr making new patch :) [18:09:44] mutante: but really... inside php, we *can* figure out what the original protocol was, right? [18:09:54] why can't we do the same in mod_rewrite? [18:10:05] it has to be in the headers *somewhere*Ü [18:11:40] * aude okay with whatever way to solve this [18:11:44] if it works [18:13:06] i don't know why it's so hard to do with apache [18:13:36] but must reasons for the fancy entry point stuff used by wmf [18:19:23] DanielK_WMDE_: i don't know better than the comments on 13293 .. like f.e. "After discussion with Asher, it may not be possible to properly send a vary header for the redirects, as it seems mod_rewrite will strip all response headers" [18:20:34] if local path won't work maybe it would all have to be in varnish [18:20:36] maybe apache 2.4 will be beter [18:20:36] "mod_rewrite will strip all response headers"? what?= [18:20:40] once we get it someday [18:20:51] when doing a rewrite? very unlikely, that would apply to all /wiki/Foo replies too [18:21:04] on a redirect? well, in that case, it generates the reponse header... [18:21:09] i don't understand what that means [18:22:00] * DanielK_WMDE_ is confused and wants to talk to an apache wizard [18:22:47] mutante: how exactly is https/http handled for wikipedia? [18:22:54] they all use the same entry poing [18:22:56] point [18:23:34] i know in commonsettings.php $wgServer gets set [18:24:01] if ( isset( $_SERVER['HTTP_X_FORWARDED_PROTO'] ) && $_SERVER['HTTP_X_FORWARDED_PROTO'] == 'https' ) { [18:24:15] then it wgServer gets set to https [18:24:58] MWMultiVersion handles the part after the protocol [18:26:58] https://wikitech.wikimedia.org/wiki/Https#SSL_termination [18:27:09] "The nginx servers answer requests on IP based virtual hosts and proxy the requests directly to the backends unencrypted. Headers are set for the requested host, the client's real IP, forwarded-for information, and forwarded-protocol information." [18:27:19] * aude click [18:28:28] ok, that's how the forwarded header is set [18:28:47] Vary: Accept-Encoding,X-Forwarded-Proto,Cookie [18:29:07] yea, sorry, i know this same thing kept coming up a couple times and i don't have the solution either. Maybe Ryan can comment [18:29:21] no apache at all? [18:29:24] let me try create a new separate patch set to just not block "test" anymore [18:29:31] what kept coming up? [18:29:38] the protocol issue with redirects [18:29:42] ah [18:29:47] as in https://gerrit.wikimedia.org/r/#/c/13293/5 [18:29:47] good idea for test [18:30:02] yeah. it's that apache itself isn't sending redirects based on the current protocol set [18:30:10] Ryan_Lane: "After discussion with Asher, it may not be possible to properly send a vary header for the redirects, as it seems mod_rewrite will strip all response headers" [18:30:14] i quoted you there [18:30:46] and Tim's last comment on https://gerrit.wikimedia.org/r/#/c/47088/ [18:31:09] mutante, Ryan: how about a rewrite rule in nginx? [18:31:28] oooh [18:31:42] DanielK_WMDE_: that wouldn't work [18:31:47] can someone try to block the message 女秘书想找火包友! sent to wikitech-l? [18:31:47] why? [18:31:48] :( [18:31:50] DanielK_WMDE_: unless we want to duplicate all of the rules [18:32:08] Ryan_Lane: ah, http doesn't go through nginx? [18:32:12] no [18:32:19] apache for http [18:32:19] ic [18:32:23] nginx is a transparent proxy for ssl [18:32:25] nginx for https? [18:32:43] it sits in front of the squid/varnish frontends [18:32:43] Ryan_Lane: after nginx then https hits the apache? [18:32:45] aude: nginx acts as a proxy, it doesn't replace the apache. mediawiki itself always runs on apache. [18:32:58] Ryan_Lane: is nginx before or after squids? [18:33:01] ok [18:33:11] "it sits in front of the squid/varnish frontends" :) [18:33:20] ok [18:33:39] * aude understands [18:33:40] so, nginx -> squid/varnish -> apache [18:33:51] or no apache if it's anon / cached [18:34:01] if nginx would also handle plain http, this would be simple :) [18:34:16] someday. [18:34:55] * aude got as far as setting up squid for my wiki farm :) [18:35:02] no varnish and nginx yet [18:35:40] can we take a step back? why are we doing a redirect again, not a rewrite? [18:35:52] a rewrite wouldn't care about protocols. [18:36:22] DanielK_WMDE_: a rewrite just sends a redirect you know, right? :) [18:36:37] Ryan_Lane: hm? no it doesn't. [18:36:47] yes, yes it does :) [18:37:05] DanielK_WMDE_: we need to know the subdomain [18:37:24] aude: i thought we are talking about /entity/ [18:37:25] there's a way to do that with rewrite? [18:37:28] both [18:37:37] Ryan_Lane: it sends a redirect where? [18:37:38] apparently we have issue with both [18:37:45] DanielK_WMDE_: to the browser [18:37:59] Ryan_Lane: i mean, /wiki/Foo is a rewrite to index.php/Foo. no redirect to the browser [18:38:02] * aude would like entity one resolved first [18:38:20] most of the redirects we have are not setup that way [18:38:32] and are not meant to be setup that way. they are meant to be redirects [18:38:34] that one is handled as an alias [18:38:35] * aude thinks [18:38:48] Ryan_Lane: but that'S what a *rewrite* is. as opposed to a *redirect*. [18:38:55] RewriteRule ^/entity/(.*)$ /wiki/Special:EntityData/$1 [QSA] ? [18:38:58] that's how i do my test wikis and i learned it from wmf config [18:39:10] most mod rewrite rules send redirects [18:39:30] Ryan_Lane: and i'm asking why we have to. if we don't, the protocol isn't a problem at all [18:39:48] I'd recommend using Alias for what you suggested, rather than rewrite [18:39:59] I have no clue what you guys are trying to do, btw [18:40:10] meta.wikimedia.org/wiki/Wikidata/Notes/URI_scheme [18:40:15] https://gerrit.wikimedia.org/r/#/c/65463/4 [18:40:15] https://meta.wikimedia.org/wiki/Wikidata/Notes/URI_scheme [18:40:15] I'm just answering the questions you're asking me :) [18:40:34] it's multiple things but we can worry about the entity one first [18:40:39] Ryan_Lane: using mod_rewrite for redirects is actually abusing it, it's meant for the other thing (or was originally) [18:41:02] Ryan_Lane: I don't care about the mechanism, i care that it would be easier as a rewrite. i think. [18:41:12] DanielK_WMDE_: the comment you made on that one, it's something i introduced with that patch set [18:41:26] DanielK_WMDE_: i was just moving existing stuff to one location to make it less confusing [18:41:51] of course it might still be valid [18:42:10] arg, i meant to say "did not introduce with that patch" [18:42:20] yes, a proper rewrite is likely what you want [18:42:27] and then protocol won't matter [18:43:14] Ryan_Lane: yay! [18:43:34] so let's just do rewrites for language subdomains as well as /entity/ [18:43:43] then we don't have to worry about https at all [18:43:54] mutante: ---^ [18:44:12] mutante: the comment about the recursion guard? but that rule is specific to that rewrite, no? [18:44:27] * DanielK_WMDE_ takes a quick pizza break [18:44:28] no, the comment here https://gerrit.wikimedia.org/r/#/c/65463/4/main.conf [18:44:37] RewriteRule ^(.*)$ http://www.wikidata.org$1 [R=301,L,NE] [18:44:50] <-- it might look like i added that but it exists, i just moved it [18:45:30] ok, quick break sounds good [18:46:01] i dropped the whole recursion guard thing, it wasn't necessary [18:46:21] ah, right. never mind that then [18:51:28] mutante: if https://gerrit.wikimedia.org/r/#/c/65443/7/main.conf gets QSA that might work? [18:51:36] let the special page figure it out then [18:51:43] (aside from the other issues) [19:03:54] Reedy: Tests now passing @ https://gerrit.wikimedia.org/r/25838 thanks to bawolff [19:04:56] From #wikimedia-dev [19:04:58] [20:03:40] bawolff: I should probably merge it now.. ;) [19:04:58] ;) [20:06:38] A database error has occurred. Did you forget to run maintenance/update.php after upgrading? See: https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script [20:06:39] Query: INSERT INTO `page_restrictions` (pr_page,pr_type,pr_level,pr_cascade,pr_expiry) VALUES ('0','aft','aft-noone','0','infinity') [20:06:39] Function: DatabaseBase::insert [20:06:39] Error: 1062 Duplicate entry '0-aft' for key 'PRIMARY' (10.64.32.26) [20:07:01] i was trying to salt a page on enwiki [20:07:12] [01:06:27 PM] [[Special:Log/protect]] protect * Legoktm * protected Asian Fox Developments ‎[create=sysop] (indefinite): [[WP:SALT|Repeatedly recreated]] [20:07:31] it didnt go through though [20:07:31] heh, aft again [20:07:41] binasher: ^ [20:08:11] aft, wtf [20:08:18] i vaguely remember some AFT database trouble from earlier [20:08:24] binasher: so i poked you [20:08:27] the page is https://en.wikipedia.org/w/index.php?title=Asian_Fox_Developments&action=protect [20:08:31] as you probably know more [20:08:39] article feedback is already set to "Disable for all users" [20:08:51] i just modified the create protection part [20:08:52] mlitn: ^^^ [20:09:08] thats https://gerrit.wikimedia.org/r/67065 [20:09:12] i think [20:09:36] oh, yeah.. looks like it is [20:09:39] hm [20:09:47] that's correct, what se4598 said [20:10:03] can we get that deployed soon? [20:12:30] legoktm: that is the plan, yes [20:12:53] ok thanks. is there any possible workaround in the meantime? [20:15:35] greg-g: could you get https://gerrit.wikimedia.org/r/67065 deployed soon? looks like you've got a reserved window coming up [20:16:07] legoktm: unfortunately, no [20:17:29] binasher: who wants to deploy it? [20:17:57] mlitn: could you? [20:18:57] I can [20:19:22] hi YuviPanda! [20:19:43] greg-g: what time woud be good? [20:20:10] hey rfarrand! [20:21:22] mlitn: cool. today's LD is pretty packed... E3 and Kaldari are doing some changes. But, maybe all three of your could coordinate? [20:30:20] greg-g: alright! [21:11:03] mwscript extensions/WikimediaMaintenance/addWiki.php en wikimedia testwikidatawiki test.wikidata.org [21:11:16] mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki en wikimedia testwikidatawiki test.wikidata.org [21:11:34] :) [21:12:14] I wonder if it should go on s3 or s5 [21:12:25] I suspect relatively it's going to be low load, so s3 should be fine [21:12:39] sure [21:13:11] I'm really not sure about the databasename either [21:16:00] Or is it going to end up as testwikidata [21:16:34] testwikidatawiki <3 [21:17:19] pretestlabsdbdbwikidatawiki [21:17:30] Slightly confused about how the docroot is going to end up as a language [21:17:57] Reedy: hmm, how does it work for test.wikipedia? [21:18:37] I think it ends up with test being the language [21:18:48] and wiki as the project [21:20:06] } elseif ( preg_match( '/^\/usr\/local\/apache\/(?:htdocs|common\/docroot)\/([a-z]+)\.org/', $docRoot, $matches ) ) { [21:20:12] } elseif ( preg_match( "/^\/usr\/local\/apache\/(?:htdocs|common\/docroot)\/([a-z0-9\-_]*)$/", $docRoot, $matches ) ) { [21:21:04] http://p.defau.lt/?IbcOeFzke94Hnnw6P3eoXA [21:21:30] * @param $serverName the ServerName for this wiki -- $_SERVER['SERVER_NAME'] [21:21:30] * @param $docRoot the DocumentRoot for this wiki -- $_SERVER['DOCUMENT_ROOT'] [21:22:39] mutante: Noting this is the code blocking the combining of all the wikimania docroots (not sure how it'll behave) [21:23:40] ServerName test.wikidata.org [21:23:41] ServerName test.wikidata.org [21:23:43] fail [21:23:49] DocumentRoot "/usr/local/apache/common/docroot/wikidata_test" [21:24:13] It's going ot end up in [21:24:23] } elseif ( preg_match( "/^\/usr\/local\/apache\/(?:htdocs|common\/docroot)\/([a-z0-9\-_]*)$/", $docRoot, $matches ) ) { [21:24:24] $site = "wikipedia"; [21:24:24] $lang = $matches[1]; [21:24:31] hrrmmm [21:24:32] So language will be wikidata_test [21:24:32] :/ [21:25:00] so "wikidata" has to be a project and test a language [21:26:33] Aaand no gitweb [21:27:49] wikidara goes through with site being wikipedia and lang being wikidata [21:28:11] mutante: using testwikidata as the docroot seems probably the way forward with how things are.. [21:28:40] happy to change it [21:28:47] which will then be a dbname of testwikidatawiki [21:28:57] testdatawikitestwiki [21:29:23] There's no reason for us to make wikidata a "suffix" like wikipedia et al, as there's not going to be loads of language sites [21:30:59] it is a special case [21:31:25] no need for fancy tricks [21:31:39] https://gerrit.wikimedia.org/r/67160 [21:33:22] https://gerrit.wikimedia.org/r/67161 [21:38:53] greg-g: it's not just you, it's an upstream issue [21:39:08] andre__: usually, when it comes to bugzilla ;) [21:39:14] yeah :-/ [21:52:01] Reedy: moved to testwikidata [21:55:13] * MaxSem scaps [21:59:43] aude: so this would work: RewriteRule ^/entity/(.*)$ http://www.wikidata.org/wiki/Special:EntityData/$1 [R=301,QSA] .. result: [21:59:48] http://en.wikidata.org/entity/foo * 301 Moved Permanently http://www.wikidata.org/wiki/Special:EntityData/foo [22:00:38] bbiaw [22:03:29] we're getting a test wikidata? :) [22:04:38] hi [22:05:40] how to list all pads on etherpan not via sql? [22:05:40] i see that it has an api query for it but idk how to use it [22:16:42] mutante: hmmm, ok [23:04:07] spagewmf: lemme know whenever the road is clear [23:04:49] you can start. Ours is just a wmf-config change, so long as we have the last 10 minutes... [23:06:05] sweet [23:06:23] syncs apache conf [23:06:30] is almost done though [23:11:05] done [23:12:52] I just tested now with blocking a vandal and verified there was an autoblock at the time of the block. [23:13:19] Then did another test by reblocking them with no talk page and again verified if the autoblock was still there. [23:13:21] It wasn't. [23:13:41] bugzilla? [23:14:03] Meh. [23:14:21] Jasper_Deng: Where is a recent change to blocking? [23:14:24] In the code [23:14:31] Can you look? [23:14:37] * Jasper_Deng digs in Gerrit [23:16:42] Krenair can probably provide a better answer. [23:17:02] I know that you should be able to see the commit history of a file somewhere. [23:17:18] So the issue is reblocking removes autoblocks? [23:17:35] You can see commits which affected a given file by doing 'git log filename' [23:18:00] you can see the last commit to touch each individual line by doing 'git blame filename' [23:18:55] * Jasper_Deng doesn't want to clone the entire core just to do that.... [23:19:24] I don't know if this occurs on the English Wikipedia. [23:21:32] Can someone look at https://gerrit.wikimedia.org/r/#/c/52749/ ? [23:21:42] Siebrand -1'ed, but he hasn't responded to me followups. [23:21:51] It's adding aliases for login/logout. [23:23:14] spagewmf: I'm all finished, but I haven't synced anything. Would it be possible for you to just run a scap after you update your config file? [23:24:15] kaldari, I was just going to sync-file the two config changes [23:25:07] if you want to do that I'll need to sync-dir the extensions dir first [23:25:16] for the new submodule [23:27:06] kaldari OK, let me know. Only 4 minutes left! [23:29:42] kaldari: OK, checking with matthias to see if syncing extensions would be OK now [23:29:58] OK, he's done [23:30:46] spagewmf: if it's the same to you, I'd prefer a scap when your done rather than individual syncs [23:30:56] I really hope it hasn't been occuring since this got fixed, Jasper_Deng: https://bugzilla.wikimedia.org/show_bug.cgi?id=5445 [23:31:07] oh wait [23:31:11] wrong bug [23:31:15] sorry, hold on [23:31:23] kaldari, sure, OK. BTW is mlitn's change to AFTv5 just now part of your LD? [23:31:27] https://bugzilla.wikimedia.org/show_bug.cgi?id=5445 [23:31:39] nevermind [23:32:30] This gerrit change: https://gerrit.wikimedia.org/r/#/c/3841/ [23:33:43] ok, that code definitely looks suspect. [23:33:53] I don't fully understand it, but it looks suspect. [23:33:59] spagewmf: not sure what you mean by your question; I asked Kaldari if it was ok to deploy some bugfixes to extensions/ArticleFeedbackv5 alongside of his LD [23:34:53] mlitn thanks for the explanation. [23:35:20] spagewmf: technically it was spage's lighting window since he claimed it first, but greg said we could share it [23:35:27] oops wrong person [23:35:32] mlitn: ^ [23:36:17] oh ok; well thanks both of you ;) [23:36:34] thanks S! [23:36:36] uh, anything ya'll did do this https://test.wikipedia.org/wiki/Main_Page / [23:36:39] ? [23:37:07] kaldari: mlitn spagewmf ^^ [23:37:11] greg-g ?!? [23:37:13] Jasper_Deng: Do you have a test account on that wiki? [23:37:21] Let's test it out some more. [23:37:21] on testwiki? [23:37:24] yeah [23:37:26] yeah, I have Jasper Deng (alternatE) [23:37:31] Jasper Deng (alternate) [23:37:42] Mind if I test block it? [23:37:46] no [23:37:47] greg-g: I imagine that's from the i18n files not being synced yet for the new extension [23:37:49] go ahead [23:37:54] it's scapping now though [23:37:57] (and I can verify on-wiki if needed) [23:38:13] wait, Jasper_Deng [23:38:18] internal errorz [23:38:36] on testwiki? [23:38:39] Yes [23:38:43] Take a look. [23:38:46] kaldari: gotcha, other wikis aren't affected, from my quick check [23:38:55] "Error: invalid magic word 'disambiguation'" [23:39:07] wat [23:39:24] greg-g: for historical reasons, test2wiki is more reliable than testwiki in many ways [23:39:39] Bsadowski1: Jasper_Deng: see kaldari's last msgs, it's about this subject ;) should be fixed soon [23:39:42] chrismcmahon: yeah.... [23:40:21] So many channels. [23:40:43] it's to confused people [23:40:48] confuse [23:40:49] https://bugzilla.wikimedia.org/show_bug.cgi?id=49220 [23:41:13] You didn't say the magic word. [23:42:36] I thought test.wikipedia.org didn't require scap. [23:42:40] next time I guess I should fully deploy the extension first before turning it on on testwiki, instead of trying to do both at the same time [23:42:50] * greg-g nods [23:42:51] Hmm, right. [23:43:12] I assumed it would sync the core files and i18n before the config files, but maybe not [23:43:52] OK, testwiki should be back to normal now [23:44:02] I did sync-common on testwiki, so that pulled in the files (but didn't rebuild the i18n) - I think scap indeed rebuilds i18n first [23:44:04] greg-g, spagewmf: ^ [23:44:19] kaldari: yep, all good, thanks [23:44:31] mlitn: ah, that was probably it then [23:44:41] https://doc.wikimedia.org/mediawiki-core/master/php/html/Block_8php_source.html @ Jasper_Deng [23:44:44] Is that current? [23:44:59] idk [23:45:03] for doc.wikimedia.org [23:45:05] https://github.com/wikimedia/mediawiki-core/blob/master/includes/Block.php [23:45:06] :o [23:45:10] mmk [23:45:10] Bsadowski1: doc.wikimedia is autogenerated [23:45:18] Ah cool [23:45:25] Bsadowski1: it's at worst a few minutes behind git master, i think [23:45:45] it says "Generated on Wed Jun 5 2013 19:35:23" in the footer [23:45:46] :) [23:46:34] Bsadowski1: depending on what you are doing, it might be too current - these docs are not for the stable release version, they're for git master [23:46:36] * MatmaRex has no context [23:46:48] ah okay thanks