[06:42:00] to link a wikimedia commons category, there's the possibility to place a commons link under "Other sites" or under "Statements" to add a "Commons category". [06:42:13] I'm a little bit confused here. [06:42:25] Both seem kinda expected - but redundant. [06:42:56] Can someone fill me in a bit more on the pros and cons or something? [06:44:31] Mostly do both, but with the other sites I only connect articles with galleries and categories with categories. [07:06:59] sjoerddebruin: thx, that offers at least a little guidance [07:10:00] tmpnck: note that Wikipedia and Commons people prefer the opposite, articles linked with categories [08:52:22] *pokes people* Whats up with WikidataQualityExternalValidation? Can't we merge in there now> [08:52:23] ? [08:54:09] I don't know!1! [09:33:17] addshore: I do have merge rights on WikidataQualityExternalValidation. Or what are you asking for? [09:33:35] well, jenkins doesnt run there... and if I +2V I cannot submit [09:33:53] has something changed in the past week? [09:33:58] The Wikidata team should own that extension [09:34:05] See https://gerrit.wikimedia.org/r/#/c/245826/ [09:34:06] Also the mediawiki group owns all extensions [09:34:16] hoo: indeed, but also jenkins should still be running afaik [09:34:57] https://git.wikimedia.org/commitdiff/mediawiki%2Fextensions%2FWikidataQualityExternalValidation/5c2b73093ae46dfd48a671df3c94519fff4f1ef6 [09:35:00] Jan changed that [09:35:02] dunno why [09:35:05] You should poke him [09:35:32] jzerebecki: ^^ [09:36:35] Oh, it's the wrong extension :P [09:36:47] Ahhh wikidata > Wikibase! [09:37:18] Shouldnt the old ones be marked as inactive? [09:37:59] hm... yeah [09:38:04] They could also redirect [09:38:07] if gerrit supports taht [09:38:28] not sure about redirects [09:39:07] I have flagged WikidataQuality* as readonly [09:39:19] Which I think means you cant push patchs to it any more ;) [09:41:23] Yeah, probably [09:41:47] Lydia_WMDE: https://tools.wmflabs.org/wikidata-exports/ [09:42:03] Lydia_WMDE: https://tools.wmflabs.org/wikidata-exports/rdf/index.php?content=dump_download.php&dump=20150928 [09:42:20] DanielK_WMDE: Lydia_WMDE: Any thoughts about also providing our json dumps as bzip2? [09:43:02] Provides better compression and the analytics team (or however they're called now) needs that as that works better with Hadoop [09:43:22] hoo: "why not"? [09:43:30] what DanielK_WMDE said [09:43:41] Ok, great :) [09:44:02] hoo: we should look into using pbzip2 for creating them [09:44:47] I doubt that's needed, yet... but at some point, probably [09:45:00] hoo: gzip streams can simply be concatenated, right? that's nice for sharded processing. does that work with bz2 as well? [09:45:26] well, parallelizing only makes sense if we are not competing for cpu cycles with anything important [09:46:00] We have that server almost to ourselves on Mondays/ Tuesdays, so CPU cycles aren't an issue [09:46:03] hoo: addshore if you want to join the daily - we're in the hangouts [09:46:05] writing over nfs could be [09:46:21] I'm only going to work for another 30 or 45m, so no thanks [09:46:30] And branch later today, follow up on CR [09:47:59] Tobi_WMDE_SW: cant see you in either one ;) - I think your message was a bit delayed :p [09:59:48] addshore: oh, ok [09:59:50] :) [09:59:52] sorry [09:59:53] :D [09:59:56] no worries :) [10:00:28] hoo: ok, fine [10:10:20] :) [10:10:28] DanielK_WMDE: Yes, bzip2 supports concatenation as well, btw. [10:10:34] But we wont make use of that (yet) [10:11:17] hoo: sure, this was more a long term consideration [10:11:31] nice to know, one more item checked on the list [10:12:43] One funny thing with concatenation is that the headers are off then... if you do gzip -l on one of our dumps it's showing funny things [10:13:53] compression ratio eg. was -181008363633.3% for our October 5 dump :D [10:14:24] haha [10:14:46] The first "shard" is just a compressed "[\n", that's why [10:15:00] seems like it's just reading the header for one chunk, and them comparing it to the total file size [10:15:05] pretty sucky compression, that [10:15:08] Yeah :D [10:17:52] Anyway, I need to go [10:18:04] Would appreciate reviews (or even merges ;) )on the ORM stuff [10:57:35] DanielK_WMDE: why but of course, Wikibase constructs a ChangesList object in tests ;) *goes to fix* [12:01:09] addshore, hoo: there is a ticket about deleting the old WikidataQuality ones https://phabricator.wikimedia.org/T103543 . they are already set to read only. [12:01:29] jzerebecki: they wern't readonly (until I did it this morning) ;) [12:01:42] addshore: thx [12:03:35] sjoerddebruin: daniel and i just looked into your watchlist issue [12:03:53] sjoerddebruin: the reason the discussion page is showing up is that data from the item about germany is used on the discussion page [12:04:01] via a parser function [12:12:57] Ah, okay... [12:13:53] That wasn't communicated and it's very annoying to see three pages every time imo [12:35:30] Lydia_WMDE: are you aware of the discussion on wikidata:forum about the name of the references section in german? I only skimmed over it (so maybe I misunderstood it), but it seems like they want to change the terminology [12:35:43] and I know a bunch of people working on wikidata are german speakers, so thought it might be relevant [13:29:55] woo, ORM removal :) [13:30:41] Do we know roughly when that will be deployed? Working out for removing from core + moving to EP [13:59:48] Reedy: probably this week [14:05:09] nikki: yeah i've been following it a bit. thanks for the pointer :) [14:05:29] sjoerddebruin: yeah agreed. this is a weird corner-case though i hope [14:05:45] will think about how to improve it [14:05:56] Make it longer and sound more angry [14:09:45] Lydia_WMDE: around? [14:09:55] aude: jep :) how are you feeling? [14:11:08] somewhat okayish [14:11:34] but need to be careful that i'm okay to ride on an airplane in 2 days :) [14:11:55] good good [14:11:58] take it easy [14:12:02] just want to get https://gerrit.wikimedia.org/r/#/c/243625/ unstuck and maybe the best way is to pull it out of the chain for now [14:12:16] it needs more work to set other params like globe [14:12:27] looking [14:12:39] the intention of the rest fo the chain is good, but maybe it needs reworking and rethinking [14:13:29] or if the chain is almost ready to merge and then we do follow ups, that's ok [14:13:42] aude: All my smaller patches *after* this patch can wait, in my opinion. They are more reminders so we do not forget all these possible refactorings. [14:13:53] Thiemo_WMDE: ok [14:14:14] So I think we have the same idea. Merge the stuff including "Add geodata to parser output". Wait with the rest. [14:14:18] Ok? [14:14:26] good from my side [14:14:32] ok for you aude? [14:14:51] ok with me [14:14:59] i don't want this deployed int he branch today [14:15:11] which since geodata is not enabled, it won't [14:16:08] Right now I'm doing the last patch in this chain. All it does is adding fixmes to the code. Then we can look at these fixmes and see when and how we are resolving all these smaller nitpicks and architectural refactorings. [14:16:45] ok [14:17:08] :) [14:17:22] aude: are you coming in on monday again if all is good? or already friday? [14:17:35] Lydia_WMDE: probably friday for the beer and cake :) [14:17:40] haha [14:17:40] ok [14:17:49] :O [14:17:54] :D [15:06:06] Is there a way to make creation of entries quicker by importing data from WIkipedia article? At least location should be really easy to import. [15:06:46] There is a script for that, created by the Russian community. [15:07:17] Also P18 (use image from infobox or one provided by https://www.mediawiki.org/wiki/Extension:PageImages) would be good heurestic. It would require verification but in 99% cases it would require just accepting it. [15:07:41] sjoerddebruin: "There is a script for that, created by the Russian community." do you have link or name to find it? [15:08:08] https://ru.wikipedia.org/wiki/Википедия:WE-Framework, don't know how it works. But sometimes see new items created with this. [15:09:03] Also, page_image provided by Wikipedia API nearly always matches P18 property. Is there some existing tool that would allow quick import (obviosuly, after verification as Extension:PageImages sometimes fails). [15:09:40] Not sure. [15:10:09] The image game of the Wikidata Game would be something maybe. https://tools.wmflabs.org/wikidata-game/ [15:12:00] * aude goes back to being sick [15:21:35] get well soon aude ! [15:21:58] http://cdn.meme.am/instances/10853705.jpg [15:24:44] :) [15:25:08] http://www.quickmeme.com/img/43/438219332e404e3094986e23f58348828b5a39544ec60b92d1092fa36377fe32.jpg [15:27:56] :D [15:29:16] :D [16:17:02] is there an open bug for the special search page? https://www.wikidata.org/w/index.php?search=&search=earth&title=Special%3ASearch&go=Go [16:18:05] dennyvrandecic: yea, somewhere... [16:19:26] dennyvrandecic: find anything here? https://phabricator.wikimedia.org/T46529 [16:19:42] just looking through that very list :) [16:19:50] this, maybe: https://phabricator.wikimedia.org/T110648 [16:20:44] dennyvrandecic: the underlying issue is https://phabricator.wikimedia.org/T89733 [16:21:01] a temporary workaround would be possible... [16:21:44] both these bugs should be blocking the first list [16:21:51] I hope I can log in... [16:25:10] is there a special:instant_search for mediawiki / wikibase? that would be nifty [16:34:22] After 15 minutes if fighting with KDE, here I am [16:36:44] hoo: great, now you can fight the KDE end boss, Lydia_WMDE [16:37:36] :P [16:37:56] I can't figure why it's acting up like that [16:38:06] I ended up deleting all kscreen config, but that didn't help a bit [16:46:27] * nikki wonders what lydia's attacks would be as an end boss [16:58:40] nikki: i'm totally not revealing my secrets obviously :P [16:58:53] aww [16:59:43] nikki: you have to buy the "Big Book of Lydia's Secret". being released December 25th in all good Berlin shops! [17:00:25] only in berlin? that sounds like a trap! :P [17:00:43] it's Lydia_WMDE - of course it is! [17:01:48] i see you are all sufficiently afraid [17:01:49] very good [17:01:50] :P [17:02:04] muahahaha [17:03:39] !run [17:04:53] Anything I need to review or so? [17:05:02] Otherwise, I'll probably kill more ORM stuffs [17:05:15] kill all the ORM stuffz [17:06:28] hoo: you need to submit a book review for "Big Book of Lydia's Secrets" ;) [17:07:14] I don't want to end up spoiling it :D [17:11:13] heh... I accidentally pasted an item id into a date field, and it still managed to parse it as a date [17:11:48] That's interesting [17:12:20] php > echo strtotime( 'Q1234' ); [17:12:20] 1444754040 [17:12:22] wow [17:12:42] Reminds me of php > echo strtotime( 'a potato' ); [17:12:42] 1444752749 [17:13:02] haha what [17:13:18] strtotime( 'kittens' ); :D [17:13:43] >9000 ? [17:13:49] aude: echo strtotime( 'a kitten' ); << That works [17:14:08] hoo: go fix PHP, it should also support what aude wrote [17:15:57] CatTimeParser [17:17:37] jzerebecki: $wgMFNearbyRange (in meters, default is 10km) [17:18:53] aude: I'm not happy with PropertyDataTypeMatcher being in Lookup\. Thoughts on where to best put it? [17:20:21] JeroenDeDauw: agree, that's not a great place but didn't think of a better place [17:21:06] aude: Entity\ seems the best one [17:21:12] Could also make a new Property\ one [17:21:23] aude: which of those do you think is best? [17:22:21] Entity\ is ok [17:22:40] there is some discussion of having a more generic "StatementMatcher" interface [17:22:47] you can ask DanielK_WMDE about it :) [17:23:15] although not sure i like that (yet) [17:24:16] aude: could just as well be a SnakMatcher interface [17:24:21] depending on the use cases we have [17:24:30] there are some nitty gritty design challanges here [17:26:18] DanielK_WMDE: yeah [17:26:36] for geodata, we potentially want reference snaks with coordinates properties [17:26:44] errr, qualifiers (or maybe both) [17:27:05] * aude thinks qualfiier cooordinates should be included in secondary coordinates [17:27:46] DanielK_WMDE: feel free to come talk about it [17:28:53] JeroenDeDauw: probably not today. focussing on getting the essentials merged. looks like we are going to do this with lots of todos left for later. [17:29:33] I'm sorry Lydia_WMDE U won't be there for the birthday :s [17:30:41] DanielK_WMDE: I'm not in the office for the next 3 weeks [17:31:58] aude: namespace Wikibase\Lib\Tests\Store; :D hahaha [17:32:04] aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa [17:32:05] JeroenDeDauw: too bad, we'll need to make some descisions regarding the future of the data model pretty soon. [17:34:02] aude: can you explain why GeoDataDataUpdate takes $statementsByGeoProperty in the constructor? [17:34:28] DanielK_WMDE: for testing [17:35:21] aude: what issue does it solve for testing? [17:35:23] but possibly i want to refactor it again [17:35:43] to test updateParserOutput seaprately from processStatement [17:36:14] but think we'll end up refactoring and not having both [17:36:18] but this is an aggregator class. one method is for putting stuff in, one is for getting stuff out. thier contracts are relative to each other. [17:36:26] so their behavior needs to be tested together [17:36:34] ORM will be gone from Wikibase after https://gerrit.wikimedia.org/r/245949 [17:36:41] testing one without the other would be testing implementation details, not interface contract [17:36:47] hoo: \o/ [17:37:02] DanielK_WMDE: ok, so i should refactor [17:37:13] There's still a lot of refactoring to do, but it's by far easier to do that without having to work around the ORM stuffall the time [17:41:41] hoo|away: well done sir, wll done [17:46:25] aude: yea, but don't mess with the patch it now, make a follow-up please [17:51:24] DanielK_WMDE: ok [17:52:41] wow, we put displaytitle entry in the page_props table for every item [17:52:58] but it's not split by language or such [18:03:24] aude: want to fix the commit message? https://gerrit.wikimedia.org/r/#/c/243613/15..17//COMMIT_MSG [18:03:28] hoo: so my plan was to update the wikidata wmf3 after daniel merged some nearby related patches. what do you prefer? for wikibase: backports or recreating the branch from master? [18:03:39] aude: i'm about to merge most of the chain [18:04:03] jzerebecki: Both ok with me, given the branch is quite new [18:04:08] DanielK_WMDE: thought thiemo could [18:04:13] Branching from master is probably easier [18:04:22] since he split the stuff [18:04:44] ok redoing wikibase wmf3 then, I'll ping you when we are ready [18:04:51] jzerebecki: not sure i want all the nearby stuff in yet [18:04:59] it's still a bit WIP [18:05:11] aude: thiemo isn't here, nor online. [18:05:16] aude: on master? [18:05:17] grrrr [18:05:29] hoo: what? [18:05:55] why recreate the branch? [18:05:58] aude: You consider it a little on master [18:06:02] * little WIP [18:06:28] Apparently there has been progress on that since you brnached, but no idea [18:06:30] aude: merging it now would give us an opportunity to have it on test before deploying it [18:06:31] not following there much [18:06:52] hoo: i'm not happy with how the nearby stuff is now, but it's a big chain and we have to fix in follow ups [18:07:01] aude: there are open design issues, but I don't see anything that would be a problem in terms of security or performance [18:07:05] any show stoppers? [18:07:19] i think it will be ready for next week (we can branch again) [18:07:28] I'm not going to be around tomorrow evening [18:07:30] DanielK_WMDE: geodata is not ready [18:07:35] aude: branching again next wee is the plan, yes [18:07:42] aude: not ready how? [18:07:45] needs to add globe and other params [18:07:53] needs proper display text [18:07:57] works for me [18:08:16] and want to polish the design of the code more [18:08:21] to make sure it's enough performant [18:08:27] aude: we would still leave the special page disabled on wikidatawiki [18:08:33] we have a working hack for the display text. the real solution needs conceptual work, not doable in a week anyway [18:08:42] jzerebecki: I don't want to even populate it yet [18:09:17] DanielK_WMDE: i am working on a patch that maybe would work (to get the displaytitle, with our stuff hooked in) [18:09:31] not sure mobile people are happy with the hack [18:09:57] except maybe they see we have a real solution in the works (that also works for mobile watchlist, search, etc) [18:10:23] aude: can you point me to the patch? [18:10:26] aude: the globe is correctly set in the geo_tags table [18:10:31] jzerebecki: no it's not [18:10:34] but in any case, that isn't going in yet anyway [18:10:46] i would really like to have the code for populating this stuff in [18:10:48] not for mars / moon, etc [18:10:53] we have been discussing that for a week now [18:11:19] DanielK_WMDE: not uploaded the patch yet [18:11:48] Population should really be started early next week, at latest [18:12:02] hoo: sure [18:12:02] Can't wait until the Wednesday evening deploy [18:12:10] (next week) [18:12:19] aude: that means wec have to have the code live. and deploy it now. [18:12:49] We could also branch on Friday and then deploy that ourselves on Monday [18:12:58] that's going to cost us some time, but it's certainly doable [18:13:01] aude: we can still wait a few days to repopulate the data, but having it in now could decrease the amount of backports we need [18:13:47] jzerebecki: we definitely would need to repopulate [18:14:10] * aude is super unhappy [18:14:56] even if we had it perfect now we would need to repopulate [18:15:20] aude: do you have a way we can do this that would make you happy? [18:15:30] jzerebecki: if we branch next week [18:15:31] If you can get it ready by Monday, we could just do a new wmf3 build from master then and scap that out [18:15:36] would be rather safe to do [18:16:25] * aude doesn't like stuff merged last minute for the deployment branch, like this [18:16:33] often a bad idea [18:16:48] aude: i'm not here next week, neither is thiemo. pushing things to next week is not going to help. [18:17:54] aude: yes, last minute merges are bad. all-in-one-big-chunk deployments are also bad. relying on running maintenance scripts in the last minute is bad too. [18:18:00] so, which shall it be? [18:18:19] if hoo and jzerebecki want to take care of deployments and issues, then ok [18:18:57] I'm in San Francisco next week [18:18:58] fwiw, geodata is not enabled on wikidata [18:19:07] so also not 100% available [18:19:21] so if we could enable it when some of the follow ups go in, then that might work [18:19:24] * Lydia_WMDE read up [18:19:36] aude: oh, good point. so the code wouldn't do anything, right? [18:19:40] DanielK_WMDE: yeah [18:19:51] Could we safely test it on testwikidata? [18:19:56] hoo: yeah [18:20:05] Or are the other potentially risky refactorings in? [18:20:12] per the ticket, we need globe + dim + type [18:20:26] which i wanted to work on but feel blocked at this point [18:20:52] i have a bad feeling about delaying this now if next week thiemo and daniel are not going to be there [18:21:21] we're not going to get much more done by delaying is my impression [18:21:34] well, we have broken geodata then :( [18:21:52] aude: broken in that some coordinates that should be not on earth are? [18:21:53] but if it's only on test, then suppose ok [18:21:56] or more? [18:22:01] Lydia_WMDE: and no dimension, no type [18:22:16] Lydia_WMDE: without the geodata extension enabled, it's not going to work anyway. [18:22:18] dimension and type meaning? [18:22:39] * aude digs up the ticket [18:22:40] DanielK_WMDE: not going to work meaning? we can't populate it? [18:22:44] aude: thx [18:22:55] Lydia_WMDE: yes. the table we want to poppulate is defined by the extension. [18:23:00] https://phabricator.wikimedia.org/T75482#1713211 [18:23:31] ok [18:23:35] dim and type are needed for when we use geodata to get coordinates for larger areas [18:24:01] aude: what timeline would you suggest? [18:24:03] then we need to rank and filter to get the most important things or filter in other ways, in order to get a reasonalbe number of results [18:24:21] once the chain is unstuck, i can work on this [18:25:38] aude: ok when would you start populating? and when deploy the code? [18:25:57] aude: dim and type are not that easy [18:26:09] jzerebecki: sure [18:26:15] Lydia_WMDE: maybe monday or tuesday [18:26:16] Lydia_WMDE: without the GeoData extension, the discussion is pretty moot. [18:26:36] DanielK_WMDE: well we can enable that at the same time no? [18:26:48] Lydia_WMDE: you mean *now*? [18:27:00] well, we can probably enable it before running refreshLinks. That would work. [18:27:04] yeah [18:27:10] that is what i mean [18:27:21] What does it need besides a config. change? [18:27:25] New table? [18:27:31] yup [18:27:36] As long as it's just that, we can do that all ourselves [18:28:57] hoo: would tuesday as a latest day still be ok for populating? /me is worried but if we agree it would still work... [18:29:41] Potentially... but if something goes wrong were going to ahve a fun Wednesday [18:29:46] so how about: we try to get it into this branch; don't right now enable geodata nor run linksupdate. then once we are fine on monday or tuesday after possibly backporting something enable it and run linksupdate [18:29:52] * aude would be more comfortable to do this next week [18:30:30] aude: this =? [18:30:36] as jzerebecki suggests? [18:30:41] Lydia_WMDE: populate stuff [18:30:45] ok [18:31:02] Sure [18:31:03] aude: so you are fine with my suggestion? [18:31:18] jzerebecki: ok with me [18:31:39] alright. i am worried we're going to delay it too much but if you all say this is better let's do it [18:31:39] good. agreed then. [18:32:01] So... we'll get it ready to go on Monday then? [18:32:13] ready to go = ready to populate [18:32:26] hoo: wrt populating: even if we have just say the first 50% of the items done in time we should still be ok [18:32:32] * aude should eat now and get some rest [18:32:32] first ones are the more important ones [18:32:41] ok [18:32:49] and relax :) [18:32:52] try to relax [18:33:35] * Lydia_WMDE hands aude some hot tea [18:33:40] :) [18:36:50] JohnFLewis: just got another subscription for wikidata-l. would be great if you could look into that soon [18:37:04] Lydia_WMDE: sure sure, doing now [18:37:12] thanks! [18:38:03] Lydia_WMDE: this is confusing now, heh? :) [18:38:17] heh [18:38:18] you mean sub requets to -l because apache redirects or? [18:38:18] a bit [18:38:39] yeah people don't seem to get redirected for the listinfo page from -l to the other [18:38:44] ack. [18:38:47] hoo: yes ready to populate on monday or tuesday [18:38:49] so they subscribe to the old list [18:38:56] bah neither of them are in this channel :( [18:39:13] * Lydia_WMDE goes and has food [18:40:22] hoo: i'm surprised the wikidata branch was correctly picked up into core wmf3, or did you do something to make that happen? [18:40:52] No, I just watched it :P [18:41:02] It matches up names I think [18:45:33] aude: jzerebecki pointed out that we shouldn't rely on class_exists for checking whether an extension is enabled. If the extension uses composer, the class may be found by the autoloader anyway (depending on how composer is used). [18:45:49] this isn't a problem with GeoData (we checked), but something to keep in mind for the future [18:46:18] made the mistake with WikibaseRepo yesterday - for that, it really doesn't work. [18:46:26] that's only really a problem with our extensions and stuff in vendor [18:46:33] this is the ticket to remove class_exist https://phabricator.wikimedia.org/T102246 [18:46:57] hoo: well, everything that uses composer. so, in the future, potentially a lot of things [18:47:01] Right now we're the only ones having autoload entries for several extensions at one point [18:47:22] yes and we only load those when the wikidata extension is enabled [18:47:24] No, not everything using composer [18:47:28] stuff that uses composer and is loaded [18:47:30] hoo: once it's used with core, all autoloader entries will be in a single place [18:47:37] Yeah, probably [18:47:52] so the classes would be found even if the extension isn't initialized [18:48:32] Yeah, but that's only an issue if the way people use composer changes fundamentally [18:51:46] hoo: so basically that becomes a concern once my the streamlining composer usage rfc is implemented [18:52:40] Yeah, by that time [18:52:59] depending on how we chose to implement autoloaders [18:53:17] one big autoloader vs. one per extension, managed by ExtensionRegistration or so [18:54:55] yea still quite many things to think of until then [18:56:39] hoo: just so you are aware: https://gerrit.wikimedia.org/r/#/c/245954/ [18:57:20] JeroenDeDauw: Nice :) [19:02:59] JeroenDeDauw: https://gerrit.wikimedia.org/r/245966 [19:16:38] hoo: ok rebranch? [19:17:14] If aude is ok [19:18:52] hoo: 20:31 < aude> jzerebecki: ok with me [19:19:07] Then ok [19:19:28] made an aditional manual test on vagrant that worked. [19:39:28] Lydia_WMDE: solving the issue in the easiest way possible - I've killed the wikidata-l list and left the archives online (until we redirect them via apache anyway) [19:54:30] hoo: i failed: https://gerrit.wikimedia.org/r/245979 [19:57:09] JeroenDeDauw: https://gerrit.wikimedia.org/r/245991 [19:58:48] hoo: huh? why did you redo this? ;p [19:58:59] redo? [19:59:14] Did that against master [19:59:28] hoo: https://gerrit.wikimedia.org/r/245979