[12:07:41] is there any way to get a notification when a particular dump has been run? [12:08:09] Not really [12:08:51] ah, a pity, thanks [12:09:27] TheDaveRoss: What sort of notification would you like? [12:09:50] It's probably worth filing a task, as there might be a way around it [12:10:03] Uh, a way to implement it [12:10:07] really anything would be of use, whether on wiki, email, RSS, etc [12:10:10] Such as a mailing list, and when jobs finish, automated emails are sent out [12:10:18] Have you got an account on phabricator? [12:10:22] yes [12:10:34] Want to file a request for it? : [12:10:36] :) [12:10:54] I suppose I could do that [12:11:22] If you don't ask, you won't get etc [12:11:53] true. Is Ariel still the point person on that? [12:17:02] Yup. But if you tag it Dump-Generation, it'll find its way to the right place [12:21:22] So tagged. I am always hesitant about bug filing since inevitably as soon as I enter one I find the duplicate that I couldn't find in the hour I spent searching [12:22:33] 2 is better than 0 [12:30:18] hm... 12:12:08 PHP Warning: file_put_contents(/src/coverage.html): failed to open stream: No such file or directory in /home/jenkins/workspace/mwext-phpunit-coverage-patch/src/extensions/Babel/vendor/mediawiki/phpunit-patch-coverage/src/CheckCommand.php on line 228 [12:32:31] https://gerrit.wikimedia.org/r/c/430350/#message-2c9ed0c5_0b48e4fc [12:32:31] ideas? [14:44:33] Technical Advice IRC meeting starting in 15 minutes in channel #wikimedia-tech, hosts: @Tonina_WMDE & @Lucas_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:01:57] Alright, welcome to this week’s Technical Advice IRC meeting :) [15:02:20] Hello! \o/ [15:03:25] Bencemac: I believe you had some pre-announced topics, which one would you like to discuss first? [15:03:34] Greetings! [15:04:44] Maybe T189376. Is it possible that AWB using IE's core? [15:04:44] T189376: Ignoring letters using acute accent - https://phabricator.wikimedia.org/T189376 [15:05:37] Bencemac: regarding AWB we are not able to reproduce the issue due to lack of Windows software :) It's really nice that you opened a phab ticket on their workboard. Hopefully they will take a look soon. Other than that AWS has an IRC channel where you can try asking #AutoWikiBrowser [15:06:59] I see. [15:07:01] https://en.wikipedia.org/wiki/Wikipedia:AWB/MW <-- it can work on linux :P [15:07:10] Hello This is my first meeting :) [15:07:32] (Not that I expect you to try that, setting it up on linux is probably a lot of work [15:07:39] hello Gopa [15:07:48] Hi! Unfortunately, IE doesn't work on Linux. :) [15:07:58] Tonina_WMDE: Hi [15:08:25] Bencemac: can you perhaps try if you get the same behavior in Internet Explorer as well, when you move the cursor there? [15:08:40] Gopa: you got a question for us? :) [15:08:45] I am struck at this can any one help me out ? https://gerrit.wikimedia.org/r/#/c/418585/ [15:08:54] let's see [15:09:05] Tonina_WMDE: Thanks a lot [15:11:25] Gopa: you mean regarding Thiemo's comment on the last patch? [15:11:42] I cannot now, I don't have IE. Tacsipacsi said that he can reproduce the bug, so I'll give a try with AWB's IRC: [15:11:47] Tonina_WMDEYes [15:12:38] Gopa: On line 77, you need to pass a ->numParams() to the wfMessage() [15:13:08] I think you'd want to pass ->numParams( 0 ) [15:13:11] yeah, similar to classes/Hooks.php L561 [15:13:42] So the line would be wfMessage( 'lqt-newmessages-n' )->numParams( 0 )->text() [15:13:43] Thanks I will check it out [15:14:32] Number zero ? or some variable? [15:14:33] What you have currently, would look like "New messages ($1)" since without specifying ->numParams(), the $1 will never be changed [15:14:58] In that code path, its immediately after marking all messages read, so the user should have no new messages at that point [15:15:23] bawolff: Ohh [15:15:29] in that case, a literal 0 would be appropriate [15:16:55] [On a different topic] - from wikitech-l - https://gerrit.wikimedia.org/r/#/c/7738/ ... poor patch is from 2012(!) [15:17:36] wow, I didn’t notice that when I saw the patch earlier [15:17:43] four-digit change number… [15:18:06] but unfortunately I don’t have +2 rights in core :) [15:18:12] who wants to be the hero? [15:21:02] Lucas_WMDE: I could merge it, but I'd like to hear someone who speaks an actual RTL language say its ok first [15:21:10] because I don't speak any RTL languages [15:21:42] oh, Huji said it was good [15:21:44] doesn’t eranroz satisfy that requirement? [15:21:47] that's probably good enough for me [15:22:06] I thought he was the patch author [15:22:29] In any case, between the two of them, that's good enough [15:23:28] I'll just give it a quick test to make sure [15:24:32] From yesterday I am not able to open login page due to some database issues I guess The error message was https://ibb.co/fHhw17 If you have any Idea that would be helpful to me Thanks [15:24:44] Lucas_WMDE: I guess we should change the invlid to invalid? I agree that's a super weird abbreviation [15:25:19] yeah, I think a typo is more likely :) [15:25:34] Gopa: Sounds like DB load balancer is misconfigured [15:25:42] if you want I could upload a new version that fixes it, then you don’t need to +2 your own patch set [15:25:46] e.g. you have the wrong database name for centralauth db or something like [15:25:54] Lucas_WMDE: That'd be great :) [15:26:16] Oh I will check it out Thanks [15:28:37] bawolff: pushed [15:30:18] In connection with T169964, what can I do? [15:30:18] T169964: Counter of the numbers of the pages on a category shows negative result - https://phabricator.wikimedia.org/T169964 [15:30:53] Thanks. Just doing a quick test of it, and then I'll +2 [15:31:15] Bencemac: Run updateCategoryCounts.php [15:31:31] Or, i mean, get someone to do that for you [15:34:05] bawolff: As an editor, the problem is not a big deal (-2, who cares?). But in a different view, the software is wrong because it should not do that. [15:34:29] Hey I have small Idea on wiki pages If willing can you please go to this https://www.mediawiki.org/wiki/User:Gopavasanth#IDEA [15:34:47] Our category counts don't happen in the same transaction (for performance reasons since the count is non-critical), which means sometimes db make a boo-boo and they get out of sync [15:35:46] I know it. Would I care about it or would I let it go? [15:37:20] let it go / let it go / can't holdit back anymore :D [15:37:40] Bencemac: Its really not a big deal to get someone to run the script [15:37:44] I would like to listen to your views on my Idea, Sorry If I am wrong in any ways. [15:37:47] especially if the wiki is not english [15:38:30] Could you give me a manual or something? [15:38:47] i don't find it on MediaWiki. [15:39:01] Gopa: There has been some previous attempts in that direction that didn't work out so well [15:39:24] Gopa: See Extension:ArticleFeedback and Extension:ArticleFeedbackToolV5 [15:39:58] Oh Thanks for letting me know this Sure I will go to that links [15:41:26] Bencemac: I don’t think you can run that script yourself, someone with production access will need to do that [15:41:32] Bencemac: Sorry, i got mixed up, its actually called recountCategories.php https://www.mediawiki.org/wiki/Manual:RecountCategories.php [15:41:34] I’m not sure what the process for requesting that is… probably a Phabricator ticket [15:41:59] You know, i think I can just do that [15:42:05] or that :D [15:42:41] I know Tgr, he can help if it is necessary. [15:43:14] This is on huwiki [15:43:18] o/ [15:44:24] how long does that take? [15:45:32] I mean does that need scheduling in the deployment calendar or does it finish in a couple minutes? [15:46:10] I'm not sure. I'm reading what the script does now [15:47:39] Hi, I'm trying to use the Wikidata Tools plugin for Google Sheets and for some reason it can't load any info from Wikidata (columns previously loaded fine), is there something broken at the moment? [15:47:42] huwiki has 60K categories [15:48:24] And about 2 million categorylinks [15:49:12] Whoa, a familiar face! Hi Tgr! [15:49:42] hey Bencemac [15:49:42] There's a related bug T170737 [15:49:43] T170737: Run recountCategories.php on Wikimedia wikis - https://phabricator.wikimedia.org/T170737 [15:52:18] tgr: So my totally rough guess, is the script will take around an hour to run on huwiki [15:52:54] or somewhere in the range of 20 minutes to maybe an hour and half (Very rough guess) [15:53:39] So i think it might be fast enough [15:55:15] I'm not sure how strictly the 1 hour limit is interpreted [15:57:18] Lucas_WMDE: Thanks for Helping me out in that previous patch. [15:57:28] Gopa: you’re welcome :) I hope someone will merge it now [15:57:51] Bencemac: aah, we’re almost out of time and I wanted to talk to you about the huwiki tool! [15:58:04] so I’m not sure how hard it would be to write a dedicated tool for that [15:58:35] but I recall hearing that some other wiki (Commons, I think?) changed their infobox template so that it would automatically compare the parameter data with Wikidata [15:58:46] and do something (extra style, maintenance category, not sure) if they were different [15:58:57] That would be great. [15:58:58] that might be easier to implement [15:59:15] but I can’t find anything about that right now… perhaps someone in #wikidata remembers? [15:59:43] but in any case, that only covers the infobox, not the lead section [15:59:52] unless you use templates for biographical data in the lead section as well [16:00:55] FYI, Edgars2007 said that she/he would make the tool (maybe). T185423 [16:00:55] T185423: Make a tool to find articles with different birth and death dates in lead section, infobox and Wikidata - https://phabricator.wikimedia.org/T185423 [16:01:16] OK [16:01:17] At more than 95% we don't. [16:01:28] just wanted to mention this alternative approach [16:01:44] hat was ruwiki I think [16:01:54] *that* [16:01:57] If you find it, could you ping me? I'd be greatful. :) [16:02:04] ah, that’s possible [16:02:07] [16:01] wikibugs_ (CR) Huji: "Holy blank! Someone actually merged this!! It has been waiting for such a long time. Made my day!" [extensions/Cite] - https://gerrit.wikimedia.org/r/7738 (https://phabricator.wikimedia.org/T15673) (owner: Eranroz) [16:02:12] another early wikidata adopter wiki [16:03:03] Bencemac, tgr: this PDF seems to mention it https://commons.wikimedia.org/wiki/File:Wikidata-infoboxes-short.pdf [16:03:18] Thanks your help, I have to leave. Have a good night! (or day) [16:03:45] and I think that wraps up this Technical Advice IRC Meeting! (we’re already a few minutes over time, oh noes) [16:03:57] thanks to everyone who participated, I think this was a very productive meeting today :) [16:30:05] Are there recommended ways to rename a wiki? By rename, I mean change the name of the database (or table prefixes) and corresponding $wgSiteName plus settings files in a farm? [16:32:22] No [16:34:06] @Reedy, how does WMF manage things like https://noc.wikimedia.org/conf/highlight.php?file=InitialiseSettings.php and wgConf.php [16:34:17] Manage? [16:34:25] They're in a git repo [16:34:34] Are they simply editted, and tested in dev environments [16:34:44] No [16:35:07] Right, I was just wondering if there was any sort of administrative script or interface to the files [16:35:12] They aren't testing [16:35:15] *tested [16:35:15] https://github.com/wikimedia/operations-mediawiki-config [16:35:21] Well they are sort of in betawiki [16:35:29] but its so different from production to be next to useless [16:36:08] freephile: you can git clone the repo, edit them with any editor and then upload them to Gerrit for code review to suggest changes [16:37:06] freephile: git clone https://gerrit.wikimedia.org/r/operations/mediawiki-config [16:37:11] Reedy: thanks for the link to the repo... I'm trying to learn how it's done at WMF to replicate [16:37:21] It's not a good example [16:37:47] Yeah, lot's of historical baggage, and the largest possible example [16:38:06] A good example of how not to sanely do things [16:39:20] * freephile waves hi to bawolff: [16:39:33] Hi [16:39:58] tgr: Wow, that script finished in like under 20 seconds [16:46:01] question: when registering a new account on https://wikitech.wikimedia.org the UNIX shell name is the same as the LDAP user name? [16:47:20] Christian75: not necessarily, no [16:47:58] it can be but doesnt have to be [16:49:54] ok, re this request: https://phabricator.wikimedia.org/T193616 I am creating an account called CristianCantoro_SUL [16:50:04] and wanted to create an LDAP user with the same name [16:50:12] but it says the name is already taken [16:52:31] CristianCantoro: i can't confirm an LDAP user with the name CristianCantoro_SUL exists [16:52:47] but there is always confusion around them because an LDAP user has at least 3 different fields [16:52:50] cn, sn and uid [16:52:55] and they can all be different [16:53:53] CristianCantoro: i _do_ see a user called "cristiancantoro" [16:54:05] and that is also a shell user name. looks like this: [16:54:11] uid: cristiancantoro [16:54:18] that should be me [16:54:22] cn: CristianCantoro [16:54:28] sn: CristianCantoro [16:54:35] omeDirectory: /home/cristiancantoro [16:54:41] that's me [16:54:59] so that exists in LDAP [16:55:44] but that's the only one i see [16:55:59] and that seems normal to me as well, because "SUL" has no relation to LDAP [16:56:48] mutante: I know, I made a mess because I did not realize that wikitech was not using SUL and then I tried to log in Phabricator and then see the request above [16:57:07] it's normal to have 2 seperate users, i also have 2 users [16:57:15] one is LDAP and one is wiki [16:58:01] i am not sure i understand what is happening in the ticket but i can confirm for you what currently is in LDAP [16:58:08] if that helps [17:01:11] Gopa: I don't have the rights to merge https://gerrit.wikimedia.org/r/#/c/428068/ . You'd have to ask someone like Reedy, hashar, or addshore [17:07:24] mutante: if I request a LDAP account in https://toolsadmin.wikimedia.org/register/ldap I get this https://i.imgur.com/FZ7KHQe.png also "cristiancantoro-sul" or similar does not work [17:07:42] mutante: I have created two accounts on phabricator by mistake [17:08:11] one is the main (CristianCantoro) which is associated to my LDAP account, CristianCantoro is also my username on the projects [17:08:44] the other (CristianCantoro_SUL) is associated to my mediawiki/Wiki* account (CristianCantoro) [17:09:24] I wanted to merge them and/or discard CristianCantoro_SUL on Phabricator and associate my Mediawiki/Wiki* account to my CristianCantoro account on Phabricator [17:09:30] but that is not possible [17:12:15] CristianCantoro: "already in use OR invalid" i guess it's the invalid part because of the space? [17:12:30] CristianCantoro: i didnt know toolsadmin.wm.org was a place to register LDAP users [17:13:01] mutante: also "cristiancantorosul" does not work [17:13:12] mutante: which is the "canonical" place? [17:13:54] CristianCantoro: i thought it was still https://wikitech.wikimedia.org/ [17:14:44] CristianCantoro: better report it to andre__ my information is maybe outdated [17:15:42] mutante: when I created the account "CristianCantoro_SUL" I have put as UNIX shell name: "cristiancantoro-sul" [17:17:55] CristianCantoro: i can confirm that exists in LDAP [17:18:00] uid: cristiancantoro-sul [17:18:06] cn: CristianCantoro SUL [17:18:15] sn: CristianCantoro SUL [17:19:03] so yes, you have 2 LDAP users [17:19:25] one is UID: cristiancantoro and one is UID: cristiancantoro-sul [17:19:50] since you already have 2, does it mean no more issue? [17:19:58] you didnt want to create a 3rd one, right [17:28:04] mutante: I changed browser and I was able to login in Phabricator with "cristiancantoro-sul" and then linked the account correctly [17:28:27] CristianCantoro: woohoo, glad to hear that :) [17:28:44] if I told you that I have understood what happened i would be lying [17:29:56] CristianCantoro: same here :) [18:08:48] bawolff: Ok no worries Thanks ;) [18:43:05] Anyone knows of any examples of a grunt job running Mediawiki (extension) phpunit tests? I know how to set it up to run phpunit in general, but running inside mw and triggering the mediawiki test runner is different [18:43:12] I wonder if there are any examples of this already [19:42:19] mooeypoo: uh, why would you have grunt trigger phpunit? [19:50:56] OK, probably a silly question, but how do I search the global rename log by old username via the API? I can see there's a field to do it on the Special:Log GUI, but I can't see a way from the API. (My query is currently: action=query&list=logevents&letype=gblrename ) [19:54:57] It doesn't look like you can [19:57:58] hmm, that's roughly the conclusion I'd come to, but was hoping to be wrong :) Cheers [19:59:41] stwalkerster: It doesn't work anyway, does it? [19:59:58] if you put &oldname=foobar into the query... [20:00:34] No, I tried that [20:00:46] Reedy: it works [20:00:50] (in the GUI) [20:01:08] Just get the "Unrecognised parameter" warning, and the same resultset as before [20:01:25] seems you will need tp do that starting by a feature request, stwalkerster ;) [20:01:29] I wonder how much his feature is used in MW [20:01:33] I think that textbox is hardcoded to the UI [20:01:34] LogEventsListGetExtraInputs etc [20:01:41] legoktm: It's done via a hook [20:02:04] Reedy: I wrote it ;) [20:02:15] legoktm: Well why didn't you implement it in the API? :P [20:03:04] :< no idea, probably no one asked at that time. GlobalRename + SULF was a crazy time [20:03:14] heh [20:03:39] Of course, this is only a feature CA seems to use [20:03:57] But a couple of simple enough hooks in core... Some subscribers in CA [20:04:01] Handful of messages [20:04:10] It doesn't alter the output, just the output filtering [20:04:48] Reedy: can you please merge https://gerrit.wikimedia.org/r/#/c/430326/ ? thanks [21:10:47] legoktm: I am dropping what I wanted since I think it's not quite worth it in my specific case, but the idea was that when you run "grunt test" locally, it should also run tests for you; we do qunit tests with grunt, but I figured why not also have grunt trigger the php testing too [21:32:39] mooeypoo: we could theoretically do it with composer (which I think makes more sense than grunt/js) [21:34:29] legoktm: good point, yeah [21:34:56] might be good in general, btw.. the command to run tests for an extension that use the mw-core tests is not very straight forward to remember how to do [21:35:06] I am still not sure how to run parser tests on a specific input [21:38:42] php tests/parser/parserTests.php --file=.... [21:38:47] (https://www.mediawiki.org/wiki/Parser_tests) [21:41:07] Unlike unit tests, which have the memorable `sudo -u www-data php tests/phpunit/phpunit.php --wiki wiki path/to/tests/to/run` ;-) [21:41:14] hm why didn't i find this... O.o I may have been confusing concepts when searching. The phpunit tests are also not completely straight forward in vagrant; I managed to run individual test files but not a whole folder for some reason [21:41:49] Also, they failed locally even though they clearly pass in CI, but that's a different issue I probably have specifically with local installation or osmething [21:41:51] (in vagrant) [21:42:05] James_F: See, that's why things are simpler when not in vagrant :P [21:42:23] mooeypoo: which test is failing and how? [21:42:24] I've always had problems running full phpunit tests [21:42:31] * bd808 deletes the git repo and let's y'all fend for yourselves [21:42:33] I think last time i tried was mediawiki 1.17 though [21:42:34] bawolff: except when you need multiple wikis :P or... ORES... or... VE... or... or or or [21:42:45] bd808: DONT. YOU. DARE. <3 [21:42:45] Vagrant is wonderful [21:42:59] If you just don't use optional features, things are soo much easier [21:43:01] You're giving me a panic attack just suggesting it [21:43:11] Also if you just don't turn on your computer, all your computer problems go away [21:43:27] bawolff comes up with the real solution! [21:43:33] computers are the worst [21:43:48] bd808: btw, I had to go back to VirtualBox :\ lxc started making really weird errors that seem to exist for others online (with variation) but no one has clear solutions and nothing worked. but, knock-on-wood, it's working for now. [21:44:51] mooeypoo: :sad face: Most of the 'issues' with mw-vagrant are really problems with the virtualization service(s) [21:46:17] legoktm: Kartographer extension, I run 'php tests/phpunit/phpunit.php --wiki=wiki extensions/Kartographer/tests/phpunit/KartographerTest.php' and I get tons of failures for testTagData but clearly these passed the last time we merged anything on master..... right? [21:47:02] bd808: yeah I know, but those are, unfortunately, what makes it work.... or not work