[00:38:41] mutante: iegreview is still being used as far as I know. It would be nice to actually move those trivial php apps to a Kubernetes cluster somewhere, but the last time I checked that was not something the serviceops folks were excited about. [00:41:58] bd808: alright. well.. i found this https://phabricator.wikimedia.org/T222665 [00:42:12] code stewardship request for iegreview [00:42:27] yes. I am very aware [00:42:58] so i have 3 apps on https://phabricator.wikimedia.org/T247648 and all 3 have more or less a ticket to discuss if decom or upgrade [00:43:53] whatever remains could also be folded added to misc_static or maybe k8s, yea [00:44:03] If the goal is end of Q4 I'm pretty sure the answer will be to upgrade those ganetti instances [00:45:22] the saga of scholarships this year has been weird. They wanted to update it, then kill it, then update it. [00:45:32] lots of minor drama [00:47:06] who do we have to ask though? [00:47:57] mutante: Isabel Cueva and Joël Letang [00:48:31] bd808: heh, never heard the names i think. thanks! [00:49:15] mutante: Joël is Rachel's boss and Isabel is her coworker on the events team [00:49:55] *nod* ok [00:53:29] I vaguely remember there being some automated email processing / sending instructions for grid engine somewhere regarding error reports and such, but is there documentation on an equivalent for K? [00:54:07] what are you trying to do? [00:55:43] stuff like sending email alerts if the bot can't edit/upload due to being blocked, for example [00:55:59] or permissions errors, etc [00:57:41] the page you're probably thinking of is https://wikitech.wikimedia.org/wiki/Help:Toolforge/Email [00:58:33] the k8s containers don't have exim enabled, so you'll have to use something else to send mail from inside [00:58:43] in python the built-in smtp module works [00:58:57] primarily using PHP [01:02:10] there's probably a library or tool to use [01:02:20] you just need something that can speak SMTP [01:09:37] check the package bsd-mailx [01:09:50] we use that to send logmail from phabricator server.. f.e. [01:10:21] also the package called "s-nail" [01:12:00] ah, yea. so what it is is https://gerrit.wikimedia.org/r/c/operations/puppet/+/542191 [01:12:08] "heirloom-mailx was [01:12:09] a transitional package and has been replaced by s-nail" [01:14:48] if you have that then you can do this in bash scripts: [01:14:59] hmm, might just end up using calls to api.php instead [01:15:04] cat < BLA BLA BLA [01:15:10] EOF [01:15:27] so just email address, subject and body [01:32:26] how would I go about calling that through the webservice? [01:36:55] DSquirrelGM: i think it's entirely unrelated to webservice [01:37:03] it's this https://wikitech.wikimedia.org/wiki/Help:Toolforge/Email#Sending_via_the_command_line [01:37:20] but i noticed the "not available in k8s" comment.. so i dont know [01:38:33] " Tools running in Kubernetes should instead send email using an external SMTP server. The mail.tools.wmflabs.org service name should be used as the target SMTP server. " [01:40:54] DSquirrelGM: alright.. so what you do is use the "mailx" command as i mentioned earlier but you add parameters for the external SMTP server [01:41:01] see https://www.binarytides.com/linux-mail-with-smtp/ [01:41:28] echo "something" | mailx -v -r "someone@example.com" -s "This is the subject" -S smtp="mail.example.com:587" ... see that example there [01:41:39] replace smtp server with mail.tools.wmflabs.org [01:42:26] you might or might not need the other options for TLS and auth [01:43:18] or swaks as in the second example [03:30:28] I created the `signatures` tool today, and I didn't get a replica.my.cnf with it [03:41:39] https://phabricator.wikimedia.org/T247654 [04:31:25] how long did you wait between creating it and trying to log in? [04:43:30] I cloned the repo in 2 hours later [04:44:51] which is more than I ususally do [11:32:01] hey guys, can someone help me update my OAuth consumer redirect url to match our new tools urls ? [11:33:36] (or disregard it, found another issue) [11:47:04] tonythomas: FWIW, you can't update oauth stuff. They're basically immutable [11:50:51] Reedy: alright. Wonder how things are going to look after we change our domains to toolname.toolforge.com though [11:51:12] Create a new oauth consumer, get it re-approved [11:51:48] I think at lease some of the cloud team can approve them, so it should be possible to do it as part of the migration workflow [11:51:54] *at least [11:55:21] I think so. Also, just asking, the new urls are not ready yet, right /" [11:57:39] AFAIK not [11:57:57] We should definitely make sure Bryan/similar is aware to make sure that's part of the migration checklist for tools [11:58:09] Especially if it's gonna be on demand task requests/similar in the first instances [11:58:52] tonythomas, you mean toolname.toolforge.org? [11:59:05] Krenair: yes. [11:59:17] haha, not .com of course. [12:00:20] Reedy: true. I see https://phabricator.wikimedia.org/T244473 though [12:01:04] You can have multiple oauth providers enabled... [17:32:28] Is there someone around that can help with my missing replica.my.cnf? https://phabricator.wikimedia.org/T247654 [17:39:24] hmm... IIRC this type of thing is beyond my permissions [17:39:35] AntiComposite: should be good now, I had to restart the maintain users process [17:39:44] AntiComposite, I see a replica.my.cnf file [17:39:54] ah [17:40:08] !log admin restart maintain-dbusers on labstore1004 T247654 [17:40:20] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Admin/SAL [17:40:20] T247654: New tool `signatures` did not include a replica.my.cnf - https://phabricator.wikimedia.org/T247654 [17:43:43] looks to be working now, thank you [18:03:52] !log tools.zppixbot-test Finish setup of k8s, tool is now functioning as expected [18:03:53] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.zppixbot-test/SAL [21:13:32] hi [21:13:58] Someone know a easy way to get all the image titles in a category on python? [21:14:23] Im just a bit tired to do everything, i want just copy and paste [21:15:13] maybe https://gist.github.com/hay/665734/6262e8a3c02af714f07a518a8f3eac2bbc12f862 ? [21:15:43] im using python 3.8 [21:17:59] like https://commons.wikimedia.org/w/api.php?action=query&prop=images&titles=Commons:Quality%20images&imlimit=500&format=json&utf8 [21:26:12] hi [21:27:43] hi [21:27:55] let's see if we can find the right API query for this [21:28:10] we want media, specifically images, in a category [21:28:54] https://commons.wikimedia.org/w/api.php?action=query&list=categorymembers&cmtitle=Category:Physics&cmtype=file [21:29:28] you can probably determine whether it's an image rather than a video or something by looking at the extension? [21:29:53] I think that could be better add the filter before [21:30:00] in the api request [21:30:16] oh I know [21:32:28] https://commons.wikimedia.org/w/api.php?action=query&generator=categorymembers&gcmtitle=Category:Physics&gcmtype=file&prop=imageinfo&iiprop=mime [21:33:10] or https://commons.wikimedia.org/w/api.php?action=query&generator=categorymembers&gcmtitle=Category:Physics&gcmtype=file&prop=imageinfo&iiprop=mediatype [21:33:38] i can see videos in the list [21:33:46] yeah you'd just filter that out [21:34:02] but there are some way to filter in the request? [21:34:07] why would you want to? [21:34:22] isn't filtering the results enough? [21:34:27] i preffer get a list allready filtered from the server [21:34:46] off the top of my head I don't know if the MW API can do that [21:35:34] I remember a way to add filters [21:42:08] Wilfredor, why would you prefer to get the list pre-filtered instead of filtering it yourself? [21:43:57] because the server dont need return a big list, it is faster for the server and for me because i dont need filter it [21:55:06] realistically unless you try to pick out all images in a category intended for videos, I'd expect the majority to be images [21:55:44] it might make a small difference but is it worth the effort [22:03:31] re: your comment about resource usage elsewhere, as long as it's not being directly run on the login server and doesn't require custom installations, you might not need to request a separate instance. (Wilfredor) [22:04:30] but if running interactive shells doing the requests, use the dev server instead of login [22:07:22] dev server is more secure [22:18:53] https://commons.wikimedia.org/w/api.php?action=query&format=json&list=categorymembers&imlimit=500&cmtitle=Category%3AQuality_images_by_Wilfredor [22:19:11] why its returning just 10 items? [22:21:04] because you're passing imlimit [22:21:16] you need to pass cmlimit as you're dealing with categorymembers [22:21:23] https://commons.wikimedia.org/w/api.php?action=query&format=json&list=categorymembers&cmlimit=500&cmtitle=Category%3AQuality_images_by_Wilfredor [22:27:36] oh ok thanks [22:30:23] Wilfredor, what do you mean by " dev server is more secure"? [22:32:29] im trying to get more than 500 results, I think that perform this request directly on server side could return a request faster because is in the same network [22:36:14] Wilfredor: are you running it on labs? [22:36:20] you could query the db directly [22:36:52] i cant run it on labs because hight CPU consumation [22:38:57] you can pass cmlimit=max and it'll give you the max number of results you're allowed for a user [22:39:02] Any more, you need to paginate [22:39:17] maybe a continue? [22:39:44] max is 500 [22:41:25] im looking for how do a pagination implementation on python [22:46:31] https://stackoverflow.com/questions/56101933/wikipedia-all-pages-api-after-30-requests-returns-same-pages-titles [22:54:11] very dangerous that script, it could generate a DDOS attack [22:59:24] Not really