[05:25:43] !log tools restarted tools-sgeexec-0906 and tools-sgeexec-0904; they seem better now but I have not repooled them yet [05:25:44] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [10:08:58] hi hi Krenair [10:09:08] i guess your here? :P your always here! [10:09:09] hi [10:09:19] I'm sort of here [10:09:23] what's up? [10:09:24] I was just looking at https://phabricator.wikimedia.org/T214278, is there a way to figure out which backend that tool is using? [10:10:03] No idea, I shifted it into the toolforge subproject, I know little about the inner workings of tools [10:10:08] it looks like the load on the tool is just too high? I was wondering if switching the the k8s backend might help / the possibility of scaling up once there [10:10:11] Krenair: okay! [10:10:42] arturo: ^^? :D [10:11:30] the backend can be found in the tool manifest file [10:24:16] arturo: thanks for the review on https://gerrit.wikimedia.org/r/c/operations/puppet/+/485193 ! I have a followup question, ATM the script autodetects the project via a file in /etc, I can't find the same for the region and thus eqiad1-r is hardcoded now, do you know if it is possible to fetch the region too? [10:27:05] godog: I think you can 'list' regions [10:27:05] https://docs.openstack.org/python-keystoneclient/latest/api/keystoneclient.v3.html#keystoneclient.v3.regions.RegionManager [10:30:43] arturo: i guess that can't be found publicly? [10:30:57] arturo: nice, another question while we're at it :) if a project has instances in multiple regions I take it instances can communicate across regions ? the reason I'm asking is that an improvement would be to list instances in all regions and merge results [10:32:11] godog: right now yes, instances can communicate across regions. An instance only can exist in one region. Instance names are unique across all regions. [10:33:00] addshore: it depends on the tool. Some tools have their files in github, etc. I'm not sure about the general case. I don't think we have a 'toolforge file browser' or 'tool details viewer' or something like that [10:33:04] sounds good, thanks again arturo [10:33:37] arturo: they can communicate but they will need a security group rule opening the ports, right? [10:34:06] gtirloni: it depends on which project/instance we are talking about [10:34:21] I only experienced it with toolforge [10:38:42] Instance names *should* be unique across all regions, I've seen violations of this which likely lead to crazy behaviour [10:51:07] !log wikidata-dev wikidata-constraints update MediaWiki and extensions to current master [10:51:09] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidata-dev/SAL [10:52:36] !log wikidata-dev wikidata-constraints remove WikibaseQuality extension (no longer required for WikibaseQualityConstraints) [10:52:37] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidata-dev/SAL [10:53:45] !log wikidata-dev wikidata-constraints remove WikibaseImport extension (WikibaseQualityConstraints can now import entities by itself) [10:53:46] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidata-dev/SAL [11:04:44] arturo: ok I think this topic is ready for review when you can: https://gerrit.wikimedia.org/r/q/topic:%22bug%252FT214058%22+(status:open%20OR%20status:merged) [11:05:00] gtirloni: too ^ if you can/want, I'd appreciate it [11:05:20] * arturo reviewing [11:08:36] volans would tell you to use .format() instead of % for string formatting :-P [11:09:39] other than that, if the logic itself of the code is right (didn't check), LGTM [11:10:39] heheh fair enough, habits are hard to die [11:10:45] I'll fix the % [11:15:48] in the form of another review that is, https://gerrit.wikimedia.org/r/c/operations/puppet/+/485623 [11:45:24] ok almost there, python3-keystoneclient needs to be pulled in from jessie-backports, fixing [12:20:30] godog: in which operating system? [13:45:50] arturo: on jessie, stretch is fine [13:46:04] cool [13:46:51] I wonder if you should require in puppet the proper ::clientpackages puppet class [13:47:03] (profile) [13:59:20] no idea, might end up being necessary but for now I've fixed it in prometheus::wmcs_scripts [14:48:45] Where is the wikimedia_nets variable defined for rate limiting purposes and are the new Tools machines in it? https://en.wikipedia.org/?diff=879480043&oldid=879465978 [14:53:32] Nemo_bis, that would be puppet.git [14:54:29] modules/varnish/templates/vcl/wikimedia-common.inc.vcl.erb does acl wikimedia_nets { [14:54:29] <% scope.lookupvar('::network::constants::aggregate_networks').each do |entry| [14:54:46] which comes from modules/network/data/data.yaml [14:55:56] network::aggregate_networks for production does not include 172.16.0.0/12 [14:56:43] just the old 10/8 range [15:00:39] that's it then [16:05:31] Nemo_bis: please open a phab task [16:14:32] Nemo_bis: or comment in https://phabricator.wikimedia.org/T213475#4871816 if it's just about the rate limits. But depending on the application, overall it might be better to just respect the rate limits [16:14:48] I would at least advise so [16:16:13] Too late https://phabricator.wikimedia.org/T214313 [16:17:33] merge in T213475 as a duplicate? [16:17:34] T213475: Difficulties to create offline version of Wikipedia because of HTTP 429 response - https://phabricator.wikimedia.org/T213475 [16:17:42] it's the exact same issue [16:21:48] !log math T204509 shutdown ubuntu iamges: drmf-beta, drmf, mathoid2, math-ru [16:21:51] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Math/SAL [16:21:51] T204509: cloudvps: math project trusty deprecation - https://phabricator.wikimedia.org/T204509 [16:22:17] !log math T204509 shutdown ubuntu *VM instances*: drmf-beta, drmf, mathoid2, math-ru [16:22:19] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Math/SAL [16:30:50] !log getstarted T204508 shutdown ubuntu VM: vrt [16:30:53] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Getstarted/SAL [16:30:54] T204508: cloudvps: getstarted project trusty deprecation - https://phabricator.wikimedia.org/T204508 [16:34:27] !log dumps T204503 shutdown ubuntu VM: bugzilla [16:34:29] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Dumps/SAL [16:34:29] T204503: cloudvps: dumps project trusty deprecation - https://phabricator.wikimedia.org/T204503 [16:35:14] wikibugs: test [16:35:31] I just filed a task but looks the bot isn't reporting [16:36:44] !log design T204502 shutdown ubuntu VMs: design-lsg3, living-style-guide, lsg-01 [16:36:47] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Design/SAL [16:36:47] T204502: cloudvps: design project trusty deprecation - https://phabricator.wikimedia.org/T204502 [16:38:03] wb2-phab tools.wikibu Rr <-- looks good [16:40:56] !log T204695 shutdown ubuntu VM: wikidataconcepts [16:40:56] gtirloni: Unknown project "T204695" [16:40:57] T204695: cloudvps: wikidataconcepts project trusty deprecation - https://phabricator.wikimedia.org/T204695 [16:41:05] !log wikidataconcepts T204695 shutdown ubuntu VM: wikidataconcepts [16:41:07] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidataconcepts/SAL [16:43:11] !log osmit T204527 shutdown ubuntu VM instances: osmit-due osmit-tre osm-serv [16:43:13] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Osmit/SAL [16:43:13] T204527: cloudvps: osmit project trusty deprecation - https://phabricator.wikimedia.org/T204527 [16:43:23] !log queryrapi T204683 shutdown ubuntu VMs: queryr-db-01, queryr-api-01 [16:43:25] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Queryrapi/SAL [16:43:26] T204683: cloudvps: queryrapi project trusty deprecation - https://phabricator.wikimedia.org/T204683 [16:47:39] !log T204703 wildcat shutdown ubuntu VM: danny-b [16:47:39] arturo: Unknown project "T204703" [16:47:40] T204703: cloudvps: wildcat project trusty deprecation - https://phabricator.wikimedia.org/T204703 [16:47:49] !log wildcat T204703 shutdown ubuntu VM: danny-b [16:47:50] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wildcat/SAL [16:50:08] !log dumps T204503 shutdown ubuntu VM: dumps-stat [16:50:11] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Dumps/SAL [16:50:11] T204503: cloudvps: dumps project trusty deprecation - https://phabricator.wikimedia.org/T204503 [16:50:16] !log dumps T204503 shutdown ubuntu VM: dumps-stats* [16:50:18] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Dumps/SAL [18:29:21] !log admin installed libguestfs-tools on cloudvirt1021 [18:29:23] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Admin/SAL [18:50:40] uhm there's no longer any way to suspend a job on grid? [18:55:23] Apparently not on the Stretch grid since qmod can't be used according to bstorm_ [19:02:56] zhuyifei1999_: I'd like to answer your question wrt tools.meta, but I'm not the owner of the tool and I don't have access to it to know how it works in the background [19:03:13] why is wikibugs not reporting on this channel? [19:03:30] it's in -cloud-feed [19:03:32] valhallasw`cloud: ^^ :) [19:03:35] Hm, that means restarting a job (qmod -rj) is also no longer possible. [19:03:45] aaah, idk about -feed [19:05:20] hooray for hiding bots away where no one will find them [19:05:39] heh [19:06:06] Hauskatze: did you find it? [19:06:08] Krenair: ;) [19:06:33] check /topic [19:08:18] I don't feel particularly strong about wikibugs in #-feed. If anyone wants to submit a patch I'll +2 and merge it in a sec [19:09:22] avgas: there was some restructuring in the new grid and, since it's a new thing, we're collecting feedback.. please open a phab task if you will and we can discuss it there. I wasn't aware this wasn't working like that anymore and I'm sure others will be interested too [19:10:38] * zhuyifei1999_ likes it in -feed better [19:15:32] gtirloni: that task regarding qmod is https://phabricator.wikimedia.org/T213656 [19:15:47] avgas: ^ [19:16:17] zhuyifei1999_: cool, thanks! I invite people to give feedback there [19:16:25] k [19:47:20] gtirloni, zhuyifei1999_, thanks! [20:33:42] Hello! I'm new here, and have a question about ssh access to bastion host(s) - as described here: https://wikitech.wikimedia.org/wiki/Help:Access -- my account needs to be a member of the "Shell users" group, and it isn't. What should I do? [20:38:15] ivoras, I believe that comes automatically when you get added to a project? [20:38:16] ivoras: the shell group information there is outdated. I'll fix that up. [20:39:00] ivoras: the bigger question is what instance were you expecting to be able to ssh to? Are you a member of a Cloud VPS project? [20:41:20] looks like they're not [20:42:00] bd808: yeah, that's the next question I had. Basically, I started all this because I've googled around and found that's the way to get access to a db replica. The host I *think* I need to access is login.tools.wmflabs.org (in the above doc: ssh -i private_key user@login.tools.wmflabs.org). But I'm not sure. [20:42:59] https://wikitech.wikimedia.org/wiki/Help:Getting_Started#Get_started_with_Toolforge [20:52:21] bd808: Thanks, I've submitted a request! Do I need to create a "tool" if I just occasionally need replica db access? [20:55:36] ivoras: no, once your access to toolforge is approved your user account will have its own replica.my.cnf file that will allow you to connect to the Wiki Replica database servers. You may also want to look at https://quarry.wmflabs.org/ to see if it actually gives you all the database access you need. [20:56:23] Thanks! [20:58:17] ivoras: I just read your access request. If the data you are looking for is the page contents you will not find that in our replica databases. The replicas only provide "meta data" about activities performed on the wikis. Page contents can be fetched using the Action API or from https://dumps.wikimedia.org/ [20:59:34] db808: ah, thanks! Then I guess I don't need the access :) [21:00:52] so, there's no way to get a database-like access to page contents? [21:01:06] there is not [21:02:55] not from labs [21:06:32] Krenair: do you mean there are other ways? What I'd need is a way to get a "full copy" and after that, to get "deltas", differences across some time periods. I was hoping that the db replicas are actual replicas so I can pull content at a point in time and then query for updated content periodically. [21:06:39] the best way to query page contents imo: Special:Search with insource:/a regex/ [21:07:12] ^ doesn't apply if you need old revisions [21:08:53] ivoras: how many pages are you processing? I'd suggest either dumps (if you are processing a lot of pages but don't need newest data) or api (if you are processing less pages and need newest data) [21:09:17] ivoras, I mean that the page contents are ultimately stored in production databases known as ExternalStore but they are not open to the public [21:09:41] the db replicas are actual replicas [21:09:44] but not of ExternalStore [21:09:51] just of the s* shards containing MW's metadata [21:10:40] the MW API will give you revision contents and diffs [21:10:50] (for most revisions) [21:13:21] Ok. It seems like I'll need to setup automated pull & restore from dumps. Since I'm probably not the first one, are there any scripts which do that? [21:14:25] https://meta.wikimedia.org/wiki/Data_dumps#Existing_tools ? [21:33:39] !log telnet shutdown ubuntu VM: telnet2 [21:33:41] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Telnet/SAL