[11:37:10] Can't we add external scripts on projects? I am getting warnings [11:39:42] gyan: what is an external script? are you talking about CloudVPS projects? which warnings are you getting? [11:40:46] [Report Only] Refused to load the script 'https://code.jquery.com/jquery-3.3.1.min.js' because it violates the following Content Security Policy directive: "default-src 'self' 'unsafe-eval' 'unsafe-inline' blob: data: filesystem: mediastream: wikibooks.org *.wikibooks.org wikidata.org *.wikidata.org wikimedia.org *.wikimedia.org wikinews.org *.wikinews.org wikipedia.org *.wikipedia.org wikiquote.org *.wikiquote.org [11:40:46] wikisource.org *.wikisource.org wikiversity.org *.wikiversity.org wikivoyage.org *.wikivoyage.org wiktionary.org *.wiktionary.org *.wmflabs.org wikimediafoundation.org mediawiki.org *.mediawiki.org wss://tools.wmflabs.org". Note that 'script-src' was not explicitly set, so 'default-src' is used as a fallback. [11:41:16] I am loading cdn scripts [11:41:20] gyan: IIRC the new CSP policy is not to allow external resources, no [11:41:34] https://tools.wmflabs.org/gyan/ [11:41:45] Please give me link of medawiki cdn [11:41:46] but I'm not totally sure, feel free to reach out the security folks for advice [11:42:17] There is cdn of mediawiki I believe [11:42:39] https://tools.wmflabs.org/cdnjs/ [11:42:42] I got it [11:43:37] Question 2: How to enable mod reqwrite on kubernet using php7.2 [11:44:52] * arturo has no idea [13:27:43] !log mwoffliner migrating project to eqiad1 [13:27:45] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Mwoffliner/SAL [15:00:51] KPADSIBIJSWJREDMAQUIMNJKNJJHLCQNEEPGSNWJ Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @Thiemo_WMDE & @chiborg - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:08:44] !log tools.wikitext-deprecation deleted k8s scheduledjob that was inactive since 2017 (T211772) [15:08:47] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.wikitext-deprecation/SAL [15:08:47] T211772: wikitext-deprecation: Cannot determine if wikitext-deprecation/wikitext-deprecation-generator needs to be started - https://phabricator.wikimedia.org/T211772 [15:23:03] !log tools.replag deleted broken k8s deployment 'test' [15:23:04] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.replag/SAL [15:23:17] !log tools.ebraminio-dev deleted broken k8s deployment 'hello-minikube' [15:23:18] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.ebraminio-dev/SAL [23:23:46] !help I'm working on https://phabricator.wikimedia.org/T208890 One aspect of that is determining the best way to store and deliver files for download to users through an app on Cloud VPS. Our app will generate the files (which are JSON) but we aren't sure if they should be saved to disk or some other storage mechanism. Ideally, we'd love for them to live for some period of time and possibly be URL addressable. [23:23:46] aezell: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [23:25:32] aezell: I'm not sure exactly what you are asking. Are you wondering if there is a ready-made Cloud Services project for serving these files? [23:26:20] That would be ideal. If not, some guidance on if writing to a VPS disk is acceptable would be helpful also. [23:26:59] We are sensitive to filling up disks or causing undue I/O load on the VPS platform. So, I'm looking for best practices from that perspective. [23:27:39] I think I'm going to need a lot more context here to be helpful [23:27:50] Fair enough :) [23:28:18] Hey, is grafana-labs.wikimedia.org for beta cluster only, or can other VPS projects write to it too? [23:28:35] aezell: something... somewhere... will make a json file and then ... something... somewhere... will make it downloadable via a url? [23:29:55] MaxSem: its not a deployment-prep thing, its a general Could VPS thing [23:30:27] @bd808 We will have an application (on ToolForge, not VPS, I was mistaken there) that will allow users to request the download of a file full of data. The application will gather this data, from the replicas, and create the file. Then, we need to provide a link/URL to the user for them to download the file via their browser. [23:31:12] aezell: and I assume this is not a direct download from ram because the process of collecting the data will take a long time? [23:32:16] @bd808 Correct. While we've not built the tool yet, we anticipate that gathering the data from a multitude of tables would take a long time. This tool may create even larger files in the future so we'd like to not rely on streaming out of memory. [23:32:57] bd808: If we have to, we can do RAM, especially if we all agree a ToolForge app could perform correctly there. This should not be a very high traffic tool. [23:33:30] Writing things to a tool's $HOME on Toolforge is not cheap or fast because it is NFS backed, but there is no other durable file storage that a Toolforge tool has access to. There is the ToolsDB, but MariaDB is neither cheap nor fast for writes either. [23:34:23] The main concern I would have with "click to create file and come back later to download" tool is that it find a reasonable way to clean up after itself and not leak generated files all over the place [23:36:36] Yeah, we've considered tracking the downloads in a DB and using a cron to delete them after the fact. [23:36:41] aezell: from the data description on T208636, you are going to have to have access to something beyond the wiki replicas as well because the user's email address is not available there. [23:36:41] T208636: Give users a download of their "User Data" - https://phabricator.wikimedia.org/T208636 [23:37:16] bd808: Cripes. You're right. I had forgotten about that. That's what a vacation will do to your brain. [23:37:20] although I would assume you are going to also use OAuth to gate access to this so that random people can't get these reports on others? [23:37:34] Correct. [23:37:47] Ok, maybe I've asked prematurely. [23:38:22] Let me circle back with the team and get some more info instead of floundering about wasting your time. I can hopefully come back with a better formulated question. [23:39:14] aezell: ok :) I'm pretty sure the final answer will be "build what you want, but if it puts too much stress on NFS or ToolsDB we will ask you to fix it" ;) [23:39:30] Got ya. ;) [23:42:16] isn't toolforge inherently unsafe for private user data? [23:42:52] Yeah, so now that I'm reviewing my notes, this will actually be part of MW core. [23:43:01] It'll be a link on your Preferences page. [23:43:11] I was all kinds of wrong. [23:43:18] Sorry about that. [23:45:04] you could build a toolforge tool that collects the data and just dumps it into the download as it preceeds, you'd have to be clever about the data format though [23:47:10] IIRC, we decided to do this in core because it might be useful across a variety of wiki projects/installs. Also, we wanted to avoid having to be clever. [23:48:04] I'm vaguely remembering the conversation that way. [23:48:33] that makes sense, if the intent is to make wikis GDPR-compliant [23:50:51] That may be part of future plans of which I'm unaware. This tool isn't *directly* aimed at that. [23:52:14] Thanks bd808