[00:22:09] I like how we still refer to MariaDB as MySQL internally [11:54:56] I created a new tool repo on diffusion using the web interface. how do I grant the tool access to the repo, or is the standard procedure that the tool has only read access to the repo? [12:55:56] Can someone fix the link in https://tools.wmflabs.org for create a new tool? https://toolsadmin.wikimedia.org/tools/create/ is returning 403 [12:56:02] If you report this error, please include the details below. [12:56:02] Request ID [12:56:02] 6174d9656ea74b01875786d45b462f1 [12:56:45] lol, session expired I soppose [12:56:56] needs to be handled better [12:57:21] at least redirect to login page? [13:04:05] Could you file a bug report in Phabricator? [13:04:09] Jyothis, ^ [13:08:13] andre__: will do. [13:12:56] andre__: https://phabricator.wikimedia.org/T192450 [13:13:44] thanks [13:59:43] jynus: it's returning 403 because you need to be logged in to toolsadmin before you can create a tool [13:59:51] jynus: sorry, autocomplete fail [14:44:04] Afternoon [14:44:10] Is there a lag on replication? [14:45:09] ShakespeareFan00: https://grafana.wikimedia.org/dashboard/db/mysql-replication-lag?panelId=4&fullscreen&orgId=1&from=now-7d&to=now&var-dc=eqiad%20prometheus%2Fops [14:45:14] if you mean wikiréplicas, yes [14:45:47] https://tools.wmflabs.org/replag/ [14:47:02] physikerwelt: what do you mean by the tool having access to the repo? you as the maintainer should have read/write access on the repo [14:51:19] jynus: OKay [14:51:30] Just some Quarry queries I use were not updating [14:52:00] you can query programatically the heartbeat table to know the delay [14:52:22] also some servers are more delayed than others, probably quarry is on the slow ones right now [14:53:05] Fair enough [14:53:14] I am used to reports only updating weekly [14:53:32] from when the replicas were not done as often [14:54:49] this is important production maintenance that will change some important core tables (users and revision) [14:55:26] because we have no redundancy, lag cannot be avoided [14:55:28] Okay, will I need to change my querioes? [14:55:40] not yet, but maybe in the future [14:55:47] Okay [14:55:56] check announcements done and to be done in cloud and wikitech [14:56:10] about work on comments restructuring and user restructuring [14:56:35] on the bright side, some queries may be faster [14:56:45] and renames will be instant for wikis [14:57:22] Renames as in file renames or as in users? [14:57:29] user renames [14:57:33] Ah Okay [14:57:41] And hopefully account merges as well? [14:57:52] not sure about that [14:58:18] check work to be done at [14:58:45] https://phabricator.wikimedia.org/T33863 [14:59:04] and https://phabricator.wikimedia.org/T167246 [15:00:07] The current hope by the Cloud Services team is that we will be able to keep a backwards compatibility layer in the Wiki replica views for the MediaWiki schema changes. We believe that it will be very difficult to reach all users of the databases with instructions on how to use the new tables in a reasonable amount of time. [15:00:46] yeah, but still, using the real structure will be much easier and probably faster [15:00:53] certainly [15:00:58] bd808: - Okay thanks [15:01:09] https://quarry.wmflabs.org/ShakespeareFan00 [15:01:17] My queries aren't that advanced [15:02:04] in most cases, just an extra join may be needed, if that is not already done automatically [15:05:01] some of those queries would be nice to pre-do them on production regularly, so you don't even have to wait [15:05:22] (get a table with those results "for free" on wikireplicas) [15:06:14] Well the important ones a the ones to do with images [15:06:22] *are to do with images [15:06:30] They don't reference the user name table at all [15:06:45] yeah, so you may not be affected [15:06:53] If there's a way to have them automated queries [15:07:15] there is not- but that is something we got asked from time to time [15:07:17] I think SQL has 'views' or stored procedures? [15:07:23] but not materialized [15:07:32] so it is like if your run SQL [15:07:51] Arguably on Quarry, it would be nice to be able to set an auto-update [15:07:52] the idea is to setup something so users can submit queries, they are aproved, and the results are regularly updated automatically [15:08:00] (03PS1) 10Andrew Bogott: Revert "Add Chicocvenancio's key for Cloud Services" [labs/private] - 10https://gerrit.wikimedia.org/r/427401 [15:08:08] (03PS2) 10Andrew Bogott: Revert "Add Chicocvenancio's key for Cloud Services" [labs/private] - 10https://gerrit.wikimedia.org/r/427401 [15:08:29] there is nothing in place yet, this is just an idea of a solution with no implementation [15:08:33] jynus: Worth raising a ticket [15:08:37] ? [15:08:42] I was about to suggest that :-) [15:08:51] if you want to share ideas, please do [15:09:09] Running regular queries should be well within the capabilities of any modern SQL backend [15:09:28] well, that is possible on mariadb/mysql with events [15:09:39] the new thing would be to centralize it so it only has to run once [15:09:55] not many people doing the same kind of "reports" independently [15:10:20] Indeed [15:10:26] and doing inefficient queries independently, if it can be optimized and organized [15:10:36] It would also be nice to have a localisation option [15:10:52] So that you don't have duplicates for different language versions [15:11:05] of what are essentially the same query [15:11:37] also some queries may need access to private data [15:11:47] but doesn't produce private data [15:11:55] e.g. number of watches of a page [15:12:26] the main issue is there is not an automated way to propose those kind of pregenerated reports [15:12:33] *watchers [15:13:15] Other than locally ona project [15:13:27] WP:DBR on English Wikipedia for example [15:14:10] (03CR) 10Andrew Bogott: [V: 032 C: 032] Revert "Add Chicocvenancio's key for Cloud Services" [labs/private] - 10https://gerrit.wikimedia.org/r/427401 (owner: 10Andrew Bogott) [15:14:21] but that is not really an infrastructure-wide [15:15:06] nor the results cannot be reused easyly I guess, as if it was additonal tables on wikirreplicas [15:16:10] (and it should work for all projects) [15:17:19] e.g. you cannot query things on top of that because it is not on quarry [15:18:13] I think doing that on server side is doable, we just need a workflow + code [15:18:50] alternatively, an SQL endpoing could be setup where those queries are instant with an analytics-focus engine [15:19:09] so real time analytics are possible [15:39:09] There may be some interesting new possibilities once we have a replica of the Analytics data lake to query -- https://wikitech.wikimedia.org/wiki/Analytics/Data_Lake -- T169572 [15:39:10] T169572: Provide mediawiki history data to Cloud Services users - https://phabricator.wikimedia.org/T169572 [16:13:20] zhuyifei1999_: I mean after I "become toolname" [16:20:05] physikerwelt: that’s an LDAP user, and that user is only applicable to stuffs that are attached to LDAP for authentication. Toolforge is one. However, the repos are hosted on Phabricator, and Phabricator users are a different entity that may be linked to an LDAP user [16:20:55] usually when one writes to the repo it checks for the ssh keys of the users with write access for authentication [16:21:43] physikerwelt: I do not personally recommend making it trivial for your tool to push data back to the git repo. [16:22:39] I would recommend managing git from your local computer and only pulling new changes as the tool using an unauthenticated https connection [16:23:14] yeah what bd808 said [16:23:33] if you make a hot fix on the tool and want to push it into the repo you can do that by creating a patch and using scp to pull it down to your local computer [16:23:48] it’s often easier to edit files locally anyways :) [16:31:53] zhuyifei1999_ bd808 thank you. I think reed access is the way to go anyhow. Do you think it's worthwhile to mention that in the readme? [16:33:37] physikerwelt: sure, the README and/or a Tool namespace page like https://wikitech.wikimedia.org/wiki/Tool:Stashbot [16:34:06] its a good idea to have some "how to run this service" document somewhere [16:41:26] bd808: it's my first tool ever... so I started with https://wikitech.wikimedia.org/wiki/Help:Toolforge#Using_Diffusion but I was unsure about the write permissions since a git user is configured for the tool further down on that page https://wikitech.wikimedia.org/wiki/Help:Toolforge#Using_Github_or_other_external_service [16:44:33] !help labtestnet2001 apparently run out of space (in case it is now known) [16:44:33] jynus: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [16:44:53] jynus: thanks, I'll look [17:03:39] bd808: do we have an username/password for logstash-beta stored somewhere? [17:07:53] DowagerCountess: yes. It is stored in ~root/secrets.txt on deployment-tin. If you are using any browser other than Chrome it should be documented in the authentication prompt. Chrome is "fancy" and decided that people do not need to see that text. [17:08:15] * bd808 grumbles that he was forced to put a password back on this at all [17:08:41] bd808: ok, I have access to deployment-tin, thanks [17:08:51] I think something is broken though [17:09:13] https://phabricator.wikimedia.org/T192471#4140018 <-- this is not the output I remember from showJobs [17:36:58] !log quarry deployed 8eeeff8 to quarry-main-01 [17:36:59] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [19:35:21] !log quarry deployed c6cd55e to quarry-main-01 [19:35:22] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [21:26:18] hi! the newest entry in database on labs (sql dewiki) seems to be almost 48h old. did i miss some change in using the db? [21:28:07] seth: I think they were updating category collations @ production wikis and that might have caused some load, but maybe others know best [21:33:15] I found some information about current replags in https://wm-bot.wmflabs.org/logs/%23wikimedia-cloud/20180417.txt [21:34:21] ah, I did not know https://tools.wmflabs.org/replag/ [21:46:19] i've read https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database now. it seems like i had to change the db host from ${PROJECT}.labsdb to ${PROJECT}.analytics.db.svc.eqiad.wmflabs [21:52:32] seth: yep, they've changed servers [21:52:36] i missed that change: https://wikitech.wikimedia.org/w/index.php?title=Help%3AToolforge%2FDatabase&type=revision&diff=1775692&oldid=1774526 [21:52:54] web for "instant" and "cheap" queries, analytics for all others [21:53:47] jes, i use queries for a constant working bot, so no need for fast response. [21:53:55] what is a good way not to miss such changes in future? [21:54:11] s/jes/yes/ [21:57:51] in former times (i.e. toolserver) there were newsletters that informed on breaking changes. but for a while i don't get any messages concerning wmlabs. where do i have to sign for being kept up-to-date? [21:58:12] I was informed via cloud@lists.wikimedia.org [21:58:29] formerly labs-l@lists.wikimedia.org [21:59:47] how can i subscribe there? [22:00:16] found it: https://lists.wikimedia.org/mailman/listinfo/cloud [22:00:18] https://lists.wikimedia.org/mailman/listinfo/cloud [22:00:21] :-) [22:00:22] yep, that one [22:12:05] thanks, cu! [22:21:47] !log quarry +Framawiki project admin & Gerrit +2 [22:21:48] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [22:25:27] ooh, a new volunteer? [22:27:17] yeah [22:27:55] I asked if they want to maintain this, and yeah https://gerrit.wikimedia.org/r/#/c/427025/ :) [23:12:57] zhuyifei1999_: thanks ! [23:21:35] !log Quarry deployed 02049d9 to quarry-main-01 [23:21:35] framawiki: Unknown project "Quarry" [23:21:54] !log quarry deployed 02049d9 to quarry-main-01 [23:21:55] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [23:22:47] \o/