[08:11:11] * Kelson42 has added user Automactic to the mwoffliner project [09:35:52] !log quarry deployed 575fc1c T209783 and 06a1f9f T205151 on quarry-web-01 [09:35:56] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [09:35:57] T205151: Create api health point for monitoring - https://phabricator.wikimedia.org/T205151 [09:35:57] T209783: Handle meta endpoints when the specified id doesn't exist - https://phabricator.wikimedia.org/T209783 [17:07:08] Is there a task tracking progress in upgrading tool forge to newer debian version? Specifically I'm interested in the availability of python 3.6 (and hence django 2.1). [18:55:19] yurb: https://phabricator.wikimedia.org/T199271 I think [19:01:19] zhuyifei1999_: thanks [21:37:09] !log quarry deployed till f9ad985 [21:37:10] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [22:22:27] !help hi, can I ask, is anyone having problems with rev_comments? When I query that in cloud, I am getting empty fields, but the downloaded data has content [22:22:27] tomthirteen: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [22:43:17] tomthirteen, I think the usage of that column changed [22:43:34] you're supposed to use another field to find an ID of a row in another table which MW will use as the comment [22:45:46] ugh when did that happen? [22:48:09] I think it's https://phabricator.wikimedia.org/T166733 [22:48:58] Presumably the revision table was getting too big so they separated it out [22:49:27] revision_temp_actor i think it is. [22:49:40] isn't that entirely a different thing paladox? [22:49:53] Oh, right yes sorry. [22:49:56] we're talking about the comment field, not the actor stuff [22:54:10] @krenair Thank you [22:55:26] @krenair, actually one more question, is it supposed to contain all the data? When I look at one title, it's only giving me the past two edits [22:55:58] I don't understand [23:03:27] I did a search for a title in recentchanges table [23:03:35] is that the new table? [23:04:04] no [23:04:22] recentchanges has been there for years [23:05:09] which table now has the revision history? [23:05:17] you want the `comment` table to find comments [23:05:24] revision still has the revision history tomthirteen [23:05:34] it just doesn't contain the comments [23:25:27] @krenair thank you [23:25:49] tomthirteen, have a look at "select rev_comment, rev_comment_id, comment.* from revision join comment on (revision.rev_comment_id = comment.comment_id) limit 10;" [23:27:56] I understand [23:27:57] thanks [23:33:58] zhuyifei1999_: hmm, so cdnjs is now a proxy and not a mirror? [23:34:09] yes [23:34:25] let me find the task [23:35:22] https://phabricator.wikimedia.org/T182604 [23:38:16] thanks [23:42:44] zhuyifei1999_: I'm mostly just curious, is there a major performance difference between locally hosting it or proxying it? [23:43:21] you mean cdnjs? not really. the major difference is that cdnjs git is so large [23:44:05] you end up with 100+ GiB with the git checkout [23:44:24] although the git repo itself is just a few GiB [23:44:58] so we need some way to de-duplicate and compress the checked out tree [23:45:48] I would imagine one way is to use FUSE and read from the repo directly, but idk if FUSE is really used in wikimedia [23:45:49] yeah [23:46:01] another way is to use btrfs [23:46:13] with transparent compression [23:46:49] but I don't think that is as good in compression ratio as direct git repo read [23:47:45] so, basically they decided to make a proxy instead of mirroring [23:54:29] (or we could figure out how to make nginx read from the git repo itself somehow)