[00:07:34] Zppix: which file are you trying to take? [00:07:50] oh never mind [00:08:11] and what command are you running that get cert issues? [00:34:54] !log quarry Deploy b5fd6b0 on quarry-main-01 [00:34:55] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [09:51:23] !help Is there anybody can help me turn off my 2fa? I read it at https://wikitech.wikimedia.org/wiki/Help:Horizon_FAQ#What_happens_if_I_lose_my_phone_and_my_backup_codes? [09:51:24] wi24rd: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [09:51:54] hi wi24rd what happened? [09:52:23] I flash my phone, and I don't know its storage is offline before. [09:52:34] so, I can't login [09:53:10] ok I can disable 2FA for you, but I would like to check your identity beforehand [09:53:36] My user name is 'wizard' . But actually I'm a novice :) [09:54:05] I has ssh login.tools.wmflabs.org [09:54:43] great, would you please start by filling a phabricator task requesting this? so we have some record of the operations. Then I would add there what I do on my side [09:55:56] I also can't login phabricator because of that. [09:56:24] And I see to reset 2fa in phabricator need a hash [09:56:27] for a month. [09:56:37] And now I don't have. [09:57:01] https://www.mediawiki.org/wiki/Phabricator/Help/Two-factor_Authentication_Resets [09:58:22] ok, we can skip the phabricator thing, is not mandatory [10:00:12] !log wikispeech Deploy latest from Git master: 3d93359, bcb22a5 (T191758) [10:00:14] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikispeech/SAL [10:00:14] T191758: Migrate to new Wikispeech wiki server - https://phabricator.wikimedia.org/T191758 [10:01:06] wi24rd: I just put a file in your home directory at the toolforge bastion. Could you please edit it to something elese? [10:01:08] else* [10:01:29] So what should I do next? [10:03:47] wi24rd: I need to confirm your identity before disabling 2fa for you [10:04:11] that's why the file edit [10:05:55] https://wikitech.wikimedia.org/wiki/Help:Horizon_FAQ#What_happens_if_I_lose_my_phone_and_my_backup_codes [10:06:01] say https://wikitech.wikimedia.org/wiki/Help:Horizon_FAQ#What_happens_if_I_lose_my_phone_and_my_backup_codes [10:06:13] say What happens if I lose my phone and my backup codes? [10:06:14] A member of the WMF Operations team can turn off 2fa for your developer account, at which point you can re-enable it and reset your code. To do this you will need to verify your identity, most likely by logging into a Cloud VPS instance with an ssh key and editing a file as requested by the Op who is helping you. Just in case, make sure you have a working ssh login to Cloud VPS before you enable 2fa. [10:06:24] I can do that to prove me [10:06:57] yeah, that's what we are doing, right? [10:07:52] wi24rd: I put a file in your home directory. Since you logged with your SSH key to login.tools.wmflabs.org, you should be able to edit the file and put something inside it [10:08:31] !log wikispeech Deploy latest from Git master: 8fcefab (T191758) [10:08:33] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikispeech/SAL [10:08:33] T191758: Migrate to new Wikispeech wiki server - https://phabricator.wikimedia.org/T191758 [10:10:45] relaying bot from IRC to somewhere else imo violates "Using Wikimedia Labs as a network proxy: Do not use Wikimedia Labs servers or projects to proxy or relay traffic for other servers. Examples of such activities include running Tor nodes, peer-to-peer network services, or VPNs to other networks. In other words, all network connections must originate from or terminate at Wikimedia Labs." of Labs TOU but someone else claims it does not [10:10:52] wi24rd: would you like some help or further details/explanations? [10:11:24] Can someone give me an answer whether "IRC relaying bot violates labs tou or not?" [10:12:26] Yes. I have see two file named 2fa_auth.txt, replica.my.cnf, and I create the README file. [10:12:47] ok thanks wi24rd [10:13:57] wi24rd: 2FA should be disabled now [10:14:29] Thx, I will check it later. BTW, Could you tell me what does the content of replica.my.cnf mean? [10:15:25] wi24rd: it's a mysql config file, for connecting to wiki-replicas databases [10:15:27] Right, it's done. [10:15:57] wi24rd: about data services: https://wikitech.wikimedia.org/wiki/Portal:Data_Services [10:16:45] Thanks for telling me that. [10:18:20] wi24rd: it would be interesting if your phabricator access was fixed, since is one of our main communication channels (specially for bugs/feature tracking) [10:22:20] It seems like it will take at least one month so I can login phabricator again. :( [10:58:40] revi: a bot that serves wikimedia channels hosted on cloud is ok [10:59:01] using cloud infrastructure to host a relay for you to connect to irc is not ok [10:59:04] ok [10:59:15] thanks [11:00:21] what if you're relaying wikimedia channels? [11:02:21] on my last sencence, relay == IRC bouncer [11:03:49] consuming, for example, the IRC recentchanges is expected [11:04:20] that is wikimedia infrastructure [11:04:27] freenode is not [11:05:26] we are ok with people anonymizing its source ip, but if cloud fails, it is also a SPOF [12:02:42] !help I asked help from musikanimal via mail, but to be fast and bold. I got a question from local newspaper for most readed pages in finnish wikipedia for the all time. In theory i can read information from the pageviews stats /public/dumps/pageviews [12:02:42] zache: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [12:02:53] however those files are huge [12:03:08] so grepping them on toollabs is very slow for me [12:03:27] and also downloading the whole dataset takes too much time [12:07:28] i know that grepping them should be not that slow so it is because i am in virtual server with cpu limits so i guess that what i am asking that if somebody with fast access to data OR without hard cpu limits would grep information what i am needing out of the datafiles [12:08:16] which is pretty much "zcat /public/dumps/pageviews/2018/2018-03/pageviews-201803* |grep "^fi " to the all files [12:37:16] I guess some tool exists to show pageview/stats, I believe that's common data anybody would like to know [12:38:58] https://tools.wmflabs.org/topviews/ and one can download the data from https://dumps.wikimedia.org/other/pageviews/ [12:40:00] but basic problem with that is that you cant currently get the most readed pages of wikipedia per year OR per all time [12:40:56] all time is a bit problematic- analytics were not recorded at first, and for the first years the way to do that changed from the latest years [12:41:35] i can handle the problematics, that is not the problem; problem is that i cant get the data out of the files in current time frame [12:42:03] we could do since July 2015 (when pageviews first started being recorded), but you'll need to manually run a query on Hive [12:43:11] I'm currently staying at a place that blocks the SSH ports, but come Friday I can run the query for you [12:45:59] you could do it through the dumps too but that's probably a lot harder [12:46:06] the Hive query would be something like https://phabricator.wikimedia.org/T183903#3874401 [12:50:06] It seems that having yearly stats may be a commonly requested feature [12:50:24] apparently already filed https://phabricator.wikimedia.org/T154446 [12:51:16] yeah, specifically https://phabricator.wikimedia.org/T154381 [12:51:37] every year I get a bunch of requests, and manually run a query for said wiki [12:55:11] can i just ask access to hive for access via phab (eg is analytics-users enough ) ? [13:02:13] I recommend creating a task asking someone run the query for you, like https://phabricator.wikimedia.org/T183903 [13:02:19] so tag with "Reading-analysis" [13:02:39] ok, thanks [13:15:17] @musikanimal : is it hard/trivial to include the wikidata properties to the resulting datasets? I would need the information for is the topic human and gender, but i can do it myself too. [13:17:16] You can't do that through Hive, no [13:18:10] Or at least I don't think you can. The analytics cluster also has production replicas, that are fast, so whoever tends to your request may be able to help [13:19:09] actually, production replicas are way slower than cloud wikireplicas [13:20:23] Oh really, well there ya go! You have access to the cloud replicas thru Quarry [13:20:59] but it is not easy to quey wikidata data from any of those [13:21:16] wikidata query service is more appropiate [13:21:42] https://phabricator.wikimedia.org/T192360 [13:21:50] When I did the new page patrol analysis last year I found the analytics replicas to be faster, which is why I say that, but that may been before the new cloud replicas were rolled out [13:22:15] musikanimal: maybe also if they are massively queried, they can get slower [13:22:26] musikanimal: wikidata qid:s you can get from the hive? [13:22:28] but the hardware is over 6 times faster [13:22:48] because they are in the page_props [13:23:24] No I think Hive is separate from all things MediaWiki, but don't quote me! [13:24:05] anyway, i have already genderdata mapped against articles, so as long i can get the top5000 articles with page_ids i am happy. [15:18:58] !log puppet3-diffs add vgutierrez as project admin [15:18:59] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Puppet3-diffs/SAL [16:09:21] Is there a reason interval 1 day would work in a query (against logging for instance), but interval 1 hour would not? It was working last night. [16:12:22] SQL: https://phabricator.wikimedia.org/P7003 [16:12:53] mediawiki, however, doesn't use timestamps [16:13:00] Indeed. [16:15:15] you have to calculate them on app and operate with the result: https://phabricator.wikimedia.org/P7003#39975 [16:17:12] Fair enough, and that's the route I ended up going, just odd that it works: https://quarry.wmflabs.org/query/26500 [16:17:23] (well, for day at least) [16:19:39] can't you cast the text to a proper timestamp and then compare? [16:20:16] SQL: I think you were getting bogus results- newusers since 20180101 [16:20:43] maybe not, depends on the collation [16:21:49] DATE_FORMAT(), what chicocvenancio suggests, may work for you [16:21:50] e.g. [16:21:58] No idea, it was an easy workaround ($lasthour = date( "YmdGis", time() - 3600 ); [16:21:58] ), was just idly curious :P [16:24:02] SQL: try this -- select count( log_id ) from logging where log_timestamp > date_format(now() - interval 1 day, '%Y%m%d%H%i%s') and log_type = "newusers"; [16:24:03] you can convert mediawiki timestamps to sql dates with "str_to_date(afl_timestamp, "%Y%m%d%H%i%s")" [16:24:33] SELECT DATE_FORMAT(now() - INTERVAL 1 DAY, '%Y%m%d%H%m'); [16:25:02] I got my time codes wrong, but you get the idea [16:25:22] yeah, thanks guys [16:25:43] I hadn't thought to mess with the formatting heh [16:26:46] this should be probably it https://quarry.wmflabs.org/query/26510 [16:35:32] So - yeah same deal, works for daily, but not hourly. https://quarry.wmflabs.org/query/26512 doesn't return any results past 1am UTC this morning unless I've got this very simple query wrong. [16:37:16] SQL: ah, that is another thing [16:37:25] see: https://tools.wmflabs.org/replag/ [16:37:47] AH. That would explain why it was working 10 hours ago LOL [16:37:55] thanks man, I shoulda known to check that [16:38:00] well, the query "works" [16:38:07] :-) [16:39:48] you can automate when to do reports by checking heartbeat_p.heartbeat [16:40:00] which will register the time of the last update [16:45:21] zhuyifei1999_: I'm looking at https://phabricator.wikimedia.org/T190895 and I can't actually find the part where it says what quotas you want :) Is that in there someplace? If not, can you add specific total numbers to that ticket someplace? [16:46:45] ah, ok, maybe it's in here already [17:00:45] andrewbogott: the ask was for doubling it, but I've no problem with the current raise [17:00:53] ok, thanks [17:01:17] we might find that it needs to be raised again, but it is a guess how much we actually need [17:01:19] +1 to what chicocvenancio said (sorry was on a bus) [17:02:09] this is probably enough for the gsoc dev environment [17:02:40] I'll try to setup the instances on Thursday. too busy with stuffs today & tomorrow [18:34:05] !help Hey. There is a huge lag in the Wiki replicas. Any chance to fix it? [18:34:06] pasleim: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [18:38:37] jynus: ^ [18:39:28] (replag is 60k+ seconds on s1, s3, & s5) [18:54:28] I beleive the lag is from schema updates that are being done for the production servers [19:16:28] bd808: Hm.. but given it increases by 1s/s, I guess it stopped replicating. [19:16:39] But presumably prod slaves are still replicating and we're not in read only mode. [19:17:47] 17h for s1,3,5 and 37h for s8. [19:19:16] We could probably look through SAL to see what's happening. One guess is that replication is stopped because they are actively applying migrations to the sanitarium servers. I know that J.aime is aware of the size of the current lag. He linked people to https://tools.wmflabs.org/replag/ a few hours ago [19:24:39] There is definitely a lot of db related activity in SAL -- https://tools.wmflabs.org/sal/production?q=jynus+marostegui+jynus%40tin+marostegui%40tin [21:46:11] Alters are clearly marked on the Deployments page https://wikitech.wikimedia.org/wiki/Deployments#Week_of_April_16th [21:48:14] also on SAL: [21:48:34] 05:33 marostegui: Deploy schema change on db1087 with replication (this will generate lag in labs) - T187089 T185128 T153182 [21:48:35] T187089: Fix WMF schemas to not break when comment store goes WRITE_NEW - https://phabricator.wikimedia.org/T187089 [21:48:35] T153182: Perform schema change to add externallinks.el_index_60 to all wikis - https://phabricator.wikimedia.org/T153182 [21:48:35] T185128: Schema change to prepare for dropping archive.ar_text and archive.ar_flags - https://phabricator.wikimedia.org/T185128 [22:59:59] !log quarry forgot to restart uwsgi on last deployment. restarted it now [23:00:01] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL