[06:33:00] is there any restriction on formatting nonces for OAuth? [06:34:03] specifically does it support A-Za-z0-9_- ? [13:27:08] [[Tech]]; 41.113.57.251; /* What is love */ new section; https://meta.wikimedia.org/w/index.php?diff=20139979&oldid=20135123&rcid=15615293 [13:28:57] [[Tech]]; Reedy; Reverted changes by [[Special:Contributions/41.113.57.251|41.113.57.251]] ([[User talk:41.113.57.251|talk]]) to last version by Stanglavine; https://meta.wikimedia.org/w/index.php?diff=20139986&oldid=20139979&rcid=15615308 [15:03:40] 12:03 PM is there any restriction on formatting nonces for OAuth? [15:03:40] 12:04 PM specifically does it support A-Za-z0-9_- ? [15:03:40] just repeating my query if anyone knows hmm [15:04:47] OAuth 1, I assume? [15:04:58] OAuth 1a yep [15:05:24] no restrictions whatsoever: https://oauth.net/core/1.0/#nonce [15:05:43] alright thanks @tgr :) [15:06:43] probably better not to use fancy characters like control chars because they might break whatever storage layer the nonce is put into but those above are fine [15:07:21] i'm using them with mediawiki's oauth [15:09:38] tildes should also be fine I guess? [15:09:50] yeah and we put it into Redis or Kask or something like that for the duplicate check [15:10:18] which in theory should handle any string, but I'd err on the safe side [15:10:35] how do these questions even come up? it's a random-generated string :) [15:11:24] was migrating from UUID to nanoid and nanoid has extra symbols, hence just wondering if it's alright [15:11:42] just use a random number or something [15:12:35] Magnus wrote the nonce using UUID (safety?) I'm trying to cut down on the crates being used by the Rust library mostly [15:16:04] all that nonce is used for is to reject the request if the same app used the same nonce within the last 5 minutes [15:16:38] the same app used by the same user, where that makes sense [15:17:08] something like a 10-character random string should be perfectly safe [15:17:30] probably way less, too [15:18:13] might as well not use a crate prolly then, yep [15:19:57] if you expect to make a thousand requests per user in 5 mins, you'd want a larger than 10^6 nonce space, that's about four lowercase letters [15:20:09] so yeah, don't sweat it [15:41:30] I was wondering if storing all previously used nonces was required, because I'm just generating a somewhat large random number out of laziness under the assumption that if I hit a duplicate it's very unlucky. I think the space in which I'm generating nonces is more like 2^32 though, so highly unlikely to cause problems. [15:43:32] with a large enough rangespace, you would need very few characters [15:44:26] yeah, I'm base64-ing the 2^32 output for sanity [16:31:00] All Wikimedia websites became unavailable to me. This includes Wikipedias, Phabricator, and Graffana. Is it me or is something going on? [16:32:20] Your ISP suck? :P [16:32:22] Huji: sorry to hear, can you have a look at https://wikitech-static.wikimedia.org/wiki/Reporting_a_connectivity_issue and follow the instructions there to get us more information? [16:33:14] Reedy it does, but that is a chronic issue. This seems acute :/ [16:33:39] @rz [16:33:58] @rzl the problem is those instructions start at Phabricator, which doesn't open for me. Let me restart my modem ... [16:34:14] even if you ca-- ah [16:34:50] He's right though [16:34:55] We should probably improve that wording [16:35:13] yeah, I was going to say mention some of the results here instead of phab, but that's true [16:36:04] huji should also know better than reporting "it doesn't work" :P [17:00:44] there started being some packet loss towards our sites in North America around 15:50 UTC [17:04:36] I'd presume Huji would be connecting to ESAMS [17:05:22] ah okay, esams doesn't look affected [17:08:16] Though... His IP seems to be https://www.rcn.com/hub/customer-center/ [17:08:27] So it might've been that [17:09:33] @Reedy i'm getting occasional drops on a traceroute, i'm probably on esams/eqsin based on location [17:09:41] That was an US IP, not sure he'll be connecting from ESAMS [17:09:55] by occasional i mean 50% ish so quite a lot [17:10:29] Reedy: that's interesting, RCN is my ISP and I'm also seeing connectivity problems to eqiad :) [17:10:44] heh [17:10:58] * Reedy had it in his head that Huji was in the ME area [17:13:16] https://www.irccloud.com/pastebin/czdk2LNf/traceroute.txt [17:15:16] qedk: if you have mtr installed, could you do a: mtr -zw --tcp --port 443 dyna.wikimedia.org [17:15:42] i don't [17:15:47] is it on homebrew? [17:15:52] yeah [17:15:55] I believe so [17:20:40] https://www.irccloud.com/pastebin/UybbPqZQ/mtr.txt [17:20:44] @cdanis [17:20:44] thank you! [17:21:02] glad to help :) [17:36:45] Reedy: sync-with-gerrit works like a charm now: https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/+/602118/ [17:48:00] and finished with the setup/import [17:48:10] too many manual steps involved [17:56:17] How much does it take for a newly created extension repo to be available on Special:ExtensionDistributor ? [17:57:54] 30-60 minutes it looks like [17:58:53] Should be 30 [17:59:50] Reedy: labs-tools-extdist takes care of it right? [18:00:13] Hmm. Well... Two different questions :P [18:00:23] The cache for the list of extensions from gerrit is 30 minutes [18:00:29] I think extdist is a daily cron [18:00:42] it looks ExtensionDistributor uses that toolforge tool to generate and distribute the tarballs [18:02:20] https://github.com/wikimedia/labs-tools-extdist/blob/master/nightly.py [18:02:24] the fact it's called nightly... [21:00:12] [[Tech]]; Ruslik0; /* Restricting the use of Content translation tool */; https://meta.wikimedia.org/w/index.php?diff=20140644&oldid=20139986&rcid=15616654