[15:07:57] I just got hit with an HTTP 429 ("Please respect our robot policy") for normal, flesh-and-blood click-on-links browsing activity on enwiki and performance.wikimedia.org [15:08:24] Is this worth reporting somewhere or a known issue as the kinks are being worked out? [15:09:11] (Request served via cp1100 cp1100, Varnish XID 525207352) [15:21:27] <_joe_> ori: if you can provide the full text you got that might help debugging the issue [15:22:25] <_joe_> there should typically be a short sha1 in parentheses, you might have got into some rule that was enabled to combat scraping yesterday, it was quite a rough day [15:23:48] I got one earlier for performance.wm.o FWIW [15:24:22] <_joe_> sigh [15:24:25] (I was thinking of reporting it by email [IRC didn't occur to me], but then it worked again after a little bit so ¯\_(ツ)_/¯) [15:24:30] <_joe_> c677ce2 was the sha1 by any chance? [15:24:40] let me check, one sec [15:24:57] believe it was c677ce2 yep [15:25:40] <_joe_> yeah it's a rule created to fight some scraper which was killing performance.wikimedia.org [15:25:53] <_joe_> but... it uses the Google Chrome TLS stack as fingerprint. Sigh [15:26:28] <_joe_> IT HAS BEEN 0️⃣ DAYS SINCE WE BLOCKED CHROME [15:27:25] <_joe_> ok restricted the rule to residential proxy addresses at least [15:27:29] <_joe_> apologies to both of you [15:32:19] no worries from my perspective :) i do not envy the job of SREs with the current amount of scraper requests that get sent [17:51:45] thank you <3 this wasn't urgent.