[00:46:47] Reedy: UploadBlacklist has a log somewhere of its hits. Any chance you could look at it? [01:37:30] Elsie: [01:37:31] lol [01:37:57] Reedy: You found the log? [01:38:03] reedy@fluorine:/a/mw-log$ grep -v MISS upload-blacklist.log -c [01:38:03] 0 [01:38:03] reedy@fluorine:/a/mw-log$ grep MISS upload-blacklist.log -c [01:38:03] 12688 [01:38:16] The miss entries are just useless [01:38:43] So nobody's hitting it, right? [01:38:55] I think even if you try to upload Goatse, it works. [01:38:57] The hash is outdated. [01:39:07] It may also be an MD5 hash, heh. [01:39:23] Well, no rows that didn't include miss in any of july [01:39:39] I'm pretty sure the extension can be disabled. [01:39:42] ^ [01:39:46] I'm pretty sure too [01:39:49] Though it occurred to me that it also needs to be removed from the wmf script. [01:39:55] Whichever script that is. [01:39:59] wmf script? [01:40:06] Isn't there one to create a branch? [01:40:09] 0 rows that don't include the text MISS in any of the rotated logs [01:40:11] Oh, yeah [01:40:19] But that's very trivial to do so [01:40:25] :-) [17:57:36] Hi [17:57:43] I wanted to log in to edit some outdated information [17:57:50] But Wikipedia doesn't accept my password [17:57:58] I thought I may have used a different one and wanted to use the Password Reset function [17:58:14] But for my username Vampire0 it says there is no email recorded [17:58:29] I associated two mails with that account, I still have the confirmation mails for those [17:58:37] And if I put in the email address it says a password reset mail is sent, but no mail arrives [18:13:06] Anyone any idea or ability to help? [18:47:11] BjoernKautler, is it possible that your account was renamed? [18:47:15] which wiki is it? [20:55:58] MaxSem: you're geodata, right? [[:mw:Extension:GeoData]] has the wrong docs for gsprimary [20:57:34] (btw) [21:20:30] MaxSem or others: Is there a way of running a less good but faster nearby search of Wikipedia articles that's more bot-friendly [21:21:11] I've got 30,000 requests, I'm guessing that's either going to be really slow, hurt the servers, or both. [21:22:19] e.g. a WHERE `Latitude` LIKE '71.566%'-style search [21:22:48] jarry1250: dump all coordinates and search them locally? [21:23:04] i think apergos was doing something about geodata sql dumps [21:23:18] or just download it all via API, it's not *that* much data [21:23:26] fewer than 30k requests needed, i think :) [21:26:45] MatmaRex: hmm [21:28:55] jarry1250: The servers handle about 100,000K requests/second. [21:29:05] Err, 100,000, rather. [21:29:12] Though 100,000K would be more impressive! [21:29:20] In any case, I'm not sure why you think 30,000 requests would be noticeable. [21:30:29] Elsie: The geodata API request calls a Solr(?) backend [21:31:13] It's hard to tell for me on my slow connection, but I *think* each request is taking a coupla seconds of processing time [21:31:39] Hi rainman-sr. [21:31:55] hello elsie [21:32:40] (I've thought of a workaround for my particular setup.) [23:21:47] jarry1250, Solr is teh fastest [23:22:18] jarry1250, what workflow do you have in mind? [23:24:26] MaxSem: I was originally going to run down a list of coordinates from a database finding the nearest article to them, but I came up with a better plan in the end [23:24:35] My internet was going at like 56kbps