[07:54:42] Since a few days, many page loads take forever for me, usually stalling on a single request which takes ~30 seconds (if I cancel loading the page, the content displays normally). https://phabricator.wikimedia.org/F12782767 [07:55:10] Known problem, however rare, or not reported yet? [08:00:12] Sorry, https://phabricator.wikimedia.org/F12782787 [09:51:08] \q Groovier [09:51:26] sorry typo :) [14:10:53] anyone about? i'm not sure if this is a -tech question or not but is there any global equivalent of this script? [14:11:01] importScript('User:Keegan/MarkBlocked.js'); [14:11:11] ie. something that will strike out blocked/glocked users? [17:11:11] Are there any developers here from the UK? [18:02:22] MndCtrl: I identify as one :) [18:02:38] (but not Wikimedia employee) [21:55:09] Hi [21:55:30] hey [21:55:35] Does anyone here want to assist in coming up with a complex query? [21:55:43] I have a Wikisource issue [21:56:00] Namely find all pages that have a specfic string at their start " " [21:56:13] to give an example [21:56:50] guess you need a search query of some sort :/ [21:57:29] Yes. but I'm not sure how you look for particualr body text [21:57:50] Essentialy it's a mass grep at the start of the wiki-text block [21:59:26] searching for this in the page text is probably the wrong approach. it would actually be pretty difficult to do [21:59:39] but, i would guess that ProofreadPage stores this in the database somewhere in a machine-readable way? [22:01:01] -- Holds a count of the number of pages at each quality level [22:01:10] MatmaRex: That string is HOW proofread page stores it [22:01:12] So yes... WIthout the user [22:01:17] No, that's how MW stores it [22:01:19] In wikitext [22:01:23] It's UGLY [22:01:35] And something that I find really BADLY designed :( [22:02:26] Someone was working on moving it into the dataset properly instead of wikitext but that hadn't happened yet [22:03:06] So it's not possible to directly find this information without greping the wikitext directly [22:03:13] Which is time-consuming [22:03:57] Especially when somoneone on Wikisource wants a list of stuff they "validated" so they can recheck it [22:04:04] that sounds unlikely… if it was true, rendering an index like https://en.wikisource.org/wiki/Index:Ruffhead_-_The_Statutes_at_Large_-_vol_9.djvu (which indicates the quality for each page with colors) would be a very expensive operation [22:04:17] since it would have to parse each page to get that data out of them [22:04:28] I think that's what it's actually doing [22:04:55] Extracting it directly from pages... If it's doing it in some otherway... [22:05:18] to do the colouring on Index: pages... it would be nice on how to do this in Quarry for example [22:05:44] *to know [22:06:16] What I was wanting was a list of pages where I'd set the pagequality to 4 (Validated) [22:06:36] Which means looking not only at wikitext, but when it changed [22:06:50] This at present is a nightmare to do by hand [22:07:37] no, the coloring is not *that* insane. it looks at the categories like https://en.wikisource.org/wiki/Category:Not_proofread for each page, and colors based on that. whew. [22:07:37] Which is why i am asking if someone technical here is able to generate a "Special:PagesIvalidated" report for me [22:08:22] Currently there isn't a "Special:" Page I can go to find out which pages I validated [22:08:23] :( [22:08:39] It would be nice if this was possible.. [22:08:50] Worth raising a Phab ticket? [22:11:18] Also I'd like to cross reference "Pages I validated" against the output of the Linter extension. [22:11:43] So I can get a "Pages I edited that may have issues with the New parser" [22:11:46] report as well [22:12:09] Typically on Wikisource, contributors can edit a batch of pages together. [22:12:34] And it would be nice to have a combined report of a "group" of pages that may share a simmilar issue [22:13:30] such as a common template or mistyped heading that's causing a parser issue on many pages, because of one missing '' or line-feed buried 3 levels deep in a complex template [22:13:41] Time to file some tickets [22:13:45] ShakespeareFan00: sorry, i got disconnected. probably worth filing a phab task [22:13:53] Will certainly consider it [22:14:16] I've asked for a prefix filter on Linter's output already.. [22:14:36] ShakespeareFan00: i think currently, your only option is searching a database dump. these seem to be stripped from the normal search index, somehow: https://en.wikisource.org/w/index.php?search=insource%3A%2F%5C%2F (0 results, there probably should be some) [22:14:55] Quite [22:14:56] ShakespeareFan00: and you can't search this with a SQL query because page contents are not stored in the SQL databases [22:15:33] * ShakespeareFan00 makes noises of screaming as though I'd seen a Lovecraftian elder one... [22:16:22] Definitely worth opening a ticket [22:33:05] MatmaRex: https://phabricator.wikimedia.org/T185722 [22:33:08] https://upload.wikimedia.org/wikipedia/commons/9/9b/WMF_Varnish_and_Swift.svg anyone else get this as plain text? [22:33:32] Seen as plain text for me? [22:33:42] Mime-type being incorrectly sent? [22:34:16] hm yeah it's got Content-Type: text/plain [22:34:36] if you wget it and run the file command: [22:34:49] ASCII text, with very long lines, with no line terminators [22:35:54] Krenair: plaintext for me. there's a bug about this [22:51:14] MatmaRex, like https://phabricator.wikimedia.org/T131012 ? [22:51:53] Krenair: https://phabricator.wikimedia.org/T150929 [22:52:10] Access Denied: Restricted Task [22:52:10] You do not have permission to view this object. [22:52:10] Users with the "Can View" capability: [22:52:10] This object has a custom policy controlling who can take this action. [22:52:10] The owner of a task can always view and edit it. [22:52:11] OK [22:52:21] can't believe Phabricator still hasn't fixed this nonsense [22:54:05] MatmaRex [22:54:21] oh [22:54:38] hmm [22:55:01] Krenair: sorry, my fault, i thought i cc'd you [22:55:04] fixed now [22:55:11] now phab's fault… *this* time. [22:55:13] not* [22:55:46] huh [22:55:53] it looks like what you did... should've CC'd [22:56:08] maybe phab noticed it was private and didn't add to CC like normal