[04:51:48] How to include multipart/form-data while using api request using https://github.com/wikimedia/mediawiki-oauthclient-php [10:06:27] I'm dissapointed [10:06:41] Plenty of the Linterrors I am finding on Simple English Wikipedia [10:07:00] are down to a subtle difference in the way Whitespace handling is done [10:07:30] meaning that certain constructions that were written in good faith for Tidy [10:07:48] through obscure Missing END/Stripped TAG errors [10:07:59] under the the new Remex parser [10:08:25] Whilst it's possible to repair most of them by removing the "implied" line breaks in white-space [10:08:35] it's tedious and confusing [10:08:56] It would be nice to have ONE consistent rule set for whitespace handling [10:09:06] across ALL markup [10:09:25] *thorw [10:09:28] *throw [17:57:26] @reauth wm-bot [17:57:33] Rejoining all channels on instance wm-bot [17:57:33] @system-rejoin-all wm-bot [17:57:45] Sorry, it wasn't in the control channel. [18:56:10] Matthew: what caused the bots to ping out? [18:56:37] I don't quite know, I suspect Labs maintenance since many bots disconnected at the same time. [19:05:59] Ahh [19:10:01] Yeah, there's been updates and migration ongoing. [19:17:12] It was because the network went down (scheduled maintenance on labnet) (i think) [23:19:35] Is there any tool / bot that scans page content and checks if pages already exist for strings in the page content, to propose linking to those wiki pages? [23:23:36] Hmm, maybe http://tools.wmflabs.org/navlink-recommendation/ covers that.