[00:00:55] No. [00:01:04] You'd need to be pass their content to parse, or save them onwiki first [00:01:20] Why does the bot need access to the scribunto/lua mw library? [00:01:25] Nearly all of it is available elsewhere [00:02:29] I'm using Python to parse and store Wiktionary content, and I've run into the issue of needing the data from several Modules, but I can't access them without having Scribunto running since many Modules need the `mw` library [00:03:14] Which modules? [00:04:52] Most of them. Right now, I need the accent qualifier Module https://en.wiktionary.org/wiki/Module:accent_qualifier [00:07:15] Seems like it wouldn't be difficult to edit that script to not use `mw` and pipe the data in some other way [00:09:53] I was able to do something like that and just have them require the data from my local filesystem, but the other problem is that there's many Modules that use functions from `mw` like `mw.ustring` and things like that [00:11:08] You can use Special:Export to get the module dependancy tree [00:16:01] I have all the Modules locally already via the Wiktionary XML dump [06:48:09] [[Tech]]; TonyBallioni; /* (Personally) Opt Out of Partial Block */ user script for opt-out; https://meta.wikimedia.org/w/index.php?diff=19704024&oldid=19703126&rcid=14704010 [09:37:23] Urbanecm: thanks for rebasing your patch, it is on my TODO list for last review & +2 after the branch cut (since there are translation strings) [09:37:52] kostajh: nice, thanks for letting me know! [13:46:29] importDump is broken? [13:46:34] * Reedy shrugs [14:01:48] mwdumper still needs to be updated, correct; I'm not sure which importDump issue you mean... ah you're gone anyways [14:01:52] ... for 1.34 [14:02:36] lol [18:24:34] Hey friends... does someone know what the usergroup/rights/etc are for Global Sysops? Is there an actual global flag for that in API? [18:24:54] I'm starting to worry Global Sysop is kinda an amalgamation. [18:26:40] https://meta.wikimedia.org/wiki/Special:GlobalGroupPermissions/global-sysop [19:18:03] Thank you Ebe123 but I was working towards trying to get that response from the API [19:18:38] https://meta.wikimedia.org/w/api.php?action=query&list=globalgroups&ggpprop=rights [19:20:05] Thank you Reedy. I've located that as well, but I'm trying to find account names that have the global-sysop flag. [19:20:44] https://meta.wikimedia.org/w/api.php?action=query&list=globalallusers&agugroup=global-sysop [19:22:55] possibly with a &agulimit=max [19:26:20] oh sweet baby linux [19:26:29] Reedy I owe you a six pack of whatever your poison is [19:29:13] Tea [19:43:13] are we expecting some changes for the server? [19:43:43] Which server? [19:43:47] Reversion links do no load in Page history and User Contribution on zhwp [19:44:36] https://zh.wikipedia.org/wiki/Special:%E7%94%A8%E6%88%B7%E8%B4%A1%E7%8C%AE/Reedy [19:44:50] https://zh.wikipedia.org/w/index.php?title=Wikipedia:%E9%A6%96%E9%A1%B5&action=history [19:44:52] Both load for me [19:45:08] See if you can reproduce on this [19:45:09] https://zh.wikipedia.org/w/index.php?title=YouTube&action=history [19:45:18] Page loads fine [19:45:48] Are you getting an error? Or a blank page? Or something else? [19:46:51] https://i.imgur.com/WmR9X97.png [19:47:34] I'm seeing links i the brackets in en [19:47:38] What interface language are you set to? [19:47:43] The two pages you give me work fine on my too [19:47:45] zh-hans [19:48:18] Even with https://zh.wikipedia.org/w/index.php?title=YouTube&action=history&uselang=zh-hans [19:48:21] 当前先前 2020年1月14日 (二) 18:56‎ 110.54.244.222 讨论‎ 75,289字节 +52‎ 撤销 [19:48:22] 当前先前 2020年1月14日 (二) 18:23‎ 110.54.244.222 讨论‎ 75,237字节 +12‎ 撤销 [19:48:29] I see links in brackets, which aren't there in your screenshot [19:48:49] yes, no links on mine [19:49:08] Any idea if it's just you? Or are other users seeing it tooo? [19:49:42] I will check. I was patrolling the RC and noticed this user [19:49:42] https://zh.wikipedia.org/wiki/Special:Contributions/110.54.244.222 [19:50:01] it has the same symptom as the history page [19:50:09] Actually... [19:50:16] Yeah, the Contributions pages are pretty sparse [19:50:28] Something isn't right, for sure [19:51:28] mys_721tx: Do you want to file a task? :) [19:52:41] Will do, which board should I fill it under? [19:53:20] MediaWiki-Special-Pages and MediaWiki-Page-History I guess [19:53:33] We'll get it tagged as blocking the deploy, as something is definitely not right there :) [19:54:09] Hang on... mys_721tx When did this start? [19:54:12] zhwiki is still on .14 [19:54:33] I only noticed it in the last 5 min [20:00:31] Reedy: Looks like that's my problem [20:00:41] Disabled adblocker and they come back [20:01:02] That's... Not good [20:01:13] It's not clear to me how one is supposed to access Wiktionary in a structured way [20:01:31] and it's nigh on impossible to search for using a search engine, since all you get is definitions of the other words in your query, lol. [20:02:29] mys_721tx: Which adblocker? [20:02:42] 1blocker on macOS [20:03:13] Special:Contributions still looks wrong to me [20:03:37] But maybe just lack of edit summaries on those pages [20:04:27] microcolonel: Wikidata? ;P [20:07:40] Reedy: that would be helpful if it were helpful [20:07:43] lol [20:07:53] Structured data is still kinda new around here [20:08:02] I mean, we've not been getting it on commons that long etc [20:30:42] Hey guys, the hamsters at WMFlabs are out of food [20:31:04] That's nice [20:31:15] flickr2commons, OAbot, and reFill are all failing [20:31:25] #wikimedia-cloud [20:31:30] There was some maintenance earlier [20:31:35] also Earwig but i haven't tried that recently [20:31:51] how long will the maintenance be? [20:32:08] It was finished hours ago [20:32:13] But it's possible there's some fallout [20:32:20] #wikimedia-cloud is the correct place to mention it [20:32:57] ok, trying for a third channel to report this then.... [20:35:35] that channel is pretty quiet [21:03:09] it's louder when it's work hours [21:14:14] Anyone around to help? Having a performance issue [21:15:21] ? [21:16:13] Reedy: Getting a fatal error while trying to view https://meta.wikimedia.org/wiki/Special:Undelete/Meta:Sandbox due to it having 37,000 revisions [21:16:28] lol [21:16:58] yup.. [21:17:29] The problem is that I had to do a history clear, which we have done before, but it won't load [21:17:40] So now I am unable to restore previous revisions [21:21:39] Reedy: Ideas? [21:22:05] Why do you need to undelete it? [21:22:16] Might be easier just to create it from scratch [21:23:08] I thought about not undeleting it, I think we did that once in 2015 but I notice that enwiki's sandbox, which is used more, has it's original revisions going back to 2002.. [21:23:13] Feel kinda bad if I [21:23:16] 'm ruining its history [21:23:45] Why did you delete it then? :P [21:24:16] 'cause I expected to be able to undelete it >.< [21:24:22] I needed about 20 revisions gone [21:24:40] classic history clean by any standard, until i hit the fatal error [21:26:37] Wouldn't revdel have been better? [21:28:31] Usually if there is that many revisions needing revdelete, its better to history clear than have half a page of striked edits [21:28:49] Knowing you can break the sandbox, the answer to that question is now: yes! [21:28:59] ohwell, its a sandbox.. [22:31:09] Reedy: Sorry I was in a lab meeting. I'll contact the 1block dev and see if their rules have any problem [22:59:59] hi, is there a way in the abusefilter to get from a page ns 0 which gets edited, the corresponding ns 1 page id? [23:00:00] or reverse?