[00:00:27] Well, this is part of serializing to the API format (typically JSON). [00:01:02] yeah, the API formatter serializes to JSON [00:01:14] it serializes the array you pass to it [00:01:28] if you serialized the array before it got it, it would be serialized twice wouldn't it? [00:02:28] TimStarling, ultimately https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FFlow.git/a36502ae5228df254aabed4b7ec6c097c32074cc/includes%2FFormatter%2FTopicListFormatter.php#L71 . [00:02:38] TimStarling, yeah, you're right, it's not literally serializing to a string. [00:02:54] To be fair, it is called RevisionFormatter. I guess the writer hedged a bit. :) [00:03:30] RevisionFormatter::formatApi() [00:04:33] public function __construct( UrlGenerator $urlGenerator, RevisionFormatter $serializer ) { [00:04:44] funny [00:05:11] Yeah. :) Fair point. [00:05:55] so TopicListFormatter::formatApi() is called with an array of revisions, $found [00:06:16] typically how many revisions? 10? 50? [00:08:05] I think the max number of topics is 100, then it will find posts (and their author links) within those topics. [00:08:42] There would tend to be repeats as people reply to each other, post on the same board multiple times, etc., which the caching should help somewhat with. [00:08:54] I think you should add a method to RevisionFormatter which does batch queries on the complete revision list [00:09:21] mmm, you already have a thing called UserNameBatch [00:11:00] Yeah, does something different, though. [00:11:07] wiki+userid => username. [00:11:29] Whereas here, we want username => 'Should it be blue?' (or 'new' or whatever). [00:11:54] If we make a new hook for this (batching isKnown checks) we could call it in Flow. [00:11:59] I'm just wondering if you can implement this global user page batch feature by analogy [00:12:46] well, LinkHolderArray is not designed to be used outside the parser [00:12:54] you could potentially factor out the relevant part of it [00:13:10] provide some usable interfaces [00:13:30] TimStarling, if it started using a hook for this batching and GlobalUserPage implemented it, we could potentially call the same hook. [00:14:19] I guess, but it sounds inelegant [00:14:42] don't you think it would be better to have a batched equivalent for Title::isKnown() which is hookable and works for both the parser and external callers? [00:15:10] maybe too big a project [00:15:35] That was basically what I was suggesting. I meant, "call the same hook" that LinkHolderArray did. [00:16:15] I mean split LinkHolderArray::replaceInternal [00:17:11] maybe everything down to line 380, including doVariants(), would go into a new class [00:17:53] LinkHolderArray, being a parser class, would do the part labelled "Construct search and replace arrays" based on the return value from the new class [00:18:18] TimStarling, yeah, basically separate the special parser stuff like from the "should it have the new class" part. [00:18:23] I see what you're saying. [00:19:11] you know we have LinkBatch already [00:19:46] I started working on a hook in LinkHolderArray where Tim suggested so that GUP can batch the color stuff [00:20:58] mmm, LinkBatch has even acquired some extra logic since I last looked at it [00:21:17] gender queries [00:21:59] when I introduced it, it was simply page existence, not link colouring [00:23:14] It looks like it still is, except that it also caches gender information. [00:23:36] But there could be a subclass or something that worked with known/unknown instead of exists/doesn't exist and allowed hooks. [00:24:04] yeah, could do [00:24:09] subclass or composition [00:47:38] I'll summarize at the bug. Feel free to post there as well. [02:04:26] Hmm, I keep getting error 503 when attempting to assign rights from Special:UserRights [02:13:15] A shell user can take a look at the error log for you. [02:13:31] QueenOfFrance: If it's happening, file a Phabricator task. [02:13:42] Err, happening consistently. [02:13:52] I guess even if it's intermittent, it shouldn't be doing that. [02:13:56] Though isn't 503 capacity? [02:15:17] I seem to be doing fine on a local wiki (ie testwiki on testwiki) [02:15:37] Fiona: another steward is testing it now [02:15:50] Cool. [02:16:49] > 503 Service Unavailable The server is currently unavailable (because it is overloaded or down for maintenance). Generally, this is a temporary state. [02:16:56] Mhm [02:17:05] So, yeah, capacity. Makes me think it might be an operations issue? [02:17:09] But it's been going on for 18 minutes now, so I figured I should probably tell somebody [02:17:17] Oh, for sure. [02:17:18] Makes sense [02:17:29] And if it keeps happening, file a task and we can track it down. [02:17:48] I'm just not sure it's a PHP exception or a overloaded/busted server. [02:17:51] Yeah, another steward tried giving me the rights and they had no luck as well [02:18:07] I wonder if I can use the form... [02:18:29] Well, you can but not for remote wikis, right? Which is what the problem seems to be about [02:18:49] If it's interwiki, it's more likely an application problem, not a server problem. [02:19:07] Tim might be around in a bit to take a look at logs. [02:20:20] Or Reedy, Lego, or Matt, maybe. [02:20:35] Hi [02:20:56] legoktm: QueenOfFrance is getting a 503 on Meta-Wiki's Special:UserRights for interwiki rights changes. [02:20:56] It's a problem with cp1066? [02:21:01] what's that? [02:21:09] I doubt a caching proxy is doing this, but maybe! [02:21:23] tim broke something earlier O_O [02:21:27] Request: POST http://meta.wikimedia.org/wiki/Special:GlobalUserRights, from 10.20.0.138 via cp1054 cp1054 ([10.64.32.106]:3128), Varnish XID 850297480 [02:21:27] Forwarded for: 95.73.201.180, 91.198.174.68, 91.198.174.68, 10.20.0.138 [02:21:27] Error: 503, Service Unavailable at Tue, 03 Feb 2015 20:19:21 GMT [02:21:37] QueenOfFrance: UserRights or GlobalUserRights? [02:21:42] UserRights [02:21:44] What's rr0? [02:21:54] 2 stewards beside me tried [02:21:56] Fiona: a person, read up in your scrollback [02:22:06] But it's so long! [02:22:10] And I've tried using UserRights on meta with a meta user instead of interwiki, and it worked fine [02:22:50] legoktm: do you want me to file a phab report? [02:23:08] Feb 4 02:19:36 mw1247: #012Fatal error: Call to undefined method UserRightsProxy::equals() in /srv/mediawiki/php-1.25wmf15/includes/specials/SpecialUserrights.php on line 231 [02:23:08] this needs a quick fix surely? [02:23:10] QueenOfFrance: yes please [02:23:18] legoktm: k on its way in a minute [02:25:03] this would also explain why special:globaluserrights is broken, it uses the same code [02:25:36] I'm not sure I knew we had Special:GlobalUserRights. [02:25:38] legoktm: should it go under CentralAuth? [02:25:42] Doesn't matter. [02:25:50] just somewhere [02:26:28] https://phabricator.wikimedia.org/T88505 created [02:26:42] Bsadowski1, I couldn't find you in the list of users or I would have cc'ed you, sorry [02:28:07] I'm not registered on there [02:29:13] Now I am [02:29:15] :D [02:33:04] I uploaded a patch and a backport, but I have to go afk in 10 minutes, so I don't think it would be a good idea for me to deploy it right now. [02:33:59] TimStarling: around? [02:34:06] yes [02:34:51] TimStarling: do you think you could review and deploy my patch to unbreak interwiki userrights? https://gerrit.wikimedia.org/r/188504 is the patch and https://phabricator.wikimedia.org/T88505 is the bug [02:38:31] doing it [02:38:57] thanks :). QueenOfFrance, Fiona ^ [02:41:39] Sweet. :-) [02:49:03] done [02:49:12] \o/ [02:49:17] test it QueenOfFrance [02:56:11] * TimStarling discovers rdiffdir [07:19:36] WikiTech doesn't have unified login yet? [07:30:47] No [07:30:53] and it's not on near term plans to be, AFAIK [07:34:29] By the way, {{urlencode:}} works fine in my template, thank you. [07:43:02] Reedy: Can I request a username change on WikiTech? [07:43:42] wiki username? not shell username? [07:44:06] wiki [07:50:14] https://wikitech.wikimedia.org/wiki/Special:SpecialPages doesn't seem to have a rename page [08:32:20] DrSkyLizard: Special:Version lists it [08:33:37] Reedy: yes, but there is no way to request a renaming [08:33:55] Probably because it's not done much [08:34:00] if at all [08:34:26] Might be easiest just opening a ticket in phabricator requesting it [08:34:35] You can login with your wikitech or your unified account there [08:35:34] the icon for login in phabricator looks like logout to me [08:35:39] lol [08:37:43] I think the 'enter door' (arrow pointing inwards) means login and 'exit door' (arrow pointing outwards) is logout [08:38:45] Username or password are incorrect. [08:39:02] when using my 'unified' login credentials [08:39:36] Did you click the correct login button? [08:39:38] ie to use mw.org? [08:41:52] ah, didn't see that one [08:42:14] too much multitasking :-) [16:26:55] there is really a problem with the wikidata dumps, isn't it ? http://dumps.wikimedia.org/wikidatawiki/20150113/ the status hasn't changed in a week [16:27:20] Guest97095: XML dump creation is stalled right now [16:27:30] For all of Wikimedia [16:27:33] oh [16:27:36] why ? [16:27:36] our json dumps are more up to date, though [16:27:49] Because stuff broke after a server reboot in weird ways [16:28:06] there are json dumps ? [16:28:08] and there are more problems, so not the highest priority [16:28:22] Our json dumps are unaffected by that [16:28:39] where can i find json dumps [16:29:44] https://dumps.wikimedia.org/other/wikidata/ [16:30:55] how do I get to this page from https://dumps.wikimedia.org [16:31:01] is it linked anywhere ? [16:31:17] It is linked from Wikidata, but not from anywhere on dumps.wikimedia.org [16:31:32] there are json dumps for wikipedia too ? [16:31:34] That's because dumps.wikimedia.org is stuck in 2005 and really needs some love [16:31:42] no, those are Wikidata specific [16:31:46] ok thank you [16:31:48] they only contain entities, no wikitext [16:31:53] oh [16:33:33] so what's missing of there are no wikitext [16:34:22] The json dumps are supposed to only cover entities [16:34:31] so there's nothing really missing there, it's just out of scope [16:37:20] unfortunately it would require a lot of work to support the json instead of xml for my project. I will just hope that xml get fixed [16:37:48] it will [16:38:02] yeah [16:38:15] ops were working on it today [16:38:19] but you should consider switching anyway, the json dumps are weekly [16:38:32] and generally better suited if you only want the entity data [16:39:29] but directly handling a json of multiple gigabyte will be pretty hard no ? [16:39:41] hahaha, of course [16:39:45] but you wouldn't do that [16:39:52] like you also wouldn'T read the whole xml at once [16:39:54] I hope [16:40:06] Just read it line by line... the json has one entity per line [16:40:10] i import it to a database [16:40:12] the xml [16:43:50] yeah I could also convert everything to the database [16:46:39] does it also contains the "wb_items_per_site" information ? [16:47:00] It has the whole entity json, which contains that information, yes [16:47:06] but not the table as is [16:49:29] thanks [20:36:25] Hi, I'm trying to join the VE triage meeting, can someone help me with that? [20:36:33] James_F: ^ [20:37:12] WikiGnom1: Hey. We just finished the test meeting, sorry. [20:37:42] James_F: Haha, okay, sorry then. [20:37:55] WikiGnom1: Next week we'll have the "real" meeting. [20:40:19] James_F: Well, then please explain how it works because otherwise I'll fail to join next week as well. [20:40:53] WikiGnom1: Elitre will have the exact instructions then. [20:41:57] James_F: sorry if I sound a bit grumpy, but she had sent me some instructions and I was unable to join you following them. [20:42:19] WikiGnom1: Yeah, that's why we had the test meeting to work out how to get it working. Sorry. :-( [21:45:06] Hello, I need some help [21:45:20] I will probably soon be an administrator at a wiki [21:45:39] and we have a problem with unicode for a language which on most browsers displays a horrible font. [21:45:48] Can you manually change the font displayed for all browsers on the wikipedia? [21:46:38] You can, but you really shouldn't. [21:46:52] Also loading fonts from 3rd party sources (eg. google) is a no-go [21:47:26] hoo: You say that I shouldn't, but I 'm really serious that this language, Gothic, at most browsers for some reason has a pre-installed horrible font [21:47:31] no matter if you use Linux, Microsoft etc. [21:47:37] I could use my own webserver which is quite stable [21:52:46] Gothicspeakerr, if there is a free font for your alphabet, UniversalLanguageSelector might help [21:53:09] MaxSem: What is that? [21:53:13] also, are you really a speaker? I thought gothic is a dead language:P [21:53:35] MaxSem: Well, I write news articles in Gothic and I have studied several Gothic grammars in my freetime and do translations in it [21:53:45] MaxSem: So yes, I speak it up to a certain extent. [21:53:56] MaxSem: I just don't have many people to speak it with [21:53:59] https://www.mediawiki.org/wiki/Extension:UniversalLanguageSelector - basically, it can load web fonts [21:54:28] so why write an encyclopedia in a language that's not used in practice? [21:55:28] Well, I didn't start it, but I got fascinated with the language and wanted to increase it's use [21:55:36] so I have taken basic steps which are used in language revival too [21:55:50] I first started with a news website, Himma Daga, which gets visitors every day [21:55:55] the second step is publishing books [21:56:05] So I translated a book which will be published and for sale with a linguist in Gothic [21:56:24] Thirdly, most languages have a Wikipedia, the Gothic one was created about 10 years ago when new languages were accepted easily [21:56:27] the problem is, it is f [21:56:41] f*cked up by a lot of people, as they haven't studied Gothic grammars like me and haven't read the ancient writings in it [21:56:45] so I need to do a lot of fix work now [21:56:50] And with a lot I mean a load of. [22:07:52] Hey Gothicspeakerr, just FYI [22:07:59] The translatewiki stuff might be working [22:08:09] But I think local admins have done a lot in the MediaWiki: namespace [22:08:34] What do you mean with that? [22:08:36] Gothicspeakerr: https://got.wikipedia.org/wiki/Special:PrefixIndex/MediaWiki: will show you what they've added locally - it may be overriding some of the messages you're trying to fix [22:08:55] Gothicspeakerr: We gave the users a terrible, terrible power. They're allowed to screw with system messages locally. [22:09:15] marktraceur: Well, the advantage is that I as a user can fix all these mistakes now [22:09:28] marktraceur: Because if I would need admin status, I would have to wait until sunday for that [22:09:35] Well [22:09:40] Gothicspeakerr: Actually it's admin-only [22:09:41] But still! [22:09:49] Yes I see [22:09:54] Gothicspeakerr: Now I guess you know where to go to fix it [22:09:56] well, at least I can already write down what to change [22:10:04] Mmhmm. [22:10:35] Actually it would be better if everyone could change this [22:10:43] because I look at all these system messages with bad grammar now [22:11:44] Unfortunately editing system messages can have many more side effects [22:11:46] Hence locking it down [22:11:54] Right. [22:12:09] That's what translatewiki is for [22:12:10] I get a headache of this bad grammar [22:12:21] Irony, thy name is... [22:12:32] Gothicspeakerr: So anyway, yes, it sounds like you're on your way, which is great [22:13:08] marktraceur: Yes, I got 3 people which voted for me to be admin and I submitted a request at the admin page [22:13:34] I see that :)