[00:05:33] Dereckson: awesome, thank you :DD [10:19:43] Hi, I searching the wikipedia MySQL statistics (Queries, Threads and users), this is for calculate the scalability of the site :) [10:25:46] Nux_: https://www.mediawiki.org/w/index.php?title=File%3AMySQL_at_Wikipedia.pdf has some stats [10:26:26] You could also try asking at #wikimedia-databases because that's where our DBA is at [10:29:54] Glaisher: Ok I will ask my question to databases staff, thanks for your help :) [16:39:18] quiddity: Do you know which channle CX is in? [16:39:49] oh, nvm :D [16:39:57] Josve05a, I believe #mediawiki-i18n [16:40:26] yeah, fund it in my archives :D [19:02:35] mobrovac: hi! are you around? [21:02:00] RFC meeting starting now in #wikimedia-office: Overhaul Interwiki map, unify with Sites and WikiMap [22:08:39] SMalyshev: I can't talk long because I've got to look after kids [22:08:49] ok, thanks [22:08:57] but one issue is that there are two layers of configuration [22:09:15] so basically what I wonder is what is the diff between what we have in wg* configs and DB/wiki configs [22:10:32] there are two layers because MW core has a concept of the "current" wiki [22:10:41] and $wg* reflects the configuration of that wiki [22:11:26] and then in WMF configuration we have this massive system which arranges for the current wiki to be determined and for all the $wg* settings to be set correctly [22:12:27] the interactions between the WMF configuration system and the core are a bit weird and incomplete [22:12:47] fyi: minutes of last office hour posted here: https://phabricator.wikimedia.org/E171#2016 [22:12:55] TimStarling: and SiteMatrix somehow gets things like wgLanguageCode for each wiki from the WMF configuration? which populateSitesTable.php then stuffs into the db. [22:13:15] right. So my question is - there's a lot of data in those files (which eventually become wg* values for current wiki) [22:13:27] and this data is then partially duplicated in other places, right? [22:14:21] using wgConf IIRC? [22:14:33] there is some duplication between InitialiseSettings.php (i.e. $wgConf) and dumpInterwiki.php [22:14:40] SMalyshev: i think we could dedup it, buy leaving it out of the pre-deploy hand maintained interwiki.json and add them in when the cache is built [22:14:52] s/buy/by/ [22:14:57] as I see dumpInterwiki hardcodes a real lot of stuff [22:15:28] yeah, and some of that would find a new home if we get rid of dumpInterwiki [22:16:11] Daniel is saying that we should just run dumpInterwiki one last time and commit its output to gerrit (essentially) [22:16:25] so I'm not even sure why there should be something like getSites() - MW* scripts know which sites there are, right? So we could use that info to compose it, no? [22:16:27] and so then that output would be canonical [22:17:01] TimStarling: well, but what if I run my own wiki install with different wikis? would I have to edit it manually? [22:18:04] if you have your own install with different wikis, then you don't have the WMF configuration system, you only have the bits and pieces that have found their way into core [22:18:17] or you can use a third-party extension which does a similar thing [22:18:30] TimStarling: e.g. vagrant install has petty much same things, no? [22:18:37] *pretty [22:18:41] probably [22:18:54] mw-vagrant uses a cut down version of MultiWiki [22:18:58] I mean of course if I have install that is managed by /var/www/MW* scripts [22:19:39] right. So MultiWiki knows a real lot about which wikis there are. And it looks like dumpInterwiki doesn't use it at all and most other places don't either [22:19:44] I don't think anyone other than WMF and mw-vagrant uses the crazy multiwiki scripts [22:19:57] bd808: cut down, as in fork? [22:20:05] yes [22:20:10] *sigh* [22:20:24] by MultiWiki do you mean multiversion? [22:20:30] y [22:20:41] jzerebecki: I wasn't about to refactor prod config just to get a shim library for running a wikifarm [22:20:51] yes, multiversion [22:20:56] multiversion is really a third layer [22:21:21] got to go [22:21:49] thx [22:21:57] I remember when I needed a map from something like ru.wikipedia.org to DB name (ruwiki) I was forced to use my own config... [22:23:03] SMalyshev: https://www.wikidata.org/w/api.php?action=sitematrix and I think there is another one [22:23:27] so this is the part I really dislike: https://github.com/wikimedia/mediawiki-extensions-CirrusSearch/blob/master/CirrusSearch.php#L836 [22:24:23] that should be in sitematrix I think. [22:24:25] ru does not necessarily equal ruwiki [22:25:01] it does if you are on a wikipedia [22:25:08] which isn't guaranteed [22:25:22] I remember something didn't work there with Sitematrix... [22:26:38] probably because "ru" was interwiki prefix and sitematrix doesn't understand that? And also I didn't need the whole matrix... [22:28:09] yea seems like sitematrix does not know about interwiki at all [22:28:51] sitematrix is a list of the wikis in the farm [22:28:52] right so that's my issue these systems seem to be disconnected [22:29:14] yup. they totally are [22:29:19] another horrible thing is SiteConfiguration::getConfig - it actually uses shell to get a config [22:29:50] SMalyshev: so you first need to look at https://ru.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=interwikimap&sifilteriw=local (domain important) then look in sitematrix [22:30:03] the idea of a family of wikis is bolted on the side of MediaWiki [22:30:41] and not with very strong bolts ;) [22:30:49] so I wonder if it's possible to reconnect these systems so that there would be one API that allows to get the list of Wikis we know and get all info about them - like interwiki prefix, db perfix, name, etc. [22:31:19] interwiki prefixes would vary depending on which wiki you're looking at it from [22:31:49] Krenair: do we really have different prefixes on different wikis or it's a theoretical possibility to have them? [22:31:57] SMalyshev: yes we have :( [22:31:58] yes, really [22:32:14] as you saw, ru: on wikipedias will take you to a wikipedia [22:32:20] it shouldn't do that outside wikipedia [22:32:27] ugh that sucks [22:32:33] if you use ru: on wikibooks it'll take you to ruwikibooks [22:32:42] What? How does that suck? It makes perfect sense [22:33:26] do we setup the project scoped aliases everywhere too? Like doesn w:ru: work on enwiki? [22:33:48] well maybe it doesn't suck if we have good API to switch between languages and prefixes and wikis [22:33:58] https://en.wikipedia.org/wiki/w:ru: -> https://ru.wikipedia.org/wiki/%D0%97%D0%B0%D0%B3%D0%BB%D0%B0%D0%B2%D0%BD%D0%B0%D1%8F_%D1%81%D1%82%D1%80%D0%B0%D0%BD%D0%B8%D1%86%D0%B0 [22:34:13] Krenair: it seems sucking and making perfect sense are no contradiction. [22:34:21] but I could't find such API [22:34:56] SMalyshev: you and wikidata are both cobbling it together as you go [22:35:12] so you should probably get together on a common solution [22:35:49] on wikibooks, w:ru: will also take you to ruwiki [22:36:39] so where this information is stored? and is there an API that does wikiDbName('w:ru') == 'ruwiki"? [22:36:45] https://en.wikibooks.org/wiki/w:ru: -> https://en.wikipedia.org/wiki/ru: -> https://ru.wikipedia.org/wiki/%D0%97%D0%B0%D0%B3%D0%BB%D0%B0%D0%B2%D0%BD%D0%B0%D1%8F_%D1%81%D1%82%D1%80%D0%B0%D0%BD%D0%B8%D1%86%D0%B0 [22:37:15] interwiki.php, automatically generated by WikimediaMaintenance's dumpInterwiki.php [22:37:32] Krenair: but dumpInterwiki is a bunch of hardcoded stuff as I see... [22:37:37] yes [22:38:13] so is it all hardcoded in the code of dumpinterwiki? [22:38:21] Using just a language-prefix is consistently ambiguous. --- Using a project-prefix + language-prefix is consistent and predictable. [22:38:25] Note, it always takes the follows the string till the end. e.g. [[:ru:wikt:es:v:fr:de:b:]] will take you to the German Wikibooks. See links at https://en.wikisource.org/wiki/User:Quiddity_%28WMF%29/sandbox [22:38:44] don't think there's an API that has calls quite like that, but there is this: https://meta.wikimedia.org/w/api.php?action=query&meta=siteinfo&siprop=interwikimap [22:39:00] lol [22:39:37] I think there should be some API that integrates all this info [22:39:51] each wiki has a different view of the interwikis [22:39:55] by API I mean PHP API (class) not api.php [22:40:06] oh, sure, one sec [22:40:06] Krenair: that should be abstracted inside the API then [22:40:32] That API calls these functions: [22:40:44] well, I mean it contains this code: [22:40:46] $getPrefixes = Interwiki::getAllPrefixes( $local ); [22:40:46] $extraLangPrefixes = $this->getConfig()->get( 'ExtraInterlanguageLinkPrefixes' ); [22:40:46] $localInterwikis = $this->getConfig()->get( 'LocalInterwikis' ); [22:42:17] take a look at ApiQuerySiteInfo::appendInterwikiMap [22:45:36] so this one seems to go to the database? [22:46:05] how the DB and interwiki.php are related? [22:49:52] SMalyshev, it uses wgInterwikiCache and getAllPrefixesCached [22:49:54] not getAllPrefixesDB [22:52:10] operations/mediawiki-config.git's wmf-config/CommonSettings.php contains '$wgInterwikiCache = include_once( "$wmfConfigDir/interwiki.php" );' [22:54:38] Krenair: ah, so in wikimedia we never use the DB, but other mediawiki installs would use the DB instead [22:54:48] yeah [22:55:33] ok, that makes it a bit clearer [22:58:36] Krenair: thanks! it's still not API I'd like but at least now there's a path to the data... [22:59:51] I don't think anyone particularly likes the wgConf/interwiki/WikiMap/SiteMatrix code [23:09:27] Hi Michboard :D [23:09:33] Hey there. :D [23:10:28] So yes. Mediawiki allows you to host your own wiki (wikipedia-like) site. However, it must be "self-hosted" either on site or on an external hosting provider. If your company has an Intranet site, it usually can be added to it [23:10:42] Okay. So my company was asking me to make a wiki page for them. It would have different links, articles, etc.. For instance, we handle a lot of Dell server calls. I would have different sections located in there for our techs to access for some information. A simple one page wouldn't be enough. [23:10:44] Ohhh, I see. [23:10:45] Many webhosts also have "1 click scripts" which easily let you set up mediawiki [23:11:07] Sure. Mediawiki could be used for a knowledgebase type system :) [23:11:48] Mediawiki is free. Hosting can vary. If you're dealing with dell servers, you most likely have a larger company and can perhaps host it on-site [23:13:15] Does that kind of make sense? You can add your logo and (I THINK) incorporate single sign on (eg: same username/pass for email, logon, and wiki). [23:13:46] Yeah, that does make sense. We are currently at about 60+ employees. :P [23:13:50] Not a huge amount, but we're getting there. [23:13:59] ah yeah. A knowledgebase makes sense at that amount of staff [23:14:19] maybe a better channel for this is #mediawiki [23:14:42] lol sorry. Krenair. We were just told to go here from #wikipedia-en-help . We can switch :) [23:14:50] I don't mind much [23:14:54] no need to apologise [23:28:50] Is there an RoanKattouw here? I need help! [23:29:20] It is about a flow-page messup on nowiki, a page is wrong [23:29:42] and I can't move it... [23:29:42] he just started scap [23:29:49] should be around [23:30:21] I guess a steward can move a flow page too.. I'll hang around a bit, thanks! [23:30:45] 23:26 logmsgbot: catrope@tin Started scap: Updating wmf23 Echo to wmf1 [23:31:09] Yeah I'm here-ish [23:31:31] Between internet connections right now while the i18n rebuild part of the scap happens [23:31:40] It seems like the bot for the beta feature moved a talk page back to itself, but then got lost [23:32:05] Its no hurry, the user has gone to sleep! [23:33:02] https://no.wikipedia.org/wiki/Brukerdiskusjon:3s/Flow-arkiv_1 [23:33:09] This is a flow page [23:37:03] It should be here https://no.wikipedia.org/wiki/Brukerdiskusjon:3s [23:37:03] I have moved the old page, but could not move the flow page [23:37:06] He asked me about help here https://no.wikipedia.org/wiki/Topic:T3s3eqex9abtcg4n [23:38:43] I'm not sure what happen, but it might be messy html on the talk page, and then the bot tried to salvage the page