[02:11:17] Hi. I'm bureaucrat and administrator of a Wikimedia based site, and I'm creating and editing some articles. I want to create some links with replace title (when the mouse arrow is over the link, it will show a brief description). This link is internal (another article). I tried "[[My_another_article]]", but it doesn't work. [02:11:53] I, then, tried "", and this is interpreted as wikitext [02:12:52] How can I add more attributes at a Wiki internal link? [02:13:01] (And external too, if possible) [02:13:30] you mean mediawiki ? [02:13:39] blah should work iirc [02:13:48] Yes, a site created over the MediaWiki platform [02:13:58] please see #mediawiki then [02:15:00] i see you've found it [02:16:44] Brooke: ping [02:16:58] or anyone i guess: do you know a page offhand that gives a 500? [02:17:17] so i can see the current error msg first hand ;) [02:19:28] Sorry, my connection is instable [02:20:45] I tried the [[My another article]] but it shows the article title instead of my brief description [02:25:56] diegojsRW: please stay in #mediawiki [07:26:00] is this the correct channel for asking question about wikipedia and its API? [07:30:05] no one? [07:31:59] n4h1: It is, but most of the devs are asleep. For general API stuff, #mediawiki might be a good bet too. [07:34:09] Matthew_ thanks :) [07:34:20] Sure :) [07:34:26] can I ask you? Mine are, I assume, quite simple questions [08:03:32] anyone awake? :D [08:04:09] No. Go to #mediawiki [17:02:02] Hey AaronSchulz: do you know the minimum config needed for openForeignConnection to work? Does it need the whole LBFactory_Multi setup? [17:02:31] I think it needs that, yes [17:03:54] I have a simple test config, but it's on a computer at home [17:05:26] you'd want to set $wgLBFactoryConf, making a simplified version of our db.php, and use $wgAllDBsAreLocalhost [17:08:16] Yeah, I'm actually trying to get it to use 2 db servers (beta labs) [17:09:41] csteipp: Can't you just do wfGetDB( DB_SLAVE, array(), 'dbname' ) ? [17:11:12] RoanKattouw: the wiki's code is opening a foreign wiki, that sits on a different server in beta. And Beta doesn't currently use LBFactory_Multi, so it's erroring out [17:12:40] who do I have to bribe to get access to #wikimedia-staff ? (I asked more than a week ago and got my IRC cloak now) [17:14:20] andre__: I put JamesF on iy [17:14:20] andre__: Me. :-) [17:14:21] *it [17:14:27] hehe [17:17:03] andre__: Should now work. [17:17:14] James_F, yay! thanks a lot! [17:17:29] * James_F lives to serve. [17:17:48] James_F: Found it. LQT hijacks section=new via the PerformAction hook, but it doesn't make any attempt to hijack section=new in the API. Instead, LQT has its own API for posting thigs [17:17:58] * James_F sighs. [17:18:02] Aye [17:18:17] RoanKattouw: Sounds like a bit of work to fix? [17:23:07] Yeah [17:23:28] AaronSchulz: Do you know if sectionsByDB can have a 'default'? So, any other wiki we don't set uses this section? [17:25:36] csteipp: yeah, don't put anything [17:25:40] https://noc.wikimedia.org/conf/highlight.php?file=db.php [17:25:49] ala our s3 catch all cluster isn't in sectionsByDB [17:29:00] the key 'DEFAULT' helps :) [17:29:24] in addition to wikis in the default not being in sectionsByDB [17:38:10] hey all, anyone know of any tutorials or docs that explain how to best use UploadBase? [17:40:55] dan-nl: probably best to ask in #mediawiki [17:41:11] Ryan_Lane: cool, thanks [17:41:15] yw [18:03:54] Reedy: how has commons and meta held up with 1.21wmf2? [18:04:22] robla: Not seen any issues whatsoever [18:04:51] https://bugzilla.wikimedia.org/showdependencytree.cgi?id=38865&hide_resolved=1 [18:05:20] 41178 I just haven't closed... [18:05:49] 41174 isn't deployment related [18:06:04] Most likely swift [18:06:34] I merged 41155 also [18:06:40] might be a couple of little outstanding issues [18:06:53] "Annoying as it is, this was already the case prior to ContenHandler." [18:06:54] Oh, no [18:07:06] * Reedy closes that one too [18:07:15] * aude can poke at https://bugzilla.wikimedia.org/show_bug.cgi?id=41244 if it's critical [18:07:37] I've not seen that one... [18:08:02] ok [18:08:13] looks related to some extension but can still investigate it [18:11:44] aude: yeah, thanks. there's not quite enough information on 41244 to know if we should worry about it [18:11:56] !b 41244 [18:11:56] https://bugzilla.wikimedia.org/show_bug.cgi?id=41244 [18:12:24] robla: ok [18:12:53] if it pops up in the logs, i can poke but otherwise can let daniel handle tha tone [18:13:06] sounds great, thanks! [18:13:07] ok [18:15:03] to enwiki [18:15:04] ! [18:15:05] ok, you just typed an exclamation mark with no meaning in the channel, good job. If you want to see a list of all keys, check !botbrain [18:15:12] :) [18:15:27] lies wm-bot, lies. [18:15:27] * aude braces for more bugs [18:15:30] hope not! [18:16:24] let's seeee [18:17:20] Catchable fatal error: Argument 1 passed to ApiParse::getSectionContent() must be an instance of Content, null given, called in /usr/local/apache/common-local/php-1.21wmf2/includes/api/ApiParse.php on line 359 and defined in /usr/local/apache/common-local/php-1.21wmf2/includes/api/ApiParse.php on line 377 [18:17:49] and its gone again [18:18:53] thought we resolved that one [18:19:32] I think that's new... [18:19:44] The others were parseroptions related [18:20:04] htrm [18:20:07] hrm [18:21:22] $this->content = $this->getSectionContent( [18:21:22] $this->content, [18:21:22] !is_null( $pageId ) ? 'page id ' . $pageId : $page->getTitle()->getText() ); [18:22:16] We must have more cologneblue users on enwiki [18:22:22] MatmaRex!! [18:22:28] Warning: Invalid argument supplied for foreach() in /usr/local/apache/common-local/php-1.21wmf2/skins/CologneBlue.php on line 631 [18:22:30] :o [18:22:43] those aren't new [18:22:45] Reedy: thousands https://en.wikipedia.org/wiki/Wikipedia:Database_reports/User_preferences#Skin [18:22:51] just more in numers [18:22:51] lol [18:24:02] the fix was merged recently, i think. [18:24:10] link? :D [18:24:20] I think I'll fix the log noise ;) [18:24:39] https://gerrit.wikimedia.org/r/#/c/27192/ [18:25:03] thanks [18:25:19] and https://gerrit.wikimedia.org/r/#/c/27968/ too, probably [18:26:15] thanks MatmaRex [18:26:53] meh, i'm the one who broke it i nthe first place [18:26:57] ;) [18:33:52] aude: http://en.wikipedia.org/w/api.php?action=parse&page=vbx&format=json&prop=text§ion=0 [18:34:00] It does actually look familiar.. [18:35:44] ok [18:40:54] Reedy: is that one associated with any bugs yet? [18:41:10] https://bugzilla.wikimedia.org/41277 [18:42:39] @return Content|null The content of the current revision [18:42:55] - if ( $this->section !== false ) { [18:43:08] + if ( $this->section !== false && $this->content !== null ) { [18:43:08] aude: ^ [18:44:45] ok [18:48:15] I think that's too simplistic.. [18:51:31] wassup [18:52:24] i think it should work [18:52:44] hi Nikerabbit [18:52:45] although api parse should be looked over carefully to check for any other issues [18:55:35] and see if we can get the section some other way [19:07:49] http://en.wikipedia.org/w/api.php?action=parse&page=vbx&format=json&prop=text§ion=0 [19:07:54] http://en.wikipedia.org/w/api.php?action=parse&page=vbx&format=json&prop=text [19:07:58] WFM [19:08:27] revision 0 is slightly strange [19:09:02] the page doesn't exist [19:26:21] hoo: how hard is https://jira.toolserver.org/browse/DBQ-193 ? [19:27:31] Nemo_bis: Define hard [19:28:25] 10 on the Mohs scale. [19:28:39] hoo: something that doesn't get done by just pinging people [19:30:14] For me it's not much work to write SQL queries of that size... only filing them with the data from text files like the given can be annoying [19:31:03] hoo: I know regex etc. [19:31:25] Nemo_bis: And so do I, but still it's annoying [19:31:26] or tell me what the filling constists of [19:31:41] aww regexes are so cuuute, how can you say so [19:31:42] It consists of the attached staff2 filte [19:31:45] * file [19:31:57] I mean the format you need [19:32:15] huh? Do you want me to re run it? [19:32:21] 'foo', 'bar' [19:32:25] simple CSV list [19:32:29] CSV? [19:32:38] coma separated values [19:32:39] yes if you run it all the better [19:33:04] SQL queries can use data in CSV? [19:33:04] alternative would be to use bash or so to loop through it [19:33:27] SELECT * FROM user WHERE user_name IN('Hoo man', 'Nemo bis'); [19:33:33] ;) [19:34:15] hmpf [19:39:12] Nemo_bis: Thinks like this https://jira.toolserver.org/browse/DBQ-196 are more nasty as they need a select on one server, then a dump of that data, then an import on another sever and then the wanted data can be selected ... [19:39:40] that's another reason for user tables on labs, btw [19:40:19] Reedy: seems pretty quiet, yes? [19:41:10] * aude hopes so [19:41:24] Yup [19:41:28] just swift stuff appearing and disappearing [19:41:30] which is usual [19:41:38] I don't see any public squawking at all. [19:42:06] * aude is checking village pump, etc. .... nothing [19:42:20] agreed [19:43:19] * Damianz squawks at chris [19:49:56] Error logs are usual noise so a good sign that nothing major is going on [19:52:32] good :) [20:21:54] !log Running p[opulateFileArchiveSha1.php on enwiki [20:22:03] Logged the message, Master [20:22:27] Reedy: it's really called p[opulateFileArchiveSha1.php ? [20:22:43] Unfortunately, no [20:22:55] heh heh [21:08:01] would anyone mind if i sync extension PostEdit to roll out a couple of bugfixes? [21:08:39] ^ Reedy (since you were the last one to sync) [21:09:41] Yeah, my deployment was done pretty soon after it started [21:10:15] cool; i'll wait another 5 mins for someone to object and then sync. [21:20:10] done; thanks. [21:45:56] jeremyb: https://en.wikipedia.org/w/& [21:45:59] Not sure if that's a 500. [21:46:19] 403, I guess. [21:47:23] z [21:49:19] Brooke: good enough, danke [21:53:11] gn8 folks [22:01:46] okay, my deployment window [22:10:17] Reedy: mw-update-l10n pass --quiet to the rebuildLocalisationCache.php script [22:10:21] Reedy: which prevents it from giving any output [22:10:31] Reedy: what do you think about making it a bit more verbose ? [22:12:36] hashar: sorry? [22:17:55] Reedy: mw-update-l10n does not show any progress while it rebuild the cache [22:18:19] Reedy: which let me staring at a stalled screen: Updating LocalisationCache for master... [22:18:39] lol [22:18:42] so I thought making it a bit more verbose (aka to show the list of languages as they are being rebuild) would be more entertaining to the user :-] [22:18:49] it's usually run via cron.. [22:19:04] so less cron spam ? [22:19:51] i'd imagine so [22:20:06] if it works, you don't really care