[00:00:22] your change *plus* lowering $wgRejectedAccountMaxAge should do [00:00:28] in ConfirmAccount.config.php i see # When configuring globals, set them at LocalSettings.php instead [00:00:30] so that's good [00:01:52] this might do. Set wgRejectedAccountMaxAge (How long to store rejected requests) = 0. Then prune. Should remove the rejected request immediately. I'll try it [00:02:03] The other switch seems unrelated [00:07:02] Platonides still getting "Username is already in use in a pending account request." [00:07:17] Do i need to run the wiki update process first? [00:07:53] no [00:08:09] what did you set Print_antifeature_intro-sandbox [00:08:15] what did you set wgRejectedAccountMaxAge to? [00:08:20] 1 [00:09:55] Print_antifeature_intro-sandbox does not exist in LocalSettings.php [00:10:06] i set wgRejectedAccountMaxAge = 0 [00:10:30] some progress, i see that previous remnant rejects are gone, and they can re-request [00:11:00] so they got pruned. Question is whether they got pruned by my new code, or by the default code [00:11:24] that is, the one new request that i just tested is still there, cannot re-request [00:11:31] only previous ones are gone [00:12:18] maybe the prune has to happen after existing code runs, in a separate process. Caching issue? [00:13:21] possible fix: run prune at beginning of new request [00:13:24] i'll try [00:13:34] (instead of at end of reject action) [00:17:30] have to find php that runs at beginning of new request.... [01:03:12] Platonides, looks like i solved it [01:03:51] in /ConfirmAccount/frontend/specialpages/actions/RequestAccount_body.php [01:05:23] function showForm [01:06:00] add very last command in the function [01:06:00] ConfirmAccount::runAutoMaintenance(); [01:06:17] thx for help! [14:27:18] Hello! [14:28:11] https://tools.wmflabs.org/persondata/test.php <-- what is wrong here with connecting to the database (the website shows its source code) [14:32:28] Wurgl: your error handling looks a little weird, but I don't actually see a connection error [14:32:44] should dewiki.labsdb work? [14:33:25] $dbi->connect_error is null when I loaded the page, which suggests that there's no connection error [14:34:57] it works when started on the command line [14:35:20] yeah, but on command line maybe there is extra magic :-) [14:35:37] I am checking docs [14:36:33] https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#PHP recommends a longer alias [14:36:45] maybe the one you use works, not sure [14:37:32] dewiki.analytics.db.svc.eqiad.wmflabs or dewiki.web.db.svc.eqiad.wmflabs [14:37:43] but please don't take may word for it [14:37:49] just looking at the wiki [14:39:18] I tried with dewiki.analytics.db.svc.eqiad.wmflabs before (changed the code now) does not work either :-( [14:39:36] then it is something else, sorry [14:39:59] Wurgl: What are you expecting to happen instead? [14:41:26] On the commandline I see public $client_info => string(79) "mysqlnd 5.0.11-dev - 20120503 - $Id: bf9ad53b11c9a57efdb1057292d73b928b8c5c77 $" [14:41:41] Just one of the lines of var_dump [14:42:18] error.log does not show anything [14:45:48] I would suggest trying to do a query, and see if that triggers any error, or succedes [14:47:37] Hmm … does var_dump behave different when called from the webserver? I added a line "print $dbi->client_info;" and this one shows some data? [14:48:22] strange [14:49:42] Its possible its different versions of php [14:50:20] And its not a normal php object, so lots of magic happening behind the scene. Could be lazy loaded in a way that var_dump() doesn't load the field or something [15:10:53] … or var_dump behaves different on objects for security reasons? [19:24:08] hi, I found a weird schema situation: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/4124558d7b40df818337f1d7eeae10c9ee9688e2%5E%21/#F7 changed the primary key of revision table to just rev_id rather than (rev_page,rev_id), but I see no corresponding updater step [19:24:49] so it seems old wikis are stuck with the old revision schema... however it seems like it's not causing issues (at least on mysql) [19:28:00] however at first glance this might mean the rev_page,rev_id index might not need to be a UNIQUE one [19:28:44] maybe it could be a simple index on rev_page [19:35:55] I'll need to go now but I'll catch up on any replies [19:40:58] TK-999: if you mean for wikimedia. On large wikis we use a non-standard schema for revision in order to do paritioning for performance reasons (I believe) [19:41:05] I guess he left [19:42:37] Hi, does anyone have an idea what happened to my post: https://www.mediawiki.org/wiki/Talk:Reading/Web/PDF_Functionality#What%20happened%20to%20my%20post??? [19:42:53] I asked a steward and he couldn't tell me [19:43:11] Seems to be some bug with flow [19:46:37] Wow, following that flow history page is hard [19:49:19] thanks for having a look [19:49:44] Stryn and I could see it here: https://www.mediawiki.org/wiki/Special:Contributions/Debenben [19:50:01] but not here: https://www.mediawiki.org/w/index.php?title=Topic:U7yyjlehi3rvc4b7&action=history [19:50:15] and it is not shown on the page anymore [19:51:10] yeah, I totally don't understand flow [19:51:17] so I have no idea [19:51:29] me neither [19:51:46] everyone I asked said he doesn't understand flow