[02:25:51] Anyone know what's going on with global login? [02:26:07] For the last week, I keep having to login on Metawiki [02:26:36] I will be handling GS deletion requests, deleting on several wikis, then go to mark the request as done, and not be logged in [02:26:52] I have had two accidental edits from my IP in the last week, and caught myself two other times [10:15:37] riley: I didn't notice anything so drastic in the past few weeks. Generally speaking login has gotten less and less reliable for many years now. [10:15:58] Strange [10:16:10] What you describe reminds me of https://www.mediawiki.org/wiki/Firefox_users_and_session_loss_bug because visiting many other wikis would trigger the exhaustion of your cookie limit [10:17:08] I have network.cookie.maxPerHost set to 1000 [10:17:50] Firefox ~75 also made it harder to see what cookies you have (they're no longer easy to see in ctrl-i) in the usual misguided race to copy whatever Chromium does [10:21:59] riley: so what's your browser and what's your network.cookie.maxPerHost if you use Firefox? [10:22:16] I don't use firefox? [10:22:19] ok [10:22:27] I use Chrome [10:22:33] I thought we had some mitigation strategy for this bug but I'm not sure how comprehensive it was [10:23:43] I don't even know how to count cookies in Chromium, so many clicks necessary [10:23:56] I'll try to reset my cache again [10:24:15] Does constantly changing wifi networks have an effective? [10:24:48] effect* [10:24:56] I guess it wouldn't, I never had issues going from work to home before [10:25:06] I don't know... it shouldn't but I sometimes had a suspicion it did [10:25:47] Maybe the easiest place is devtools (f12) > application > storage > cookies (and friends https://developers.google.com/web/tools/chrome-devtools/storage/sessionstorage ) [10:27:39] riley: and by the way do give login some time to take effect? I often have to wait ~10 seconds after loading a page in a "new" domain [10:27:58] Sometimes refresh is enough, sometimes you click login and then it recognises the global login [10:29:15] There are several updates in https://developers.google.com/web/updates/2020/01/devtools#cookies [11:23:36] Is there a mapping somewhere between the results returned by the mediawiki API and where that data resides in the database dumps from dumps.wikimedia.org? [11:24:33] I'm trying to figure out how to replicate the functionality of a script that pulls specific data from the commons.wikimedia.org API regarding image files. [11:25:08] Not all of the imageInfo props appear to be in the 'image' table dump. [11:42:30] mathemancer: where have you looked so far? [14:16:35] Nemo_bis: I've checked the SQL dump for the image table, and the XML articles dump. [14:18:43] Neither contains all the info, and even their union doesn't seem to contain everything. For example, the data found in the extmetadata.LicenseShortName via the API doesn't seem to appear in either of those dumps. [14:25:23] That'd be because LicenseShortName isn't stored in the image table [14:27:37] mathemancer: Some things are stored in the DB and others are generated dynamically, AFAIK. We have some documentation on how things are supposed to look like but for an exact "mapping" I guess you need to study the code itself. [14:28:38] I doubt it's a wise idea to reimplement in your tool whatever conversion the CommonsMetadata extension is doing. The whole point of making that extension is to make sure that people don't have to process the raw data themselves because it's generally too hard. [14:30:23] metadata is in the dump, extmetadata is dynamic [14:30:56] there is no dump unfortunately [14:31:52] and it's pretty hopeless to reimplement on your side - the logic is not complicated, but it's based on the HTML, not the wikitext, and there is no dump for HTML, either [14:32:52] on the bright side, Commons is slowly transitioning into storing image metadata as structured data, with proper dumps [14:33:26] tgr: that is something to look forward to [14:34:47] In the meantime, it seems like I'll be limited to the speed at which I can pull from the API...I wonder, is it possible to use the dumps in combination with the mediawiki software to replicate just some of the functionality (e.g., the creation of the extmetadata field) [15:00:54] mathemancer: in theory yes, but I doubt it would be faster [15:02:34] you would need to parse pages containing complicated templates, even for the production servers that takes something like a hundred milliseconds [15:03:03] while you can probably fetch a few thousand pages per second from the API [15:05:18] tgr: Actually, if I hit the API too fast, I get a whole lot of internal server errors. That was an original inspiration for trying to use the DB dumps. [15:06:25] mathemancer: can you define "too fast"? [15:06:31] I'm currently running three instances pulling the data in parallel, letting the server set the speed (I.e., sending the next request once the first receives a response). [15:06:55] If I try 4 in parallel, I start seeing quite a few errors [15:08:18] Ok. That's not ideal but not entirely unexpected either. [15:09:52] When I'm getting the error, it's either 'internal_api_error_WMFTimeoutException', or an actual 5xx internal server error HTTP status. [15:10:52] mathemancer: what interface language are you using? [15:11:09] I'm just calling the API directly [15:11:24] With the `requests` library in Python [15:11:41] I'm trying to think what might cause some caches to be cold. [15:12:36] I think it's actually come computation happening (perhaps pulling the extmetadata, since I now understand that's a bit complicated) [15:12:55] yeah, it might do parsing on the fly [15:12:58] And, if I retry after a failure, it will often work the second time around (I suppose it caches the result) [15:13:33] are you requesting batches of 500 pages? [15:13:42] tgr: yep [15:14:45] I'd try to find the smallest batch size that works reliably [15:15:40] but in the end the servers will perform the same work you'd have to locally on the dump, and very likely perform it much faster [15:15:57] so it's still your least bad option [15:17:20] tgr: I'll still try smaller batches to see if the overall throughput is better. My only goal is to pull the data quickly without somehow taking down WMC (definitely don't want to do that, lol) [17:38:38] did any updates get pushed live today? [17:42:36] Betacommand, https://wikitech.wikimedia.org/wiki/Deployments#deploycal-item-20200504T1100 [17:43:32] The MediaWiki train doesn't start rolling until tomorrow, but a few patches went out this morning [17:43:54] Well, not quite [17:44:00] The train moved forward today [17:44:06] After being rolle back last week [17:44:12] so .30 is now everywhere again [21:53:40] I've enabled "New wikitext mode" at https://test.wikipedia.org/wiki/Special:Preferences#mw-prefsection-betafeatures [21:53:41] and I was able to add a custom Tool to its toolbar (it appears when I go to "Insert" > "More"). [21:53:46] However, I would like to add the tool to a *new* custom group instead, and this didn't work: [21:53:48] https://test.wikipedia.org/wiki/User:He7d3r/common.js?diff=430667&oldid=430663 [21:53:50] What am I missing? [21:54:03] Does anyone know how to make the new group to appear in the toolbar? [22:27:29] Sometimes I miss tidy. A single apex at the end of the document is nowadays able to inflict italics on an entire page. (See at the end of https://meta.wikimedia.org/?diff=20042641&oldid=20042636 ) [22:28:02] Helder: There is an option in the preferences which no longer does anything apart from killing the toolbar entirely [22:28:14] Allegedly this is a "feature" [22:29:39] it's more that no one's bothered to redesign mediwiki editor preferences into something sensible [22:31:11] Wrong. We did but then all the work was reversed. [22:31:55] Because every other week someone comes up with a new feature which of course requires a new preference and you need someone whose sole goal in life is to hit such proposals with a giant banhammer. [22:33:34] Sadly I think jenkins is not yet able to inflict a strong electroshock to everyone who submits a patch with a new preference. [22:33:34] https://phabricator.wikimedia.org/T202921 [22:35:55] * Nemo_bis nostalgic https://bash.toolforge.org/quip/AU7VV0UU6snAnmqnK_0g [23:21:44] i love Nemo_bis [23:28:31] Hi juancarlos [23:29:57] hi nemo