[01:41:38] Urbanecm: That's not what emergency means. [01:43:34] And it's not really appropriate to spam multiple channels. [01:56:11] is anyone here good with wiki text willing to help me make a signature that I'm bad at trying to make [02:00:48] Vermont: The default user signature is fine. [02:01:17] k [02:01:36] Ivy: Sorry to bother ya [07:12:07] Wm [15:58:59] =o [16:02:22] \o/ [16:02:32] \o [16:02:36] It's that time again! [16:02:46] It's that time again indeed! [16:03:32] Technical Advice IRC meeting! [16:03:35] which time? [16:03:36] ah [16:03:40] good good [16:03:49] oh hi mainframe98 -- nice to see you [16:04:01] Hey everyone! Glad to see you all around :) [16:04:04] Hi! Glad I can finally attend one of these meetings [16:04:17] Hi Hauskatze, hi d3r1ck, hi mainframe98 ! [16:04:35] mainframe98: I cannot believe the fix for https://gerrit.wikimedia.org/r/#/c/406988/ was that easy [16:05:26] Sometimes I'm not online but want to access MW docs offline, is that possible? I'm aware that there is https://doc.wikimedia.org/. Is it possible to also have that bundled with MW offline? [16:05:29] This is why using a static code analyse tool is important. Unfortunately, Phan doesn't run with Jenkins tests, or it would've caught it. [16:06:03] I think I remember that they wanted to use phan at some point? [16:06:20] I can't remember [16:06:21] addshore: A little help here please? https://gerrit.wikimedia.org/r/#/c/406985/, https://gerrit.wikimedia.org/r/#/c/404400/ [16:06:38] CFisch_WMDE: any ideas if one can get offline docs? [16:07:02] d3r1ck: you could generate your own copy of the documentation with doxygen. Additionally, there is quite some documentation available in the docs/ folder. [16:07:12] Hauskatze: you can use phan on the CI alreaddy [16:07:14] already [16:07:27] d3r1ck: Not that I know off [16:07:28] addshore / CFisch_WMDE -- not sure if this is the right time/way, but maybe you could review https://gerrit.wikimedia.org/r/#/c/406988/ and associate task and see if the proposed patch fixes the issue? [16:07:50] mainframe98: Yeah, thanks. I looked at the docs/ folder too but I was thinking of a way that one can access the docs via a web browser as if it's online [16:08:12] Is it just me or is the enwiki Watchlist maybe slightly in a broken state? [16:08:23] https://en.wikipedia.org/w/index.php?title=Special%3AWatchlist&days=180&hidecategorization=1&hideWikibase=1&namespace=&action=submit gives zero hits [16:08:38] mainframe98: You get the point I'm trying to pass across? So I can look at docs/ which is great but can I access it offline via the web browser? [16:08:55] jubo2: I see things in my watchlist at that url [16:09:42] addshore: hmm.. I cannot believe on-one edited any of the 429 pages I watch in the last half year [16:09:50] *no-one [16:09:56] Hauskatze: I had added myself as a reviewer for that patch, although legoktm knows more about MassMessage things so it may be best to wait for him to take a look! [16:10:14] addshore: ack, agree - thanks! [16:10:26] d3r1ck: I'm pretty sure that you could use doxygen to manually generate the docs that you see on docs.wikimedia.org. That way you have a copy of docs.wikimedia locally. [16:10:28] addshore: fwiw I think https://gerrit.wikimedia.org/r/#/c/406824/ is +2able [16:10:32] jubo2: 30 days is the limit I'm pretty sure [16:10:50] Hauskatze: done ;) [16:11:06] :-D [16:11:12] hmm.. I didn't know of a limit.. it does say 720 hours in the UI [16:11:15] :D x2 [16:11:17] mainframe98: Yeah, thanks so much. But please I'm also asking if it's possible to get the docs (aftre generating) on the web browser. Is it accessible that way? [16:11:44] Mediawiki changes and gets more features and naturally some special arrangements for WMF wikis' needs [16:12:51] jubo2: what sorts of pages do you have in your watchlist, something should probably be showing up [16:12:52] So Watchlist is fine [16:12:53] d3r1ck: it should? At least JavaDoc would generate documentation that you can open in your webbrowser. I'll see if I can generate it [16:12:54] Ok [16:13:03] have you tried adding a page to your watchlist that you know has recent changed? [16:13:15] mainframe98: Okay great! Walk me through when you are done please :) [16:13:17] No and I need to get going somewhere soon [16:13:24] CFisch_WMDE: Thanks for the +2 :) [16:14:05] You could have that jenkins job vote I guess [16:15:46] I am new in mediawiki hacking, wich tasks do you advise me? [16:15:59] CFisch_WMDE: :D [16:16:11] mwext-php70-phan-docker is run on 'check experimental' [16:16:17] bam_: In Phabricator there are some tasks tagged as easy [16:16:21] I'm running it on MassMessage now [16:16:49] CFisch_WMDE: Do pass me the link, pls [16:17:01] jepp give me a sec [16:17:33] https://phabricator.wikimedia.org/maniphest/query/jsySdXllltwv/#R [16:17:51] These are tasks from all over the place [16:18:00] bam_: maybe you can help clear this? https://phabricator.wikimedia.org/T175794 [16:18:09] Even though Lego has a script that does that automatically [16:18:15] But it can help you get started :) [16:18:16] you could take a look into them and see if you can make any sense out of it [16:18:32] bam_: and see if someone is assigned already [16:18:54] if you really want to work on one of the things assign yourself first of cause [16:19:03] bam_: Yes, follow CFisch_WMDE advice :) [16:19:18] and best ask questions on the ticket, like if you need more input for a better understanding [16:19:34] you can also always ask here [16:19:35] alright, d3r1ck [16:19:42] :) [16:19:57] or or look onto the board for devs https://discourse-mediawiki.wmflabs.org/ [16:20:23] bam_: I guess you read the basics already about becoming a mediawiki hacker? [16:20:31] https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [16:20:47] CFisch_WMDE: yes of course [16:21:15] I just need some task to work on my spare time [16:21:21] :-) [16:21:28] there is also the tag https://phabricator.wikimedia.org/tag/need-volunteer/ [16:21:47] bam_: I remember I merged your task last time right? Thanks for working on that :) [16:21:55] Maybe some improvements on the docs can also help? [16:22:07] CFisch_WMDE: Any help on this please? https://gerrit.wikimedia.org/r/#/c/406421/, thank you :) [16:23:04] d3r1ck: Added myself as a reviewer ... will probably look into it the next days :-) [16:23:38] d3r1ck: you're welcome. I was busy, I couldn't work on other tasks for AWMD. Thing will be ready for them now [16:23:53] mainframe98: You something posted on the meeting page. Anything you would like to know around that topic? [16:24:00] *had [16:24:12] bam_: Okay great! Anytime I mean. :) [16:24:14] ( not that I feel very prepared ) [16:24:19] CFisch_WMDE: Thanks :) [16:25:20] CFisch_WMDE: Sure. I have an extension with a service that implements SalvageableService. I was wondering how that works in a multi-wiki environment. [16:25:32] Are the values salvaged between wikis? [16:25:54] CFisch_WMDE: Hey, and this too please :), https://gerrit.wikimedia.org/r/#/c/370370/ [16:25:58] Thank you [16:27:14] d3r1ck: as long as this meeting does not get flooded with review requests I guess its ok ;-) [16:27:17] I want to work on this task https://phabricator.wikimedia.org/T175794, but I see GCI [16:27:34] mainframe98: ok, I will have a look ... addshore do you have any insights in that area? [16:27:47] Was it part of GCI? [16:27:52] Is there a bot or tool that one can use to read changes of content on Google Spreadsheet and then updates a wiki table? [16:28:04] bam_: Yes it was part of the Google Code-In contest [16:28:19] But GCI is over and that needs to be cleaned, I mean if someone wants to :) [16:28:23] So, it's now free? [16:28:27] mainframe98: SalvageableServices are only salvagable on a per request basis [16:28:36] CFisch_WMDE: Yeah, that's my last PS review request. Thanks :) [16:28:46] bam_: Yeah it's free and you can work on it :) [16:29:02] mainframe98: what service are you creating (im curious) [16:29:06] bam_: so looking at the ticket I would say it's free ... there seems to be a lot of work done already [16:29:06] bam_: ... and don't forget to keep in mind what CFisch_WMDE said if you have any issues :) [16:29:07] But I thing I need some clarification. [16:29:14] who can review https://gerrit.wikimedia.org/r/#/c/406816/ ? [16:29:19] but still some extensions left [16:29:34] bam_: What clarifications, please feel free to ask here :) [16:29:36] Which ones exactly? [16:30:01] bam_: Okay! If you read the ticket, you'll see that some have patches appeneded at the end of them, some don't [16:30:04] Tasks to work on, because some extensions are already done [16:30:09] Hauskatze: probably a combination of CFisch_WMDE and I [16:30:20] So you just need to pick an extension that doesn't have the minus-x utility and then add that to it. [16:30:29] addshore: A service that creates wikiset objects. [16:30:52] hmm, mainframe98 which part of the service is salvagable ? [16:31:12] bam_: So you can see this in the ticket "extensions/CommunityTwitter - no mediawiki/minus-x" [16:31:18] /what resources are salvagable / hard / expensive to recreate? [16:31:21] Hauskatze: I have that one on my monitor already will probably look into it tomorrow [16:31:25] Ok [16:31:27] That extension doesn't yet have minus-x so you can add it [16:31:52] So the ones that don't have patches are not yet done and the ones that have patches are already done and pending review I think [16:32:09] bam_: So just read the task description and you'll see an example patch that solves the issue for each extension :) [16:32:34] bam_: You can read what the utility is all about here: https://www.mediawiki.org/wiki/MinusX. [16:32:36] It is what I am trying to do? [16:33:11] bam_: I don't understand your question please, can you rephrase? Or does someone understands bam_? :) [16:34:08] addshore: The sets themselves. It's not so much that the sets are hard to create, but they're requested every request, for each set the wiki is part of. The problem stems from the fact that it is used in the MediaWiki Setup process, when caching is not available. [16:34:44] I mean I am trying to read task description [16:35:00] bam_: Okay! Sure! Feel free :) [16:35:11] I saw the question mark and so I thought it was a question [16:35:18] 17:34:37 bam_ | It is what I am trying to do? [16:35:49] Sorry [16:36:00] hmmmm [16:36:03] I could be a period (.) [16:36:17] Okay! [16:36:26] mainframe98: they will be recreated every request no matter if they implement a salvagableservice or not [16:37:19] mainframe98: and in a regular request you will have the same service object for the whole life of the request [16:37:19] How do I assign myself a task on Phabrigator? [16:37:26] unless your doing some really funky stuff [16:38:01] bam_: That task is quite big to assign to yourself [16:38:11] But to assign a task to yourself, you can follow 2 methods [16:38:19] Ok [16:38:45] What are those methods? [16:38:49] addshore: I knew that the service object is retained the whole request, but how is anything that is service cached then? [16:39:49] 1. Go to the text area below the task, click on "Add action" dropdown, click on "Assign/Claim" (this will auto assign to you) then hit submit [16:40:00] addshore: If I have an extension that obtains configuration from the database, and provides that to a custom config factory, how should it cache that configuration so it isn't requested each request. (since putting configuration for each set toghether in one large config object is expensive) [16:40:45] 2. On the right menu, click "Edit Task", on the "Assigned To" input field in the form, put your "phab name" and hit "Save changes" [16:40:55] bam_: Method 1 is the fastest. I'll recommend that. [16:40:56] so the services themselves are 'cached' in memory for the whole request within the service container (MediaWikiServices) [16:40:58] mainframe98: ^^ [16:41:01] Do you understand? bam_ ? :) [16:41:23] mainframe98: if you have a service that gets config from the db, you will need to cache that within the service itself if you want some sort of cache / not to hit the db each time [16:41:37] mainframe98: do you have some code or a patch that I could look at? [16:42:03] bam_: You can assign that task to yourself if you want to finish up adding minus-x to all the remaining extensions. But since it's many tasks in 1, you can just keep submitting patches :) [16:42:41] Cant see the the text area [16:42:58] addshore: unfortunately, it's not public, since I keep it on my own offline git repo. [16:43:14] bam_: It's because the task has many comments, scroll right at the bottom-most part of the page :) [16:43:18] mainframe98: okay [16:43:38] mainframe98: so yes, it sounds like your service that constructs these wikisets should have some sort of cache within it for caching the data [16:44:08] mainframe98: take a look at WANObjectCache (for a complex usecase that will work in most cases) [16:44:31] d3r1ck: Unfortunately, I can't seem to successfully generate the mediawiki documentation offline. There's something that fails within it. [16:45:01] mainframe98: you also have the implementations of BagOStuff which care much simpler, you can get a generic cache from MediaWikiServices::getMainObjectStash [16:45:12] Oh, I see. Thank you so much [16:45:21] mainframe98: Hmmm.. Okay! Thanks very much :), I'll also try to see if I can do that after the meeting :) [16:45:28] bam_: Yeah, you're welcome anytime :) [16:45:30] addshore: I had used WANObjectCache before, which seems to suite my needs. The only problem I ran into was getting that service injected since I need to have it before the config factory can be created. [16:46:04] hmmm [16:46:08] Phantom42: You around sir? :) [16:46:37] d3r1ck: Yes, I am! [16:46:39] mainframe98: you say this was being run in setup? do you have an exact line i could look at? where is the service that creates wikiset objects for you being created? [16:46:45] Phantom42: something interesting or that may interest you: https://phabricator.wikimedia.org/project/view/1424/. [16:46:56] You may find some pleasure working on those? :) [16:47:04] addshore, I'll try to post a gist. [16:47:12] mainframe98: okay! [16:47:14] Phantom42: Tony and I with others are always available to review :) [16:47:27] Phantom42: In fact, you know the general process :) [16:48:00] Phantom42: Just thought about you so was willing to share that with you please :) [16:48:46] d3r1ck: Oh, thanks for sharing! I would be glad to work on this! [16:49:02] Phantom42: Thank you so much :) [16:51:23] addshore: https://gist.github.com/mainframe98/c449ef71bb87cdbc337dd7c55dc6defc [16:51:28] I'm sorry, but I have to leave you. [16:51:58] bam_: Thanks for checking in! [16:52:01] See you around [16:52:03] :-) [16:52:24] Thank you, too for hosting [16:52:26] And please always feel free to ask for help! [16:52:41] addshore: This doesn't use the wikiset factory service, I now realize. The project has gotten big enough that I don't always know which what uses, unfortunately. [16:52:48] Ok [16:54:21] so mainframe98 the $boostrapConfig is a GlobalVarConfig instance [16:54:52] I guess that has config for caches in there [16:55:29] Also, if the caching config is general / doesnt change per site, then you can still get a cacheing service from $services there [16:57:53] addshore: Alright, thats good to know. The problem is that the caching service uses the MainConfig service, which calls the MediaWiki provided ConfigFactory, which I'm trying to replace. Isn't that going to cause a conflict? [16:58:47] What sort of cache are you looking to use? file? apc? memcached or similar? [17:00:12] addshore: Anything that is shared between wikis, so probably apc, memcached or similar. Since configuration for sets is shared between wikis, it makes sense to share it between wikis. [17:00:38] Depending on the order that the service actually gets replaced in just calling $services->getMainObjectStash() could work [17:01:34] So despite the very active discussion here the official part of the Technical Advice IRC meeting this week if over now. Thanks to all that participated, see you next week I guess :-)! [17:01:44] o/ CFisch_WMDE :D [17:01:56] *is [17:02:35] addshore: True, but due to backwards compatibility, especially regarding extensions that do not use the Config factory yet, I have to export the configuration to $GLOBALS almost instantaneously for the extensions to work. [17:03:03] oooof [17:03:14] addshore: Regardless, you've helped me a great deal already. I can at least try and see if I can implement simple caching insertion and see where that brings me. [17:03:30] addshore: So, thanks! [17:04:49] mainframe98: I gneerally thing that trying to redefine the ConfigFactory might result in even more errors further down the line [17:05:39] mainframe98: something to take note of is in indlues/Setup.php on line 609 ish there is a call to MediaWikiServices::resetGlobalInstance( new GlobalVarConfig(), 'quick' ); [17:07:24] it is indeed a sticky area :) [17:10:14] addshore: That could actually work in my advantage. Thanks for the explanations. I now understand SalvageableService a lot better. I'll see what I can achieve. [17:13:30] CFisch_WMDE: thanks for the tips wrt phan and to mainframe98 for past/future tests on that patch :D [17:13:43] I might create a task for that, but I'm not sure it's needed [17:14:18] you're welcome [17:14:26] what a silly fix, adding 'use MWNamespace;' ;-) [17:14:34] wonder how it didn't failed int he past [17:39:50] is the tech advice still? [17:40:06] 'cause I have https://gerrit.wikimedia.org/r/#/c/372803/ which could use some more eyes [17:40:37] [not mine, from Melo-s, but for a extension I use a lot everyday :) ] [17:40:41] Nope normally only 1 hour, but if you're lucky someone will pay attention anyway ;-) [17:40:49] hah, well, not urgent [17:44:07] thanks for merging the massmessage stuff [17:44:14] hope that it resolves the issue :) [19:21:29] doing some sockhunting on en-wiki: does anyone know of a tool to compare the editor intersection on a set of articles? [19:31:00] alfie: I think https://tools.wmflabs.org/intersect-contribs/ [19:32:23] Stryn: this is the inverse - i want to look at a set of articles and see which users are common to them [19:32:35] * alfie has a suspicion there's some undisclosed paid editing going on [19:32:47] ah then I don't know unfortunately [19:33:00] Stryn: no worries, this'll probably come in handy anyway! [19:33:18] Wikipedia: Actually just an exercise in set theory. [20:23:05] any hints how to force a 2-year-old thumbnail from an outdated verion of a file to be regenerated? [20:23:20] https://upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Karol_May_-_Old_Surehand_01.djvu/page256-854px-Karol_May_-_Old_Surehand_01.djvu.jpg [20:23:34] I see p. 230 instead of 232 here [20:33:13] Last-Modified: Fri, 08 Jan 2016 07:47:03 GMT [20:33:13] Etag: a81b9d5936394d1d2a1a7341c53058c9 [20:33:13] X-Timestamp: 1452239222.85447 [20:33:13] X-Trans-Id: txd5b25d42412b41cab95b1-005a721e98 [20:33:13] X-Varnish: 499281636, 338202450 336658680, 921780112 [20:33:15] Via: 1.1 varnish-v4, 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1) [20:33:18] Age: 2354 [20:33:21] X-Cache: cp1048 pass, cp3034 hit/10, cp3035 miss [20:33:23] X-Cache-Status: hit-local [20:35:30] _08_ _Jan_ _2016_ while current version was uploaded 2016-08-23T23:28:43 [20:41:42] ankry: it looks like it at least needs to be purged from the Varnish cache layer -- https://wikitech.wikimedia.org/wiki/Multicast_HTCP_purging#One-off_purge -- but it may or may not also need to be somehow purged from the long term image storage layer (swift). [20:42:37] ankry: filing a phabricator task and tagging it with #traffic would probably be the way to start that process [20:46:38] bd808: so no way to do it online? [20:47:09] ankry: not that I can think of, primarily because that image size is non-standard [20:47:36] I think that means it would only exist in the Varnish cache [20:49:04] generating purges of all thumb sizes for a given image from the MediaWiki side is something that is not really possible today. This is in part a limitation of the way that Varnish keeps track of objects in its cache [20:50:16] the best we can do is to enumerate the images that are known to exist in the Swift object store and then issue Varnish HTCP purges for each unique image size [20:51:04] I'm not entirely sure if that even works for djvu and other multi-page media types [21:05:38] bd808: https://phabricator.wikimedia.org/T186153 [21:05:59] can you help tagging? [21:06:31] ankry: I took some guesses :)