[00:18:12] I updated my local bare metal (non-Vagrant) wiki to latest and now get "Notice: Object of class Closure could not be converted to double in WANObjectCache.php" on user pages (but not Special Pages, then InvalidArgumentException "Invalid cache miss callback provided". Anyone else? [00:18:36] I'd ask Aaron Schulz [00:20:06] Krenair: thanks, he did commit "objectcache: Allow bounded HashBagOStuff sizes and limit it in WANObjectCache". [00:21:16] I didn't know I had a WANObjectCache, I feel like a top 50 web site :) [00:55:15] Krenair: FYI it was Gadgets extension, whose caching Aaron has updated. o7 [08:09:33] [[Tech]]; Michielesmith; /* Receive An Excellent QuickBooks Support From Experts Today */ new section; https://meta.wikimedia.org/w/index.php?diff=14461180&oldid=14414391&rcid=6968922 [08:35:05] [[Tech]]; Stemoc; Reverted changes by [[Special:Contributions/Michielesmith|Michielesmith]] ([[User talk:Michielesmith|talk]]) to last version by 201.19.107.105; https://meta.wikimedia.org/w/index.php?diff=14461458&oldid=14461180&rcid=6968979 [14:02:04] I just filed https://phabricator.wikimedia.org/T117697 because I could not find a dupe, but I doubt I'm the first one seeing that..? [14:02:15] "Error loading data from server: ve-api..." when creating a new page with VisualEditor [15:14:40] qgil, you're not the first one, I had forgotten about that though, thanks [15:15:09] Good, and I see that andre__ has replied as well [15:15:56] win 24 [15:16:02] ENOSLASH [21:02:02] legoktm: you about? [21:10:33] Betacommand: hey [21:13:44] legoktm: any chance you can get https://phabricator.wikimedia.org/T44345 merged? [21:16:38] Betacommand: hmm, I think I now disagree with the patches I uploaded back in 2013. I'll try and take another look this week [21:17:33] legoktm: either way the ABF shouldnt be using the target's IP [21:17:40] legoktm: but thanks [21:18:27] legoktm: if your looking at CU bugs can I bribe you to look at https://phabricator.wikimedia.org/T49505 [21:21:12] I opened it in a tab [21:21:22] but a quick look makes it not seem easy [21:24:06] legoktm: converting hard coded message to a Mediawiki: namespace is difficult? [21:47:21] Betacommand: it doesn't look hardcoded to me, it looks like Linker::userToolLinks() [22:18:22] spagewmf, just ran into the same exception [22:18:29] thank you! [22:18:52] Krenair: did updating Gadgets make it go away? [22:18:57] checking that now [23:08:28] bd808, gwicke: if we can automate vendor, we could automate forking upstream repos for review [23:08:34] DanielK_WMDE__: the problem with signed forks of upstream is the complexity of a version bump. Today it takes updating a composer.json file, running composer update and then pushing to gerrit [23:09:06] yeah, making the code review experience nice isn't easy in that model [23:09:24] you first need to track down the origin repos, for example [23:09:26] it's not particularly nice now either :) but i agree, the workflow could use some streamlinging [23:09:28] with signed forks it requires a pull from upstream, push to gerrit, review, sign, update our local package index, then update composer.json in the right place [23:09:34] ( ^^ is a continuation of https://phabricator.wikimedia.org/E85 which was on #wikimedia-office ) [23:09:45] bd808: then it is review; git tag --sign; git push tag; make patch to composer.json; merge [23:09:48] bd808: before my juice runs out: i'm curious what you think of https://gerrit.wikimedia.org/r/#/c/251007/ [23:10:07] (totally o/t, sorry) [23:10:19] DanielK_WMDE__: it's a start in the right direction I think. I need to stare at it a bit more [23:11:55] bd808: ok. as i said, i don't really care what the service locator looks like in the end, as long as we have one, and it's not abused. the hard coded version was the most basic solution i could come up with. if we have need for more, I agree we should use a more flexible approach. my patch is a compromise. [23:12:16] gwicke: "you first need to track down the origin repos" no composer does that for you, and clones it [23:12:30] jzerebecki: I think your idea of making Composer capable of verification is a good one in general. I'm not against that [23:12:38] jzerebecki: okay, none of those use tars or zips? [23:13:14] gwicke: most of them optionally also do. but none of them can not use git clone [23:13:23] ok, running out of power. cu [23:13:24] jzerebecki: I don't think you could use Composer in the signed fork workflow could you? [23:13:29] DanielK_WMDE__: cu [23:13:35] jzerebecki: I'm just wondering about the recursive dependency part [23:13:48] if all those already use git, then I guess we are fine [23:14:05] but if some deep down the graph don't, then it gets ugly [23:14:06] gwicke: with composer --prefer-source it is git clones all the way down [23:14:22] okay [23:14:39] bye DanielK_WMDE__ [23:15:23] bd808: yea you need to decide if you want to use unsigned stuff or not... [23:15:24] jzerebecki: in any case, I think it would be nice to consider the vendor automation idea alongside the composer idea [23:15:42] I don't get what goals of this discussion would not be solved by checking in composer.lock [23:15:49] conceptually I like the git signature idea, but I'm not sure that it's worth the cost overall [23:15:57] considering that it's specific to composer [23:16:44] tgr: the problem with versioning composer.lock for core is that it causes dirty diffs for anyone using Composer for extensions [23:16:44] gwicke: wikidata currently has patches to their build automatically prepared [23:17:37] jzerebecki: okay, but the diffs are too huge for review? [23:18:14] bd808: so don't MediaWiki's composer file for extensions [23:18:17] spagewmf: nobody should use mediawiki/vendor besides the deployment at wmf [23:18:29] that's a stupid usage pattern and should not be supported [23:18:41] put a composer.json in /extensions or whatever [23:18:42] spagewmf: I'm pretty sure mediawiki/vendor is mentioned on the install from git instruction page [23:19:33] tgr: but we have the same problem for WMF. a growing nubmer of libs in mediawiki/vendor are optional or required by extenstions [23:19:34] gwicke: here is a small one from one day of changes https://gerrit.wikimedia.org/r/#/c/250936/1 [23:20:13] tgr: and I think one of the main points of this RFC is that wikidata wnats to stop bundling their deps in the way that they do for WMF deployment [23:20:16] gwicke: computers are better at binary comparing two things [23:20:37] jzerebecki: looks reasonablish, especially once you take into account which ones you trust (the ones you control yourself) [23:20:42] bd808: don't see the problem. Have a separate file for the dependencies of MediaWiki and the dependencies of your website, use comoser-merge-plugin to make one call the other [23:20:59] but that all goes into composer.lock [23:21:18] so dirty diff and no freezing of known versions [23:21:38] for optional dependecies [23:21:51] gwicke: the generic idea is aplicable to any package manager, but as people decided everyone needs their own package manager every package manager needs to interface with e.g. gpg. [23:21:52] ugh, true [23:22:08] is it possible to have a merge plugin for lockfiles? [23:22:33] it would have to go into the upstream [23:22:47] and it may be time to start talking to them about that I guess [23:22:59] but I don't think they want it [23:23:17] they are friendly to my plugin, actually very friendly [23:23:25] merge plugin for lock files won't work [23:23:32] but in part because it takes care of a mess that they don't want [23:23:53] conceptually a lock file is after dependency resolution [23:24:14] so if you change which things to merge you need to redo dependency resolution [23:24:34] agreed [23:24:55] but there could be a way to specify an alternate lock to read/write [23:25:16] jzerebecki: I really like the idea in principle. My objection is more pragmatic: I simply doubt that we could add gpg support to all relevant package managers & signatures to dependencies without investing a lot of effort. [23:26:29] gwicke: i think all the alternatives end up doing conceptually (from a trust perspective )the same thing or degrade to curl|bash [23:27:12] either you verify or not [23:27:15] I thought signing was presented as a "this would be the alternative but it's obviously not possible" thing in the proposal and people did not catch on to the not possible part [23:27:32] spagewmf: I finally found it (but I had to look pretty hard) -- https://www.mediawiki.org/wiki/Download_from_Git#Fetch_external_libraries -- bullet #2 [23:27:38] if you verify you need to trust other people to be able stem such hughe code bases [23:28:04] there are shades of grey in how thoroughly you check dependencies [23:28:50] that's how I prioritize those reviews, at least [23:29:10] the same would apply to signing other people's code [23:29:15] could we avoid the problem by making MediaWiki more library-like? [23:29:26] gwicke: so once you did check in the greyest way possible, how do you know later that this is the code that you did check? [23:29:45] jzerebecki: if it differs, it'll show up as a diff [23:29:48] in the git review model [23:30:12] ie separate composer.json for MediaWiki and for, says, the WMF website, and the latter requires the former? [23:30:24] I guess that still does not make version pinning possible [23:30:25] tgr: when makin mw more library like, i stubled uppon this problem [23:30:33] in the deploy system, we reference the exact hash of the deploy repo [23:30:48] so we know what we are getting there as well [23:32:02] gwicke: so who does the grey review? [23:32:28] jzerebecki: for services, we review each other's deploy repo updates [23:32:38] tgr: I've never tested to see what happens if you have a library (eg MediaWiki) with a composer.lock and install it to another composer managed project, but I think the lock file would be ignored. [23:33:12] I don't rember seeing any code in Composer that looks at locks other than the installer entry point [23:33:12] bd808: I would expect so [23:33:13] bd808, tgr: yes the lock file is ignored in any dependencies [23:33:23] composer is not recursive, right? [23:33:29] no [23:33:39] in a sense it is recursive [23:33:45] well it is in gathering dependencies [23:33:50] jzerebecki: a nice thing about the code-as-submodule model is that a simple code update (no pedency change) is just a hash diff [23:33:50] yes [23:33:52] to feed to the solver [23:33:58] *dependency [23:34:04] if we require library X and that requires Y they both go to the vendor directory, right? [23:34:34] so libraries don't each get a their own clones of their dependencies [23:35:30] gwicke: but mediawiki on beta doesn't use submodules [23:36:18] bd808: any idea how to solve the beta problem of a patch that gets deployed there needing a vendor update? [23:36:18] that's different from e.g. npm where you would get mainproject/node_modules/X/node_modules/Y [23:36:41] but vendor and the patched repo are two different repos and we can only change one repo at a time [23:36:47] jzerebecki: we have that problem too, but that's where the automatic vendor update could perhaps come in [23:36:52] that difference makes Composer inherently hostile to version pinning since pinned versions would conflict all the time [23:37:28] gwicke: so you are suggesting to have automatic vendor updates without verification? [23:37:40] yeah, for later code review [23:37:49] we can lock down CI [23:37:58] we need to, in any case [23:38:10] jzerebecki: don't merge the dependent patch until vendor has been bumped? That has been the basic procedure so far. It sometimes requires forcing vendor if the new lib version is breaks api compat and causes the tests to fail [23:38:31] gwicke: we could do that, but I had the impression that we didn't want to [23:38:53] with core one issue I could see is somebody getting changes in random dependencies they don't know about [23:38:56] we did fix the problem if it always needing to be forced by removing (or chainging?) the chcek against core's composer.json [23:39:12] so somebody knowledgeable about the entire system would need to review those dependency diffs [23:39:18] I personally would find the lack of security apalling, but I can live with that [23:40:16] but, at least there would be a guarantee that the code has been tested with that exact version of dependencies [23:40:24] yes [23:41:19] gwicke: wanna propose that on the rfcs phabricator task? [23:41:27] I wonder how other large PHP-based websites deal with this [23:41:38] jzerebecki: sure [23:41:39] surely everyone uses Composer these days [23:41:53] tgr: they don't care and do curl|bash [23:42:05] jzerebecki: on my todo, possibly later tonight [23:42:10] thx [23:42:44] jzerebecki: I mean really large ones [23:43:07] say, Etsy? [23:43:26] I know that some do artifact-based deploys [23:43:57] which are basically the automatic vendor update thing, except I'm not sure how much code review there is typically [23:44:21] some also lock down dependencies via internal npm registries (in node land) [23:45:08] Etsy could easily use versioned lock files. [23:45:16] I still feel this would be solved by making MW more library-like [23:45:19] we are the weirdos in the world on that point [23:45:34] It could certainly be helped [23:45:36] and have a composer.lock that's about the website, not the MW library [23:45:40] I don't think anyone checks if what composer outputs is actually what github would give you directly [23:45:52] (by making MediaWiki a library rather than an app) [23:46:43] re: library. most of it's already in the includes directory, right? TADA! :-) [23:48:05] tgr: but checking each hash that is in the lock file seems even more work/complexity than what bd808 says my rfc would create [23:48:11] This library + platform duality is exactly what led to inventing composer-merge-plugin and mediawiki/vendor.git [23:48:25] bd808: thanks for the doc link. I updated https://www.mediawiki.org/wiki/Download_from_Git#Keeping_up_to_date . It used to say "picking up the latest changes is really easy" 8-| [23:48:51] also everyone running composer is still required to do curl|bash [23:49:04] jzerebecki: just run composer install and check the vendor file, it's not much different from how it's done now [23:49:17] but nobody is required to run composer, the tarballs have vendor pre-populated [23:49:25] then again, it doesn't offer any great benefit either [23:49:27] for both core and extensions [23:49:42] bd808: for any extensions? [23:49:56] jzerebecki: I believe so, yes [23:50:04] legoktm built all that tooling [23:50:19] its running composer when generating the tar? [23:50:25] yes [23:50:29] heh [23:50:41] so only the labs host gets p0wned [23:50:47] yippe [23:50:52] hasn't been pwn'd yet! [23:50:54] and everyone using the tar [23:51:05] legoktm: how do you know ;) [23:51:19] heh [23:51:30] I haven't logged in since...a long time. [23:51:50] https://phabricator.wikimedia.org/diffusion/TEXD/browse/master/nightly.py;8bd6c73cfbcd3ed484cb425ad35256d3ff6377ae$179 [23:51:52] that's the code [23:53:03] fully puppetized, etc. I don't think I've even logged into the new labs vms that run it [23:53:03] so no merge-plugin? [23:53:13] no, because each extension gets its own tarball [23:53:41] so each extension has a local vendor, that it has to autoload manually [23:54:10] it's icky, but it works? [23:54:13] if they so happen to have different versions of the same thing... it won't work [23:54:28] yeah. [23:54:52] only for compat breaking changes, but yeah [23:55:00] first loader wins [23:58:07] the rfc really needs a clearer problem statement, but as stated now, the path of least resistance seems by far to be improvements to our CI infrastructure to check out requirement changes [23:58:47] tgr: what is unlcear about https://www.mediawiki.org/wiki/Requests_for_comment/Streamlining_Composer_usage#Problems ? [23:59:32] tgr: oh and https://www.mediawiki.org/wiki/Requests_for_comment/Streamlining_Composer_usage#Package_integrity should probably be included there, at least that was what I argued [23:59:35] ie. Depends-On: mediawiki/vendor:I123456abc in the commit message and then Jenkins needs to figure out to update mw/vendor to that