[08:29:48] Are there any docs on using "modern JS builds" / requirements for mediawiki / wikipedia written out anywhere? [08:29:56] as in guidlines that say that it should be used where possible? [08:30:05] and is it the sort of thing that would come up in a performance review? [15:16:49] We generally don't provide two versions of code (e.g. to save a few bytes), not as default or general recommendation. However on a case-by-case basis it can sometimes come up in perf review when it makes a net win in terms of performnace and complexity. For example if you need WeakMap, and need the feature to work in older browsers, and if you're okay with it not actually being a weak map in those older browsers, then you could ship the [15:16:49] es6 polyfill (e.g. the same one that Babel might include for you), which RL can automatically leave out for modern browsers with the skipFunction feature. [15:17:05] this is how we shipped the es5, json, and domnode polyfills in the past. [15:18:05] In general though, we focus more on load time and interaction performance, and there byte savings rarely help. So I'd generally recommend focussing efforts on reducing code complexity and dependencies, instead of splitting/compressing. [15:19:28] thanks! [20:43:01] dpifke: xhgui import LGTM [20:43:04] https://performance.wikimedia.beta.wmflabs.org/xhgui/run/view?id=5ef5168d88e941be5935492c [20:43:07] https://performance.wikimedia.beta.wmflabs.org/xhgui-old/run/view?id=5ef5168d88e941be5935492c [20:43:16] assuming that the -old one here came in on mongo [20:43:30] and the canonical url is imported into mysql and served from there [20:53:22] Yup. -old is just Mongo, new is just Maria. [20:54:25] dpifke: is the new/empty instance in prod web accessible currently? [20:55:19] Not externally. You could SSH port-forward to it. [20:55:23] right [20:55:51] so in what order do you want to go from here with write switch, web front switch, and import? [20:57:18] I'll do a first-pass data migration this afternoon. [20:57:51] Then ideally repointing Mediawiki and swapping destination of /xhgui happens ~simultaneously. [20:58:10] Then we can do a second migration pass to catch any records written in the meantime. [20:58:22] right, to see if anything breaks during the import and becuase that'll likely be the slowest part [20:58:34] how long do you estimate it would take if done simply/as-is? [20:59:20] Should be on the order of minutes. There were only ~400 records in beta and it only took a couple of seconds. [20:59:41] (So kinda hard to extrapolate.)