[13:16:06] Oh wow [13:16:19] The complete update of core and extensions took 15 minutes [13:30:09] It's not quick, but I don't think it takes it that long for me [14:22:07] Why does profiling data takes forever to open? [14:26:47] vvv, git update takes me about 11m [16:35:00] Takes just under 20 for me, Reedy. I blame it on the netbook, though. [16:35:22] real 10m3.694s [16:35:24] On a VM [16:36:23] Okay, now again. [16:36:55] My eee 901 has 2 SSDs :p [16:37:43] This vm host is low powered, Dual core Athlon II Neo 36L (2 x 1.297GHz) [16:38:26] [16:38:57] Most of the time, realy, is in the network, but I'm on a wire. [16:40:21] Is there any hypothetical way to optimize that? [16:40:32] Probably by reusing SSH connection or something like that [16:41:11] paralell. [16:43:41] Reedy: warning: remote HEAD refers to nonexistent ref, unable to checkout. (in mediawiki) [16:44:08] That may be a good idea [16:44:17] * YuviPanda gives GNU parallel a hug [16:44:38] * vvv uses xargs for that [16:44:50] vvv: xargs does parallel? [16:44:57] without tricks? [16:45:29] Yes, if you use -n and -P [16:45:38] hmm, this I did not know [16:46:04] -n 1 to run stuff in seperate processes, -P N to run N processes simultaneously [16:49:52] Oh wow [16:50:03] find -maxdepth 1 -type d | sed -e'/^.$/d' -e'/.git/d' | xargs -n 1 -P 16 ./gitupdate [16:50:10] That updates stuff in ~30 seconds [17:34:25] Okay, I wrote a script for that [17:43:04] Oh, forgot to report to Reedy: Total time: 1567.6870670319 seconds [17:43:04] real 26m8.154s [18:09:31] https://gerrit.wikimedia.org/r/#change,6072