[04:25:37] DanielFriesen: i like the lib/cpucount.rb thing [09:29:37] https://meta.wikimedia.org/w/index.php?diff=5369131&oldid=5366554&rcid=4044975 [10:30:45] [[Tech]]; Patrick; Specifying a color for interwiki links; https://meta.wikimedia.org/w/index.php?diff=5369221&oldid=5367411&rcid=4045084 [11:08:09] hi, anybody here? my posts to wikitech are not going through :/ [11:08:38] could something be done about it? [11:08:56] MatmaRex: I could forward your post(s) for you to start with+ [11:09:20] nemowiki at gmail [11:09:56] let me try once again first [11:10:39] i just noticed i might have been sending it from the wrong address [11:10:50] :) [11:11:00] but if that was the case, it would be nice of the mailman to point it out or something, eh [11:11:15] it does [11:11:33] but probably the list admin set it to reject everything silently due to excessive spam [11:12:10] hm. [11:12:21] well, it seems to have worked now, the post shows up in the archives [11:12:26] false alarm, sorry :) [12:11:57] [[Tech]]; Patrick; found answer myself after checking https://bits.wikimedia.org/en.wikipedia.org/load.php?debug=true&lang=en&modules=skins.vector&only=styles&skin=vector&*; https://meta.wikimedia.org/w/index.php?diff=5369343&oldid=5369221&rcid=4045250 [14:52:45] http://toolserver.org/~hersfold/newfakeSULutil.php?username=SyriaAram failed to connect on Cluster 6. [14:54:25] Jeff_G: the toolserver channel might be a better location :) [14:56:13] http://toolserver.org/~quentinv57/tools/sulinfo.php?showinactivity=1&showblocks=1&username=SyriaAram says "Warning : The SQL server s6 is down or having issues. Consequently, the following wikis won't be displayed : ruwiki, jawiki and frwiki." and "Due to an issue on Toolserver's s7 databases server, these informations cannot been displayed. Please wait that the problem is fixed or [14:56:13] request help at #wikimedia-toolserver." [14:56:31] SQL server s6 is handled in this channel, no? [14:58:39] Jeff_G: not the toolserver replication of it, no [15:00:26] ok, sorry for the intrusion. [21:17:44] when was http://liquidthreads.labs.wikimedia.org re-killed? O_o [21:17:48] I saw no notice about it [21:20:06] IIRC all of the four-part domains should've been renamed/deleted [21:21:38] https://gerrit.wikimedia.org/r/#/c/53487/ [21:21:52] ah so it's just because renaming was too boring? :) [21:23:05] Tim-away, [21:23:46] Who knows a lot about edit protected? [21:25:01] hoo: :D [21:25:12] Yes? :P [21:26:31] hoo, do you know a lot about edit protected? [21:27:09] hoo is a mediawiki master [21:27:17] :P [21:27:27] Quite a lot, yes [21:28:00] Does the right allow access to any page that is cascade protected either directly or indirectly? [21:28:37] Cyberpower678: You mean editing? No [21:28:44] It doesn't "break" cascade [21:29:09] ? [21:29:17] break? [21:29:51] Cyberpower678: You can't edit cascading protected pages with editprotected [21:30:45] I thought so. [21:44:04] * Cyberpower678 is away: auto-away [21:52:32] Why are the dumps 'tar'ed and then '7z'ed? [21:53:04] Bruntgrunt: 7z is extremely slow on a lot of files, AFAIR [22:03:56] Bruntgrunt, which dumps? [22:04:24] you probably refer to the static ones... [22:08:50] hoo: I'm not sure that answer's Bruntgrunt's question :) [22:08:55] answers* [22:09:22] Krenair: huh? Running 7z on one file is ok, while on many is a pain [22:09:37] AFAIR [22:09:47] Why are the dumps 'tar'ed and then '7z'ed? [22:09:49] Bruntgrunt: 7z is extremely slow on a lot of files, AFAIR [22:09:54] it's not that it's painful [22:10:02] Bruntgrunt: In the unix-world it is normal that a compress-program can only handle 1 file – so tar is used to collect files first [22:10:08] TimStarling: Just slow, no? :P [22:10:10] it's just that 7z needs to load the whole directory into memory, and there wasn't enough RAM [22:10:24] (no idea if 7z can handle more than 1 file) [22:10:28] DaBPunkt, it can [22:10:29] I think there was 8GB available, which was nowhere near enough [22:10:40] tar doesn't have that limitation [22:10:52] I guess another reason is that it will compress better if treating it as a single file [22:10:58] it just streams in the filenames [22:11:09] no, it doesn't compress better [22:11:28] 7z has a "solid archive" feature which I used until I discovered the RAM problem [22:12:02] I was wondering if it was an option [22:12:33] can someone solve this bug, it is occurring again: https://bugzilla.wikimedia.org/show_bug.cgi?id=31577 [22:12:57] it's odd that it can't work processing the filenames on-the-fly [22:13:07] otoh, it's uncommon to have so much files :) [22:14:53] IIRC the limitation also applied to decompressing [22:15:04] so you'd need 16GB to decompress as well [22:15:15] anyway, I don't know what you'd want with a dump that's so old