[06:07:08] benjaoming: it did [06:08:38] Even the thread you linked says that there is a ZIM with pictures [06:20:23] Nemo_bis: Thanks again, yes, just saw that... trying it out, AFAIK its a version with non-enlargable images.. for sure their rendering method is quite nice... they render the HTML through their own local Mediawiki installation and store it as static HTML, then download inline images from articles... quite a lot of computing... and a really nice format... will rethink whether its worth distributing full Mediawiki installations [06:20:23] with their own MySQL db. [06:21:14] benjaoming: no, Kiwix no longer uses an own MediaWiki install, that's history luckily [06:21:45] Also, I've signed up for the Wikimedia offline mailing list... didn't know there was a team like that -> https://lists.wikimedia.org/mailman/listinfo/offline-l [06:21:54] Good! [06:22:12] There are also other efforts in addition to Kiwix (all volunteer stuff) [06:22:20] The embedded images can be made larger if you prefer, it's just an option IIRC [06:22:44] Nemo_bis: Yes, but the creation of the ZIM files happens from a real installation of Wikipedia, pretty cools stuff, probably dumb to redo [06:23:53] benjaoming: as I always say, doing wikitext parsing yourself is a last resort [17:33:13] this has probably been asked before, but why aren't things at download.wikimedia.org set up as torrents? [17:34:33] if they were, the server(s) everything is on could serve as a permanent leecher, and other downloaders could leech if they wanted; it would only decrease traffic to that server, not increase it? [17:34:47] download.wikimedia.org has many defects [17:35:03] * seeder, not leecher [17:35:04] So far, it was thought not to hahve bandwidth issues [17:35:33] Do you have reasons to believe it has issues with bandwidth? [17:35:35] sure, it's probably negligent compared to what the wikis themselves use [17:35:53] i just think it's much slower than what i can get with a torrent [17:35:58] Not quite the point, there were in fact some points where the server was too slow [17:36:14] That might be a networking issue between you and the server [17:36:29] i'm downloading the most recent wikidata dump (4.5 GB), and it's set to be finished in around 24 hours; with torrents that size, it usually takes me one hour [17:37:27] Hm https://ganglia.wikimedia.org/latest/graph_all_periods.php?h=dataset1001.wikimedia.org&m=cpu_report&r=month&s=by%20name&hc=4&mc=2&st=1421948209&g=network_report&z=large&c=Miscellaneous%20eqiad [17:39:18] Jhs: I'm trying https://dumps.wikimedia.org/svwiki/20150101/svwiki-20150101-pages-meta-history.xml.7z from a Finnish server and it's on average 3 MB/s [17:40:19] Finnish server? [17:41:06] Yes, aren't you still in that area of the world [17:42:20] Nemo_bis, yeah, hehe, but where did you find this finnish server? :P [17:44:34] Jhs: I have two even [17:45:06] I'm using the one mentioned in https://archive.org/details/wikiteam [17:45:31] 7 minutes later... I'm at 2300 KiB/s average [17:45:41] Quite slow still, but almost decent [17:47:24] i'm around 70 KiB/s average... >_< but that's from the official site [17:48:07] me too... [17:49:30] Jhs: what's the exact URL you're downloading? [17:49:46] Nemo_bis, http://dumps.wikimedia.your.org/wikidatawiki/20150113/wikidatawiki-20150113-pages-articles.xml.bz2 [17:50:01] I'm trying https://dumps.wikimedia.org/wikidatawiki/20150113/wikidatawiki-20150113-pages-articles.xml.bz2 and it does look slower, still above 1 MB/s though [17:50:07] Ok that's the same [17:50:23] IMHO, you should make a bug report with traceroute and everything [17:51:14] https://old-bugzilla.wikimedia.org/show_bug.cgi?id=60283 should show the kind of questions you'll get [17:54:19] i'll take a look [19:12:10] Jhs: fwiw, download completed with an average of 1252 KB/s [19:12:28] hmmm [19:12:43] i'm still sitting at around 70-80 KB/s [19:12:55] only 17 hours left [21:12:27] * se4598_d rants: phab doesn't load/veeery slow via my ISP atm, but if I'm accessing it via a external vpn (for which it loads it normally), then youtube is too slow... ;) [22:59:26] Is commonswiki still in readonly mode? [22:59:34] And why was it in readonly mode? [23:00:09] The wiki is currently in read-only mode at Thu, 22 Jan 2015 21:23:36 GMT served by mw1117 ++++ [23:01:33] nothing in the logs [23:01:34] hmm [23:02:15] rillke: it doesn't look to be now [23:02:22] I have reports from Thu, 22 Jan 2015 21:23:08 GMT to Thu, 22 Jan 2015 21:23:36 GMT [23:02:35] so just a few seconds [23:02:49] when does this usually happen? Replication lag? [23:02:56] Someone pressing a button? [23:03:06] a short period like that, yeah, likely replag [23:03:17] Is there a lot of people voting in the photo competition atm? [23:03:53] I'd say moderately, according to our recent changes [23:05:15] We'll expect more when the CN banners are running [23:06:01] a user can spawn tens of edits in seconds without having to wait for pages to load [23:06:43] ah, that's the problem, I see [23:07:13] https://noc.wikimedia.org/dbtree/ [23:07:23] It's a guess on my part, but it seems likely [23:07:28] shows a lot of servers 'Not Reporting or Replicating' [23:07:32] But making UX crappier by requiring to re-load a page or showing a countdown is not really an option [23:07:44] Krenair: That's possibly as ganglia was broken for mysql [23:07:51] rillke: I didn't say it was [23:08:02] is it still? [23:08:33] I thought sean fixed it [23:08:39] they're appearing at least [23:09:44] Not completely [23:09:51] slaves aren't showing mysql stats in ganglia [23:09:54] let me reopen my task [23:11:20] https://phabricator.wikimedia.org/T87209 [23:29:41] why can't I save anything...? [23:36:10] Trijnstel, which Wiki? [23:36:19] well, I guess it's solved or so [23:36:28] I got this error [23:36:30] 12:27 AM A database query error has occurred. This may indicate a bug in the software. [23:36:30] 12:27 AM Function: WikiPage::updateRevisionOn [23:36:30] 12:27 AM Error: 1205 Lock wait timeout exceeded; try restarting transaction (10.64.16.13)