[01:10:15] Reedy: https://pt.wikipedia.org/w/index.php?search=media%3Atest.jpg&title=Especial%3APesquisar&uselang=en <- clues regarding that fatal error? [01:10:34] [66e05f62] 2012-07-01 01:08:42: Fatal exception of type MWException [01:10:47] We're getting a few appearing like that recently [01:10:58] that one is repeatable. [01:11:03] The others are [01:11:08] or some of them are, at least [01:11:39] https://bugzilla.wikimedia.org/show_bug.cgi?id=37978 [01:11:48] https://bugzilla.wikimedia.org/show_bug.cgi?id=38035 [01:12:13] Also, that link is logged on https://bugzilla.wikimedia.org/show_bug.cgi?id=37131 [01:12:15] 37131 too, apparently [01:12:17] heh [01:12:41] kk, thanks [01:12:56] I think this needs a parent bug [01:13:02] Something has changed with the error handling somewhere [01:13:16] [01:13:29] ^ person who pointed me at it [01:13:58] heh [01:14:54] hi! ;-) [01:15:35] Doesn't even appear in the error logs due to it being seemingly "handled" [01:16:07] try/catch is wonderful. [01:16:54] stuck it in wikimedia for the time being [01:16:54] https://bugzilla.wikimedia.org/show_bug.cgi?id=38095 [03:16:39] !log LocalisationUpdate completed (1.20wmf6) at Sun Jul 1 03:16:39 UTC 2012 [03:16:50] Logged the message, Master [04:01:12] What's up with the lag? It looks like it's been high (over 5, see [[mw:Maxlag]]) since about 19:34 on 2012-06-29. [04:12:42] db32 is being silly [04:12:48] as usual! [04:25:26] !log LocalisationUpdate completed (1.20wmf5) at Sun Jul 1 04:25:25 UTC 2012 [04:25:37] Logged the message, Master [05:33:22] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php 'logging' [05:33:32] Logged the message, Master [09:33:52] Hi all. http://etherpad.wikimedia.org seems to have gone down? [09:37:39] Currently reporting 502 Proxy Error. [09:39:33] ah, phew, back up. :) [12:10:15] Krenair: answered your memo :-] [13:21:04] Where can I find info on the softwares performing the back-up, daily dump production and status.wikimedia.org of Wikimedia sites.. I need to set this up for Consumium .. [13:21:45] and what tables are needed to dump in modern MW to comply with GFDL ? [13:22:50] used to be cur and old and the other tables could be reconstructed from those tables .. but this was in the mid 00's .. [13:23:16] I mean to comply with the "providing hardcopy" of the licenced material [13:25:13] I really want people to be able to download dumps of Consumium so that even if catastrophic failure is experienced like a fire at the Server Hall and my flat and my friends flat catching fire at the same time and I lose my USB stick [13:26:14] that would lead to catastrophic loss if it were not for the copies that have been downloaded and placed off-line [13:26:23] jubo2: By any chance, do you mean Consumerium? [13:26:35] Hydriz: yeah .. Consum(er)ium [13:26:40] at http://consumerium.org/? [13:27:11] if thats the case, okay :) [13:27:30] yeah. http://develop.consumerium.org/wiki/ will be 10 yrs online using the wonderfully stabile and secure MW software on 2012-03-10 .. [13:28:12] before 2003-03-10 we had only static site and PHBB discussion forum .. [13:28:31] oops.. I meant 2013 in that first date nat'lly [13:29:04] Seems like a small wiki to me... [13:31:15] Hydriz: I'm going to make server no. 2 of the Consum(er)ium high security back-up machine that verifies the backups hourly or so called backup.consumium.org ( also runs status.consumium.org and dumps.consumium.org ), then the 3rd server will be the 2nd production server and that and 3, 4, 5, 6, 7 etc. all will be backupped to the server no.2 [13:31:53] use ssh keys instead of passwords.. one sudoer and the only user on the system .. [13:32:13] server no. 3 can then provide for toolserver type of stuff [13:33:05] use anachronistic schedulers and rsync-over-ssh and the said SSH keys instead of passwords [13:34:12] So I need into technology behind status.wikimedia.org and dumps.wikimedia.org .. So I can implement similar servers [13:35:36] jubo2: The dumps.wm.o infrastructure is at https://gerrit.wikimedia.org/r/gitweb?p=operations%2Fdumps.git;a=shortlog;h=refs%2Fheads%2Fariel [13:35:50] though I have never heard of anyone using the code there [13:36:10] because, I believe, its messy and is only usable by Wikimedia [13:36:42] but if you are intending to make backups of your wiki, I strongly recommend using dumpBackup.php in /maintenance/ [13:56:17] :| [14:05:02] jubo2: And I have just archived the wiki: http://archive.org/details/wiki-develop.consumerium.org [14:08:37] jubo2: And I have just archived the wiki: http://archive.org/details/wiki-develop.consumerium.org [14:13:25] Hydriz: archive.org contains archives starting from 2002-12 [14:13:42] yeah, the text based [14:13:57] the dump I made can be reimported into any MediaWiki installation [14:14:29] Is gerrit having a touch time? [14:14:34] tough, I mean. [14:14:55] Hydriz: what is "developconsumeriumorg_w-20120701-history.xml.7z" [14:15:08] the files for the XML dump [14:15:19] thats also what Wikimedia generates for their own wikis too [14:15:37] Hydriz: using Special:XML_dump or something similar ? [14:15:48] Special:Export to be exact, but yeah [14:16:18] then on another wiki, Special:Import can import the pages in [14:16:21] Hydriz: and the XML dump contains what ? [14:16:34] all the history of every page in the wiki [14:16:34] Hydriz: oh cool i gotta try that .. [14:16:59] but apparently not the deleted stuff [14:17:03] eh, you got to extract the files out from the .7z files too [14:17:08] yeah, not the deleted stuff [14:17:55] deleted stuff ain't publicly accessible :P [14:21:20] Hydriz: and should a catastrophic scenario of fire at my flat, fire at the hosting peoples, fire or thefth at friends and I lose my USB stick we can retrieve the whole contents of the wiki from the archive.org archive ? [14:21:36] yeah [14:21:43] kewl.tnxman [14:21:49] and as you said, excluding deleted stuff [14:22:13] and it has to be updated regularly later on [14:22:17] Hydriz: to be honest we [14:22:51] Hydriz: to be honest we've had two near catastrophes.. One in 2006 and one in 2010 but here we are 2012 [14:23:36] Hydriz: I'm looking for a system where in addition to taking back-ups the backups are verified to load correctly into MySQL [14:23:38] hmm, this is why its important to have backups :) [14:24:09] you can dump the SQL database, but it can only be done at your end [14:24:17] Hydriz: 2006 incident was MW php inserting too large keys for MySQL4 .. [14:24:35] was on FreeBSD based solution and MySQL5 wasn't available [14:24:44] the dump I generate can be imported into the wiki using some MediaWiki-based scripts [14:26:02] then 2010 the back-ups were faulty too, contained only umpty tables of umptydumpty total tables for undisclosed reasons [14:26:39] I've made sure that the old server cannot know the sudo password of the new server [14:26:49] should it be compromised [14:27:51] Hydriz: I'm going to go post that link to the http://archive.org/details/wiki-develop.consumerium.org onwiki [14:28:17] remember: Its just a snapshot of the wiki at the time the dump it was made [14:28:32] so, perhaps, in the future it has to be updated again [15:18:22] jubo2, also user data is not at the xml dumps... [15:37:05] Platonides: good. good. that way is better for us [15:39:06] The user's voting pages in Consumium will not be copylefted, If users wanna make copies of own vote lists into another GFDL + CC-BY-SA'ed namespace [15:39:56] then organizations can compile lists of those vote lists that are GFDL'ed by moving them from User-space [15:40:06] which nat'lly is NOINDEX, NOFOLLOW [15:40:31] jubo2, user pages are in the dumps [15:40:40] (well, you could exclude them...) [15:40:44] then what is user data [15:40:48] I refered to things like user preferences [15:40:52] or password hashes [15:41:03] those would obviously need to be backed up separatedly [15:41:52] Platonides: yeah.. dump and load to empty database.. strip User-namespace with raw SQL and dump again.. [15:42:27] but this needs to be modified as it makes more sense to dump XML instead of SQL [15:44:01] jubo2, no need to do that [15:44:38] it's strange [15:44:47] I was sure maintenance/dumpBackup.php contained a parameter to choose which namespaces to dump [15:44:53] but I don't see it now [15:48:32] Platonides: I need to remove the pages listed in http://develop.consumerium.org/wiki/Voting as we do not want anyone to index or copy, we try to protect the copyright of the users .. we have to have an open voting system so that any attempts to skew the system can be met with formulating anti-skewing-formulae and distributing 'em to the populae en masse [15:53:35] you probably don't want that in a wiki page [16:03:27] Platonides: I keep hearing I want a CMS, not MediaWiki for some of the stuff I'm trying to do / to plan to do [16:06:16] Platonides: but in order to gain information warfare resilience we are having to do the classic sacrifice of user data which opens especially those with real name used to serious consumption pattern profiling but this way we can make sure that Consumium cannot turn into some sort of corrupt machine from hell that no-one knows how to stop .. [16:07:51] so it's a non-paid-trolls vs. paid-trolls situation .. this formeth equilibrium [16:09:52] it's mothermothering armsrace !!! xexexe [16:48:10] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php 'logging' [16:48:20] Logged the message, Master [16:53:47] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php [16:53:56] Logged the message, Master [17:30:49] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php [17:30:59] Logged the message, Master [17:32:11] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php [17:32:21] Logged the message, Master [17:43:30] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php 'more logging' [17:43:38] Logged the message, Master [17:45:00] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php 'more logging' [17:45:15] Logged the message, Master [17:55:27] !log aaron synchronized php-1.20wmf5/includes/WikiPage.php 'more logging' [17:55:38] Logged the message, Master [17:55:57] trying to pinpoint bug 37225, aaron? [17:57:57] Platonides: just checking some basic assumptions...I'll probably give up again soon and get some food ;) [18:00:39] I'd like to see it fixed [18:01:38] it's moking us :( [18:10:12] Platonides: reminds me of http://en.wikipedia.org/wiki/Repression_%28Star_Trek:_Voyager%29 for some reason [18:14:48] "TUVOK: I have worked meticulously, yet he has outwitted me each time. " [18:14:51] "TUVOK: I can almost sense his presence. It's as though he's challenging me to find him." [18:17:22] well, we could always bisect on wikipedia [18:18:51] Platonides: I was wondering if we could scan RC for dups and see if it only affects wikis with a certain extension [18:19:59] that's an option too [18:20:06] from the code, I don't how an extension could break anything, but who knows [18:20:20] I think we should be trying to reproduce it with a lagged slave [18:20:43] still, I don't see how the changes between the two wmf versions could affect it [18:21:41] as long as $oldid/$oldtext match, even if they were fetched from a slave it should work [18:21:54] it's basically a 1-try cas [18:22:40] anyway, from logging, I can tell that lots of races *are* detected, rollback is called, and it works [18:23:35] "$result = $dbw->affectedRows() != 0" is fine since I don't see -1 showing up in the logs [18:23:43] * Aaron|home may as well change that in master for sanity [18:28:46] well, the double insert happens because it is inserted when it shouldn't [18:54:47] Does anyone know of a better deal then VPS-from-cloud / shared CPU / 512MB ram / 100GB disk from farm of SAS-disks stuck to a 40Gb/s infiniband networking .. I intend to dedicate the 2nd server of the Consumium effort to be backup.consumium.org and also status.consumium.org and dumps.consumium.org, nothing else, nothing that could compromise it, hard-as-hell only sudo password and use ssh keys to access the back-uppable sites so it [18:54:49] should stay safe [19:12:13] !log reedy synchronized php-1.20wmf6/extensions/WikimediaMaintenance/ 'Update to master for hashar' [19:12:21] Can anything be done about db32? It's been lagging on the order of 10-60 seconds for well over 24 hours. [19:12:24] Logged the message, Master [19:16:49] <60 seconds lag is nothing [19:18:45] reedy, normally that's true, but when it continues for over 24 hours continuously there must be something seriously wrong [19:19:22] bots are supposed to shut down when replag > 5, so this is preventing many tasks from running [19:20:06] Not really [19:20:12] ignore replag, it's easier [19:22:04] err, maxlag, even [19:27:42] It's not causing an outage, and it's a weekend... And IIRC Asher isn't reachable this weekend either [19:28:52] Can ask Tim to have a look when he's about [19:32:28] aw sysadmin suggesting people to break policies [19:33:25] I was told by someone more senior to do the same a while back ;) [19:35:11] * russblau hears and obeys [19:40:35] * jubo2 declares "We ♥ FOSS devels"-day [23:55:58] Hmm. dblock