[00:08:13] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:08:33] RECOVERY - Puppet freshness on search22 is OK: puppet ran at Wed Jun 26 00:08:28 UTC 2013 [00:09:13] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 00:09:12 UTC 2013 [00:10:14] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:11:23] RECOVERY - Puppet freshness on mw1116 is OK: puppet ran at Wed Jun 26 00:11:16 UTC 2013 [00:15:43] RECOVERY - Puppet freshness on search21 is OK: puppet ran at Wed Jun 26 00:15:36 UTC 2013 [00:16:13] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 00:16:08 UTC 2013 [00:17:13] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:22:34] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 00:22:26 UTC 2013 [00:23:14] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:29:03] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 00:28:56 UTC 2013 [00:29:14] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:31:53] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [00:32:53] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 8.539 second response time [00:32:54] RECOVERY - Puppet freshness on cp1041 is OK: puppet ran at Wed Jun 26 00:32:48 UTC 2013 [00:33:33] RECOVERY - Puppet freshness on cp1043 is OK: puppet ran at Wed Jun 26 00:33:30 UTC 2013 [00:37:24] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 00:37:17 UTC 2013 [00:38:14] RECOVERY - Puppet freshness on cp1042 is OK: puppet ran at Wed Jun 26 00:38:04 UTC 2013 [00:38:15] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [00:38:53] RECOVERY - Puppet freshness on cp1044 is OK: puppet ran at Wed Jun 26 00:38:45 UTC 2013 [01:02:28] RECOVERY - NTP on ssl3002 is OK: NTP OK: Offset 0.008463382721 secs [01:05:08] PROBLEM - Puppet freshness on manutius is CRITICAL: No successful Puppet run in the last 10 hours [01:09:05] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [01:16:55] ori-l: :( my local vagrant decided to just reinstall itself [01:17:21] I had a shitload of local modifications [01:21:41] guess I need to put some effort into getting the openstack stuff working properly in modules [01:32:05] RECOVERY - NTP on ssl3003 is OK: NTP OK: Offset 0.009344935417 secs [01:33:16] Ryan_Lane: it shouldn't do that of its own accord; what happened, exactly? [01:33:33] at some point the vm swap deathed [01:33:41] I killed the process [01:34:01] I tried bringing it back up, but it said it had a lockfile held [01:34:03] I rebooted [01:34:14] then I did vagrant up, and it started from scratch [01:34:49] hmmm [01:34:56] does the vagrant stuff use a unionfs or some trick like that? [01:35:13] 'VirtualBox Shared Folders' [01:35:39] it can use NFS as well (and NFS is much faster), but NFS doesn't work on Windows and requires installing some packages on Ubuntu and other Linux distributions [01:36:22] ah hah [01:36:23] so I went with VirtualBox Shared Folders out-of-box, but documented the configuration changes you need to make to switch to NFS [01:36:30] there's two virtual box images now [01:37:24] there's going to be some metadata file under .vagrant that has a UUID that references one of those images [01:37:42] you should be able to edit it to point to the correct image (if it isn't, presently) [01:38:05] cool [01:38:22] but regardless that's a serious upstream bug, so I'll look into it [01:38:27] thanks [01:40:37] !log catrope synchronized php-1.22wmf8/skins/vector/collapsibleTabs.js 'I6e277a9503a1e6003bc7bf9f9468ed7b35552e60' [01:40:47] Logged the message, Master [01:42:33] Ryan_Lane: http://robinwinslow.co.uk/2012/10/05/what-to-do-if-your-vagrant-vm-crashes/ [01:43:37] it doesn't identify the underlying problem, but it spells out the steps for pairing a strayed VM with vagrant [01:43:45] * Ryan_Lane nods [01:43:54] I don't actually have a .vagrant file [01:44:04] but I do have a .vagrant/machines/default/virtualbox/id file [01:44:44] oh yes, you're right -- that page is out-of-date [01:45:07] vagrant status says "not created" [01:45:07] heh [01:45:07] the single-file .vagrant format is pre-1.2 [01:45:07] heh, sorry [01:45:12] the process is the same [01:45:33] looks like its creating a new install again [01:45:34] * Ryan_Lane sighs [01:46:10] indeed. there's now a third vm [01:46:20] what's the output of 'VBoxManage list vms'? [01:46:32] vboxmanage list vms [01:46:32] "vagrant_1371577963" {50820ebb-8c7f-45ed-84a0-3545fd04fb67} [01:46:32] "vagrant_1372208796" {0e1fa812-9159-42e0-9633-6001712b9138} [01:46:32] "vagrant_1372211126" {ec06016f-7cab-4e50-b625-50945dbfc3bd} [01:46:44] i presume the first is the authentic one [01:46:47] yeah [01:47:15] and it's wiped out the id in .vagrant/machines/default/virtualbox/id" and replaced it [01:47:26] and .vagrant/machines/default/virtualbox/id is '50820ebb-8c7f-45ed-84a0-3545fd04fb67'? [01:47:26] right [01:47:34] and the VM is in 'halt' state? [01:47:53] i think it might not be able to reacquire a handle on the VM if it has already started [01:48:06] it's not running [01:48:24] ah. it's in the "aborted" state [01:48:58] try "VBoxManage 50820ebb-8c7f-45ed-84a0-3545fd04fb67 controlvm poweroff" [01:49:17] I just booted it through the gui [01:49:22] then I'll shut it down [01:49:35] i'm optimistic that this will fix it [01:50:03] vagrant status [01:50:09] default not created (virtualbox) [01:51:12] gah, that's annoying [01:52:35] god damn it https://github.com/mitchellh/vagrant/issues/1119 [01:52:41] "I think this is fixed now in 1.1." [01:52:43] thanks for the details [01:53:21] heh [01:53:35] dinner time, i'll look into it more later [01:53:40] ok. cool. thanks [01:53:41] sorry :/ [01:54:16] I have a working environment in labs. it's no big deal for now [02:04:39] New patchset: Ryan Lane; "Set libvirt based on variable for ubuntu specific config" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70580 [02:07:35] !log LocalisationUpdate completed (1.22wmf8) at Wed Jun 26 02:07:35 UTC 2013 [02:07:46] Logged the message, Master [02:13:32] !log LocalisationUpdate completed (1.22wmf7) at Wed Jun 26 02:13:32 UTC 2013 [02:13:41] Logged the message, Master [02:17:21] Change merged: Ryan Lane; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70580 [02:25:06] PROBLEM - Puppet freshness on ms-be1001 is CRITICAL: No successful Puppet run in the last 10 hours [02:27:09] !log LocalisationUpdate ResourceLoader cache refresh completed at Wed Jun 26 02:27:09 UTC 2013 [02:27:18] Logged the message, Master [02:56:50] !log Updated Parsoid config for bug 50173 [02:56:59] Logged the message, Mr. Obvious [03:00:43] New patchset: Catrope; "Revert "Point the Parsoid cache in labs to parsoid-spof rather than deployment-parsoid2"" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70582 [03:09:37] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [03:57:46] !log catrope synchronized php-1.22wmf7/extensions/VisualEditor 'Updating VisualEditor to master' [03:58:11] !log catrope synchronized php-1.22wmf8/extensions/VisualEditor 'Updating VisualEditor to master' [04:07:53] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [04:08:24] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 04:08:17 UTC 2013 [04:08:53] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [04:15:33] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 04:15:30 UTC 2013 [04:15:53] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [04:34:43] RECOVERY - Puppet freshness on ms-be1001 is OK: puppet ran at Wed Jun 26 04:34:36 UTC 2013 [04:37:25] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 04:37:19 UTC 2013 [05:11:42] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [05:11:44] PROBLEM - NTP on ssl3003 is CRITICAL: NTP CRITICAL: No response from NTP server [05:15:24] PROBLEM - NTP on ssl3002 is CRITICAL: NTP CRITICAL: No response from NTP server [06:12:12] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [06:24:34] New patchset: Faidon; "Ceph: switch packages to the development releases" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70590 [06:24:34] New patchset: Faidon; "Ceph: adjust ceph.conf settings for v0.65" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70591 [06:28:53] New patchset: Faidon; "Ceph: switch packages to the development releases" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70590 [06:28:53] New patchset: Faidon; "Ceph: adjust ceph.conf settings for v0.65" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70591 [06:29:40] Change merged: Faidon; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70590 [06:30:03] Change merged: Faidon; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70591 [06:39:57] !log upgrading ceph/ms-fe.eqiad.wmnet to 0.65, ignore warnings/alerts for the next few hours [06:40:07] Logged the message, Master [07:02:04] PROBLEM - HTTP radosgw on ms-fe1001 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [07:02:14] PROBLEM - HTTP radosgw on ms-fe1004 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [07:02:22] bleh [07:02:54] RECOVERY - HTTP radosgw on ms-fe1001 is OK: HTTP OK: HTTP/1.1 200 OK - 311 bytes in 0.004 second response time [07:03:04] RECOVERY - HTTP radosgw on ms-fe1004 is OK: HTTP OK: HTTP/1.1 200 OK - 311 bytes in 0.005 second response time [07:09:34] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [07:12:17] !log deploying squid config change to block API DoS attack [07:12:26] Logged the message, Master [07:53:06] New patchset: MaxSem; "Enable mobile subdomain for wmf.org" [operations/mediawiki-config] (master) - https://gerrit.wikimedia.org/r/70597 [07:58:54] MaxSem: wikimediafoundation.org, not wmf.org [07:59:00] too confusing otherwise [07:59:16] it's a well-known shorthand:) [08:00:04] paravoid: anyone can edit the commit summary! >.> [08:02:50] RECOVERY - NTP on ssl3003 is OK: NTP OK: Offset -0.006988167763 secs [08:02:59] hmm, what's the status of our transition from squid to varnish? [08:03:20] RECOVERY - NTP on ssl3002 is OK: NTP OK: Offset -0.002138733864 secs [08:08:12] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [08:09:50] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 08:09:45 UTC 2013 [08:10:10] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [08:11:21] lo [08:11:27] apergos: so I am more or less around again :-) [08:11:44] been doing ton of continuous integration work lately, I need to focus a bit more on beta [08:14:37] !log restarting Jenkins to apply plugins updates. [08:14:46] Logged the message, Master [08:17:11] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 08:17:04 UTC 2013 [08:18:10] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [08:19:25] ok, I'm here and roasting but at the keyboard [08:21:23] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [08:22:10] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.128 second response time [08:24:00] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 08:23:59 UTC 2013 [08:24:10] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [08:28:44] New patchset: Faidon; "Enable mobile subdomain for wikimediafoundation.org" [operations/mediawiki-config] (master) - https://gerrit.wikimedia.org/r/70597 [08:31:12] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 08:31:09 UTC 2013 [08:32:10] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [08:33:57] !log jenkins: restarted it twice to finally have it up and running. [08:34:06] Logged the message, Master [08:34:50] !log jenkins reducing number of executors on (master) from 10 to 4. Most jobs are now running on the 'gallium' slave node. [08:34:59] Logged the message, Master [08:39:33] !log jenkins installed Simple Theme Plugin [08:39:41] Logged the message, Master [08:41:25] !log recreating Solr index [08:41:34] Logged the message, Master [09:00:34] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [09:02:27] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.155 second response time [09:06:44] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [09:07:34] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.129 second response time [09:08:10] PROBLEM - Puppet freshness on erzurumi is CRITICAL: No successful Puppet run in the last 10 hours [09:08:10] PROBLEM - Puppet freshness on lvs1004 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:10] PROBLEM - Puppet freshness on lvs1005 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:10] PROBLEM - Puppet freshness on lvs1006 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:10] PROBLEM - Puppet freshness on mc15 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:10] PROBLEM - Puppet freshness on ms-fe3001 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:11] PROBLEM - Puppet freshness on sodium is CRITICAL: No successful Puppet run in the last 10 hours [09:08:11] PROBLEM - Puppet freshness on spence is CRITICAL: No successful Puppet run in the last 10 hours [09:08:12] PROBLEM - Puppet freshness on virt1 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:12] PROBLEM - Puppet freshness on virt3 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:13] PROBLEM - Puppet freshness on virt4 is CRITICAL: No successful Puppet run in the last 10 hours [09:08:20] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [09:38:54] New review: Akosiaris; "LGTM. One minor point is the vim: noet line. I totally understand why you put it there but I don't l..." [operations/puppet] (production) C: 2; - https://gerrit.wikimedia.org/r/70427 [09:43:02] akosiaris: thanks :) [09:44:33] I can remove the # vim: noet snippet from site.pp [09:44:46] and submit another patch that put the vim modeline on most of our pp using tabs [09:46:49] New review: Hashar; "(1 comment)" [operations/puppet] (production) C: -1; - https://gerrit.wikimedia.org/r/70427 [09:47:11] hashar: that would be cool. The only thing I am afraid of is that we will get too comfortable using it and never move to 4 space tabs but maybe that's just me. I don't even know if we should chase that goal, aggressively or not. [09:47:58] akosiaris: indeed, I am removing it [09:48:08] as well as the class dependencies definitions that are no more needed in the latest patchset [09:48:25] New patchset: Hashar; "contint: vary tmpfs conf on master and slaves" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70427 [09:48:32] New review: Hashar; "(1 comment)" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70427 [09:49:19] akosiaris: if you feel brave enough we can get it merged on sock puppet and I can run puppet on gallium [09:49:23] then amend any potential issue [09:52:12] hashar: i see no reason not to... plus I have not yet brought anything down.. That is my chance!!! Running puppet-merge now [09:52:58] New review: Akosiaris; "LGTM" [operations/puppet] (production); V: 2 C: 2; - https://gerrit.wikimedia.org/r/70427 [09:52:59] Change merged: Akosiaris; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70427 [09:53:28] running puppet on gallium [09:54:29] I also need to finish the cleaning of the contint manifest [09:58:28] tmpfs on /var/lib/jenkins-slave/tmpfs type tmpfs (rw,noatime,size=128M,mode=755,uid=997,gid=1004) [09:58:29] yeahhh [09:58:41] :-) [09:58:53] puppet apply rocks [09:59:07] while you are around, Faidon told me you wrote a few integrations tests for some puppet module [09:59:15] do you have any example off hand? [09:59:32] I could get them to run in Jenkins whenever someone send a change in Gerrit [10:10:51] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [10:12:13] hashar: yeah look at the cdh4 module tests subdir [10:12:31] re sorry [10:14:19] crazy :-) [10:14:38] and i already have automated the Makefile a bit more. I have a better example at a local bacula module i want to post these days. Just finishing up some last points [10:15:03] do you know about puppet-rspec ? [10:15:03] http://rspec-puppet.com [10:15:29] rspec is a ruby testing tool, that project above adds support for testing puppet manifest [10:15:47] I tried it out last year, but it was just too slow because it loaded site.pp before each test [10:16:45] Yeah i have seen it too. But I have not yet run it in a real module. I should try it [10:18:09] ah here it is https://gerrit.wikimedia.org/r/#/c/16139/ [10:19:04] let you write specifications such as https://gerrit.wikimedia.org/r/#/c/16139/1/spec/classes/squid_spec.rb,unified [10:23:46] now that is interesting [10:31:16] New patchset: Faidon; "Remove obsolete nrpe stanzas" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70606 [10:33:58] poor faidon [10:34:31] New patchset: Mark Bergsma; "Correct domain" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70609 [10:35:32] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70609 [10:35:54] you don't say [10:37:55] !log jenkins moving unit test jobs for mw extensions to use the slave tmpfs and let them roam on slaves [10:38:05] Logged the message, Master [10:39:33] Change merged: Faidon; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70606 [10:40:35] mark, can you deploy https://gerrit.wikimedia.org/r/66874 please? [10:41:04] i can [10:42:19] Change merged: Faidon; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70239 [10:42:26] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/66874 [10:42:56] merged on sockpuppet for you [10:43:16] thanks! [10:43:57] that wasn't merged yet as it said "needs to go live with MF deployment..." [10:45:05] meh, I should've pinged you [11:05:20] PROBLEM - Puppet freshness on manutius is CRITICAL: No successful Puppet run in the last 10 hours [11:10:22] New patchset: Mark Bergsma; "Rename upload storage backends to be device name independent" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/68153 [11:12:58] hashar: I'm just going to merge that I think, so you can rebase on that [11:13:05] apparently varnish doesn't complain when a backend doesn't exist [11:13:21] so it's not a big deal that those names are mismatching, apparently that has been the case on dysprosium for a while already [11:13:41] ah good to know! [11:13:42] thanks :-) [11:13:50] will rebase this afternoon [11:24:04] crazyyy [11:24:19] Gerrit build system is now Buck instead of Maven [11:24:21] yet another java app to package :-D [11:24:31] * Reedy points at #gerrit [11:24:35] https://github.com/facebook/buck :-D [11:24:40] Y U GO CHANGE BUILD SYSTEM!? [11:32:52] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:35:10] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:36:00] we should switch to mediawiki codereview [11:36:04] would be less work ;-p [11:36:41] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:38:33] hashar: yeah, azatoth was already working on both buck & gerrit packaging [11:38:57] great :) [11:39:07] i think buck is ready [11:39:11] qchris sent me an email about buck and was wondering what should be done [11:39:15] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:39:50] buck: https://gerrit.wikimedia.org/r/#/c/67999/ [11:40:34] greeaaaat [11:40:50] that will make qchris and ^demon very happy [11:41:15] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:42:51] I got lame easy to review change : https://gerrit.wikimedia.org/r/70182 . That marks php5-dev as being required on contint servers [11:47:13] New patchset: Hashar; "lanthanum as a jenkins slave" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/64601 [11:51:34] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:52:54] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [11:54:18] New patchset: Hashar; "lanthanum as a jenkins slave" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/64601 [11:55:15] New review: Hashar; "rebased, added in contint::packages" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/64601 [11:55:37] New patchset: Hashar; "beta: symlink /a/common" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/65254 [11:57:18] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [12:00:16] New patchset: Mark Bergsma; "Prepare upload cache manifests for the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [12:02:16] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/68153 [12:05:54] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70613 [12:07:01] !log Jenkins let jobs roam on gallium slave node and keep master only for tied jobs. [12:07:09] Logged the message, Master [12:11:30] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [12:12:36] New patchset: Mark Bergsma; "Rename the persistent storage files on the new servers" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70615 [12:13:10] PROBLEM - Apache HTTP on mw1157 is CRITICAL: Connection timed out [12:13:30] PROBLEM - Apache HTTP on mw1154 is CRITICAL: Connection timed out [12:13:30] PROBLEM - Apache HTTP on mw1159 is CRITICAL: Connection timed out [12:13:30] PROBLEM - Apache HTTP on mw1155 is CRITICAL: Connection timed out [12:13:30] PROBLEM - LVS HTTP IPv4 on rendering.svc.eqiad.wmnet is CRITICAL: Connection timed out [12:14:04] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70615 [12:14:17] mark: that you? [12:14:30] don't know [12:14:30] swift had a load spike [12:14:38] PROBLEM - Apache HTTP on mw1160 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:14:38] PROBLEM - Apache HTTP on mw1158 is CRITICAL: Connection timed out [12:14:48] PROBLEM - Apache HTTP on mw1156 is CRITICAL: Connection timed out [12:14:48] PROBLEM - Swift HTTP on ms-fe4 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:14:48] PROBLEM - Apache HTTP on mw1153 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:15:17] had or has? [12:15:39] RECOVERY - Swift HTTP on ms-fe4 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 0.064 second response time [12:16:33] doesn't look good [12:16:41] http://ganglia.wikimedia.org/latest/graph_all_periods.php?hreg[]=^ms-fe[1-9].pmtpa&mreg[]=swift_[A-Z]%2B_hits%24&z=large>ype=stack&title=Swift+queries+per+second&aggregate=1&r=hour [12:16:51] http://ganglia.wikimedia.org/latest/?r=hour&cs=&ce=&tab=v&vn=swift+backend+storage is the most interesting one [12:16:54] still does [12:17:17] i/o wait all of them [12:20:05] i'm not sure why the varnish hit rate dropped [12:24:19] PROBLEM - Swift HTTP on ms-fe4 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:24:38] varnish front/back client requests aren't inflated as far as I can see [12:25:21] i don't get it, i just renamed the storage backends [12:25:23] huh, ganglia lost all stats from upload caches eqiad [12:26:00] yeah I was wondering where those went all of a sudden [12:26:22] RECOVERY - Swift HTTP on ms-fe4 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 0.060 second response time [12:27:21] they are certainly doing conversions [12:27:42] ? [12:28:15] ah sorry, the image scalers, which show in ganglia as down [12:30:22] i mean, there's this: [12:30:23] director backend random { [12:30:23] { [12:30:23] .backend = ipv4_10_2_1_27; [12:30:23] - .weight = 20; [12:30:24] + .weight = 100; [12:30:27] } [12:30:29] } [12:30:33] but I don't see why that would be a problem [12:30:37] I just got that too [12:30:43] root@cp1036:~# diff -u /var/lib/puppet/clientbucket/c/c/5/1/6/b/1/7/cc516b17e1b677c22fa7d318eb6dd6a2/contents /etc/varnish/wikimedia_upload-backend.vcl [12:31:18] i've restarted cp1021-1028 now in case the storage name change is the problem [12:31:19] happened on 12:10 on cp1036 [12:32:59] oh [12:33:00] I think I know [12:33:11] backend dysprosium { [12:33:11] .host = "dysprosium.eqiad.wmnet"; [12:33:11] - .port = "80"; [12:33:11] + .port = "3128"; [12:33:11] .connect_timeout = 5s; [12:33:12] .first_byte_timeout = 35s; [12:33:14] .between_bytes_timeout = 4s; [12:33:16] .max_connections = 1000; [12:33:18] + .probe = varnish; [12:33:20] } [12:33:25] damn you [12:33:27] so now it's going straight to dysprosium backend [12:33:29] instead of via dysprosium frontend, which was wrong before [12:33:33] I'm on dysprosium about to hit enter on that [12:33:36] which means a lot more requests are no longer hashed [12:33:36] on that diff [12:33:48] dysprosium wouldn't have given you that [12:33:51] so no worries ;) [12:33:52] ah that's esams [12:33:54] this is on cp300* [12:33:57] yeah [12:34:12] so now all esams requests are random [12:34:17] instead of some hashed, some not [12:34:48] yeah, dysprosium served as our chash for esams [12:34:49] what's the best way to fix this [12:34:55] for its share of traffic yes [12:35:04] which explained why random there wasn't such a huge deal too doesn't it [12:35:08] I was wondering about that [12:35:08] yes [12:35:28] i kinda wanted to tackle that issue after getting the new servers in [12:36:06] revert now and chat later? :) [12:36:24] I guess [12:37:12] that was very subtle in the commit log [12:37:13] New patchset: Mark Bergsma; "Send esams backend traffic via dysprosium frontend for now" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70618 [12:37:51] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70618 [12:38:26] oh damn [12:38:29] there's more [12:39:45] New patchset: Mark Bergsma; "Restore esams backend weight" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70619 [12:40:17] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70619 [12:40:32] PROBLEM - Swift HTTP on ms-fe4 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:41:15] RECOVERY - Swift HTTP on ms-fe4 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 0.061 second response time [12:43:05] PROBLEM - Swift HTTP on ms-fe2 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [12:43:46] I see no change yet [12:43:55] that's because puppet isn't done yet [12:45:14] RECOVERY - Swift HTTP on ms-fe2 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 0.066 second response time [12:45:17] should start seeing changes now [12:46:27] on a side note, we have a very nice comparison of h310 vs. h710 [12:46:47] on the i/o wait graphs [12:47:41] hehe [12:47:49] no change yet [12:47:59] I wonder why we lost all ganglia graphs for varnish boxes [12:48:00] all esams have original chash weight now [12:48:08] eqiad & esams [12:48:26] oh, no just eqiad [12:49:11] starting to get down [12:49:30] yeah [12:50:05] RECOVERY - Apache HTTP on mw1156 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 5.704 second response time [12:50:26] RECOVERY - Apache HTTP on mw1158 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.037 second response time [12:50:28] RECOVERY - Apache HTTP on mw1154 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.052 second response time [12:50:28] RECOVERY - Apache HTTP on mw1157 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.040 second response time [12:50:28] RECOVERY - Apache HTTP on mw1155 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.043 second response time [12:50:35] RECOVERY - Apache HTTP on mw1159 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.045 second response time [12:50:35] RECOVERY - Apache HTTP on mw1153 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.062 second response time [12:50:35] RECOVERY - Apache HTTP on mw1160 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.052 second response time [12:50:35] RECOVERY - LVS HTTP IPv4 on rendering.svc.eqiad.wmnet is OK: HTTP OK: HTTP/1.1 200 OK - 66546 bytes in 0.156 second response time [12:50:57] good [12:51:17] i'll revert that dysprosium change in a bit [12:51:20] back to port 3128 [12:54:21] New patchset: Mark Bergsma; "Revert "Send esams backend traffic via dysprosium frontend for now"" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70620 [12:54:45] New patchset: Mark Bergsma; "Revert "Send esams backend traffic via dysprosium frontend for now"" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70620 [12:55:15] New patchset: Hashar; "zuul: craft a publicly readeable configuration file" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [12:55:38] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70620 [12:56:25] any champion around to review a puppet exec that uses sed please ? https://gerrit.wikimedia.org/r/70621 :) [12:58:26] New patchset: Mark Bergsma; "Install new servers as upload caches" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70622 [12:58:37] hashar: will this ever run? [12:59:19] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70622 [12:59:21] huh ? confused... [12:59:26] so my idea is that whenever the template is expanded, the notify => would trigger the exec{} [12:59:55] the exec will only run if /etc/zuul/public.conf does not exist right ? [13:00:04] but the command does not create it ... [13:00:36] and you depend on refreshonly [13:00:43] ok.. now i see what you got here [13:00:50] oh man [13:01:09] I forgot to use > /etc/zuul/public.conf [13:01:27] ok that is what is was going to ask [13:01:46] I tested the sed command on my laptop [13:01:49] then copy pasted it [13:02:28] New patchset: Hashar; "zuul: craft a publicly readeable configuration file" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [13:02:38] akosiaris: there is the > :-] [13:06:27] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:07:07] hashar: i have the feeling it will only run once... gimme 5 mins to verify smt [13:07:18] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 3.102 second response time [13:07:29] I am not sure if we need to do the notify or subscribe [13:10:27] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:11:29] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.135 second response time [13:11:47] PROBLEM - Apache HTTP on mw1159 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:12:39] PROBLEM - Apache HTTP on mw1153 is CRITICAL: Connection timed out [13:12:39] PROBLEM - Apache HTTP on mw1154 is CRITICAL: Connection timed out [13:12:47] PROBLEM - Apache HTTP on mw1157 is CRITICAL: Connection timed out [13:12:47] PROBLEM - Apache HTTP on mw1160 is CRITICAL: Connection timed out [13:12:47] PROBLEM - Apache HTTP on mw1155 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:12:47] PROBLEM - LVS HTTP IPv4 on rendering.svc.eqiad.wmnet is CRITICAL: Connection timed out [13:13:05] sigh [13:13:05] ok [13:13:07] heh [13:13:28] New patchset: Mark Bergsma; "Revert "Revert "Send esams backend traffic via dysprosium frontend for now""" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70626 [13:13:31] PROBLEM - Apache HTTP on mw1156 is CRITICAL: Connection timed out [13:13:31] New patchset: Mark Bergsma; "Revert "Revert "Send esams backend traffic via dysprosium frontend for now""" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70626 [13:13:41] PROBLEM - Apache HTTP on mw1158 is CRITICAL: Connection timed out [13:13:45] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70626 [13:16:18] i'll just fix it for the new servers only [13:16:35] it's kinda scary [13:16:40] hashar: yeah i was correct.. it will only run once [13:16:41] PROBLEM - Swift HTTP on ms-fe4 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:16:42] I wonder how close we are to swift's limits [13:16:59] akosiaris: :( [13:17:08] hashar: so ... put a /bin/ in front of sed and ditch the creates [13:17:11] and you should be ok [13:17:28] akosiaris: will it run on every puppet run ? [13:17:37] only if /etc/zuul/zuul.conf changes [13:17:47] RECOVERY - Swift HTTP on ms-fe4 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 3.707 second response time [13:18:10] hashar: which is a presume what you wanted [13:18:46] akosiaris: yes sir :-) [13:18:48] New patchset: Hashar; "zuul: craft a publicly readeable configuration file" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [13:19:00] so creates is just to make sure the file is there kind of like present [13:19:04] yay, more puppet reviewers [13:19:12] I'm so glad [13:19:29] at the rate I exhaust puppet reviewers, ops need to hire a lot more often :-D [13:19:35] PROBLEM - Swift HTTP on ms-fe1 is CRITICAL: CRITICAL - Socket timeout after 10 seconds [13:19:55] New review: Akosiaris; "LGTM" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [13:20:04] I think it's the other way around hashar [13:20:07] New review: Hashar; "got rid of the creates statement to make sure the public file is rebuild whenever the other file is ..." [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [13:20:12] New review: Akosiaris; "LGTM" [operations/puppet] (production) C: 2; - https://gerrit.wikimedia.org/r/70621 [13:20:21] you're not exhausting me, but too few reviewers means a slow review queue which exhausts submitters [13:20:25] RECOVERY - Swift HTTP on ms-fe1 is OK: HTTP OK: HTTP/1.1 200 OK - 2503 bytes in 0.062 second response time [13:20:38] ah that too [13:20:49] the workaround I found is to submit more patches :-] [13:21:01] this way I still get one merged from time to time which keeps me in a happy mood [13:21:10] lol [13:21:13] hahaha [13:23:03] akosiaris: I guess you can merge in addition to CR+2 on https://gerrit.wikimedia.org/r/#/c/70621/ :-D [13:23:09] jenkins does not merge commits on operations/puppet [13:24:10] Change merged: Akosiaris; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70621 [13:25:29] hashar: your wish is my command :-). Merged an puppet-merge ran [13:25:40] running puppet [13:26:14] one day i will have to look at puppet-merge to make it report back in Gerrit that a change got merged on sockpuppet [13:27:23] that is a shell script .. [13:28:45] akosiaris: since the template did not change, the Exec did not get triggered hehe [13:30:10] I have created it manually [13:32:55] hashar: yeah makes sense.... [13:33:18] at least when the templates changes the exec will trigger so we are ok [13:33:31] yup [13:33:36] thank you for the review/merge! [13:34:41] :-) [13:38:56] swift still not happy? [13:41:47] PROBLEM - Varnish HTTP upload-frontend on cp1064 is CRITICAL: Connection refused [13:42:07] PROBLEM - Varnish HTTP upload-frontend on cp1050 is CRITICAL: Connection refused [13:42:18] PROBLEM - Varnish HTTP upload-frontend on cp1062 is CRITICAL: Connection refused [13:44:02] not very no [13:44:05] holy crap [13:44:11] we have a few hundred upload purges a second [13:44:18] like, 500 [13:44:37] PROBLEM - RAID on ms-be3 is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [13:45:22] aaaargh [13:45:34] dysprosium is sending traffic to the new servers [13:47:15] New patchset: Mark Bergsma; "Dysprosium is part of the "old" cluster" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70632 [13:47:30] Change merged: Mark Bergsma; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70632 [13:47:46] that's why swift remains unhappy [13:48:00] RECOVERY - Varnish HTTP upload-frontend on cp1062 is OK: HTTP OK: HTTP/1.1 200 OK - 673 bytes in 0.004 second response time [13:51:26] * apergos gits teeth and hopes it's just puppet being slow [13:52:00] I assume that missing thumbnails is a consequence of this? [13:52:06] yes [13:53:00] thanks [13:53:50] RECOVERY - Apache HTTP on mw1153 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.036 second response time [13:54:00] RECOVERY - Apache HTTP on mw1156 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.037 second response time [13:54:01] RECOVERY - Apache HTTP on mw1155 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.059 second response time [13:54:01] RECOVERY - Apache HTTP on mw1158 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.071 second response time [13:54:01] RECOVERY - LVS HTTP IPv4 on rendering.svc.eqiad.wmnet is OK: HTTP OK: HTTP/1.1 200 OK - 66546 bytes in 0.153 second response time [13:54:10] akosiaris: Hi [13:54:22] RECOVERY - Apache HTTP on mw1154 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.055 second response time [13:54:22] akosiaris: I saw your +2 on https://gerrit.wikimedia.org/r/#/c/67999/ [13:54:22] RECOVERY - Apache HTTP on mw1160 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.058 second response time [13:54:22] RECOVERY - Apache HTTP on mw1157 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.060 second response time [13:54:22] RECOVERY - Apache HTTP on mw1159 is OK: HTTP OK: HTTP/1.1 301 Moved Permanently - 747 bytes in 0.061 second response time [13:54:45] akosiaris: However it is not merged. [13:54:45] finally [13:54:57] New review: Alex Monk; "(1 comment)" [operations/mediawiki-config] (master) - https://gerrit.wikimedia.org/r/70322 [13:55:16] akosiaris: Do you think it can get merged? [13:58:41] !log reedy synchronized wmf-config 'touch 1' [13:58:50] Logged the message, Master [13:59:46] !log reedy synchronized php-1.22wmf8/resources [13:59:55] Logged the message, Master [14:00:24] !log reedy synchronized php-1.22wmf7/resources [14:00:34] Logged the message, Master [14:01:32] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [14:01:41] PROBLEM - Host cp1049 is DOWN: CRITICAL - Plugin timed out after 15 seconds [14:01:52] PROBLEM - Host cp1050 is DOWN: PING CRITICAL - Packet loss = 100% [14:02:05] so, all is good now? [14:02:09] yes [14:02:22] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 4.320 second response time [14:02:41] PROBLEM - Host cp1051 is DOWN: PING CRITICAL - Packet loss = 100% [14:02:41] PROBLEM - Host cp1052 is DOWN: PING CRITICAL - Packet loss = 100% [14:02:47] qchris: sure.. I just left this to azatoth to do it whenever he felt like, but we can merge it now if you want [14:03:01] RECOVERY - Host cp1050 is UP: PING OK - Packet loss = 0%, RTA = 0.57 ms [14:03:01] RECOVERY - Host cp1049 is UP: PING OK - Packet loss = 0%, RTA = 1.41 ms [14:03:11] RECOVERY - Host cp1051 is UP: PING OK - Packet loss = 0%, RTA = 0.22 ms [14:03:11] PROBLEM - Host cp1061 is DOWN: PING CRITICAL - Packet loss = 100% [14:03:15] Oh, then I'll ask AzaToth [14:03:19] does AzaToth has merge rights ? [14:03:31] RECOVERY - Host cp1052 is UP: PING OK - Packet loss = 0%, RTA = 0.29 ms [14:03:42] PROBLEM - Host cp1062 is DOWN: PING CRITICAL - Packet loss = 100% [14:03:42] PROBLEM - Host cp1063 is DOWN: PING CRITICAL - Packet loss = 100% [14:03:42] PROBLEM - Host cp1064 is DOWN: PING CRITICAL - Packet loss = 100% [14:03:55] AzaToth: What do think about merging https://gerrit.wikimedia.org/r/#/c/67999/ (and do you have merging permission)? [14:04:01] RECOVERY - Varnish HTTP upload-frontend on cp1050 is OK: HTTP OK: HTTP/1.1 200 OK - 675 bytes in 0.009 second response time [14:04:02] RECOVERY - Host cp1061 is UP: PING OK - Packet loss = 0%, RTA = 1.35 ms [14:04:15] I don't think AzaToth is ops member, so he would not be able to merge / upload to apt.wm.o [14:04:46] paravoid, mark, I'm hoping for eyes on https://gerrit.wikimedia.org/r/#/c/70429/ (and, subsequently, https://gerrit.wikimedia.org/r/#/c/68584/ ) [14:05:02] RECOVERY - Host cp1063 is UP: PING OK - Packet loss = 0%, RTA = 0.76 ms [14:05:02] RECOVERY - Host cp1062 is UP: PING OK - Packet loss = 0%, RTA = 1.10 ms [14:05:16] andrewbogott: what's the naming conflict on that? [14:05:18] andrewbogott: perhaps I missed an answer yesterday, but can't we just move the roles last? [14:06:08] mark, I'd prefer to move them as I refactor individual manifests -- that allows me to test them as I go. For example, I already have a box set up to test exim, so would like to finish all the exim moves in one go. [14:06:11] RECOVERY - Host cp1064 is UP: PING OK - Packet loss = 0%, RTA = 0.21 ms [14:06:46] ottomata, if 'role::' refers to a module named 'role' then puppet isn't so great about picking up classes named 'role::' outside of the module [14:06:54] I've have role class naming conflicts too, and usually solved them by: [14:06:54] A. never declaring nested classes [14:06:54] B. including fully qualified class names, e.g. class { '::role::gerrit::production': } [14:06:56] Im not [14:07:01] PROBLEM - Varnish HTTP upload-frontend on cp1061 is CRITICAL: Connection refused [14:07:02] RECOVERY - Varnish HTTP upload-frontend on cp1064 is OK: HTTP OK: HTTP/1.1 200 OK - 673 bytes in 0.013 second response time [14:07:14] OH, this is a module named role...hmmmmm [14:07:16] AzaToth: Ok thanks. [14:07:19] hmmmm [14:07:27] andrewbogott: so you want to do one big rename of that module at the end or something? [14:07:30] do we have anything else in this 'wmrole' module yet? [14:07:32] or is this brand new? [14:07:37] akosiaris: Since he cannot merge, could you? [14:08:13] akosiaris: the reason for using 0 as revision is that it's not in debian [14:08:20] mark, yeah, or just leave it. Maybe 'mwrole' offends you more than it does me though :) [14:08:30] yeah I hate it :) [14:08:31] ha, it kind of offends me too :p [14:08:36] um…wmrole [14:09:04] akosiaris: it's the same ubuntu does for packages not in debian [14:09:11] mark, in that email thread i suggested eventually using a secondary module directory in modulepath for roles [14:09:15] don't worry about testing, if modularization of role in the end fails, everything will break [14:09:18] i don' think anyone commented on that [14:09:32] to make it forward compatible for eventual inclusion in debian [14:09:37] * mark rereads [14:09:57] AzaToth: doesn't ubuntu do that ubuntu[0-9] thingy ? [14:10:02] when it's the same upstream version [14:10:12] akosiaris: yes, as their revision indication [14:10:34] New patchset: Jgreen; "new >= 2048 bit key for sahar" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70634 [14:10:35] but the -N is the debian revision, and for packages/version not in debian, the debian revision is 0 [14:10:51] could have made it -0+wmf1 [14:11:02] that one i like better [14:11:17] andrewbogott: I don't see that in that thread? [14:11:22] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [14:11:23] οκ ... i will try that and merge [14:11:41] mark, which? [14:11:48] about a secondary module directory [14:11:58] akosiaris: I didn't focus on the changelog as the one actually building the package needs to update the changelog ツ [14:12:42] that was me [14:12:44] akosiaris: offcourse deiban will never include buck 0.0+g410fcf34 ツ [14:12:45] lemme see [14:13:02] obviously :-) [14:13:29] ottomata, I think I got one of your 360s last week too :) [14:13:45] mark, forwarded [14:14:02] oh haha [14:14:09] akosiaris: sadly there are no version/tags in buck at all, so the version is all fuckedup [14:14:16] your full name is not far from mine [14:14:18] AzaToth: why not? [14:14:21] if bog was your middle name: [14:14:27] you would be Andrew Ott [14:14:33] 0.0+g410fcf34 is a bad version number [14:14:40] doh [14:14:40] paravoid: yes [14:14:41] sorry I mixed you guys up [14:14:43] there is a problem with that even for us [14:14:47] paravoid: yes [14:14:51] which explains why I didn't see the email when looking for andrew bogott's ;) [14:14:56] the fact that it's not monotinically increasing [14:15:02] paravoid: I never come to a resolution what we should specify as version [14:15:10] paravoid: git describe doesn't work [14:15:14] when packaging git snapshots, append the date [14:15:19] er, prepend [14:15:27] ottomata: so I don't see how that helps anything, it's not very related to this discussion as far as I can see [14:15:40] I also don't think we're gonna have very many non-WMF specific modules at all [14:15:47] well, the discussion was about role modules [14:15:50] yes [14:15:52] vs manifests/role [14:15:55] right now we have a name conflict [14:16:01] paravoid: in what format? [14:16:05] oh, totally [14:16:06] i'm not sure how another modules dir helps there [14:16:15] this is more abstract, i just remembered that no one commented [14:16:22] paravoid: prepend? [14:16:26] 0~git20130626 should work [14:16:33] sorry, i will postpone the abstract discussion til it matters :p [14:16:52] ok, i thought you were proposing a solution here [14:16:55] I can have another go at making manifests/role and module/role coeexist. I didn't hammer on it all that long, just noticed that it was causing funny things to happen. [14:17:10] why do we have module/role? is that new? [14:17:16] andrewbogott: I think I would prefer to just keep roles in manifests/ for now until we're ready to move it over in one big role module [14:17:18] andrewbogott: imho, just go with manifests/role for now [14:17:26] that's the part I don't like right now [14:17:27] yeah [14:17:39] especially since that discussion isn't settled [14:18:15] as far as I see it, the role classes will stay a little bit spaghetti anyway [14:18:22] mark, paravoid, I can do that, but it seems a little bit weird that you want me to upend my work process due to not liking a module name. Aren't you deep into bikeshed territory? [14:18:45] paravoid: AzaToth: what about this ? 0.0~git20130611-wmf1 [14:19:02] no [14:19:04] you're trying to do two changes at once, one of which is quite fundamental in the way we work [14:19:13] andrewbogott: I believe the modules and roles related to those modules are separate [14:19:16] fundamentally [14:19:20] 0~git20130611-0+wmf1 is fine [14:20:01] Hm… several of my 'make modules' patches have been met with code reviews that say "you should spin this out into a role" [14:20:09] ok moving forward with this [14:20:18] that is probably true, doesn't mean you need to do it now [14:20:19] So it's connected in the sense that I'm making a bunch of new roles, and it grinds my buzz to create those new roles in a place that I know is the wrong place [14:20:25] or YOU need to do it in the first place [14:20:29] why is the wrong place? [14:20:31] is it* [14:20:42] it's the right place [14:20:47] if you're just moving stuff into modules, then I think having some things currently in manifests which should move into roles is ok [14:21:01] we can move those to roles later [14:21:03] *shrug* because my goal is to eliminate modules, and adding new manifests is the opposite of that [14:21:10] yeah, having things in manifests/roles still is good [14:21:22] we can figure out how to get rid of that in a nice way later [14:21:23] I think right now, manifests/role is the exception to that [14:21:39] configuration specific values don't really belong in the modules anyway [14:21:45] Well, labs won't get any of the benefit of this work until everything is in modules. [14:21:45] I thought your goal was to de-spaghettize our codebase [14:21:50] so I won't ask you to move stuff into roles if all you're doing is moving existing badly designed stuff into modules [14:22:33] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [14:22:44] I also think the role module (or manifests, whatever) will always remain a bit spaghetti [14:22:44] frankly, I'm even more stringent than mark I think [14:22:49] hopefully a bit less than now [14:23:18] I think that we should take the opportunity of moving into modules and rethink where each part goes [14:23:24] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.129 second response time [14:23:25] akosiaris: I see there are many new commits to upstream buck [14:23:31] paravoid: that will slow it down though [14:23:32] !log reedy synchronized php-1.22wmf7/extensions/VisualEditor/modules/ve/ve.js [14:23:34] paravoid, sure, I don't object to that. [14:23:41] Logged the message, Master [14:23:43] I'm with andrewbogott that we don't have to make it perfect now [14:23:49] Reedy: Ta. [14:23:50] so? it works as it is now [14:23:57] not for labs [14:24:02] akosiaris: though due the way someone setup ourt repo (.gitreview + big merge) it's a hell to merge it [14:24:07] what do you mean? [14:24:22] AzaToth: tell me about it ... :-( [14:24:28] apparently everything needs to be moved into modules for something to work? [14:24:30] I don't know what really [14:24:34] * AzaToth looks at hashar with angry eyes [14:24:37] Yeah, I think there's a middle ground between 'perfect' and 'don't rearrange anything'. Basically once a refactor requires me to understand a tool rather than just understanding puppet I'll balk, but refactors that I can understand are fine :) [14:25:05] andrewbogott: are you actually working on going thorugh *everything* right now? [14:25:06] into modules? [14:25:11] or just pieces you need? [14:25:31] ottomata, I'm not going to do it all in one patch [14:25:38] akosiaris: if possible I would want to rebase our changes ontop upstream [14:25:38] right [14:25:38] !log reedy synchronized php-1.22wmf8/extensions/VisualEditor/modules/ve/ve.js [14:25:40] I'm not terribly excited into having the same spaghetti possibly badly written/layered code just under a different filepath [14:25:48] Logged the message, Master [14:25:51] but are you planning to move everything in this effort? [14:25:54] ymmv [14:26:20] so why exactly do we need to have *everything* in modules asap? [14:26:24] ottomata, yes (although realistically I'll probably get pulled away by something more urgent before I finish) [14:26:29] aye, hm [14:26:32] AzaToth: :-D [14:26:33] i think i agree wiht paravoid then [14:26:34] i'm working under the assumption that this is indeed needed [14:26:39] if there's a need to have everything modules asap I might reconsider though [14:26:42] it'd be better to keep the speghetti code in manifests [14:26:46] AzaToth: feel free to try... I will too [14:26:46] and then clean refactored stuff in modules/ [14:26:52] that helps us track our progress too [14:26:56] nod [14:27:00] i do agree with that [14:27:15] but I assume there's a reason why andrew is now rushing the move ;) [14:27:21] akosiaris: sadly I can't force push [14:27:23] but, if there a need to have everything in a module [14:27:31] then perhaps multiple module paths will help here? [14:27:34] Well… once things are in modules we get some benefits that could make the refactoring safer/easier. We can write unit tests for modules [14:27:44] we could have a different module directory for speghetti modules :p [14:27:45] AzaToth: crap... ok ... gimme a little time to figure it out [14:28:01] we can use some new puppet features that allow us to puppetize instances from git branches [14:28:09] akosiaris: unless mark & C:o gives me über power to the repo [14:28:14] which feature is that? [14:28:15] roles/ [14:28:15] modules/ [14:28:15] wmfmodules/ [14:28:16] whatever [14:28:22] ottomata: not spaghetti/ ? [14:28:25] haa [14:28:26] sure [14:28:28] :) [14:28:36] not everything is bad under manifests btw [14:28:40] * andrewbogott looks it up [14:28:42] a few things are, others are mostly okay [14:28:54] nodes will stay in manifests/ right? [14:28:58] nodes? [14:28:59] http://docs.puppetlabs.com/guides/environment.html [14:29:04] environments yeah [14:29:07] node definitions [14:29:25] I don't think environments needs modules necessarily [14:29:34] it works better with them, specifically for files [14:29:35] btw, you can also have $environment in modulepath [14:29:42] roles/$environment/ could be a module dir [14:29:58] that might be nice for our realm specific roles [14:30:02] anyway, ahh sorry [14:30:05] tangent! [14:30:05] that's fairly common, ottomata [14:30:08] ignore me :p [14:30:28] multiple branches corresponding 1:1 to multiple environments like that [14:30:42] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [14:30:43] My motivations may be half-baked… a) Ryan keeps muttering that 'if only we used modules' we could do [14:30:54] b) Any time I read about puppet techniques it assumes we're using modules [14:30:58] well [14:31:08] the puppet folks are basically pushing hard to get everyone to use modules [14:31:09] And from b) I conclude that we're stuck in a backwater [14:31:14] but that's not always on a solid foundation [14:31:27] a lot of these things can and/or could have been done without modules [14:31:27] i would *think* that ryan would also be saying "if we only had GOOD modules' we could do lots of things [14:31:38] if config values are all hardcoded in modules, they aren't very modular [14:31:42] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 5.864 second response time [14:31:50] I think our main motivation is readability/maintainability [14:32:04] modules is the means to that, but not if we just move the exact same code over [14:32:40] perhaps someone should try environments with our current puppet repo [14:32:41] in labs [14:32:55] if it's possible to get by without moving everything into modules right now, that would definitely be preferred [14:33:02] then we can do modules one by one in a less rushed manner [14:33:50] You keep saying 'asap' and 'rushed' as though you feel like my existing patches make things explicitly worse [14:33:55] Are they worse? Or just not perfect? [14:34:41] as I've barely done any reviews for you, I can't comment [14:35:40] akosiaris: if you can reset master to a3aadacd7c1ccd819420a73975a08ba67110decb I can push two new changes [14:36:42] PROBLEM - Disk space on ms-be1001 is CRITICAL: DISK CRITICAL - free space: / 5683 MB (3% inode=98%): [14:36:55] oh dear [14:37:44] (the nagios alert) [14:39:23] RECOVERY - Varnish HTTP upload-frontend on cp1061 is OK: HTTP OK: HTTP/1.1 200 OK - 675 bytes in 0.433 second response time [14:39:30] andrewbogott: the mail stuff I'm looking at right now still has quite a bit of layering violation and such... that's not at all your fault, but it does mean that it's even less likely to get fixed later [14:39:41] that would be ok if there's a really pressing need to move everything into modules now [14:39:44] I disagree that it's less likely [14:39:49] but if there isn't, I would prefer it to stay where it is [14:39:53] In a module we can write tests, and having tests allows for safer, easier refactoring [14:40:08] that doesn't hold for layering violations [14:40:19] if the exim shouldn't have simple-mail-sender *at all* [14:40:24] the exim module* [14:40:44] and you'd want to move that to e.g. a role [14:40:48] where would you put the unit test? [14:41:54] I'm going to go eat breakfast. [14:41:54] (even if we make the assumption that it is possible to write so extensive tests, which is probably more effort than cleaning up the module itself...) [14:42:15] also currently there is quite a bit of spaghetti here and there, but at least that's sort of understood [14:42:29] yes, exactly my point too [14:42:32] once we start moving that into modules and splitting them up, that's getting even more confusing [14:42:42] it's spaghetti that we're used to, reordering all that up is just going to hurt us more [14:42:54] or having two places to look for a role, for example [14:42:56] again, if we absolutely have to, sure, but if not, i'd like to do them one by one [14:43:07] at least we can keep the modules more or less consistent then [14:43:40] I assume no then [14:45:23] AzaToth: a3aadacd7c1ccd819420a73975a08ba67110decb is not a commit in master [14:45:39] let's make a secondary spaghetti/ module dir! [14:45:40] akosiaris: it's the tip of upstream [14:46:03] then andrewbogott can move to 'modules' and we can keep track of the difference between the unholy and holy modules [14:46:19] there are no holy puppet modules [14:46:47] New patchset: Hashar; "do more spaghetti sauce" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70641 [14:46:49] andrewbogott: noone ever said anything about modules not being an improvement [14:46:54] not sure where your mail comes from [14:47:39] how the hell is gerrit meant to be useful when you need to merge commits from a upstream? [14:48:01] or is that nothing they remotely considered [14:48:43] PROBLEM - Disk space on ms-be1001 is CRITICAL: DISK CRITICAL - free space: / 5544 MB (3% inode=98%): [14:49:44] RECOVERY - Disk space on ms-be1001 is OK: DISK OK [14:49:54] akosiaris: reset upstream-google to that as well [14:50:38] i am starting to get the feeling gerrit won't be happy... [14:51:23] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [14:53:12] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.128 second response time [15:07:41] New patchset: BBlack; "copyright/licensing/docs" [operations/software/varnish/libvmod-netmapper] (master) - https://gerrit.wikimedia.org/r/70643 [15:08:11] Change merged: BBlack; [operations/software/varnish/libvmod-netmapper] (master) - https://gerrit.wikimedia.org/r/70643 [15:09:22] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [15:18:45] ^demon: so I messed up and submitted a bunch of change sets to gerrit that depend on one another. I _think_ I can figure out how to rebase them all against gerrit/master which looks like the right thing to do. [15:19:49] manybubbles: git remote update ; git checkout 'sha1 of the latest commit' ; git rebase [15:20:28] hashar: let me give that a shot [15:22:23] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [15:23:13] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.148 second response time [15:23:46] <^demon> manybubbles: I'm reviewing it now. I should've merged the HTML sanitizing yesterday, had already tested it [15:24:48] ^demon: in that case I'll just let you review them in order and it'll merge them. next time I'll do something different - are you making a branch for every change set, commiting it, and sending that to gerrit? [15:25:12] <^demon> That's usually what I do. [15:25:20] <^demon> Branch per set of connected changes I'm working on. [15:25:45] I'll do that from here on out [15:25:56] <^demon> I'm also very liberal with resetting and rebasing, so a lot of times I'll just work on master, push to gerrit, then toss it from my local history. [15:26:06] <^demon> s/lot of times/sometimes/ [15:27:30] because once you've pushed to gerrit you can always get it back anyway [15:27:45] <^demon> Yup :) [15:28:22] so if you don't branch you'll git commit, git review, then git reset --hard gerrit/master? [15:28:52] I'm just trying to connect all the dots.... [15:29:31] <^demon> I have an alias for some of my resetting. [15:29:36] I think when I went to distributed version control I got half of the experience with our workflow. This one is so different and somewhat liberating. [15:29:38] cool [15:29:45] <^demon> `git rollback` == `git reset --hard HEAD~1` [15:29:54] manybubbles: what I do: [15:29:54] if there are only a very small number of connected changes that I need to get reviewed soon, I use a topic branch: a local branch that tracks gerrit/master [15:29:56] <^demon> `git ohcrap` == `git reset --hard origin/master` [15:29:59] akosiaris: no luck I assume [15:30:19] if there is longer development work to do, then I use a local branch that also tracks a gerrit remote branch [15:30:19] and merge back into gerrit/master later [15:30:38] <^demon> manybubbles: I'm going to merge the namespace changes. Other than that todo I think it's working well. [15:31:03] ottomata: cool. gits efemeral branches offer lots of fun. [15:31:05] ^demon: sounds good. [15:31:29] AzaToth: you assume correctly... I am new to gerrit as well and the requirements it imposes puzzle me a bit.... [15:31:39] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [15:32:05] still trying to figure it out [15:32:15] <^demon> AzaToth: What seems to be the problem? [15:32:19] ^demon: in https://gerrit.wikimedia.org/r/#/c/70535/1/CirrusSearchUpdater.php you mention that __METHOD__ doesn't work in an anonymous function - it certainly "works" in that it doesn't crash and produces recognizable output - if we don't like it I can change it though. [15:32:30] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 0.187 second response time [15:34:22] <^demon> manybubbles: I'm trying to find it, but I remember having to deal with this before. [15:34:46] ^demon: I'll just change it. no worries. simple simple. [15:35:49] <^demon> https://gerrit.wikimedia.org/r/#/c/54884/2..3/MWSearch_body.php - here's from before. [15:36:39] PROBLEM - Puppetmaster HTTPS on stafford is CRITICAL: CRITICAL - Socket timeout after 10 seconds [15:38:39] RECOVERY - Puppetmaster HTTPS on stafford is OK: HTTP OK: Status line output matched 400 - 336 bytes in 5.362 second response time [15:41:32] I am off see you tomorrow :-D [15:41:48] Change merged: Jgreen; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70634 [15:42:36] ^demon: fixed the first one. I'm rebasing the others (now I really see why you use the topic branches) [15:42:44] or not [15:50:27] PROBLEM - Disk space on ms-be1002 is CRITICAL: DISK CRITICAL - free space: / 5697 MB (3% inode=98%): [16:02:46] Hey, does the creation of a mailing list require mgmt approval or do I just Do It? [16:03:07] is it for wikimedia (community) purposes? [16:03:23] internal [16:03:29] then do it [16:07:17] !log updated Parsoid to e79f70a [16:07:27] Logged the message, Master [16:09:11] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [16:16:07] Change abandoned: Hashar; "(no reason)" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70641 [16:34:20] ^demon|lunch: I want to reset buck to a real upstream instead of the ugly merge someone (hashar) did [16:37:08] !log catrope synchronized php-1.22wmf7/extensions/VisualEditor 'Update VisualEditor to master' [16:37:16] Logged the message, Master [16:37:32] !log catrope synchronized php-1.22wmf8/extensions/VisualEditor 'Update VisualEditor to master' [16:37:41] Logged the message, Master [16:45:35] New patchset: Jforrester; "Make 'visualeditor-enable' hidden if on for all users" [operations/mediawiki-config] (master) - https://gerrit.wikimedia.org/r/70652 [16:46:35] New review: Jforrester; "We'd need to re-support this being a hidden preference in VE." [operations/mediawiki-config] (master) C: -1; - https://gerrit.wikimedia.org/r/70652 [16:46:41] New review: Catrope; "Do not deploy, pending verification that the preference has the ignoreHidden thingy set correctly." [operations/mediawiki-config] (master) C: -2; - https://gerrit.wikimedia.org/r/70652 [17:07:57] greg-g, we deployed yesterday with mobile, skipping today [17:08:47] yurik: cool [17:08:50] RoanKattouw: ^ [17:08:57] Oh OK [17:09:00] I'm already done [17:09:09] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [17:09:09] oh right, I saw that before and forgot [17:09:35] I just didn't delete the mental "when yuri pings me, ping Roan" todo in my head [17:25:52] New patchset: BBlack; "fix up test/dist stuff" [operations/software/varnish/libvmod-netmapper] (master) - https://gerrit.wikimedia.org/r/70654 [17:27:22] Change merged: BBlack; [operations/software/varnish/libvmod-netmapper] (master) - https://gerrit.wikimedia.org/r/70654 [17:51:03] greg-g, RoanKattouw, is anyone deploying right now? there is a bug that i would like to quick deploy something [17:51:38] yurik: what's up? (and no, to answer your question) [17:51:39] I am not [17:52:14] greg-g, ok, doing a quick deploy - trying to figure out what has caused a huge drop in stats two weeks ago [17:52:26] oh right, that [18:04:13] greg-g, syncing... [18:04:24] "here goes nothing..." [18:04:30] exactly! [18:04:45] don't agree with me on that! ;) [18:08:41] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [18:09:33] !log yurik synchronized php-1.22wmf8/extensions/ZeroRatedMobileAccess [18:09:41] Logged the message, Master [18:11:55] !log yurik synchronized php-1.22wmf7/extensions/ZeroRatedMobileAccess [18:12:04] Logged the message, Master [18:14:16] greg-g, done [18:14:28] yurik: did you break it? [18:15:03] greg-g, doesn't seem so... despite me trying hard to do so :) [18:15:04] good [18:29:47] New review: coren; "LGM" [operations/puppet] (production) C: 2; - https://gerrit.wikimedia.org/r/70582 [18:30:38] New patchset: coren; "Revert "Point the Parsoid cache in labs to parsoid-spof rather than deployment-parsoid2"" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70582 [18:31:12] PROBLEM - Puppet freshness on wtp1011 is CRITICAL: No successful Puppet run in the last 10 hours [18:31:21] New review: coren; "Still looks good after rebase. :-)" [operations/puppet] (production) C: 2; - https://gerrit.wikimedia.org/r/70582 [18:31:22] Change merged: coren; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70582 [18:32:12] PROBLEM - Puppet freshness on mw10 is CRITICAL: No successful Puppet run in the last 10 hours [18:32:33] New patchset: Demon; "Updates for 2.7-rc2-505-g7502a46" [operations/debs/gerrit] (master) - https://gerrit.wikimedia.org/r/70667 [18:33:01] New review: Demon; "War is at https://integration.wikimedia.org/nightly/gerrit/wmf/gerrit-2.7-rc2-505-g7502a46.war" [operations/debs/gerrit] (master) - https://gerrit.wikimedia.org/r/70667 [18:33:11] PROBLEM - Puppet freshness on amssq31 is CRITICAL: No successful Puppet run in the last 10 hours [18:33:11] PROBLEM - Puppet freshness on amssq35 is CRITICAL: No successful Puppet run in the last 10 hours [18:33:11] PROBLEM - Puppet freshness on amssq37 is CRITICAL: No successful Puppet run in the last 10 hours [18:33:11] PROBLEM - Puppet freshness on cp1038 is CRITICAL: No successful Puppet run in the last 10 hours [18:33:11] PROBLEM - Puppet freshness on cp1060 is CRITICAL: No successful Puppet run in the last 10 hours [18:35:13] PROBLEM - Puppet freshness on amssq36 is CRITICAL: No successful Puppet run in the last 10 hours [18:35:13] PROBLEM - Puppet freshness on amssq43 is CRITICAL: No successful Puppet run in the last 10 hours [18:35:13] PROBLEM - Puppet freshness on amssq41 is CRITICAL: No successful Puppet run in the last 10 hours [18:35:13] PROBLEM - Puppet freshness on amssq50 is CRITICAL: No successful Puppet run in the last 10 hours [18:35:13] PROBLEM - Puppet freshness on amssq58 is CRITICAL: No successful Puppet run in the last 10 hours [18:36:17] !log commented out cron-msp queue flush in stat1:/etc/cron.d/sendmail to stop pointless cronspam, host is running exim [18:36:28] Logged the message, Master [18:36:54] Ryan_Lane: What /does/ cause those hosts to not run puppet? A quick check shows that running it by hand works fine. [18:37:12] PROBLEM - Puppet freshness on amssq33 is CRITICAL: No successful Puppet run in the last 10 hours [18:37:14] PROBLEM - Puppet freshness on amssq38 is CRITICAL: No successful Puppet run in the last 10 hours [18:37:14] PROBLEM - Puppet freshness on amssq42 is CRITICAL: No successful Puppet run in the last 10 hours [18:37:14] PROBLEM - Puppet freshness on amssq45 is CRITICAL: No successful Puppet run in the last 10 hours [18:37:14] PROBLEM - Puppet freshness on amssq49 is CRITICAL: No successful Puppet run in the last 10 hours [18:37:14] PROBLEM - Puppet freshness on amssq39 is CRITICAL: No successful Puppet run in the last 10 hours [18:38:08] New review: Catrope; "Blocked on https://gerrit.wikimedia.org/r/#/c/70655/ being deployed" [operations/mediawiki-config] (master) - https://gerrit.wikimedia.org/r/70652 [18:39:39] Coren: most likely the master being overloaded [18:39:51] 10 hours? [18:40:17] Do we normally do something about it or just let it recover next run(s)? [18:40:42] Coren: they're probably decom'ed but still alive [18:40:46] Coren: it should recover [18:41:04] I'm preparing something for that [18:41:11] ah, and they just haven't been pulled from icinga yet? [18:41:20] why wouldn't we continue running puppet on decom nodes? [18:41:36] we should until they are fully removed [18:42:05] if it's that then the cause could be that neon hasn't had a successful run at the right time yet. (it's only some shard of runs, right?) [18:42:33] did i get the name right? neon is the new spence? [18:43:13] we have all permutations of (machine is online, machine is in puppet, machine is in puppetca, machine is decomissioned) [18:43:25] news about all that soon [18:45:14] * jeremyb subscribes to the news feed [18:48:03] paravoid: Your ideas are intriguing to me and I wish to subscribe to your newsletter. [18:53:04] <^demon|lunch> AzaToth: I just put you in the 'gerrit' group in gerrit. You should be able to manage the buck/gerrit repos now. [18:53:13] <^demon|lunch> You've got owner on them [18:55:05] okidoki [18:57:13] ^demon|lunch: does gerrit actually announce changes in gerrit to any channel? [18:57:22] as they are per definition repos [18:57:25] <^demon|lunch> To -dev, I think. [18:57:31] <^demon|lunch> That's the default. [18:59:40] ^demon|lunch: I added force push to gerrit on https://gerrit.wikimedia.org/r/#/admin/projects/operations/debs/buck,access but haven't seen any notification in any channel [19:00:56] <^demon|lunch> Bot's never notified on acl changes. [19:00:56] <^demon|lunch> Unless it was proposed for review & merged. Bot kinda sucks. [19:01:12] ah [19:01:35] afaik acl changes are something that should be notified to the relevant parties [19:01:53] <^demon|lunch> I want to rewrite it to use stream-events instead. [19:01:56] k [19:02:19] ^demon|lunch: are you ok I remake buck repo to remove that ugly merge hashar did? [19:02:58] i.e. atm it's emtpy + gitreview + merge [19:03:08] which is a pain [19:03:09] <^demon|lunch> Will it break the sha1s on https://gerrit.wikimedia.org/r/#/q/project:operations/debs/buck,n,z? [19:03:37] I will keep the shas [19:03:47] <^demon|lunch> k. [19:04:07] except hashars merge which will dissapear [19:04:53] New patchset: AzaToth; "Initial debian build" [operations/debs/buck] (master) - https://gerrit.wikimedia.org/r/67999 [19:04:53] Change merged: AzaToth; [operations/debs/buck] (master) - https://gerrit.wikimedia.org/r/67999 [19:05:21] damn [19:06:01] <^demon|lunch> I can just delete & recreate :p [19:06:05] was only going to push upstream-google [19:06:11] ok, do dat [19:08:29] PROBLEM - Puppet freshness on mc15 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:29] PROBLEM - Puppet freshness on lvs1006 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:29] PROBLEM - Puppet freshness on erzurumi is CRITICAL: No successful Puppet run in the last 10 hours [19:08:29] PROBLEM - Puppet freshness on ms-fe3001 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:29] PROBLEM - Puppet freshness on lvs1004 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:30] PROBLEM - Puppet freshness on sodium is CRITICAL: No successful Puppet run in the last 10 hours [19:08:30] PROBLEM - Puppet freshness on spence is CRITICAL: No successful Puppet run in the last 10 hours [19:08:31] PROBLEM - Puppet freshness on virt4 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:31] PROBLEM - Puppet freshness on virt1 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:32] PROBLEM - Puppet freshness on virt3 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:32] PROBLEM - Puppet freshness on lvs1005 is CRITICAL: No successful Puppet run in the last 10 hours [19:08:33] ^demon|lunch: done or lunch? [19:08:43] <^demon|lunch> Sec, lunch was hours ago [19:08:49] hehe [19:09:34] you know if I can add upstream into gitreview? [19:10:59] <^demon> New repo created, completely empty. [19:11:01] <^demon> You can push whatever you want :po [19:12:23] ty [19:13:02] ^demon: wondering if there is a legit way to define that we use upstream code from [19:13:28] true can plaster it into a text file, but I thought about somethying more automagical, like in .gitreview [19:13:51] <^demon> Well upstream doesn't use gerrit, so .gitreview wouldn't help much. [19:14:00] <^demon> git-review would barf [19:14:14] our upstream for buck _is_ gerrit [19:14:33] i.e. it's gerrits version of facebooks buck [19:14:40] aka fubar [19:16:09] ^demon: you removed ldap,ops from access? [19:16:22] or was that irrelevant for nothing? [19:16:44] <^demon> They have access by default [19:17:00] k [19:18:43] New patchset: AzaToth; ".gitreview file" [operations/debs/buck] (master) - https://gerrit.wikimedia.org/r/70672 [19:18:43] New patchset: AzaToth; "Initial debian build" [operations/debs/buck] (master) - https://gerrit.wikimedia.org/r/70673 [19:19:09] Change merged: AzaToth; [operations/debs/buck] (master) - https://gerrit.wikimedia.org/r/70672 [19:20:20] Change merged: Ryan Lane; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70448 [19:20:41] Change merged: Ryan Lane; [operations/debs/gerrit] (master) - https://gerrit.wikimedia.org/r/70667 [19:21:49] MaxSem: dunno if it was right or wrong, but I interpreted debs-osm as debs-osm whatever osm is but it's debs at least [19:22:16] probably Open Street Map [19:22:17] Original Synical Manager? [19:22:22] heh:) [19:22:31] oh [19:23:39] hashar: remade the repo [19:23:58] the big merge did fuck everything up [19:24:59] ^demon: do we have debian-glue enabled on jenkins? [19:25:21] would want to have something automagically verify [19:25:56] <^demon> Dunno, doubt it [19:26:44] k [19:26:58] what is debian-glue ? [19:27:04] <-- is the jenkins guy [19:27:09] ok so http://jenkins-debian-glue.org [19:27:13] seems to be to build debian packages [19:27:22] which is something I really would like our Jenkins setup to manage [19:27:35] New patchset: Demon; "Set max commit summary lengths for Gerrit" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/66665 [19:27:47] but my past experience with build system ended up with requiring some root password :D [19:27:49] New review: Demon; "I did 100 / 120." [operations/puppet] (production) - https://gerrit.wikimedia.org/r/66665 [19:29:28] ^demon: newest gerrit is in the repo [19:29:35] <^demon> ty! [19:30:32] AzaToth: do you have any past experience with debian glue ? ;D [19:36:19] hashar: aye [19:36:39] hashar: https://github.com/mika/jenkins-debian-glue/issues/55 [19:37:06] :-) [19:37:58] I am looking for a debian package to install it on labs :D [19:38:30] you can either build it using debian gule or find a pre-packaged one [19:38:59] http://jenkins-debian-glue.org/ [19:42:09] hashar: http://jenkins-debian-glue.org/getting_started/manual/ outlines how to make a job to actually build jenkins-deiban-glue [19:47:03] hashar: you understaand? [19:47:09] yeah reading [19:47:33] it took time for me to actually understand how it worked [19:47:36] trying to remember how I have setup a jenkins slave in labs :D [19:47:44] easy [19:48:02] a slave just needs to be a ssh login with sudo access [19:48:24] yeah that is the "just" which is going to cause me trouble shehe [19:48:32] hehehe [19:49:28] there's offcourse the old never fixed bug in jenkins where it can stall on copying artifacts after build is done [19:52:21] https://issues.jenkins-ci.org/browse/JENKINS-7641 [19:52:31] they seems to ignore this kind of errors [19:52:45] afaik ssh is too alien to java programmers [19:52:49] AzaToth, I noticed you added me as a reviewer to debs [19:53:04] MaxSem: I added debs-osm :-P [19:53:10] which is you it seems [19:53:22] I've no idea about packages:P [19:53:29] I saw "debs" [19:53:41] I just committed other people's stuff [19:53:41] :-P [19:53:50] [06/26/13 19:53:40] [SSH] Authentication successful. [19:54:06] you are in the only debs group in jenkins, so :-P [19:54:14] hashar: that's good [19:55:23] hashar: when I was building I was building on slaves that where armel 1.2GHz 512MB [19:56:04] * hashar waits for java to install [19:56:46] hmm [19:59:21] RECOVERY - Disk space on ms-be1002 is OK: DISK OK [20:07:39] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [20:07:39] RECOVERY - Puppet freshness on amssq31 is OK: puppet ran at Wed Jun 26 20:07:36 UTC 2013 [20:07:39] RECOVERY - Puppet freshness on amssq35 is OK: puppet ran at Wed Jun 26 20:07:36 UTC 2013 [20:07:39] RECOVERY - Puppet freshness on mw1020 is OK: puppet ran at Wed Jun 26 20:07:37 UTC 2013 [20:07:48] RECOVERY - Puppet freshness on db77 is OK: puppet ran at Wed Jun 26 20:07:38 UTC 2013 [20:07:48] RECOVERY - Puppet freshness on mc4 is OK: puppet ran at Wed Jun 26 20:07:38 UTC 2013 [20:07:48] RECOVERY - Puppet freshness on mw10 is OK: puppet ran at Wed Jun 26 20:07:39 UTC 2013 [20:07:48] RECOVERY - Puppet freshness on sq68 is OK: puppet ran at Wed Jun 26 20:07:39 UTC 2013 [20:07:48] RECOVERY - Puppet freshness on db65 is OK: puppet ran at Wed Jun 26 20:07:40 UTC 2013 [20:07:49] RECOVERY - Puppet freshness on db1058 is OK: puppet ran at Wed Jun 26 20:07:41 UTC 2013 [20:07:49] RECOVERY - Puppet freshness on ms-be1007 is OK: puppet ran at Wed Jun 26 20:07:42 UTC 2013 [20:07:50] RECOVERY - Puppet freshness on ms-be1009 is OK: puppet ran at Wed Jun 26 20:07:42 UTC 2013 [20:07:50] RECOVERY - Puppet freshness on mw55 is OK: puppet ran at Wed Jun 26 20:07:43 UTC 2013 [20:07:51] RECOVERY - Puppet freshness on wtp1002 is OK: puppet ran at Wed Jun 26 20:07:43 UTC 2013 [20:07:51] RECOVERY - Puppet freshness on cp1038 is OK: puppet ran at Wed Jun 26 20:07:43 UTC 2013 [20:07:52] RECOVERY - Puppet freshness on mw95 is OK: puppet ran at Wed Jun 26 20:07:44 UTC 2013 [20:07:52] RECOVERY - Puppet freshness on srv272 is OK: puppet ran at Wed Jun 26 20:07:44 UTC 2013 [20:07:53] RECOVERY - Puppet freshness on mw99 is OK: puppet ran at Wed Jun 26 20:07:46 UTC 2013 [20:07:53] RECOVERY - Puppet freshness on srv264 is OK: puppet ran at Wed Jun 26 20:07:46 UTC 2013 [20:07:54] RECOVERY - Puppet freshness on mw39 is OK: puppet ran at Wed Jun 26 20:07:47 UTC 2013 [20:07:58] RECOVERY - Puppet freshness on mw1198 is OK: puppet ran at Wed Jun 26 20:07:47 UTC 2013 [20:07:58] RECOVERY - Puppet freshness on mw1169 is OK: puppet ran at Wed Jun 26 20:07:48 UTC 2013 [20:07:58] RECOVERY - Puppet freshness on search1023 is OK: puppet ran at Wed Jun 26 20:07:48 UTC 2013 [20:07:58] RECOVERY - Puppet freshness on search1021 is OK: puppet ran at Wed Jun 26 20:07:49 UTC 2013 [20:07:58] RECOVERY - Puppet freshness on mw1083 is OK: puppet ran at Wed Jun 26 20:07:49 UTC 2013 [20:07:59] RECOVERY - Puppet freshness on analytics1008 is OK: puppet ran at Wed Jun 26 20:07:49 UTC 2013 [20:07:59] RECOVERY - Puppet freshness on mw1094 is OK: puppet ran at Wed Jun 26 20:07:50 UTC 2013 [20:08:00] RECOVERY - Puppet freshness on mw1053 is OK: puppet ran at Wed Jun 26 20:07:50 UTC 2013 [20:08:00] RECOVERY - Puppet freshness on brewster is OK: puppet ran at Wed Jun 26 20:07:51 UTC 2013 [20:08:01] RECOVERY - Puppet freshness on mw1116 is OK: puppet ran at Wed Jun 26 20:07:51 UTC 2013 [20:08:01] RECOVERY - Puppet freshness on mw1136 is OK: puppet ran at Wed Jun 26 20:07:51 UTC 2013 [20:08:02] RECOVERY - Puppet freshness on mw1074 is OK: puppet ran at Wed Jun 26 20:07:52 UTC 2013 [20:08:02] RECOVERY - Puppet freshness on sq50 is OK: puppet ran at Wed Jun 26 20:07:52 UTC 2013 [20:08:03] RECOVERY - Puppet freshness on mc14 is OK: puppet ran at Wed Jun 26 20:07:53 UTC 2013 [20:08:03] RECOVERY - Puppet freshness on db57 is OK: puppet ran at Wed Jun 26 20:07:53 UTC 2013 [20:08:04] RECOVERY - Puppet freshness on db1024 is OK: puppet ran at Wed Jun 26 20:07:54 UTC 2013 [20:08:04] RECOVERY - Puppet freshness on cp1065 is OK: puppet ran at Wed Jun 26 20:07:54 UTC 2013 [20:08:05] RECOVERY - Puppet freshness on db51 is OK: puppet ran at Wed Jun 26 20:07:54 UTC 2013 [20:08:05] RECOVERY - Puppet freshness on wtp1013 is OK: puppet ran at Wed Jun 26 20:07:55 UTC 2013 [20:08:06] RECOVERY - Puppet freshness on es5 is OK: puppet ran at Wed Jun 26 20:07:55 UTC 2013 [20:08:06] RECOVERY - Puppet freshness on mc1008 is OK: puppet ran at Wed Jun 26 20:07:55 UTC 2013 [20:08:07] RECOVERY - Puppet freshness on analytics1009 is OK: puppet ran at Wed Jun 26 20:07:56 UTC 2013 [20:08:07] RECOVERY - Puppet freshness on dataset2 is OK: puppet ran at Wed Jun 26 20:07:56 UTC 2013 [20:08:08] RECOVERY - Puppet freshness on mw33 is OK: puppet ran at Wed Jun 26 20:07:57 UTC 2013 [20:08:08] RECOVERY - Puppet freshness on mw117 is OK: puppet ran at Wed Jun 26 20:07:57 UTC 2013 [20:08:09] RECOVERY - Puppet freshness on pdf3 is OK: puppet ran at Wed Jun 26 20:07:58 UTC 2013 [20:08:09] RECOVERY - Puppet freshness on mw26 is OK: puppet ran at Wed Jun 26 20:07:59 UTC 2013 [20:08:10] RECOVERY - Puppet freshness on mw36 is OK: puppet ran at Wed Jun 26 20:07:59 UTC 2013 [20:08:10] RECOVERY - Puppet freshness on cp1025 is OK: puppet ran at Wed Jun 26 20:08:01 UTC 2013 [20:08:11] RECOVERY - Puppet freshness on stat1002 is OK: puppet ran at Wed Jun 26 20:08:02 UTC 2013 [20:08:11] RECOVERY - Puppet freshness on analytics1011 is OK: puppet ran at Wed Jun 26 20:08:02 UTC 2013 [20:08:12] RECOVERY - Puppet freshness on analytics1019 is OK: puppet ran at Wed Jun 26 20:08:03 UTC 2013 [20:08:12] RECOVERY - Puppet freshness on mw1138 is OK: puppet ran at Wed Jun 26 20:08:03 UTC 2013 [20:08:13] RECOVERY - Puppet freshness on db46 is OK: puppet ran at Wed Jun 26 20:08:04 UTC 2013 [20:08:13] RECOVERY - Puppet freshness on cp1060 is OK: puppet ran at Wed Jun 26 20:08:04 UTC 2013 [20:08:14] RECOVERY - Puppet freshness on tarin is OK: puppet ran at Wed Jun 26 20:08:07 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on mc16 is OK: puppet ran at Wed Jun 26 20:08:07 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on amssq37 is OK: puppet ran at Wed Jun 26 20:08:08 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on wtp1021 is OK: puppet ran at Wed Jun 26 20:08:09 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on db55 is OK: puppet ran at Wed Jun 26 20:08:09 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on mw1185 is OK: puppet ran at Wed Jun 26 20:08:10 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on ms-fe2 is OK: puppet ran at Wed Jun 26 20:08:10 UTC 2013 [20:08:19] RECOVERY - Puppet freshness on mw107 is OK: puppet ran at Wed Jun 26 20:08:10 UTC 2013 [20:08:20] RECOVERY - Puppet freshness on snapshot4 is OK: puppet ran at Wed Jun 26 20:08:11 UTC 2013 [20:08:20] RECOVERY - Puppet freshness on mw1001 is OK: puppet ran at Wed Jun 26 20:08:11 UTC 2013 [20:08:21] RECOVERY - Puppet freshness on es1009 is OK: puppet ran at Wed Jun 26 20:08:12 UTC 2013 [20:08:21] RECOVERY - Puppet freshness on labsdb1001 is OK: puppet ran at Wed Jun 26 20:08:12 UTC 2013 [20:08:22] RECOVERY - Puppet freshness on cp1023 is OK: puppet ran at Wed Jun 26 20:08:13 UTC 2013 [20:08:22] RECOVERY - Puppet freshness on mw1218 is OK: puppet ran at Wed Jun 26 20:08:13 UTC 2013 [20:08:23] RECOVERY - Puppet freshness on lvs6 is OK: puppet ran at Wed Jun 26 20:08:14 UTC 2013 [20:08:23] RECOVERY - Puppet freshness on db1010 is OK: puppet ran at Wed Jun 26 20:08:14 UTC 2013 [20:08:24] RECOVERY - Puppet freshness on mw109 is OK: puppet ran at Wed Jun 26 20:08:15 UTC 2013 [20:08:24] RECOVERY - Puppet freshness on sq55 is OK: puppet ran at Wed Jun 26 20:08:16 UTC 2013 [20:08:25] RECOVERY - Puppet freshness on sq64 is OK: puppet ran at Wed Jun 26 20:08:16 UTC 2013 [20:08:25] RECOVERY - Puppet freshness on mc1011 is OK: puppet ran at Wed Jun 26 20:08:17 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on amssq59 is OK: puppet ran at Wed Jun 26 20:08:18 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on mw40 is OK: puppet ran at Wed Jun 26 20:08:18 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on amssq40 is OK: puppet ran at Wed Jun 26 20:08:19 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on aluminium is OK: puppet ran at Wed Jun 26 20:08:19 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on mw70 is OK: puppet ran at Wed Jun 26 20:08:19 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on sq81 is OK: puppet ran at Wed Jun 26 20:08:23 UTC 2013 [20:08:29] RECOVERY - Puppet freshness on mw1040 is OK: puppet ran at Wed Jun 26 20:08:23 UTC 2013 [20:08:30] RECOVERY - Puppet freshness on mw1022 is OK: puppet ran at Wed Jun 26 20:08:24 UTC 2013 [20:08:30] RECOVERY - Puppet freshness on ms-fe1003 is OK: puppet ran at Wed Jun 26 20:08:24 UTC 2013 [20:08:31] RECOVERY - Puppet freshness on analytics1001 is OK: puppet ran at Wed Jun 26 20:08:26 UTC 2013 [20:08:31] RECOVERY - Puppet freshness on mw1132 is OK: puppet ran at Wed Jun 26 20:08:26 UTC 2013 [20:08:32] RECOVERY - Puppet freshness on mw87 is OK: puppet ran at Wed Jun 26 20:08:27 UTC 2013 [20:08:38] RECOVERY - Puppet freshness on mw111 is OK: puppet ran at Wed Jun 26 20:08:28 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on search1011 is OK: puppet ran at Wed Jun 26 20:08:28 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on virt2 is OK: puppet ran at Wed Jun 26 20:08:29 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on mw1219 is OK: puppet ran at Wed Jun 26 20:08:29 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on mw1072 is OK: puppet ran at Wed Jun 26 20:08:29 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on mw64 is OK: puppet ran at Wed Jun 26 20:08:30 UTC 2013 [20:08:39] RECOVERY - Puppet freshness on search1006 is OK: puppet ran at Wed Jun 26 20:08:31 UTC 2013 [20:08:40] RECOVERY - Puppet freshness on professor is OK: puppet ran at Wed Jun 26 20:08:31 UTC 2013 [20:08:40] RECOVERY - Puppet freshness on db29 is OK: puppet ran at Wed Jun 26 20:08:31 UTC 2013 [20:08:41] RECOVERY - Puppet freshness on mc2 is OK: puppet ran at Wed Jun 26 20:08:32 UTC 2013 [20:08:41] RECOVERY - Puppet freshness on db31 is OK: puppet ran at Wed Jun 26 20:08:32 UTC 2013 [20:08:42] RECOVERY - Puppet freshness on magnesium is OK: puppet ran at Wed Jun 26 20:08:33 UTC 2013 [20:08:42] RECOVERY - Puppet freshness on db60 is OK: puppet ran at Wed Jun 26 20:08:33 UTC 2013 [20:08:43] RECOVERY - Puppet freshness on sq78 is OK: puppet ran at Wed Jun 26 20:08:34 UTC 2013 [20:08:43] RECOVERY - Puppet freshness on db59 is OK: puppet ran at Wed Jun 26 20:08:34 UTC 2013 [20:08:44] RECOVERY - Puppet freshness on mc1013 is OK: puppet ran at Wed Jun 26 20:08:34 UTC 2013 [20:08:44] RECOVERY - Puppet freshness on search27 is OK: puppet ran at Wed Jun 26 20:08:34 UTC 2013 [20:08:45] RECOVERY - Puppet freshness on mc8 is OK: puppet ran at Wed Jun 26 20:08:34 UTC 2013 [20:08:45] RECOVERY - Puppet freshness on cp1015 is OK: puppet ran at Wed Jun 26 20:08:35 UTC 2013 [20:08:46] RECOVERY - Puppet freshness on db1011 is OK: puppet ran at Wed Jun 26 20:08:35 UTC 2013 [20:08:46] RECOVERY - Puppet freshness on lvs1001 is OK: puppet ran at Wed Jun 26 20:08:36 UTC 2013 [20:08:47] RECOVERY - Puppet freshness on ms-be10 is OK: puppet ran at Wed Jun 26 20:08:37 UTC 2013 [20:08:47] RECOVERY - Puppet freshness on ms-be1004 is OK: puppet ran at Wed Jun 26 20:08:37 UTC 2013 [20:08:48] RECOVERY - Puppet freshness on cp3006 is OK: puppet ran at Wed Jun 26 20:08:37 UTC 2013 [20:08:48] RECOVERY - Puppet freshness on amssq32 is OK: puppet ran at Wed Jun 26 20:08:38 UTC 2013 [20:08:49] RECOVERY - Puppet freshness on wtp1010 is OK: puppet ran at Wed Jun 26 20:08:38 UTC 2013 [20:08:49] RECOVERY - Puppet freshness on sq82 is OK: puppet ran at Wed Jun 26 20:08:39 UTC 2013 [20:08:50] RECOVERY - Puppet freshness on db73 is OK: puppet ran at Wed Jun 26 20:08:40 UTC 2013 [20:08:50] RECOVERY - Puppet freshness on srv240 is OK: puppet ran at Wed Jun 26 20:08:41 UTC 2013 [20:08:51] RECOVERY - Puppet freshness on srv280 is OK: puppet ran at Wed Jun 26 20:08:41 UTC 2013 [20:08:51] RECOVERY - Puppet freshness on mw1220 is OK: puppet ran at Wed Jun 26 20:08:42 UTC 2013 [20:08:52] RECOVERY - Puppet freshness on srv282 is OK: puppet ran at Wed Jun 26 20:08:43 UTC 2013 [20:08:52] RECOVERY - Puppet freshness on search1004 is OK: puppet ran at Wed Jun 26 20:08:43 UTC 2013 [20:08:53] RECOVERY - Puppet freshness on mw1134 is OK: puppet ran at Wed Jun 26 20:08:43 UTC 2013 [20:08:53] RECOVERY - Puppet freshness on srv270 is OK: puppet ran at Wed Jun 26 20:08:44 UTC 2013 [20:08:54] RECOVERY - Puppet freshness on mw1080 is OK: puppet ran at Wed Jun 26 20:08:44 UTC 2013 [20:08:54] RECOVERY - Puppet freshness on mw1178 is OK: puppet ran at Wed Jun 26 20:08:46 UTC 2013 [20:08:55] RECOVERY - Puppet freshness on mw1031 is OK: puppet ran at Wed Jun 26 20:08:47 UTC 2013 [20:08:55] RECOVERY - Puppet freshness on mw1203 is OK: puppet ran at Wed Jun 26 20:08:47 UTC 2013 [20:08:58] RECOVERY - Puppet freshness on srv259 is OK: puppet ran at Wed Jun 26 20:08:48 UTC 2013 [20:08:58] RECOVERY - Puppet freshness on mw1045 is OK: puppet ran at Wed Jun 26 20:08:48 UTC 2013 [20:08:58] RECOVERY - Puppet freshness on mw1139 is OK: puppet ran at Wed Jun 26 20:08:49 UTC 2013 [20:08:58] RECOVERY - Puppet freshness on srv296 is OK: puppet ran at Wed Jun 26 20:08:49 UTC 2013 [20:08:58] RECOVERY - Puppet freshness on mchenry is OK: puppet ran at Wed Jun 26 20:08:49 UTC 2013 [20:08:59] RECOVERY - Puppet freshness on ms-be1002 is OK: puppet ran at Wed Jun 26 20:08:50 UTC 2013 [20:08:59] RECOVERY - Puppet freshness on amssq43 is OK: puppet ran at Wed Jun 26 20:08:50 UTC 2013 [20:09:00] RECOVERY - Puppet freshness on mc1016 is OK: puppet ran at Wed Jun 26 20:08:50 UTC 2013 [20:09:00] RECOVERY - Puppet freshness on terbium is OK: puppet ran at Wed Jun 26 20:08:51 UTC 2013 [20:09:01] RECOVERY - Puppet freshness on pdf2 is OK: puppet ran at Wed Jun 26 20:08:51 UTC 2013 [20:09:01] RECOVERY - Puppet freshness on mw115 is OK: puppet ran at Wed Jun 26 20:08:52 UTC 2013 [20:09:02] RECOVERY - Puppet freshness on mw1215 is OK: puppet ran at Wed Jun 26 20:08:52 UTC 2013 [20:09:02] RECOVERY - Puppet freshness on mw114 is OK: puppet ran at Wed Jun 26 20:08:52 UTC 2013 [20:09:03] RECOVERY - Puppet freshness on analytics1018 is OK: puppet ran at Wed Jun 26 20:08:54 UTC 2013 [20:09:03] RECOVERY - Puppet freshness on mw65 is OK: puppet ran at Wed Jun 26 20:08:54 UTC 2013 [20:09:04] RECOVERY - Puppet freshness on mw1200 is OK: puppet ran at Wed Jun 26 20:08:54 UTC 2013 [20:09:04] RECOVERY - Puppet freshness on mw1062 is OK: puppet ran at Wed Jun 26 20:08:56 UTC 2013 [20:09:05] RECOVERY - Puppet freshness on linne is OK: puppet ran at Wed Jun 26 20:08:56 UTC 2013 [20:09:08] RECOVERY - Puppet freshness on db36 is OK: puppet ran at Wed Jun 26 20:08:57 UTC 2013 [20:09:08] RECOVERY - Puppet freshness on mw53 is OK: puppet ran at Wed Jun 26 20:08:58 UTC 2013 [20:09:08] RECOVERY - Puppet freshness on stat1001 is OK: puppet ran at Wed Jun 26 20:08:58 UTC 2013 [20:09:08] RECOVERY - Puppet freshness on cp1054 is OK: puppet ran at Wed Jun 26 20:08:59 UTC 2013 [20:09:08] RECOVERY - Puppet freshness on cp1068 is OK: puppet ran at Wed Jun 26 20:08:59 UTC 2013 [20:09:09] RECOVERY - Puppet freshness on db1045 is OK: puppet ran at Wed Jun 26 20:09:02 UTC 2013 [20:09:09] RECOVERY - Puppet freshness on db56 is OK: puppet ran at Wed Jun 26 20:09:02 UTC 2013 [20:09:10] RECOVERY - Puppet freshness on mc1006 is OK: puppet ran at Wed Jun 26 20:09:03 UTC 2013 [20:09:10] RECOVERY - Puppet freshness on db44 is OK: puppet ran at Wed Jun 26 20:09:03 UTC 2013 [20:09:11] RECOVERY - Puppet freshness on cp1021 is OK: puppet ran at Wed Jun 26 20:09:04 UTC 2013 [20:09:11] RECOVERY - Puppet freshness on es1008 is OK: puppet ran at Wed Jun 26 20:09:04 UTC 2013 [20:09:12] RECOVERY - Puppet freshness on db1057 is OK: puppet ran at Wed Jun 26 20:09:06 UTC 2013 [20:09:12] RECOVERY - Puppet freshness on cp1039 is OK: puppet ran at Wed Jun 26 20:09:07 UTC 2013 [20:09:13] RECOVERY - Puppet freshness on wtp1006 is OK: puppet ran at Wed Jun 26 20:09:07 UTC 2013 [20:09:18] RECOVERY - Puppet freshness on db1056 is OK: puppet ran at Wed Jun 26 20:09:08 UTC 2013 [20:09:18] RECOVERY - Puppet freshness on mw1059 is OK: puppet ran at Wed Jun 26 20:09:08 UTC 2013 [20:09:18] RECOVERY - Puppet freshness on analytics1017 is OK: puppet ran at Wed Jun 26 20:09:08 UTC 2013 [20:09:18] RECOVERY - Puppet freshness on sq59 is OK: puppet ran at Wed Jun 26 20:09:09 UTC 2013 [20:09:18] RECOVERY - Puppet freshness on mw1086 is OK: puppet ran at Wed Jun 26 20:09:09 UTC 2013 [20:09:19] RECOVERY - Puppet freshness on srv271 is OK: puppet ran at Wed Jun 26 20:09:09 UTC 2013 [20:09:19] RECOVERY - Puppet freshness on lvs2 is OK: puppet ran at Wed Jun 26 20:09:09 UTC 2013 [20:09:20] RECOVERY - Puppet freshness on mw91 is OK: puppet ran at Wed Jun 26 20:09:10 UTC 2013 [20:09:20] RECOVERY - Puppet freshness on mw1145 is OK: puppet ran at Wed Jun 26 20:09:11 UTC 2013 [20:09:21] RECOVERY - Puppet freshness on mw15 is OK: puppet ran at Wed Jun 26 20:09:11 UTC 2013 [20:09:21] RECOVERY - Puppet freshness on mw52 is OK: puppet ran at Wed Jun 26 20:09:11 UTC 2013 [20:09:22] RECOVERY - Puppet freshness on amssq50 is OK: puppet ran at Wed Jun 26 20:09:11 UTC 2013 [20:09:22] RECOVERY - Puppet freshness on mw113 is OK: puppet ran at Wed Jun 26 20:09:13 UTC 2013 [20:09:23] RECOVERY - Puppet freshness on sq37 is OK: puppet ran at Wed Jun 26 20:09:13 UTC 2013 [20:09:23] RECOVERY - Puppet freshness on mw1141 is OK: puppet ran at Wed Jun 26 20:09:14 UTC 2013 [20:09:24] RECOVERY - Puppet freshness on analytics1020 is OK: puppet ran at Wed Jun 26 20:09:14 UTC 2013 [20:09:24] RECOVERY - Puppet freshness on srv268 is OK: puppet ran at Wed Jun 26 20:09:16 UTC 2013 [20:09:25] RECOVERY - Puppet freshness on sq49 is OK: puppet ran at Wed Jun 26 20:09:17 UTC 2013 [20:09:28] RECOVERY - Puppet freshness on sq77 is OK: puppet ran at Wed Jun 26 20:09:17 UTC 2013 [20:09:28] RECOVERY - Puppet freshness on emery is OK: puppet ran at Wed Jun 26 20:09:20 UTC 2013 [20:09:28] RECOVERY - Puppet freshness on mw1158 is OK: puppet ran at Wed Jun 26 20:09:19 UTC 2013 [20:09:28] RECOVERY - Puppet freshness on mw1 is OK: puppet ran at Wed Jun 26 20:09:21 UTC 2013 [20:09:28] RECOVERY - Puppet freshness on mw1174 is OK: puppet ran at Wed Jun 26 20:09:21 UTC 2013 [20:09:29] RECOVERY - Puppet freshness on ms1004 is OK: puppet ran at Wed Jun 26 20:09:22 UTC 2013 [20:09:29] RECOVERY - Puppet freshness on ssl3001 is OK: puppet ran at Wed Jun 26 20:09:22 UTC 2013 [20:09:30] RECOVERY - Puppet freshness on mw1082 is OK: puppet ran at Wed Jun 26 20:09:22 UTC 2013 [20:09:30] RECOVERY - Puppet freshness on mw1026 is OK: puppet ran at Wed Jun 26 20:09:23 UTC 2013 [20:09:31] RECOVERY - Puppet freshness on cp1001 is OK: puppet ran at Wed Jun 26 20:09:24 UTC 2013 [20:09:31] RECOVERY - Puppet freshness on mw59 is OK: puppet ran at Wed Jun 26 20:09:24 UTC 2013 [20:09:32] RECOVERY - Puppet freshness on cp1053 is OK: puppet ran at Wed Jun 26 20:09:24 UTC 2013 [20:09:32] RECOVERY - Puppet freshness on mw1110 is OK: puppet ran at Wed Jun 26 20:09:25 UTC 2013 [20:09:33] RECOVERY - Puppet freshness on es2 is OK: puppet ran at Wed Jun 26 20:09:25 UTC 2013 [20:09:33] RECOVERY - Puppet freshness on arsenic is OK: puppet ran at Wed Jun 26 20:09:25 UTC 2013 [20:09:34] RECOVERY - Puppet freshness on ms-be1003 is OK: puppet ran at Wed Jun 26 20:09:26 UTC 2013 [20:09:34] RECOVERY - Puppet freshness on chromium is OK: puppet ran at Wed Jun 26 20:09:26 UTC 2013 [20:09:35] RECOVERY - Puppet freshness on mc13 is OK: puppet ran at Wed Jun 26 20:09:27 UTC 2013 [20:09:35] RECOVERY - Puppet freshness on mc1002 is OK: puppet ran at Wed Jun 26 20:09:27 UTC 2013 [20:09:38] RECOVERY - Puppet freshness on cp3021 is OK: puppet ran at Wed Jun 26 20:09:27 UTC 2013 [20:09:38] RECOVERY - Puppet freshness on ms-be1010 is OK: puppet ran at Wed Jun 26 20:09:28 UTC 2013 [20:09:38] RECOVERY - Puppet freshness on virt8 is OK: puppet ran at Wed Jun 26 20:09:28 UTC 2013 [20:09:38] RECOVERY - Puppet freshness on virt1005 is OK: puppet ran at Wed Jun 26 20:09:29 UTC 2013 [20:09:38] RECOVERY - Puppet freshness on wtp1008 is OK: puppet ran at Wed Jun 26 20:09:29 UTC 2013 [20:09:39] RECOVERY - Puppet freshness on wtp1020 is OK: puppet ran at Wed Jun 26 20:09:29 UTC 2013 [20:09:39] RECOVERY - Puppet freshness on wtp1016 is OK: puppet ran at Wed Jun 26 20:09:31 UTC 2013 [20:09:40] RECOVERY - Puppet freshness on harmon is OK: puppet ran at Wed Jun 26 20:09:32 UTC 2013 [20:09:40] RECOVERY - Puppet freshness on ssl1002 is OK: puppet ran at Wed Jun 26 20:09:32 UTC 2013 [20:09:41] RECOVERY - Puppet freshness on tmh1 is OK: puppet ran at Wed Jun 26 20:09:33 UTC 2013 [20:09:41] RECOVERY - Puppet freshness on mw3 is OK: puppet ran at Wed Jun 26 20:09:34 UTC 2013 [20:09:42] RECOVERY - Puppet freshness on srv293 is OK: puppet ran at Wed Jun 26 20:09:34 UTC 2013 [20:09:42] RECOVERY - Puppet freshness on mw92 is OK: puppet ran at Wed Jun 26 20:09:34 UTC 2013 [20:09:43] RECOVERY - Puppet freshness on cp3019 is OK: puppet ran at Wed Jun 26 20:09:37 UTC 2013 [20:09:48] RECOVERY - Puppet freshness on search23 is OK: puppet ran at Wed Jun 26 20:09:37 UTC 2013 [20:09:48] RECOVERY - Puppet freshness on lvs4 is OK: puppet ran at Wed Jun 26 20:09:38 UTC 2013 [20:09:48] RECOVERY - Puppet freshness on mw1108 is OK: puppet ran at Wed Jun 26 20:09:38 UTC 2013 [20:09:48] RECOVERY - Puppet freshness on mw1060 is OK: puppet ran at Wed Jun 26 20:09:40 UTC 2013 [20:09:48] RECOVERY - Puppet freshness on srv251 is OK: puppet ran at Wed Jun 26 20:09:42 UTC 2013 [20:09:49] RECOVERY - Puppet freshness on srv285 is OK: puppet ran at Wed Jun 26 20:09:43 UTC 2013 [20:09:49] RECOVERY - Puppet freshness on mw7 is OK: puppet ran at Wed Jun 26 20:09:43 UTC 2013 [20:09:50] RECOVERY - Puppet freshness on srv243 is OK: puppet ran at Wed Jun 26 20:09:44 UTC 2013 [20:09:50] RECOVERY - Puppet freshness on mw1148 is OK: puppet ran at Wed Jun 26 20:09:45 UTC 2013 [20:09:51] RECOVERY - Puppet freshness on mw1090 is OK: puppet ran at Wed Jun 26 20:09:47 UTC 2013 [20:09:58] RECOVERY - Puppet freshness on mw1073 is OK: puppet ran at Wed Jun 26 20:09:48 UTC 2013 [20:09:58] RECOVERY - Puppet freshness on mw1121 is OK: puppet ran at Wed Jun 26 20:09:48 UTC 2013 [20:09:58] RECOVERY - Puppet freshness on mw1199 is OK: puppet ran at Wed Jun 26 20:09:49 UTC 2013 [20:09:58] RECOVERY - Puppet freshness on mw1112 is OK: puppet ran at Wed Jun 26 20:09:49 UTC 2013 [20:09:58] RECOVERY - Puppet freshness on mw1120 is OK: puppet ran at Wed Jun 26 20:09:49 UTC 2013 [20:09:59] RECOVERY - Puppet freshness on mw1009 is OK: puppet ran at Wed Jun 26 20:09:50 UTC 2013 [20:09:59] RECOVERY - Puppet freshness on srv298 is OK: puppet ran at Wed Jun 26 20:09:50 UTC 2013 [20:10:00] RECOVERY - Puppet freshness on analytics1021 is OK: puppet ran at Wed Jun 26 20:09:50 UTC 2013 [20:10:00] RECOVERY - Puppet freshness on sq66 is OK: puppet ran at Wed Jun 26 20:09:51 UTC 2013 [20:10:01] RECOVERY - Puppet freshness on mw1135 is OK: puppet ran at Wed Jun 26 20:09:51 UTC 2013 [20:10:01] RECOVERY - Puppet freshness on mw1008 is OK: puppet ran at Wed Jun 26 20:09:52 UTC 2013 [20:10:02] RECOVERY - Puppet freshness on sq75 is OK: puppet ran at Wed Jun 26 20:09:52 UTC 2013 [20:10:02] RECOVERY - Puppet freshness on hydrogen is OK: puppet ran at Wed Jun 26 20:09:53 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on nitrogen is OK: puppet ran at Wed Jun 26 20:09:53 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on cp1009 is OK: puppet ran at Wed Jun 26 20:09:54 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on db1046 is OK: puppet ran at Wed Jun 26 20:09:54 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on cp1008 is OK: puppet ran at Wed Jun 26 20:09:55 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on sq36 is OK: puppet ran at Wed Jun 26 20:09:57 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on db1018 is OK: puppet ran at Wed Jun 26 20:09:58 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on db1015 is OK: puppet ran at Wed Jun 26 20:09:59 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on cp1064 is OK: puppet ran at Wed Jun 26 20:09:59 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on db64 is OK: puppet ran at Wed Jun 26 20:10:00 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on ms-fe1004 is OK: puppet ran at Wed Jun 26 20:10:01 UTC 2013 [20:10:09] RECOVERY - Puppet freshness on lvs3 is OK: puppet ran at Wed Jun 26 20:10:01 UTC 2013 [20:10:10] RECOVERY - Puppet freshness on db1049 is OK: puppet ran at Wed Jun 26 20:10:02 UTC 2013 [20:10:10] RECOVERY - Puppet freshness on snapshot3 is OK: puppet ran at Wed Jun 26 20:10:02 UTC 2013 [20:10:11] RECOVERY - Puppet freshness on cp1042 is OK: puppet ran at Wed Jun 26 20:10:03 UTC 2013 [20:10:11] RECOVERY - Puppet freshness on holmium is OK: puppet ran at Wed Jun 26 20:10:03 UTC 2013 [20:10:12] RECOVERY - Puppet freshness on mw48 is OK: puppet ran at Wed Jun 26 20:10:04 UTC 2013 [20:10:12] RECOVERY - Puppet freshness on srv257 is OK: puppet ran at Wed Jun 26 20:10:04 UTC 2013 [20:10:13] RECOVERY - Puppet freshness on srv274 is OK: puppet ran at Wed Jun 26 20:10:06 UTC 2013 [20:10:13] RECOVERY - Puppet freshness on srv235 is OK: puppet ran at Wed Jun 26 20:10:07 UTC 2013 [20:10:14] RECOVERY - Puppet freshness on mw56 is OK: puppet ran at Wed Jun 26 20:10:07 UTC 2013 [20:10:18] RECOVERY - Puppet freshness on mw1170 is OK: puppet ran at Wed Jun 26 20:10:07 UTC 2013 [20:10:18] RECOVERY - Puppet freshness on mw1061 is OK: puppet ran at Wed Jun 26 20:10:08 UTC 2013 [20:10:18] RECOVERY - Puppet freshness on mw1119 is OK: puppet ran at Wed Jun 26 20:10:08 UTC 2013 [20:10:18] RECOVERY - Puppet freshness on mw1070 is OK: puppet ran at Wed Jun 26 20:10:09 UTC 2013 [20:10:18] RECOVERY - Puppet freshness on mw1058 is OK: puppet ran at Wed Jun 26 20:10:09 UTC 2013 [20:10:19] RECOVERY - Puppet freshness on mw1157 is OK: puppet ran at Wed Jun 26 20:10:10 UTC 2013 [20:10:19] RECOVERY - Puppet freshness on mw1102 is OK: puppet ran at Wed Jun 26 20:10:10 UTC 2013 [20:10:20] RECOVERY - Puppet freshness on mw1085 is OK: puppet ran at Wed Jun 26 20:10:10 UTC 2013 [20:10:21] RECOVERY - Puppet freshness on ms-be1005 is OK: puppet ran at Wed Jun 26 20:10:11 UTC 2013 [20:10:22] RECOVERY - Puppet freshness on mc5 is OK: puppet ran at Wed Jun 26 20:10:11 UTC 2013 [20:10:22] RECOVERY - Puppet freshness on amssq36 is OK: puppet ran at Wed Jun 26 20:10:11 UTC 2013 [20:10:23] RECOVERY - Puppet freshness on db68 is OK: puppet ran at Wed Jun 26 20:10:13 UTC 2013 [20:10:23] RECOVERY - Puppet freshness on virt0 is OK: puppet ran at Wed Jun 26 20:10:13 UTC 2013 [20:10:24] RECOVERY - Puppet freshness on db1028 is OK: puppet ran at Wed Jun 26 20:10:14 UTC 2013 [20:10:24] RECOVERY - Puppet freshness on db38 is OK: puppet ran at Wed Jun 26 20:10:14 UTC 2013 [20:10:25] RECOVERY - Puppet freshness on wtp1014 is OK: puppet ran at Wed Jun 26 20:10:15 UTC 2013 [20:10:25] RECOVERY - Puppet freshness on mw1019 is OK: puppet ran at Wed Jun 26 20:10:15 UTC 2013 [20:10:26] RECOVERY - Puppet freshness on amssq41 is OK: puppet ran at Wed Jun 26 20:10:15 UTC 2013 [20:10:26] RECOVERY - Puppet freshness on mw1093 is OK: puppet ran at Wed Jun 26 20:10:16 UTC 2013 [20:10:27] RECOVERY - Puppet freshness on analytics1024 is OK: puppet ran at Wed Jun 26 20:10:16 UTC 2013 [20:10:27] RECOVERY - Puppet freshness on analytics1006 is OK: puppet ran at Wed Jun 26 20:10:17 UTC 2013 [20:10:28] RECOVERY - Puppet freshness on mw96 is OK: puppet ran at Wed Jun 26 20:10:17 UTC 2013 [20:10:28] RECOVERY - Puppet freshness on mw120 is OK: puppet ran at Wed Jun 26 20:10:17 UTC 2013 [20:10:29] RECOVERY - Puppet freshness on srv239 is OK: puppet ran at Wed Jun 26 20:10:18 UTC 2013 [20:10:29] RECOVERY - Puppet freshness on srv252 is OK: puppet ran at Wed Jun 26 20:10:18 UTC 2013 [20:10:30] RECOVERY - Puppet freshness on srv247 is OK: puppet ran at Wed Jun 26 20:10:19 UTC 2013 [20:10:30] RECOVERY - Puppet freshness on search20 is OK: puppet ran at Wed Jun 26 20:10:20 UTC 2013 [20:10:31] RECOVERY - Puppet freshness on ms-be7 is OK: puppet ran at Wed Jun 26 20:10:20 UTC 2013 [20:10:31] RECOVERY - Puppet freshness on mw1177 is OK: puppet ran at Wed Jun 26 20:10:22 UTC 2013 [20:10:32] RECOVERY - Puppet freshness on mw1172 is OK: puppet ran at Wed Jun 26 20:10:24 UTC 2013 [20:10:32] RECOVERY - Puppet freshness on search1009 is OK: puppet ran at Wed Jun 26 20:10:24 UTC 2013 [20:10:33] RECOVERY - Puppet freshness on mw67 is OK: puppet ran at Wed Jun 26 20:10:25 UTC 2013 [20:10:33] RECOVERY - Puppet freshness on mw78 is OK: puppet ran at Wed Jun 26 20:10:25 UTC 2013 [20:10:34] RECOVERY - Puppet freshness on mw1191 is OK: puppet ran at Wed Jun 26 20:10:27 UTC 2013 [20:10:38] RECOVERY - Puppet freshness on search1013 is OK: puppet ran at Wed Jun 26 20:10:27 UTC 2013 [20:10:38] RECOVERY - Puppet freshness on mw1039 is OK: puppet ran at Wed Jun 26 20:10:28 UTC 2013 [20:10:38] RECOVERY - Puppet freshness on mw1184 is OK: puppet ran at Wed Jun 26 20:10:28 UTC 2013 [20:10:38] RECOVERY - Puppet freshness on sq57 is OK: puppet ran at Wed Jun 26 20:10:29 UTC 2013 [20:10:38] RECOVERY - Puppet freshness on sq61 is OK: puppet ran at Wed Jun 26 20:10:29 UTC 2013 [20:10:39] RECOVERY - Puppet freshness on es7 is OK: puppet ran at Wed Jun 26 20:10:29 UTC 2013 [20:10:39] RECOVERY - Puppet freshness on ms-fe4 is OK: puppet ran at Wed Jun 26 20:10:30 UTC 2013 [20:10:40] RECOVERY - Puppet freshness on hooper is OK: puppet ran at Wed Jun 26 20:10:30 UTC 2013 [20:10:40] RECOVERY - Puppet freshness on cp1019 is OK: puppet ran at Wed Jun 26 20:10:31 UTC 2013 [20:10:41] RECOVERY - Puppet freshness on wtp1005 is OK: puppet ran at Wed Jun 26 20:10:33 UTC 2013 [20:10:41] RECOVERY - Puppet freshness on amssq58 is OK: puppet ran at Wed Jun 26 20:10:33 UTC 2013 [20:10:42] RECOVERY - Puppet freshness on db37 is OK: puppet ran at Wed Jun 26 20:10:33 UTC 2013 [20:10:42] RECOVERY - Puppet freshness on db1041 is OK: puppet ran at Wed Jun 26 20:10:34 UTC 2013 [20:10:43] RECOVERY - Puppet freshness on nfs2 is OK: puppet ran at Wed Jun 26 20:10:34 UTC 2013 [20:10:43] RECOVERY - Puppet freshness on gallium is OK: puppet ran at Wed Jun 26 20:10:35 UTC 2013 [20:10:44] RECOVERY - Puppet freshness on db1003 is OK: puppet ran at Wed Jun 26 20:10:36 UTC 2013 [20:10:44] RECOVERY - Puppet freshness on search35 is OK: puppet ran at Wed Jun 26 20:10:36 UTC 2013 [20:10:45] RECOVERY - Puppet freshness on mw84 is OK: puppet ran at Wed Jun 26 20:10:36 UTC 2013 [20:10:45] RECOVERY - Puppet freshness on mw17 is OK: puppet ran at Wed Jun 26 20:10:37 UTC 2013 [20:10:46] RECOVERY - Puppet freshness on mw71 is OK: puppet ran at Wed Jun 26 20:10:37 UTC 2013 [20:10:48] RECOVERY - Puppet freshness on srv254 is OK: puppet ran at Wed Jun 26 20:10:38 UTC 2013 [20:10:48] RECOVERY - Puppet freshness on mw69 is OK: puppet ran at Wed Jun 26 20:10:38 UTC 2013 [20:10:48] RECOVERY - Puppet freshness on cp1035 is OK: puppet ran at Wed Jun 26 20:10:38 UTC 2013 [20:10:48] RECOVERY - Puppet freshness on search26 is OK: puppet ran at Wed Jun 26 20:10:39 UTC 2013 [20:10:48] RECOVERY - Puppet freshness on mw25 is OK: puppet ran at Wed Jun 26 20:10:41 UTC 2013 [20:10:49] RECOVERY - Puppet freshness on db33 is OK: puppet ran at Wed Jun 26 20:10:42 UTC 2013 [20:10:49] RECOVERY - Puppet freshness on mw1055 is OK: puppet ran at Wed Jun 26 20:10:42 UTC 2013 [20:10:50] RECOVERY - Puppet freshness on mw1195 is OK: puppet ran at Wed Jun 26 20:10:43 UTC 2013 [20:10:50] RECOVERY - Puppet freshness on es1007 is OK: puppet ran at Wed Jun 26 20:10:43 UTC 2013 [20:10:51] RECOVERY - Puppet freshness on mw1013 is OK: puppet ran at Wed Jun 26 20:10:44 UTC 2013 [20:10:51] RECOVERY - Puppet freshness on mw1076 is OK: puppet ran at Wed Jun 26 20:10:44 UTC 2013 [20:10:52] RECOVERY - Puppet freshness on wtp1012 is OK: puppet ran at Wed Jun 26 20:10:45 UTC 2013 [20:10:52] RECOVERY - Puppet freshness on sq53 is OK: puppet ran at Wed Jun 26 20:10:47 UTC 2013 [20:10:53] RECOVERY - Puppet freshness on mw63 is OK: puppet ran at Wed Jun 26 20:10:47 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on db1043 is OK: puppet ran at Wed Jun 26 20:10:48 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on mw1167 is OK: puppet ran at Wed Jun 26 20:10:48 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on analytics1016 is OK: puppet ran at Wed Jun 26 20:10:49 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on snapshot1001 is OK: puppet ran at Wed Jun 26 20:10:49 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on mw1035 is OK: puppet ran at Wed Jun 26 20:10:49 UTC 2013 [20:10:58] RECOVERY - Puppet freshness on es3 is OK: puppet ran at Wed Jun 26 20:10:50 UTC 2013 [20:10:59] RECOVERY - Puppet freshness on nfs1 is OK: puppet ran at Wed Jun 26 20:10:50 UTC 2013 [20:10:59] RECOVERY - Puppet freshness on silver is OK: puppet ran at Wed Jun 26 20:10:51 UTC 2013 [20:11:00] RECOVERY - Puppet freshness on mc1010 is OK: puppet ran at Wed Jun 26 20:10:51 UTC 2013 [20:11:00] RECOVERY - Puppet freshness on sq56 is OK: puppet ran at Wed Jun 26 20:10:51 UTC 2013 [20:11:01] RECOVERY - Puppet freshness on es1006 is OK: puppet ran at Wed Jun 26 20:10:52 UTC 2013 [20:11:01] RECOVERY - Puppet freshness on virt10 is OK: puppet ran at Wed Jun 26 20:10:52 UTC 2013 [20:11:02] RECOVERY - Puppet freshness on ssl3003 is OK: puppet ran at Wed Jun 26 20:10:53 UTC 2013 [20:11:02] RECOVERY - Puppet freshness on sq69 is OK: puppet ran at Wed Jun 26 20:10:53 UTC 2013 [20:11:03] RECOVERY - Puppet freshness on analytics1026 is OK: puppet ran at Wed Jun 26 20:10:54 UTC 2013 [20:11:03] RECOVERY - Puppet freshness on db1007 is OK: puppet ran at Wed Jun 26 20:10:55 UTC 2013 [20:11:04] RECOVERY - Puppet freshness on db1039 is OK: puppet ran at Wed Jun 26 20:10:55 UTC 2013 [20:11:04] RECOVERY - Puppet freshness on amslvs3 is OK: puppet ran at Wed Jun 26 20:10:55 UTC 2013 [20:11:05] RECOVERY - Puppet freshness on mw1216 is OK: puppet ran at Wed Jun 26 20:10:56 UTC 2013 [20:11:05] RECOVERY - Puppet freshness on sq44 is OK: puppet ran at Wed Jun 26 20:10:56 UTC 2013 [20:11:06] RECOVERY - Puppet freshness on search1002 is OK: puppet ran at Wed Jun 26 20:10:57 UTC 2013 [20:11:06] RECOVERY - Puppet freshness on mc1014 is OK: puppet ran at Wed Jun 26 20:10:57 UTC 2013 [20:11:08] RECOVERY - Puppet freshness on mw1149 is OK: puppet ran at Wed Jun 26 20:10:57 UTC 2013 [20:11:08] RECOVERY - Puppet freshness on mw32 is OK: puppet ran at Wed Jun 26 20:10:58 UTC 2013 [20:11:08] RECOVERY - Puppet freshness on mw1151 is OK: puppet ran at Wed Jun 26 20:10:58 UTC 2013 [20:11:08] RECOVERY - Puppet freshness on mw1202 is OK: puppet ran at Wed Jun 26 20:10:59 UTC 2013 [20:11:09] RECOVERY - Puppet freshness on mw1098 is OK: puppet ran at Wed Jun 26 20:10:59 UTC 2013 [20:11:09] RECOVERY - Puppet freshness on analytics1013 is OK: puppet ran at Wed Jun 26 20:11:00 UTC 2013 [20:11:10] RECOVERY - Puppet freshness on mw1133 is OK: puppet ran at Wed Jun 26 20:11:01 UTC 2013 [20:11:10] RECOVERY - Puppet freshness on cp3022 is OK: puppet ran at Wed Jun 26 20:11:01 UTC 2013 [20:11:11] RECOVERY - Puppet freshness on ersch is OK: puppet ran at Wed Jun 26 20:11:01 UTC 2013 [20:11:11] RECOVERY - Puppet freshness on mw86 is OK: puppet ran at Wed Jun 26 20:11:02 UTC 2013 [20:11:12] RECOVERY - Puppet freshness on mw1079 is OK: puppet ran at Wed Jun 26 20:11:02 UTC 2013 [20:11:12] RECOVERY - Puppet freshness on mw1051 is OK: puppet ran at Wed Jun 26 20:11:03 UTC 2013 [20:11:13] RECOVERY - Puppet freshness on cp1044 is OK: puppet ran at Wed Jun 26 20:11:03 UTC 2013 [20:11:13] RECOVERY - Puppet freshness on mw1109 is OK: puppet ran at Wed Jun 26 20:11:03 UTC 2013 [20:11:14] RECOVERY - Puppet freshness on lvs1003 is OK: puppet ran at Wed Jun 26 20:11:04 UTC 2013 [20:11:14] RECOVERY - Puppet freshness on mw88 is OK: puppet ran at Wed Jun 26 20:11:04 UTC 2013 [20:11:15] RECOVERY - Puppet freshness on mw18 is OK: puppet ran at Wed Jun 26 20:11:05 UTC 2013 [20:11:15] RECOVERY - Puppet freshness on mw1168 is OK: puppet ran at Wed Jun 26 20:11:05 UTC 2013 [20:11:16] RECOVERY - Puppet freshness on cp1011 is OK: puppet ran at Wed Jun 26 20:11:05 UTC 2013 [20:11:18] RECOVERY - Puppet freshness on mw1192 is OK: puppet ran at Wed Jun 26 20:11:08 UTC 2013 [20:11:18] RECOVERY - Puppet freshness on rdb1001 is OK: puppet ran at Wed Jun 26 20:11:09 UTC 2013 [20:11:18] RECOVERY - Puppet freshness on db49 is OK: puppet ran at Wed Jun 26 20:11:09 UTC 2013 [20:11:18] RECOVERY - Puppet freshness on search24 is OK: puppet ran at Wed Jun 26 20:11:12 UTC 2013 [20:11:18] RECOVERY - Puppet freshness on ms-be1008 is OK: puppet ran at Wed Jun 26 20:11:12 UTC 2013 [20:11:19] RECOVERY - Puppet freshness on db72 is OK: puppet ran at Wed Jun 26 20:11:13 UTC 2013 [20:11:19] RECOVERY - Puppet freshness on formey is OK: puppet ran at Wed Jun 26 20:11:13 UTC 2013 [20:11:20] RECOVERY - Puppet freshness on db74 is OK: puppet ran at Wed Jun 26 20:11:14 UTC 2013 [20:11:20] RECOVERY - Puppet freshness on virt1007 is OK: puppet ran at Wed Jun 26 20:11:14 UTC 2013 [20:11:21] RECOVERY - Puppet freshness on iron is OK: puppet ran at Wed Jun 26 20:11:15 UTC 2013 [20:11:28] RECOVERY - Puppet freshness on mw1014 is OK: puppet ran at Wed Jun 26 20:11:17 UTC 2013 [20:11:28] RECOVERY - Puppet freshness on sq70 is OK: puppet ran at Wed Jun 26 20:11:18 UTC 2013 [20:11:28] RECOVERY - Puppet freshness on virt7 is OK: puppet ran at Wed Jun 26 20:11:18 UTC 2013 [20:11:28] RECOVERY - Puppet freshness on oxygen is OK: puppet ran at Wed Jun 26 20:11:19 UTC 2013 [20:11:28] RECOVERY - Puppet freshness on db1016 is OK: puppet ran at Wed Jun 26 20:11:19 UTC 2013 [20:11:29] RECOVERY - Puppet freshness on ssl3002 is OK: puppet ran at Wed Jun 26 20:11:20 UTC 2013 [20:11:29] RECOVERY - Puppet freshness on db1030 is OK: puppet ran at Wed Jun 26 20:11:22 UTC 2013 [20:11:30] RECOVERY - Puppet freshness on cp1041 is OK: puppet ran at Wed Jun 26 20:11:23 UTC 2013 [20:11:30] RECOVERY - Puppet freshness on mw4 is OK: puppet ran at Wed Jun 26 20:11:23 UTC 2013 [20:11:31] RECOVERY - Puppet freshness on mw123 is OK: puppet ran at Wed Jun 26 20:11:24 UTC 2013 [20:11:31] RECOVERY - Puppet freshness on srv263 is OK: puppet ran at Wed Jun 26 20:11:24 UTC 2013 [20:11:32] RECOVERY - Puppet freshness on hooft is OK: puppet ran at Wed Jun 26 20:11:25 UTC 2013 [20:11:32] RECOVERY - Puppet freshness on mw97 is OK: puppet ran at Wed Jun 26 20:11:25 UTC 2013 [20:11:33] RECOVERY - Puppet freshness on search32 is OK: puppet ran at Wed Jun 26 20:11:25 UTC 2013 [20:11:33] RECOVERY - Puppet freshness on mw1183 is OK: puppet ran at Wed Jun 26 20:11:26 UTC 2013 [20:11:34] RECOVERY - Puppet freshness on mw72 is OK: puppet ran at Wed Jun 26 20:11:26 UTC 2013 [20:11:34] RECOVERY - Puppet freshness on db1053 is OK: puppet ran at Wed Jun 26 20:11:27 UTC 2013 [20:11:35] RECOVERY - Puppet freshness on mw85 is OK: puppet ran at Wed Jun 26 20:11:27 UTC 2013 [20:11:38] RECOVERY - Puppet freshness on cp1027 is OK: puppet ran at Wed Jun 26 20:11:28 UTC 2013 [20:11:38] RECOVERY - Puppet freshness on db78 is OK: puppet ran at Wed Jun 26 20:11:28 UTC 2013 [20:11:38] RECOVERY - Puppet freshness on db1059 is OK: puppet ran at Wed Jun 26 20:11:28 UTC 2013 [20:11:38] RECOVERY - Puppet freshness on ms1002 is OK: puppet ran at Wed Jun 26 20:11:29 UTC 2013 [20:11:38] RECOVERY - Puppet freshness on amssq62 is OK: puppet ran at Wed Jun 26 20:11:29 UTC 2013 [20:11:39] RECOVERY - Puppet freshness on cp1055 is OK: puppet ran at Wed Jun 26 20:11:29 UTC 2013 [20:11:39] RECOVERY - Puppet freshness on amssq46 is OK: puppet ran at Wed Jun 26 20:11:30 UTC 2013 [20:11:40] RECOVERY - Puppet freshness on mw1034 is OK: puppet ran at Wed Jun 26 20:11:30 UTC 2013 [20:11:40] RECOVERY - Puppet freshness on es8 is OK: puppet ran at Wed Jun 26 20:11:35 UTC 2013 [20:11:41] RECOVERY - Puppet freshness on mw1156 is OK: puppet ran at Wed Jun 26 20:11:35 UTC 2013 [20:11:48] RECOVERY - Puppet freshness on wtp1022 is OK: puppet ran at Wed Jun 26 20:11:37 UTC 2013 [20:11:48] RECOVERY - Puppet freshness on wtp1023 is OK: puppet ran at Wed Jun 26 20:11:38 UTC 2013 [20:11:48] RECOVERY - Puppet freshness on srv288 is OK: puppet ran at Wed Jun 26 20:11:39 UTC 2013 [20:11:48] RECOVERY - Puppet freshness on mw1050 is OK: puppet ran at Wed Jun 26 20:11:39 UTC 2013 [20:11:48] RECOVERY - Puppet freshness on ssl3 is OK: puppet ran at Wed Jun 26 20:11:40 UTC 2013 [20:11:49] RECOVERY - Puppet freshness on dysprosium is OK: puppet ran at Wed Jun 26 20:11:41 UTC 2013 [20:11:49] RECOVERY - Puppet freshness on searchidx2 is OK: puppet ran at Wed Jun 26 20:11:41 UTC 2013 [20:11:50] RECOVERY - Puppet freshness on srv279 is OK: puppet ran at Wed Jun 26 20:11:41 UTC 2013 [20:11:50] RECOVERY - Puppet freshness on mw1212 is OK: puppet ran at Wed Jun 26 20:11:42 UTC 2013 [20:11:51] RECOVERY - Puppet freshness on mw1036 is OK: puppet ran at Wed Jun 26 20:11:42 UTC 2013 [20:11:51] RECOVERY - Puppet freshness on snapshot1004 is OK: puppet ran at Wed Jun 26 20:11:43 UTC 2013 [20:11:52] RECOVERY - Puppet freshness on srv241 is OK: puppet ran at Wed Jun 26 20:11:43 UTC 2013 [20:11:52] RECOVERY - Puppet freshness on mw27 is OK: puppet ran at Wed Jun 26 20:11:44 UTC 2013 [20:11:53] RECOVERY - Puppet freshness on mw102 is OK: puppet ran at Wed Jun 26 20:11:44 UTC 2013 [20:11:53] RECOVERY - Puppet freshness on mw1077 is OK: puppet ran at Wed Jun 26 20:11:45 UTC 2013 [20:11:54] RECOVERY - Puppet freshness on mw6 is OK: puppet ran at Wed Jun 26 20:11:45 UTC 2013 [20:11:54] RECOVERY - Puppet freshness on search1016 is OK: puppet ran at Wed Jun 26 20:11:45 UTC 2013 [20:11:55] RECOVERY - Puppet freshness on search1017 is OK: puppet ran at Wed Jun 26 20:11:46 UTC 2013 [20:11:55] RECOVERY - Puppet freshness on mw1097 is OK: puppet ran at Wed Jun 26 20:11:47 UTC 2013 [20:11:56] RECOVERY - Puppet freshness on mw1029 is OK: puppet ran at Wed Jun 26 20:11:47 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on mw1163 is OK: puppet ran at Wed Jun 26 20:11:47 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on mw1096 is OK: puppet ran at Wed Jun 26 20:11:48 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on mw1002 is OK: puppet ran at Wed Jun 26 20:11:48 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on pc3 is OK: puppet ran at Wed Jun 26 20:11:49 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on sq80 is OK: puppet ran at Wed Jun 26 20:11:49 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on cp1052 is OK: puppet ran at Wed Jun 26 20:11:49 UTC 2013 [20:11:59] RECOVERY - Puppet freshness on rdb1004 is OK: puppet ran at Wed Jun 26 20:11:49 UTC 2013 [20:11:59] mark: ! [20:12:00] RECOVERY - Puppet freshness on es1010 is OK: puppet ran at Wed Jun 26 20:11:50 UTC 2013 [20:12:00] RECOVERY - Puppet freshness on virt6 is OK: puppet ran at Wed Jun 26 20:11:50 UTC 2013 [20:12:01] RECOVERY - Puppet freshness on db1008 is OK: puppet ran at Wed Jun 26 20:11:50 UTC 2013 [20:12:01] RECOVERY - Puppet freshness on cp1059 is OK: puppet ran at Wed Jun 26 20:11:50 UTC 2013 [20:12:02] RECOVERY - Puppet freshness on db1038 is OK: puppet ran at Wed Jun 26 20:11:50 UTC 2013 [20:12:02] RECOVERY - Puppet freshness on amssq38 is OK: puppet ran at Wed Jun 26 20:11:52 UTC 2013 [20:12:03] RECOVERY - Puppet freshness on ms-be1011 is OK: puppet ran at Wed Jun 26 20:11:52 UTC 2013 [20:12:03] RECOVERY - Puppet freshness on ms1001 is OK: puppet ran at Wed Jun 26 20:11:54 UTC 2013 [20:12:04] RECOVERY - Puppet freshness on mw1023 is OK: puppet ran at Wed Jun 26 20:11:54 UTC 2013 [20:12:04] RECOVERY - Puppet freshness on ms-be1001 is OK: puppet ran at Wed Jun 26 20:11:54 UTC 2013 [20:12:05] RECOVERY - Puppet freshness on wtp1019 is OK: puppet ran at Wed Jun 26 20:11:55 UTC 2013 [20:12:05] RECOVERY - Puppet freshness on mw44 is OK: puppet ran at Wed Jun 26 20:11:56 UTC 2013 [20:12:06] RECOVERY - Puppet freshness on virt5 is OK: puppet ran at Wed Jun 26 20:11:56 UTC 2013 [20:12:08] RECOVERY - Puppet freshness on cp1033 is OK: puppet ran at Wed Jun 26 20:11:59 UTC 2013 [20:12:08] RECOVERY - Puppet freshness on mw89 is OK: puppet ran at Wed Jun 26 20:12:00 UTC 2013 [20:12:08] RECOVERY - Puppet freshness on ms-be6 is OK: puppet ran at Wed Jun 26 20:12:00 UTC 2013 [20:12:08] RECOVERY - Puppet freshness on srv281 is OK: puppet ran at Wed Jun 26 20:12:01 UTC 2013 [20:12:08] RECOVERY - Puppet freshness on db47 is OK: puppet ran at Wed Jun 26 20:12:01 UTC 2013 [20:12:09] RECOVERY - Puppet freshness on analytics1027 is OK: puppet ran at Wed Jun 26 20:12:03 UTC 2013 [20:12:09] RECOVERY - Puppet freshness on db1006 is OK: puppet ran at Wed Jun 26 20:12:04 UTC 2013 [20:12:10] RECOVERY - Puppet freshness on db50 is OK: puppet ran at Wed Jun 26 20:12:04 UTC 2013 [20:12:10] RECOVERY - Puppet freshness on mw21 is OK: puppet ran at Wed Jun 26 20:12:05 UTC 2013 [20:12:11] RECOVERY - Puppet freshness on mw80 is OK: puppet ran at Wed Jun 26 20:12:05 UTC 2013 [20:12:11] RECOVERY - Puppet freshness on mw1152 is OK: puppet ran at Wed Jun 26 20:12:05 UTC 2013 [20:12:12] RECOVERY - Puppet freshness on pc1001 is OK: puppet ran at Wed Jun 26 20:12:06 UTC 2013 [20:12:12] RECOVERY - Puppet freshness on mw1028 is OK: puppet ran at Wed Jun 26 20:12:06 UTC 2013 [20:12:13] RECOVERY - Puppet freshness on db62 is OK: puppet ran at Wed Jun 26 20:12:07 UTC 2013 [20:12:13] RECOVERY - Puppet freshness on search1020 is OK: puppet ran at Wed Jun 26 20:12:07 UTC 2013 [20:12:18] RECOVERY - Puppet freshness on db35 is OK: puppet ran at Wed Jun 26 20:12:07 UTC 2013 [20:12:18] RECOVERY - Puppet freshness on locke is OK: puppet ran at Wed Jun 26 20:12:08 UTC 2013 [20:12:19] RECOVERY - Puppet freshness on wtp1017 is OK: puppet ran at Wed Jun 26 20:12:08 UTC 2013 [20:12:19] RECOVERY - Puppet freshness on db1047 is OK: puppet ran at Wed Jun 26 20:12:09 UTC 2013 [20:12:19] RECOVERY - Puppet freshness on mw1005 is OK: puppet ran at Wed Jun 26 20:12:09 UTC 2013 [20:12:19] RECOVERY - Puppet freshness on sq60 is OK: puppet ran at Wed Jun 26 20:12:10 UTC 2013 [20:12:19] RECOVERY - Puppet freshness on cp1031 is OK: puppet ran at Wed Jun 26 20:12:11 UTC 2013 [20:12:20] RECOVERY - Puppet freshness on mw1209 is OK: puppet ran at Wed Jun 26 20:12:11 UTC 2013 [20:12:20] RECOVERY - Puppet freshness on mw1067 is OK: puppet ran at Wed Jun 26 20:12:12 UTC 2013 [20:12:21] ^demon: shcpaaaam [20:12:21] RECOVERY - Puppet freshness on amssq49 is OK: puppet ran at Wed Jun 26 20:12:12 UTC 2013 [20:12:21] RECOVERY - Puppet freshness on mw1016 is OK: puppet ran at Wed Jun 26 20:12:13 UTC 2013 [20:12:22] RECOVERY - Puppet freshness on amssq54 is OK: puppet ran at Wed Jun 26 20:12:13 UTC 2013 [20:12:22] RECOVERY - Puppet freshness on cp1022 is OK: puppet ran at Wed Jun 26 20:12:14 UTC 2013 [20:12:23] RECOVERY - Puppet freshness on mc1004 is OK: puppet ran at Wed Jun 26 20:12:14 UTC 2013 [20:12:23] RECOVERY - Puppet freshness on amssq42 is OK: puppet ran at Wed Jun 26 20:12:15 UTC 2013 [20:12:24] RECOVERY - Puppet freshness on amssq45 is OK: puppet ran at Wed Jun 26 20:12:15 UTC 2013 [20:12:24] RECOVERY - Puppet freshness on cp1014 is OK: puppet ran at Wed Jun 26 20:12:16 UTC 2013 [20:12:28] RECOVERY - Puppet freshness on kaulen is OK: puppet ran at Wed Jun 26 20:12:19 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on es1001 is OK: puppet ran at Wed Jun 26 20:12:19 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on cp1070 is OK: puppet ran at Wed Jun 26 20:12:19 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on nescio is OK: puppet ran at Wed Jun 26 20:12:20 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on cp1004 is OK: puppet ran at Wed Jun 26 20:12:20 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on cp3020 is OK: puppet ran at Wed Jun 26 20:12:21 UTC 2013 [20:12:29] RECOVERY - Puppet freshness on db71 is OK: puppet ran at Wed Jun 26 20:12:21 UTC 2013 [20:12:30] RECOVERY - Puppet freshness on cp3008 is OK: puppet ran at Wed Jun 26 20:12:22 UTC 2013 [20:12:30] RECOVERY - Puppet freshness on cp1032 is OK: puppet ran at Wed Jun 26 20:12:22 UTC 2013 [20:12:31] RECOVERY - Puppet freshness on cp1040 is OK: puppet ran at Wed Jun 26 20:12:22 UTC 2013 [20:12:31] RECOVERY - Puppet freshness on mw11 is OK: puppet ran at Wed Jun 26 20:12:23 UTC 2013 [20:12:32] RECOVERY - Puppet freshness on amssq33 is OK: puppet ran at Wed Jun 26 20:12:23 UTC 2013 [20:12:32] RECOVERY - Puppet freshness on mw22 is OK: puppet ran at Wed Jun 26 20:12:24 UTC 2013 [20:12:33] RECOVERY - Puppet freshness on srv265 is OK: puppet ran at Wed Jun 26 20:12:24 UTC 2013 [20:12:33] RECOVERY - Puppet freshness on mc9 is OK: puppet ran at Wed Jun 26 20:12:25 UTC 2013 [20:12:34] RECOVERY - Puppet freshness on zhen is OK: puppet ran at Wed Jun 26 20:12:26 UTC 2013 [20:12:34] RECOVERY - Puppet freshness on search34 is OK: puppet ran at Wed Jun 26 20:12:26 UTC 2013 [20:12:35] RECOVERY - Puppet freshness on ms-fe1 is OK: puppet ran at Wed Jun 26 20:12:27 UTC 2013 [20:12:38] RECOVERY - Puppet freshness on srv248 is OK: puppet ran at Wed Jun 26 20:12:28 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on cp1067 is OK: puppet ran at Wed Jun 26 20:12:29 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on ssl2 is OK: puppet ran at Wed Jun 26 20:12:29 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on mw1064 is OK: puppet ran at Wed Jun 26 20:12:29 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on yvon is OK: puppet ran at Wed Jun 26 20:12:29 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on srv277 is OK: puppet ran at Wed Jun 26 20:12:30 UTC 2013 [20:12:39] RECOVERY - Puppet freshness on db1035 is OK: puppet ran at Wed Jun 26 20:12:30 UTC 2013 [20:12:40] RECOVERY - Puppet freshness on mw1193 is OK: puppet ran at Wed Jun 26 20:12:31 UTC 2013 [20:12:40] RECOVERY - Puppet freshness on amssq61 is OK: puppet ran at Wed Jun 26 20:12:31 UTC 2013 [20:12:41] RECOVERY - Puppet freshness on amssq39 is OK: puppet ran at Wed Jun 26 20:12:32 UTC 2013 [20:12:41] RECOVERY - Puppet freshness on snapshot2 is OK: puppet ran at Wed Jun 26 20:12:32 UTC 2013 [20:12:42] RECOVERY - Puppet freshness on search15 is OK: puppet ran at Wed Jun 26 20:12:32 UTC 2013 [20:12:42] RECOVERY - Puppet freshness on cp1045 is OK: puppet ran at Wed Jun 26 20:12:32 UTC 2013 [20:12:43] RECOVERY - Puppet freshness on mw1006 is OK: puppet ran at Wed Jun 26 20:12:32 UTC 2013 [20:12:43] RECOVERY - Puppet freshness on wtp1009 is OK: puppet ran at Wed Jun 26 20:12:33 UTC 2013 [20:12:44] RECOVERY - Puppet freshness on tridge is OK: puppet ran at Wed Jun 26 20:12:33 UTC 2013 [20:12:44] RECOVERY - Puppet freshness on db1037 is OK: puppet ran at Wed Jun 26 20:12:34 UTC 2013 [20:12:45] RECOVERY - Puppet freshness on mw1012 is OK: puppet ran at Wed Jun 26 20:12:35 UTC 2013 [20:12:45] RECOVERY - Puppet freshness on xenon is OK: puppet ran at Wed Jun 26 20:12:35 UTC 2013 [20:12:46] RECOVERY - Puppet freshness on mw19 is OK: puppet ran at Wed Jun 26 20:12:37 UTC 2013 [20:12:48] RECOVERY - Puppet freshness on cp1047 is OK: puppet ran at Wed Jun 26 20:12:39 UTC 2013 [20:12:48] RECOVERY - Puppet freshness on mw24 is OK: puppet ran at Wed Jun 26 20:12:39 UTC 2013 [20:12:48] RECOVERY - Puppet freshness on srv236 is OK: puppet ran at Wed Jun 26 20:12:40 UTC 2013 [20:12:48] RECOVERY - Puppet freshness on mw73 is OK: puppet ran at Wed Jun 26 20:12:41 UTC 2013 [20:12:48] RECOVERY - Puppet freshness on strontium is OK: puppet ran at Wed Jun 26 20:12:42 UTC 2013 [20:12:49] RECOVERY - Puppet freshness on srv256 is OK: puppet ran at Wed Jun 26 20:12:42 UTC 2013 [20:12:49] RECOVERY - Puppet freshness on mw77 is OK: puppet ran at Wed Jun 26 20:12:43 UTC 2013 [20:12:50] RECOVERY - Puppet freshness on mw1071 is OK: puppet ran at Wed Jun 26 20:12:43 UTC 2013 [20:12:50] RECOVERY - Puppet freshness on analytics1005 is OK: puppet ran at Wed Jun 26 20:12:44 UTC 2013 [20:12:51] RECOVERY - Puppet freshness on analytics1025 is OK: puppet ran at Wed Jun 26 20:12:44 UTC 2013 [20:12:51] RECOVERY - Puppet freshness on mw1207 is OK: puppet ran at Wed Jun 26 20:12:45 UTC 2013 [20:12:52] RECOVERY - Puppet freshness on mw101 is OK: puppet ran at Wed Jun 26 20:12:45 UTC 2013 [20:12:52] RECOVERY - Puppet freshness on search28 is OK: puppet ran at Wed Jun 26 20:12:46 UTC 2013 [20:12:53] RECOVERY - Puppet freshness on mw94 is OK: puppet ran at Wed Jun 26 20:12:47 UTC 2013 [20:12:58] RECOVERY - Puppet freshness on mw105 is OK: puppet ran at Wed Jun 26 20:12:47 UTC 2013 [20:12:58] RECOVERY - Puppet freshness on mw103 is OK: puppet ran at Wed Jun 26 20:12:48 UTC 2013 [20:12:58] RECOVERY - Puppet freshness on mw1187 is OK: puppet ran at Wed Jun 26 20:12:48 UTC 2013 [20:12:58] RECOVERY - Puppet freshness on mw108 is OK: puppet ran at Wed Jun 26 20:12:48 UTC 2013 [20:12:58] RECOVERY - Puppet freshness on mw54 is OK: puppet ran at Wed Jun 26 20:12:49 UTC 2013 [20:12:59] RECOVERY - Puppet freshness on mw29 is OK: puppet ran at Wed Jun 26 20:12:49 UTC 2013 [20:12:59] RECOVERY - Puppet freshness on snapshot1003 is OK: puppet ran at Wed Jun 26 20:12:50 UTC 2013 [20:13:00] RECOVERY - Puppet freshness on cp3005 is OK: puppet ran at Wed Jun 26 20:12:50 UTC 2013 [20:13:00] RECOVERY - Puppet freshness on cp3004 is OK: puppet ran at Wed Jun 26 20:12:51 UTC 2013 [20:13:01] RECOVERY - Puppet freshness on mw1160 is OK: puppet ran at Wed Jun 26 20:12:52 UTC 2013 [20:13:01] RECOVERY - Puppet freshness on mw1037 is OK: puppet ran at Wed Jun 26 20:12:53 UTC 2013 [20:13:02] RECOVERY - Puppet freshness on mw1113 is OK: puppet ran at Wed Jun 26 20:12:53 UTC 2013 [20:13:02] RECOVERY - Puppet freshness on pc2 is OK: puppet ran at Wed Jun 26 20:12:54 UTC 2013 [20:13:03] RECOVERY - Puppet freshness on virt9 is OK: puppet ran at Wed Jun 26 20:12:54 UTC 2013 [20:13:03] RECOVERY - Puppet freshness on mc1003 is OK: puppet ran at Wed Jun 26 20:12:54 UTC 2013 [20:13:04] RECOVERY - Puppet freshness on srv287 is OK: puppet ran at Wed Jun 26 20:12:55 UTC 2013 [20:13:04] RECOVERY - Puppet freshness on maerlant is OK: puppet ran at Wed Jun 26 20:12:55 UTC 2013 [20:13:05] RECOVERY - Puppet freshness on srv300 is OK: puppet ran at Wed Jun 26 20:12:56 UTC 2013 [20:13:05] RECOVERY - Puppet freshness on mw51 is OK: puppet ran at Wed Jun 26 20:12:56 UTC 2013 [20:13:06] RECOVERY - Puppet freshness on search13 is OK: puppet ran at Wed Jun 26 20:12:56 UTC 2013 [20:13:06] RECOVERY - Puppet freshness on cp3003 is OK: puppet ran at Wed Jun 26 20:12:57 UTC 2013 [20:13:07] RECOVERY - Puppet freshness on sq67 is OK: puppet ran at Wed Jun 26 20:12:57 UTC 2013 [20:13:09] RECOVERY - Puppet freshness on mw1164 is OK: puppet ran at Wed Jun 26 20:12:58 UTC 2013 [20:13:09] RECOVERY - Puppet freshness on mc1015 is OK: puppet ran at Wed Jun 26 20:12:58 UTC 2013 [20:13:09] RECOVERY - Puppet freshness on search1010 is OK: puppet ran at Wed Jun 26 20:12:58 UTC 2013 [20:13:09] RECOVERY - Puppet freshness on mw1217 is OK: puppet ran at Wed Jun 26 20:12:59 UTC 2013 [20:13:09] RECOVERY - Puppet freshness on carbon is OK: puppet ran at Wed Jun 26 20:12:59 UTC 2013 [20:13:10] RECOVERY - Puppet freshness on mw1117 is OK: puppet ran at Wed Jun 26 20:13:00 UTC 2013 [20:13:10] RECOVERY - Puppet freshness on mw1100 is OK: puppet ran at Wed Jun 26 20:13:01 UTC 2013 [20:13:11] RECOVERY - Puppet freshness on mw1018 is OK: puppet ran at Wed Jun 26 20:13:01 UTC 2013 [20:13:11] RECOVERY - Puppet freshness on cp1007 is OK: puppet ran at Wed Jun 26 20:13:02 UTC 2013 [20:13:12] RECOVERY - Puppet freshness on analytics1012 is OK: puppet ran at Wed Jun 26 20:13:02 UTC 2013 [20:13:12] RECOVERY - Puppet freshness on mw1103 is OK: puppet ran at Wed Jun 26 20:13:04 UTC 2013 [20:13:13] RECOVERY - Puppet freshness on db1054 is OK: puppet ran at Wed Jun 26 20:13:05 UTC 2013 [20:13:13] RECOVERY - Puppet freshness on sq86 is OK: puppet ran at Wed Jun 26 20:13:05 UTC 2013 [20:13:14] RECOVERY - Puppet freshness on labsdb1003 is OK: puppet ran at Wed Jun 26 20:13:06 UTC 2013 [20:13:14] RECOVERY - Puppet freshness on sq83 is OK: puppet ran at Wed Jun 26 20:13:06 UTC 2013 [20:13:15] RECOVERY - Puppet freshness on sq71 is OK: puppet ran at Wed Jun 26 20:13:06 UTC 2013 [20:13:15] RECOVERY - Puppet freshness on cp1061 is OK: puppet ran at Wed Jun 26 20:13:06 UTC 2013 [20:13:16] RECOVERY - Puppet freshness on labsdb1002 is OK: puppet ran at Wed Jun 26 20:13:07 UTC 2013 [20:13:18] RECOVERY - Puppet freshness on cp1056 is OK: puppet ran at Wed Jun 26 20:13:09 UTC 2013 [20:13:18] RECOVERY - Puppet freshness on es1 is OK: puppet ran at Wed Jun 26 20:13:09 UTC 2013 [20:13:18] RECOVERY - Puppet freshness on db40 is OK: puppet ran at Wed Jun 26 20:13:09 UTC 2013 [20:13:18] RECOVERY - Puppet freshness on amssq52 is OK: puppet ran at Wed Jun 26 20:13:11 UTC 2013 [20:13:18] RECOVERY - Puppet freshness on db1017 is OK: puppet ran at Wed Jun 26 20:13:13 UTC 2013 [20:13:19] RECOVERY - Puppet freshness on cp1029 is OK: puppet ran at Wed Jun 26 20:13:13 UTC 2013 [20:13:19] RECOVERY - Puppet freshness on mw31 is OK: puppet ran at Wed Jun 26 20:13:14 UTC 2013 [20:13:20] RECOVERY - Puppet freshness on mw16 is OK: puppet ran at Wed Jun 26 20:13:14 UTC 2013 [20:13:20] RECOVERY - Puppet freshness on mw1088 is OK: puppet ran at Wed Jun 26 20:13:15 UTC 2013 [20:13:21] RECOVERY - Puppet freshness on cerium is OK: puppet ran at Wed Jun 26 20:13:15 UTC 2013 [20:13:21] RECOVERY - Puppet freshness on mw1176 is OK: puppet ran at Wed Jun 26 20:13:16 UTC 2013 [20:13:22] RECOVERY - Puppet freshness on mw1099 is OK: puppet ran at Wed Jun 26 20:13:16 UTC 2013 [20:13:22] RECOVERY - Puppet freshness on tmh2 is OK: puppet ran at Wed Jun 26 20:13:16 UTC 2013 [20:13:23] RECOVERY - Puppet freshness on mw122 is OK: puppet ran at Wed Jun 26 20:13:17 UTC 2013 [20:13:23] RECOVERY - Puppet freshness on mw82 is OK: puppet ran at Wed Jun 26 20:13:17 UTC 2013 [20:13:28] RECOVERY - Puppet freshness on srv238 is OK: puppet ran at Wed Jun 26 20:13:18 UTC 2013 [20:13:28] RECOVERY - Puppet freshness on mw46 is OK: puppet ran at Wed Jun 26 20:13:18 UTC 2013 [20:13:28] RECOVERY - Puppet freshness on srv297 is OK: puppet ran at Wed Jun 26 20:13:19 UTC 2013 [20:13:28] RECOVERY - Puppet freshness on search1018 is OK: puppet ran at Wed Jun 26 20:13:19 UTC 2013 [20:13:28] RECOVERY - Puppet freshness on search1019 is OK: puppet ran at Wed Jun 26 20:13:19 UTC 2013 [20:13:29] RECOVERY - Puppet freshness on mw1095 is OK: puppet ran at Wed Jun 26 20:13:20 UTC 2013 [20:13:29] RECOVERY - Puppet freshness on mw66 is OK: puppet ran at Wed Jun 26 20:13:21 UTC 2013 [20:13:30] RECOVERY - Puppet freshness on mw20 is OK: puppet ran at Wed Jun 26 20:13:21 UTC 2013 [20:13:30] RECOVERY - Puppet freshness on search1008 is OK: puppet ran at Wed Jun 26 20:13:22 UTC 2013 [20:13:31] RECOVERY - Puppet freshness on mw1065 is OK: puppet ran at Wed Jun 26 20:13:25 UTC 2013 [20:13:31] RECOVERY - Puppet freshness on mw1144 is OK: puppet ran at Wed Jun 26 20:13:25 UTC 2013 [20:13:32] RECOVERY - Puppet freshness on mw1015 is OK: puppet ran at Wed Jun 26 20:13:25 UTC 2013 [20:13:32] RECOVERY - Puppet freshness on fenari is OK: puppet ran at Wed Jun 26 20:13:26 UTC 2013 [20:13:33] RECOVERY - Puppet freshness on mw1052 is OK: puppet ran at Wed Jun 26 20:13:26 UTC 2013 [20:13:33] RECOVERY - Puppet freshness on mw1042 is OK: puppet ran at Wed Jun 26 20:13:27 UTC 2013 [20:13:34] RECOVERY - Puppet freshness on neon is OK: puppet ran at Wed Jun 26 20:13:27 UTC 2013 [20:13:38] RECOVERY - Puppet freshness on sq65 is OK: puppet ran at Wed Jun 26 20:13:27 UTC 2013 [20:13:38] RECOVERY - Puppet freshness on es1003 is OK: puppet ran at Wed Jun 26 20:13:28 UTC 2013 [20:13:38] RECOVERY - Puppet freshness on db1023 is OK: puppet ran at Wed Jun 26 20:13:28 UTC 2013 [20:13:38] RECOVERY - Puppet freshness on cp3007 is OK: puppet ran at Wed Jun 26 20:13:28 UTC 2013 [20:13:38] RECOVERY - Puppet freshness on es1005 is OK: puppet ran at Wed Jun 26 20:13:29 UTC 2013 [20:13:39] RECOVERY - Puppet freshness on db1019 is OK: puppet ran at Wed Jun 26 20:13:29 UTC 2013 [20:13:39] RECOVERY - Puppet freshness on cp1043 is OK: puppet ran at Wed Jun 26 20:13:30 UTC 2013 [20:13:40] RECOVERY - Puppet freshness on srv295 is OK: puppet ran at Wed Jun 26 20:13:30 UTC 2013 [20:13:40] RECOVERY - Puppet freshness on ms-be1 is OK: puppet ran at Wed Jun 26 20:13:31 UTC 2013 [20:13:41] RECOVERY - Puppet freshness on amssq55 is OK: puppet ran at Wed Jun 26 20:13:31 UTC 2013 [20:13:41] RECOVERY - Puppet freshness on williams is OK: puppet ran at Wed Jun 26 20:13:32 UTC 2013 [20:13:42] RECOVERY - Puppet freshness on cp1028 is OK: puppet ran at Wed Jun 26 20:13:32 UTC 2013 [20:13:42] RECOVERY - Puppet freshness on ms-be8 is OK: puppet ran at Wed Jun 26 20:13:35 UTC 2013 [20:13:43] RECOVERY - Puppet freshness on srv258 is OK: puppet ran at Wed Jun 26 20:13:36 UTC 2013 [20:13:43] RECOVERY - Puppet freshness on cp1012 is OK: puppet ran at Wed Jun 26 20:13:36 UTC 2013 [20:13:44] RECOVERY - Puppet freshness on srv250 is OK: puppet ran at Wed Jun 26 20:13:36 UTC 2013 [20:13:44] RECOVERY - Puppet freshness on cp1020 is OK: puppet ran at Wed Jun 26 20:13:37 UTC 2013 [20:13:49] RECOVERY - Puppet freshness on mw34 is OK: puppet ran at Wed Jun 26 20:13:37 UTC 2013 [20:13:49] RECOVERY - Puppet freshness on dobson is OK: puppet ran at Wed Jun 26 20:13:38 UTC 2013 [20:13:49] RECOVERY - Puppet freshness on mc3 is OK: puppet ran at Wed Jun 26 20:13:38 UTC 2013 [20:13:50] RECOVERY - Puppet freshness on mw1182 is OK: puppet ran at Wed Jun 26 20:13:39 UTC 2013 [20:13:50] RECOVERY - Puppet freshness on mw1017 is OK: puppet ran at Wed Jun 26 20:13:39 UTC 2013 [20:13:50] RECOVERY - Puppet freshness on mw49 is OK: puppet ran at Wed Jun 26 20:13:39 UTC 2013 [20:13:50] RECOVERY - Puppet freshness on tin is OK: puppet ran at Wed Jun 26 20:13:40 UTC 2013 [20:13:51] RECOVERY - Puppet freshness on mw47 is OK: puppet ran at Wed Jun 26 20:13:41 UTC 2013 [20:13:51] RECOVERY - Puppet freshness on mw62 is OK: puppet ran at Wed Jun 26 20:13:41 UTC 2013 [20:13:52] RECOVERY - Puppet freshness on mw1214 is OK: puppet ran at Wed Jun 26 20:13:41 UTC 2013 [20:13:52] RECOVERY - Puppet freshness on analytics1010 is OK: puppet ran at Wed Jun 26 20:13:42 UTC 2013 [20:13:53] RECOVERY - Puppet freshness on sq72 is OK: puppet ran at Wed Jun 26 20:13:44 UTC 2013 [20:13:53] RECOVERY - Puppet freshness on pc1003 is OK: puppet ran at Wed Jun 26 20:13:44 UTC 2013 [20:13:54] RECOVERY - Puppet freshness on search22 is OK: puppet ran at Wed Jun 26 20:13:44 UTC 2013 [20:13:54] RECOVERY - Puppet freshness on sq45 is OK: puppet ran at Wed Jun 26 20:13:45 UTC 2013 [20:13:55] RECOVERY - Puppet freshness on gurvin is OK: puppet ran at Wed Jun 26 20:13:45 UTC 2013 [20:13:55] RECOVERY - Puppet freshness on pc1002 is OK: puppet ran at Wed Jun 26 20:13:45 UTC 2013 [20:13:56] RECOVERY - Puppet freshness on cp1051 is OK: puppet ran at Wed Jun 26 20:13:45 UTC 2013 [20:13:56] RECOVERY - Puppet freshness on es10 is OK: puppet ran at Wed Jun 26 20:13:45 UTC 2013 [20:13:57] RECOVERY - Puppet freshness on mw1175 is OK: puppet ran at Wed Jun 26 20:13:46 UTC 2013 [20:13:57] RECOVERY - Puppet freshness on db1052 is OK: puppet ran at Wed Jun 26 20:13:46 UTC 2013 [20:13:58] RECOVERY - Puppet freshness on cp1069 is OK: puppet ran at Wed Jun 26 20:13:46 UTC 2013 [20:13:58] RECOVERY - Puppet freshness on cp1013 is OK: puppet ran at Wed Jun 26 20:13:47 UTC 2013 [20:13:59] RECOVERY - Puppet freshness on es4 is OK: puppet ran at Wed Jun 26 20:13:47 UTC 2013 [20:13:59] RECOVERY - Puppet freshness on es1002 is OK: puppet ran at Wed Jun 26 20:13:47 UTC 2013 [20:14:00] RECOVERY - Puppet freshness on db53 is OK: puppet ran at Wed Jun 26 20:13:49 UTC 2013 [20:14:00] RECOVERY - Puppet freshness on cp1003 is OK: puppet ran at Wed Jun 26 20:13:49 UTC 2013 [20:14:01] RECOVERY - Puppet freshness on amssq57 is OK: puppet ran at Wed Jun 26 20:13:50 UTC 2013 [20:14:01] RECOVERY - Puppet freshness on mw1114 is OK: puppet ran at Wed Jun 26 20:13:51 UTC 2013 [20:14:02] RECOVERY - Puppet freshness on amssq34 is OK: puppet ran at Wed Jun 26 20:13:51 UTC 2013 [20:14:02] RECOVERY - Puppet freshness on mw1127 is OK: puppet ran at Wed Jun 26 20:13:52 UTC 2013 [20:14:03] RECOVERY - Puppet freshness on srv237 is OK: puppet ran at Wed Jun 26 20:13:52 UTC 2013 [20:14:03] RECOVERY - Puppet freshness on mw1123 is OK: puppet ran at Wed Jun 26 20:13:52 UTC 2013 [20:14:04] RECOVERY - Puppet freshness on mw1126 is OK: puppet ran at Wed Jun 26 20:13:55 UTC 2013 [20:14:04] RECOVERY - Puppet freshness on db52 is OK: puppet ran at Wed Jun 26 20:13:56 UTC 2013 [20:14:05] RECOVERY - Puppet freshness on cp3010 is OK: puppet ran at Wed Jun 26 20:13:56 UTC 2013 [20:14:05] RECOVERY - Puppet freshness on ms5 is OK: puppet ran at Wed Jun 26 20:13:56 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on sq73 is OK: puppet ran at Wed Jun 26 20:13:58 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on srv284 is OK: puppet ran at Wed Jun 26 20:14:01 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on srv286 is OK: puppet ran at Wed Jun 26 20:14:01 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on mw125 is OK: puppet ran at Wed Jun 26 20:14:01 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on mw90 is OK: puppet ran at Wed Jun 26 20:14:02 UTC 2013 [20:14:10] RECOVERY - Puppet freshness on mw9 is OK: puppet ran at Wed Jun 26 20:14:02 UTC 2013 [20:14:11] RECOVERY - Puppet freshness on mw100 is OK: puppet ran at Wed Jun 26 20:14:03 UTC 2013 [20:14:11] RECOVERY - Puppet freshness on search14 is OK: puppet ran at Wed Jun 26 20:14:03 UTC 2013 [20:14:12] RECOVERY - Puppet freshness on mw5 is OK: puppet ran at Wed Jun 26 20:14:03 UTC 2013 [20:14:12] RECOVERY - Puppet freshness on srv275 is OK: puppet ran at Wed Jun 26 20:14:04 UTC 2013 [20:14:13] RECOVERY - Puppet freshness on mw81 is OK: puppet ran at Wed Jun 26 20:14:04 UTC 2013 [20:14:13] RECOVERY - Puppet freshness on mw1196 is OK: puppet ran at Wed Jun 26 20:14:05 UTC 2013 [20:14:14] RECOVERY - Puppet freshness on mw41 is OK: puppet ran at Wed Jun 26 20:14:05 UTC 2013 [20:14:14] RECOVERY - Puppet freshness on search21 is OK: puppet ran at Wed Jun 26 20:14:05 UTC 2013 [20:14:15] RECOVERY - Puppet freshness on lvs1 is OK: puppet ran at Wed Jun 26 20:14:06 UTC 2013 [20:14:15] RECOVERY - Puppet freshness on search1014 is OK: puppet ran at Wed Jun 26 20:14:06 UTC 2013 [20:14:16] RECOVERY - Puppet freshness on mw1162 is OK: puppet ran at Wed Jun 26 20:14:06 UTC 2013 [20:14:16] RECOVERY - Puppet freshness on mw1044 is OK: puppet ran at Wed Jun 26 20:14:07 UTC 2013 [20:14:18] RECOVERY - Puppet freshness on zirconium is OK: puppet ran at Wed Jun 26 20:14:07 UTC 2013 [20:14:18] RECOVERY - Puppet freshness on stafford is OK: puppet ran at Wed Jun 26 20:14:09 UTC 2013 [20:14:18] RECOVERY - Puppet freshness on cp1006 is OK: puppet ran at Wed Jun 26 20:14:10 UTC 2013 [20:14:19] RECOVERY - Puppet freshness on db1048 is OK: puppet ran at Wed Jun 26 20:14:11 UTC 2013 [20:14:20] RECOVERY - Puppet freshness on lvs5 is OK: puppet ran at Wed Jun 26 20:14:12 UTC 2013 [20:14:20] RECOVERY - Puppet freshness on cp1050 is OK: puppet ran at Wed Jun 26 20:14:12 UTC 2013 [20:14:20] RECOVERY - Puppet freshness on mc1005 is OK: puppet ran at Wed Jun 26 20:14:14 UTC 2013 [20:14:20] RECOVERY - Puppet freshness on ms-be4 is OK: puppet ran at Wed Jun 26 20:14:15 UTC 2013 [20:14:21] RECOVERY - Puppet freshness on cp1046 is OK: puppet ran at Wed Jun 26 20:14:16 UTC 2013 [20:14:21] RECOVERY - Puppet freshness on sq79 is OK: puppet ran at Wed Jun 26 20:14:16 UTC 2013 [20:14:22] RECOVERY - Puppet freshness on wtp1024 is OK: puppet ran at Wed Jun 26 20:14:17 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw119 is OK: puppet ran at Wed Jun 26 20:14:17 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on search36 is OK: puppet ran at Wed Jun 26 20:14:19 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw38 is OK: puppet ran at Wed Jun 26 20:14:20 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw83 is OK: puppet ran at Wed Jun 26 20:14:20 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw1125 is OK: puppet ran at Wed Jun 26 20:14:21 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw1111 is OK: puppet ran at Wed Jun 26 20:14:22 UTC 2013 [20:14:29] RECOVERY - Puppet freshness on mw1124 is OK: puppet ran at Wed Jun 26 20:14:22 UTC 2013 [20:14:30] RECOVERY - Puppet freshness on mw1147 is OK: puppet ran at Wed Jun 26 20:14:22 UTC 2013 [20:14:30] RECOVERY - Puppet freshness on gadolinium is OK: puppet ran at Wed Jun 26 20:14:23 UTC 2013 [20:14:31] RECOVERY - Puppet freshness on mw1130 is OK: puppet ran at Wed Jun 26 20:14:23 UTC 2013 [20:14:31] RECOVERY - Puppet freshness on mw118 is OK: puppet ran at Wed Jun 26 20:14:24 UTC 2013 [20:14:32] RECOVERY - Puppet freshness on srv301 is OK: puppet ran at Wed Jun 26 20:14:24 UTC 2013 [20:14:32] RECOVERY - Puppet freshness on srv290 is OK: puppet ran at Wed Jun 26 20:14:25 UTC 2013 [20:14:33] RECOVERY - Puppet freshness on mw1190 is OK: puppet ran at Wed Jun 26 20:14:25 UTC 2013 [20:14:33] RECOVERY - Puppet freshness on sq51 is OK: puppet ran at Wed Jun 26 20:14:26 UTC 2013 [20:14:34] RECOVERY - Puppet freshness on mw1101 is OK: puppet ran at Wed Jun 26 20:14:26 UTC 2013 [20:14:34] RECOVERY - Puppet freshness on mw1161 is OK: puppet ran at Wed Jun 26 20:14:26 UTC 2013 [20:14:35] RECOVERY - Puppet freshness on search17 is OK: puppet ran at Wed Jun 26 20:14:27 UTC 2013 [20:14:39] RECOVERY - Puppet freshness on analytics1015 is OK: puppet ran at Wed Jun 26 20:14:27 UTC 2013 [20:14:39] RECOVERY - Puppet freshness on cp1063 is OK: puppet ran at Wed Jun 26 20:14:29 UTC 2013 [20:14:39] RECOVERY - Puppet freshness on mc1009 is OK: puppet ran at Wed Jun 26 20:14:29 UTC 2013 [20:14:39] RECOVERY - Puppet freshness on db1036 is OK: puppet ran at Wed Jun 26 20:14:30 UTC 2013 [20:14:39] RECOVERY - Puppet freshness on mw14 is OK: puppet ran at Wed Jun 26 20:14:30 UTC 2013 [20:14:40] RECOVERY - Puppet freshness on ms-fe3 is OK: puppet ran at Wed Jun 26 20:14:31 UTC 2013 [20:14:40] RECOVERY - Puppet freshness on db1020 is OK: puppet ran at Wed Jun 26 20:14:35 UTC 2013 [20:14:41] RECOVERY - Puppet freshness on db45 is OK: puppet ran at Wed Jun 26 20:14:35 UTC 2013 [20:14:41] RECOVERY - Puppet freshness on vanadium is OK: puppet ran at Wed Jun 26 20:14:36 UTC 2013 [20:14:42] RECOVERY - Puppet freshness on ms-be1012 is OK: puppet ran at Wed Jun 26 20:14:36 UTC 2013 [20:14:47] ori-l: ! [20:14:48] RECOVERY - Puppet freshness on wtp1011 is OK: puppet ran at Wed Jun 26 20:14:37 UTC 2013 [20:14:48] RECOVERY - Puppet freshness on wtp1018 is OK: puppet ran at Wed Jun 26 20:14:38 UTC 2013 [20:14:48] RECOVERY - Puppet freshness on db1004 is OK: puppet ran at Wed Jun 26 20:14:40 UTC 2013 [20:14:48] RECOVERY - Puppet freshness on solr2 is OK: puppet ran at Wed Jun 26 20:14:45 UTC 2013 [20:14:49] RECOVERY - Puppet freshness on db43 is OK: puppet ran at Wed Jun 26 20:14:45 UTC 2013 [20:15:28] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 20:15:22 UTC 2013 [20:15:39] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [20:16:03] don't look at me [20:16:08] AzaToth: paravoid is probably responsible, was talking about doing something about this a while back [20:16:29] YuviPanda: seems it stopped now (for now) [20:16:37] ori-l: ha ha [20:16:48] * AzaToth goes digging a hole for paravoid  [20:17:29] RECOVERY - Puppet freshness on cp1016 is OK: puppet ran at Wed Jun 26 20:17:18 UTC 2013 [20:17:32] AzaToth: well, at least your nick wasn't 'puppet' [20:17:35] or 'recovery' [20:17:37] hehe [20:17:40] :) [20:17:51] or freshness [20:17:56] or... [20:18:01] is? [20:18:02] or OK [20:18:10] or UTC [20:18:13] or - [20:18:24] hah! [20:18:28] gah [20:18:31] hehe [20:18:34] UTC: ha ha [20:18:37] geez [20:18:43] I kind of like it [20:18:46] hah! [20:19:04] AzaToth: something building at https://integration.wikimedia.org/ci/job/jenkins-debian-glue-source/1/console :D [20:19:08] better nick as PROBLEM [20:19:23] Finished: FAILURE [20:19:44] hashar: Please run the script in the jenkins workspace [20:20:12] Anyone here used rspec + puppet? [20:21:33] andrewbogott: Are you wanting RSpec tests for your Puppet manifests? [20:22:02] preilly, yeah, more or less. Right now I'm looking at a third-party module in our repo that already has tests, and just trying to get them to run. [20:22:24] andrewbogott: did you look at: http://rspec-puppet.com/setup/ [20:22:25] All tests fail because the tests can't find the module that contains the tests… I suspect user error [20:22:49] preilly, yep, I have that page up. Lots of docs about how to write tests but not much about how to invoke them. [20:24:37] For example, I run $ puppet/modules/apache rspec . [20:24:42] andrewbogott: don't you just use rake? [20:24:47] and I get a bunch of failures like "Could not find class apache for puppet-testing.pmtpa.wmflabs at line 2 on node puppet-testing.pmtpa.wmflabs" [20:25:28] preilly, maybe that's the bit that I'm missing. Trying... [20:25:43] like rake spec [20:26:06] so rake sets up a virtuel env, and running rspec naked is uses the current env? That's how it looks so far... [20:26:08] RECOVERY - Puppet freshness on celsus is OK: puppet ran at Wed Jun 26 20:26:01 UTC 2013 [20:26:28] Try: % rake spec [20:26:42] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [20:26:59] yep, running that now… so far tests are passing. Thanks! [20:27:09] Ooh, 0 failures! Way better than the 60 I got last time [20:27:31] ha ha nice [20:27:37] andrewbogott: I'm glad that I could help [20:28:11] andrewbogott: you might also want to gem install puppet-lint [20:28:35] then add require 'puppet-lint/tasks/puppet-lint' to the top of your rakefile [20:28:53] then you can run rake lint to run puppet-lint over your manifests. [20:29:11] Yeah… gotta figure out how to get all these gems politely packaged and installed :( [20:29:18] so that you can insure that manifests comply with the Puppet Labs style guide [20:29:26] AzaToth: bah a jenkins multi configuration job can't be tied to a specific slave apparently :( [20:30:05] can be [20:30:46] hashar: http://jenkins-debian-glue.org/faq/#foreign_arch_builds [20:31:01] ahhh that is right [20:31:03] i.e. you have to add a combo filter [20:31:05] they have their own setting [20:31:09] RECOVERY - Puppet freshness on ssl1004 is OK: puppet ran at Wed Jun 26 20:31:07 UTC 2013 [20:31:29] RECOVERY - Puppet freshness on tmh1002 is OK: puppet ran at Wed Jun 26 20:31:22 UTC 2013 [20:40:58] AzaToth: it is not building on the correct node :( [20:41:02] https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/8/console [20:43:02] ahh no it does [20:43:10] https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/architecture=amd64,node=integration-debian-builder/9/console [20:43:39] need to tweak sudo [20:47:20] something is happening https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/10/architecture=amd64,node=integration-debian-builder/console [20:52:15] AzaToth: that makes a good progress on bug 36443 """have jenkins build debian packages for us" https://bugzilla.wikimedia.org/show_bug.cgi?id=36443 [20:52:28] will have to puppetize all that mess though [20:52:40] hashar: wooooo, nice [20:53:16] hashar: btw, if you want something to try that on, Coren's 'log2udp2' (analytics/log2udp2.git) is as a easy as it gets: one makefile, one sourcefile, no dependencies [20:53:33] I have picked pybal :-] [20:53:42] the three or four I tried before just failed [20:53:44] that works too :P [20:54:41] I will have to puppeteer that properly then write all the jenkins job builder integration [20:54:52] then we can get them triggered by Zuul whenever a patch is submitted [20:55:00] (hopefully) [20:55:38] sounds too good :P what's the catch? [20:57:02] 1 hour ago I never heard of debian-glue :-D [20:58:11] hashar: seems you are making good progress [20:58:24] yup [20:58:25] + '[' -n 'Error: repository /srv/repository does not exist.' ']' [20:58:26] :) [20:58:33] progressing one error at a time [20:58:43] I know how it feels [20:59:00] can you puppetize jenkins config? [20:59:07] no way [20:59:14] :-] [20:59:38] we are using a python script written by OpenStack that generates jobs for us out of yaml definitions [20:59:38] I thought as you can use Scm-Sync-Config [20:59:47] ok [21:00:01] integration/jenkins-job-builder-config.git is da repo [21:00:39] !github integration/jenkins-job-builder-config [21:00:44] hmm bot is not smart enough :) [21:01:08] https://github.com/wikimedia/integration-jenkins-job-builder-config that is the yaml definitions. and our doc is at http://www.mediawiki.org/wiki/CI/JJB [21:02:16] so you make no modifications to jenkins jobs directly? [21:03:15] nop [21:03:29] Jenkins has been originally setup to run MediaWiki tests [21:03:40] then we expanded it to MEdiawiki extensions ( a whole bunch of them) [21:03:47] so maintaining them manually was not an option [21:04:20] I had three possibilities: 1) using a Jenkins job templating plugin, 2) writing my own python/shell templating to generate XML job files 3) use the third party python script [21:04:33] I wanted to do some python and we have good contact with third party (OpenStack) [21:06:00] PROBLEM - Puppet freshness on manutius is CRITICAL: No successful Puppet run in the last 10 hours [21:07:49] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [21:08:41] hashar: do you make the yamls by hand? [21:08:47] yup [21:09:11] I need to clean them up a bit [21:09:24] and my goal is to have some step by step tutorial written by the end of july [21:09:28] to get more people to edit them [21:10:07] Finished: SUCCESS [21:10:38] I got some debs at https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/architecture=amd64,node=integration-debian-builder/ :-]]]] [21:11:16] pybal? [21:11:39] that is a python load balancer we use in production [21:11:46] I assume the job is just a testjob? [21:11:47] which is packaged under operations/debs/pybal.git [21:11:55] yeah that is a playground [21:12:01] from there I can write the YAML templates for Jenkins Job Builder [21:12:17] then list all our debs repo and generate all the jobs automaticall [21:12:17] y [21:12:32] you can also have it deploy the debs to a reprepro [21:12:53] then configure Zuul (a gateway between Gerrit and Jenkins) to trigger the generated job whenever someone submit a change on one of the debs project [21:13:08] hmm [21:13:17] I guess the jenkins slave could be a reprepro [21:13:36] I always used Jenkins-Gerrit plugin [21:13:42] Gerrit Trigger [21:13:51] yeah we used that one before [21:13:55] k [21:14:01] I eventually was missing features in that plugin [21:14:09] okidoki [21:14:12] and since I do not know java at all, I switched to a python daemon (Zuul) [21:14:16] maintained by Openstack [21:14:17] !log changing fundraising db config to point to service aliases [21:14:21] the same that maintain jenkins job builder :-] [21:14:25] Logged the message, Master [21:14:29] so I got support from #openstack instead of from #jenkins hehe [21:14:36] and contributed a bit, since I know python :) [21:15:22] hashar: http://jenkins-debian-glue.org/faq/#single_repository [21:16:11] I wouldn't place them on the slave [21:16:27] forstmost as they've allready been moved to the master when the build is done [21:16:39] and secondly you might want to have multiple slaves available in the future [21:16:55] a slave is something you can shoot and replace [21:17:28] the master is the one sitting on all the goodies [21:18:59] hashar: did you enable lintian reports? [21:19:00] AzaToth: good news. I found out the license for adminbot [21:19:03] it's public domain [21:19:22] AzaToth: I left some notes [21:19:24] never heard about that license [21:19:31] AzaToth: are the artifacts copied back to the master? [21:19:50] AzaToth: public domain means it's free to the world with no license [21:19:54] hashar: Archiving artifacts [21:20:21] hashar: well, it depends I would assume depending how you configure it [21:20:37] Ryan_Lane: was just sarcastic [21:20:42] ah. heh [21:21:01] AzaToth: seems they got copied somewhere on the master :-] [21:21:13] Ryan_Lane: problem is that it's only US of A that has a "public domain" relevant declaration [21:21:37] the original version was public domain [21:21:49] we can relicense it however we want [21:22:02] AzaToth: I have no idea how to enable lintian reports :D [21:22:10] we should note that earlier versions were public domain, though [21:22:24] hashar: http://jenkins-debian-glue.org/getting_started/manual/ [21:22:33] section "Enable Lintian reports" [21:22:35] and based on that, we should probably use a permissive OSI license, like MIT, BSD or Apache2 [21:22:45] yea [21:23:25] hashar: below "Configure Sudo" [21:23:28] :-P [21:23:36] * AzaToth hides [21:26:07] AzaToth: I did the lintian steps :-) [21:26:15] heh [21:26:40] https://integration.wikimedia.org/ci/job/jenkins-debian-glue-source/9/testReport/(root)/lintian/lintian/? :--D Passed! [21:27:36] heh [21:28:07] :D [21:28:12] /tmp/hudson1373797359634596223.sh: line 2: report/lintian.xml: No such file or directory [21:28:13] Build step 'Execute shell' marked build as failure [21:28:16] New patchset: Jgreen; "pcoombe access to aluminium/grosley per RT #5280" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70754 [21:28:56] hashar: To enable Lintian reports for your jenkins-debian-glue jobs install the jenkins-debian-glue-buildenv-lintian package [21:29:05] you did do that right? [21:29:08] Change merged: Jgreen; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70754 [21:29:08] I did [21:29:53] hashar: you remembered mkdir on the binary? [21:30:38] I don't see any mkdir in the output [21:30:41] ah true [21:31:40] sorry for throwing this pita on you [21:32:06] you are like a second brain :-] very helpful [21:32:55] rebuilding at https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/architecture=amd64,node=integration-debian-builder/14/console [21:34:52] https://integration.wikimedia.org/ci/job/jenkins-debian-glue-binaries/architecture=amd64,node=integration-debian-builder/14/testReport/(root)/lintian/lintian/? [21:34:55] niiiice [21:35:05] I hate the long URL but he, we get a testReport :) [21:35:48] AzaToth: I am heading bed now. Thank you very much for the suggestion and all your help :) [21:35:53] np [21:35:59] will look at making that a bit more robust next week [21:36:50] okidoki [21:37:38] New patchset: Jgreen; "remove stale comments from fundraising.pp" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70756 [21:37:56] Change merged: Jgreen; [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70756 [21:39:35] * hashar waves [21:42:39] binasher: is ishmael down? [21:43:02] hrm [21:50:11] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 673 [21:50:51] jdlrobson: [21:52:12] RECOVERY - NTP on spence is OK: NTP OK: Offset -0.005134701729 secs [21:55:12] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 973 [22:00:05] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 1273 [22:04:51] New patchset: Demon; "Basic styling to make Gitblit look somewhat like Gerrit" [operations/puppet] (production) - https://gerrit.wikimedia.org/r/70762 [22:05:04] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 1573 [22:06:09] RECOVERY - SSH on spence is OK: SSH OK - OpenSSH_5.3p1 Debian-3ubuntu7 (protocol 2.0) [22:06:26] AaronSchulz: it's back [22:06:37] yay [22:06:39] PROBLEM - Puppet freshness on celsus is CRITICAL: No successful Puppet run in the last 10 hours [22:10:00] PROBLEM - Puppet freshness on mw1066 is CRITICAL: No successful Puppet run in the last 10 hours [22:10:09] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 1873 [22:15:11] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 2174 [22:18:33] paravoid: nice work on https://github.com/twitter/twemproxy/issues/120 [22:19:45] paravoid: did you find away around the libyaml.tar.gz stuff? [22:20:09] PROBLEM - check_mysql on db1025 is CRITICAL: SLOW_SLAVE CRITICAL: Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 2473 [22:21:48] !log killed slow query on db1025 [22:22:03] Logged the message, Mistress of the network gear. [22:25:08] PROBLEM - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) [22:30:08] PROBLEM - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) [22:35:08] PROBLEM - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) [22:36:29] LeslieCarr: are you know Slayer of Queries? [22:40:06] PROBLEM - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) [22:40:47] that mysql(?) problem makes no sense to me [22:40:56] you still uses mysql? [22:43:27] omg trademark violation [22:44:13] AzaToth: aren't we on MariaDB now? [22:45:06] PROBLEM - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) [22:45:29] db1025 doesn't even exist, according to DNS [22:45:35] so I don't know how it can be lagging [22:45:36] did Tim just say OMG? [22:46:01] don't get too excited, it was only sarcasm [22:46:36] TimStarling: I believe it's a fundraising box [22:47:00] * AaronSchulz thinks of people that use "like" every other two words [22:47:02] anyway, what was announced was a progressive migration to MariaDB [22:47:34] I see that db1024 is still MySQL, and db1026 is MariaDB [22:47:35] AaronSchulz: what's wrong with the word "like"? those of us from california are just very inclined to like things [22:47:59] but obviously we don't change our nagios service check identifiers as each server is migrated [22:48:19] notpeter: or to compare things with other things? [22:49:33] TimStarling: yes, but we compare things and still like both sides of the comparison. the thing that I like is like the other thing that I like [22:49:37] all very likeable [22:49:48] ACKNOWLEDGEMENT - check_mysql on db1025 is CRITICAL: Slave IO: Yes Slave SQL: No Seconds Behind Master: (null) Matt Walker Yes yes, I broke it [23:00:47] PROBLEM - NTP on ssl3003 is CRITICAL: NTP CRITICAL: No response from NTP server [23:01:28] TimStarling: ok [23:01:40] so db1025 is in limbo ツ [23:01:41] notpeter: that's like so totally, like, last-year, I mean like omg, come on [23:02:48] "like" is soo facebook [23:07:08] see, as the only person on earth who's not on facebook, I actually like less than just about anyone else [23:09:16] PROBLEM - NTP on ssl3002 is CRITICAL: NTP CRITICAL: No response from NTP server [23:09:18] notpeter, not the only [23:11:48] greg-g, aaa, need to do a quick depl in a few min, discovered a bad bug with zero configurations. Anyone doing deployments now? [23:12:06] AaronSchulz: yarrrr [23:12:13] TimStarling: it's in fundraising [23:12:50] LeslieCarr: Matt Walker acknowledged it [23:13:57] maybe fundraising servers should have distinctive hostnames [23:14:06] yurik: should be clear [23:14:22] greg-g, thx, will do it in about 10 min [23:14:35] they have explicitly made them not my problem by not giving me (or most ops) root access [23:14:50] which is fine, and good security policy, it would just be nice to easily know which is which [23:20:35] hrm, something like fr-db1025 ? [23:21:05] yes [23:21:19] !log updated Parsoid to 8c83524 [23:21:28] Logged the message, Master [23:25:08] RECOVERY - check_mysql on db1025 is OK: Uptime: 2970102 Threads: 3 Questions: 55132591 Slow queries: 52076 Opens: 54646 Flush tables: 2 Open tables: 64 Queries per second avg: 18.562 Slave IO: Yes Slave SQL: Yes Seconds Behind Master: 0 [23:32:56] AaronSchulz: I did not know there was an array >= operator [23:33:10] I checked the manual, it is undocumented [23:34:26] so I tried it in PHP 4, and it was definitely there at the time [23:35:49] I thought it was documented [23:36:43] http://www.php.net/manual/en/language.operators.array.php [23:37:22] it just says there is +, ==, ===, !=, <>, !== [23:38:06] it's documented, just not there [23:38:08] http://www.php.net/manual/en/language.operators.comparison.php#language.operators.comparison.types [23:38:12] http://www.php.net/manual/en/language.operators.comparison.php [23:38:46] documented as in "here is equivalent PHP code" [23:40:31] ok, about to sync [23:45:14] !log yurik synchronized php-1.22wmf8/extensions/ZeroRatedMobileAccess/ [23:45:24] Logged the message, Master [23:46:09] Which repo is the code for gerrit-wm in? I want to set up the pywikibot commits to go to #pywikipediabot [23:46:27] !log catrope synchronized php-1.22wmf7/extensions/VisualEditor 'Update VE to master' [23:46:36] Logged the message, Master [23:46:51] !log catrope synchronized php-1.22wmf8/extensions/VisualEditor 'Update VE to master' [23:47:00] Logged the message, Master [23:47:25] !log yurik synchronized php-1.22wmf7/extensions/ZeroRatedMobileAccess/ [23:47:34] Logged the message, Master [23:48:27] Found it I think. https://github.com/wikimedia/operations-puppet/blob/166fdf04ee85bbf8f21beebce2076d53d7f05e62/manifests/gerrit.pp [23:52:26] done. If only "git pull" would take less than 5 min :( [23:52:40] try "git pull --fast" [23:53:40] I can't tell if you're joking. [23:53:46] i am [23:53:57] but [23:54:07] i quickly checked to make sure it's not, in fact, an option [23:54:13] because you never know, with git [23:54:30] I feel like --quick is an option or something. [23:54:37] git is just magic spells.