[00:21:26] PROBLEM - Puppet freshness on mw1072 is CRITICAL: No successful Puppet run in the last 10 hours [00:42:50] RECOVERY - check_job_queue on hume is OK: JOBQUEUE OK - all job queues below 10,000 [00:45:50] PROBLEM - check_job_queue on hume is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [00:58:40] PROBLEM - RAID on solr3 is CRITICAL: Timeout while attempting connection [08:56:47] PROBLEM - Puppet freshness on analytics1021 is CRITICAL: No successful Puppet run in the last 10 hours [09:06:48] wq notpeter [09:06:50] er :) [09:08:58] is that 'well quit'? [09:09:12] (sorry! :P) [09:45:37] PROBLEM - Host mw1085 is DOWN: PING CRITICAL - Packet loss = 100% [09:46:07] RECOVERY - Host mw1085 is UP: PING OK - Packet loss = 0%, RTA = 0.26 ms [10:21:53] PROBLEM - Puppet freshness on mw1072 is CRITICAL: No successful Puppet run in the last 10 hours [10:36:26] (03PS1) 10MaxSem: No need to remove hiddenStructure by now [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85651 [10:45:58] (03PS1) 10MaxSem: 5 years later, we're not interested in a query.php translator [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85652 [10:57:02] ~~-~~ [11:51:52] PROBLEM - Disk space on analytics1003 is CRITICAL: DISK CRITICAL - free space: /var/lib/hadoop/data/g 11236 MB (3% inode=99%): [11:51:52] PROBLEM - Disk space on analytics1004 is CRITICAL: DISK CRITICAL - free space: /var/lib/hadoop/data/g 11239 MB (3% inode=99%): [12:55:10] (03CR) 10Ottomata: "We'd prefer not to mirror. We're going to try to produce from esams directly to eqiad. I'm waiting on a fix from Magnus to try this." [operations/puppet/kafka] - 10https://gerrit.wikimedia.org/r/85148 (owner: 10Ottomata) [12:55:39] (03CR) 10Ottomata: [C: 032] Fixed typo in README [operations/software/varnish/varnishkafka] - 10https://gerrit.wikimedia.org/r/85647 (owner: 10Edenhill) [12:55:43] (03CR) 10Ottomata: [V: 032] Fixed typo in README [operations/software/varnish/varnishkafka] - 10https://gerrit.wikimedia.org/r/85647 (owner: 10Edenhill) [12:58:04] (03CR) 10Ottomata: [C: 032 V: 032] Cleaned up the conf.example [operations/software/varnish/varnishkafka] - 10https://gerrit.wikimedia.org/r/85648 (owner: 10Edenhill) [13:40:53] (03PS1) 10Hashar: ganglia wrapper for py plugins (and add diskstat plugin) [operations/puppet] - 10https://gerrit.wikimedia.org/r/85669 [13:47:11] (03CR) 10Hashar: "(1 comment)" [operations/puppet] - 10https://gerrit.wikimedia.org/r/85669 (owner: 10Hashar) [14:05:19] hashar: I think there is a bug about such a ganglia plugin for production too [14:06:54] Nemo_bis: yup bug is in the commit summary [14:09:02] hashar: but it's in Labs component [14:09:31] because ops do not use bugzilla [14:10:06] though some of them do use it under Labs > Infrastructure :-] The original indentation was to monitor disks of the beta cluster labs instances [14:10:14] it is in Gerrit now anyway [14:10:58] yep [14:11:21] I might as well have imagined filing that bug report (it was after some grumbling by Tim :p) [14:23:10] hiiii paravoid, did we ever figure out the log rotate conclusion for varnishkafka? [16:03:55] mutante: a "free" domain costs $999 ???? [16:05:59] akosiaris: hehe, exactly my thought:) [16:06:30] !log importing jenkins_1.509.3_all.deb with reprepro for RT #5806 [16:14:26] hasharCall: Conf jenkins (1.509.3 Wikimedia:12.04/precise-wikimedia [all]) [16:19:27] (03PS2) 10Chad: Cirrus as secondary for enwikisource/cawiki, primary for itwikitionary [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/84434 [16:19:40] yum [16:20:33] 'wikitechwiki' => true, ? [16:21:32] <^d> wikitechwiki isn't on the cluster. [16:22:51] yea, it's labs controller, just wondering if we can switch that too [16:25:38] <^d> I don't think so? I thought it wasn't allowed to reach production stuff? [16:25:49] <^d> I mean I'm totally cool with trying :) [16:25:58] i don't know [16:26:38] mutante, hasharCall, want me to add you as morebots tool members so you can restart things? [16:26:52] TV is all about broken iphone 5 fingerprint reader [16:27:03] andrewbogott: yes, thanks [16:27:28] <^d> mutante: US tv is all about record iPhone 5s/c sales. [16:27:39] mark, fyi, i'm installing varnishkafka (via .deb) on cp3003 to test esams -> eqiad production [16:27:39] <^d> *tv, internet, media in general [16:28:00] ^d: DE tv is about Chaos Computer Club showing how to bypass the sensor, heh [16:28:20] <^d> Yeah I saw the story :) [16:28:22] get a 2400dpi scan, transfer to latex.. [16:28:24] you're dzahn on labs? [16:28:27] yes [16:29:16] ok --added you to tools & morebots [16:31:31] <^d> mutante: I'll ask Ryan about it when he gets in. I'm totally cool with doing that though. [16:32:18] mutante: I will upgrade Jenkins tomorrow :-D [16:32:21] thank you both [16:32:53] hasharCall: yea, i did not actually do it, teasing you:) [16:33:02] puppet won't touch it, correct [16:35:05] ottomata: ok [16:45:02] PROBLEM - Disk space on searchidx1001 is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [16:45:52] RECOVERY - Disk space on searchidx1001 is OK: DISK OK [16:48:12] RECOVERY - check_job_queue on hume is OK: JOBQUEUE OK - all job queues below 10,000 [16:51:23] PROBLEM - check_job_queue on hume is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [16:52:28] hiiii LeslieCarr, mark, should I be able to reach analytics1003.eqiad.wmnet from esams? [16:52:34] PING analytics1004.eqiad.wmnet (10.64.21.104) 56(84) bytes of data. [16:52:35] From vl100-ve1.csw1-esams.wikimedia.org (91.198.174.1) icmp_seq=1 Destination Net Unreachable [16:53:55] that's from cp3003.esams.wikimedia.org [16:53:57] mutante: yeah Jenkins new version is available for upgrade on gallium. I am mailing ops / wikitech about it :D thx! [16:54:15] yw [16:59:08] (03PS1) 10Aude: Update config for Wikidata and clients [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 [16:59:43] (03CR) 10Aude: [C: 04-1] "*** important ***" [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [17:03:39] ottomata: no [17:04:11] (03PS2) 10Aude: Update config for Wikidata and enabling on Wikimedia Commons [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 [17:04:27] mark, how should I produce to the Kafka brokers? [17:04:38] you can't, currently, if they're internal [17:05:31] in general, a caching center isn't necessarily able to reach our internal networks as they aren't connected [17:06:30] hm. [17:06:38] what should we do? make the kafka brokers public? [17:06:43] we are planning on connecting them, right? [17:09:31] we are working on connecting them yes [17:09:37] but future caching centers may again not be [17:09:52] also, until we have two redundant links, it will probably be less reliable than what we have now [17:11:14] hm ok [17:11:18] so, brokers need public IPs? [17:11:57] for the forseeable future, it seems so... [17:12:02] with strong firewalling I'd say [17:12:04] ya [17:12:42] k, I'll get LeslieCarr to help me figure that out [17:12:46] thanks [17:13:11] er [17:13:19] you don't need her for that, it wouldn't be on the routers :) [17:13:25] you should look at the 'ferm' module in puppet [17:13:32] right, but she volunteered to be involved in the whole thing [17:14:04] i wanted to get her to help me test the esams setup anyway, so she is involved and can help out with stuff [17:14:11] will checkout the ferm module too [17:15:12] right [17:18:05] greg-g: Can I haz deployment window for a centralauth update in the next day or two? Should fix bug 54119. [17:19:55] csteipp: looks confusing... but yeah [17:20:18] csteipp: wed at 1pm? [17:21:05] greg-g: That's fine. It's a pretty small change with low impact, so a lightning would also work.. but Wed is fine too. [17:21:22] ok, or today's LD [17:21:38] Today's LD would be good, if there's room [17:22:28] csteipp: there's room. I'll add you [17:26:39] (03CR) 10Daniel Kinzler: [C: 031] Update config for Wikidata and enabling on Wikimedia Commons [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [17:27:59] (03CR) 10Aude: [C: 04-1] "** important ***" [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [17:35:56] (03CR) 10CSteipp: [C: 031] "Totally fine, for security, to do this. I'll let wikidata-folk time the deployment, to make sure your javascript isn't using wikidata.org." [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/56153 (owner: 10Aude) [17:36:47] oh, that is ancient [17:37:12] i think it's still okay to make that change [17:52:27] (03PS1) 10Jgreen: get rid of separate spamassassin acl block [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 [17:54:11] (03CR) 10jenkins-bot: [V: 04-1] get rid of separate spamassassin acl block [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 (owner: 10Jgreen) [18:06:33] (03CR) 10Ryan Lane: [C: 032] Simplify git-deploy configuration [operations/puppet] - 10https://gerrit.wikimedia.org/r/83046 (owner: 10Ryan Lane) [18:07:35] !log updating git-deploy system for a more simplified configuration [18:15:08] (03PS2) 10Jgreen: get rid of separate spamassassin acl block. jgreen loathes gerrit [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 [18:15:50] (03CR) 10jenkins-bot: [V: 04-1] get rid of separate spamassassin acl block. jgreen loathes gerrit [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 (owner: 10Jgreen) [18:16:32] (03PS3) 10Jgreen: get rid of separate spamassassin acl block. jgreen loathes gerrit even more now [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 [18:21:27] (03PS1) 10Reedy: Wrap Elastic inclusion in file_exists for wmf17 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85693 [18:21:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 68 statistics [18:21:58] !log reedy synchronized wmf-config/CirrusSearch-common.php [18:23:13] !log mw1072 has a readonly file system [18:23:22] Can someone please depool/debug mw1072 [18:23:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 82 statistics [18:23:49] Hmm. Guess it might have already been [18:24:01] apergos has opened an RT ticket for it, nothing in SAL though [18:25:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 88 statistics [18:25:33] !log reedy rebuilt wikiversions.cdb and synchronized wikiversions files: wikinews, wikivoyage, wikiversity and wikisource to 1.22wmf18 [18:28:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 67 statistics [18:28:54] !log reedy rebuilt wikiversions.cdb and synchronized wikiversions files: wikiquote and wiktionary to 1.22wmf18 [18:30:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 1 unauthenticated, 0 locked, 0 copy to table, 85 statistics [18:30:42] Hello! I had some questions concerning the documentation of the server roles [18:33:01] asaifm_: You can ask in here, but some folks are in other timezones [18:33:05] mark still there? I remember the problem I had with FoxyProxy and analytics hadoop urls [18:33:24] hrmm, Who is on RT duty this week? [18:33:26] asaifm_: As your questions... [18:33:30] Chris was last week. [18:33:32] !log reedy rebuilt wikiversions.cdb and synchronized wikiversions files: wikimedia and private and special wikis to 1.22wmf18 [18:33:45] ksnider: Do you recall who is on RT triage duty this week? [18:34:16] (03PS1) 10Reedy: Non wikipedias to 1.22wmf18 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 [18:34:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 75 statistics [18:34:33] yeah sure... I was wondering was are the exact functions of the text squid and bit squid servers and how do they interact with the apache servers [18:35:01] we have three kinds of caching servers, text, bit, and upload (image) [18:35:16] hrmm, lemme see if we have a wikitech page for this. [18:35:24] RobH: Ack, possibly no one! We should have taken some volunteers during our meeting. :) [18:35:26] !log upgrading wikitech to 1.22wmf18 [18:35:33] I'll send out an email. [18:35:54] -_- where's morebots? [18:35:57] asaifm_: did you want to know how bit and text squid differ? [18:36:03] yes [18:36:11] (03CR) 10Jgreen: [C: 032 V: 031] get rid of separate spamassassin acl block. jgreen loathes gerrit even more now [operations/puppet] - 10https://gerrit.wikimedia.org/r/85689 (owner: 10Jgreen) [18:36:12] (03CR) 10Reedy: [C: 032] Non wikipedias to 1.22wmf18 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 (owner: 10Reedy) [18:37:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 74 statistics [18:37:29] text caches the text portions of the pages, while bits handles the css and other odds and ends (i may be descrbing this poorly) [18:37:33] grrrr !log is not logged on the v [18:37:35] Server Admin Log [18:37:39] we also have upload squid, which caches the actual images [18:38:02] Raymond_: yea, morebots isnt up [18:38:37] RobH: ok and how long on average does the data stay on the servers before updating the actual servers with them? [18:38:41] bits is js, css, and some images and other static content [18:38:46] squid servers* [18:38:59] asaifm_: mediawiki sends purges for updates to caches [18:39:02] When an article is edited, the cache on the squids should be purged [18:39:07] so, practically immediately [18:39:21] ugh, morebots [18:39:21] i dont know offhand how many ms [18:39:31] morebots is in toollabs now, ive not restarted it there [18:39:31] I am a logbot running on tools-exec-06. [18:39:32] Messages are logged to wikitech.wikimedia.org/wiki/Server_Admin_Log. [18:39:32] To log a message, type !log . [18:39:41] jeremyb: hey, go back through the backscroll and re !log things, kthx [18:39:47] RobH: got the main idea, thanks [18:40:04] ^demon: ok, wikitech is running 1.22wmf18 [18:40:05] asaifm_: sorry, I'm not the best versed in it, but figured better than nothing =] [18:40:14] * Ryan_Lane goes to upgrade static [18:40:27] RobH: Thanks for the info [18:41:08] (03PS1) 10Jeremyb: wikimediastories.com/net/org [operations/apache-config] - 10https://gerrit.wikimedia.org/r/85698 [18:41:18] !log upgraded wikitech to 1.22wmf18 [18:41:20] RobH, Reedy: Another question, are the wikipedias distributed over all clusters or every cluster is concerned with certain languages? [18:41:21] Logged the message, Master [18:41:27] PROBLEM - MySQL Processlist on db1052 is CRITICAL: CRIT 0 unauthenticated, 0 locked, 0 copy to table, 80 statistics [18:41:31] Clusters of what? [18:41:35] !log upgrading wikitech-static to 1.22wmf18 [18:41:38] Logged the message, Master [18:41:54] greg-g, Is morebots broken /now/ or were you ughing about the earlier outage? [18:42:06] urghing about earlier i assume, as it seems ok now [18:42:17] ok. [18:42:27] Still no idea how to make it cope better with netsplits :( [18:42:43] asaifm_: Do you mean database clusters? We have multiple clusters, based off size of project and language [18:42:44] http://noc.wikimedia.org/dbtree/ [18:42:47] I guess it could just restart itself periodically [18:42:57] andrewbogott: it may be worth switching to a better irc library/framework [18:43:23] RECOVERY - MySQL Processlist on db1052 is OK: OK 0 unauthenticated, 0 locked, 0 copy to table, 0 statistics [18:43:23] !log Created sites and site_identifier tables on commonswiki [18:43:24] I thought we just did that... [18:43:26] Logged the message, Master [18:43:36] did we? [18:43:42] it's still using irclib isn't it? [18:44:08] (03CR) 10Reedy: "recheck" [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 (owner: 10Reedy) [18:44:11] * andrewbogott looks [18:44:25] andrewbogott: RobH yeah, missed his(?) join 5 minutes before I said ugh [18:44:25] it is [18:44:27] (03CR) 10Jeremyb: "try again: Id9882e1a8a5927a7074e9d6de386b2d160312de5" [operations/apache-config] - 10https://gerrit.wikimedia.org/r/84902 (owner: 10RobH) [18:44:29] i rewrote logmsgbot [18:44:34] but not morebots [18:44:37] Hm, how to make gerrit show me patch history of a certain project [18:44:44] is using irclib, I mean. [18:45:02] andrewbogott: use 'project:' [18:45:04] ori-l, ok, that's probably what I'm thinking of. [18:45:09] sorry disconnected [18:45:10] We'll see which of us gets to morebots first [18:45:17] Reedy: you have several clusters: eqiad, pmtpa, esams etc [18:45:27] andrewbogott: project:operations/debs/adminbot [18:45:32] etc? [18:45:34] That's it currently [18:45:50] greg-g: erm, too much stuff to do right now, maybe later [18:46:02] jeremyb: I forgot the ";)" [18:46:10] greg-g: :) [18:46:18] jeremyb: why are you reopening my patchsets? [18:46:18] asaifm_: There is only really the database servers that are seperated in any way. Any wiki can run on any apache (bar testwiki on mw1017) [18:46:30] oh, linking a merge [18:46:38] RobH: i'm not sure what the original problem was with that patchset but sent a new one. regex was based on wikimediafoundation.com [18:46:52] problem was it wasnt parsing, but i didnt get a lot more into it [18:47:08] cuz i didnt wanna leave it busted, i thought you were reopening a ps, you were just linking, disregard, continue on! [18:47:11] =] [18:47:16] Reedy: So, any live wikipedia could be running from any cluster? [18:47:16] ok, well this one is already used elsewhere so it should work! :) [18:47:26] No [18:47:32] esams is only caching [18:47:42] pmtpa is currently the backup site and not used [18:48:20] Reedy: what about eqiad? [18:48:24] RobH: i don't think i *can* reopen? i probably would have if i could. (that keeps everything in one place) [18:48:54] asaifm: What about it? [18:49:02] It's neither a backup site, nor a caching only site [18:49:10] Making it the "primary cluster" [18:49:33] Reedy: ok, understood the main idea and job of eqiad. [18:50:08] !log enabling cirrussearch for wikitech [18:50:11] Logged the message, Master [18:50:38] (03CR) 10Reedy: [V: 032] Non wikipedias to 1.22wmf18 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 (owner: 10Reedy) [18:50:39] Ryan_Lane: is that the main cluster or there's a separate wikitech elastic cluster? [18:51:00] main cluster [18:51:02] * Reedy kicks grrrit-wm [18:51:22] Reedy: Here is my concern and the reason for asking all of these questions. The eqiad cluster is the only cluster from which one can fetch featured articles on the arabic wikipedia and upon updating any of those articles the data is saved but at the same time the regular user would get a timeout error [18:51:25] (03PS2) 10Reedy: Non wikipedias to 1.22wmf18 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 [18:51:56] Why would they? [18:52:24] (03PS1) 10Jgreen: cleanup whitespace on exim4.conf.SMTP_IMAP_MM.erb [operations/puppet] - 10https://gerrit.wikimedia.org/r/85701 [18:52:56] jeremyb: the workflow is a bit odd, i'll give you that. [18:53:51] Reedy, RobH: Is it ok if I update your provided info on this page? https://wikitech.wikimedia.org/wiki/Server_roles [18:54:08] uh, that page was rmeoved as its outdated [18:54:18] and the puppet manifests document the server roles better. [18:54:53] (03CR) 10Jgreen: [C: 032 V: 031] cleanup whitespace on exim4.conf.SMTP_IMAP_MM.erb [operations/puppet] - 10https://gerrit.wikimedia.org/r/85701 (owner: 10Jgreen) [18:54:57] RobH: Could you provide link? [18:55:20] asaifm: no single link [18:55:31] site.pp in our puppet repo lists every server we run [18:55:40] (well, 90%) [18:55:53] RobH: Thanks, it is a start [18:55:54] so its a document of what servers serve what roles [18:55:57] lemme see [18:56:11] https://wikitech.wikimedia.org/wiki/Git [18:56:22] That walks you through how to clone a copy of any of our repos [18:56:31] the repo you would be interested is operations/puppet [18:57:02] PROBLEM - Puppet freshness on analytics1021 is CRITICAL: No successful Puppet run in the last 10 hours [18:57:27] asaifm: So in that repo, there are manifests/site.pp which is the core manifest we build off of [18:57:41] that lists every server, and what it does, and how it does it, via linking to other docs [18:58:00] keeping a single wikitech page up to date manually as servers changed roles was un-managable [18:58:15] (now if we had it parse puppet manifest and output a list, thats something else ;) [18:59:03] RobH, Reedy: Thanks for all the guidelines, pointers and info. Will check the puppet manifest and let you know if something comes up. [18:59:31] yea no problem, in future just ask questions and folks tend to read and answer as they see them [19:00:02] also #wikimedia-tech tends to have a larger population of volunteers and the like who may also have info (this channel tends to stay very ops centric) [19:00:06] either place is fine to ask though [19:00:42] I'd link to a 'staet of operations' talk but pretty sure no one gave one at last wikimania [19:00:52] so any link is years outdated and therefore near useless [19:14:49] (03PS1) 10Petr Onderka: Added documentation comments [operations/dumps/incremental] (gsoc) - 10https://gerrit.wikimedia.org/r/85707 [19:15:00] (03CR) 10Petr Onderka: [C: 032 V: 032] Added documentation comments [operations/dumps/incremental] (gsoc) - 10https://gerrit.wikimedia.org/r/85707 (owner: 10Petr Onderka) [19:16:22] yay! [19:19:49] (03PS1) 10Edenhill: Default explicitly configured numeric fields to 0 instead of -. [operations/software/varnish/varnishkafka] - 10https://gerrit.wikimedia.org/r/85710 [19:25:07] (03CR) 10Reedy: [C: 032 V: 032] Non wikipedia wikis to 1.22wmf18 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85696 (owner: 10Reedy) [19:25:09] (03CR) 10Aude: "sites tables are ready" [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [19:25:28] (03PS1) 10Jgreen: add comment for malware discard [operations/puppet] - 10https://gerrit.wikimedia.org/r/85713 [19:27:14] (03CR) 10Reedy: [C: 032] Wrap Elastic inclusion in file_exists for wmf17 [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85693 (owner: 10Reedy) [19:28:58] (03PS3) 10Reedy: Update config for Wikidata and enabling on Wikimedia Commons [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [19:29:05] (03CR) 10Reedy: [C: 032] Update config for Wikidata and enabling on Wikimedia Commons [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [19:32:17] (03CR) 10Jgreen: [C: 032 V: 031] add comment for malware discard [operations/puppet] - 10https://gerrit.wikimedia.org/r/85713 (owner: 10Jgreen) [19:33:05] (03Merged) 10jenkins-bot: Update config for Wikidata and enabling on Wikimedia Commons [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85683 (owner: 10Aude) [19:35:21] !log reedy synchronized wmf-config/ [19:35:23] hashar, regarding https://rt.wikimedia.org/Ticket/Display.html?id=5394 -- when I build from your repo I still get files marked 0.6.0. Does that mean I'm doing something wrong, or misunderstanding something? [19:35:32] Logged the message, Master [19:36:31] andrewbogott: hoo [19:36:38] !log reedy synchronized database lists files: [19:36:49] Logged the message, Master [19:36:52] hoo? [19:37:04] andrewbogott: ah yeah operations/debs/jenkins-debian-glue , I guess git build package attempts to use the upstream branch which I probably forgot to update [19:37:13] andrewbogott: on labs I am using the upstream deb repo directly :-D [19:38:05] Do you want to twiddle with the wikimedia repo until it builds the proper version? Then I can put the packages on brewster. [19:38:29] I will simply push the version I need :D [19:41:09] andrewbogott: the master branch in Gerrit points to the version I need (sha1 7ab545a ), but git build package takes the version from debian/changelog [19:43:06] I don't understand… shouldn't the changelog reflect whatever version of the code is checked out? [19:44:15] andrewbogott: upstream only update the change log entry when it tag a new version. [19:44:32] Right, but it sounds like you're saying the changelog postdates the branch version [19:44:43] In any case… you can just change the changelog, right? [19:45:00] Well, you or I, but it's more likely to say something useful if you do it [19:45:04] dch -v `git describe` [19:45:05] :) [19:45:35] git-dch [19:45:55] oh [19:46:31] !log reedy synchronized wmf-config 'touch' [19:46:43] Logged the message, Master [19:47:12] andrewbogott: so I guess git-dch before building and that will do it. [19:47:26] ok, trying... [19:48:08] Any idea what ubuntu package includes git-dch? [19:49:38] $ git-dchgbp: error: Version 0.6.0 not found [19:50:04] git-buildpackage [19:50:16] yep, found, installed, now it fails :( [19:50:18] http://packages.ubuntu.com/precise/git-buildpackage [19:50:20] * andrewbogott is helpless [19:55:20] (03CR) 10Ottomata: [C: 032 V: 032] Default explicitly configured numeric fields to 0 instead of -. [operations/software/varnish/varnishkafka] - 10https://gerrit.wikimedia.org/r/85710 (owner: 10Edenhill) [19:55:28] Snaps_: ^ thanks :) [19:58:14] :) [19:59:41] hashar… ? [19:59:47] andrewbogott: yup? [20:00:04] um… ^ [20:00:17] gbp:error: Version 0.6.0 not found [20:00:40] just edit the change log manually with dch -v `git describe` [20:00:48] I have no better advice [20:01:05] that change should be checked in though, shouldn't it? [20:01:10] the only added value I have when using git build package is copy pasting from google result to my labs terminal [20:01:36] ideally we would rename master to upstream [20:01:42] (03CR) 10Chad: [C: 032] Cirrus as secondary for enwikisource/cawiki, primary for itwikitionary [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/84434 (owner: 10Chad) [20:01:55] and have a master branch that would be a fork of upstream with a single change that will be bumping the version in dch [20:02:24] I have no clue how to handle that with git buildpackage though. I can surely look it up but that is going to take me half a day =) [20:05:05] !log demon synchronized wmf-config/InitialiseSettings.php 'Cirrus as primary for itwikt, secondary on cawiki and enwikisource' [20:05:21] Logged the message, Master [20:05:29] <^demon> mw1072 is complaining about having a r/o filesystem. [20:06:20] mmm [20:06:30] Ariel created an rt ticket for already [20:08:01] <^demon> mmk [20:08:21] <^demon> manybubbles: Did you already create cawiki/enwikisource indexes? I'm getting some yelling from the script. [20:08:37] <^demon> http://p.defau.lt/?oHZ_kvvrJEQWRBh5_50iGA [20:09:40] ^demon: I didn't but I can take a guess. I bet because it has been synted those machines are already sending updates. [20:09:52] !log mw1072 has a r/o filesystem [20:09:58] (noting morebots was broken) [20:10:02] Logged the message, Master [20:10:08] <^demon> manybubbles: So it should be ok as-is, right? [20:10:13] ^demon: safest thing is to do a '--indexIdentifier now' to get a fresh one [20:10:28] ^demon: prolly not. [20:11:04] ^demon: another thing that might be happening is that the alias name is taken by the index. then you'd have to manually delete it [20:11:08] (03PS1) 10Andrew Bogott: Mark the debian changelog for Hashar's special build [operations/debs/jenkins-debian-glue] - 10https://gerrit.wikimedia.org/r/85766 [20:11:12] I don't think the script handles that yet. [20:11:30] (03CR) 10Andrew Bogott: [C: 032 V: 032] Mark the debian changelog for Hashar's special build [operations/debs/jenkins-debian-glue] - 10https://gerrit.wikimedia.org/r/85766 (owner: 10Andrew Bogott) [20:11:34] `curl -XDELETE testsearch1001:9200/cawiki` should do you [20:11:48] andrewbogott: \O/ [20:11:49] and I can file a bug to handle this. [20:12:52] <^demon> Bah, and it's saying index missing. [20:13:26] hashar, is there a labs project I can upload these to for you to test? [20:14:01] <^demon> manybubbles: I'm kinda at a loss here. [20:14:14] andrewbogott: integration-debian-builder.pmtpa.wmflabs [20:14:25] ^demon: I got bash search machine! [20:15:24] <^demon> huh? [20:16:04] hashar, /data/project/test-this-stuff/ [20:17:31] andrewbogott: luckily the file are small enough that hopefully GlusterFS is not going to corrupt them *grin* [20:17:33] andrewbogott: thx! [20:18:35] ^demon: got it. [20:19:03] (03PS9) 10Hashar: Jenkins validation (please ignore) [operations/debs/pybal] - 10https://gerrit.wikimedia.org/r/84932 [20:19:19] <^demon> manybubbles: enwikisource is fine, I got it fixed. [20:19:22] ^demon: I image we'll want something we can run that blows away all the search indecies. in other words --rebuild should do it. [20:19:25] <^demon> cawiki's the only one acting up still. [20:19:28] ^demon: I've got it [20:19:31] <^demon> mmk [20:20:57] RECOVERY - check_job_queue on fenari is OK: JOBQUEUE OK - all job queues below 10,000 [20:22:14] ^demon: been made to behave [20:22:27] PROBLEM - Puppet freshness on mw1072 is CRITICAL: No successful Puppet run in the last 10 hours [20:23:44] andrewbogott: works for me. I am just worried about the version number [20:23:53] ^demon: if search updates are on you can end up with indecies already created when you go try to make them. They are created with the name of the alias, inconveniently, and the script doesn't know how to fix that. [20:23:56] how so? [20:24:07] PROBLEM - check_job_queue on fenari is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [20:24:23] <^demon> manybubbles: Mmk. I've started the indexing process. [20:24:29] thanks [20:24:51] andrewbogott: I have no clue how Debian handles it usually, but I think you want some ~git or something in the version string. Might want the date as well sorry :/ [20:25:11] if you tell me exactly what you would like the version string to be, I will use that. [20:25:18] Having dashes causes the build process to get upset though [20:26:37] tilds so ? [20:27:33] Not sure, I can try it. [20:28:16] ^demon: https://bugzilla.wikimedia.org/show_bug.cgi?id=54481 [20:28:43] <^demon> Cool beans [20:29:16] andrewbogott: lets go with 0.6.0~20130919.git7ab545a-1 (upstream version, date of commit, sha1, incremental vets) [20:29:34] ok, trying [20:29:44] andrewbogott: I have found some other package using that scheme, ex: 1.6~git20120311.dfsg.1-2ubuntu0.1 [20:30:31] andrewbogott: this way we have the date/sha1 of commit and can easy upgrade the package later on [20:30:37] I know not why, but with the -1 it complains "This package has a Debian revision number but there does not seem to be [20:30:38] an appropriate original tar file or .orig directory in the parent directory;" [20:30:41] The ~ works though [20:30:50] ahh [20:30:54] that is why I never use dch :-] [20:32:25] (03PS1) 10Andrew Bogott: A different version tag, I guess? [operations/debs/jenkins-debian-glue] - 10https://gerrit.wikimedia.org/r/85770 [20:32:31] hashar, can you live with this one? ^ [20:33:54] andrewbogott: we will survive :-] [20:34:29] (03CR) 10Hashar: [C: 031] "That will surely let us upgrade properly later on :) Thanks!" [operations/debs/jenkins-debian-glue] - 10https://gerrit.wikimedia.org/r/85770 (owner: 10Andrew Bogott) [20:34:56] andrewbogott: I got pybal packages build automatically in Jenkins now :-D [20:38:16] (03CR) 10Andrew Bogott: [C: 032 V: 032] A different version tag, I guess? [operations/debs/jenkins-debian-glue] - 10https://gerrit.wikimedia.org/r/85770 (owner: 10Andrew Bogott) [20:43:46] hashar: OK, those packages should be available now. Lemme know if you run into trouble. [20:44:06] andrewbogott: yeah could you expand my day to 30 hours ? :D [20:44:20] I can, but you won't have so many of them [20:44:29] ah that sucks [20:44:53] yahhhh its in apt :-D [20:45:47] (03PS10) 10Hashar: Jenkins validation (please ignore) [operations/debs/pybal] - 10https://gerrit.wikimedia.org/r/84932 [20:46:17] !log reedy synchronized php-1.22wmf18/extensions/UploadWizard [20:46:21] andrewbogott: rebuilding pybal ^^^, if that works (and it will) we are set. Thank you very much! [20:46:31] Logged the message, Master [20:48:00] !log reedy synchronized wmf-config/ 'touch' [20:48:13] Logged the message, Master [20:53:25] andrewbogott: works for me. You can take the credits in closing https://rt.wikimedia.org/Ticket/Display.html?id=5394 çç [20:53:36] ok! [20:55:18] chr [20:55:36] (03PS2) 10Chad: Cirrus on all closed wikis [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85239 [20:55:37] (03PS1) 10Chad: testwiki on Cirrus [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85773 [20:55:47] (03CR) 10Chad: [C: 032] testwiki on Cirrus [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85773 (owner: 10Chad) [20:56:29] (03CR) 10jenkins-bot: [V: 04-1] testwiki on Cirrus [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85773 (owner: 10Chad) [20:57:05] (03CR) 10Chad: [V: 032] testwiki on Cirrus [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85773 (owner: 10Chad) [20:57:38] !log demon synchronized wmf-config/InitialiseSettings.php [20:57:50] Logged the message, Master [21:00:44] (03PS1) 10Ori.livneh: Add 'webperf' module; provision asset-check service [operations/puppet] - 10https://gerrit.wikimedia.org/r/85774 [21:00:54] paravoid: around? [21:02:33] paravoid: guess not. check out that patch when you're back if you get a chance [21:02:41] (03CR) 10jenkins-bot: [V: 04-1] Add 'webperf' module; provision asset-check service [operations/puppet] - 10https://gerrit.wikimedia.org/r/85774 (owner: 10Ori.livneh) [21:03:14] hashar: whoa, in-line pep8 annotations in jenkins! nice. [21:03:23] ori-l: oh really? [21:03:30] https://integration.wikimedia.org/ci/job/operations-puppet-pep8/3772/violations/file/modules/webperf/files/asset-check.py/? [21:05:20] (03PS2) 10Ori.livneh: Add 'webperf' module; provision asset-check service [operations/puppet] - 10https://gerrit.wikimedia.org/r/85774 [21:05:45] ori-l: yeah that works sometime [21:06:34] ori-l: and sometimes says "no violations found". Ideally I would write a post processing script that would send back the review directly in Gerrit inline comment and -1 "Jenkins-bot says: sorry for nitpicking but you would mind fixing the 1490 errors I reported inline? thx" [21:08:02] speaking of which -- hashar / Krinkle -- check out https://github.com/ariya/phantomjs/wiki/Network-Monitoring [21:09:02] I've used those hooks to look for 404s [21:09:31] nice [21:09:50] the puppet patch above configures a service that reports static asset payload size to ganglia using the same technique [21:10:02] but catching things in CI would be nicer [21:10:35] how do you find the time to implement all that stuff? [21:12:10] my output is like 0.25 Krinkle units [21:12:23] Ku [21:14:43] ori-l: Hm.. are the phantomjs monitor and ganglia related? Or is what something else? [21:14:53] that* [21:15:28] Ah, they are related. [21:15:32] yeah, the webperf thing writes the payload size (broken down by asset type: css/js/image) to ganglia [21:16:02] it's a bit gross because it's a) only the main page of b) only a handful of popular wikis and c) the URLs are hard-coded into the script. but i want to see if it's useful before building it out into something more elaborate. [21:16:14] ori-l: Interesting, so this would be running against a production url? nice. [21:17:18] heh, i thought "uhoh, krinkle is reading my JS", and i instantly found a bug [21:17:57] (03PS3) 10Ori.livneh: Add 'webperf' module; provision asset-check service [operations/puppet] - 10https://gerrit.wikimedia.org/r/85774 [21:18:05] I'll keep the tab open a bit longer and come back in a minute then :) [21:18:19] you caught me on a moment where I had only 1 other tab open so I was already reading into it [21:18:20] o hai Krinkle [21:18:30] on a rare moment* [21:19:09] ori-l: Aw, now I don't get to make my whitespace joke :( [21:19:27] what's the whitespace joke? [21:19:31] Ace of space. [21:19:52] I admit, it's not a very good joke :P [21:20:19] hahaha [21:33:03] Krinkle about the issue I mentioned yesterday [21:33:13] I filed this: https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortening_system_for_Wikimedia_sites_to_support_QR_codes [21:33:23] I noticed its something you already talked about before [21:34:01] I talked about https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener [21:34:26] ya I noticed it later [21:34:30] but this isnt the same as that [21:34:45] I think two rfcs can be merged though [21:34:57] I think my idea builds on that [21:35:03] i am unsure if you'd agree [21:35:18] As for QR codes specifically, I personally don't believe in their use and have to this day refused to ever scan one as a requirement. [21:35:39] not that [21:35:42] for redirection [21:35:46] sure [21:36:00] PLLCCCCCC seems like a good codec to me [21:36:09] will we really have more than 36 projects? [21:36:16] The URL_shortener RFC focussed on the backend that will enable the ability to map an ID to a stable title [21:36:25] There are 800+ projects already [21:36:35] there are 800+editions [21:36:45] because both page id and page title are *not* reliable identifier and should not be used in permanent urls. [21:36:47] vast majority are wikipedia [21:37:11] why wouldnt page id be reliable? [21:37:19] I'm not going to explain that now [21:37:36] all you need to know is the ShortURL extension adds an extra ID (short_id) that maps to a namespace + title. [21:37:36] you mean in the event the page is moved? [21:37:39] no [21:37:42] doesn't page id change after deletion and undeletion? [21:37:48] sometimes. [21:38:07] Krinkle short_id seems to be for something thats more human usable. [21:38:13] anyway, we want to link to a page title, not page id. Because articles can be renamed and change subject (e.g. disambiguation, split, merge) [21:38:35] with the ShortURL extension as backend, we have a numerical ID for a page name. [21:38:43] Krinkle what I wish is to have a system for QR codes, having a short url is a compnent of it [21:40:02] then on top of that we can create an interface to access that id (e.g. instead of en.wikipedia.org/wiki/Special:ShortUrl/36 or en.wikipedia.org/s/36) we'd use wmf.ly/wen36 or a higher base code even (we can use base 36 or 48) [21:41:18] Krinkle I was thinking of PLLCCCCCC where P is the project code (wikipedia. wikimedia, wiktionary, wikinews, etc) LL is the language code (en, fr, ru etc) and CCCCCC the code that relates to the page [21:41:29] however the 1 or more characters after project+languagecode can be of theoratically unlimited size. [21:41:30] base 36 seems to be more than sufficient [21:41:39] as anything more wouldnt work well with QR codes [21:42:19] Krinkle I want to use version 1 of qr codes as its the simplest [21:42:29] with 25% error correction [21:42:37] ToAruShiroiNeko: Why not version 2 or 3? [21:42:45] https://www.wikidata.org/wiki/Wikidata:Project_chat#Article_specific_QR_codes [21:42:51] gets kind of too complicated for smaller displays [21:43:06] my phone has difficulty reading the version 3 [21:43:10] version 1 is 4 characters? [21:43:54] its 27 characters for version 1 [21:44:08] https://developers.google.com/chart/infographics/docs/qr_codes [21:44:11] google has a nice table [21:44:32] v1 can handle 41 chars if you dont have much of an error correction [21:44:48] 41 digits [21:44:54] 25 characters [21:45:01] right [21:46:20] http://wmf.ly/PLLCCCCCC - 23 chars [21:47:00] could go for version 2 for the purpose of error correction [21:47:41] point is I dont want to go for higher complexities if possible [21:48:46] Krinkle my concept is someone in an office environment scaning their screen to load an article on their phone OR do so on a wikipedia print out [21:49:08] so that's a max of 2,176,782,336, right? [21:49:12] consider a printed encyclopedia with each article having a current version link [21:49:14] 36 ^ 6 [21:50:33] there are about 40,600,000 pages on en.wp [21:51:01] 2176782335 indeed [21:55:09] so a 6 base 36 schema would work well [21:55:45] ToAruShiroiNeko: Hm.. do all base 48 characters fall within the alphanumeric range that all QR supports? [21:55:54] that would allow us to buy a few extra characters [21:56:02] (0 to 9, A to Z, space, $ % * + - . / :) [21:56:31] we could use chars like $%*+-: but I think that may be problematic [21:56:31] ^demon: my calendar just told me our deployment window is in 5 minutes. I imagine you've already done everything though. [21:56:44] <^demon> Um, I thought it was earlier? [21:56:54] <^demon> One of us misread a calendar. [21:56:59] I am gladd you two dont handle the space program :p [21:57:14] ^demon: sweet. I thought it was 3 sf time [21:57:44] no problem, I guess. I'd still like to do a deploy, however minimal. [21:57:47] <^demon> manybubbles: I have 1pm sf :) [21:57:50] Krinkle char countwise we can add more base36 I suppose [21:57:57] so 7 digits instead of 6 [21:58:00] sure [21:58:10] <^demon> manybubbles: So yeah, I deployed awhile ago. Indexes are building for cawiki and enwikisource, gonna take awhile. [21:58:17] you could go for 29 with ver 2 (25% correction) [21:58:20] that is assuming we can get a 6 char domain [21:58:23] ^demon: sounds good :) thanks again [21:58:43] Krinkle isnt page ID the number of existing pages (deleted or otherwise) [21:58:43] <^demon> np. tomorrow is 9am sf, noon your time. [21:58:48] that includes all namespaces [21:58:49] <^demon> I can let you do that one. [21:59:00] ToAruShiroiNeko: Yes [21:59:34] ToAruShiroiNeko: LAtest page [21:59:34] https://en.wikipedia.org/w/index.php?title=Father_figure_(disambiguation)&oldid=574243487 [21:59:41] Page id 40613247 [22:00:06] so ~ 40,613,247 [22:00:45] Not even 100M [22:01:32] base_convert('40613247', 10,36) == "o6hdr" [22:01:40] ya [22:02:36] not even using 6 digits yet [22:03:28] 5 can go upto 60,466,175 [22:03:48] so could handle even a ~50% growth [22:04:05] so I think we can avoid going for higher complexity qr versions [22:07:18] ToAruShiroiNeko: btw, http://my.register.ly/whois.php?domain=wmf.ly [22:07:26] It has been reserved [22:08:03] ah. ours? [22:08:16] No, it has been reserved by the LY system in general [22:08:36] thats no good [22:08:38] It isn't free for open signup. You have to specifically ask for it by them as an organisation [22:08:53] oh, so wmf can ask for it [22:08:58] It is to avoid domain parking and stuff like that. It's a good thing. [22:09:03] yes yes [22:09:11] domain parkers are the scum of the earth :p [22:09:25] e.g. to avoid a dude from buying a-za-z.ly and selling it real local lybian organisations [22:09:43] we can also use wmf.co [22:10:16] same length would work [22:10:16] Slightly less sketchy [22:10:34] it doesnt even have to be wmf for the purpose of a redirect [22:10:57] would be nice but not always possible [22:11:12] wmf.co is up for auction at GoDaddy and others. 90 days left. Current bid is € 35 [22:11:34] I wonder if the foundation is interested [22:11:36] probably not [22:11:39] https://www.wikidata.org/wiki/Wikidata:Project_chat#Article_specific_QR_codes [22:11:49] thats what I envisioned for QR codes, kinda sorta [22:12:29] the qr code would link to the article it is seen at [22:12:42] if its a diff, a higher verson qr code can be used [22:13:01] but I am not too concerned with that [22:20:57] we should probably avoid wmf.uk [22:21:02] but there's a lot of options [22:21:04] http://www.mywebsitetool.com/en/domains/int_reg.html [22:21:22] Although I think wmf.co would be nicest [22:22:22] note that .co also takes single letter names [22:22:24] like g.co [22:22:41] Although getting those is likely going to be harder (e.g. w.co) [22:22:58] ya [22:23:26] ICANN changed their rules as you well know probably [22:23:32] so we could get just "wmf" too [22:31:32] a couple of trivial merges? https://gerrit.wikimedia.org/r/#/c/85630/ and https://gerrit.wikimedia.org/r/#/c/85629/ [22:31:38] Ryan_Lane: ^ Coren ^ [22:37:38] (03CR) 10Ryan Lane: [C: 032] Ensure that a matching version of JDK is present for the JRE [operations/puppet] - 10https://gerrit.wikimedia.org/r/85630 (owner: 10Yuvipanda) [22:37:53] Ryan_Lane: can https://gerrit.wikimedia.org/r/#/c/85774/ piggy-back? [22:39:23] (03PS4) 10Ryan Lane: toollabs: replace snarky comment [operations/puppet] - 10https://gerrit.wikimedia.org/r/85629 (owner: 10Yuvipanda) [22:39:42] you'll need to give me a min on that [22:39:45] it's not simple ;) [22:39:58] yeah, np [22:42:41] (03CR) 10Ryan Lane: [C: 032] toollabs: replace snarky comment [operations/puppet] - 10https://gerrit.wikimedia.org/r/85629 (owner: 10Yuvipanda) [22:47:26] (03PS1) 10Bsitu: Enable Echo and Thanks on Various wikis: [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85778 [22:47:44] Krinkle so do you think the two ideas can be merged? [22:48:17] ToAruShiroiNeko: I'd keep them separate [22:48:26] ToAruShiroiNeko: But we can have one more clearly use the other [22:49:28] I'll do an edit pass on both later today to update them and incorporate this [22:49:39] (03CR) 10Bsitu: [C: 04-2] Enable Echo and Thanks on Various wikis: [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85778 (owner: 10Bsitu) [22:50:35] (03CR) 10Ryan Lane: [C: 032] Add 'webperf' module; provision asset-check service [operations/puppet] - 10https://gerrit.wikimedia.org/r/85774 (owner: 10Ori.livneh) [22:51:00] ori-l: the python took me a bit to trace the code for shelling out, looks good, though [22:51:46] merged all the way through [22:51:51] woot, thanks very much [22:52:30] Krinkle well yes [22:52:37] qr code needs short URLS you seek short urls [22:53:01] designing a second short url system sounds silly [22:53:08] I'm limited the scope of the existing RFC a bit and moving parts of it to yours and merging that [22:53:14] We're not going to have 2 systems. [22:53:32] indeed [22:53:46] but my requirement is low character limit for the actual url [22:54:05] perhaps somehting like my encoding to have it work on all projects seemlessly [22:54:19] RECOVERY - check_job_queue on fenari is OK: JOBQUEUE OK - all job queues below 10,000 [22:54:23] how many projects do we have anyways [22:54:29] One is for the generic backend for mapping a namespace/title page to a stable numerical id. The other is the front-end implementation for Wikimedia (e.g. currently proposing wmf.ly or wmf.co with a base36 shorturl id redirecting to project/languagecode/shorturl) [22:54:56] right [22:55:10] 851 [22:55:13] I am just curious if 36 projects is aiming too low. [22:55:29] 851 wikipedias? [22:55:48] I suppose the correct term is "families" [22:56:07] indeed, anything with a subdomain in front. [22:56:26] there are fake families too [22:56:39] like how commons and meta are not in the same family [22:56:39] plus maybe some exceptions for those without a subdomain (like mediawiki.org) [22:56:45] and yes [22:56:47] well, they are both under wikimedia.org [22:56:52] yes [22:57:01] likewise chapter wikis [22:57:18] the interpreter would redirect those properly too [22:57:22] true, so we can't use "co" for "commons" in the "wikimedia' family as that could clash [22:57:29] PROBLEM - check_job_queue on fenari is CRITICAL: CHECK_NRPE: Socket timeout after 10 seconds. [22:57:44] I suppose we'd use numeric mapping for that as well with base36? [22:57:49] instead of actual language codes [22:57:49] yeah [22:58:05] if the codeword for "wikimedia" sites is say A commons would be somehting like A02 [22:58:20] In that case, if you're going to need a hardcoded integer map, might as well map to full domain name or wiki id [22:58:33] so instead of PLL, you'd have WWW [22:58:37] (wiki code) [22:58:57] 0=> enwiki, 515 => nlwiktionary, 812 => viwikivoyage [22:59:24] That would work too [22:59:50] would handle 46,656 wikis [23:00:12] in fact that seems less wasteful [23:00:39] two digits would be sufficient for now but 3 would last forever [23:00:44] yep, PLL is also limited to 46,656, except that PLL is restricting it to 1296 wikis in 36 families [23:01:11] we can add relevant non-wiki pages too [23:01:14] like bugzilla [23:03:04] <^demon> manybubbles: So, we're still working on building out enwikisource and cawiki. This totally isn't going to scale--we're going to have to revisit the initial indexing problem again. [23:03:26] <^demon> I've got a couple ideas. Want to sleep on it though. [23:04:37] ToAruShiroiNeko: Actually, on second thought. The existing RFC is indeed directly mergable. The ShortURL backend implementation is already pretty much agreed upon. If not, it would seem appropiate to deal with that in the same RFC. [23:04:52] I mean, it is already written and installed on some wikis. [23:05:14] sure but it doesnt comply with what I am hoping to have, perhaps it can be slightly altered [23:05:50] its useless without somehting like wmf.co to shorten the actual url atm which is a WIP [23:05:53] Yep, doing that now. [23:06:15] useless in terms of shortening the url [23:06:21] greg-g, i would like to do a deployment tomorrow morning - i will be traveling next week and want to get it in as early as possible [23:06:29] sorry that remark was unnecesarily snipey [23:06:49] greg-g, is there a time that it would be safe/ok to do? :) [23:18:52] (03PS1) 10Catrope: Set VisualEditor back to opt-in on enwiki [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85790 [23:19:02] (03CR) 10Catrope: [C: 032] Set VisualEditor back to opt-in on enwiki [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85790 (owner: 10Catrope) [23:19:22] (03Merged) 10jenkins-bot: Set VisualEditor back to opt-in on enwiki [operations/mediawiki-config] - 10https://gerrit.wikimedia.org/r/85790 (owner: 10Catrope) [23:20:14] !log catrope synchronized wmf-config/InitialiseSettings.php 'Set VisualEditor to opt-in on enwiki' [23:20:27] Logged the message, Master [23:35:08] !log catrope synchronized php-1.22wmf17/extensions/VisualEditor 'VisualEditor cherry-picks' [23:35:24] !log catrope synchronized php-1.22wmf18/extensions/VisualEditor 'VisualEditor cherry-picks' [23:35:25] Logged the message, Master [23:35:41] Logged the message, Master [23:36:00] !log mw1072 is unhappy, sync-file throws warnings about read-only filesystem [23:36:16] Logged the message, Mr. Obvious [23:37:56] Known, logged and already RT'd [23:43:48] (03PS1) 10Kaldari: wikivoyage.org TXT Google Webmaster Tools key addition [operations/dns] - 10https://gerrit.wikimedia.org/r/85794 [23:45:19] anybody want to merge this little DNS change for wikivoyage? https://gerrit.wikimedia.org/r/#/c/85794/ [23:45:34] if not, I'll just file an RT ticket or something [23:47:52] (03PS2) 10Kaldari: wikivoyage.org TXT Google Webmaster Tools key addition [operations/dns] - 10https://gerrit.wikimedia.org/r/85794 [23:48:12] (03PS1) 10Ori.livneh: Add Ganglia view for static asset payload size [operations/puppet] - 10https://gerrit.wikimedia.org/r/85796 [23:49:11] (03PS3) 10Kaldari: wikivoyage.org TXT Google Webmaster Tools key addition [operations/dns] - 10https://gerrit.wikimedia.org/r/85794 [23:49:14] Ryan_Lane: ^ that one's simple and follows the previous patch rather naturally [23:50:21] ori-l: thanks for the endorsement ;) [23:50:51] kaldari: nice try