[00:00:02] it's a container :) [00:00:04] download at home.. upload it once [00:00:10] the entire container? [00:00:12] yes [00:00:36] ok.. upload it once to your own internal IP.. then change curl command to download from your own server [00:00:47] add cronjob on own server that syncs with upstream but just once a week [00:00:47] hmm, we carn't [00:00:52] due to the licensing of the file [00:01:03] we have to keep it away from redistributing [00:01:06] so ie hidden [00:01:14] https://www.irccloud.com/pastebin/uYebJnot [00:01:15] Title: [ Snippet | IRCCloud ] - www.irccloud.com [00:01:16] i dont get it. downloading a lot is allowed but not .. downloading a little? [00:01:23] ^^ [00:01:28] you are not redistributing [00:01:57] Note the cloudflare stuff towards the end [00:02:21] paladox: mutante ^^ [00:02:27] yep [00:02:41] yea, i bet it's like this.. [00:02:47] if there are more than X connections.. [00:02:49] The whole reason for that fail is the DDoS filtering [00:02:52] then it routes it via cloudflare [00:02:54] and that fails [00:03:20] the obvious fix seems to not download it so often [00:03:34] you are not redistributing if you have it on your own internal IP [00:03:59] as long as you dont put it on downloads.miraheze.org or something.. license should be fine [00:04:07] Travis downloads it every time it runs [00:04:16] yes, and that is the problem [00:04:20] PROBLEM - misc4 Puppet on misc4 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [00:04:22] download it from your own server [00:04:36] and you dont run into ddos protection or rate limiting [00:04:51] The question is where would we store it? [00:05:12] [02miraheze/services] 07MirahezeSSLBot pushed 031 commit to 03master [+0/-0/±1] 13https://git.io/fAhRq [00:05:14] [02miraheze/services] 07MirahezeSSLBot 033bc73eb - BOT: Updating services config for wikis [00:05:21] the install server [00:05:26] would be my first guess [00:05:35] do you have one ? [00:05:43] paladox: ^^ [00:05:54] ok and nope [00:05:59] we have no install server [00:06:02] we have misc* though [00:06:12] do any of your servers have private IPs? [00:06:18] nope [00:06:20] heh [00:06:23] they are all public [00:06:34] i belive our hosting provider is working on a private network [00:06:39] but for now it's all public [00:06:46] well. then.. setup an apache on port 666 and firewall it off except from your own IPs :P [00:07:04] We use NGINX mutante [00:07:10] only problem is travis ci ip could be anything. [00:07:18] s/apache/a webserver [00:07:18] mutante meant to say: well. then.. setup an a webserver on port 666 and firewall it off except from your own IPs :P [00:07:45] who cares what the IP of travis is.. it will just be the client ? [00:08:20] well we doin't want every one downloading it :) [00:08:27] wait.. you know what your own IP addresses are .. right [00:08:41] you are saying they are public AND also random ? [00:08:47] oh https://dev.maxmind.com/geoip/geoip2/geolite2/#License [00:08:48] Title: [ GeoLite2 Free Downloadable Databases « MaxMind Developer Site ] - dev.maxmind.com [00:09:24] paladox: i would be VERY surprised if you have virtual machines that get a random new IP each time they boot.. really??? [00:09:28] The easier fix may be not using travis on the repo [00:09:31] mutante nope [00:09:36] we have a floating ip assigned [00:09:41] paladox: so you DO know your own IP [00:09:45] yep [00:09:48] we know our ip [00:09:54] then i see no issue [00:10:10] it's not a problem to setup a webserver, right [00:10:14] and to put a file on it [00:10:17] and to download from it [00:10:28] ok [00:10:49] you can use hosts.allowed / hosts.denied as well [00:10:58] if you dont want to mess with iptables/ferm [00:11:11] to block the other people from downloading it [00:12:10] RECOVERY - mw3 JobQueue on mw3 is OK: JOBQUEUE OK - job queue below 300 jobs [00:12:20] RECOVERY - misc4 Puppet on misc4 is OK: OK: Puppet is currently enabled, last run 1 minute ago with 0 failures [00:12:56] or you could put simple auth in front of it [00:13:02] and tell curl a user/password to use [00:13:18] yep [00:13:23] like user: maxmind password: miraheze :P [00:13:31] lol [00:14:27] another question [00:14:35] so travis builds a new container each time, is that right? [00:14:45] how does it know how to build that [00:14:52] is there a base image it uses or so? [00:15:18] i mean.. how does it know what OS to use [00:15:35] i think so [00:15:41] we are using ubuntu xenial [00:15:47] as the default is to use precise [00:15:50] so somehere the is an ubuntu image for this [00:15:59] possible you can edit that.. right [00:16:02] and put the file in there [00:16:10] and not even have to download it [00:16:28] um i doin't think so [00:16:33] but have no idea :) [00:16:42] cant you unpack that image, add a file and then repack it again? [00:16:45] i think you can only use the images travis picks [00:16:48] just thinking out loud [00:18:09] https://docs.travis-ci.com/user/enterprise/build-images [00:18:10] Title: [ Customizing Travis CI Enterprise Build Images - Travis CI ] - docs.travis-ci.com [00:18:53] start a Docker container based on one of the default build images travis:[language], [00:18:56] run your customizations inside that container, and [00:18:58] commit the container to a Docker image with the original travis:language name (tag). [00:19:06] mutante that costs though [00:19:06] this sounds like what i meant by "unpacking and repacking" [00:19:06] https://enterprise.travis-ci.com [00:19:09] Title: [ Travis CI Enterprise - Test and Deploy Your Code with Confidence ] - enterprise.travis-ci.com [00:19:56] but docker is free? [00:20:03] and all it says is using docker? [00:20:10] mutante docker is free but travis enterprise [00:20:11] is pay [00:20:52] i would expect that there is no difference in how Travis uses Docker to generate those images [00:21:09] "not enterprise" = totally different mechanism ... that would surprise me a lot [00:21:49] i do, because the enterprise features give you more flexability. [00:22:00] but i have no idea :) [00:22:06] anyways i have to go [00:22:10] well, unless without enterprise you cant even build images.. i dunno that [00:22:35] back to just running a random webserver somewhere that is locked down [00:22:43] good night paladox [00:22:50] you too mutante ;) [00:22:52] *:) [01:02:49] Looking at icinga there is one warning (db4 disk space [01:02:56] ) [01:04:51] run "apt-get clean" and hope for recovery. works quite often [01:05:04] especially if its not usually run [11:24:15] [02mw-config] 07Amanda-Catherine commented on pull request 03#2486: Add userspace to default search on starmetalwiki T3654 - 13https://git.io/fAjvo [11:27:07] Amanda-Catherine/dns/master/dca571b - Amanda The build has errored. https://travis-ci.com/Amanda-Catherine/dns/builds/86135115 [11:35:42] [02mw-config] 07JohnFLewis closed pull request 03#2486: Add userspace to default search on starmetalwiki T3654 - 13https://git.io/fAAZT [11:35:43] [02miraheze/mw-config] 07JohnFLewis pushed 031 commit to 03master [+0/-0/±1] 13https://git.io/fAjJY [11:35:45] [02miraheze/mw-config] 07Amanda-Catherine 03b26bede - Add userspace to default search on starmetalwiki T3654 (#2486) [12:11:46] O/ [13:31:45] Hi all [13:32:00] hi [13:38:53] spambots... [13:41:51] yep [14:33:38] :/ wish i could see what they are saying [14:34:30] Basically the kind of things that they are saying in #wikimedia-cloud Zppix [14:34:47] MacFan4000: ah [14:35:09] I swear spam on freenode is just getting worse and worse [14:35:26] It used to be a couple times a month now its daily [14:37:34] They seem to get smarter and smarter [14:37:37] Like, smart enough to avoid anti-spam bots [14:38:00] (Such as Sigyn) [14:39:39] I think sigyn triggers are human defined [16:15:12] Reception123: estás? [16:17:35] MacFan4000: Are you there? [16:18:15] Yes, but no ssh access [16:18:41] (I’m on my phone) [16:20:21] ah, no. wanted to ask for comments on this proposal https://meta.miraheze.org/wiki/Talk:Site_updates#Proposal when you have time MacFan4000 [16:20:22] Title: [ Talk:Site updates - Miraheze Meta ] - meta.miraheze.org [16:22:57] That seems fine to me Wiki-1776 [16:23:24] I’ll keep that in mind when doing site updates for September [16:26:37] ok [17:15:13] [02miraheze/services] 07MirahezeSSLBot pushed 031 commit to 03master [+0/-0/±1] 13https://git.io/fAj6n [17:15:15] [02miraheze/services] 07MirahezeSSLBot 03b96f89e - BOT: Updating services config for wikis [20:31:38] curiosity, with which script do you know that Wikis uses a certain extension? for example UniversalLanguageSelector [20:34:17] Wiki-1776, did you ever look at Special:Version ? [20:37:39] yes, only there are more than 500 wikis to review and I seem to remember that there is a script that shows that wikis use a certain extension (it seems they used it with an extension that had bugs) qq[IrcCity] [20:38:59] Confused grammar. Should be: do you know which wikis use a certain extension [20:39:27] BTW “wiki” is a common noun. [20:39:59] uh :( [20:41:45] Wiki-1776: you ask the API for "siteinfo" [20:41:56] and that includes a list of extensions it has installed [20:41:57] https://www.mediawiki.org/wiki/API:Siteinfo [20:41:58] Title: [ API:Siteinfo - MediaWiki ] - www.mediawiki.org [20:42:05] https://www.mediawiki.org/wiki/API:Siteinfo#Extensions [20:42:05] Title: [ API:Siteinfo - MediaWiki ] - www.mediawiki.org [20:44:42] Wiki-1776: https://publictestwiki.com/w/api.php?action=query&meta=siteinfo&siprop=extensions [20:44:43] Title: [ MediaWiki API result - TestWiki ] - publictestwiki.com [20:44:52] this is the example using test.miraheze [20:45:08] I belive he wants a script to show what each wiki has [20:45:14] which a url carn't do :) [20:46:15] eh.. that part seemed obvious after you know how to ask one wiki [20:46:19] then you ask all wikis [20:46:44] i dont think there is any other way than asking each wiki [20:46:47] of course use a loop [20:47:23] yeh [20:47:35] how to get the list of all miraheze wiki names .. that code can be copied from wikistats.. public repo :) [20:47:44] lol [20:47:44] it's asking your server to give back the list of all wikis [20:47:59] also an API call [20:53:48] paladox: hmm, the UniversalLanguageSelector extension has not been useful to me and it doesn't even work in my Wiki... I don't plan to use it and I have disabled it from my Wiki. Will they want to remove it from GitHub/ManageWiki or will it remain for others to use? or is there a Wiki other that uses the extension? [20:54:11] I belive other wiki's use it [20:54:12] *i think* [20:54:18] (remember that it was me who requested that extension) [20:54:44] yes? [20:55:45] actually [20:55:50] looks like no wiki uses it [20:55:59] cat all.dblist | grep 'UniversalLanguageSelector' [20:57:50] * Wiki-1776 observes that he again made a mistake in the wording :( [20:59:23] Well, youlls decide what to do with that extension. [21:00:18] ok we will