[02:14:10] SMalyshev: wikimedia-lucene-explain-parser is missing most of the library bootstrap files :/ [02:15:16] SMalyshev: next time you can use https://phabricator.wikimedia.org/diffusion/MCCL/ to create the repository, it'll populate all the standard files automatically [08:02:18] legoktm: not sure what is missing now...Doxyfile? ah maybe also .gitattributes [08:02:46] SMalyshev: doxyfile, .gitattrubtes, composer.json is non-standard, .travis.yaml is non-standard, ... [08:03:15] I didn't really keep looking after that [08:03:41] ok, I can fix those pretty easily [08:03:55] thanks :) [08:06:13] legoktm: though I am not sure standard composer would work there... [08:06:28] what do you mean? [08:06:40] it assumes phpunit and phpcs and phpcbf already installed system-wide. Which is not always the case [08:06:49] er, no [08:06:55] composer prepends vendor/bin to the path [08:07:15] yes? didn't work for me... maybe I tested wrong, let me try again [08:07:45] nope [08:07:46] sh: parallel-lint: command not found [08:07:50] it definitely does :) see https://getcomposer.org/doc/articles/scripts.md#writing-custom-commands and our entire CI depends upon this [08:08:31] did you run composer install? ls vendor/bin/ ? [08:08:32] I don't know what happens on CI but definitely does not work for me [08:09:01] let me update everything and try again [08:09:35] hmm now it worked... weird. ok then :) [08:10:21] according to thiemo on https://github.com/DataValues/Geo/pull/121 including vendor/bin/ is harmful for Windows users [08:11:25] can we add .idea to default gitignore? [08:11:32] phpstorm puts stuff there [08:11:52] sure, please submit a patch [08:15:09] legoktm: https://gerrit.wikimedia.org/r/#/c/399780/ should fix those [08:16:03] thanks :) [08:20:13] do you know who can enable travis build for this ext? [08:20:30] I don't have permissions [08:21:25] I'm not sure [08:21:57] ok, I'll file phab task and see :) [08:22:09] #GitHub-Mirrors project maybe? [08:22:12] * legoktm afk -> zzz [08:22:25] good night :) [09:52:25] Is there an easy way to revert a vandal on phabricator? Specifically on https://phabricator.wikimedia.org/M236 [09:53:32] pcoombe: nope [09:55:51] pcoombe: i've asked a few of teh admins in -releng to disable when they are around [09:56:10] Ok, thanks p858snake [11:17:53] pcoombe: I've disabled the user, I see that you've already reverted the changes [11:20:00] Thanks volans! [11:20:36] thank you for the heads up ;) [11:24:09] [[Tech]]; Srdjan m; /* Lua scripts timing out */ new section; https://meta.wikimedia.org/w/index.php?diff=17568421&oldid=17564700&rcid=11096364 [13:29:16] i don't remember how are the dump people here, but every link in https://dumps.wikimedia.org/enwiki/20171220/ gives me 404s [13:57:25] Headbomb_: the files are being produced but haven't made it over to the web server yet [13:57:36] I'll have a look today and see what's up [14:08:01] Zppix [14:09:24] Is it really the case the ENTIRE kw-wiki fits into a 1.6MB file? [14:09:37] https://dumps.wikimedia.org/kwwiki/20171220/ [14:12:50] Dysklyver: these are the current revisions of articles in the main namespace only [14:12:58] bz2 compressed of couse [14:13:10] so that is all 3000 odd article? [14:13:22] however many articles, they are all in there yup [14:13:30] no images [14:13:30] I am impressed [14:13:33] just the text [14:14:09] the images are linked from commons anyway, I don't belive kwwiki has any local images [14:14:28] the images would need to be downloaded and scaled if you wanted to make a copy [14:14:45] if you want just to work with the text though, there it is, please feel free! [14:15:41] actually I want to make a local copy of the entire wiki so i can test things on it without breaking the real one [14:16:12] in that case you might want to get all the sql files and stuff too [14:16:24] and the xml files with the full revision content history [14:16:35] it's probably still pretty small [14:17:02] and what am i ding wrong that i get a 404 error? [14:17:28] because the files for this month haven't been copied over yet [14:17:40] I've been modifying the copy scripts, probably I have a bug from yesterday [14:17:53] just get the previous month ones [14:18:04] if it's only for testing [14:18:05] ok so that is a temporary server side issue [14:18:13] a temporary me side issue ;-P [14:23:53] ok no worries :) [14:36:46] apergos, well the links says the files are completed soooo something weird is going on if that pages doesn't reflect the actual status [14:37:23] Headbomb_ try downloading any file from https://dumps.wikimedia.org/kwwiki/20171220/ [14:37:43] it says: 404 Not Found nginx/1.11.13 [14:39:55] the files are completed [14:40:08] we copy over the index files separately from the dump download files [14:40:23] so one is working and I broke the other, figures [14:40:44] but the files themselves all exist on the server where we will copy from [14:40:50] so no worry about that [14:41:03] ok [14:42:01] I specifically want all the kw stuff, so https://dumps.wikimedia.org/kwwiktionary/20171220/ https://dumps.wikimedia.org/kwwiki/20171220/ and https://dumps.wikimedia.org/kwwikiquote/20171220/ [14:43:23] all the dump output files get copied in one job [14:43:41] so everything will start to show up over a few hours, once I find and fix the issue [14:43:59] it's a fancy rsync [14:44:28] oo fancy rsync [14:46:08] :-D [14:47:12] apergos: does fancy mean tempermental? [14:47:51] no [14:47:59] it runs fine if I don't break it :-p [14:48:05] Lol [15:22:09] on kannada wikipedia a page semms to be blank but when tried editing it alredy had content [15:22:22] https://kn.wikipedia.org/wiki/%E0%B2%A1%E0%B2%AE%E0%B2%BE%E0%B2%B8%E0%B3%8D%E0%B2%95%E0%B2%B8%E0%B3%8D [15:24:45] on kannada wikipedia a page semms to be blank but when tried editing it alredy had content https://kn.wikipedia.org/wiki/%E0%B2%A1%E0%B2%AE%E0%B2%BE%E0%B2%B8%E0%B3%8D%E0%B2%95%E0%B2%B8%E0%B3%8D [15:27:38] something wrong with the infobox code [15:30:54] if you take off " ಸುಮಾರು 4 ದಶಲಕ್ಷ " from the "population_total" then it's working, I don't know why [15:37:25] yes its working now /ping Stryn how did find it out without editing aritcle [15:38:25] Minato826: I can use the preview button [15:38:56] apergos:it runs fine if I don't break it :-p: [15:38:58] that's one for http://www.bash.org/ [15:39:53] feel free [15:40:06] they always reject my stuff, so i never bother [15:40:10] ah [15:40:16] I've never submitted so, shrug [15:40:29] @Stryn earlier before removing population total when used preview it displayed but i saved content it again blanked [15:41:02] hmm should not [15:42:27] i thought you might have used something on browser console to find bug [15:43:32] nope, I'm not so good in tech, I just removed lines one by one and always previewed :) [15:44:02] apergos, http://www.bash.org/?965449 if it ever gets approved [15:50:48] heh [15:50:52] thanks [16:22:50] Headbomb_ and Dysklyver, I've found the issue, it's going to take some hours at least for the rsyncs to catch up [16:23:06] there's a lot of data, and two hosts that need to get the copy [16:23:20] ok congrats on fixing it [16:23:53] yay party time [16:25:41] party time tomorrow morning probably