[03:51:18] is wiki responding slowly for anyone else? [04:21:06] Magog_the_Ogre: which wkki? [04:21:10] Wiki* [04:21:25] Commons for me, only sometimes [04:21:38] been happening since yesterday [04:24:52] Magog_the_Ogre: is it random or only when doing certain actions? [04:31:56] just now, it took about 5 or 6 seconds for a page to load [04:32:09] the latency doesn't seem to exist on the rest of the web [04:34:55] the scripts don't always load either [04:35:09] oddly, no console logs [04:35:57] the base document just took 5.2 seconds to load (!) [04:36:47] ping Zppix [04:37:09] Hmm [04:38:56] Sadly its pretty late so i cant exactly contact anyone... have you tried to make sure its not your connection, sometimes depending on your timezone you could be on peak usage time for your internet provider causing you to lose internet speed [04:41:53] oh dear, please don't do that. [04:42:08] it's not nearly urgent enough, and it might be my router acting up [04:42:16] I thought maybe someone might know of the issue [06:23:52] I left a note on their talk page with a link to https://wikitech.wikimedia.org/wiki/Reporting_a_connectivity_issue [14:33:05] Phabricator question: How do parent tasks work? Does a parent task block a subtask, or is it the other way around? [14:34:15] Jhs: depends sometimes parent tasks are a broad "to-do" [14:35:00] Zppix, ok. asking because I'm adding a task that can only be done once a different task is done. so i'm wondering if i should add that other task as parent or subtask [14:35:52] Jhs: subtask, and then make it obvious that the subtask is blocked [14:37:47] Zppix, make it obvious just in the description, yeah? [14:38:00] Yes [14:39:22] Generally subtasks block parent tasks [14:39:39] If you're used to the bugzilla nomenclature [14:39:53] bawolff: phab is backwarss [14:40:03] Backwards [14:41:09] If you had task A, that cannot happen until task B is done, then I would say that A should be the parent of B [14:41:52] I never used bugzilla [14:42:01] So im used to it the other way [14:42:30] But generally the subtask thing is more meant to denote I have big project A, and to do A i need to do step B, C, D, so B, C and D become substasks of A [14:42:45] Jhs: General rule, don't overthink it to much, if its wrong, its easy to fix [14:47:09] Zppix: I think I actually disagree with what you're saying. In phab subtasks block parent tasks [14:49:46] Jhs: https://www.mediawiki.org/wiki/Phabricator/Help#Parent_tasks_and_subtasks [14:50:40] Zppix, no, subtasks are NOT blocked by parent tasks. The other way round. [14:51:13] Jhs: In any case, I agree with what you did on https://phabricator.wikimedia.org/T182431 [14:54:05] Jhs: On your last question - if its a somewhat urgent thing - e.g. If categories are very wrong as it stands, you can also ask for the patch itself to be swatted, and then the change could be done as early as monday [14:54:48] I'm not sure if the situation is urgent or not. If its not urgent in any sense, I would probably just wait for thursday [15:19:42] hello [15:20:46] Someone in? [15:20:59] wqsde: Sure! :) [15:22:18] What country are you from? [15:33:46] andre__: oh sorry [16:55:11] bawolff, thanks for the tips! (was afk) it's not really urgent, so don't want to bother too many people about it :) but would be nice to have before new years, but that's just that – nice to have :) [16:55:54] Right, new years. I forgot about winter holidays, I wonder when the deployment freeze happens [16:56:13] dec 18 appearently [16:56:33] Dec 18 [16:56:50] Unless emergency [17:02:18] yeah, there's one SWAT happening after all wikis have been switched to 1.31-wmf.12, so I'm aiming for that one ;) [18:29:39] I'm wondering of https://phabricator.wikimedia.org/T182448 would be suitable for gci (With a very much improved description) [18:30:08] Its probably not too hard, but there is some stuff to it, and Phantom42 did say he was looking for more challenging tasks [18:31:08] bawolff: (gci is handled by dev relations over at #wikimedia-devrel prob a better place) [18:31:37] as usual, if you explain requirements well in the task description so people can judge if that's their skill set, try? :) [18:38:24] hmm, https://phabricator.wikimedia.org/diffusion/MTPS/browse/master/README.md has some interesting syntax highlighting going on [18:38:57] oh screw it, linking to github (Sorry _joe_ and friends of the anti-github crowd ;) [18:42:46] hmm, the readme says GPLv2 or later, but everything else says just version 2, should probably fix that before other people commit to this repo [18:43:19] oh wait it did say version 2 or later [19:02:12] bawolff: time to create a phab project for the securityplugin? [19:02:25] yeah, that'd probably be a good idea [19:02:50] legoktm: You'll be happy to know i'm adding proper gpl license headers everywhere [19:03:04] <3 [19:03:18] legoktm always love licencing [19:03:56] https://gerrit.wikimedia.org/r/396442 [19:04:58] +1 from me [19:05:01] looks good [19:05:05] bawolff: also we should set up CI for that repo [19:05:18] yes, i have a task for that [19:05:24] composer test already works [19:05:31] well, i forgot to enable php lint [19:05:31] link? I think I can work on that one [19:06:02] https://phabricator.wikimedia.org/T182199 [19:06:20] composer test will run some internal tests, and phan, and phpcs [19:06:42] so all that needs to be setup is have ci run `composer test` [19:06:54] on php7, since the thingy has a dependency on that version [19:07:04] looks doable, I'll take a look [19:07:20] no promises though eh, I'm just a noob [19:08:40] https://gerrit.wikimedia.org/r/#/c/396442/ <-- weird, merged and later uploaded a patch? [19:13:36] https://phabricator.wikimedia.org/diffusion/MTPS/browse/master/composer.json looks kinda messy bawolff [19:14:01] oh damn spaces and tabs [19:14:11] 8 tab space for everyone! [19:14:14] I should fix that [19:14:48] and the composer schema file as well :D [19:14:52] composer auto-generates a composer.json file with spaces, but I'm super used to tabs and i didn't notice. The entire thing should be converted to tabs probably [19:14:56] got the project created already? [19:15:16] Do you mean phabricator project? [19:15:21] SecurityCheckPlugin standalone or you want it nested elsewhere? [19:15:23] yes [19:15:50] there's no phabricator work board for it yet [19:16:09] SecurityCheckPlugin is meant to be used as a (dev) dependency of other projects (See the readme) [19:17:37] Hauskatze: btw, well you are there, feel free to add jakub-onderka/php-parallel-lint as a require-dev dependency and add it to composer test (but only if you want to) [19:17:57] bawolff: I'm fixing composer.json first [19:18:02] ok [19:18:06] if that's okay ofc? [19:18:27] Of course, thank you for working on it [19:18:35] saves me from having to figure out how to do it :) [19:22:21] converting spaces to tabs first [19:23:18] btw, I hope you're not doing that manually. If you're using vi you can do that really quickly by doing :%s/ /\t/g [19:23:46] too late :P [19:23:56] I cloned that on my pc and I'm using sublime [19:24:12] so I can check line by line [19:24:26] 10 lines left only [19:25:29] done, now https://getcomposer.org/doc/04-schema.md [19:31:42] bawolff: so it's okay to create #SecurityCheckPluging Phabricator standalone project then? [19:31:49] * Hauskatze doing various things at once [19:32:06] Hauskatze: yep. Maybe no g at the end of the name [19:32:18] heh [19:32:23] SecurityCheckPlugin is kind of an uninspired name for this plugin, but oh well [19:32:49] that way I can add that to the composer.json "support" section [19:33:06] &projects=SecurityCheckPlugin [19:45:03] bawolff: "jakub-onderka/php-parallel-lint": "^0.9.2" is already in composer.json in require-dev [19:45:26] oh hmm [19:45:31] maybe someone else added it [19:45:36] I can add paralel lint --exclude vendor [19:45:39] if needed [19:45:44] no appearently i did [19:45:50] and i just didn't remember [19:45:58] "parallel-lint . --exclude vendor", [19:46:10] yeah, that should probably be added to the "test" composer script [19:46:29] okay I'll upload a first version of the new composer.json file for review [19:46:41] and then you can suggest changes, etc. [19:52:06] heh, nice, there's no .gitreview in that repo [19:54:35] Hauskatze: I don't generally use git review [19:54:38] you can do [19:54:45] git push origin HEAD:refs/for/master [19:54:49] to upload changes [19:54:55] was looking for that [19:54:55] or feel free to just add a .gitreview [19:54:56] thanks [19:55:00] maybe later [19:55:57] bawolff: asks me for a change-id [19:56:17] Oh, you still need the git review precommit hooks [19:56:24] Just add the following to your commit message [19:56:34] * Hauskatze uses gerrit editor [19:56:37] Change-Id: Ie9106c80b23f23a393912a12c9368078bcdb3123 [19:56:44] or that [19:56:54] The change id just has to be a random hexadecimal string [19:57:05] you can use anything provided its the right length, and nobody else has used it before [19:57:11] Or just run [19:57:15] git review -s [19:57:18] or is -S [19:57:23] can't remember [19:57:34] uploaded [19:58:50] looking [20:08:12] did two minor modifications [20:08:26] Hauskatze: oh, i totally didn't realize you were MarcoAurelio [20:09:11] bawolff: yup, it's me. [20:09:24] addshore: I see the backport, but i don't see any deploying [20:09:31] Lol i was going to say something but i wanted to see what would happen [20:09:34] So i don't really know what's happening with that [20:09:54] bawolff: we should backport the Special:Undelete change too [20:10:09] legoktm: I was planning to add it to swat on monday [20:10:59] it's a pretty small change and it's really annoying, so I'd rather not leave it over the weekend like that [20:11:04] ok [20:13:05] Do I merge this in the deployment branch - https://gerrit.wikimedia.org/r/#/c/396465/ ? I have no idea what the procedures are for deploying non-security patches [20:13:48] yes [20:14:29] Hmm, appearently I cannot (I thought i could since i became a deployer) [20:14:42] uh [20:14:55] Well technically I'm a gerrit admin so can do anything by adding myself to the relavent groups, but I can't with the groups I'm currently in [20:15:33] bawolff: fixed [20:15:48] you weren't in the wmf-deployment group [20:17:26] legoktm: So umm, as this is not in a window, do I just get permission from greg-g to deploy it, and then go do it, or are there other procedures I have to follow? [20:17:54] uh, right, we probably should have asked greg-g before you +2d it [20:20:47] oh umm [20:22:51] legoktm: I un +2'd it, and it didn't merge yet, so should be fine [20:26:04] legoktm: phpcs -p -s is the same as composer phpcs? [20:26:15] yep [20:26:24] okay so that's dupe, I'll just leave one [20:29:48] hi [20:30:09] No [20:30:10] Is there an easy way to deal with: https://tools.wmflabs.org/ia-upload/log/archaeologiacor00prycgoog [20:30:22] Whats wrong with it? [20:30:32] [2017-12-08 01:00:17] LOG.CRITICAL: Client error: `GET http://tools.wmflabs.org/phetools/pdf_to_djvu_cgi.py?cmd=convert&ia_id=archaeologiacor00prycgoog` resulted in a `400 Bad Request` response: {"text": "invalid ia identifier, I can't locate needed files", "error": 2} [] [] [20:30:42] Try it again? [20:30:52] this is the second try [20:31:04] Link to it on IA? Ill give it a go [20:31:15] bawolff / legoktm https://gerrit.wikimedia.org/r/#/c/396455/ updated, please check syntax [20:31:16] https://archive.org/details/archaeologiacor00prycgoog [20:31:32] In a moment [20:31:40] now I have to guess how to make that php7 tests [20:32:00] also, I shall create a patch for jenkins as well :O [20:32:05] * Hauskatze writes down a list [20:34:58] Hauskatze: the license identifier should be "GPL-2.0+" since it has the "or later" [20:35:10] fixing [20:35:11] MaskedCornishman: it failed for me let me try to figure this out ill keep you updated [20:35:21] also type: library is default, so you don't need to specify it [20:35:29] Zppix: ok no worries [20:35:37] but legoktm spdx says those + are deprecated, just ignore? [20:35:45] uh, where does it say that? [20:36:28] https://spdx.org/licenses/GPL-2.0+.html [20:36:31] that's very confusing [20:36:45] legoktm: https://spdx.org/licenses/ bottom [20:36:58] > This new syntax supports the ability to use a simple “+” operator after a license short identifier to indicate “or later version” (e.g. GPL-2.0+). [20:37:21] I've added those recently but I've always wondered if I should continue to :) [20:37:24] Hauskatze: basically they mean "GPL-2.0+" is not its own license, it's just equal to "GPL-2.0" plus "or later" [20:37:31] so GPL-2.0+ is totally fine [20:37:36] okay, adding :D [20:37:43] and getting rid of library too? [20:37:49] yeah [20:38:23] so done [20:38:27] bbl, dinner time [20:38:40] I feel like fried eggs with chips will work for today :D [20:39:57] MaskedCornishman: nothing i can do sorry :/ i think it fails on pdf convert [20:40:35] ok [20:40:50] is there an easy way to get the cover page off the pdf? [20:41:07] without using adobe X DC that is :) [20:41:08] MaskedCornishman: There used to be an issue where pdfs with jpeg2000 images embedded in them did not work on commons. this might be that [20:41:25] bawolff: its failing at ia-uploads end [20:41:31] Its not commons [20:41:35] I uploaded the actual pdf [20:41:38] MaskedCornishman: there is a pdfimages command line tool [20:41:45] MaskedCornishman: ok [20:41:47] Zppix, ah I misunderstood [20:41:58] bawolff: thanks [20:42:19] bawolff: if it was commons i would be taking time into investigating it :P [20:42:38] yes I had to mark the pdf i uploaded as speedy beacuse it still has the google cover page [20:43:12] https://pdfsam.org/ [20:43:35] lol, at the email to all gci mentors: "Tasks that can be easily completed by copying things verbatim from elsewhere often lead to plagiarism. Students often don't understand that it's not okay to take text from Wikipedia" [20:43:40] What if we are wikipedia ;) [20:46:16] "Remix" is the magic word. :P [20:47:26] PDFsam is quite good [20:48:16] andre__: And "properly comply with attribution requirements of license" :P [20:48:51] "Wikipedia told me I had to tell you I stole it from Wikipedia" [20:49:28] MaskedCornishman: want me to upload it w/o cover to commons and you can fill in the info? [20:50:42] Zppix: yes that would be great [20:50:47] K [20:50:58] learning this pdf tool is not the fastest exercise [20:51:32] shouldn't take much to "split" it [21:02:06] [[Tech]]; Oliviapahtayken; [none]; https://meta.wikimedia.org/w/index.php?diff=17524758&oldid=17472794&rcid=11029270 [21:05:47] [[Tech]]; 94.176.240.71; Undo revision 17524758 by [[Special:Contributions/Oliviapahtayken|Oliviapahtayken]] ([[User talk:Oliviapahtayken|talk]]); https://meta.wikimedia.org/w/index.php?diff=17524761&oldid=17524758&rcid=11029274 [21:08:15] Grr, someday I'm going to find the reason SUL is only "Single Unified Logout", but not "Single Unified Login" for me and stop accidentially editing pages as IP -.- [21:09:03] eddiegp: clear your cookies and cache then see if it still happens maybe? I never have had that issue [21:11:08] I've got that for at least half a year, across different browsers and devices, so yeah, I actually have cleared cookies and cache a few times (along with all other browser settings and customizations that is). [21:12:20] Hmm [21:14:20] eddiegp: bawolff and I noticed that it does seem worse at keeping you logged in on other wikis [21:16:35] Reedy: Hmm, I'll have to remember to put an eye on that. So far I haven't got the impression that it's 'only' about _keeping_ me logged in but about logging me in there to begin with. [21:18:45] seems I'm logged in now on every other wikis except wikimania wikis and chapter wikis, sometimes I'm logged in only on few wikis... (not sure are you talking about the same problem) [21:21:44] what's your cookie setup? do you have perhaps third-party cookies disabled, or something? [21:25:39] for Firefox, there was some task somewhere, it apparently deletes cookies from the same toplevel domain pretty aggresively [21:26:28] so if you're logged in on e.g. Wikipedias in several languages (en.wikipedia.org, de.wikipedia.org, etc.), all the cookies count for the limit of wikipedia.org, and the limit is like 50 or 100 or something. if each wiki sets a couple cookies, you start losing them pretty fast [21:33:26] MatmaRex: maybe we need to combine our cookies if possible then? [21:44:13] I just did a short test, seems it's some domain-related issue. I've logged out, completely cleared cache & cookies and logged into enwiki. It was automatically logged into dewiki and svwiki when I tested it, but not into enwikivoyage, enwikibooks, dewikibooks or enwikinews. [21:50:23] eddiegp: also, if i remember correctly (but i may not remember correctly), it will only try to auto-login you once per 24 hours per wiki. if you log in and out a lot, that might explain the problems [21:50:54] eddiegp: actually from what you just said, it might be something with third-party cookies [21:51:09] if you log in on en.wikipedia, and all wikipedias work, but all non-wikipedias do not work [21:51:45] That's what I meant with domain-related, yeah. [21:56:10] third-party cookies :P [21:56:17] what's your browser? [21:56:24] Firefox [22:08:30] Hmm. Two gci students have claimed one of my cloneable tasks at the same time. I hope they dont accidently fix the same instance [22:09:17] It happens [22:34:23] Hauskatze: both the composer lines are needed [22:34:50] the two lines below have separate keys so they run seperate tests [22:35:03] and the idea of composer test [22:35:12] is it runs all the tests [22:35:36] but the individual tests are also there if you want to run them separately [22:36:51] bawolff: thanks for clarifying :) [22:37:00] I'm trying to figure out the zuul config [22:37:19] but jenkins dislikes it https://gerrit.wikimedia.org/r/#/c/396486/ [22:37:34] You also removed the phpcs line [22:37:49] there was a 'composer-test' template but it also runs php55 and lego said we should avoid that [22:37:56] which i think is needed if you manually want to run phpcs [22:38:03] thought it was needed [22:38:11] s/needed/duplicate [22:38:18] re-adding in a sec [22:38:24] yeah, this wont work with php5 [22:39:46] bawolff: so inside test [] or outside? [22:39:50] composer phpcs I mean [22:39:59] Outside [22:41:15] Hauskatze: dont include the test in the check pipeline [22:41:46] the check pipeline should only run phpcs and phplint, not the general tests [22:42:08] for now its probably fine to just run no tests from the check pipeline [22:42:27] you mean the zuul patch right? [22:42:45] Yeah [22:43:13] I think thats why jenkins is failing it [22:43:19] * Hauskatze feels overwhelmed :) [22:43:45] okay so getting rid of 'check' [22:46:52] +2 [22:47:31] looking at the composer phpcs [22:47:37] though I'm not sure here [22:47:57] phpcs -p -s is there, can't you run it standalone that way too? [22:50:01] Its just an alias because im lazy [22:50:24] done I think? [22:50:45] And im constantly screwing up code style so i actually use it as an alias a lot [22:52:29] bawolff: done hopefully [22:52:38] Oh, i meant the key should be just "phpcs", not "composer phpcs" [22:52:49] otherwise looks good [22:52:59] bawolff: there's PS11 [22:53:28] Also, you need to remove [wip] from commit msg [22:53:35] since its ready :) [22:53:36] composer phpcs inside test and phpcs = phpcs -p -s [22:53:44] phew [22:55:01] done [22:55:27] https://gerrit.wikimedia.org/r/#/c/396455/12/composer.json [22:55:33] latest patchset [22:55:48] Thanks, looks good [22:55:56] everything? [22:56:22] I dont know enough about zuul config to comment on that [22:56:34] I'll run composer validate just to be sure [22:56:54] too late :P [23:00:11] there's a fixme at L26, submitting follow-up [23:02:27] I've also set automatic build updates https://gerrit.wikimedia.org/r/#/c/396533/ [23:02:33] if Lego approves [23:05:47] bawolff: https://gerrit.wikimedia.org/r/#/c/396543/ fixes the issue with Line 26 extra comma. [23:06:01] (I hate those, I always miss them or add them more than needed) [23:06:39] Worst part of json standard