[11:09:06] Whats the correct product/component category in bugzilla for jQuery.IME? [11:09:54] se4598_2: MW Extensions -> UniversalLanguageSelector [15:59:39] help: I'm trying to follow https://www.mediawiki.org/wiki/Wikimedia_engineering_report/How_to#Fill_out_the_Metrics_box so I'm doing ssh ssh://username@gerrit.wikimedia.org:29418/mediawiki/core.git but I get the error ssh: Could not resolve hostname gerrit.wikimedia.org:29418/mediawiki/core.git: Name or service not known [16:01:08] sumanah: should be ssh gerrit.wikimedia.org -p 29418 [16:01:13] yeah, i'm fixing [16:01:27] ugh, so confusing that mediawiki.org and enwiki are different VE versions! [16:02:08] ori-l: so, something like this? [16:02:11] ssh ssh://sumanah@gerrit.wikimedia.org/mediawiki/core.git -p 29418 "gerrit query ....." [16:02:34] sumanah: fixed [16:02:40] sumanah: no, see what i changed it too [16:02:41] to* [16:03:01] but what if I just want core? [16:03:25] you have to find a way to specify that in the query itself [16:03:27] you have project:^mediawiki.* which will include extensions that don't count here [16:03:44] so, project:^mediawiki.*" becomes project:mediawiki/core [16:03:46] i guess [16:03:48] ok, so, project:^mediawiki/core.* [16:04:00] no, try with just what i wrote [16:04:12] but either should work because core has no children [16:04:17] sumanah, I don't think you can ssh into /mediawiki/core.git :) [16:04:25] from my experience searching Gerrit, that grabs all extensions in addition to core, jeremy [16:04:38] sumanah: project:mediawiki/core does? [16:05:06] project:^mediawiki/core.* can't/shouldn't match anything other than core... [16:05:14] jeremyb: you're being confusing. you said "just try with what I wrote" and I'm telling you that "project:^mediawiki.*" matches extensions as well as core. [16:05:31] sumanah: 02 16:03:44 < jeremyb> so, project:^mediawiki.*" becomes project:mediawiki/core [16:05:33] and "project:^mediawiki.*" is what you wrote. [16:05:44] sorry, i wrote more than one thing :) [16:06:05] yeah. specificity is a better thing in general. [16:06:16] ^mediawiki.* will match all mediawiki.* repos, yes. including core, extensions, etc. [16:07:00] of course it will also match mediawiki/php/*, mediawiki/skins/*, mediawiki/rcsub, mediawiki/tools/*, mediawiki/vagrant, etc. [16:07:23] jeremyb: your usage of "becomes" is confusing when you remember that you are talking about a regex and thus "becomes" is a slangy way to talk about a regex expanding to something else. [16:09:39] sumanah: so, should be 9 for core and 18 for core+extensions. as of now [16:09:48] are you seeing the same thing? [16:11:26] jeremyb: did you change the date? [16:11:32] no [16:11:57] also, that number doesn't make sense. [16:11:59] i am [16:12:39] The number should be around 110-130. [16:15:12] $ for repo in '^mediawiki.*' mediawiki/core; do echo -n "$repo: "; ssh wmf-gerrit "gerrit query NOT age:40d status:merged project:${repo}" | grep -B4 "createdOn: $(date -d '-27 day' +%Y-%m-)" | grep username | sort | uniq | wc -l; done [16:15:16] ^mediawiki.*: 114 [16:15:18] sumanah: are those sane #s? [16:15:20] mediawiki/core: 48 [16:15:49] what are you guys doing? [16:16:04] there is a total of 394 unmerged changes in mediawiki/core alone [16:16:07] MatmaRex: quote from the start of this session: help: I'm trying to follow https://www.mediawiki.org/wiki/Wikimedia_engineering_report/How_to#Fill_out_the_Metrics_box so I'm doing ssh ssh://username@gerrit.wikimedia.org:29418/mediawiki/core.git but I get the error ssh: Could not resolve hostname gerrit.wikimedia.org:29418/mediawiki/core.git: Name or service not known [16:16:12] and 1283 in all of gerrit [16:16:16] this is not about unmerged changes, Bartosz. [16:17:13] At least, I believe the thing I'm trying to get is # of unique committers. [16:17:15] sumanah: that isn't a valid command, i think. [16:17:28] MatmaRex: i fixed it onwiki [16:17:32] Yes, we figured that out. [16:17:48] why don't you use the REST API? [16:17:55] now i tweaked even more on wiki [16:17:56] https://gerrit.wikimedia.org/r/Documentation/rest-api-changes.html#list-changes [16:18:06] * sumanah has evidently caused others to decide to fix this problem more permanently :) [16:18:20] * sumanah sits back and awaits others updating the docs so that she and future report-writers will have an easier time [16:18:37] sumanah: well, do those #s look plausible? [16:19:35] apergos, parent5446: hello [16:19:56] i have to run in a couple mins... [16:20:00] * jeremyb installs shoes [16:20:06] svick: hey [16:20:08] jeremyb: they do, but now I am suspicious, because it looks like the previous query specifically wanted to only look at core, so why was it getting different numbers.... oh well. [16:20:33] Thank you Jeremy. [16:20:36] sumanah: i don't follow. it was looking at stuff from april changed in the last 40 days. which is bound to be low [16:21:38] jeremyb: the documentation of the query that was at https://www.mediawiki.org/wiki/Wikimedia_engineering_report/How_to#Fill_out_the_Metrics_box implies that it is only looking for patches created in a particular month, and you're supposed to change it to reflect the month you are reporting on. [16:22:46] sumanah: right. but now i changed it so that it just automatically looks at whatever month it was 27 days ago. relies on your local machine clock being accurate. and it has to be run soon enough after the start of the month because of the 40 day window [16:23:34] $ date -d '-27 day' +'%Y-%m-%d'; date -d '-27 day' +'%Y-%m-' [16:23:34] 2013-07-06 [16:23:34] 2013-07- [16:24:00] jeremyb: then could you please update the English line saying "but then filtering for createdOn: 2013-04 restricts it to April, for instance." to reflect what your script actually does? :) [16:24:59] are you sure you saw the latest version with the '-27 day' in it? [16:25:00] hello [16:25:03] rats. [16:25:07] there's no april in there any more [16:25:15] mm not toooo late [16:25:15] sumanah: anyway, really have to run [16:25:18] bye jeremyb [16:25:46] (yes, that line is still in there) [16:26:14] apergos, parent5446: today i'm working on what we talked about yesterday: reading directly from MediaWiki using dumpBackup and fetchText [16:26:28] \o. I read about the new "secure browsing" deal, I have a few suggestions. [16:26:41] how is that looking? [16:26:42] hi stanley [16:26:51] hey! [16:27:05] svick: mhm [16:27:06] you read our blog post, then? [16:27:07] sorry if this is totally offbeat or similar but I read something about finding ways to provide a more secure service to fol [16:27:10] k [16:27:11] yeah, haha [16:27:29] https://blog.wikimedia.org/2013/08/01/future-https-wikimedia-projects/ for those in this channel who haven't seen it [16:27:31] i had to find a library for running processes (Linux and Windows do it differently), but now it's looking like it's going to be okay [16:27:41] stanley: a more secure service to fol? I don't know that acronym [16:28:10] I'm not sure how hard it would be to implement disabling editing for it or something like that but a Tor/I2P hidden service for reading (at least, editing wouldn't be as important) could do wonders. [16:28:21] sumanah: typo. "more secure way to follow wikimedia sites" [16:28:47] sumanah: Just read that blog post. The only way to achieve step 1 is through one of two patches, both of which are -2ed right now. Do you know when/how they're gonna resolve that? [16:29:27] parent5446: Greg Grossmeier, the release manager, is the person to ask about that sort of thing [16:29:30] greg-g: ^ [16:29:54] greg-g: You here? [16:30:20] svick: out of curiosity, what process library are you using? [16:31:38] well, i found 3, but two of them required big frameworks (boost and something called POCO); so i went with the small one, it's called libexecstream: http://libexecstream.sourceforge.net/ [16:32:03] parent5446: hi, what's up? [16:32:07] Mhm, yeah boost is useful but it's very big. This looks interesting [16:32:11] ah so you compile it in [16:32:12] parent5446: oh, re ssl/tls etc [16:32:17] greg-g: yeah [16:32:37] apergos: yeah, it's just a few files [16:32:39] Has any decision been made on my patch v. Demon's patch? [16:32:50] parent5446: neither of those patches are complete enough to do it the way it needs to be done for our infrastructure, so we're working on makine one of them better. [16:33:11] greg-g: What's missing? [16:33:14] parent5446: I *think* ^d is going to work on his a bit (adding some things to it) [16:33:22] I, uh, forget [16:33:31] csteipp and ^d could tell you more ;) [16:33:53] Anyone got any ideas on my suggestion? I think it could be beneficial. [16:34:28] stanley: a Tor/I2P hidden service? [16:34:33] I'm not sure what I2P means [16:34:43] sumanah: Yeah. I2P is Tor-like network but focused on hidden services. [16:34:50] What does it stand for? [16:34:56] Invisible Internet Project [16:35:04] sumanah: http://www.i2p2.de/ [16:35:14] Very similar to Tor. [16:35:37] Discussion from December: http://www.gossamer-threads.com/lists/wiki/wikitech/323006 "Can we help Tor users make legitimate edits?" [16:35:59] I was thinking of only allowing viewing until that problem is resolved. Reading now. [16:36:00] tl;dr from that thread: It's unlikely to happen in the near future. [16:36:01] parent5446: just voiced with ^d, it'll be a combo of the two, probably working from his, pulling in some of the ideas from yours (but not including the other features like groups) [16:36:22] stanley: we welcome your help [16:36:50] greg-g: Wonderful. That means I'm gonna have to make an entirely new patch just to fix bug 31323. [16:37:06] apergos, parent5446: do you have anything else? [16:37:15] svick: not that I can think of. [16:37:23] sumanah: you has new docs [16:37:58] stanley: and popularizing Tor for *reading* sounds like a great idea... are you thinking of running outreach campaigns and teaching people how to use Tor? [16:38:02] not at this point [16:38:05] apergos, svick: Well then see you guys tomorrow. [16:38:12] sumanah: For disabling anonymous editing via a hidden service? I imagine that wouldn't be too hard. As for resolving the issue with editing anonymously.. I can't tell you. my thoughts are that users which are verified normally should be able to do it OK as they wouldn't be able to register.. so it'd have no impact (just block them). [16:38:15] monday! [16:38:21] see you monday [16:38:30] I'll figure out whether it's worth the unnecessary effort to attempt to improve MW security when I'm not exhausted. [16:38:36] stanley: I'm sorry, I misunderstand you I think [16:38:38] sumanah: Yes. I'm glad to see how easy it has become, too. [16:38:38] parent5446: yeah, exactly (re 31323) [16:38:43] sumanah: oh ok [16:38:57] stanley: when I said "we welcome your help" I just meant in general [16:39:08] TIL fixing more than one bug at a time in MediaWiki is not allowed. [16:39:35] parent5446: hey, it's allowed, just sounds like it's a bunch of weird dependencies in this particular case? go get unexhausted & come back later :) [16:39:44] depends, but especially for security related things, the smallest reviewable chunk is usually good. [16:39:46] ah, yes. Wikipedia in particular interests me. :) [16:40:18] oh, dang, [16:40:31] was going to say that at least the patch for 31323 will be easier now [16:40:37] * greg-g shrugs [16:40:48] stanley: :) might be worth your while to read that thread and dig around a little bit [16:40:51] was bound to happen after the -2's being thrown around [16:41:05] sumanah: Aye. Hang on. [16:44:29] sumanah: Seems focused mainly on not being able to solve the editing problem. While interesting, I think simply anonymous *viewing* could be helpful. [16:44:45] I think you're right, more anonymous viewing could be helpful [16:44:55] stanley: check out https://blog.wikimedia.org/2013/08/01/ieg-learnings-call-new-proposals/ [16:45:23] stanley: I think "I want to run twenty how-to-use-Tor workshops to teach marginalized, surveilled people how to use Tor to read Wikipedia" could be a reasonable request [16:46:15] I'm not in a position to go find twenty marginalized, surveilled people, sadly(?). [16:46:24] I'm more of a technical person [16:47:21] stanley: in my opinion we really need more teachers, designers, translators, and customer service people to work on this, but of course developers are useful too :) [16:47:49] Well you can already read Wikipedia via tor without a hidden service. Not that I'm saying the idea of running a hidden service should be ignored, but it's unlikely to change anything drastically. [16:47:50] do you work on Tor itself? [16:47:54] I guess I could be a custom service people.. but mostly a developer/sysadmin type. [16:47:57] no [16:48:35] BlastHardcheese: It'd take the issue of HTTPS issues out of the situation. [16:49:10] stanley: do you already know about http://openitp.org/ ? http://www.openitp.org/openitp/about-the-open-internet-tools-project.html [16:49:15] What issues are there with reading over HTTPS? [16:49:55] BlastHardcheese: ssl to wikipedia is blocked by great firewall of china [16:50:18] sumanah: Yeah; what they're doing is looking out to be pretty interesting. Last I checked, they hadn't funded any projects yet and were still warming up. [16:50:20] Yes, but they attempt to block tor entry nodes too [16:51:50] stanley: they've run several skillshare and hacking meetings, and they've granted money to several projects now. So the warmup phase is over I think :) [16:52:04] (I have an advantage here; I co-work there on Thursdays) [16:52:31] Ah. The folk over at I2P are going to be applying hopefully. [16:52:35] cool [16:52:52] good to see they're up and going [16:53:30] "The PRB is still ramping up right now. You can track our progress at the PRB Wiki. We will be looking for both paid and pro bono auditors, and projects interested in being audited. Feedback? Email ella at openitp dot org." [16:54:16] If you're using tor to connect to Wikipedia, the only things HTTPS adds are a) encrypting the traffic between the tor exit node and Wikipedia, and b) authenticating the connection so you know you're connecting to Wikipedia instead of a man-in-the-middle [16:54:46] a) is a big factor, potentially. Especially if you're logging in. [16:55:55] yes, but Wikimedia blocks tor exit nodes for logging in (I'm not sure if ipblock exemptions on accounts still apply to that, but in any case you can only get those by request) [16:56:49] yeah, you can request those, but it's hard [16:57:03] ie: you need a reputation, which is hard to get anonymously ;) [16:57:11] so for people just reading the site over tor they're going to be logged out anyway, most likely [16:57:12] Allowing logging in to verified accounts which have made $X edits would be nice. [16:57:12] Or, alternatively, someone who has requested. :P [16:57:15] greg-g: and if I recall correctly, some part of this is WMF-wide and some part of it is wiki-specific? [16:57:30] stanley: did you already read the thread I mentioned? [16:57:46] sumanah: right, the ip exemption policy I'm thinking of is enwiki [16:58:39] tor exit nodes are blocked on the all-Wikimedia level I believe [16:59:06] since they have a long, long history of being used by serial abusers, on multiple projects [16:59:21] (unrelatedly: we've had two "The future of..." blog posts on the WMF wiki, one for SSL and one of Release Management :) ) [16:59:31] anybody who wants to summarize the current situation and email wikitech-l: this would be a good idea [16:59:42] so it isn't just a hard-to-scroll IRC log [16:59:57] https://lists.wikimedia.org/mailman/listinfo/wikitech-l [17:00:00] stanley: ^ [17:00:03] sumanah: I don't see any conclusions sorry [17:00:06] did I read wrong? [17:01:36] stanley: I infer that you mean there weren't any conclusions in http://www.gossamer-threads.com/lists/wiki/wikitech/323006 ? Yeah I think you're right. I just wanted you to know everything in there, including about the state of leniency at https://en.wikipedia.org/wiki/Wikipedia:Request_an_account [17:01:59] back later. [17:03:30] sumanah: Right. I think Request_an_account is definitely a start though. [17:03:36] oops