[01:18:05] hello [09:43:50] spagewmf: there? [09:44:15] galorefitz: kindof, it's 2:44am here :-) [09:44:25] what up? [09:45:36] Oh. Sorry :/ I wanted to know why we are required to convert .txt, .md and other formats to wikitext, since we won't be editing them on the page anyway [09:45:55] https://phabricator.wikimedia.org/T91626 under possible/desirable features [09:52:47] galorefitz: I rephrased the README.md example. The use case there is someone writes a .md file because it's easy and GitHub (and soon doc.wikimedia.org) formats it. But if you want to reuse the info on a wiki page someone has to convert the markup. Extension:GitHub shows it's possible to parse .md files and show their output in wiki pages. [09:53:00] s/formats it/renders it/ [09:54:53] galorefitz: that line in T91626 could be mis-phrased. I'm not sure if Extension:GitHub converts .md to wikitext, or converts it to HTML that it inserts into the HTML output. [09:55:37] spagewmf: I'll check that and get back to you. Sleep. I am sorry :P [09:58:37] galorefitz: no worries, it's a good question. I don't know much about the parser. Also note that that is under "Possible/desirable features", i.e. not phase 1 :) [10:00:29] spagewmf: Oh. Alright :) [13:18:31] Anyone know how/where to get help with changing callback URL for an oauth consumer, [13:18:35] ? [13:20:19] hello [13:20:35] I'm having issues with a Commons category [13:20:38] https://commons.wikimedia.org/wiki/Category:Collection_compl%C3%A8te_des_portraits_des_Grands-Aigles_et_des_Grands-Officiers_de_la_L%C3%A9gion_d'Honneur [13:20:52] this one is empty, and there should be 100+ files in there [13:21:00] files like this one : https://commons.wikimedia.org/wiki/File:Btv1b8453983r-p133.jpg [13:25:56] JeanBono, https://phabricator.wikimedia.org/T116001 ? [13:25:57] JeanBono: maybe the jobqueue is more backlogged than usual; about 1 million now https://commons.wikimedia.org/w/api.php?action=query&meta=siteinfo&siprop=statistics [13:26:44] ok, thanks [13:47:03] my cat is full now [13:47:07] see you [13:50:55] meow ? [14:16:49] I suspect DarkNeko has "cat" as stalkword given his precedents ;) [15:10:06] Category updates on Commons are taking ages- is there a problem? [15:12:13] whee 1.3M https://commons.wikimedia.org/w/api.php?action=query&meta=siteinfo&siprop=statistics [15:12:31] the job queue keeps increasing so it might be that nothing is really getting done, yes [15:24:15] Thanks nemo bis [15:27:36] there's a pretty large spike [15:48:13] legoktm: due to the elastic stuff? [15:51:25] I'm not sure [15:52:05] there are a ton of "enqueue" jobs [15:53:30] ebernhardson: is CirrusSearch using the enqueue job thing? [15:54:07] legoktm: not sure what you mean be "enqueue" thing, you mean just normal jobs? [15:54:23] enqueue: 1368181 queued; 1 claimed (1 active, 0 abandoned); 0 delayed [15:54:50] that shouldn't be us, afaik. but i don't know what that particular job is [15:55:04] we do standard job insertions via: JobQueueGroup:singleton()->push() [15:55:45] that's a no then [15:57:21] Can someone look at https://en.wikipedia.org/wiki/MediaWiki:Antispoof-conflict-bottom 's history and explain why it did not render on the user creation page? [15:57:52] I think it's refreshLinks ? [15:58:59] it's still increasing [15:59:37] maybe something go stuck and the rest is just accumulating? [16:01:11] 2015-10-20 15:48:59 mw1010 enwiki redis ERROR: Redis exception on server "10.64.0.201" {"redis_server":"10.64.0.201"} [16:01:16] * legoktm switches to -ops [16:07:51] on the bright side, job runners drained the queue for all the other wikis :P [16:22:59] Just gone over 1.5M, so it's adding at a rate of 200K per hour. [16:23:42] MOre like 100K, I think [18:05:16] Krenair: FWIW, it's better to delete spam Flow topics than just the post in them, as otherwise they still show up. [18:05:26] Krenair: (And thank you for your help.) [18:05:44] Did I miss some topics? [18:05:50] A couple. [18:05:52] I was trying to get all the topics and all their comments [18:05:53] oops [18:05:55] I just killed them. [18:05:57] No worries. :-) [18:25:12] James_F: I did that [18:25:14] Killed all the comments [18:25:21] Then noticed the topics were still there [18:25:22] fml etc [18:34:00] Reedy: Yarp. [19:03:50] James_F: HENCE! [19:03:59] Filing the bug to get Flow support for Nuke [19:03:59] :D [19:04:03] Reedy: Good. :-) [19:04:25] And then RoanKattouw_away marked it high <3 [20:17:33] fluff: just register a new application [20:17:42] see also https://phabricator.wikimedia.org/T59631 [22:00:51] [[Tech]]; Raonjs; /* **/1~844.461.2828/**kodak printer Support phone number */ new section; https://meta.wikimedia.org/w/index.php?diff=14224715&oldid=14194258&rcid=6911422 [22:06:35] [[Tech]]; Krenair; Undo revision 14224715 by [[Special:Contributions/Raonjs|Raonjs]] ([[User talk:Raonjs|talk]]); https://meta.wikimedia.org/w/index.php?diff=14224784&oldid=14224715&rcid=6911441 [22:09:37] They need blocking [22:09:48] locking, even [22:10:32] No rollback on meta is annoying [22:10:36] undo is too many clicks [22:11:08] Reedy: Don't you have that globally? [22:11:13] Nope [22:11:14] https://meta.wikimedia.org/wiki/Special:CentralAuth/$1 [22:11:16] That looks broken [22:11:37] https://meta.wikimedia.org/wiki/Special:Contributions/Raonjs [22:11:45] $1 in most of the links at the bottom [22:11:45] sigh [22:13:09] https://meta.wikimedia.org/w/index.php?title=MediaWiki:Sp-contributions-footer&action=edit [22:13:11] I can't edit it [22:13:37] whee [22:16:42] Reedy, I can, but why are we not filling out $1? [22:16:59] No idea [22:17:16] Has MW stopped passing $1? [22:18:04] Looks right in the code [23:27:04] bd808: you around? Re T116065 and "these are the exact reports we need", it sounds like you want a spec for how to aggregate API request parameters. But qgil and I know so little about Hadoop I'm not sure how to provide it. [23:27:48] spagewmf: I'll take a shot at a first approximation and then we should meet to talk about it [23:28:07] you can mostly ignore the hadoop part and just think about SQL [23:28:45] But I'm going to push back hard against this being a data mining feature. there is just too much sensitive data to store raw [23:29:02] it needs to work mostly like our page view data [23:29:43] Its ok to know about how many distinct user-agents but not ok to know what a particular user-agent did exactly [23:30:34] We will probably end up with one rollup table per report [23:31:05] like a table that says user-agent X hit api.php 17 times during hour Y [23:31:36] That would allow us to generate a distinct user-agents per unit time report [23:31:55] and not lean anything about what that user-agent did exactly [23:31:58] *leak [23:32:40] that one would probably also have a column that told us if the ip was internal/labs/external [23:33:13] bd808: yes, agreed all. Re a spec for "Ranking of most requested actions/parameters" I'm not sure how think about action=parse&prop=modules|jsconfigvars [23:33:35] that one is going to be tricky I agree [23:33:53] bd808: whew, I thought it was just me 8-) [23:34:04] we need to figure out what would actually be useful there in an aggregate report [23:40:08] bd808: I think we don't care about what props were requested together, so have action and details columns, and increment multiple counters for 'parse,modules' and 'parse,jsconfigvars' for one API request. Query is a special case, maybe a submodule column? [23:40:39] *nod* [23:41:21] bd808: O RLY? That barely made sense to me! [23:41:58] It's along the lines of what I was thinking. basically we need to figure out the interesting tuples and count them [23:43:26] for the action=parse&prop=modules|jsconfigvars example we might want to count 3 tuples (parse,null), (parse,modules), (parse, jsconfigvars) [23:43:30] k, well it's in good hands. I think anomie and I can trawl through the API modules coming up with the tuple-thingies [23:44:15] that would be very useful [23:47:58] It takes every kinda tuple, to make the world go 'round ♩♫ (Robert Palmer)