[00:03:16] Reedy are you there ? [00:03:31] yes [00:03:49] Sorry I still have nopt got my account working [00:04:03] the password did not work [00:05:16] the sent password did not work [00:05:38] And password reminder is producing "A password reminder has already been sent within the last 24 hours. To prevent abuse, only one password reminder will be sent per 24 hours." [00:07:47] Hopefully this time should work [00:09:52] okay !! [00:10:01] Thanks so much [00:10:27] Just got my WikiMath working on my own wiki too, lol ! [00:10:30] Wheee [00:10:38] thanks you Sir [00:10:51] to [00:12:08] Cheers Reedy [00:15:48] reedy will the memcache error resolve itself naturally? or is it going to need a shove? I saw it when I was doing spambot cleanup yesterday, but no idea where I was [00:16:30] memcache re gadgets [00:17:05] It should fix itself, but for some reason the gadgets one isn't so proactive about it [00:17:16] It's happened before and I logged a bug for it a while back [00:17:44] yes, and I was watching that bug and how I saw the reference to the new one [00:18:08] there's been a few occurances today after one memcached server being down for a while [00:19:05] it would be nice if there was a system means to touch all the pages [00:19:32] I did write something for translatewiki before to purge every page using the api... [00:19:35] but that's extreme [00:20:34] I was meaning the 700+ mediawiki:gadget-definition [00:20:42] s^ [00:20:56] Yeah idea, different scope [00:20:58] *same idea [00:21:42] I will note it for stewards and global sysops [00:22:35] I do wonder if just explicitly deleting the memcached keys would do it [00:23:31] <- does his best dumbf look [03:57:56] hi jayne. are you the right person to ask for a bot cloak? [04:01:48] ori-l: if you want a wikimedia/bot/*, you have to get it through the GCs [04:02:05] GCs? [04:02:18] group contacts [04:02:18] https://spreadsheets.google.com/viewform?hl=en&formkey=dG1FTWV1RnNBVHFOSnExMHF6aUhya2c6MA [04:02:22] thats the form [04:02:26] https://meta.wikimedia.org/wiki/IRC/Cloaks [04:02:51] legoktm: ah, neat. thanks. [04:04:14] np [07:19:05] what's the next office hour or communication whatever for the community about Echo? (I want to tell it.wiki) [11:11:37] here's something to chew on: http://en.wikipedia.org/wiki/WP:VPT#Gadgets_in_the_Slovene_Wikipedia [11:19:51] [[Tech]]; This, that and the other; /* Identifying wikis with local titlebacklists and issues within */ comment; https://meta.wikimedia.org/w/index.php?diff=5473934&oldid=5464896&rcid=4167294 [11:21:09] yes that's me but the two are unrelated issues [12:30:42] ^ Note for any watchers. Known issue, and is fixed by a null edit to MediaWiki:Gadgets-definition [13:25:13] * Susan null edits Reedy. [13:33:12] [[Tech]]; Anomie; /* Identifying wikis with local titlebacklists and issues within */ re x 2; https://meta.wikimedia.org/w/index.php?diff=5474126&oldid=5473934&rcid=4167641 [14:56:04] [[Tech]]; Anomie; /* Identifying wikis with local titlebacklists and issues within */ list; https://meta.wikimedia.org/w/index.php?diff=5474186&oldid=5474126&rcid=4167721 [15:30:37] Hey guys [15:31:20] I've tried purging a thumbnail, but to no avail. I believe there is a bug in the software, wanna check it out? [15:32:31] Anyone? [15:35:42] Bharel: just link it [15:36:38] http://en.wikipedia.org/w/index.php?title=Hypercube_graph&action=edit§ion=1 [15:36:57] I had to change eventually to 241 px in there. [15:37:10] 240 px changes the image to the old thumbnail [19:45:02] Hello [19:45:28] hi [19:45:32] Hi Automatik [19:45:53] Is everybody knows how to use a includeonly balise at a one-level of inclusion (sorry for the English)? [19:46:06] one and no more [19:47:15] hi Automatik [19:47:28] Can you use a different word? I don't know what "balise" means [19:47:56] ah, sorry, tag [19:49:19] ok, so you want to edit a wiki page and use an includeonly tag that only goes one layer deep? [19:49:26] * sumanah still isn't totally sure how to help, sorry [19:50:15] Ok, thank you for your attention [19:53:05] maybe you have ever seen a template who use something like only> example only> (or an other example where includeonly and noninclude tags are nesting one within the other [19:56:45] Automatik: I think #mediawiki might be a better place for this [19:56:49] ask again there? [20:04:15] sumanah: thanks, I'll ask there [20:04:21] Good luck [20:17:45] ^demon: ram: are you java skills around? I got a Jenkins thread at 100% CPU, got it is system number but I have no idea how to find out what that thread is doing :( [20:20:10] <^demon> Hmm, jstack will let you see what's in the stack right now. [20:20:18] <^demon> jmap is useful for looking at the heap. [20:22:47] <^demon> jstack -F 6633 isn't attaching...which sounds bad. [20:23:50] <^demon> Ah, wrong proc #. [20:24:48] <^demon> `jstack 6635` was a bit more useful. [20:58:36] hello, persons [20:59:35] hello, person [21:05:25] Please give me access to #48306 [21:05:30] (bugzilla) [21:05:42] https://bugzilla.wikimedia.org/show_bug.cgi?id=48306 [21:07:50] (if it is about chunked async upload) [21:09:32] done [21:11:08] thank you [21:12:41] I don't understand, whby does https://www.wikidata.org/wiki/Special:DispatchStats *always* have the "Oldest" in "Change log statistics" being *exactly* 72 h before [21:15:59] Data not kept indefinitely? [21:35:46] Reedy: still that would mean that there is always at least a 3 days backlog? [21:35:55] which is however truncated [21:36:47] You should really just ask in #wikimedia-wikidata [21:38:48] went for https://bugzilla.wikimedia.org/show_bug.cgi?id=45892#c14 insted [22:36:11] [[Tech]]; Patrick; Notifications become a mess; https://meta.wikimedia.org/w/index.php?diff=5474853&oldid=5474186&rcid=4168576 [23:13:13] How do I replace a project's favicon? Looks like they live at bits.wikimedia.org/favicon/. How do I put things there? [23:14:33] mediawiki-config.git [23:14:40] docroot/bits/favicon [23:14:58] kaldari: Which project? [23:15:05] officewiki [23:15:09] Ah. [23:15:19] remember there is favicon.php too [23:15:20] It's probably using the default (wmf.ico). [23:15:23] Krinkle|detached: made that [23:15:38] Krenair: thanks@ [23:15:55] 'officewiki' => '//bits.wikimedia.org/favicon/office.ico', [23:16:01] Guess not. [23:16:08] You'll want to update that file, I guess. [23:20:53] We should probably normalise that further [23:25:20] Susan: thanks, I just replaced the file [23:29:36] Someone needs to normalize Reedy. [23:30:33] Krenair: Hi. [23:30:44] Krenair: Can I get you to add a column to the page table? [23:31:42] <^demon> New column on page? [23:31:46] Yes. [23:32:15] <^demon> For what? :) [23:32:23] Storing the first revision of the page. [23:32:46] <^demon> Hmm [23:33:59] https://bugzilla.wikimedia.org/show_bug.cgi?id=42135 [23:36:53] ^demon: Then you can write a script to populate it [23:37:05] Then you can make MediaWiki populate it on new page creation [23:37:24] Then you can make MediaWiki update that revision when the original revision gets deleted [23:38:17] At least OSC should be able to deal with that to the page table [23:38:55] I feel like purging can update the value. [23:39:06] It should be a fairly cheap lookup. [23:40:23] select * from revision where rev_page = 12345 order by rev_timestamp ASC limit 1 [23:40:54] I'd use MIN(), but sure, whichever. [23:41:09] There's something about rev_id and rev_timestamp not always being as chronological as you'd hope. [23:41:15] there should be a column for second revision too [23:41:32] Hi domas! [23:42:23] <^demon> domas: Or maybe a whole new table? revision_page(rev_id,rev_num,page_id) :) [23:43:02] rp_page,rp_id,rp_num * [23:43:33] Page creation follows the zero–one–infinity rule. [23:43:38] And it's a pretty common request. [23:47:15] nowadays IDs are monotonic [23:47:27] timestamp is kind of odd [23:47:56] I can't remember how page histories sort in MediaWiki, but I think it's based on rev_timestamp. [23:51:01] there's an index [23:51:08] allows to fetch that row [23:52:05] Right, but the use-case is aggregating based on that data. [23:52:11] So individual lookups seem like a poor idea. [23:52:57] what do you mean aggregating [23:53:17] ordering in some way, pages created at X time? [23:53:38] I'd like to be able to generate "pages created by author X". [23:53:47] Or broadly, all users by article count. [23:54:12] you can generate that now :) [23:54:19] Right, it's just costly. [23:54:24] you can generate that really cheap [23:54:28] How? [23:54:56] I created a separate table a year or two ago to track page_id and page_creator. [23:55:00] But it's awful to maintain. [23:55:06] I'd like something built in to MediaWiki. [23:55:24] well, do you want one pass for all data [23:55:26] or on demand? [23:55:37] One pass for all data. [23:56:18] well, sql methods are in http://jan.kneschke.de/2007/2/15/groupwise-max/ [23:56:25] what I'd do is just join in an application :) [23:56:39] map-reduce! [23:56:50] Anything to avoid an index? :-) [23:56:50] hehe [23:57:00] what do you mean? [23:57:02] there is an index [23:57:05] I think indexing the first revision of a page is a fine approach. [23:57:14] it is indexed [23:57:24] in (rev_page,rev_timestamp) index [23:57:49] there's a patch for mysql that makes very efficient loose scans/joins [23:57:58] I hope it makes into 5.7 :) [23:58:04] I don't see how to efficiently query "Wikipedians by article count" with a revision table that's 600 million rows. [23:58:14] you aggregate the counts, of course [23:58:25] In a single query? [23:58:32] well, you store that data somewhere [23:58:47] Right, so I have to query 30 million pages or something. [23:58:50] in my world, if you run count(*), you'er already doing something wrong [23:59:06] well, it is a very simple graph task [23:59:10] on one side you have users [23:59:13] on other side you have pages [23:59:14] I want to store the first revision. [23:59:19] you want to have edges of first revisions [23:59:27] and you can count how many first revisions each user has [23:59:46] Right, but that requires looking up that information for each page. [23:59:55] I'm suggesting storing that information to make the lookup much faster.