[02:14:04] k [02:14:51] could anyone explain to me how I can get the output of Special:PrefixIndex through the API? [02:57:11] Vogone: allpages should do it, I think. [02:58:09] https://www.mediawiki.org/w/api.php?action=query&list=allpages&apprefix=Ba&aplimit=100 [06:20:11] [[Tech]]; BrandenburgConcerto; /* Remove sidebar (similar to Nostalgia) */; https://meta.wikimedia.org/w/index.php?diff=5594802&oldid=5593629&rcid=4306557 [06:38:06] [[Tech]]; Rschen7754; Reverted changes by [[Special:Contributions/BrandenburgConcerto|BrandenburgConcerto]] ([[User talk:BrandenburgConcerto|talk]]) to last version by MiszaBot; https://meta.wikimedia.org/w/index.php?diff=5594864&oldid=5594802&rcid=4306620 [12:15:36] when looking at http://dumps.wikimedia.org/enwiki/20130604/ I wonder why you supply bz2 dumps when 7z (xz I assume) looks to be much better? [12:37:31] AzaToth: because bz2 is block oriented, means [12:37:37] just a sec I think it's a FAQ [12:37:58] http://meta.wikimedia.org/wiki/Data_dumps/FAQ#Why_do_you_still_use_bzip2_and_not_just_7z_compression.3F [12:38:01] here ya go [12:39:03] the "new format" to come out of this summer's project may use something else, tbd: http://www.mediawiki.org/wiki/User:Svick/Incremental_dumps [12:48:38] apergos: how big is the uncompressed dump? [12:48:53] that's a FAQ too (for en wp history) [12:49:17] lol [12:49:22] * AzaToth hides of shame [12:49:24] :-D [12:49:37] I honestly thought you might be trolling me on a sunday afternoon [12:49:42] and a nice troll it was too :-D [13:13:47] apergos: you expect people to read the FAQs? [13:14:42] no, I expect to be able to snark at them and point them to the FAQs [13:14:50] saves me typing and it's also free entertainment :-D [13:15:19] (well people in the community might read them by accident, you never know...) [13:23:22] apergos: I didn't even know there was a FAQ [13:24:07] it's not like there was a big "HERE IS THE FAQ" sign ツ [13:25:59] apergos: there is one question I've not found a answer for though [13:26:16] apergos: has anyone actually downloaded and used the whole history dump? [13:31:46] also I wonder about the yahoo files [13:44:11] yes, people fo regularly [13:44:13] *do [13:44:32] I get questions not infrequently or I see them on the list form people who need to create a full mirror or folks doing various sorts of analysis [13:44:41] the yahoo files, I really have no idea [13:45:19] I can remember once getting asked by someone about how to get an abstract and pointing them to those files, and I don't know if they in fact used them [13:45:50] my first instinct was to assume the yahoo files was made to be used by yahoo [13:46:19] or is "yahoo" codename for the NSA dumps? ツ [13:47:12] 10TiB are a lot to unpack though [13:47:58] * AzaToth wonder how big the actual master db is [13:48:54] once upon a time yahoo did use them [13:49:01] but that was years ago, I have no idea if they still do [13:49:44] ok [13:50:54] the db claims to be 732gb but this is *only* the metadata [13:51:25] in order to get you the sizes of all the revisions I'd have to poll all the external text clusters and I don't know if they are all file-per-table yet [13:51:35] heh [13:51:46] in technical terms: ginormous [13:51:56] I can figure ツ [13:52:20] too big to fail! [13:52:40] who will bail us out when it does, that's what I want to know [13:52:48] me too [13:53:20] wonder which would affect most, shutting down fb or shutting down wp [13:54:00] loosing oll your knowledge or loosing all your imagined friends [13:54:28] well people could go to google plus or fall back to myspace [13:54:37] not the sam ething but they could sort of bridge the gap [13:54:41] if wp went... eh [13:54:45] for fb yes [13:54:54] a lot of us would lose a bunch of IQ points overnight :-D [13:55:04] (oblig xkcd ref) [13:55:10] well, to be fair, google pretty much caches all articles [13:55:18] hehe [13:55:32] yes but that just means we would still be using wp content [13:55:37] http://xkcd.com/903/ [13:55:46] well also, the dumps are around so someone could put up a copy [13:55:58] yep that's the one [13:56:17] yea [13:56:38] today's comic is pretty great [13:59:48] apergos: afaik you mean fridays comic [14:30:03] odder: categories added to the previous issues, and now included in the init template. Also, I've automated the links in the init template, so we don't have to change them manually. [14:33:06] oh yeah friday's [14:33:17] today's for me, I am a bit behind [14:34:28] guillom: this makes me happy! [14:34:37] good :) [14:34:39] how about those: https://meta.wikimedia.org/wiki/Tech/News ? [14:35:16] odder: I actually just replied: https://meta.wikimedia.org/wiki/Talk:Tech/News#weekly_changing_numbers [14:36:13] Summary: I don't understand why translations are affected. [14:36:49] hmm [14:36:58] odder: on the French version, it was updated automatically by Fuzzybot: https://meta.wikimedia.org/w/index.php?title=Tech/News/fr&diff=5593641&oldid=5557257 [14:38:51] as expected [14:39:06] perhaps I shouldn't have invalidated existing translations [14:42:04] Ah, perhaps. I didn't realize it was even possible to invalidate existing translations even if the text hadn't changed [15:04:14] odder: you're going to be happy again :) https://meta.wikimedia.org/wiki/User:Guillom/sandbox2 [15:04:30] I've finished the Lua module that prepares the newsletter automagically. [15:04:53] well, almost; You still need to list the translations, but I still consider this progress [15:05:11] I've tested the generated code on frwp & jawp and it seems to work [15:06:02] And now, to see if I can get Lua to subst: templates. [15:14:21] hmm, the en text doesn't show [15:23:19] works now; I'm an idiot. [17:09:34] * Elsie hugs guillom. [17:10:10] How sweet... [17:34:04] * guillom tickles Elsie. [17:34:24] odder: any objection to launching the global delivery? [17:34:33] I'm going to prepare the text [17:35:48] guillom: none [17:36:57] ok :) [18:07:08] How long does it take before the direct link to a deleted commons image goes away? [18:08:29] lfaraone: commons delinker now broken [18:08:39] so about ∞ [18:08:44] Base-w: I mean https://upload.wikimedia.org [18:09:11] lfaraone: what do you mean by direct link? [18:09:54] Base-w: lets imagine I delete [[File:Auditorio de Tenerife, Santa Cruz de Tenerife, España, 2012-12-15, DD 17.jpg]]. How long before https://upload.wikimedia.org/wikipedia/commons/e/ee/Auditorio_de_Tenerife%2C_Santa_Cruz_de_Tenerife%2C_España%2C_2012-12-15%2C_DD_17.jpg 404s? [18:10:33] ah [18:10:47] do you have sysop rights somewhere to just test? [18:12:09] Base-w: No. In this specific case, the work was deleted 13 days ago from Commons but is still accessible by URI. [18:12:46] what work? [18:12:59] Base-w: an image whose subject requested deletion per OTRS [18:14:06] hm, let's ping someone. odder, here? [18:17:40] yes [18:31:34] odder: look at ifaraone's question + you have otrs access [18:32:12] from me a request too: can next tech news use a header of a very first translation block? [18:33:07] ah its not the same [18:33:18] then put in for translation too [18:33:28] what do you mean? [18:33:42] * Base-w wants to see not Tech news: 2013-26 but Тех. новини: 2013-26 [18:33:47] in header [18:33:51] http://uk.wikipedia.org/wiki/Вікіпедія:Кнайпа_%28технічні_питання%29#Tech_news:_2013-26 [18:34:02] oh, when it's sent out. [18:34:06] guillom: [18:34:23] lfaraone: I have no idea, it might take some time. It'll get deleted eventually. [18:35:02] Base-w: yes, I've wanted to do that for a while, but it's not trivial because I don't know if the bot will support it. I'll try with a switch and see if that works. [18:35:34] guillom: ok, would be cool [18:35:39] I agree :) [18:36:09] :) [18:49:50] guillom: No. [18:50:15] Elsie: the switch won't work for the section header? [18:50:27] A multi-line subject line will break the bot. [18:50:41] Elsie: what about a single-line switch? [18:50:59] The bot has a check for subject line length. [18:51:02] I guess that could be disabled. [18:51:06] hmm, ok [18:51:16] It tries to prevent truncated edit summaries. [18:51:37] You just want to do {{#switch:{{CONTENTLANG}}}} or something? [18:51:45] With a subst. [18:51:49] yeah, like for the body [18:53:29] Okay, I bumped the length limit for now. [18:53:35] if len(subject_line) > 245000000: # Bump temporarily... [18:53:47] So as long as it's a single line, it should be fine. [18:54:06] Thanks; tonight I'm busy with VisualEditor stuff, but I'll test later this week. [18:54:50] I'm not sure #switch works in a section header generally. [18:55:03] Might make for ugly edit summaries. [18:55:55] I'll try on my userpages before using it for wider distribution [18:57:19] https://test.wikipedia.org/w/index.php?title=User_talk:MZMcBride&diff=174790&oldid=136739 [18:58:34] hmmm [18:58:56] Ain't life a bitch. [18:59:17] Ain't it indeed. [18:59:48] There's likely an open bug about the edit summary wonkiness. [19:00:19] https://bugzilla.wikimedia.org # guillom: once you begin thinking of the search icon as sad, it's difficult to stop. [19:00:25] Also, you just lost The Game. [19:00:44] ... heh [19:01:04] :-) [19:03:19] https://bugzilla.wikimedia.org/show_bug.cgi?id=43914 [21:05:38] NzbI'll ajn [21:11:46] Wlbloyal [21:11:51] M [21:12:31] * YuviPanda takes cat off T13|isAFK's keyboard [21:15:32] Opps [21:16:02] Pocket typing.. [22:05:42] MatmaRex: huh. [22:06:11] https://pl.wikipedia.org/w/index.php?title=Wikipedia:Kawiarenka/Kwestie_techniczne&diff=36897166&oldid=36893537 ? [22:51:54] gn8 folks [22:53:47] [[Tech]]; MF-Warburg; /* Indefinitely blocked IPs */ new section; https://meta.wikimedia.org/w/index.php?diff=5596641&oldid=5594864&rcid=4308594 [22:56:15] [[Tech]]; Legoktm; /* Indefinitely blocked IPs */ link to query; https://meta.wikimedia.org/w/index.php?diff=5596647&oldid=5596641&rcid=4308600 [23:21:26] [[Tech]]; Xiong Chiamiov; /* Remove sidebar (similar to Nostalgia) */; https://meta.wikimedia.org/w/index.php?diff=5596723&oldid=5596647&rcid=4308676 [23:42:56] > An error has occurred while searching: The search backend returned an error: Error opening index. [23:42:58] on wikidata [23:43:11] oh wait [23:43:13] im on test.wd [23:43:16] nvm [23:43:51] IIRC the search index isn't set up on some wikis... e.g. testwikidata IIRC? [23:45:37] Definitely for elwikivoyage and vecwiktionary: https://bugzilla.wikimedia.org/show_bug.cgi?id=48715 [23:45:48] yeah, supposedly there's an RT ticket for it [23:46:00] i meant to be searching on normal wikidata anyways >.>