[00:01:03] gn8 folks [01:29:24] Hmmmmmm. [01:29:32] I wonder if that indicates out-of-sync Apaches. :-/ [03:24:54] Elsie: "that"? [03:28:53] jeremyb: __DISAMBIG__ showing up in pages. [03:29:03] uhuh [03:29:08] that sounds plausible [08:03:55] Why on this query: http://goo.gl/OKHRJ "Blestium" doesn't have any outgoing links? although as you can see it has 79 http://goo.gl/nY99R ? [08:04:38] Why on some entires it shows the data for 'links' and 'langlinks' attributes but on others it doesn [08:09:20] also it doesn't show the langlinks for Blestium [08:09:21] :(( [08:13:55] this is madness! [08:14:01] WHY ON EARTH IT DOESN'T WORK [08:22:19] cff: langlink handling recently changed, lemme see where it is now [08:22:48] I don't understand why it works on some articles but not on others... [08:23:08] its because some langlinks are stored in page text and some are stored in wikidata [08:23:30] on some articles I get 'langlinks' as well as 'links' and on others I don't (but I should get, because those articles have langlinks and links too) [08:24:26] right, because they're stored in wikidata. https://www.wikidata.org/wiki/Q572453#sitelinks-wikipedia [08:25:27] legoktm: so I should switch to the WikiData API ? [08:25:38] https://www.wikidata.org/w/api.php ? [08:25:45] well what are you looking for? just langlinks? [08:26:06] I want coordinates, langlinks, links only from namespace=0 [08:26:38] ok, you're probably going to want to make 2 queries then [08:26:51] one on the local API for coordinates+links [08:26:55] one on the wikidata api for langlinks [08:27:19] the wikidata api is different though, https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q572453&format=jsonfm [08:27:36] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q572453&format=jsonfm&props=sitelinks even [08:31:44] using WikiData API I should be able to get the langlinks for all the articles, not just for a few right? [08:31:57] like it is going on with the prop=langlinks [08:32:14] "for all the articles"? [08:32:29] for all the articles that have langlinks yes [08:32:42] yes, i'd say 99+% are in wikidata [08:33:31] the query at http://goo.gl/OKHRJ worked just fine yesterday now its borked [08:33:49] or maybe I had the impression it worked... [08:34:56] I only want to retrieve the links and langlinks for articles that have coordinates [08:37:15] legoktm: it doesn't seem to work for coordinates&links either http://goo.gl/Zz7Zy Blestium still has only coordinates but no 'links' attribute... [08:38:26] ah [08:38:40] so the issue is that you can only fetch 500 links at once [08:38:56] however for all those pages, the 500 links finishes very quickly [08:39:04] but it sends you more pages [08:39:18] so you need to use the continue value at the top [08:39:19] "plcontinue": "8022|0|Pandorea" [08:39:27] and continue it [08:39:29] I need to use both plcontinue and gapcontinue? [08:39:40] when I use prop=links|coordinates? [08:40:27] cff: see https://www.mediawiki.org/wiki/API:Query#Continuing_queries [08:40:59] I've run the query with plcontinue... no results for Blestium... [08:41:19] yes, I know I use a variation of the code from there [08:46:07] so I run the query with the same gapcontinue but I use plcontinue until the param is empty it means that I'll get all the links for the current 500 articles right? [08:46:11] *if I [08:46:32] i think so [08:46:51] but that doesn't happen... Blestium has no links attribute... [08:50:01] I think I'm going to make separate HTTP call for each thing I need: coordinate, langlinks, links [08:50:53] I'll do it first for coordinates, and using that article's title which has coordinates, I obtain the other info I need [08:51:25] i think you could combine langlinks and links [08:52:04] let me try that [08:52:41] yes, but then I need to search for the langlinks&links for the article that has coordinates [08:53:17] ah or not [08:53:40] I can query if article has coordinates using its title from the previous langlinks&links response [08:54:42] but this means a lot more data has to be fetched from the langlinks&links HTTP request does it? [08:55:16] i don't know what you are doing exactly [08:55:26] because there are a lot more articles with links or langlinks than those with coordinates [08:58:58] giftpflanze: I want to retrieve incoming links, outgoing links and language links for all the articles that have coordinates [09:00:30] giftpflanze: btw the query for langlinks & links doesn't work for Blestium either: http://goo.gl/ZfCLR [09:01:19] Blestium has not 'links' or 'langlinks' attribute... [09:01:25] s/not/no [09:01:38] * cff sighs [09:02:26] legoktm: its because I can get max 500 links for the ENTIRE request? not for each article's links ? [09:02:38] cff: yes. [09:04:35] legoktm: damn you are right, I've counted the links from the first articles and it sums to 500 [09:04:44] legoktm: and the rest of the articles don't have links [09:05:01] yeah, thats how it works [09:06:49] yeah [09:07:12] now when I run plcontinue I get all the previous links * plus I think the new 500 links [09:07:27] s/*/, [09:07:50] err, not [09:09:17] or that's it [09:09:32] I was counting it wrong [09:09:54] the articles from the previous query, if they don't have links anymore the links attribute will be missing [09:10:07] makes sense [09:19:14] Using prop=links with plnamespace=0 will get me only the internal links right? [09:19:37] i.e. no links to external sites will be included? [09:22:01] (prop=links) == data from the pagelinks table? [12:41:49] legoktm: I think I'm better of doing separate HTTP request because using the continue parameters would require a lot more requests being done [12:42:02] s/request/requests [12:42:46] I would have to continue for plcontinue and llcontinue even though none of the 500 articles have coordinate [12:44:31] or only 1 of the 500 has coordinate and I have to plcontinue and llcontinue for the other 499 articles to get the langlinks and links even though I'm only interested in the langlinks and links for the 1 article with coordinate [13:28:48] indeed to much useless requests if I do prop=coordinates|langlinks|links [13:29:11] *too many [16:18:30] apergos, parent5446: hello [16:18:35] hello [16:18:47] looks like we have parent5547 today actually [16:18:56] grr [16:18:58] parent5446: [16:19:32] or not :-D [16:19:47] you scared him away :-) [16:20:02] ohh looky, a parent5448 [16:20:11] today, i'm working on the build file; it seems i managed to create the makefile using cmake, now i'm working on fixing compile errors from gcc [16:20:24] ah I thought there would be some build errors, yep [16:20:33] as in I ran into some when I tried it [16:20:45] Yeah Android is giving me trouble. [16:20:58] good you are doing this early in the process, it will save you headaches later [16:21:08] are you here on a smartphone parent5448? [16:21:26] yeah [16:21:28] Yes. Laptop is out of commission at the moment. [16:22:07] ok [16:22:24] how did the comments stuff work out? [16:24:17] i tried invalid UTF-8 in the middle of a comment, and it got converted to the same U+FFFD character [16:24:27] blahh [16:24:33] ok well that's sucky [16:24:57] and this is in the raw xml out? before your code processes it? [16:25:18] yeah [16:26:21] grr [16:26:38] ok well just tossing it is no good, replacing it with some placeholder that's not used anywhere else would be best [16:26:46] i also looked at the wikis you suggested (elwiki, nlwiki, frwiki), and from those, the only comments over 255 bytes that have FFFD in the middle are two revisions on frwiki [16:26:51] ok [16:26:55] so there are a couple [16:26:56] http://fr.wikipedia.org/w/index.php?diff=prev&oldid=79249886 and http://fr.wikipedia.org/w/index.php?diff=prev&oldid=93079986 [16:27:27] and both have the U+FFFD almost at the end, just before elipsis [16:27:35] yep [16:29:14] which i think means that there won't be any fixing like you suggested yesterday [16:29:18] no [16:29:30] but just tossing the character is no good either (unless it's literally at the end) [16:30:14] if i tossed it also just before the elipsis, i think that would be okay too [16:30:24] which would take care of all problematic cases [16:30:37] (that i encountered on those 3 wikis) [16:32:31] if it's the last character then yes it's ok [16:32:38] anyways you can put it off for now for sure [16:33:17] okay [16:34:02] anything else going on that's an issue? [16:34:46] nothing i can think of [16:35:13] hey parent5448, got anything you wanna bring up? I don't know if you got to see yesterday's logs [16:35:41] Yeah I took a quick look. Everything seems OK to me. [16:35:55] well I got nothin [16:36:12] just waiting on being able to run on linux :-) [16:36:41] Mhm [16:37:10] ok [16:37:55] well... [16:38:01] if no one's got anything... [16:38:08] see folks tomorrow? [16:38:15] Yep. See you tomorrow. [16:38:28] have a good one! [16:38:29] Hopefully ill be parent5446 by then [16:38:33] heh good luck [16:39:05] thanks, bye [16:49:53] Uhhh [16:50:03] Krinkle|detached: What have you done to mediawiki-config on tin? [16:50:10] Author [16:50:10] Timo TijhofJul 18, 2013 5:49 PM [16:50:10] Committer [16:50:10] Timo TijhofJul 18, 2013 5:49 PM [16:50:21] Reedy: busy? [16:50:44] Not so much [16:50:49] Could do with fixing that though apparently [16:50:56] Reedy: good [16:50:56] https://www.mediawiki.org/w/index.php?title=Help:VisualEditor/User_guide&diff=736734&oldid=723797 [16:51:01] ineed thismarked for translation :D [16:51:31] How? [16:51:34] I just marked it as accurate.. [16:51:45] how wouldi know? you're a translationadmin [16:51:46] https://www.mediawiki.org/w/index.php?title=Special:ListUsers&group=translationadmin [16:52:10] * Reedy clicks more links [16:52:17] Done.. I think [16:52:20] Nemo_bis: you there? :D ^ [16:52:30] ah [16:52:31] lessie [16:54:31] okay Reedy, works, thanks [16:54:33] ilu [18:50:10] hi. Is it normal for the HTML output of wikipedia pages to contain more than one occurrence of ? [18:54:03] twkozlowski, abusive changing the nick :D hi [18:54:42] Why abusive, Base? :-) [18:54:43] Twkozlowski, here? [18:55:05] abartov: no, it's invalid HTML, in fact [18:55:07] because not usual "odder" :P [18:55:26] abartov: you can use class=toccolours to make something look like the ToC when it's not [18:55:28] Not everyone on the planet know your real name [18:56:24] Base: not everyone on the planet knows my nickname either :-) [18:56:56] :D perhaps [18:58:20] Base: I did a @notify on Sunday because the translation of Tech News #29 into Ukrainian wasn't finished. [18:58:26] I ended up sending an English version, in the end. [18:58:36] ended up in the end, yay me. [18:59:21] but uk was almost done [19:00:19] Actualy faik it was done for time of letter that en finished [19:00:21] Yes. We figured out since then it might be better to send out unfinished translations instead of English text. [19:00:26] *afaik [19:00:57] Base: Yes, I sent the e-mail on Friday, but we freezed the issue on Saturday evening (CEST) [19:00:59] defiitely. Not everybody speak english [19:01:22] Wee! You're the second person to say so. [19:01:31] I'll go ahead and add this to the publication manual then [19:02:26] Now ata_zh mostly translate it [19:02:41] she leaves pc in saturday usualy [19:02:48] twkozlowski, maybe 50% of translatedness can be a theshold?.. [19:03:01] i cant because im from phone [19:03:30] ata_zh i think even 1% translated is better than 0 [19:03:49] :) [19:04:17] MatmaRex_, thanks. I'm not trying to get the style, I'm actually scraping a page, and was surprised to find a second id=toc table. :( [19:05:18] Ata_Zh: yes, I thought that 50% might be a good number. [19:05:40] I personally would hate to get translations not even half-finished. [19:05:49] But on the other hand, it might encourage further translations. [19:05:55] then 49% will be to low? [19:06:05] *too [19:06:34] i dont think such number is necessary [19:56:40] best comment about VE ever: https://en.wikipedia.org/wiki/Wikipedia_talk:VisualEditor#With_this_you_will_conquer_the_world. [20:00:24] I can't wait till we replace Google with VE [20:01:44] > you will conquer the world♙ --108.38.191.162 (talk) 19:17, 18 July 2013 (UTC) [20:01:46] I smiled. [20:02:49] "Don't you mean == With this|} you will [[conquer the|conquer the]] world♙" [20:03:14] looks about right [20:03:20] Reedy: help. why is https://www.mediawiki.org/wiki/Special:ExtensionDistributor?extdist_extension=RelatedArticles not showing the version for 1.20 when the REL1_20 branch exists? [20:03:23] > Wikipedia has become of, for, and by Aspies. [20:03:30] That talk page has some pretty great quotes. [20:03:36] MatmaRex: Is the ranch new? [20:03:44] no idea [20:03:44] become of aspies? [20:03:47] doesn't look like [20:03:54] points to a commit from 2012-11-05 22:13:00 [20:04:00] become by aspies? [20:04:00] wat [20:04:12] i was going to merge some 1.20-incompatible changes and wanted to be sure things won't asplode [20:04:15] MatmaRex: https://github.com/wikimedia/mediawiki-extensions-RelatedArticles/branches [20:04:20] It doesn't have a REL1_20 branch [20:04:21] That's why [20:04:28] <^demon> MatmaRex: Maybe ext-dist couldn't reach github when it tried last? If it 404'd it caches it as not existing. [20:04:38] <^demon> Or what reedy said. [20:04:38] Reedy: it does locally for me at least [20:04:40] 1.21 too [20:04:54] I Hmm [20:05:00] github says they were merged into master.. [20:05:24] wat [20:05:32] <^demon> https://git.wikimedia.org/summary/mediawiki%2Fextensions%2FRelatedArticles.git has a 1.20 and 1.21 as well [20:05:41] "Showing 2 branches merged into master." [20:05:50] <^demon> As does https://gerrit.wikimedia.org/r/#/admin/projects/mediawiki/extensions/RelatedArticles,branches, the canonical one [20:06:24] man, why do *i* always run into this crap. [20:06:51] You're too active. [20:07:56] > Deployment windows are 'pinned' to the time in San Francisco and thus the UTC time will change due to the as appropriate. [20:08:08] We've lost a noun, I think. [20:09:42] is global watchlist ever to appeare? [20:09:51] One day. [20:10:17] twkozlowski: you there? when are the next tech news being sent out? [20:10:25] actually, Elsie, you might know this, too. do you? [20:10:45] I think Elsie might have /Status on their watchlist [20:11:12] Heh, not on Meta-Wiki. [20:11:23] MatmaRex: Nothing has changed, it'll be sent out on Sunday. [20:11:26] Well, on my watchlist, but not on my watch list. ;-) [20:11:34] What twkozlowski said. [20:11:39] And freezed for publication on Saturday. [20:11:43] hm. okay. [20:11:48] Why? [20:11:59] https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Duplicated_text_on_tags [20:11:59] MatmaRex: Is highlighting by #hash in the URL done with :target or :focus? [20:12:02] I always forget... [20:12:10] info for this is included in the next one [20:12:22] :target, right? [20:12:24] i was under impression it will be sent out before today's deployment [20:12:27] guess i was wrong [20:12:28] Elsie: right [20:12:51] !technews | MatmaRex [20:13:02] Hi wm-bot. [20:13:12] hey, wm-bot's fucked up here too [20:13:14] petan: ^ [20:13:16] !ping [20:13:20] !help [20:13:20] !(stalk|ignore|unstalk|unignore|list|join|part|quit) [20:13:21] !ping is whatevs [20:13:21] Unable to modify db, access denied, link to database isn't valid [20:13:25] kapow. [20:13:31] Hi snitch. [20:13:40] Oh. https://meta.wikimedia.org/wiki/Tech/News/Latest and fifth item from the bottom, MatmaRex [20:14:42] yeah. hm. [20:14:45] > Future software changes [20:14:48] > July 17 [20:15:01] let's apply some last-minute fixes. :D [20:15:47] noooo [20:16:19] "The Wikipedia database is temporarily in read-only mode for the following reason: [20:16:20] The database has been automatically locked while the slave database servers catch up to the master" [20:16:30] Last-minute fixes? That issue's been sent out already, MatmaRex... [20:16:50] uh [20:16:55] was it? [20:17:07] wait [20:17:11] https://meta.wikimedia.org/wiki/Tech/News/Next is the one that will be send out on Sunday [20:17:11] i'm all condused now [20:17:12] brb [20:17:44] https://pl.wikipedia.org/wiki/Wikipedia:Kawiarenka/Kwestie_techniczne#Tech_News:_2013-29 MatmaRex [20:18:03] MatmaRex: Perhaps you confused Tech News with English Wikipedia Signpost which is published on Thursday [20:18:14] * MatmaRex slaps self around a bit with a large trout [20:18:16] no [20:18:31] i assumed people actually read the news and would have acted accordingly if it was sent out [20:18:33] oh well [20:18:37] sorry about the trouble [20:18:44] No worries. [20:19:06] Makes me think it's time to update this week's issue so we can prepare it for early translation tomorrow. [20:20:14] MatmaRex: Got a second? [20:20:31] sorta, yes. [20:20:46] https://test.wikipedia.org/wiki/MediaWiki:Common.css [20:20:52] body.action-info #mw-pageinfo-watchers:target { background: #DEF; } [20:21:10] Is there any way to genericize that to apply to any ID? [20:21:28] *:target <-- ??? [20:22:26] just ':target' [20:22:35] Hah! [20:23:39] Sweeeeeeeeet. [20:23:58] https://test.wikipedia.org/wiki/Main_Page?action=info#mw-pageinfo-robot-policy [20:24:05] https://test.wikipedia.org/wiki/Main_Page?action=info#mw-pageinfo-watchers [20:24:51] That's sick. Awesome. [20:27:19] What's with all the s5 replag at the moment? (https://noc.wikimedia.org/dbtree/ ) [20:27:28] Someone complained about enwiki as well but that seems okay? [20:27:35] springle may know? [20:27:41] I saw something about Wikidata schema something above. [20:27:47] Is Wikidata s5? [20:28:01] Yes, s5 = wikidatawiki & dewiki [20:28:08] Krenair, Elsie, yes on it [20:28:12] :-) [20:28:53] springle: It's nice to have you around. :-) [20:48:05] Reedy, can you test some code in eval.php in production for me please? [20:48:26] hm, I keep on getting "User talk page modification: Failed to save edit: The wiki is currently in read-only mode" but saving my edits works. [20:49:23] I can't edit [20:51:07] What wiki Doug_Weller? [20:51:35] Hm en.wiki, but I just managed to restore an old version of an article - it said I'd failed, but it worked [20:51:49] Krenair: ? [20:52:48] back to normal now [20:52:54] Reedy, http://pastebin.com/DADpF7Bh (from a non-de wiki) [20:54:45] Krenair: http://pastebin.com/WwYkvCaS [20:54:53] I'm just hoping that won't trip up on some weird networking/security on the production cluster :/ [20:55:27] Oh great, it's fine then. Thanks Reedy [20:59:36] Doug_Weller, a couple of other people reported that... every time I checked dbtree enwiki (s1) was fine, it was dewiki & wikidatawiki (s5) which had issues [21:07:53] kaldari: ping [21:08:12] kaldari: is it just me, or is Echo's "Show talk page message indicator in my toolbar" on test.wp broken? [21:08:29] hmm [21:10:54] MatmaRex: working for me [21:11:15] hm. wgUserNewMsgRevisionId is null for me, so showing the indicator fails [21:11:21] and i can't get it not to be null [21:11:55] hm [21:12:03] could it be because https://test.wikipedia.org/wiki/User_talk:Matma_Rex has only one revision? [21:12:25] kaldari: can you add something therefor me? :) ^ [21:20:28] kaldari: yeah, i'm pretty sure it'scaused by that (although this is a core bug) [21:23:44] MatmaRex: Sorry got pulled into a RL discussion [21:24:35] added another post to your talk page [21:25:36] MatmaRex: it's entirely possible there's a core bug. All the wgUserNewMsgRevisionId code is relatively new. [21:25:51] i added one as anoynomus earlier and it appeared [21:26:04] filed as https://bugzilla.wikimedia.org/51640 [21:26:17] i guess this could be caused by something else, but that seems unlikely [21:26:20] MatmaRex: I'm working on the Mobile web team now, though, so you may want to ping bsitu about it [21:26:30] bsitu: ping ^ [21:26:31] kaldari: alright :) [21:27:18] MatmaRex: hi [21:28:49] bsitu: https://bugzilla.wikimedia.org/51640 seems relevant to Echo [21:28:50] hey chrismcmahon, how did the QA hangout go? [21:29:29] bsitu: so if you or someone else who knows how that workscould look, it would be cool [21:29:55] hi ori-l it went pretty well, but I had more fun doing it in person in SF last month. We've already got a few contributors, looking forward to a few more. [21:30:32] well, there's space in SF :) [21:30:47] was anyone new using vagrant, and if so did it work / not work well for them? [21:31:29] MatmaRex: The new orange bar depends on wgUserNewMsgRevisionId. Based on the bug description, it sounds like there is a bug with wgUserNewMsgRevisionId [21:32:29] MatmaRex: i will see how this can be fixed [21:33:35] bsitu: thanks [21:39:19] bsitu, MatmaRex: looks like $user->getNewMessageRevisionId() is returning null even after the first post for some reason. [21:39:28] ^^ chrismcmahon [21:39:32] see q above [21:42:58] ori-l: yes, we have at least one pretty sophisticated user on a vagrant VM. she hasn't contributed code yet, but I am enthusiastic, she told me she has a lot of other obligations right now she is clearing away [21:43:20] bsitu, MatmaRex: I see the problem [21:43:22] kaldari: thx, this would be a good starting point to investigate [21:43:33] kaldari: cool, :) [21:43:49] kaldari: yay [21:44:09] User::getNewMessageLinks is comparing against the last viewed rev. But in the case of the 1st rev there is no "last viewed rev". [21:44:20] I'll go ahead and fix it [21:45:15] chrismcmahon: cool. please do encourage users to file bugs, not just for outright breakage, but also things that seem needlessly confusing or difficult. i really appreciate those. [21:45:19] kaldari: thx for the quick fix, :), [22:02:18] Reedy: What? I didn't do anything. [22:02:34] Oh crap, I think I know. [22:02:40] Krenair: You were set as the name/email of the local config on /a/common on tin [22:02:48] git config defaults to changing config at repo level, not user level. [22:02:56] Reedy, eh? [22:03:05] Oh, you meant Krinkle :) [22:03:12] I did ;) [22:03:34] When I committed those presentations from a detached head (didn't want to rsync to local, and not affect local master), I fixed up the committer/author to be me [22:03:45] should've done git config --global instead [22:04:03] which is a bit counterintuive in the context of tin, because I'd say global means for everyone, but in this case it's reversed. [22:04:26] well, it isn't, I just didn't know what I was thinking. global for all repos, not all users. [22:04:56] Reedy: Did you restore it to be blank? [22:05:02] Yes [22:05:12] (defaulting to global and/or git auto constructing based on sshuser/host) [22:05:17] Thanks [22:06:27] TimStarling, Reedy: Trijnstel's about to do a bigdelete on enwiki [22:06:35] ehh, she already did [22:06:36] Why? [22:06:39] Awesome [22:06:47] for a history merge [22:06:57] * Jasper_Deng always thinks it's good to tell sysadmins first, but oh well [22:07:01] thanks Jasper_Deng [22:07:10] in case I cause problems... [22:07:19] I'm merging a page with 5,000+ revisions [22:07:34] hmm, right, I did.... [22:07:36] A database error has occurred. Did you forget to run maintenance/update.php after upgrading? See: https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script [22:07:37] Query: SELECT page_id,page_latest FROM `page` WHERE page_namespace = '0' AND page_title = 'Dragon_Ball_Z' LIMIT 1 FOR UPDATE [22:07:39] Function: PageArchive::undeleteRevisions [22:07:40] Error: 1205 Lock wait timeout exceeded; try restarting transaction (10.64.32.26) [22:08:06] I need to restore the edits [22:08:34] yeah, well done Trijnstel [22:08:40] sorry.... [22:09:01] and the edits are not restored yet :( -> see https://en.wikipedia.org/w/index.php?title=Dragon_Ball_Z&action=history [22:09:22] didn't expect this and figured if would be ok (+forgot that I should inform you) [22:09:59] Jasper_Deng did inform us, a minute before you joined [22:10:00] hé, they're resotred? [22:10:06] *restored [22:10:08] ah [22:11:47] yes, I suppose they are restored [22:12:08] ok, thnx [22:12:12] and again sorry [22:12:24] will inform you next time someone asks me to do this [22:12:54] just don't do it [22:14:20] You could remove the 'big delete' user right from stewards. [22:14:32] Rather than giving them a user right and hoping they won't ever use it. [22:15:00] bigdelete * [22:15:03] well, that's not ture [22:15:05] *true [22:15:20] I mean, sometimes it's ok to move as the software doesn't count the amount of edits correctly [22:15:41] Moving pages isn't relevant. :-) [22:15:45] Moving is cheap. Deleting is the issue. [22:15:59] and we are - indeed - informed to ask you first before doing something [22:16:15] [00:15] Elsie Moving is cheap. Deleting is the issue. <- yes, and sometimes a page need to be deleted when it has more than 5000 revisions [22:16:26] (currently usually on enwiki and ruwiki) [22:16:35] and I expect it needs to be done more [22:17:01] as we only get more revisions on pages, rather than less [22:17:59] why would you delete 5000 revisions just to merge in a single revision from a sandbox? [22:18:01] "Need" is pretty subjective. :-) [22:18:06] merging page histories certainly doesn't help them stay small [22:18:09] it's ridiculous [22:18:25] you're right... didn't think of it [22:18:41] won't do it anymore [23:16:00] the other way would be delete 1 rev, move page ,restore rev [23:16:15] and then move back if needed [23:32:27] gn8 folks