[14:19:39] Hey flipzagging , you up already? [14:20:05] yep [14:20:48] ok so as for the patch [14:21:00] I can give you one diff you can apply [14:21:14] I just compiled one myself too, so I'll compare them [14:21:23] hurr [14:21:42] okay [14:21:47] (Compare your diff and mine I mean) [14:21:59] I'm sure we made different decisions as to what to leave out [14:22:19] yeah [14:22:28] all right I'll send you the svn diff then? [14:22:49] Yeah and I'll send you mine in a second [14:23:22] List of revs I merged locally: https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWiki/tag/uwdeploy [14:31:39] flipzagging: Oh it looks like I forgot to merge in the rev that removes the , i typo [14:31:54] that was the last one [14:32:25] Yeah [14:32:30] I saw it and tagged it but forgot to merge it [14:33:08] I also merged in a change that affected a lot of doxygen comments, not by me [14:33:18] I don't really care either way :) [14:34:27] Yeah I think that's the only difference between our patches [14:34:29] + Merged /branches/sqlite/includes/api/ApiQueryStashImageInfo.php:r58211-58321 [14:34:34] *RoanKattouw frowns on SVN [14:35:15] Well the doxygen changes and the is_a -> instanceof, or -> || rev that I committed just now [14:35:22] I don't see any significant changes except the first (you missed the 'i' typo) and the last (you remembered the MessagesEn) [14:35:27] which I forgot [14:35:31] Oh and you didn't catch the MessagesEn one [14:35:36] Yeah it's easy to miss [14:35:46] I happened to remember to tag it when I saw it come by [14:36:01] I'd like to see how you do this. [14:36:10] As you probably can tell I merged a lot of individual diffs [14:36:19] I actually wrote a perl script to do it. [14:36:34] What I basically did is this [14:36:55] for rev in list of revs to merge; do svn merge -c $rev ~/mediawiki/trunk/phase3/includes/ includes/; done [14:37:11] (with pwd = branches/uploadwizard-deployment ) [14:37:30] oh last thing, do you have includes/api/ApiQueryStashImageInfo.php [14:37:35] In practice, I ran the commands manually one by one because I wasn't sure includes/ would work for all of them (it doesn't for MessagesEn for instance) [14:37:42] A + includes/api/ApiQueryStashImageInfo.php [14:37:44] that was an svn copy, I couldn't make it work any other way [14:37:48] ok [14:37:51] I just merged the rev [14:37:59] And it magically appeared [14:38:01] funny, that broke for me [14:38:09] ok [14:38:25] I got a few conflicts in doc comments because I skipped the doxygen revs [14:38:35] I guess we're good to go -- if you merge into uwd, I can sync down to commons.prototype and we can test it [14:38:41] Alright [14:38:44] er, merge into uploadwizard-deployment [14:38:49] Yaeh [14:38:56] I'll also bump ext/UW although you don't need that right now [14:39:32] Another nice trick here is to use CR tagging to create a rev list to merge [14:39:51] yeah, that sounds slick. [14:39:55] You can then be sure you merge everything in order (ascending rev ID) and you can generate the rev ID list for in the commit summary with jQuery [14:40:00] but AFAIK tagging is copying in SVN. [14:40:17] or maybe I have this backwards. [14:40:32] anyway you can work, explain this later. [14:41:36] No I mean the tag feature in CodeReivew [14:41:41] oh. [14:42:02] So what I did just now is visit [[Special:Code/MediaWiki/tag/uwdeploy]] and do: [14:42:04] >>> var a = []; $j('.TablePager_col_cr_id a').each(function() { a.push('r' + $j(this).text()); }); a.join(', '); [14:42:05] "r76796, r76783, r76782, r76750, r76746, r76740, r76537, r76526, r76503, r76386, r76354, r75995" [14:42:30] *RoanKattouw is thinking he should note this in his newly-written deploy docs [14:42:44] Oh oops, I forgot to invert the order [14:46:05] flipzagging: Committed merge [14:46:48] synced on commons.prototype [14:48:13] Alright [14:48:22] Hm so one thing I was worried about was the upload max size limit [14:48:24] ok, two quick tests show regular uploads are not broken, and UploadWizard style uploads do work. [14:48:30] Cool [14:48:50] do you know an easy way to test the regular upload API? That's the one thing I am not exercising here. [14:48:54] So if and when you have time, could you do a quick test setting $wgUploadMaxSize or whatever its name is to a ridiculously low value and verify it's being honored? [14:49:05] I don't, no [14:49:13] Other than writing up a quick HTML form for it [14:49:33] I tested it by quickly changing a file to a ridiculously high size, and it refused to serve it as expected. [14:50:06] Refused to serve from stash you mean? [14:50:20] yes [14:50:33] I removed a file, and it complained appropriately. [14:50:49] then I replaced that with a ln to a super-large file, and it made the file size complaint. [14:51:01] so I'm sure it works, or reasonably sure I guess. [14:51:06] OK but what I was getting at is the limit applied when you first upload the file [14:51:16] oh, I see [14:51:17] Like, on Commons you can't upload files >100M [14:51:45] ok I'll try it [14:52:09] that's a combination of the php.ini setting and the MW global though right? [14:52:19] Yeah whichever is lowest [14:52:30] So if you set the MW global to like 5, it should refuse almost all reasonable uploads [14:52:48] oh really [14:52:56] I thought it would be slightly different [14:53:06] if you change the php setting, it will refuse immediately [14:53:22] I believe the MW setting will happen after php's finished with it [14:53:27] Yes [14:53:34] So effectively, whichever is lowest wins, right? [14:54:02] (Currently we don't seem to show reasonable error messages for violating the PHP limit in 1.16wmf4; there's code for that in trunk) [14:54:21] I'm trying to find the global in MW now [14:55:07] $wgMaxUploadSize = 1024*1024*100; # 100MB [14:55:20] nm got it [14:56:59] I set it to 5 and uploaded a file the old fashioned way -- Special:Upload -- nothing special happened, no error. [14:57:13] hm [14:57:40] That's strange [14:57:41] that might mean that my changes have broken it for all methods. [14:57:50] That's what I though [14:57:51] t [14:57:55] ooookay. [14:58:03] Well I thought it'd break for non-UI methods [14:58:06] Hadn't thought it'd be all [14:58:12] That's weird though [14:58:14] or, possibly it doesn't even work [14:58:32] hah [14:58:37] can we test this with a wmf deploy branch wiki somewhere, or do we have to make it ourselves [14:58:44] The only check is in UploadFromURL of all places [14:58:54] or maybe there's a minimum value for wgMaxUploadSize [14:58:56] testwiki [14:59:03] Fairly sure there's not [14:59:08] Let's poke at testwiki [14:59:50] I don't use that, what's the exact url? [15:00:05] 'wgMaxUploadSize' => array( [15:00:07] // Only affects URL uploads; web uploads are enforced by PHP. [15:00:08] Aha [15:00:11] test.wikipedia.org [15:00:15] doh! [15:00:23] The definition was *changed* to apply to all uploads [15:00:31] So this is not technically a regression [15:00:39] although it sucks. [15:00:46] I agree [15:00:54] And the trunk behavior is better [15:00:56] Krinkle: that's what I thought, but I have a DNS error here. [15:01:01] But we didn't break anything [15:01:17] catrope@roanLaptop:~/mediawiki/trunk/phase3$ host test.wikipedia.org [15:01:18] test.wikipedia.org is an alias for text.wikimedia.org. [15:01:20] text.wikimedia.org is an alias for text.esams.wikimedia.org. [15:01:21] text.esams.wikimedia.org has address 91.198.174.232 [15:01:28] Note it's test.wikiPedia.org, not wikiMedia [15:01:37] http://test.wikipedia.org/ works here :) [15:03:12] not working for me, but perhaps it's at&t. [15:03:35] flipzagging: Try: https://secure.wikimedia.org/wikipedia/test/wiki/Main_Page [15:03:35] the esams name works. [15:03:47] flipzagging: Yeah, but doesn't show a real wiki, right ? [15:03:49] Krinkle: that was clever. Worked. [15:03:57] Krinkle: yes, just the landing page. [15:04:14] Yeah test on secure sort-of works [15:04:16] ok um, I forgot what we were going to do. [15:04:24] Test the upload max size [15:04:26] it seems that the feature just doesn't work. [15:04:26] ::zip:: [15:04:30] But it turns out there's nothing to test [15:04:41] why did you think there was an issue in the first place [15:04:48] some other anomaly? [15:04:51] Because of the way the variable currently works [15:05:02] I saw revisions made after the definition was changed [15:05:11] ok [15:05:23] In particular, one that moved max size checks around and consolidated them with after-the-fact checks to see if the PHP limit was hit [15:06:05] And I forget why, but I then had reason to believe I half-merged those moves, i.e. removed the checks from the old places but never added them back in the new places [15:06:22] ok, so false alarm [15:06:27] Yes [15:06:47] all right if you're satisfied i am [15:06:51] well [15:07:21] I'd like to test the old upload api just a bit somehow, but I'm also lazy [15:07:22] I think we're good [15:07:32] Yeah we should do that [15:07:37] geek... tendencies... struggling... for control [15:07:50] haha [15:08:01] I think Bryan T. M. had a test script for this [15:08:41] what if I hacked in the PHPUnit, just on prototype [15:09:16] Could do that, good luck :) [15:09:24] Although it should really just be cURL based, right? [15:10:45] well here's the irritating part; it isn't [15:10:52] it's based on faking API requests in PHP. [15:11:05] so, the HTTP transport is not tested, but the logic is. [15:11:25] Aaah [15:11:34] there are some advantages, as you can fiddle with sessions directly and so on. [15:11:42] but mostly it's not a good idea. [15:14:56] Yeah [15:15:12] Bryan mentioned his script in a mailing list post once [15:15:16] Lemme try to find it [15:18:32] i'm working on phpunit [15:18:52] the simple apt-get did not install the right verson. [15:20:36] Hmmm [15:20:57] <^demon> Install from pear, not apt-get [15:21:05] <^demon> You definitely want 3.5.x [15:21:09] yes [15:21:13] otherwise I get 3.0x [15:23:11] I'll install pywikipediabot and upload to commons.prototype with that [15:24:17] what PHP version do we run live? [15:24:30] for wikipedia, commons, et al. [15:24:39] Lemme check [15:24:57] Actually, Special:Version is the most reliable source for that [15:25:13] 5.2.4-2ubuntu5.12wm1 (apache2handler) [15:25:34] I am surprised at 5.2 [15:25:56] I'm not sure that 1.16wmf4 works nicely in 5.3 [15:25:58] I'm trying to think now if I've ever used a 5.3-ism [15:26:05] maybe in one-off scripts [15:26:08] People notice that [15:26:14] I run 5.2 myself [15:26:23] ok, mental note, downgrade [15:26:26] 5.3 is stricter on references though, which results in very annoying behavior [15:26:38] when you use hooks that get their ampersands wrong [15:27:28] PHP 5.2 doesn't care if the hook function expects &$object and the caller passes $object (maybe a notice or something), PHP 5.3 silently (?) refuses to execute the function and you get a confusing "Hook did not return a value" error [15:28:20] <^demon> We really need update to something newer in 5.2.x [15:35:15] RoanKattouw: how's that pywikibot thing going. I am in a maze of twisty dependencies. [15:35:33] Almost done [15:37:41] trying to get a more modern phpunit results in having to upgrade pear which results in having to upgrade everything :( [15:38:14] also, I can't even hack it with the older phpunit since we do unusual stuff to set it up [15:38:24] grumble grumble [15:59:05] Ugh it looks like pywikipedia screenscrapes the upload page [15:59:23] Although that may be because it warns about not being able to get a token, which is also strange [16:01:51] User::matchEditToken: broken session data [16:01:53] Hmph [16:05:41] WTF... pywikipediabot looks broen [16:05:43] *broken [16:05:48] At least the upload.py part [16:06:05] Or wait maybe the version parameter is significant and shouldn't be set to 1.12alpha :D [16:06:50] Whee! [16:06:52] Uploading file to commonsproto:en via API.... [16:06:54] Upload successful. [16:08:19] Alright so pywikipediabot works just fine with uwd [16:08:22] And now Neil's gone :( [17:08:25] RoanKattouw: hey, I'm back. Any progress? [17:09:08] Uploading file to commonsproto:en via API.... [17:09:09] Upload successful. [17:09:13] Alright so pywikipediabot works just fine with uwd [17:09:15] And now Neil's gone :( [17:09:21] heh [17:09:36] well, you can go ahead and pull the trigger as far as I'm concerned [17:09:47] I would do that if I weren't about to have dinner [17:09:50] we're on schedule for 10:00am deploy. [17:09:52] oh right. [17:09:55] So let's do it at the scheduled time instead [17:09:56] whenever you want, then. [18:03:43] flipzagging: Alright, I'm back, let's do this thing [18:03:53] ok. Anything I can do to help? [18:04:42] Not yet [18:04:56] You can test once I deploy to test.wp.o [18:05:05] ok [18:05:07] And you can help me fix things if they explode [18:05:56] *guillom prepares the extinguisher. [18:05:59] UW is going on test.wp.o? [18:06:06] the backend API of it [18:06:08] not the frontend [18:06:14] Ah. [18:06:23] Yeah we're only deploying UW backend changes today [18:06:36] The actual wizard will be deployed .... in 8 days? [18:06:40] *RoanKattouw forgets [18:07:08] *Shirley imagines Mickey in wizard robes. [18:07:44] ooooh [18:07:50] something like that [18:07:55] That's a fantastic idea. [18:08:09] And publicized later, or something. flipzagging and guillom should be able to explain the schedule more accurately [18:08:20] I think Mickey should be out of copyright, surely? ;) [18:08:29] Heh. [18:08:40] Disney is evil. [18:09:07] Wouldn't Mickeys drawn in recent years still be copyrighted? [18:09:23] If that's true, you'd have to use a very old Mickey drawing, or draw a Mickey yourself, I guess :) [18:10:06] Copyright in the U.S. is such a mess in part due to Mickey. [18:11:23] RoanKattouw: it's a joke for copyright / free licensing geeks. The law that extends copyright nearly perpetually for large corporations is nicknamed the Mickey Mouse Protection Act, because Disney does it to make sure their trademark stays copyrighted. [18:12:24] http://en.wikipedia.org/wiki/Copyright_Term_Extension_Act [18:12:45] haha [18:13:11] mickey is not out of copyright. [18:13:29] they keep doing shit to renew the sorcerer's apprentice, anyway. [18:16:47] flipzagging: Alright, merged uploadwizard-deployment to 1.16wmf4 [18:16:58] ok [18:17:05] Now running svn up so it'll appear on test [18:21:14] flipzagging: OK it's on test now [18:21:47] all right [18:21:58] can you point that python api test at the server [18:22:13] Yes [18:22:54] Having fun with pywikipediabot? [18:26:52] flipzagging: Successful, see http://test.wikipedia.org/wiki/Special:RecentChanges [18:26:55] Okay, the normal upload methods seem to be working [18:26:59] guillom: Best way we knew to test the normal upload API [18:27:10] but, we want to test the new API as well [18:27:21] Try reuploading too? [18:27:41] I'm having difficulty doing it on my end, I thought I could just point the JS of UploadWizard at the new server, but it is failing to obtain an edit token. [18:27:56] reuploading, like replacing a file you mean? [18:28:32] Yeah [18:29:34] Meanwhile, I can rig up an HTML form for testing the stash interface [18:29:52] Or... just rig up pywikipediabot to do it [18:29:54] works for me [18:30:05] flipzagging: To stash something all I have to do is add &stash=1 to a normal API upload, right? [18:30:09] yes can you just alter pywikipediabot to add "stash=1" to its upload parameters [18:30:11] yes [18:30:16] OK on it [18:30:32] I don't know why my method of testing is failing though... edit token returns a blank response [18:30:38] although we did get a response [18:30:49] pywikipediabot is the Devil. [18:31:02] it seems a bit fragile [18:31:12] It does the job. [18:31:28] Sometimes. [18:31:29] it's amazing to me how people write PHP fake api stuff, and python screen-scraping stuff, nobody just uses the api :) [18:31:44] pywikipediabot predates the (write) API. [18:31:54] ah. makes sense, I guess. [18:32:05] It was supposed to be rewritten at some point. [18:32:08] I use http://code.google.com/p/python-wikitools/source/browse/trunk/#trunk/wikitools personally. [18:32:11] RoanKattouw: does pywikipediabot use api.php ? [18:32:27] Yes [18:32:31] OK, it says upload successful [18:32:41] But of course I didn't see the return data so I don't know the stash ID [18:32:54] er [18:32:55] I'll have to figure that out using magic [18:33:04] ok so can you get it to dump that somehow [18:33:25] Shirley: upload.py uses the API and that's all I care about right now :) [18:33:28] Probably can [18:33:33] Using dark magic [18:33:37] I have an idea how to do it [18:34:10] We've come from Wizard Mickey Mouse to Dark Magic. The circle is complete. [18:34:15] Oh wait, it's linked to my *session*, not my user ID [18:34:20] right. [18:34:29] So I need to figure out which session ID pywikipedia got [18:34:59] Hmph [18:35:23] This isn't gonna be easy [18:35:31] flipzagging: Where are stashed uploads stored? [18:35:41] Or which $wg var controls that [18:37:12] they're in the local images directory, whatever that is, probably images/ [18:37:45] OK [18:38:42] So they're just in there along with other random files?!? [18:38:52] How does that protect against naming conflicts? [18:39:28] they're in whatever local storage is defined by MW [18:39:36] and the file names are content hashes [18:40:07] it's supposed to be in temp storage only [18:40:29] sorry I misspoke, I think a stash file should be in images/temp [18:40:53] Oh OK [18:41:18] try this [18:41:24] find images/temp -type f | perl -wlne 'print ( ( 86400 * (-M $_ ) ) . " " . $_ )' | sort -n | less [18:42:17] anyway that just tells you if the file is there or not [18:42:30] not if the response from the API was right [18:42:49] if you can spy on sessions in test.wikipedia.org, that might also be useful [18:42:51] That's true, I can't see the API respones [18:42:57] But yes, I have my session data now [18:42:59] But it's hard to decode [18:42:59] why can't you [18:43:04] erk [18:43:10] php serialized data right [18:43:20] you can't spy on your own network traffic or something [18:43:54] I could [18:43:58] But I'd have to set that up [18:44:04] I got the session ID from pywikipedia's cache file [18:44:13] And fortunately the upload data is last in the session [18:44:21] So it's easy to pull out and unserialize [18:44:24] yes, it appends in order [18:45:42] Looks good: http://mediawiki.pastebin.com/SCEt4WWs [18:46:24] Trying to remember whether I did indeed stash two files [18:46:28] yes [18:46:28] RoanKattouw: what is the number you dial into for the features team meeting? [18:46:37] ext 2002 [18:46:41] Office number: (415) 839 6885 [18:47:07] thanks! [18:47:58] Roan: can you obtain that session id and try the URL http://test.wikipedia.org/wiki/Special:UploadStash/4o7swoecph0ser5x32act2t2diqt5vs.png ? [18:49:10] Working on it [18:49:13] I've obtained the session ID and am now hitting /frex7j7se8avvfo2pmmt5t643h10zki [18:49:16] the other thing, we *could* try installing the UploadWizard extension on test.wikipedia.org, temporarily even. [18:49:31] it's not necessary, but it's easier if you add an extension to the end [18:49:44] you should still get it though [18:49:48] Grrr screw you knsq30 [18:50:26] Oh the .png is optional? [18:50:38] it makes browsers and filesystems happier :) [18:50:41] Gaaaah our Amsterdam DC is going down [18:50:41] but yes, it's optional [18:51:16] ... and coming back up [18:51:42] Yup, both work [18:51:47] They were both images I uploaded [18:51:52] So the stash is totally legt [18:51:54] *leigt [18:51:57] *legit [18:52:11] excellent [18:52:16] ok, we know that part of this works [18:52:25] we don't know that the API returns the proper thing, we are just presuming. [18:52:41] Yeah [18:52:51] I think I will have to craft a proper test using curl, my hackery with JS isn't going to help. [18:53:04] The other thing we could try is to install UploadWizard, even temporarily, on test.wp.org [18:53:14] Yeah just craft a cURL test [18:53:24] Meanwhile, I'll deploy to the cluster [18:53:26] but then I have to deal with cookies and logging in and things :( [18:53:42] Just log in using a browser and copypaste the cookie [18:53:43] ok, let's call it tentatively okay and apparently we didn't break anything. :) [18:53:57] Or well, that's not nice for a 'real' test [18:53:59] Yeah [18:54:04] 90% of it is confirmed OK [18:54:21] I can pull the stash ID out of $_SESSION and I get the file back when feeding it to Special:UploadStash [18:54:33] And those Special:UploadStash come up blank for you, right? [18:54:39] yes. [18:54:47] OK [18:54:48] as in yes they are not visible. [18:54:50] Then let's go cluster-wise [18:54:58] And you get a reasonable error message? [18:55:11] the correct error message. [18:55:13] Oh I can just try that myself :P [18:55:20] Yeah it 404s with the key not found in stash message [18:55:23] this wasn't really an exhaustive test though. [18:55:26] Excellent [18:55:31] we also should test publishing from stash to live. [18:55:32] N, I know [18:55:48] my fault for not having a real test suite for you. [18:56:02] or, the fault of the deploy branch for not using PHPunit... :) [18:56:10] Do you know what the temp directory is/was used for other than stashing? [18:56:22] Cause there's already stuff in /mnt/upload6/wikipedia/test/temp [18:56:34] some files get stashed anyway, if there are errors/problems [18:56:50] also tests for alternative upload APIs like the chunked api, I think [18:57:08] Oh you mean we already stash for warnings? [18:57:12] yes [18:57:14] we didn't invent this [18:57:21] 20081030045854!Confused.gif 20090129010514!Confused.gif [18:57:27] So apparently this stuff never gets cleaned up ... [18:57:37] *RoanKattouw wonders how large the temp dir is [18:57:44] well, it's not my problem if nobody writes a cronjob to clean up temp dirs [18:57:47] 37MB for testwiki [18:57:49] I know :) [18:57:52] Just sayin' [18:57:54] sure [18:57:57] Will bring up on private-l [18:58:27] ok meeting [18:58:37] Yeah [18:58:41] I'll scap soonish then [18:58:52] It looks like deploying this code cluster-wide won't hurt [18:58:59] And if it does, I wanna be awake for it [18:59:18] You in the meeting room now flipzagging_ ? [18:59:24] (i.e. done moving your laptop) [18:59:34] Yeah [18:59:38] I'll scap soonish then [18:59:39] It looks like deploying this code cluster-wide won't hurt [18:59:41] And if it does, I wanna be awake for it [19:00:18] I am fine with whatever you do as long as it's in the next 24 hours or so [19:00:26] OK [19:00:28] So I'll do it now [19:05:34] flipzagging: Just ran scap [19:05:49] hooray, also fear [19:05:53] :) [19:06:15] Whoa, the load spike is huge on the API Apaches [19:06:20] They now actually have spare capacity [19:06:25] So CPU usage shot up from 40% to 80% [19:06:46] It looks differently when CPU usage is already at 70-80 in normal operation, which was the case until we got additional API servers [19:08:20] Strike that, that was the regular cluster [19:08:24] Two of them even went down xD [19:10:09] flipzagging: For graphs porn (load spike due to cache rebuild on the Apaches) see http://ganglia.wikimedia.org/?r=hour&s=descending&c= within an hour from now [19:10:59] Also not seeing any fatal errors in the logs other than the normal slew of OOMs due to l10ncache rebuild, so that's good [19:14:08] Graphs porn? Fap function! [19:55:50] arrgh dropped again [19:56:32] Oh so that was YOU [19:56:38] They now think it was me [19:56:41] (That bleep) [19:58:38] heh [19:59:35] nah me.. another reason i hate thanksgiving.. apparenlty i cannot get internet in this apartment till Dec 6th because of thanksgiving (don't ask me what the connection is). Mifi connection sucks and i get no cell phone reception in the apartment [19:59:57] i feel like i'm in camping in the woods in a fancy apartment [20:00:28] haha [20:07:21] thank you RoanKattouw œ pdhanda [20:07:27] s/œ/& [20:56:02] ^demon: i have some changes merged from trunk r76353 on prototype flagged revs. For some reason i still cannot commit on that branch [20:56:12] svn: MKACTIVITY of '/svnroot/mediawiki/!svn/act/a0891c9d-08f4-4555-a221-0c44c3e6d673': 403 Forbidden (http://svn.wikimedia.org) [20:56:53] *^demon is perplexed. [20:57:46] :( [21:08:55] <^demon> pdhanda: Merged to staging branch. Not sure what robla's done differently w.r.t. prototype setup, but we could do push there again [21:34:47] flipzagging, trevor, robla: this is the essay i was talking about: http://www.negativland.com/albini.html [22:04:09] Hi, I have a small issue with Subclipse, I am trying to do a commit but nothing happens, no error message, no commit. This used to work fine. The only thing I did before the commit was I renamed two files and I moved two files. But they have been added to version control. Any suggestion on how to fix this? [22:04:52] <^demon> Diederik: Any chance you checked out with http instead of svn+ssh? [22:08:04] I have checked out using svn+ssh [22:11:48] <^demon> hm. [22:13:45] I am using the SVNKit as interface [22:15:30] <^demon> Offhand, I'm not sure. Might try poking around #mediawiki to see if anyone else uses eclipse. [22:15:48] i never had much luck with svn plugins to eclipse [22:16:00] (these days i actually use netbeans as editor, but still do my svn and git from command line :D) [22:16:13] though netbeans understands git natively these days [22:17:06] <^demon|away> brion: netbeans hasn't given me any problems with svn either. [22:17:19] <^demon|away> I use it for doing propedits, much easier than doing it on the cli. [22:20:30] I am using a different interface now (Javahl) and that does give an error message, it cannot find a file which is correct as I delted it so I will to delete it from svn locally as well and that should (hopefully) fix it [22:24:45] Did you try to move a file while someone else changed it in SVN? [22:24:53] That can yield nasty results [22:26:59] No, I am the only one who is committing to this project [22:38:55] I am still struggling.... when doing a commit it says: Transmitting file data ... [22:38:56] Skipped [22:39:16] But the repository is up-to-date [22:41:05] *robla reads very small backlog, and starts updating prototype [22:43:16] Diederik: You can install the SVN command line tool to do it, or ask someone who has it to do it for you (I'd volunteer but I need sleep) [22:45:48] pdhanda: I see some local mods you made on prototype. looks slightly different than what ^demon|away has checked in [22:46:29] pdhanda: is that intentionally different, or is it safe to update out of Chad's branch? [22:49:02] hmm [22:49:11] let me see what the diff is, one sec [22:55:54] this mifi picks the worst possible times to drop me [22:56:10] robla: should be ok to get chad's changes [22:57:20] and robla while you're here and free can you verify the edit notice and the notice while viewing the latest pending version [22:58:24] cool....lemme look [23:01:52] *robla compares to jorm's mockup [23:08:54] hey...there you are [23:09:04] :( sorry about that [23:09:21] pc notice looks great...I think there's just little minor style cleanup [23:09:33] ok [23:09:40] most important change: black text, not red [23:10:51] there are other differences from what Jorm did, but I'm not sure just how wedded he is to those choices [23:10:53] http://upload.wikimedia.org/wikipedia/commons/2/2d/PendingChangs-Nov16-Pending.png [23:11:20] I would throw something at him, but I don't see him at his desk [23:12:20] i need to make that text left aligned and change the date format [23:12:26] robla red text? [23:12:36] pdhanda: http://prototype.wikimedia.org/flaggedrevs/Pakistan [23:13:00] I wonder if we're seeing different things [23:15:36] hmm....actually, a more alarming problem is not seeing any text when I click on "Read" [23:25:20] robla: moved the mifi to the patio, seems better there [23:25:38] sometimes i wonder why i live in this city :( [23:25:46] :) [23:25:46] trying again [23:26:05] so....I think the red text problem must have been a cache problem [23:26:14] yup [23:26:25] it's looking fine for me now [23:26:35] shift refresh fixed that, so other than the alignment and the date format, it looks ok? [23:26:54] also i wanted to ask you where to link "accepted revision" to [23:26:58] in that same notice [23:27:43] yeah, I think the alignment and date format are all fine. I like this style better, but the one quibble I'd have with it is that it's a little out of sync with the way old revisions look [23:27:47] *robla pulls up link [23:28:18] e.g. http://prototype.wikimedia.org/flaggedrevs-w/index.php?title=Pakistan&oldid=7925 [23:28:55] ah [23:29:10] you want to change that to look similar to the new ones? [23:29:11] like I said, I like your version better...so I'd rather not fuss with it, but we may want to revisit later [23:29:24] I'll double check with jorm when he's available [23:29:39] anyway, a bigger fish.... [23:29:58] what do you see when you click on the "Read" tab? [23:30:13] eeek [23:30:20] nothing [23:31:31] I wonder if that's also a cache problem [23:31:47] (of a different variety) [23:32:44] I think this is an intermittent problem that I saw early on which seemed to resolve itself [23:33:00] ...but it could be a problem we have to figure out the root cause of [23:35:38] ok, let me look at what's going on [23:35:44] k...thanks!