Even if it's all removed, it isn't the end of the world it's just you'll have an svn 1.4 like behavior for whole branch merges that involve branches before that point in time. specifying revisions x,y,z) should not be affected. svn merge url/to/src/branch - where no revision is specified), and may cause subversion to try to re-merge revisions that were previously merged. This matters when doing full branch merges (e.g. In Richard's case, the repository never had merges committed from the root of the branches, which means svn:mergeinfo at the root probably did not contain anything, so removing them all will remove svn's knowledge of what was previously merged. Exercise caution in removing them without some understanding. UPDATED: It's important to understand what svn:mergeinfos are. If you've never merged from the root before, I assume you have a lot of mergeinfos. I found a thread about too many mergeinfo properties causing increased memory usage. Separate suggestion: search the mailing list for similar problems. I suggest posting to the subversion mailing list, and a quick search tells me you already have :) I suspect they will confirm that you need more memory to do a big merge given the size of the repository, but it's also possible something else is going on. It's conceivable that merging a branch with a lot of changes could require a lot of memory, but I'm just speculating. I've dealt with 3GB working copies and lots of merging (always from the root of the branch) - I've never hit a memory issue, but I've also been 64-bit for a long time. If you don't have a 64-bit machine available, try breaking it up into smaller merges (unless you're reintegrating a branch, then I'm not sure how you'd split that up). Given the memory usage you're seeing (~1.8GB) prior to termination and the fact you are on 32-bit Windows, which has a 2GB memory limit per process, I'd recommend attempting the merge on a 64-bit machine. Performing a merge on just this folder results in the same out of memory error (although it takes longer to blow up). Are we hitting an internal svn limit? EditĪfter some trial-and-error investigation I've found that this problem seems to be caused by one specific folder in our repository. This is the first time that we are attempting a root-level merge. The total file size (on a developer machine) of the repository is around 3GB. We are trying to perform the merge from the root level of our repository. The peak memory usage of the svn.exe process is in excess of 1.8GB.Īs an aside, we get the same result when trying to perform the merge using TortoiseSVN. Inspecting Windows task manager around this time reveals the following memory usage Please contact the application's support team for more information. This application has requested the Runtime to terminate it in an unusual way. The SmartSVN help docs (I can't find the article on the new site) suggested setting it to 32000, but I found that I kept increasing it, and have been fairly happier with a setting of 128000: # Work around problems with SCM programs and java7 fs.inotify.max_user_watches = 128000 Honestly, I think that SmartSVN has inherited the huge memory consumption problems of Eclipse, and the team should put increasing performance on the top of their priorities for the next release.We are seeing the following error when trying to perform a command-line svn merge with Subversion 1.6.9 under 32 bit Windows XP. For me on Linux, this looked like adding the following lines to my /etc/nf and running "sysctl -p" or restarting. The other change (and I think this is Linux-specific, but might also touch OS X), is increasing the the system-wide maximum for the number of inotify watches. This can also be changed by setting that environment variable before SmartSVN startup, at least for my situation. There is a line near the beginning of that containing the max heap size, which I've changed to SMARTSVN_MAX_HEAP_SIZE=1024m. I imagine the mechanism is different on OS X, but on Linux, SmartSVN is started with the smartsvn-7_5/bin/smartsvn.sh. ![]() The first is configuring it to use more RAM in the startup script. I have large sandboxes and prefer to have multiple sandboxes in a single project, and both of these changes have been required to achieve any kind of usability. There are two factors in major SmartSVN slow-downs in my own usage.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |