Community
Participate
Working Groups
With terrabyte harddisks a plenty these days, using CVS (or subversion) to store large binary files is quite feasable. The problem with Eclipse is that it does not treat Very Large Files differently from smaller files. E.g. a large binary file can be e.g. a generated .iso image or a .war file that can change in response to running some build scripts. These files change frequently, but are committed rarely. The problem is that such Very Large Files clog up e.g. Team Synchronize effectively making the application crash as not even pressing Cancel will make Eclipse responsive again. To reproduce: 1. add a Very Large File & commit it to CVS. It should be something that takes ca. 5-10 minutes to commit/update given the CVS connection in question. 2. Change the file locally 3. Start Team Synchronize => Team Synchronize takes forever due to the large file.
Is anyone investigating this? I have users working with files as large as 200 MB (text files) and they keep on running into these issues (even on machines with huge heap sizes).
No, we haven't had a chance to investigate this yet. I suspect the problem may be with the model-based synchronizations. Here is a link to a FAQ entry that describes this. http://wiki.eclipse.org/index.php/CVS_FAQ#Synchronizing_has_some_regressions_in_Eclipse_3.2._Why.3F Could you turn off model-based Synchronization and see if that helps (i.e. there are certain optimizations in the old style synchronizations that could not be made in the model based one).
(In reply to comment #2) > Could you turn off model-based Synchronization and see if that helps (i.e. > there are certain optimizations in the old style synchronizations that could > not be made in the model based one). Tried it, but no change - these are basically text files (no Java source or anything).
*** Bug 171129 has been marked as a duplicate of this bug. ***
The issue mentioned in bug 171129 is slightly different but is caused by the transfer of a large file during Synchronize. The scenario mentioned there should be tested when this issue is being addressed.
It would not be a trivial change but it should be possible to add special handling for large files. We'll target this for 3.4.
Mass update - removing 3.4 target. This was one of the bugs marked for investigation (and potential fixing) in 3.4 but we ran out of time. Please ping on the bug if fixing it would be really important for 3.4, and does not require API changes or feature work.
(In reply to comment #7) > Mass update - removing 3.4 target. This was one of the bugs marked for > investigation (and potential fixing) in 3.4 but we ran out of time. Please ping > on the bug if fixing it would be really important for 3.4, and does not require > API changes or feature work. > I've switched to Subversion. It works a million times better with large files and narrow bandwidth connections.
This bug hasn't had any activity in quite some time. Maybe the problem got resolved, was a duplicate of something else, or became less pressing for some reason - or maybe it's still relevant but just hasn't been looked at yet. If you have further information on the current state of the bug, please add it. The information can be, for example, that the problem still occurs, that you still want the feature, that more information is needed, or that the bug is (for whatever reason) no longer relevant.