Community
Participate
Working Groups
Build Identifier: M20100211-1343 When indexer tries to parse a specific .c file from glibc's source (math/test-tgmath2.c), it fails with out of memory exceptions (and does not continue parsing the rest of a project). The problem is only in that specific file. If the file is removed from glibc's source tree, indexer is able to index the whole glibc's project successfully. In order to reproduce the error it is not necessary to create a project for a whole glibc: the problem occurs even if the only file in a project is this file. I use the following JVM: java version "1.6.0_18" OpenJDK Runtime Environment (IcedTea6 1.8) (6b18-1.8-0ubuntu2) OpenJDK 64-Bit Server VM (build 14.0-b16, mixed mode) My environment is: Linux laptop 2.6.32-24-generic #39-Ubuntu SMP Wed Jul 28 05:14:15 UTC 2010 x86_64 GNU/Linux PC has 2Gb RAM Reproducible: Always Steps to Reproduce: 1. Download glibc source code from http://www.gnu.org/software/libc/ (I used glibc-2.12.1) 2. Extract from .tgz a file math/test-tgmath2.c, put it in a separate directory and create a new C project from that folder 3. During project's indexing (starts automatically in my environment) an exception occurs (see log attached) and a message suggesting to close eclipse appears
Created attachment 176398 [details] Eclipse's log file with stacktrace.
This works for me, have you tried to adjust the maximum heap size? e.g. -vmargs -Xmx384m
(In reply to comment #2) > This works for me, have you tried to adjust the maximum heap size? > e.g. -vmargs -Xmx384m Maximum heap size was set to 512m. What else can I provide to find out the problem? P.S.: I just understood that the version of an eclipse I was running is not the recent release I downloaded, but a version from Ubuntu's repository. I removed the old version from the system and ensured that a new version is used: build 20100617-1415. The problem still exists (now by default heap size is set to 384m).
(In reply to comment #3) > (In reply to comment #2) > > This works for me, have you tried to adjust the maximum heap size? > > e.g. -vmargs -Xmx384m > Maximum heap size was set to 512m. What else can I provide to find out the > problem? Parsing the file does take quite a bit of memory (there are lots of macro expansions in the file), however I had no problem with 384m of heap. You can try to use jhat to figure out what is consuming the memory: http://download.oracle.com/javase/6/docs/technotes/tools/share/jhat.html
(In reply to comment #4) > You can try to use jhat to figure out what is consuming the memory Attached is jhat's histogram resulting from processing heap dump created during parsing of a project which included only a file test-tgmath2.c. In that case eclipse was ran with -Xmx384m. Heap overflowing occurs even with -Xmx1024m, but I would be unable to process the resulting dump file with jhat, since even processing of 384m dump requires ~1.5Gb. I tried to switch jvm to Sun's 1.6.0_20-b02, then to 1.6.0_21-b06 (both 64 bit), but results were identical (heap overflow). On Windows machine (same eclipse' version, jvm 1.6.0_21-b06 32 bit) eclipse succeeded to parse the file with -Xmx384m.
Created attachment 176752 [details] jhat histogram mentioned in comment #5
Note: attached jhat histogram is partial: the list includes only objects for which total size of all it's instances is >100k
Hmm, maybe the issue can be reproduced on windows as follows: On solaris generate a file that contains the result of preprocessing the following source: (gcc -E -P <input-file>) #undef __NO_MATH_INLINES #define __NO_MATH_INLINES 1 #include <math.h> #include <complex.h> #include <stdio.h> #include <string.h> #include <tgmath.h> On windows include the preprocessed headers instead of <math.h> .... If that works, I should be able to reproduce the issue as well.
(In reply to comment #8) > On windows include the preprocessed headers instead of <math.h> .... > If that works, I should be able to reproduce the issue as well. My test results were exactly the opposite: on the same Linux machine where heap overflow occurs, when preprocessed headers are used instead of includes, eclipse succeeds to parse the file quickly (2-3 seconds. I used -Xmx384m for this test). With includes it takes about 1 minute to parse the file until heap overflow occurs (note: a number of times eclipse did succeed to parse the file with includes. When I try to rebuild index for the project which included the file (when I talk about 'project' I mean project which includes only the file 'test-tgmath2.c'), sometimes index is rebuilt successfully (which, again, takes about 1 minute), and sometimes overflow occurs). On Windows, on the other hand, both file with preprocessed headers and file with includes is parsed quickly (2-3 seconds. Windows is installed on the same machine as Linux, and project is shared between the Linux and Windows).
I've facing same issue but even 1GB of heap isn't enough to index it! Maybe test-tgmath2.c presented to CDT indexer an unpredictable condition when it was developed. So IMO we should go further to investigate possible memory leaks in the indexer. Someone could give me hints on how to debug the cdt indexer?
(In reply to comment #10) > Someone could give me hints on how to debug the cdt indexer? Perhaps you could start by examining the stack trace http://wiki.eclipse.org/How_to_report_a_deadlock#Getting_a_stack_trace_on_other_platforms and examining the memory consumption: http://www.eclipse.org/mat/ Also: http://wiki.eclipse.org/Getting_started_with_CDT_development I wasn't able to reproduce the problem on Windows.
I tried indexing this file with CDT 8.3 on Linux with 4G of heap. It did not give me an "out of memory" error, but it did take a long time to index (almost 10 minutes).