Community
Participate
Working Groups
See http://download.eclipse.org/eclipse/downloads/drops4/S-4.5M5-201501291830/performance/linux.gtk.x86_64/Scenario279.html org.eclipse.jdt.text.tests.performance.CodeCompletionPerformanceTest#testApplicationWithParamterGuesses2()
There weren't any related changes in JDT UI, but quite a few in JDT Core during that week. Please investigate.
Sasi, please take a look at this. We are looking at commits that went in between I20141216-0800 and I20141223-0800.
I get similar times for both cases (before 16/12 and after 23/12) in my dev environment as well as on downloaded builds I20150127-0900 and I20141208-2000. Perhaps something in the environment? Is there a place where we can see the args, propertes etc for the test run?
The only place I know of where more information could be available is the Hudson page for perf tests: https://hudson.eclipse.org/perftests/job/ep45ILR-perf-lin64/ E.g. here's some recent console output: https://hudson.eclipse.org/perftests/view/Eclipse%20and%20Equinox/job/ep45ILR-perf-lin64/ws/workarea/I20150203-0800/eclipse-testing/results/consolelogs/linux.gtk.x86_64_8.0_consolelog.txt Looking at the performance graph, I'm not sure the results before I20141216-0800 (< 2s) are actually accurate. The baseline R-4.4-201406061215 took a lot longer (7.81s). Sasi, are your times closer to 2s or to 10s?
(In reply to comment #4) > (7.81s). Sasi, are your times closer to 2s or to 10s? Here's a sample of what I have On I20141208-2000 Scenario 'org.eclipse.jdt.text.tests.performance.CodeCompletionPerformanceTest#testApplicationWithParamterGuesses2()' (average over 10 samples): Elapsed Process: 2.8s (95% in [2.75s, 2.84s]) Measurable effect: 77ms (1.3 SDs) CPU Time: 3.83s (95% in [3.03s, 4.62s]) Measurable effect: 1.4s (1.3 SDs) (required sample size for an effect of 5% of mean: 538) On I20141230-0800 Scenario 'org.eclipse.jdt.text.tests.performance.CodeCompletionPerformanceTest#testApplicationWithParamterGuesses2()' (average over 10 samples): Elapsed Process: 2.6s (95% in [2.55s, 2.64s]) Measurable effect: 78ms (1.3 SDs) CPU Time: 3.78s (95% in [2.9s, 4.66s]) Measurable effect: 1.56s (1.3 SDs) (required sample size for an effect of 5% of mean: 685) Haven't checked with 442. Will do that and attach all the readings I have so far
Created attachment 250518 [details] reports from the test case on different builds
442 gives pretty much the same times. Attached are reports from different test runs on different builds
So, looks like there's something about the baseline or the readings from the build environment. Lars, just to be sure, can someone from your team run the tests once in your environment too?
(In reply to Jay Arthanareeswaran from comment #8) > So, looks like there's something about the baseline or the readings from the > build environment. Lars, just to be sure, can someone from your team run the > tests once in your environment too? Where do I find the description how to run the JDT performance tests?
(In reply to Lars Vogel from comment #9) > Where do I find the description how to run the JDT performance tests? Sorry, actually I meant that for Markus :), sorry about it.
(In reply to Jay Arthanareeswaran from comment #10) > Sorry, actually I meant that for Markus :), sorry about it. Thanks, I already got scared by this request, I haven't touched JDT core yet.
(In reply to Jay Arthanareeswaran from comment #8) > So, looks like there's something about the baseline or the readings from the > build environment. Lars, just to be sure, can someone from your team run the > tests once in your environment too? I'm neither Lars nor Markus :) but: what environment are you looking for? Any need to run them on linux (I'm on Kubuntu)? OTOH, I haven't yet looked at running JDT performance tests, so I, too, could use a quick pointer, if needed.
(In reply to Stephan Herrmann from comment #12) > but: what environment are you looking for? Any need to run them on linux > (I'm on Kubuntu)? OTOH, I haven't yet looked at running JDT performance > tests, so I, too, could use a quick pointer, if needed. Linux would be okay, I guess. The test in question is in jdt.text. But can be run just like any other junit test. Markus or Sasi can assist you if you need anything, as they are more familiar with the tests.
Moving to M7
We are not convinced that this is a real problem with the JDT code. Moving out Mars.
Target to be set later when we are certain that there is a problem. Moving out.
Lets mark this as worksforme and open a new one or reopen if we find a real problem
We didn't see any evidence of performance issues later on. Verified for 4.6 M7 with build I20160425-1300