Bug 198894 - Performance tests commit measurement in done in loop in several tests
Summary: Performance tests commit measurement in done in loop in several tests
Status: RESOLVED FIXED
Alias: None
Product: JDT
Classification: Eclipse Project
Component: UI (show other bugs)
Version: 3.3   Edit
Hardware: PC Windows XP
: P3 normal (vote)
Target Milestone: 3.4 M2   Edit
Assignee: Benno Baumgartner CLA
QA Contact:
URL:
Whiteboard:
Keywords: performance
Depends on:
Blocks:
 
Reported: 2007-08-05 13:14 EDT by Frederic Fusier CLA
Modified: 2007-08-10 07:00 EDT (History)
1 user (show)

See Also:


Attachments
fix (14.47 KB, patch)
2007-08-06 10:54 EDT, Benno Baumgartner CLA
no flags Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Frederic Fusier CLA 2007-08-05 13:14:04 EDT
Observed in build I20070802-0800 but I guess it's done since a (long) time...

Following tests commit performance results inside a loop instead of doing it after:
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testAllCleanUps()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testCodeStyleCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testControlStatementsCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testConvertLoopCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testExpressionsCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testJava50CleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testNullCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testSortMembersCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testStringCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testUnnecessaryCodeCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testUnusedCodeCleanUp()
org.eclipse.jdt.ui.tests.performance.views.CleanUpPerfTest#testVariableDeclarationCleanUp()
org.eclipse.jdt.ui.tests.performance.views.PackageExplorerPerfTest#testOpen()
org.eclipse.jdt.ui.tests.performance.views.PackageExplorerWarmPerfTest#testOpen()
org.eclipse.jdt.ui.tests.refactoring.reorg.MoveCompilationUnitPerfTests1#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.MoveCompilationUnitPerfTests2#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.MoveStaticMembersPerfTests1#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.MoveStaticMembersPerfTests2#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameMethodPerfTests1#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameMethodPerfTests2#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameMethodWithOverloadPerfTests#test_1000_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameMethodWithOverloadPerfTests#test_100_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameMethodWithOverloadPerfTests#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenamePackagePerfTests1#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenamePackagePerfTests2#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameTypePerfTests1#test_10_10()
org.eclipse.jdt.ui.tests.refactoring.reorg.RenameTypePerfTests2#test_10_10()

This has consequence on perf results numbers as the performance framework assumes that each committed results are independent...
Comment 1 Dani Megert CLA 2007-08-06 03:00:58 EDT
There's more: most tests don't do a warm up run. This can be OK but normally you should do some warm up runs first and then do the real measuring.

NOTE: you must backport this change to the perf_33x branch in order to get correct 3.4 perf results.

This one seems to be OK:
org.eclipse.jdt.ui.tests.performance.views.PackageExplorerPerfTest#testOpen()
Comment 2 Benno Baumgartner CLA 2007-08-06 10:54:59 EDT
Created attachment 75434 [details]
fix
Comment 3 Benno Baumgartner CLA 2007-08-06 10:59:13 EDT
Frederic, how did you find out about this? Our implementation is certainly wrong, but I'm not sure if it makes any difference.

Dani, which test should do a warm up and does not? All RepeatingRefactoringPerformanceTestCase have a testCold test without measurement to warm up.
Comment 4 Frederic Fusier CLA 2007-08-06 12:41:59 EDT
(In reply to comment #3)
> Frederic, how did you find out about this? Our implementation is certainly
> wrong, but I'm not sure if it makes any difference.
> 
I find these problems while trying to improve the generation of performance tests HTML pages. I discovered difference between the results I get with my new implementation and those generated with current one... While investigating I get numbers directly from the releng database and saw that for these tests, there were 10 consecutive stored results, the first has one iteration, the second two iterations, etc... until the tenth which has ten iterations. Then, I looked at the code and realized that the commit operation was wrongly done in the loop instead after it.

> Dani, which test should do a warm up and does not? All
> RepeatingRefactoringPerformanceTestCase have a testCold test without
> measurement to warm up.
> 
It does make a difference, because while computing the average and standard deviation of a scenario, the performance framework makes the average of all stored numbers if there are several commit done (see method computeStatsFromAggregates of StatisticsSession). As it is said at line 160, this makes scenario numbers obviously wrong.

I've opened bug 198957 for this problem in the performance framework.
Comment 5 Dani Megert CLA 2007-08-07 04:17:52 EDT
>Dani, which test should do a warm up and does not?
All tests where the first run(s) take(s) longer than all others because they need to init/start stuff first (e.g. class loading).

Of course if your test has exactly one run then you can argue you want to measure the very first invocation.
Comment 6 Benno Baumgartner CLA 2007-08-10 07:00:38 EDT
released to HEAD stream
released to perf_33x stream

fixed > I20070809-1105