Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
RE: [wtp-dev] Clarification of priorities and measurements

I think it's very important that we produce a weekly integration build that buildmasters feel it's worth spending time trying to adopt.  We need a flag on the download site that indicates the build has achieved "quality" success.
 
I understand that we also need "api coverage" success for M4.
 
So I see two options for the short term:
1)  quality = test results + component lead email
     api coverage = test coverage
2)  quality = test results
     api coverage = test coverage + component lead email
 
I personally think 2) is a more practical and enforceable policy.  1) is fine with me, if there's a flag on the download site that indicates the quality result (and assuming you can get timely component lead email).
 
Another comment on the phrase "we'd expect JUnit success to always be high".  I believe we should expect JUnit success to always be 100%.  It already seems pretty lenient to only enforce this once a week.
 
-Ted

	-----Original Message----- 
	From: wtp-dev-bounces@xxxxxxxxxxx on behalf of David M Williams 
	Sent: Thu 4/7/2005 8:18 AM 
	To: General discussion of project-wide or architectural issues. 
	Cc: 
	Subject: [wtp-dev] Clarification of priorities and measurements
	
	

	Thanks Chuck. just to be clear, the priorities are definitely (still) to 1) define high quality API (or decide something can not be API this release), and 2) produce a high 
	quality milestone (in that order, but both high priority). 
	
	It just seems to me many of the failing tests didn't have to do with the defining high quality API process, so that's what I was asking component leads to clarify. 
	
	So if those are tests failing simply due to the "defining high quality API" process, then its fair for you 
	to simply say so (for now, until M4) under the rules established by project leads, in order to have those 
	counted in the coverage report. 
	
	My concern about, our process, in general, is that if we have lots of tests failing due to the "defining high quality API in advance" process, 
	then they are not suddenly going to go away overnight after M4, so we'll have this problem of using JUnits to judge quality of the build for a long time. 
	
	So ... you are right, using JUnits to judge quality of the build (the traditional use) and using JUnits to judge API definition in advance of being implemented 
	(the proposed new use we are trying out) are in conflict -- we'd expect them to merge overtime as implemented .... but that leaves a fair amount of time of 
	ambiguity. Communication is the key, so I appreciate your (continued) comments. 
	
	I'll make this concrete proposal -- that is, and is intend to be, consistent with our priorities and goals -- 
	
	let's shoot for complete coverage of overview docs and JavaDoc for proposed API for M4, and then use the 
	JUnit coverage *trend*, as implemented (not defined in advance) to judge quality during the M5 timeframe. 
	That is, we'd expect JUnit success to always be high, and coverage to steadily increase as we approach release of 1.0. 
	I'm pretty sure this is consistent with project leads and PMC's view and quality goals, but they can verify it if others think 
	this is a good idea. 
	
	If anyone has any other suggestions to improve our process to accomplish our goals 
	(high quality API, high quality implementation) and measure our progress towards them, 
	then I think we are all open to suggestions. 
	
	
	
	
	
Chuck Bridgham/Raleigh/IBM@IBMUS 
Sent by: wtp-dev-bounces@xxxxxxxxxxx 

04/07/2005 09:49 AM 
Please respond to
"General discussion of project-wide or architectural issues."


To
"General discussion of project-wide or architectural issues." <wtp-dev@xxxxxxxxxxx> 
cc
Subject
RE: [wtp-dev] Thursday I-build schedule	

		




	
	I will sustain from voting on this issue, but I feel we need to clarify priority for this milestone. 
	
	Our main focus is to define quality api, and build tests that cover this api. - We have been directed to build on this coverage in this milestone. 
	
	So I will go ahead and remove our failing tests, which will damage our coverage in the api report. 
	
	Just wanted to bring up these conflicting policies, so we can have a clear understanding of whats expected. 
	
	Thanks - Chuck 
	
	Rational J2EE Tooling Team Lead
	IBM Software Lab - Research Triangle Park, NC
	Email:  cbridgha@xxxxxxxxxx
	Phone: 919-254-1848 (T/L: 444)
	
	
	
David M Williams/Raleigh/IBM@IBMUS 
Sent by: wtp-dev-bounces@xxxxxxxxxxx 

04/06/2005 07:58 PM 

Please respond to
"General discussion of project-wide or architectural issues."

To
"General discussion of project-wide or architectural issues." <wtp-dev@xxxxxxxxxxx> 
cc
Subject
RE: [wtp-dev] Thursday I-build schedule	


		


	
	
	
	
	+1 for "Build will not be declared a success if there are any test errors" 
	
	The "easy way out" was intended to be a very temporary condition, just until M4 is complete (which is not that far off!) intended to make it easier for developers to specify proposed API, provide a JUnit test for it in advance so it could be counted by the test counter, and have the proposed-api-junit failure not prevent the build from being declared a success. 
	
	But, its not working. I'm not sure if its just a bad idea or if component leads just haven't been communicating well. (And, I say this knowing full well I'm one of the culprits, allowing failing xml validation tests last week to go unmentioned and unexplained -- they are non critical but should have been removed until fixed). 
	
	So, fellow component leads .. please make sure tests are clean this week, or communicate clearly if they are failing due to the "provide JUnit tests in advance" policy. 
	That way, we'll be able to easily decide next week if a change in policy/procedure is needed. (my current impression is we do not have any that are failing due to the "specified in advance" policy) 
	
	And, IMHO, I think the main criteria of success of an I-build is if it can be used as a target for further development, so, in many cases, its legitimate to remove failing tests if they are failing due to testing some fringe cases, etc. Might it even be possible to provide the 'proposed-api-tests" and rig things to always have them "fake" success? Long long term, it would be great if the JUnit framework provided a way to specify the "level" of the test, right in the test ... critical, noncritical, stress-only, performance, etc. .... but that's a Release 2 item, if even then. 
	
	Thanks for the comments Ted -- it helps to have another perspective.  And three cheers for continuos improvement through continuous iteration :) 
	
	Other's opinions welcome. 
	
	
	
"Ted Bashor" <tbashor@xxxxxxx> 
Sent by: wtp-dev-bounces@xxxxxxxxxxx 

04/06/2005 01:32 PM 

Please respond to
"General discussion of project-wide or architectural issues."




To
"General discussion of project-wide or architectural issues." <wtp-dev@xxxxxxxxxxx> 
cc
Subject
RE: [wtp-dev] Thursday I-build schedule	




		


	
	
	
	
	"Component owners must justify test errors if they want to declare it a pass despite errors."
	Is there any way we can change that policy to "Build will not be declared a success if there are any test errors"?
	
	Seems to me that either the error is critical, and the build is unstable, or it's non-critical, and the test should be commented out until there's time to fix it.  The impression one gets is that failing tests are allowed to hang out indefinitely through the milestone.
	
	This "successful with errors" state is really inconvenient for someone trying to determine whether to adopt the build.  Ideally one should be able to look at the download site and see a green check or red X and get a general sense of stability.  Obviously there are going to be bugs and unfinished work in an integration build, but I think we really need a binary thumbs up or thumbs down that doesn't require digging through email.  
	
	If people really don't want to comment out tests, we could have two status flags, one for "no test errors" and one for "test errors but declared sucessful", but to me that indicates the need for separate test suites.
	
	Another recommendation is that test results start going out with the first candidate build (that builds successfully).  I know that's more spam to the mailing list, but I think it will give us a better shot at achieving a clean I-build sometime between Tues and Thurs.
	
	Thanks, Ted
	
	
	              -----Original Message----- 
	              From: wtp-dev-bounces@xxxxxxxxxxx on behalf of David M Williams 
	              Sent: Wed 4/6/2005 9:35 AM 
	              To: wtp-dev@xxxxxxxxxxx 
	              Cc: 
	              Subject: [wtp-dev] Thursday I-build schedule
	              
	              
	
	              Thanks goes to Ozgur for confirming. And thinking generally that if I'm confused others might be (which ... I know ... is seldom the case :) I thought I'd re-post the I-build schedule for Thursday. And, now that most of us have gone through daylight-savings-time change, I will even do the math for EDT: 
	                      midnight, 11:30 AM, 6:30 PM. 
	              
	              And, BTW, no need to wait till asked for "Component owners must justify test errors" part. Let us all know status of failed tests here on wtp-dev. 
	              
	              Thanks, 
	              
	              = = = = =  original post = = = = = 
	              Final-Thursday:
	              Builds are started on 4:00 AM GMT,  15:30 GMT, 22:30 GMT
	              Failure Policy:
	              Rebuild until success.  (Success will be achieved when there are no compile/testing errors).  Component owners must justify test errors if they want to declare it a pass despite errors.  All other unfixed errors will be the source of public humiliation. You must fix your errors.  This becomes the top priority on thursdays.
	
	_______________________________________________
	wtp-dev mailing list
	wtp-dev@xxxxxxxxxxx
	https://dev.eclipse.org/mailman/listinfo/wtp-dev
	_______________________________________________
	wtp-dev mailing list
	wtp-dev@xxxxxxxxxxx
	https://dev.eclipse.org/mailman/listinfo/wtp-dev
	_______________________________________________
	wtp-dev mailing list
	wtp-dev@xxxxxxxxxxx
	https://dev.eclipse.org/mailman/listinfo/wtp-dev
	
	


Back to the top