Bug 365859 - [compiler][null] distinguish warnings based on flow analysis vs. null annotations
Summary: [compiler][null] distinguish warnings based on flow analysis vs. null annotat...
Status: VERIFIED FIXED
Alias: None
Product: JDT
Classification: Eclipse Project
Component: Core (show other bugs)
Version: 3.8   Edit
Hardware: Other Linux
: P3 normal (vote)
Target Milestone: 3.8 M6   Edit
Assignee: Stephan Herrmann CLA
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2011-12-07 05:02 EST by Stephan Herrmann CLA
Modified: 2012-03-14 09:02 EDT (History)
6 users (show)

See Also:
amj87.iitr: review+


Attachments
Fix (21.58 KB, patch)
2012-02-23 09:34 EST, Stephan Herrmann CLA
no flags Details | Diff
fix for API names (31.47 KB, patch)
2012-03-13 12:10 EDT, Ayushman Jain CLA
no flags Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Stephan Herrmann CLA 2011-12-07 05:02:26 EST
from bug 365387 comment #38
> "Redundant null check: The variable f cannot be null at this location"
> 
> Do we want to introduce a variant of this message for a @NonNull annotated
> local ? The phrase: "at this location" is redundant here as the variable 
> cannot be null at any location. (IMO it is OK either way)

whenever a warning is directly based on a null-annotation we should clearly 
communicate this fact.
Comment 1 Stephan Herrmann CLA 2012-02-19 07:47:00 EST
Also in this snippet
  
  void foo(@Nullable Object o) {
    bar(o);
  }
  void bar(@NonNull Object o) { }
  
we should improve.  The reported message isn't actually bad:
  Type mismatch: required '@NonNull Object' but the provided value can be null
but currently this is wrongly controlled by the setting for "Potential violation of null specification", while the type error is not "potential".

This should be cleaned-up by introducing one more IProblem, so that we'll have:
  IProblem.RequiredNonNullButProvidedPotentialNull // if the potential null is computed by flow analysis
  IProblem.RequiredNonNullButProvidedNullable // if the value comes from a variable specified as @Nullable
the former IProblem will remain being controlled by "Potential violation of null specification", whereas the latter will be controlled by "Violation of null specification"

Additionally we can rephrase the new message to s.t. like:
   Type mismatch: required '@NonNull Object' but the provided value is declared as '@Nullable Object'

Targeting for M6 since this requires an API addition.
Comment 2 Stephan Herrmann CLA 2012-02-23 09:34:06 EST
Created attachment 211490 [details]
Fix

This patch introduces three new problem messages:

930 = Redundant null check: The variable {0} is specified as @{1}
931 = Null comparison always yields false: The variable {0} is specified as @{1}
932 = Type mismatch: required ''@{0} {1}'' but the provided value is specified as @{2}

In action:

void foo(@NonNull Object o) { if (o != null) ... 
-> Redundant null check: The variable o is specified as @NonNull

void foo(@NonNull Object o) { if (o == null) ... 
-> Null comparison always yields false: The variable o is specified as @NonNull

void bar(@NonNull Object o) {}
void foo(@Nullable Object on) { bar(on); }
-> Type mismatch: required '@NonNull Object' but the provided value is specified as @Nullable


Two comments on these messages:
- The specification may be a @NonNullByDefault at any enclosing scope.
  I'm afraid people will complain when they don't see a @NonNull declaration
  but we'd report "is declared as @NonNull". For vague reasons I feel that
  "is specified as ..." is slightly better to get them thinking about where
  this specification might be.
  If that's too weak we should probably file another bug for distinguishing
  explicit and implicit @NonNull specifications. Currently, we don't have this
  information in the analysis.

- The third message is slightly asymmetric: required '@NonNull Object' but
  is @Nullable. Again just based on vague feelings I thought being pedantic
  and repeatedly mentioning the full type ('@Nullable Object') would distract
  rather than help. OTOH, the first mention of '@NonNull Object' should help
  to justify this as a type mismatch.


Luckily all this could be implemented inside ProblemReporter, i.e., without touching the analysis proper.
Comment 3 Srikanth Sankaran CLA 2012-02-24 00:15:12 EST
Ayush, thanks for a quick once over.
Comment 4 Ayushman Jain CLA 2012-02-25 12:42:27 EST
Patch looks good but a few small comments:
(1)IMHO, Even the message for RequiredNonNullButProvidedPotentialNull should be reworded to say
Potential type mismatch: required ''@{0} {1}'' but the provided value can be null
(2)RequiredNonNullButProvidedSpecdNull should be named RequiredNonNullButProvidedSpecdNullable for more clarity
(3)The two methods isNonNull and isNullable should belong to LocalVariableBinding. We're anyway not handling fields for now.
(4)nullityMismatch(..) can be simplified  to not redundantly calculate the arguments and argumentsShort. :)

Can you also check if the quick fixes for these new wordings are still good?
Comment 5 Stephan Herrmann CLA 2012-02-25 14:37:08 EST
(In reply to comment #4)
> Patch looks good but a few small comments:
> (1)IMHO, Even the message for RequiredNonNullButProvidedPotentialNull should be
> reworded to say
> Potential type mismatch: required ''@{0} {1}'' but the provided value can be
> null

I disagree :)

I mentioned this particular error message in my recent blog post:
http://blog.objectteams.org/2012/02/help-the-jdt-compiler-helping-you-3-the-essence-of-null-annotations/
under the heading "Null warnings vs. type errors"

If this doesn't convince you I'd be interested in learning which step of
my argumentation you disagree with.

> (2)RequiredNonNullButProvidedSpecdNull should be named
> RequiredNonNullButProvidedSpecdNullable for more clarity

sure.

> (3)The two methods isNonNull and isNullable should belong to
> LocalVariableBinding. We're anyway not handling fields for now.

I'll push it down and look forward to pulling it up again.

> (4)nullityMismatch(..) can be simplified  to not redundantly calculate the
> arguments and argumentsShort. :)

I didn't consider error reporting as a hot spot for performance, but you're
right, this is wasteful.
 
> Can you also check if the quick fixes for these new wordings are still good?

Which ones?
Comment 6 Deepak Azad CLA 2012-02-26 05:20:28 EST
(In reply to comment #2)
> 932 = Type mismatch: required ''@{0} {1}'' but the provided value is specified
> as @{2}

> void bar(@NonNull Object o) {}
> void foo(@Nullable Object on) { bar(on); }
> -> Type mismatch: required '@NonNull Object' but the provided value is
> specified as @Nullable

+1. This is good.

(In reply to comment #4)
> Patch looks good but a few small comments:
> (1)IMHO, Even the message for RequiredNonNullButProvidedPotentialNull should be
> reworded to say
> Potential type mismatch: required ''@{0} {1}'' but the provided value can be
> null

+1. Here, we are talking about problem 911, right? And Stephan in your blog you are talking about problem 932, no?

"... *can* be null" indicates 'potential', but 'Type mismatch' has a ring of 'certainty' to it. Looks confusing to me..
Comment 7 Ayushman Jain CLA 2012-02-26 05:29:20 EST
(In reply to comment #5)
> (In reply to comment #4)
> > Patch looks good but a few small comments:
> > (1)IMHO, Even the message for RequiredNonNullButProvidedPotentialNull should be
> > reworded to say
> > Potential type mismatch: required ''@{0} {1}'' but the provided value can be
> > null
> 
> I disagree :)
> 
> I mentioned this particular error message in my recent blog post:
> http://blog.objectteams.org/2012/02/help-the-jdt-compiler-helping-you-3-the-essence-of-null-annotations/
> under the heading "Null warnings vs. type errors"
> 
> If this doesn't convince you I'd be interested in learning which step of
> my argumentation you disagree with.
Well, that example cannot be exactly used to discuss my point here. Let me explain a bit:
Before this patch, we had only one warning controlled by PotentialNullSpecViolation, which is IProblem.RequiredNonNullButProvidedPotentialNull.
This warning was " Type mismatch: required ''@{0} {1}'' but the provided value can be null" for both cases (a) and (b) below:

(a) void foo(@Nullable Object o) {
        @NonNull Object var;
        var = o;   // Type mismatch between @NonNull and @Nullable
    }
(b) void foo(boolean b) {
        Object o;
        if (b) o = null;
        else o = new Object();
        @NonNull Object var;
        var = o;   // Type mismatch between @NonNull and 'can be null' value
    }

However, with the patch the warning for (a) is now changed and controlled by NullSpecViolation instead. So, the warning in the example on your blog is also changed and I agree with that. However, for (b) it makes more sense to say "Potential type mismatch" just like we do for unknown nullness. Adding "potential" will also be consistent with the controlling option for this warning, which is PotentialNullSpecViolation.

> > Can you also check if the quick fixes for these new wordings are still good?
> 
> Which ones?
Sorry i was under the impression that bug 337977 is fixed, but it isn't actually, so nop. :)
Comment 8 Stephan Herrmann CLA 2012-02-26 07:12:56 EST
Thanks for your comments, I think we're about to crack this communication bug.

I have the feeling the disagreement is rooted in one oversimplification that I'm guilty of: I'm using the word "potential" in situations where I probably should have invented a word that is not yet in use in the JDT. Overloading is bad, in Java and even more so in English :)

Let's talk about type checking and see what different messages we want the compiler to say:

(a.type)
  Cat cat = b ? new Cat() : new Dog();

The compiler answers: "Type mismatch: ...", good.

(a.null)
  @NonNull Cat cat : b ? new Cat() : null;

The compiler must basically answer the same as in (a.type).
If we have more to say, it should be recognizable as additional explanation ("OK, OK, I see that on one path you are right, but still that isn't enough").

(b.type)
Consider an integration of Java with a dynamic language that doesn't have static typing. Consider that the Java part could see methods from the dynamic language as using the type "dyn" to denote: we don't have a type specification for this value.

  interface to the dynamic language:
  def dyn getCat()

  in Java:
  Cat cat = getCat();

Compiler should answer: "I can't type check this assignment".
It could be right it could be wrong, but a useful spec for getCat() is missing.
  
(b.null)
   
  legacy library:
  Cat getCat();

  annotated Java:
  @NonNull cat = getCat();

Same answer as (b.type).


How can these four messages be phrased as to optimally explain what the compiler finds?

(a.type) is given.

(a.null) should emphasize the similarity to (a.type). I believe the addition "can be null" is already quite good. We could still be more explicit by saying "is not provably @NonNull", but I'm afraid that's to complicated for an error message (don't use no double negations!).

(b.type) is hypothetical in Java (although such gradual type systems exist and that's really cool stuff!!).
This means we are free to invent anything for (b.null).

(b.null)
Saying anything with "potential" seems to tell people that this situation has something to do with flow analysis. That's a bug, we should fix this. We need a wording here that says: "maybe the guy who wrote 'Cat getCat()' can prove that the return is a '@NonNull Cat', but since he doesn't use annotations he did not express this in the signature, so it could also be wrong".

While I'm writing this I think I should have borrowed from a another situation, where we already support such an interface between legacy and new code:

   List getList();

   List<String> strings = getList();

=> Type safety: The expression of type List needs unchecked conversion to conform to List<String>

That's it!

At (b.null) the compiler should probably say:
=> Type safety: The expression of type Cat needs unchecked conversion to conform to '@NonNull Cat'.


Hooray, no "potential"! :)

If we can agree on these messages, we can proceed to re-check the mapping to irritants and to defaults. But first I'd like to hear if this set of messages make sense to you:

910 = Type mismatch: required ''@{0} {1}'' but the provided value is null
911 = Type mismatch: required ''@{0} {1}'' but the provided value can be null
912 = Type safety: The expression of type ''{0}'' needs unchecked conversion to conform to ''@{1} {2}''
932 = Type mismatch: required ''@{0} {1}'' but the provided value is specified as @{2}

Examples for the above:
(910)
   @NonNull Cat cat = null;
(911)
   @NonNull Cat cat = b ? new Cat() : null;
(912)
   Cat getCat();
   ..
   @NonNull Cat cat = getCat();
(932)
   @Nullable getCat();
   ..
   @NonNull Cat cat = getCat();
Comment 9 Stephan Herrmann CLA 2012-02-26 07:33:07 EST
Specific answers:

(In reply to comment #6)
> +1. Here, we are talking about problem 911, right? And Stephan in your blog you
> are talking about problem 932, no?

You're right. In my blog I was still on the verge of introducing this distinction. Sorry for the confusion.
 
> "... *can* be null" indicates 'potential', but 'Type mismatch' has a ring of
> 'certainty' to it. Looks confusing to me..

Should we omit the "can be null" part?

If, OTOH, you suggest adding 'potential' this must be paralleled by saying:
   Cat cat = b ? new Cat() : new Dog();
=> Potential type mismatch: required 'Cat' but the provided value can be 'Dog'

I don't think you want to suggest that, do you? :)


More seriously, there is indeed one more thing to learn from the actual message for the cat-or-dog example: tell what the compiler inferred.
Transferring this to nullness could yield the following message:


Type mismatch: required ''@NonNull Cat'' but the provided value is inferred to  ''@Nullable Cat''

Yea, I like that, because it explains a two step process:
(1) compiler infers nullness of RHS (since it cannot prove @NonNull it assumes @Nullable)
(2) compiler compares RHS and LHS and detects a type mismatch.

Choosing one message over the other implies to choose how much we want users to think of an analysis going on inside the compiler. But it might be worth explaining this much.
Comment 10 Deepak Azad CLA 2012-02-26 07:35:29 EST
(In reply to comment #8)
> Let's talk about type checking and see what different messages we want the
> compiler to say:
> 
> (a.type)
>   Cat cat = b ? new Cat() : new Dog();
> 
> The compiler answers: "Type mismatch: ...", good.
> 
> (a.null)
>   @NonNull Cat cat : b ? new Cat() : null;
> 
> The compiler must basically answer the same as in (a.type).
> If we have more to say, it should be recognizable as additional explanation
> ("OK, OK, I see that on one path you are right, but still that isn't enough").

I agree for this specific example, if the message is changed to say "Type mismatch: required '@NonNull Cat' but the provided value is null". 

In the first case, I get "Type mismatch: cannot convert from Object to Cat". Note the absence of 'can' in the first case :-)
Comment 11 Deepak Azad CLA 2012-02-26 07:37:58 EST
(In reply to comment #9)
> Type mismatch: required ''@NonNull Cat'' but the provided value is inferred to 
> ''@Nullable Cat''

I am OK with this as well, but then it also means that this specific case is not governed by 'Potential null specification violation' preference.
Comment 12 Deepak Azad CLA 2012-02-26 07:53:46 EST
(In reply to comment #7)
> (b) void foo(boolean b) {
>         Object o;
>         if (b) o = null;
>         else o = new Object();
>         @NonNull Object var;
>         var = o;   // Type mismatch between @NonNull and 'can be null' value
>     }

Let's also go back to the above case

class Snippet{
	void foo3(boolean b) {
		Object o;
		if (b)
			o = null;
		else
			o = new Object();
		@NonNull
		Object var = o; // Problem 1
	}

	void foo4(boolean b) {
		Animal o;
		if (b)
			o = new Cat();
		else
			o = new Dog();
		Cat var = o; // Problem 2
	}
}
class Animal {}
class Cat extends Animal {}
class Dog extends Animal {}

Currently the messages are
1. Type mismatch: required '@NonNull Object' but the provided value can be null
2. Type mismatch: cannot convert from Animal to Cat

I guess we can remove 'can be' from (1) even in this case. and also move the problem out of 'Potential violation of null specification'.

- This would mean that 911 is no longer covered by 'Potential violation of null specification', correct? (Or did I miss something?)
- Also 'Potential violation of null specification' now covers only 912, right? 
If yes, can we change the default for 'Potential violation of null specification' to 'Ignore'?
Comment 13 Stephan Herrmann CLA 2012-02-26 09:00:07 EST
(In reply to comment #10)
> (In reply to comment #8)
> > Let's talk about type checking and see what different messages we want the
> > compiler to say:
> > 
> > (a.type)
> >   Cat cat = b ? new Cat() : new Dog();
> > 
> > The compiler answers: "Type mismatch: ...", good.
> > 
> > (a.null)
> >   @NonNull Cat cat : b ? new Cat() : null;
> > 
> > The compiler must basically answer the same as in (a.type).
> > If we have more to say, it should be recognizable as additional explanation
> > ("OK, OK, I see that on one path you are right, but still that isn't enough").
> 
> I agree for this specific example, if the message is changed to say "Type
> mismatch: required '@NonNull Cat' but the provided value is null". 
> 
> In the first case, I get "Type mismatch: cannot convert from Object to Cat".
> Note the absence of 'can' in the first case :-)

After reading also your comment 11 and comment 12 I'm confused whether you still propose to remove 'can' from the message. Let's try to settle this first by slightly expanding the first example, so it shows a bit more of the existing glorious details.

Look at this:
   String s = b ? new String() : new Integer(1);
=> "Type mismatch: cannot convert from Object&Serializable&Comparable<?> to String"

Oops, what's that? :)
Compiler indeed infers the type for the RHS by computing the LUB of the two branches. None of the types String, Integer, or Object is optimal. Inference can and should do better.

Transferring this to the null variant:
The most suitable type for "b ? new Cat() : null" is '@Nullable Cat', no reason to drop 'can'. The type of 'null' is not a good approximation, it isn't even a correct approximation.

(I'm not answering about irritants and options before we settle the individual messages :) )
Comment 14 Stephan Herrmann CLA 2012-02-26 09:30:52 EST
(In reply to comment #12)
> (In reply to comment #7)
> > (b) void foo(boolean b) {
> >         Object o;
> >         if (b) o = null;
> >         else o = new Object();
> >         @NonNull Object var;
> >         var = o;   // Type mismatch between @NonNull and 'can be null' value
> >     }
> 
> Let's also go back to the above case
> 
> class Snippet{
>     void foo3(boolean b) {
>         Object o;
>         if (b)
>             o = null;
>         else
>             o = new Object();
>         @NonNull
>         Object var = o; // Problem 1
>     }
> 
>     void foo4(boolean b) {
>         Animal o;
>         if (b)
>             o = new Cat();
>         else
>             o = new Dog();
>         Cat var = o; // Problem 2
>     }
> }
> class Animal {}
> class Cat extends Animal {}
> class Dog extends Animal {}

OK, what has changed compared to my example? By replacing "?:" with "if" you're forced to denote the intermediate type, i.e., you participate in the process of type inference.

For the type variant (foo4) you're lucky, you find (you created) a denotable common type. As shown in the previous comment this is not always possible. But anyway.

For the null variant (foo3) you chose a common type which the compiler should never pick: the legacy type Object means: we have no null-ness information (similar to always assuming Object as the common type in the type variant).
Indeed the compiler will infer better: combining the two branches it finds both options, so it's neither "type of null" nor "@NonNull Object" but "'potentially null' Object". The most specific denotable type to which this is assignable is "@Nullable Object".

> Currently the messages are
> 1. Type mismatch: required '@NonNull Object' but the provided value can be null

To be changed perhaps to
"Type mismatch: required '@NonNull Object' but the provided value is inferred to '@Nullable Object'

This is the exact same change as proposed for the ternary operator.
To be perfectly explicit: I'm proposing the "is inferred" wording for all cases where information from flow analysis is used in null checking.

> 2. Type mismatch: cannot convert from Animal to Cat

Since type checking is governed by the JLS we cannot carry forward any results along flow analysis, so we have to use whatever type 'o' is declared as. Type inference only happens in specific syntactic contexts. However, we should rather assimilate with the stronger analysis, where inference is admitted. Otherwise, we would have to completely abandon flow analysis for nullness (which is a form of inference).

(I'm still not answering about irritants and options before we settle the individual messages :) )
Comment 15 Ayushman Jain CLA 2012-02-26 10:35:14 EST
In thinking about this more, I had a wild idea that we should just reword the warning messages to align with the controlling options i.e. instead of prefixing "Type mismatch", just say "Null mismatch". ;)

Applying your arguments to the warning 912, I don't understand why it says "Potential type mismatch". Even that should just say "Type mismatch" then, no?

Also even if we agree that " Type mismatch: required ''@{0} {1}'' but the provided value can be null" is not bad in itself, i'm still worried about confusion arising out of this option being controlled with PotentialNullSpecViolation. So, while we can say that this is analogous to compiler warnings and conclude that the warning is kosher, we cannot exactly say the same about what the user will expect to see when he turns on the "Potential null spec violation" option to WARNING and "Null spec violation" to ERROR. For one kind of type mismatch problems he sees an error, and for others he sees a warning, and that may be confusing.

So, even if we change the wording to "Type mismatch: required ''@NonNull Cat'' but the provided value is inferred to ''@Nullable Cat''", it will still be controlled by "Pot. null spec. violation". So it still doesn't solve the confusion. 

I'm actually seeing it in two ways: from a compiler standpoint, what you say is correct, but from a user standpoint, it seems a bit vague. I and Deepak are two users who're confused by the wording, there may be others in the future. ;)

Deepak, to clarify your comment:
(In reply to comment #12)
> I guess we can remove 'can be' from (1) even in this case. and also move the
> problem out of 'Potential violation of null specification'.
> 
> - This would mean that 911 is no longer covered by 'Potential violation of null
> specification', correct? (Or did I miss something?)
Well, even if we change the wording the controlling option will still be the same, and as I said above, thats confusing.
> - Also 'Potential violation of null specification' now covers only 912, right?
No, 912 is covered by another option "insufficient info for null analysis"
Comment 16 Deepak Azad CLA 2012-02-26 10:39:13 EST
Stephan, I thought I agreed with your proposal about wording for 911 in comment 11 :-)

However, when you describe a change in wording you should also mention what corresponding changes, if any, do you have in mind for 'problem to preference' mapping, because the whole picture needs to make sense. (We can come to the defaults in the end)
Comment 17 Deepak Azad CLA 2012-02-26 10:52:51 EST
(In reply to comment #15)
> Deepak, to clarify your comment:
> (In reply to comment #12)
> > I guess we can remove 'can be' from (1) even in this case. and also move the
> > problem out of 'Potential violation of null specification'.
> > 
> > - This would mean that 911 is no longer covered by 'Potential violation of null
> > specification', correct? (Or did I miss something?)
> Well, even if we change the wording the controlling option will still be the
> same, and as I said above, thats confusing.

I can't agree to that. Either the wording says 'potential' (or something equivalent) or the problem is not covered by 'Potential violation....'. But Stephan has already argued successfully that the problem has nothing potential about it, so 911 can no longer be covered by 'Potential violation....'. Any proposal to the contrary gets my -1.
Comment 18 Stephan Herrmann CLA 2012-02-26 11:25:10 EST
(In reply to comment #15)
> Applying your arguments to the warning 912, I don't understand why it says
> "Potential type mismatch". Even that should just say "Type mismatch" then, no?

I think we need a re-sync here.

As of the latest proposal (comment 8) 912 reads:

912 = Type safety: The expression of type ''{0}'' needs unchecked conversion to
conform to ''@{1} {2}''


The word "potential" is burnt. It causes more harm than it helps in this context. Please join me in an experiment to avoid it in everything relating to null annotations.
Comment 19 Ayushman Jain CLA 2012-02-26 11:56:45 EST
(In reply to comment #18)
> As of the latest proposal (comment 8) 912 reads:
> 
> 912 = Type safety: The expression of type ''{0}'' needs unchecked conversion to
> conform to ''@{1} {2}''
Oops, my bad! I didn't notice that this replaces the earlier message.
> 
> The word "potential" is burnt. It causes more harm than it helps in this
> context. Please join me in an experiment to avoid it in everything relating to
> null annotations.

Ok, so if we agree on that, then we need to think about associating these with either "Null spec violation" or another option replacing "Potential null spec violation". Having "potential" in the option will not help anymore.
Comment 20 Stephan Herrmann CLA 2012-02-26 12:35:26 EST
(In reply to comment #16)
> However, when you describe a change in wording you should also mention what
> corresponding changes, if any, do you have in mind for 'problem to preference'
> mapping, because the whole picture needs to make sense. (We can come to the
> defaults in the end)

OK, let's give it a try, let's pretend we had an agreement on these messages:

910
Type mismatch: required '@NonNull Cat' but the provided value is null
932
Type mismatch: required '@NonNull Cat' but the provided value is specified as @Nullable
911
Type mismatch: required '@NonNull Cat' but the provided value is inferred to '@Nullable Cat'
912
Type safety: The expression of type 'Cat' needs unchecked conversion to conform to '@NonNull Cat'

The first three are spec violations. 
912 signals a data flow from legacy to annotated.

So 912 clearly needs a category of its own.
Current label "Insufficient information for null analysis" isn't bad, maybe we can still improve. For alignment with the problem message we could say:
  "Unchecked conversion from legacy type to @NonNull type"
Also the constant COMPILER_PB_NULL_SPECIFICATION_INSUFFICIENT_INFO is OK.
Might want to reconsider the @SW token, could perhaps use "unchecked". I'm ambivalent here between "null" and "unchecked".

When taking a fresh look at 910 we could actually spare a configuration option for this. If annotation-based null analysis is enabled, then this problem is an error, unconditionally (are we trusting our flow analysis that when it says "is null" this is always correct? I think this is safe.)

Coming to 932 I again don't see why this problem should ever be globally downgraded. We *might* still want to rank it as an optional error so that @SW works. I wouldn't recommend using @SW here, ppl should use an assertNonNull helper for conversion, but I can't predict their excuses why @SW should be OK. (@SW is *not* OK for 910 - unless we want to shield against buggy analysis).

This leaves us with the "mixed" case 911 (mixed because specification *and* inference is involved): This is a bit tricky, because at the point of the problem we may not be able to tell, *why* inference was involved: are we trying to improve from a @Nullable value or from a legacy value.

  void test(Object legacy, @Nullable Object nullable, boolean b) {
      @NonNull Object o1 = b ? legacy : new Object();    // 912
      @NonNull Object o2 = b ? legacy : null;            // 911
      @NonNull Object o3 = b ? nullable : new Object();  // 911
      @NonNull Object o4 = b ? nullabel : null;          // 911
  }

Luckily, o1 is already reported as 912, so "Insufficient information ..." is the correct classification.

What degree of control do we need for o2 - o4? From the given examples one might be tempted to treat all these the same as 932. But as soon as more complex flow analysis is involved we should account for the possibility of false positives (e.g., due to lack of correlation analysis, pot.null is not as reliable as def.null).

Ergo: This should be separately configurable. Let's try some new wording here, too, e.g.:
  "Violation of null specification on some path"
or
  "Conflict between null specification and null inference"
The second one should be unambiguous.

Let's put it together into one UI proposal:

[x] Enable annotation-based null analysis
    Insufficient information for null analysis             [E/W/I]
    Conflict between null specification and null inference [E/W/I]

Alternative wording for "Insufficient information ...":
    Unchecked conversion from legacy type to @NonNull type


The mapping would be

[x] Enable annotation-based null analysis      -> set 910 & 932 to error
    Insufficient information for null analysis             -> 912   
    Conflict between null specification and null inference -> 911


Look, Ma, no "potential" :)
Comment 21 Ayushman Jain CLA 2012-02-27 01:30:15 EST
(In reply to comment #20)
> So 912 clearly needs a category of its own.
> Current label "Insufficient information for null analysis" isn't bad, maybe we
> can still improve. For alignment with the problem message we could say:
>   "Unchecked conversion from legacy type to @NonNull type"
> Also the constant COMPILER_PB_NULL_SPECIFICATION_INSUFFICIENT_INFO is OK.
> Might want to reconsider the @SW token, could perhaps use "unchecked". I'm
> ambivalent here between "null" and "unchecked".
Agreed. "null" looks fine, to differentiate it from unchecked generic type warnings.

> When taking a fresh look at 910 we could actually spare a configuration option
> for this. If annotation-based null analysis is enabled, then this problem is an
> error, unconditionally (are we trusting our flow analysis that when it says "is
> null" this is always correct? I think this is safe.)
Even if we do trust our analysis, users may want to see warnings from the analysis and not errors. Flexibility is not bad in this case. So we can let it be. :) 

> This leaves us with the "mixed" case 911 (mixed because specification *and*
> inference is involved): This is a bit tricky, because at the point of the
> problem we may not be able to tell, *why* inference was involved: are we trying
> to improve from a @Nullable value or from a legacy value.
Yeah, thats definitely a point of contention.

> Ergo: This should be separately configurable. Let's try some new wording here,
> too, e.g.:
>   "Violation of null specification on some path"
> or
>   "Conflict between null specification and null inference"
> The second one should be unambiguous.
Umm, from a user pov these are weird. Also, Violation of null specification on some path = Potential null spec. violation :P
Ok, I know you're gonna beat me up for this but in all the wordings we discussed, the best one seems to be the original
"type mismatch: required ''@{0} {1}'' but the provided value can be
null" :)

However, let this also be controlled by the "null spec violation" option

> Let's put it together into one UI proposal:
> The mapping would be
> 
> [x] Enable annotation-based null analysis      -> set 910 & 932 to error
>     Insufficient information for null analysis             -> 912   
>     Conflict between null specification and null inference -> 911
My suggestion:
 [x] Enable annotation-based null analysis      
     Null spec violation  -> set 910,911 & 932 to error
     Insufficient information for null analysis (or unchecked conversion)-> 912
Comment 22 Stephan Herrmann CLA 2012-02-28 07:51:50 EST
After a brief personal communication let my try to summarize the remaining open issue(s), please let me know if I missed s.t. relevant:

When assigning a value to a @NonNull variable we seem to agree an the toplevel classification:

- if a legacy type is involved, rank this as an unchecked conversion

- if either null or @Nullable is involved rank this as a type mismatch

Two questions remain regarding the second:

(1) do we need to weaken the type mismatch message due to some uncertainty in the analysis? (where and when?)

(2) how much detail / explanation do we put into the messages?

Also, the exact wording, grouping into options, and defaults are still subject to discussion, but these should be derivable from answers to these questions. 
At this level, did I miss a question?
Comment 23 Stephan Herrmann CLA 2012-02-28 08:03:21 EST
(In reply to comment #22)
> (1) do we need to weaken the type mismatch message due to some uncertainty in
> the analysis? (where and when?)
> 
> (2) how much detail / explanation do we put into the messages?

Trying to narrow down the scope of (1):

If value is "null", uncertainty can only result from bugs in the analysis. If it says def.null we should be able to rely on that. If it's broken, we must fix it - short term workaround until the fix: disable null annotations.

If specified @Nullable directly flows into @NonNull I don't see any uncertainty.

If flow analysis result says pot.null I mentioned that uncertainty might result from a legacy type and/or from behavioral difference on different branches. However, in comment 20 I concluded that the relevant case "o1" is already detected as a legacy situation ("insufficient info"), so we can precisely mention this in the problem message.
For all cases o2-o4 we are *sure* that null *can* legally occur, confirming that this is indeed a type mismatch, and contrary to a previous statement I'd like to avoid to speak of any "false positives" in this group. These are real type errors, just we are not sure that null *will* occur and produce a problem at runtime (exactly as also "normal" type errors could be correct programs in an untyped language).
While flow analysis is stronger than directly using legacy types, it is also weaker than explicit specification of each single variable. Basically, we're making flow analysis the weakest link in the chain, and considering, e.g., the lack of correlation analysis we know that in some situations users expect the analysis to be smarter than it is. Thus it should be relevant to tell the user when flow analysis was involved in detecting a type error.

For that reason I think that weaker wording is required for o2 & o3 and only for these. o4 is borderline, it may remain in this group, but could perhaps be improved towards certainty and strong message (since we're only uncertain whether null or @Nullable) -> Future work.


Regarding (2) we already have good results for legacy, null, and @Nullable situations.

For the remaining flow-based problems we have proposals to be rather unspecific, e.g., by saying "can be null", without giving any reason why null can occur nor why this is only a "can be". I proposed as an alternative to make the inference explicit (assuming we can equate null flow analysis with null type inference). At this point the difference appears to be mostly a matter of taste, so we might want to try which one better supports the last stop: finding labels for the UI.

---
As the next step, I'd like to narrow down, which configurable options we need.

Considering the situations of assigning either null or @Nullable to @NonNull, let me call out, whether anybody has substantial reasons for downgrading these from error to warning or even ignore? Otherwise we don't need a user preference here.

I think we agree that "insufficient info" (whatever wording) deserves an option of its own.

Does anybody see reasons (means?) to further split the flow-based problems?

---
Finally, for the UI, I'm having difficulties to find a label that clearly relates to a "can be null" message. Any suggestions? (please don't suggest "Potential type mismatch", we're only searching for details at the RHS of "Type mismatch:..").

I do think the word "infer"/"inference" could suitably create this link.
Comment 24 Stephan Herrmann CLA 2012-02-28 08:10:56 EST
(In reply to comment #23)
> Considering the situations of assigning either null or @Nullable to @NonNull,
> let me call out, whether anybody has substantial reasons for downgrading these
> from error to warning or even ignore? Otherwise we don't need a user preference
> here.

Obviously, there always is a way out: if you don't want to see these errors the natural action should be to disable annotation based null analysis :)
Comment 25 Stephan Herrmann CLA 2012-02-28 12:13:19 EST
This bug now has a UI sibling: Bug 372768
Comment 26 Ayushman Jain CLA 2012-02-28 13:18:19 EST
(In reply to comment #24)
> (In reply to comment #23)
> > Considering the situations of assigning either null or @Nullable to @NonNull,
> > let me call out, whether anybody has substantial reasons for downgrading these
> > from error to warning or even ignore? Otherwise we don't need a user preference
> > here.
> 
> Obviously, there always is a way out: if you don't want to see these errors the
> natural action should be to disable annotation based null analysis :)
I agree with your analysis mostly. Here's one thing we can consider as a case for someone to downgrade to warning:
I realized this recently when Deepak changed some dependency versions in JDT/UI, and on pulling the jdt.ui repo, the project started giving access restriction errors in my workspace. The only way to get rid of them is by updating to a newer build. Until then, I cannot even launch a runtime workbench without closing the UI project.
Similarly, suppose a user sees a false positive because of a bug - he now sees a prickly error which may be beyond his control (May be a reference to something in an annotated library). The only way to get rid of the error may be to change the annotation in his code to, say, @NonNull from @Nullable. However, that would cause a flood of additional errors and he doesn't want that. He makes a note of this false positive and files a bug. Meanwhile, he downgrades to "warning" so that he can enjoy the convenience of the analysis while he's made a note of the false positive.
Now in the model you suggest, the user does not have such a flexibility. Because of a bad error, he cannot even launch a runtime workbench. Its not a compiler error as such, so its a bit irritating. Thats why, I prefer the flexibility that  a configurable option brings for non-fatal static analysis kind of errors.
Comment 27 Stephan Herrmann CLA 2012-02-28 14:17:53 EST
(In reply to comment #26)
> (In reply to comment #24)
> > Obviously, there always is a way out: if you don't want to see these errors
> the
> > natural action should be to disable annotation based null analysis :)
> I agree with your analysis mostly. Here's one thing we can consider as a case
> for someone to downgrade to warning:
> I realized this recently when Deepak changed some dependency versions in JDT/UI,
> and on pulling the jdt.ui repo, the project started giving access restriction
> errors in my workspace.

sounds familiar :)

> The only way to get rid of them is by updating to a
> newer build. Until then, I cannot even launch a runtime workbench without
> closing the UI project.
> Similarly, suppose a user sees a false positive because of a bug - he now sees a
> prickly error which may be beyond his control (May be a reference to something
> in an annotated library). The only way to get rid of the error may be to change
> the annotation in his code to, say, @NonNull from @Nullable. However, that would
> cause a flood of additional errors and he doesn't want that.

Sure, changing the specification to account for a compiler bug is a dead end.
However, I'm sure there will be workarounds with explicit null checks and intermediate local variables that will do a better job than changing the annotation.
But, let's still assume, that for some obscure reasons an additional null check cannot be used.

> He makes a note of
> this false positive and files a bug. Meanwhile, he downgrades to "warning" so
> that he can enjoy the convenience of the analysis while he's made a note of the
> false positive.
> Now in the model you suggest, the user does not have such a flexibility. Because
> of a bad error, he cannot even launch a runtime workbench.

That'd be horrible, sure!

But, in this (hopefully very rare) case, is it really unacceptable to disable null annotation checking for the 120 minutes until we provide a corrected JDT/Core? :}

> Its not a compiler error as such, so its a bit irritating. Thats why, I prefer the flexibility that
> a configurable option brings for non-fatal static analysis kind of errors.

It's a question of balancing: how many minutes will a user be in this mode on average?
We don't know but we hope to get a *very* low number. 
It's only in case of an implementation bug in the null analysis, call it a compiler bug or something else, if it occurs it's our bug.
Balance that against: how many times will users look at this option (while searching for something else) and be puzzled, when it should be used? Or worse: be tempted to use it when they shouldn't!

Is this kind of implementation bug more likely than other implementation bugs?
Maybe by a slight offset, but there's already a way out, must it be a smooth 6 lane highway, or could a few bumps be acceptable?

Normal compiler configuration options should be intended for accommodating different styles of programming, and not be there just as a back door for a compiler bug.
And the back door is already there.

Am I missing anything?
Comment 28 Ayushman Jain CLA 2012-03-05 08:01:27 EST
(In reply to comment #27)
> But, in this (hopefully very rare) case, is it really unacceptable to disable
> null annotation checking for the 120 minutes until we provide a corrected
> JDT/Core? :}
Yes, as of now. Can't predict the future though :)

> Normal compiler configuration options should be intended for accommodating
> different styles of programming, and not be there just as a back door for a
> compiler bug.
> And the back door is already there.
Yeah, I did not intend to have the configuration options as a safety net for the user to escape a compiler bug. I guess the argument for whether we should have configurable options for this case should be the same as whether we should have it for the other options on the errors/warnings page. Comment 26 is just an example to illustrate one case where user may want use 'warning'. There may be other cases, or it may just be a matter of taste. How about a master option "Use annotations for null analysis [E/W/I]" and have more enable/disable options for the other three sub-configurations we're proposing?

I think its high time we take a call on this and take it to completion. M6 is around the corner. :)
Comment 29 Stephan Herrmann CLA 2012-03-05 16:35:19 EST
(In reply to comment #28)
> I guess the argument for whether we should
> have configurable options for this case should be the same as whether we 
> should have it for the other options on the errors/warnings page. 

This tells me, you don't agree that some errors are different from all those
we have on the preference page. But I still don't know where I lost you
when I argued that "910" & "932" errors are type errors and that ignoring
some type errors makes the whole concept of type checking pretty pointless.
Following this argumentation 910/932 errors aren't normal optional errors.
Can't you give me a hint, on where you see a hole in this argumentation?

> There may be other cases,

Maybe not. I'm waiting to be convinced by a strong example.


> How about a master option
> "Use annotations for null analysis [E/W/I]" and have more enable/disable
> options for the other three sub-configurations we're proposing?

That means, all irritants will get the same [E/W/I] level?
That sounds like sacrificing *useful flexibility*.
Why should 912 be forced to the same level as 910?
Comment 30 Srikanth Sankaran CLA 2012-03-06 00:51:09 EST
Satyam, could you study this issue and come up with a recommendation on
what should happen for M6 ? (3.8 ?). We also have to have a clear picture
of what the UI changes required are - we have a standing request to minimize
load on them as much as possible since they are under severe constraints.
Comment 31 Ayushman Jain CLA 2012-03-06 01:23:25 EST
(In reply to comment #29)
> (In reply to comment #28)
> > I guess the argument for whether we should
> > have configurable options for this case should be the same as whether we 
> > should have it for the other options on the errors/warnings page. 
> 
> This tells me, you don't agree that some errors are different from all those
> we have on the preference page. But I still don't know where I lost you
> when I argued that "910" & "932" errors are type errors and that ignoring
> some type errors makes the whole concept of type checking pretty pointless.
> Following this argumentation 910/932 errors aren't normal optional errors.
> Can't you give me a hint, on where you see a hole in this argumentation?
The only difference I see is that these errors are not compiler errors per se. I know that with null annotations, these are indeed very legit errors and the user will never want to set them to "ignore". But perhaps, error/warning. :)
Ok, since I can;t come up with any example of when the user may want to downgrade to warning, I raise the white flag. :)
Comment 32 Ayushman Jain CLA 2012-03-06 01:29:36 EST
(In reply to comment #31)
> The only difference
I didn't mean the difference between these and other options on the page. I mean even if null-annotation errors are different from the other options, they're also different from standard compiler errors. This is just a clarification. I'm ok with removing the configuration option.
Comment 33 Stephan Herrmann CLA 2012-03-06 07:59:05 EST
(In reply to comment #30)
> Satyam, could you study this issue and come up with a recommendation on
> what should happen for M6 ? (3.8 ?). 

Satyam, let me try to summarize from my pov as input for your recommendation:

I think we all agreed on the implementation changes in comment 2.

For any additional changes regarding messages / categories / UI etc. I'm proposing as detailed in bug 372768 (against JDT/UI).

After Ayush 'raised the white flag' I don't see any active objections, but we'd probably need a +1 also from JDT/UI.

Should a compromise be needed, I primarily see these options:
- further polish the wording (UI labels as well as compiler messages)
- retain configurability for definite null spec violations.
  From technical and also usability pov I'm arguing against this, but
  perhaps a management perspective gives good reasons which I didn't see.

> We also have to have a clear picture
> of what the UI changes required are - we have a standing request to minimize
> load on them as much as possible since they are under severe constraints.

The request in bug 372768 amounts to removing one option and changing the labels for two others (plus a small fup in bug 371968). The main effort would go into discussing the approach, implementation is trivial, I suppose.

HTH
Comment 34 Satyam Kandula CLA 2012-03-07 05:04:19 EST
Thanks Stephan for the summary. Considering the state, these comments should probably should go in bug 372768. However, am writing here as this has more context. 

As I understand the current state is that we still need to finalize or agree on 
 - Exact UI option and messages
 - Removal of the option 'Violation of null specification'

1. Proposal to change 'Potential violation of null specification':

Your proposal:
(UI)Conflict between null specification and null inference [E/W/I]
(CM)Type mismatch: required '@NonNull Cat' but the provided value is inferred
to'@Nullable Cat'

My comments:
The compiler message looks fine with me, but UI message is confusing. If I look only at the UI message, I will be confused between 'Violation of null specification' and 'Conflict between null specification and null inference'. What is the difference between both of them? Somebody has to read some other doc to understand the differentiation. Ideally it would be good to be able to find it from the message itself. I have to confess that I didn't either understand the complete difference when 'Potential' was used, but I think it is better that the proposed.
I understand the reason that you need to modify the UI message in the first place -  it is probably to have some kind of a easy user mapping between the UI message and the compiler message. I also agree that the proposed compiler message is better than the previous compiler message. 
I also agree that 'Potential Type mismatch' doesn't look right. I am not satisfied with Ayush's other proposal to have 'Null mismatch'. I could think of 'Null specification violation'. It is a big name, but I like it otherwise. I will be happy if we could shorten this. 'Null violation' doesn't sound correct either.

2. Proposal to change 'Insufficient information for null analysis'
(UI)Unchecked conversion from legacy type to @NonNull type  [E/W/I]
(CM) Type safety: The expression of type 'Cat' needs unchecked conversion to
conform to '@NonNull Cat'

Both these look fine except for the word 'legacy'. 'legacy' could mean many things and I think we need a better word. I don't have a better word either. 

3. Removal of the option 'Violation of null specification'
From a static checking tool or an annotation processor pov, this is a clear error and configuration probably need not be specified. Things are becoming tricky when we add this to the compiler. You could argue that 'turning off the annotation' is the right way to go in those cases when one doesn't want to have these errors, but that alone cannot justify the removal.
Stephan, you were telling that there will be wrong diagnostics if not turned error by default. Can you please elaborate?
Comment 35 Satyam Kandula CLA 2012-03-07 10:32:30 EST
Deepak had pointed me to bug 365763 which talks about why 'Type mismatch' is correct. Now if 'Potential Type mismatch' is incorrect and yes it is really a potential 'error', then probably the term 'Type mismatch' is itself in correct. I think this argument is wrong but these forced me to think more about this term and here are some of my findings/thoughts.

Annotated type can be seen as a type qualifier. Should the change in the type qualifier be termed as a type mismatch? I don't know but here is an argument. According to jsr308 spec, an annotated type can be considered as subtype or super type of the original type. So, differently annotated types probably are not in the hierarchy and hence could classify as 'Type Mismatch'. However, if one is not annotated, the annotated type is a sub or super type of this and this is not really a 'Type Mismatch'! I agree that an annotation processor could think otherwise, but assuming not, an obvious violation is a 'Type mismatch' and an unobvious violation is not a 'Type mismatch'. It is not even a 'Potential Type mismatch'. So, my statement in the first paragraph is wrong.  

I tried to look at the 'null' error messages of Checker's framework - http://code.google.com/p/checker-framework/source/browse/checkers/src/checkers/nullness/messages.properties. 'Type mismatch' is not used. 

As I see from the bug 365763, there are not reservations about using this, but at the same time this wasn't considered to be the best way. 

I don't have any other arguments to put in, but considering the above points, I think we should not use the term 'Type mismatch'.
Comment 36 Stephan Herrmann CLA 2012-03-07 15:57:07 EST
(In reply to comment #34)
> Stephan, you were telling that there will be wrong diagnostics if not turned
> error by default. Can you please elaborate?

Sure. Assume user set's preferences like this (using labels as of M5):

  Null Pointer Access : Error
  Redundant Null Check: Warning
  [x] Enable annotation-based null analysis
    Violation of null specification:           Warning
    Potential violation of null specification: Warning
[x] Treat above errors like fatal errors ...

Now this method

	public static void main(String... args) {
		Object val = null;
		Object o2 = null;
		if (val == null) {
			o2 = "It works";
		}
		System.out.println(o2.toString());
	}

compiles with a (suppressable) warning, and can be run to print "It works".

Add one annotation:

	public static void main(String... args) {
		@NonNull Object val = null;
		Object o2 = null;
		if (val == null) {
			o2 = "It works";
		}
		System.out.println(o2.toString());
	}

Now you get one error: the last statement is incorrectly flagged as:
  "Null pointer access: The variable o2 can only be null at this location"
This is because analysis now "believes" that the then-branch is dead code.

The program cannot be run anymore, which is OK, but the error tells the user to change the wrong location.

By ignoring the warning against the first statement (we may even think of @SuppressWarnings("null") for the entire method) we don't see the root cause. Moreover, the compiler will use the wrong assumption that val is nonnull and this mistake can arbitrarily propagate into all subsequent analysis.
Comment 37 Stephan Herrmann CLA 2012-03-07 16:36:44 EST
(In reply to comment #35)
> I tried to look at the 'null' error messages of Checker's framework -
> http://code.google.com/p/checker-framework/source/browse/checkers/src/checkers/nullness/messages.properties.
> 'Type mismatch' is not used. 

This is easily explained: Type error messages seem to be taken from the core of the checkers framework, rather than the file you cited:

import checkers.nullness.quals.*;
public class T {
	void test() {
		@Nullable Object o1 = null;
		@NonNull Object o2 = o1;
	}
}

gives:

T.java:5: error: incompatible types.
                @NonNull Object o2 = o1;
                                     ^
  found   : @Nullable Object
  required: @NonNull Object
1 error

Bottom line: they're re-using the exact same message as for other type errors.
Comment 38 Stephan Herrmann CLA 2012-03-11 18:41:46 EDT
Released for 3.8 M6 via commit f59cd62ee82097eb13f8e87f8fbab8b81f747c29

I've addressed issues (2)-(4) from comment 4.

Compiler messages are in line with bug 372768 comment 9.
Comment 39 Ayushman Jain CLA 2012-03-12 10:08:45 EDT
Reopening to work on bug 372768 comment 15. I'll take care of that
Comment 40 Stephan Herrmann CLA 2012-03-12 15:14:14 EDT
(In reply to comment #39)
> Reopening to work on bug 372768 comment 15. I'll take care of that

While we're at it: also the doc text needs an update to account for RequiredNonNullButProvidedSpecdNullable.
I propose the following paragraph:

	 * <p>Depending on this option, the compiler will issue either an error or a warning
	 *    whenever one of the following situations is detected:
	 *    <ol>
	 *    <li>A method declared with a nonnull annotation returns a <em>nullable</em> expression.</li>
	 *    <li>A <em>nullable</em> expression is passed
	 *        as an argument in a method call where the corresponding parameter of the called
	 *        method is declared with a nonnull annotation.</li>
	 *    <li>A <em>nullable</em> expression is assigned
	 *        to a local variable that is declared with a nonnull annotation.</li>
	 *    <li>A method that overrides an inherited method declared with a nonnull annotation
	 *        tries to relax that contract by specifying a nullable annotation
	 *        (prohibition of contravariant return).</li>
	 *    <li>A method that overrides an inherited method which has a nullable declaration
	 *        for at least one of its parameters, tries to tighten that null contract by
	 *        specifying a nonnull annotation for its corresponding parameter
	 *        (prohibition of covariant parameters).</li>
	 *    </ol>
	 *    In the above an expression is considered as <em>nullable</em> if either it is statically
	 *    known to evaluate to the value <code>null</code>, or if it is declared with a nullable annotation.
	 * </p>
   ...
Comment 41 Ayushman Jain CLA 2012-03-12 15:24:47 EDT
(In reply to comment #40)
> (In reply to comment #39)
> > Reopening to work on bug 372768 comment 15. I'll take care of that
> 
> While we're at it: also the doc text needs an update to account for
> RequiredNonNullButProvidedSpecdNullable.
You mean for JavaCore.COMPILER_PB_NULL_ANNOTATION_REFERENCE_CONFLICT ?
Comment 42 Stephan Herrmann CLA 2012-03-12 15:46:35 EDT
(In reply to comment #41)
> (In reply to comment #40)
> > (In reply to comment #39)
> > > Reopening to work on bug 372768 comment 15. I'll take care of that
> >
> > While we're at it: also the doc text needs an update to account for
> > RequiredNonNullButProvidedSpecdNullable.
> You mean for JavaCore.COMPILER_PB_NULL_ANNOTATION_REFERENCE_CONFLICT ?

I drafted the change for COMPILER_PB_NULL_SPECIFICATION_VIOLATION but it seems we're on the same page :)
Comment 43 Ayushman Jain CLA 2012-03-13 12:10:46 EDT
Created attachment 212569 [details]
fix for API names
Comment 44 Ayushman Jain CLA 2012-03-13 13:38:35 EDT
(In reply to comment #43)
> Created attachment 212569 [details]
> fix for API names

Released in jdt/core master via commit d00b9ad70dbedf29a635b964d926113aa73c3dfc
Comment 46 Satyam Kandula CLA 2012-03-14 09:02:25 EDT
Verified for 3.8M6 using build I20120313-2000