Community
Participate
Working Groups
From bug 365387 comment 47, I learned that the compiler creates synthetic @NonNull annotations if an enclosing element is annotated with @NonNullByDefault. This makes the Eclipse compiler incompatible with a standard Java compiler, since the generated class files are not equivalent any more. We should not deviate unless absolutely necessary.
I think we'd discussed this point already during the review and the alternative strategy proposed in bug 365531 should cover this. (Basically I had suggested that null annotations should work just like @Override, which becomes effective for all enclosed classes and methods without actually generating synthetic annotations) Stephan, feel free to close as a dup.
(In reply to comment #1) > (Basically I had suggested > that null annotations should work just like @Override, which becomes effective > for all enclosed classes and methods without actually generating synthetic > annotations) For the records: You mean @Deprecated. Anyway, these are two related but distinguishable issues: bug 365531 starts out from how information is internally encoded during compilation. The question is basically about TypeBindings vs. tagBits. One of the questions there is, whether resolving of the NonNull annotation is needed if the code makes no mention of it. (This could eventually result in affecting the class files, too, but we're not there yet). This current bug is clearly about the class files. Let me explain my initial motivation for generating these synthetic annotations: Performance. The lookup of nullness defaults is more involved than the case of @Deprecated for two reasons: - it affects everything from each single method parameter to methods, to types, to packages to project settings. - in addition to plain propagation outside->in we also have cancellation I surely didn't want to penalize each reading of a binary type with sophisticated computations. While some issues could possibly be solved with some diligence, I currently don't see how we could integrate project-wide (or even workspace-wide) non-null default without synthetic annotations. Any ideas? Otherwise this might already hit the clause "unless absolutely necessary". As to the risks involved, it would take an additional nullness checker to potentially produce different results w/ vs. w/o synthetic annotations (no other tool should care about these). (1) Other nullness checker reads annotations from class files into which the JDT Compiler generated synthetic annotations: the checker will see a binary that syntactically differs from the source code but is semantically equivalent. If the other nullness checker has no notion of @NonNullByDefault, *only* the version with synthetic annotations will give a full representation of the intended semantics. (2) JDT Compiler consumes class files from javac (other checkers are typically not part of a compiler): if our compiler relies on seeing synth. annotations it will not see any defaults applying to those class files. We will fall back to seeing lots of signatures without null-ness specification, i.e., with javac in the loop we're back in legacy mode. So, when consuming class files from javac for which defaults should apply, our compiler may raise more errors/warnings than when consuming its own class files. In this case these errs/warns would need to be downgraded to warn/ign in order compile. What conclusion should we draw from these? PS: why did you mark this bug as blocking bug 366007?
(In reply to comment #0) > This makes the Eclipse compiler incompatible with a standard Java compiler, > since the generated class files are not equivalent any more. We should not > deviate unless absolutely necessary. I agree with this goal. So we did absolutely the right thing in disqualifying the earlier proposal around COMPILER_DEFAULT_IMPORT_NULL_ANNOTATION_TYPES (see https://bugs.eclipse.org/bugs/show_bug.cgi?id=186342#c83). But in the current issue at hand I don't see how this concern comes into play. - These annotations are propagated into individual annotatable entities in binaries only when the annotation based null analysis is on. - Toggling the analysis on/off would trigger a rebuild, so what is persisted in class files reflects the current setting choice. - Annotations are *the* linguistic machinery in Java to do extra lingual stuff. Every Java compiler is required to accept non-predefined annotations in source and binaries and other than processing the meta annotations in the former case, consumethem gracefully and not choke on them. - Annotation processors are not expected to choke on "alien" annotations. - What is persisted has the language recommended reverse domain name structure package naming convention to avoid name space collision/pollution. - Another way of looking at this is that the transformations are applied not by the compiler, but by the annotation processor for null family of annotations (though by design annotation processors are presented only a read-only view of the sources - so this is not exactly true in this argument) - So the only party who can be influenced by this decision is another null annotation processor that recognized org.eclipse.jdt.annotation.* > since the generated class files are not equivalent any more. We should not > deviate unless absolutely necessary. So I see this case falling in the same class as our generated instruction stream not being the same or local variable ranges or exception tables looking different. (In reply to comment #2) [...] > Let me explain my initial motivation for generating these synthetic > annotations: Performance. The lookup of nullness defaults is more involved > than the case of @Deprecated for two reasons: See also https://bugs.eclipse.org/bugs/show_bug.cgi?id=365387#c25. The master switch is not truly "global", for example we should not impose NonNull contract on methods coming from JRE classes. If annotations are not synthesized and propagated but are to be materialized out of thin air from binaries, this distinction should be maintained. (I don't know that it cannot be done or done easily.) > As to the risks involved, it would take an additional nullness checker > to potentially produce different results w/ vs. w/o synthetic annotations > (no other tool should care about these). To further qualify, it would take an additional nullness checker that is equipped to understand our annotations to potentially produce different results.
I think the strongest reason in this issue is that project/workspace wide defaults need to be persisted somehow so that dependent projects can see the effective annotations. From this I'm inclined to close as WONTFIX. Still I see two options we might want to weigh: (a) Store all annotations exactly as the compiler sees them after applying any defaults. (b) Translated global default into @NonNullByDefault into each affected type, without adding anything to methods (future: fields). Pro (a): For the small possibility that a 3rd party processor for null annotations wants to leverage our annotations that processor will get the correct semantics without the need of interpreting our @NonNullByDefault. Potentially pro (b): For quick fixes it should be interesting to distinguish explicit annotations vs. annotations generated from a default, because some quick fixes should only offer to add new annotations, not replace existing ones with their opposite (even more relevant for clean-ups). However, I don't exactly know ATM what information the quick fixes exactly use - depending on that, we should check how Core can provide the required distinction. (a) is currently implemented and works fine. For realizing (b) we would probably need bug 365531 whose feasibility I haven't fully evaluated. It seems doable, but comes with some effort. Did I miss any options? Markus, can you comment on the overall theme and also on the quick fix issue?
I am curious, what happens if Eclipse embeds its @NonNull annotations in my classes (which seems seriously wrong) and I go run my classes outside the IDE? Is my program going to blow up because it references annotations that are not on my classpath?
(In reply to comment #5) > I am curious, what happens if Eclipse embeds its @NonNull annotations in my > classes (which seems seriously wrong) why wrong? Do you see a better option? All annotations with @Retention(CLASS) or @Retention(RUNTIME) are stored in your class file as per the JLS/JVM-spec. This is not specific to null annotations. > and I go run my classes outside the IDE? > Is my program going to blow up because it references annotations that are not > on my classpath? Nothing like that will happen. These annotations have @Retention(CLASS) i.e., *not* @Retention(RUNTIME) so the VM will completely ignore these annotations during program execution. It has been designed this way specifically to allow running programs with these annotations without adding demands regarding your runtime classpath.
(In reply to comment #6) > why wrong? Do you see a better option? I can't make any better points than what was said before. Markus has a nice explanation that equals my sentiments: > This makes the Eclipse compiler incompatible with a standard Java compiler, > since the generated class files are not equivalent any more. We should not > deviate unless absolutely necessary. I think it is a justifiable cost to have slower analysis for null-checking, if it must be, rather than the current solution of creating synthetic annotations. It doesn't seem "right" to substitute/add annotations simply because one IDE (i.e., Eclipse) finds it convenient. I prefer compatible binaries any day of the week.
(In reply to comment #7) > (In reply to comment #6) > > why wrong? Do you see a better option? > > I can't make any better points than what was said before. Markus has a nice > explanation that equals my sentiments: > > > This makes the Eclipse compiler incompatible with a standard Java compiler, > > since the generated class files are not equivalent any more. We should not > > deviate unless absolutely necessary. > > I think it is a justifiable cost to have slower analysis for null-checking, if > it must be, rather than the current solution of creating synthetic annotations. > It doesn't seem "right" to substitute/add annotations simply because one IDE > (i.e., Eclipse) finds it convenient. I prefer compatible binaries any day of > the week. Two different compilers will hardly ever create exactly the same bytes. Please let us know what you think will break.
(In reply to comment #7) > I think it is a justifiable cost to have slower analysis for null-checking, if > it must be, Forgot to answer this: it's not just performance, without generating any annotations we *cannot* support the option to specify a project or workspace wide default.
(In reply to comment #7) > (i.e., Eclipse) finds it convenient. I prefer compatible binaries any day of > the week. The binaries _are_ compatible. See comment# 3. The VM shouldn't care about these decorations. Nor should a compiler. Nor should an annotation processor that is not designed to process these annotations. That said, there can be some pathological situations where ripples can occur: See for example bug 288658. If the test case in https://bugs.eclipse.org/bugs/show_bug.cgi?id=288658#c8 were to have coded a bit "loosely" to use java.lang.reflect.Method.getDeclaredAnnotations() in place of java.lang.reflect.Method.getAnnotation(Class<T>) then interesting results could occur with the propagated annotation in place. (In reply to comment #9) > (In reply to comment #7) > > I think it is a justifiable cost to have slower analysis for null-checking, if > > it must be, > > Forgot to answer this: it's not just performance, without generating any > annotations we *cannot* support the option to specify a project or workspace > wide default. yes, (In reply to comment #4) > (b) Translated global default into @NonNullByDefault into each affected type, > without adding anything to methods (future: fields). This definitely looks much less invasive/intrusive while maintaining the same semantics and so is worth a look post M5.
There are a few real problems: a) "Pollution" of class files with many annotations that are not in source. b) Class files compiled with a non-Eclipse compiler will not contain the synthetic annotations, so compiling against such class files will yield different results than compiling against Eclipse-generated class files. In that scenario, the class files are really incompatible. c) Generated Javadoc won't contain the defaults, so the class files and the Javadoc will deviate. For quick fixes / clean ups, the storage doesn't matter, since we have all annotations available when computing the fixes, so the implementation can play the game of defaults by walking the enclosing scopes without much overhead. But for the Javadoc hovers (bug 366007), it will make a difference. The method parameter annotations are currently fetched from Java model elements: IMethod#getParameters() and then getAnnotations(). This currently returns the synthetic annotations from class files, but doesn't return them for source elements. We could also find the defaults using IJavaElements, but I think that will be more expensive, and we will have to pay this price all the time (because every method parameter is potentially affected by defaults). So we will have to talk about caching the defaults anyway. Whatever we do in Eclipse, we will deviate from the output of the Javadoc tool as soon as we add any defaults. Maybe we should just remove the default setting from the project or turn it into a warning if a package or type in the project doesn't have the @NonNullByDefault annotation. That way, at least the Javadoc and the generated class files stay in sync. And then we can also avoid generating class files with inherited annotations. In the Eclipse hovers, we could still add some smartness to show inherited null annotations. Sorry if it looks like I deviated from the main topic of this bug, but the storage and presentation of null annotations are really inertwined and must be considered together.
(In reply to comment #11) Thanks, Markus, for your comments. And, it was me who asked for impact regarding the UI so I don't see that as a deviation :) > a) "Pollution" of class files with many annotations that are not in source. What "environmental damage" are you thinking of? Is that a concern regarding file size? > b) Class files compiled with a non-Eclipse compiler will not contain the > synthetic annotations, so compiling against such class files will yield > different results than compiling against Eclipse-generated class files. In that > scenario, the class files are really incompatible. Using Srikanth's POV (I think) the compiler will not behave differently, it's just the integrated annotation processor. In other words, only null-related warnings will differ, the produced class files will be strictly compatible wrt their runtime behavior (thanks to the fact, actually, that we never eliminate any code that appears to be dead based on null analysis). So, yes, if you use a compiler that doesn't support this kind of analysis then the analysis will be incomplete. That's a tautology. > c) Generated Javadoc won't contain the defaults, so the class files and the > Javadoc will deviate. Yes, that's a pity. I wish there were a mechanism to make this information show up in the Javadoc, too. I have no answer for this. > Maybe we should just remove the default setting from the project or ... Really?? I'm surprised to see that you are willing to sacrifice the global defaults in favor of avoiding synthetic annotations. From my recollections global defaults were unanimously considered as a necessary feature to make null annotations practical. As argued above I'm not convinced that synthetic annotations really cause any harm that motivates such drastic conclusion, but if that is indeed your conclusion, then how about the following solution: Instead of generating synthetic annotations just generate a custom bytecode attribute using the org.eclipse.jdt name space. Such bytecode attributes are explicitly designed for the purpose of storing tool-specific information in a class file. No other tool is supposed to ever look at any bytecode attributes it doesn't know of. At that point we could actually stick to storing this information at all individual locations using any representation of our choosing without the need to tell anybody about it. Again, I don't see the need but as a last resort this could make us perfectly honest without sacrificing essential functionality.
(In reply to comment #11) > b) Class files compiled with a non-Eclipse compiler will not contain the > synthetic annotations, so compiling against such class files will yield > different results than compiling against Eclipse-generated class files. In that > scenario, the class files are really incompatible. Can you describe a case or cases where we would see the difference you are referring to ? Perhaps I am overlooking some nuances altogether and categorically asserting all will be well ? On further thought, there _are_ a few cases where eclipse compiler behaves differently from javac when compiling against class files that reference a missing type in annotations - I have prima facie held these to be bugs on our side, but these do show such behavioral differences today. See https://bugs.eclipse.org/bugs/show_bug.cgi?id=320965 https://bugs.eclipse.org/bugs/show_bug.cgi?id=239639 Also: https://bugs.eclipse.org/bugs/show_bug.cgi?id=36397 https://bugs.eclipse.org/bugs/show_bug.cgi?id=283687.
> > a) "Pollution" of class files with many annotations that are not in source. > What "environmental damage" are you thinking of? File size, unexpected annotations, confusion of somebody looking at the classfiles and comparing them to source and Javadoc. > > b) Class files compiled with a non-Eclipse compiler will not contain the > > synthetic annotations, so compiling against such class files will yield > > different results than compiling against Eclipse-generated class files. In that > > scenario, the class files are really incompatible. To be specific, the scenario is that code with null annotations gets compiled by a non-Eclipse compiler (e.g. on an external build system), and then another project uses those pre-compiled class files as dependency. In that case, any tricks we do to add more annotations than what's in source or to add custom attributes will cause trouble, since the compiled dependency is incomplete. The only way out is to strictly adhere to the original JVM spec. Therefore my proposal to avoid custom information in class files. > > Maybe we should just remove the default setting from the project or ... > > Really?? I'm surprised to see that you are willing to sacrifice the global > defaults in favor of avoiding synthetic annotations. From my recollections > global defaults were unanimously considered as a necessary feature to make null > annotations practical. Yes, I like and want to keep the global defaults, but I think they are not practical because they essentially store a flag for a compile participant, but Java doesn't support compile participants that modify existing types. Therefore my suggestion to just emit a warning if a global default is set, but a package doesn't specify @NonNullByDefault. Then the source stays completely clean and self-contained, and even Javadocs are correct. And a simple quick fix can add the missing annotation in the source.
(In reply to comment #14) > > > b) Class files compiled with a non-Eclipse compiler will not contain the > > > synthetic annotations, so compiling against such class files will yield > > > different results than compiling against Eclipse-generated class files. In that > > > scenario, the class files are really incompatible. > > To be specific, the scenario is that code with null annotations gets compiled > by a non-Eclipse compiler (e.g. on an external build system), and then another > project uses those pre-compiled class files as dependency. In that case, any > tricks we do to add more annotations than what's in source or to add custom > attributes will cause trouble, since the compiled dependency is incomplete. Sorry for being particularly unimaginative, I am still going to need a bit more help/information to understand what trouble you are anticipating. Thanks.
Maybe we can avoid some unnecessary iterations in this discussion if we first agree on what exactly is to be shown: Speaking of incompatibilities and troubles caused can mean quite different things. IMHO comment 5 illustrates that generally speaking of incompatible class files raises fears, e.g., of bad things happening at runtime. Specifically, do we see any of these dangers: - A virtual machine could fail to execute our class files or could execute them with wrong runtime behavior? - Another compiler could fail to read and interpret class files produced by the JDT? - The JDT compiler could fail to interpret class files produced by another compiler to the effect that legal Java would not result in legal Java class files? Should more tools aside from JVM and compilers be considered? Which? What bad effects can happen? By contrast, all unintended effects that I can see so far boil down to different ways how our advanced null analysis could be incomplete if class files produced by another compiler are involved. This is different in so far as no aspects of a program that are under the laws of JLS and JVM-spec are affected but only the additional service of our advanced analysis. The worst case that I can see so far is: when consuming class files from another compiler a project could fail to compile because our advanced null analysis is lacking information in those class files, but by disabling this analysis the consuming project can always be compiled without problems. Before we discuss the severity of this problem (and possible remedies), could we please first agree whether this is indeed the worst case or whether anyone anticipates worse scenarios.
(In reply to comment #16) > Before we discuss the severity of this problem (and possible remedies), could > we please first agree whether this is indeed the worst case or whether anyone > anticipates worse scenarios. I *did* intend to place a question mark :) I mean it as an open question indeed.
(In reply to comment #16) I don't see any run-time incompatibilities, only compile-time problems and confusion when someone (or a tool like the Eclipse Java model) inspects class files. => I agree about the worst-case scenario, but I'd say it's not only a problem for compilers but also for other tools that consume class files (reflective tools, Java model). Before this change, class files produced by javac and by ecj were functionally equivalent, i.e. for correct Java source, they produced results that only had small differences in terms of performance, but no other externally visible differences. Therefore, it was always possible to replace ecj by javac without any impact on the compilation results. With this change, this is no longer true. Concrete scenario: - project A uses null annotations - project B requires A and uses null annotations as well - project A is built independently of B, and developers of B work with a compiled JAR of A and generated Javadocs of A - if project A is compiled using ecj, the compiled class files contain synthetic null annotations (but the Javadocs don't) - if project A changes its build system to use javac, the class files will not contain the synthetic null annotations any more. When B uses ecj to compile against the new A.jar, it will get compile errors because the synthetic null annotations in A.jar are missing. Hence, class files produced by the two compilers are not completely exchangeable any more.
(In reply to comment #18) > Concrete scenario: > - project A uses null annotations > - project B requires A and uses null annotations as well > - project A is built independently of B, and developers of B work with a > compiled JAR of A and generated Javadocs of A > > - if project A is compiled using ecj, the compiled class files contain > synthetic null annotations (but the Javadocs don't) > - if project A changes its build system to use javac, the class files will not > contain the synthetic null annotations any more. When B uses ecj to compile > against the new A.jar, it will get compile errors because the synthetic null > annotations in A.jar are missing. Stephan, is the last statement strictly true ? Can you construct an example ? I can readily construct a test case for the scenario where the difference is that something that would not compile if A were to be compiled with ecj to start compiling when A is compiled with javac. (In reply to comment #12) > Instead of generating synthetic annotations just generate a custom bytecode > attribute using the org.eclipse.jdt name space. Such bytecode attributes are > explicitly designed for the purpose of storing tool-specific information in a > class file. No other tool is supposed to ever look at any bytecode attributes > it doesn't know of. This then will have the same problem ?
An alternative for implementing a global @NonNullByDefault would be to put this into the MANIFEST.MF. Then, also packaged JARs would contain all necessary information. This would still have the disadvantage that the default is not visible in the Javadoc, but at least the compiler wouldn't need to know any magic. The MANIFEST.MF came to my mind when I was reflecting on bug 369079 and bug 331651. Maybe we should also put the annotation names into the MANIFEST.MF, so that a JAR can be self-contained and tell the compiler what annotations it uses.
(In reply to comment #20) > An alternative for implementing a global @NonNullByDefault would be to put this > into the MANIFEST.MF. Markus, would this still be an effective strategy with tools like Maven? I use Maven, but ECJ is simply where things are compiled in the IDE; my project is totally recompiled outside Eclipse during package time. BTW, the project built with Maven wouldn't have the synthetic @NonNull annotations due to shelling out to another compiler.
(In reply to comment #21) > Markus, would this still be an effective strategy with tools like Maven? I use > Maven, but ECJ ... Just a quick not on "Maven vs. ECJ": Although Maven by default uses javac, not ecj, this is not hard coded. It is not difficult to configure your poms for compiling with the "jdt" compiler. So, strictly spoken, there is no "Maven vs. ECJ". In fact, for anybody interested in annotation based null analysis I'd strongly recommend to use ecj also for Maven based builds. Otherwise you won't see these warnings/errors on the build server.
(In reply to comment #22) > So, strictly spoken, there is no "Maven vs. ECJ". Stephan, you are correct. This is not a Maven vs. ECJ argument. My apologies for any implication; that wasn't my intention. My assumption is most people use Oracle's javac because the build process is usually done with Hudson/Jenkins and totally outside of Eclipse's influence. AFAIK in my experience, I haven't met a developer who uses ECJ for anything else but the IDE. I was alluding to comment #14 where Markus said: > To be specific, the scenario is that code with null annotations gets compiled > by a non-Eclipse compiler (e.g. on an external build system), and then another > project uses those pre-compiled class files as dependency. In that case, any > tricks we do to add more annotations than what's in source or to add custom > attributes will cause trouble, since the compiled dependency is incomplete. With that said, I think the solution chosen for @NonNullByDefault must be portable across compilers. It shouldn't have to rely on ECJ to get correct diagnostics if my binaries was rebuilt by another compiler.
(In reply to comment #20) > An alternative for implementing a global @NonNullByDefault would be to put this > into the MANIFEST.MF. Then, also packaged JARs would contain all necessary > information. This would still have the disadvantage that the default is not > visible in the Javadoc, but at least the compiler wouldn't need to know any > magic. I like this suggestion much better than, e.g., the idea to generate lots of package-info.java files (bearing also in mind how little tested the JDT's support for those was until recently :) ). Before you said this I was going to propose another view: projects with null annotations which OTOH are not compiled using the JDT compiler could be regarded as falling back to the "legacy" status, where missing null annotations would need to be provided using the mechanisms of bug 331651. And of course we could let our compiler create those profiles as a by-product of in-IDE-compilation. But maintaining the few entries inside a MANIFEST.MF is of course a lot easier. Looking at the spec of MANIFEST.MF I only see an indirect mention of custom entries, when it says that unknown attributes must be ignored. Is it safe to assume that we may add JDT specific attributes to the manifest? I'll soon start working on bug 331651. While doing so I will keep in mind the ideas of connecting the different bits via the MANIFEST.MF. What actually are the options for non-OSGi projects to manage their manifests using the JDT? Would it be straight-forward to transparently store information from the project preferences directly in the manifest, too? Or would management of the manifest vary too much depending on the kind of project?
Sigh! I finally made it through all the comments (sorry for being late). I agree with Markus on this topic. We shoot ourselves in the foot if we assume that the additional information is in the class file. While people might use our compiler in the IDE, they might build using javac as already noted in this bug. Those projects will loose the information when they import/use the built JARs. Also, in the light of being open, we should give non-Eclipse projects which only use javac the ability to enrich their code so that it can be consumed by Eclipse projects using our null annotation analysis.
(In reply to comment #24) > (In reply to comment #20) > > An alternative for implementing a global @NonNullByDefault would be to put this > > into the MANIFEST.MF. Then, also packaged JARs would contain all necessary > > information. This would still have the disadvantage that the default is not > > visible in the Javadoc, but at least the compiler wouldn't need to know any > > magic. > > I like this suggestion much better than, e.g., the idea to generate lots of > package-info.java files (bearing also in mind how little tested the JDT's > support for those was until recently :) ). Pardon my ignorance. In a scenario where some classes are compiled using javac and packaged as a jar for consumption in other eclipse projects, do we have control over the manifest that goes with the jar ? Always, usually, sometimes ?
Another way to solve this problem is for JDT compiler to not propagate the defaults, but for eclipse to allow for defaults to be specified at the dependency level. It would/could be clumsy to materialize this setting when handling binary types, but it can be made to work and can remove concerns around compatibility. Thinking aloud (not necessarily having thought through): May be we could invent a IClasspathAttribute that points to an xml options file ? Or May be the UI could allow someone to specify nullness defaults in the build path page ? I don't know if these introduce more headaches than solved and whether how well it would fit within the overall UI design. Proposing these for discussion nevertheless so we can explore ways out of this deadlock. If the dependency comes in as a project, we can already handle that. Only for jar files we need to think of a way.
Created attachment 210141 [details] two projects showing that A must be compiled by ecj (In reply to comment #19) > > When B uses ecj to compile > > against the new A.jar, it will get compile errors because the synthetic null > > annotations in A.jar are missing. > > Can you construct an example ? See the attached projects A and B. There's a compile error in B, which goes away when you modify <classpathentry kind="lib" path="/A/javac_bin"/> in B/.classpath to refer to /A/bin. (In reply to comment #24) > Would it be straight-forward to transparently store information > from the project preferences directly in the manifest, too? We won't store the same option in multiple places. Either it's in the manifest, or it's in the preferences. For PDE projects, the location of the manifest is well-defined, but for plain Java projects, it can really be anywhere. In that case, we would have to ask the user for the location of the manifest and tell him to include it in the build. Or we just say that we support <project>/META-INF/MANIFEST.MF and maybe the first <sourcefolder>/META-INF/MANIFEST.MF, and that's it. External information is always a pain to deal with, so I would try to avoid solutions that require the client to fiddle with classpath attributes when adding a JAR.
(In reply to comment #26) > Pardon my ignorance. In a scenario where some classes are compiled > using javac and packaged as a jar for consumption in other eclipse > projects, do we have control over the manifest that goes with the > jar ? Always, usually, sometimes ? Corporations who use Ant or Maven in their build process won't be able to use the MANIFEST.MF solution. These tools are well-known to creating the MANIFEST.MF file at packaging time.
(In reply to comment #29) > (In reply to comment #26) > > Pardon my ignorance. In a scenario where some classes are compiled > > using javac and packaged as a jar for consumption in other eclipse > > projects, do we have control over the manifest that goes with the > > jar ? Always, usually, sometimes ? > > Corporations who use Ant or Maven in their build process won't be able to use > the MANIFEST.MF solution. These tools are well-known to creating the > MANIFEST.MF file at packaging time. How about, e.g., http://maven.apache.org/shared/maven-archiver/examples/manifestFile.html :)
(In reply to comment #30) > (In reply to comment #29) > > (In reply to comment #26) > > > Pardon my ignorance. In a scenario where some classes are compiled > > > using javac and packaged as a jar for consumption in other eclipse > > > projects, do we have control over the manifest that goes with the > > > jar ? Always, usually, sometimes ? > > > > Corporations who use Ant or Maven in their build process won't be able to use > > the MANIFEST.MF solution. These tools are well-known to creating the > > MANIFEST.MF file at packaging time. > > How about, e.g., > http://maven.apache.org/shared/maven-archiver/examples/manifestFile.html > :) Okay, yes, you can change your build process... but why would you? That's my point. People who already have an established configuration/process won't go for this solution. If I want to use Eclipse's @NonNullByDefault, I am forced to use ECJ and modify my corporate process to consume a different manifest? It's too much; the solution is overdone. It's not simple enough.
(In reply to comment #31) > (In reply to comment #30) > > (In reply to comment #29) > > > (In reply to comment #26) > > > > Pardon my ignorance. In a scenario where some classes are compiled > > > > using javac and packaged as a jar for consumption in other eclipse > > > > projects, do we have control over the manifest that goes with the > > > > jar ? Always, usually, sometimes ? > > > > > > Corporations who use Ant or Maven in their build process won't be able to use > > > the MANIFEST.MF solution. These tools are well-known to creating the > > > MANIFEST.MF file at packaging time. > > > > How about, e.g., > > http://maven.apache.org/shared/maven-archiver/examples/manifestFile.html > > :) > > Okay, yes, you can change your build process... but why would you? That's my > point. People who already have an established configuration/process won't go > for this solution. If I want to use Eclipse's @NonNullByDefault, I am forced to > use ECJ and modify my corporate process to consume a different manifest? It's > too much; the solution is overdone. It's not simple enough. If you don't want to move even a millimeter just don't use default nullness and be content. We can't do miracles, OK? If we are not allowed any location where the information about a nullness default can be stored, than what should we do?
(In reply to comment #31) > If I want to use Eclipse's @NonNullByDefault, I am forced to > use ECJ and modify my corporate process to consume a different manifest? It's > too much; the solution is overdone. It's not simple enough. To clarify, if you have followed this bug you should know that using the manifest approach is not for using @NonNullByDefault annotations but for specifying a global default. OTOH, if you haven't follow the discussion please be more careful with your statements.
(In reply to comment #28) > Created attachment 210141 [details] > two projects showing that A must be compiled by ecj > > (In reply to comment #19) > > > When B uses ecj to compile > > > against the new A.jar, it will get compile errors because the synthetic null > > > annotations in A.jar are missing. > > > > Can you construct an example ? > > See the attached projects A and B. There's a compile error in B, which goes > away when you modify <classpathentry kind="lib" path="/A/javac_bin"/> in > B/.classpath to refer to /A/bin. OK, Thanks Markus. I see this as a specific instance of the general problem of a project B expecting the annotation processed A, but instead encountering plain A. I imagine we would be able to construct similar examples outside of the null annotation context. Nevertheless combined with the earlier feedback around teams using hybrid build set ups, I agree such build breakage is a real possibility and when that happens we can't quibble over pedantics. So I concede the point that we should not add synthetic annotations. At the moment, having considered all suggestions, the simplest one appears to be the suggestion to warn if a package in the project doesn't have the @NonNullByDefault annotation (and augment that with quickfixes to generate package info where necessary and/or apply the annotations.) The way I see it, any one who applies project wide or workspace wide defaults is in for a certain amount of tweaking work anyways and this could be a part of it. I believe this is an issue only for classes in a jar file. If project B consumes project A as a required project or for classes with a project in an incremental build, this should not be an issue. (may need some code changes, but is not the major hurdle).
I quickly thought about using the package-info.java of the default package to define a project-wide default, but that doesn't work, since that file cannot be syntactically correct without a package declaration. We could also redefine the meaning of @NonNullByDefault and say that this also applies to subpackages. That's IMO still not as good as declaring it on each package, but it would avoid many package-info.java files.
(In reply to comment #35) > I quickly thought about using the package-info.java of the default package to > define a project-wide default, but that doesn't work, since that file cannot be > syntactically correct without a package declaration. I was thinking about default packages, too: since package-info.java doesn't work for default packages, here we will need to insert @NonNullByDefault into each toplevel type, right? > We could also redefine the meaning of @NonNullByDefault and say that this also > applies to subpackages. That's IMO still not as good as declaring it on each > package, but it would avoid many package-info.java files. For the time being, yes, when @NonNullByDefault can be applied recursively the burden is much smaller (with all hesitation connected to the concept of "sub-packages"). OTOH, I can see situations where a top level package contains API where @NonNullByDefault is desired, whereas all its subpackages could be internal without such default. I'm saying this to illustrate that *always* applying the default recursively will cause a penalty for other use cases. Thus, I suggest to rethink what parameters @NonNullByDefault should have. Currently we have one boolean: default on or default off. In the context of null annotations for fields I was thinking that for certain programming styles fields tend to be nullable but methods provide nonnull access (e.g., lazy initialized fields). So, sometimes we'd want @NonNullByDefault to apply to method signatures only, whereas in other code fields should be affected, too. Now we have the question of whether or not "sub-packages" should be affected. If we come up with a clean and powerful model for tuning @NonNullByDefault (and find an appropriate default (meta-default?)) I could actually start to like this solution. @Markus: in Ludwigsburg we deferred further parameterization to keep things simple. Is now the point to rethink this?
Stephan, can you take a look to see how easy or hard it is to (while internalizing a binary type), look up package-info for a particular package at different levels of the sub package structure ? Thanks.
(In reply to comment #35) > We could also redefine the meaning of @NonNullByDefault and say that this also > applies to subpackages. That's IMO still not as good as declaring it on each > package, but it would avoid many package-info.java files. Should we care about there being many package info files ? Why don't we keep it simple and opt for the most straightforward scheme ?
(In reply to comment #36) Default packages are not meant for serious programming, so having to add @NonNullByDefault to all top-level types is not an issue. For tuning @NonNullByDefault with more options, we can always add more members. If the declaration specifies a default, then old clients stay compatible, e.g. boolean includeSubpackages() default true; (In reply to comment #38) > Should we care about there being many package info files ? Why don't we > keep it simple and opt for the most straightforward scheme ? Also fine with me. It shouldn't be too hard to adjust the New Package and New Type wizards to also create the package-info.java when the "non-null by default" option is set on the project. But with the subpackage solution, we could even avoid the project-wide option and easily specify everything in code.
(In reply to comment #39) > (In reply to comment #38) > > Should we care about there being many package info files ? Why don't we > > keep it simple and opt for the most straightforward scheme ? > > Also fine with me. It would be great if a user could sort the problems by id in the problems view, multi-select several messages and apply a quick fix at one go. Do we support such multi-fixes in other cases ?
Can we agree on the course of action before Markus goes on vacation, so that work in JDT/Core is not stalled and can still proceed ? My preference is to keep things simple and not worry about there being many package info files. I worry that any short cut we devise may have some as yet unforeseen implication. (Particularly) If UI would support multiselection and quickfix, it is a non-issue. This is not a strong preference however - if majority leans towards includeSubpackages approach, I am fine with it too.
Discussed this again with Dani. The manifest approach entails many little things that will add complexity. Finding the manifest in all cases is not trivial (special cases for class folders or other projects on the classpath, etc.), it would often also need changes in external build system, it would not be visible in generated Javadoc, and it would also need more special support in refactorings where you wouldn't expect it (e.g. moving a package from one project to another). The subpackges approach has similar limitations. We agree with Srikanth that the best solution is to keep this simple and in line with the Java language: Packages are the biggest elements that can have an @NonNullByDefault annotation in the source files. We keep the per-project option, but change its meaning to "the project *intends* to use @NonNullByDefault in all packages". The only disadvantage of that approach is that for non-null-by-default projects, each package needs a package-info.java file. We can ease the pain with a few enhancements: - Core: Show a warning when the project setting is enabled. - UI: Have a multi-quick fix for the above warning that can fix all these warnings at once. - UI: Toggling the project-wide default adds/removes @NonNullByDefault for all packages (remove only after asking the user). - UI: New package/type wizards automatically add the package-info.java when a new package is created.
(In reply to comment #42) > The subpackges approach has similar limitations. Do you have an example? > - Core: Show a warning when the project setting is enabled. What element do I report the warning against? It seems most natural to report against each package lacking the annotated package-info.java, but I don't think the compiler has the infrastructure to report anything against a package. In bug 186342 I made experiments with reporting configuration problems with CAT_BUILDPATH. In bug 186342 comment 162 I reverted those experiments following a discussion in the team. This would leave me with reporting the problem against each individual compilation unit ;-P
(In reply to comment #43) > [..] In bug 186342 I made experiments with > reporting configuration problems with CAT_BUILDPATH. In bug 186342 comment 162 > I reverted those experiments following a discussion in the team. Here's a better xref: bug 363858.
(In reply to comment #43) > What element do I report the warning against? Can we report it against the package declaration of the "first" type from the package we encounter ? "Encounter" here could be in the context of full build, incremental build (we need to consider multiple waves not producing multiple warnings from the same package), reconcile operations in editor etc. > It seems most natural to report against each package lacking the annotated > package-info.java, but I don't think the compiler has the infrastructure to > report anything against a package. What would it take/mean to invent a CAT_PACKAGE ?
(In reply to comment #45) > > It seems most natural to report against each package lacking the annotated > > package-info.java, but I don't think the compiler has the infrastructure to > > report anything against a package. > > What would it take/mean to invent a CAT_PACKAGE ? We'd have to define at least two things: - What's the life cycle of the corresponding markers? This will require more stuff in the builder, I guess. - Ask UI to implement visualization and navigation. Not sure if that's all.
> - Ask UI to implement visualization and navigation. Not sure what you mean here? The annotations should go on each package. Inventing some kind of hierarchy where the annotation is inherited from a super package will not work well outside Eclipse.
(In reply to comment #47) > > - Ask UI to implement visualization and navigation. > > Not sure what you mean here? The annotations should go on each package. visualization: Does the package explorer display error markers on packages? navigation: Does the Problems view support double click on a marker that refers to a package? I don't think we have use cases for this, yet, do we?
(In reply to comment #48) > (In reply to comment #47) > > > - Ask UI to implement visualization and navigation. > > > > Not sure what you mean here? The annotations should go on each package. > > visualization: > Does the package explorer display error markers on packages? Yes. > navigation: > Does the Problems view support double click on a marker that refers to a > package? Double-click means open it in an editor, hence the answer is "no, and it won't be implemented". To select a marker in a specific view, one can use 'Show In', which also works for packages.
(In reply to comment #49) > (In reply to comment #48) > > (In reply to comment #47) > > > > - Ask UI to implement visualization and navigation. > > > > > > Not sure what you mean here? The annotations should go on each package. > > > > visualization: > > Does the package explorer display error markers on packages? > Yes. > > > > navigation: > > Does the Problems view support double click on a marker that refers to a > > package? > Double-click means open it in an editor, hence the answer is "no, and it won't > be implemented". To select a marker in a specific view, one can use 'Show In', > which also works for packages. Thanks for clarification. That means UI is done in this regard. :) Remains the issue of the markers' life cycle. I'll look into that.
This hybrid compilation model is worrisome: Consider this situation where a @NonNull annotated method actually returns null and is compiled with javac and so gets by. On the eclipse project side, the @NonNull method's return could be captured in a @NonNull local and dereferences without guarding code such code having been deleted by the user because eclipse encouraged him/her to ? i.e original eclipse side code: class X { void foo() { @NonNull Object o = Y.goo(); if (o != null) { // <<<--- we encourage elimination of this check o.toString(); } } } Y.java (compiled with javac or with eclipse without null annotations turned on and loaded into a jar which is added to build path of project with X.java:) class Y { @NonNull static Object goo() { return null; } } If the programmer deleted the "if (o != null)" because of our advice, he/she would be in trouble. So it is not just a question how the annotations are persisted in the binary file that matters, We also need to know somehow whether the annotations were validated at the time the class file got generated. I don't yet have a suggestion how this may be handled, but would like to hear opinions.
Sure, the intention behind systematically using null checks exactly where needed includes the advice to eliminate unnecessary null checks. And, sure, the guarantees given by any contract are only valid if the obligations of the same contract are also checked. If Y advertises goo() as non-null this is part of the contract and the party breaking the contract acts illegally, here the programmer writing that goo() method returning null has to expect a possibly high fine. IMHO that's the nature of specifications and contracts. It will always be possible to deliberately create examples that violate a contract if the contract checker doesn't see all parts of the game. It's not our job to make NPE *impossible*, is it?
Srikanth, At second reading I'm actually not 100% sure which way your comment was heading, so another attempt at interpreting: (In reply to comment #51) > This hybrid compilation model is worrisome: Are you saying, we should advise users against using code from two different compilers in the same project?
(In reply to comment #52) > It will always be possible to deliberately create examples that violate > a contract if the contract checker doesn't see all parts of the game. Agreed. > It's not our job to make NPE *impossible*, is it? Agreed again. My point was about whether we could inadvertently encourage users to inadvertently shoot themselves in their foot. And a more academic side to the question was to see if there is a way to know when looking a class file whether a persisted annotation was actually acted upon by a processor and if so which one. It looks like a reasonably interesting, important and useful query to me. (In reply to comment #53) > Are you saying, we should advise users against using code from two different > compilers in the same project? No. Actually after I wrote about javac and eclipse being mixed in a hybrid model, I realized that you don't even need two compilers. ECJ itself is enough. A jar file could be loaded with classes that were annotated but compiled without annotations enabled. If this jar file is added to the build path of a project which uses null annotations then the problem scenario could show up again. So the intent of the question is really to know if there is a way to find out the presence of @NonNull on a binary method constitutes a guarantee or not and recover suitably if that is possible.
So the question is: can the compiler find out whether it can trust any contract annotations it finds in the byte code of already-compiled classes? This, IMHO, is a perfectly legal use case for a custom bytecode attribute: just drop a note: This class file was generated by ecj version 3.8.0 with the following irritants checked as errors: ... If the attribute is there: go ahead, fully trust the contracts (depending on the irritants). If the attribute is not there but contracts are contained: use the contracts during null analysis, but report that the other side of the contract may not have been checked (default: error). XXL-variant: provide an additional compiler option: Handle null annotations not checked by ecj: ( ) trust ( ) ignore (o) report as error
(In reply to comment #55) > So the question is: can the compiler find out whether it can trust any contract > annotations it finds in the byte code of already-compiled classes? Yes, > This, IMHO, is a perfectly legal use case for a custom bytecode attribute: just > drop a note: > This class file was generated by ecj version 3.8.0 with the following irritants > checked as errors: ... So out of the box, there is no way to know ? > XXL-variant: provide an additional compiler option: > Handle null annotations not checked by ecj: > ( ) trust > ( ) ignore > (o) report as error This sounds like a good idea particularly since we encourage programmers to remove null checks. I'll file a follow up bug for post 3.8.
(In reply to comment #51) I find the proposed compiler option / custom bytecode attribute worrisome, and I don't think we need this. The presence of the annotations should be enough for a nullness checker to treat them as contracts. Similarly, an API method that promises to return a List<Integer> can also just return a raw List, but we still trust the API. If a build system uses javac and e.g. FindBugs to verify the @NonNull annotations, then we would generate useless warnings, although everything is fine. In a setup where not even the developers of a library use the Eclipse compiler to verify the nullness annotations, I don't think there's much justification for them to believe that the nullness annotations in their APIs are validated anywere -- so this is not a situation we should have to worry about.
(In reply to comment #57) > (In reply to comment #51) > I find the proposed compiler option / custom bytecode attribute worrisome, and > I don't think we need this. I can agree that there's no (strong) need for it, but I didn't understand what's worrisome about it. Do you care to expand, or should we just forget about it? BTW, a patch for the blocking bug 365531 is about to be released.
The proposal doesn't worry me on the technical side, but I think it hampers usability. The invisible bytecode attribute is hard to understand, especially from the viewpoint that the nullness checker should just act like an external annotation processor. That we implemented this in the compiler is an implementation detail.
After the fix in bug 365531 I reverted the now broken global default via commit c5e448ecc90e8d75bf431155e497df695586ca94. At this point we no longer generate any synthetic annotations but the global default is completely without effect. As a consequence several tests had to be disabled. I propose to open a new bug for the replacement for the global default (aka the intended default). My previous attempts in implementing errors without a compilation unit to report against were not received enthusiastically (bug 363858), so I'd like to step back from that task.
(In reply to comment #60) > My previous attempts in implementing errors without a compilation unit to > report against were not received enthusiastically (bug 363858), so I'd like to > step back from that task. Stephan, I vaguely remember discussing this. Did we investigate the option of a buildpath error instead? I think it won't cause bug 363858 -like problems. If it's not too complicated, it may be worth a shot. :)
(In reply to comment #61) > (In reply to comment #60) > > My previous attempts in implementing errors without a compilation unit to > > report against were not received enthusiastically (bug 363858), so I'd like to > > step back from that task. > Stephan, I vaguely remember discussing this. Did we investigate the option of a > buildpath error instead? I think it won't cause bug 363858 -like problems. If > it's not too complicated, it may be worth a shot. :) One of the issues is to determine the life-cycle for those error markers. The builder handles markers per CU which can be incrementally added and removed. And it also handles build path markers which are much more sticky but I don't have the details right now. I'm afraid that for package level markers the builder lacks the event for removal.
(In reply to comment #60) > I propose to open a new bug for the replacement for the global default (aka the > intended default). Hi Stephan, Has a new bug been raised ? I see two issues with the current state of affairs: (1) NonNullByDefault annotation at the package/type level is not honored for binary types on HEAD. and (2) (In reply to comment #11) [...] > Whatever we do in Eclipse, we will deviate from the output of the Javadoc tool > as soon as we add any defaults. Maybe we should just remove the default setting > from the project or turn it into a warning if a package or type in the project > doesn't have the @NonNullByDefault annotation. That way, at least the Javadoc > and the generated class files stay in sync. And then we can also avoid > generating class files with inherited annotations. We need to implement this behavior. Markus, Do you want to propose what the UI label for this new scheme would say, so we have a common vocabulary to speak/think about this new behavior ? (1) Is a must for M6, (2) is nice to have for M6.
(In reply to comment #63) > (1) NonNullByDefault annotation at the package/type level is not > honored for binary types on HEAD. bug 372011 > (2) > We need to implement this behavior. bug 372012 > Markus, > > Do you want to propose what the UI label for this new scheme would say, > so we have a common vocabulary to speak/think about this new behavior ? bug 372013
Verified for 3.8 M6 using Build id: I20120306-0800