Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
RE: [cdt-dev] CDT Conference call - Build Model Docs

> After a bird eyes view, I got some questions to help me 
> getting a better understanding.
> Feel free to answer them in different emails.  Thanks again 
> for the API templates.

I'll try to answer all here.  If anything seems unclear, well,
blame it on the cold medicine :-/
 
> == Scenario 1: Existing proprietary builders.
> In a previous email from Sky M. @ rational, he outlined some 
> of the things the build model
> should accomodate and one of them was cooperation with others.
> For example, for the QNX IDE, qnx C/C++ project,  QNX 
> provides a builder
> for the project that has a very good knowledge of the way 
> things should be compile
> (Drivers, resource managers etc ..) and even on where things 
> are in the filesystem.
> 
> Sky's vision below(he could correct me, if I missed the point 8-)

<snip>
 
> It was not clear to me how those other builders can plugin 
> within the CDT.
> Unless the answer is: if you use the IDE you got to use the 
> Build of the CDT.

It seems like there are two ways to do this:

- Provide an abstract builder that is used to implement the default
  CDT builder.  ISVs could use the capabilities of the abstract
  builder to implement a custom builder as desired.

- Provide a default builder that is driven in part or in whole by the
  toolchain.  The default builder would order resource builds to
  fulfill dependencies reported by the toolchain.

A combination of the two would probably work best - develop a
flexible builder that relies on the toolchain to encapsulate
knowledge about how to build things properly, but make it a
straightfoward task to derive a completely different builder.

> == scenario 2: Existing projects
> Lots of talk about usability, where new users hit a brick wall
> i.e. they have to put up with arcane Makefiles etc ..  all true.
> However what about the flip side of the coin, for example we 
> have customers with
> very complex build procedures, some simply defy sanity(for 
> some histerical raisin).
> Not suprising, those customers were quite happy with the 
> default CDT builder 8-)
>
> It would be nice for them be able to use the IDE even if they 
> do not have
> the bell and wistles of the super duper IDE builder.
> It will probably be an incentive to migrate in the long term.
> 
> So we can say to clients: "Oh! you have a monstruous project, 
> a build develop by
> summer students and you are in a time crunch and can not do 
> the conversion? Ok
> we can give you an adapter for the IDE that will spawn your 
> shell scripts, minimal
> and you may loose some features but you can get to work right away."

I don't see any reason at all to drop the current builder - if
nothing else, there are probably hundreds of thousands of projects
(open source and corporate) that have hand-made Makefiles and
build scripts.  I've actually got a handful of these that I have
used the current version of CDT with.

I don't know what the best method of handling this is, though.
In TS2, we use a project property to indicate if the Makefile
is to be maintained automatically or simply used as-is.  Perhaps
a similar flag on projects is called for (use internal builder
vs. use external build command).
 
> == scenario 3: Information for other modules.
> For example, the debugger needs to know the src paths for the 
> sourceLocator to find
> files when stepping.  The parser may need some macros, 
> definition to do parsing
> correctly.  The indexer, search, refactor may also need the 
> include paths, in C++
> the entire definition of the class can be inline in a header, 
> the class definition
> is needed to do correct code complete/assist.  Some work was 
> started in the 
> ICBuilder but we did not have a chance to refine.

All these pieces of information (source file locations, macros,
include paths, etc.) should be exposed by the build model in a
generic way.  One way to do this is to add convenience methods
to the ICBuildConfig interface to retrieve standard information
(getSourcePaths(), getIncludePaths(), get getCppMacros(), etc.)

> == scenario 4: Extension points.
> Can it(should it) be possible to overload the entire Build Model?

I don't know.  My inclination is to say "no", though others may
disagree.

> Where the other extension points?  for example for the error parsers ?

Not present yet.  Still a work in progress.
 
> == scenario 5: Integration with the current CDT
> In the CDT core there is a framework for extension points it 
> is save in the
> project, in a ".cdtproject" file.  The ".cdtproject" file is 
> somehow equivalent
> to the ".project" but for the CDT so it is possible to 
> export/share information.
> For example when checking out a C/C++ Project, the binary 
> parser type, build model type
> etc .. can be shared.  Here is an example of what is generated now:
> 
> <?xml version="1.0" encoding="UTF-8"?>
> <cdtproject id="org.eclipse.cdt.core.make">
>     <extension id="org.eclipse.cdt.core.makeBuilder" 
> point="org.eclipse.cdt.core.CBuildModel">
>         <attribute key="command" value="make"/>
>     </extension>
> </cdtproject>
> 
> Sharing properties/settings(via cvs or other means), seems to 
> be important for some of our
> customers .. they even ask to share breakpoints ... sigh ..

The current implementation is set up to save build config settings
to one or more files (one per build configuration).  The files are
plain text, modeled after launcher config files, and exist explicitly
because of the need to share build configs.

Having one file (.cdtconfig) or merging the configuration information
into the .cdtproject file is possible.  Is there an way to inject (and
later extract) arbitrary information into the .cdtproject file?

Haven't implemented this yet, but the concept of "currently active
build configuration" for a project probably should be a local,
persistent project property.

-Samrobb


Back to the top