[Scons-dev] SCons tools refactoring progress
Gary Oberbrunner
garyo at oberbrunner.com
Tue Oct 20 20:59:02 EDT 2015
Comments inline.
On Tue, Oct 20, 2015 at 12:52 AM, anatoly techtonik <techtonik at gmail.com>
wrote:
> On Tue, Oct 20, 2015 at 3:57 AM, Gary Oberbrunner <garyo at oberbrunner.com>
> wrote:
>>
>> On Sun, Oct 18, 2015 at 7:52 AM, anatoly techtonik <techtonik at gmail.com>
>> wrote:
>>
>>>
>>> I see the implementation, but I don't see any use cases. I know it
>>> sounds too formal, but I can't validate the assumptions we had towards the
>>> new toolchain without a formal list. Do you have some notes or maybe BDD
>>> tests for that?
>>>
>>
>> A very good question. I don't have enough design notes in there; I'll
>> write up some of my motivations and design goals here and add them to the
>> README.rst.
>>
>> The basic idea is that the current concept of a Tool is too low level;
>> the primary motivating use case is that users (SConscript authors) should
>> be able to select groups of related tools, _as_ a group, with fallbacks. A
>> Toolchain is that abstraction.
>>
>
> The use case is already good to be recorded. But there are two cases:
> 1. Select a group of related tools
> 2. Fallback
> Do we have a concrete groups of related tools already?
>
Yes, I described a few. (Compiler and linker, maybe assembler is the
obvious one.)
> The fallback story needs to be expanded. Are there fallback choices inside
> of one group (with priority or other preference strategy) or the fallback
> means "fallback to another group"? Maybe I am too detailed, but also - what
> are the cases when fallbacks occur?
>
This is all implemented. Check the code and test cases (Toolchain-test.py).
I think it's pretty solid but please review. Basically it's recursive AND
and OR trees with optional or required elements.
> It it not necessary to design everything now upfront - the primary goal of
> my questions is to find *real world* stories (not even use cases) for which
> new mechanism is needed. I am afraid to design a system that will be too
> perfect for implementation in a free time.
>
Well, at least for the toolchain, I think it's mostly there
implementation-wise (check it out). The interesting stuff is improving the
Tool ideas (there's a sort of registry so you can give concrete tools names
and look them up that way), adding Finders, and all the other stuff that
we've been discussing. But if you have some real-world stories you'd like
to see covered, this is a good time to get them out there.
All the rest is just there to make that idea work well. The secondary goal,
>> in service of the main one, is that Tools need to obey their abstraction,
>> for instance always calling exists().
>>
>
> What is the story behind this? Because I know the opposite story -
> exists() for default tools is called even if I don't build anything with
> those tools - this delays the build start and produces messages about
> missing compiler to the screen.
>
exists() is how a tool knows whether its binaries (or whatever it needs to
run) exist or not, so toolchains can't work without it. As for default
tools, the idea here is that SConscript writers will (finally!) be able to
specify exactly which tools they want. I hope to do away with the current
default tool initialization system, though some of that still needs to be
thought out. The current design is much "lazier" but still needs work.
The new system also creates a distinction between an abstract tool, such as
>> intelc, and a concrete instance of it, such as intelc v12 x86. This is
>> needed so the user can create chains of either specific or general tools.
>>
>
> I understand where this might be useful, but still - is there a real world
> story where this was needed?
>
Absolutely - in my day job we specify very carefully what tool chain is
used to build any given product. If the machine doesn't have the right
version of the Intel compiler the build should fail. For open-source
projects, on the other hand, they should try to build on as many
configurations as possible so they want to keep things general -- use any
Intel compiler, or any gcc, or any cc (for instance).
> One restriction I'm imposing in the new system is that Tools have to be
>> configurable outside of any Environment; I don't like the current system
>> where tool-configuration variables ("tool args" basically) are mixed into
>> the Environment. This poses some challenges for a fully generalizable
>> system but I think I have a decent handle on that. The current Intel
>> compiler has an initial attempt at such tool args.
>>
>
> How to handle "tools args" is a question for a separate thread. We will
> need to have a standard set for every *type* of tool and specific for every
> tool supported. For the maintenance that means tables, and perhaps tests
> for those args. And neither unit tests, nor BB wiki are good things for
> human readable tests and human writable tables.
>
Maybe. I think it is _vital_ at this stage to separate mechanism from
policy. We need a mechanism that can support many policies; table-driven
args or standard arg sets are fine policies, and we'll probably have one
for the SCons built-in tools, and users will probably develop their own,
which is fine.
> Again, for the design it needs concrete stories. Like "I want to use
> intelc v12 x86 which uses its own args on Windows". My goal with all these
> stories is to dig up the cases where new more complicated system just don't
> worth it and simple Environment variable hack is good.
>
I sure hope this isn't complicated! It's just a few classes. Simplicity is
an over-arching goal. "Simple Environment hacks" is how we got to the
complexity we have now where even documenting what args a tool requires is
error-prone. (I've added some self-documentation to Tools -- see the code.)
>
>
>> Some use cases:
>> * a simple SConstruct should automatically select the "best"
>> compiler/linker
>>
>
> What is the best in real world projects? For example, pre-configured chain
> for specific combination of OS, environment, flags, source file types? How
> many criteria we may (which arguments should we) pass to
> get_the_best_tool() function?
>
Again, separate mechanism (what can we support) from policy (what is best
for a particular project). I'm concerned with building the mechanism.
> Also, I see that it will be good to make it independent of SCons codebase
> for now.
>
I don't see how that is possible. It will need to replace the current Tool
logic.
>
>
>> * a non-C-related SConstruct (e.g. doc prep, asset management,
>> scientific data handling) shouldn't have to care about compilers, linkers
>> or other unrelated tools
>>
>
> Good. Now we need to expand what this means and record into separate
> story. "When I invoke Sphinx and SCSS tool, SCons initializes compilers,
> linkers and other related tools and complains about missing Visual Studio
> compliler. It also takes a long time."
>
Well, that's not a story about the requirements of the system, it's a story
about how the current system is bad. Please come up with some user stories
that describe desired behavior: "As a user, I want to be able to specify
that the only tools I need are Sphinx and SCSS. Other tools should not be
initialized nor slow down the build." I've captured that in the README;
please send more.
>
>
>> * a SConstruct author should be able to specify toolchains and
>> alternatives, and handle failures gracefully
>>
>
> What is the failure? Only "tool does not exist"? Where it is needed right
> now?
>
Yes, "tool does not exist" is the only one I foresee now -- I have a
provision for an additional error message the tool can pass back ("not
found in $PATH" or whatever).
>
>
>> * it should be possible to write a SConstruct where the tools and
>> toolchains can be manipulated by cmd line args if desired
>> * it should be possible to specify desired tools generally
>> ("gcc/gnulink") or very specifically ("gcc 4.3, x86 cross compiler, with
>> gnulink 4.3 - and TeXLive")
>>
>
> I don't see how that's better than just passing the absolute path for
> specific binary. Making specific restriction for "gcc" and "4.3" in SCons
> defined classes may make it harder to build stuff with compatible
> toolchains that don't have SCons definitions.
>
Think about Mac: there are three different places gcc can be found
(macports, built-in, and another one I can't remember right now). A
SConscript for an open-source project shouldn't have to have any paths
hardcoded.
> There's a bunch of tests in that dir; they're mostly unit-test level at
>> this point. There are also some examples of the new subsystem in use (see
>> the README) which are closer to actual use cases.
>>
>
> There is a risk for such use cases to become skewed towards easier
> implementation, and in the end it may be not solving the original problem
> or doing it in a very strange and complicated way. That's why in formalism
> I prefer real world stories.
>
OK, please send some.
>
>
>> My current task is to design a good way for Tools to find their
>> executables and configure themselves. The current ad-hoc method is not
>> consistent and doesn't encourage reuse. Jason has some ideas in Parts (see
>> its concept of Finders); I don't intend to reuse that directly but at least
>> take some inspiration from it.
>>
>
> I haven't seen the code, but why? What is the problem with Jason approach?
> And difference in finding executables and configuration?
>
It's too complicated to get into here. The Parts code is not quite
complete, requires some heavy infrastructure, is a bit policy-oriented
rather than mechanism, and other things.
>
>
>> My current design I'm working on is something like this: ... a good
>> architecture might be to have a list of finders, to be tried in order; each
>> one can be any type (env, reg, path, fixed path, function, whatever); build
>> a list of all the results, and then have a finder-picker that picks the
>> best from that list (the simplest finder-picker might be return paths[0]).
>> But where that list comes from and how it's manipulated is still TBD.
>>
>
> So, you need a bunch of finder functions - findinenv, findinreg,
> findinfixedpath, ... which can be quickly combined into a list and is tool
> is needed, the first wins. You also need to cache the result - either for
> every check, or for the tool in general. Then you will need to clean the
> cache for the check one day.
>
Yes, that's basically the idea. See the code. If you have a user story for
why cleaning the cache could be needed, please send it along.
>
> For example, my story for Wesnoth automated build, is that if tool does
> not exist, I download the tool and then redo the check. So it resembles the
> way SCons deals with sources. So, should SCons threat Tools as such nodes?
> Should these nodes be isolated from the main FS and Build graph?
>
This may be beyond the capabilities of the system I propose; I think adding
that would significantly complicate it. If you need to download something
(to even see if it exists()) or have a dependency-graph to create the
tools, I think a different system would be needed.
>
> To make this actionable, I propose to create a simple "finders" repository
> that will concentrate on finding tools and will contain:
>
> 1. lookup helpers
> 2. descriptions of tool locations and discovery for
> 1. different tools
> 2. different platforms
> 3. attempt to design file format for expressing this info declaratively
> 4. description of stories where declarative spec doesn't help or will be
> too complex
>
OK, feel free to start something. I can't promise to use it but at least
some of what you're describing sounds useful. I'm currently hung up on how
to allow configurability of those "lookup helpers" (aka Finders: path,
registry, function-call) from various levels (within a tool, by toolchain,
by SConstruct, or ultimately by cmd line arg) in an elegant and extensible
way, but of course writing the basic code is the first step.
Even if we fail to patch the SCons to accommodate all use cases, the repo
> will be useful for other build tools, who may bring better ideas in chain
> management.
>
>
>> Additionally I'd like to see how far it makes sense to go in having
>> standard args to configure Tools; one way is just to leave args up to each
>> Tool (but maybe have the Finders have some high-level control); the other
>> is to define many standard ones (like ARCH, search in PATH or some custom
>> path, and so on) - the latter will make it easier someday to pass these all
>> the way down from the cmd line. I want Tools (and their Toolchains) to
>> stand alone easily, but also have some way to override their decisions at a
>> high level (globally, from cmd line, etc.)
>>
>> And yes, with this system it is a goal that there would be no more
>> "missing Visual Studio" errors on Windows. :-)
>>
>
> In that case we need to start with use case from a different end - after
> reading SConstruct detect if a Visual Studio is needed. We would really
> need to draw a diagrams for that. =)
>
> I explicitly reject that goal. Automatically analyzing a SConstruct to see
what Tools it needs is much too complicated. My idea is users
(SConstruct-writers) will either use pre-set tool chains, or will set up
their own at the top of the SConstruct (before creating any Environment).
"Explicit is better than implicit" as the saying goes.
--
Gary
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist2.pair.net/pipermail/scons-dev/attachments/20151020/bfc0a41e/attachment-0001.html>
More information about the Scons-dev
mailing list