[Scons-dev] SCons tools refactoring progress
Jason Kenny
dragon512 at live.com
Wed Oct 21 10:33:11 EDT 2015
I think I trying to talk about high level concepts needed to do at the base level what you suggest your main goals are with this revamp. The “policy” are what I think we are trying to agree on as the tool_args concept. I think I tried to state that the main difference from what exists and what is proposed going forward it to move the tools args to an formal object vs the current add a variable to the Scons environment.
When you talk about stuff such as graphic pipe lines, document prep systems, etc. I agree that should be using the same general mechanism, However I don’t agree that the idea of how a toolchain is setup is that different. The toolchains are a set of tools, document or web app uses different tools. A give tool in general is a command line program that our tool logic need to make sure:
1) Exists
2) Has some environment needs to so it can run
3) Has some meta information so we can select/find the correct tool if there is more than one on the system, such as
a. Location
b. Version
c. System information for the target output.
These are common themes all tools have. Sure some tool like documents are platform independent, and some tool we may not care about the version. But all tools have some idea of a version ( else we have no way to deal with bugs and improvements in the tool) and the all have some idea of host platform we are one. You look for a tool in different locations if you are on a mac, or linux ( Red hat is different from Ubuntu for some items) or windows. On a given platform what you look for is different for 64-bit vs 32-bit.
What I talk about below are the concepts I think we need to have for a system to work. I think we need to have standards to help make this easy and efficient for these different chains to exists. For example in Scons how do we separate BSD from Linux? Or how do we separate 32 form 64-bit? What I see at the moment is the Scons tool for MSVC and Intelc use different ways to state 32-bit and 64-bit. In Parts this is the same. More so in that it have common mappings such as i386 to x86 to help make it safer. My point here is that I think Scons would be a more robost system if for example we add some form of the Parts SystemPlatform object to Scons as this become a common and standard way to express an important concept in a build system, no matter if your tool chain is platform independent for the target output or not. The current proposed system as I current understand it has people making intelc-x86 and intelx-ia32 tools as different instances even though they are the same. I think this add to the confusion.
Anyways… Let me look more at the current code and let me make some object example with both system for use to discuss.
Jason
Sent from Mail for Windows 10
From: Gary Oberbrunner
Sent: Tuesday, October 20, 2015 8:13 PM
To: Jason Kenny
Cc: SCons developer list
Subject: Re: [Scons-dev] SCons tools refactoring progress
You talk a lot about what I consider "policy": specific types of args that would be used to configure specific types of Tools, like compilers. This is all fine, but my goal is to build a framework where such policies can be applied, and then get into the specifics of platforms, cross-compilers, versions, targets and all that. Building a game asset manager, or a visual-effects pipeline, or a document-preparation system, or a scientific-data analysis pipeline, will have very different _specifics_ but the same general mechanism should be usable for all, or at least that's my goal.
Jason, since you've already been down this path once, I'd very much appreciate a code and design review of what I have so far. It's only 700 lines or so, including tests. I'm especially interested in use cases Parts can handle that my proposed system wouldn't be able to. (Again, from a mechanism or base framework standpoint; not that I don't specifically define a target architecture argument for instance.)
One use case I see from your notes below is this one, which I'll capture:
As a SConscript writer, I'd like to be able to enumerate all versions of a tool on the system so I can choose which one to use.
If there are others, please send them along.
-- Gary
On Tue, Oct 20, 2015 at 11:39 AM, Jason Kenny <dragon512 at live.com> wrote:
Hi Gary,
>From looking at the below this seems to be in line with what is going on in Parts. I had hoped that what I did here would be used /improved on in Scons. That is the a goal of Parts.. to improve Scons.
>From what you talk about below I think we are in general on the same page. I know when I started this I was also of the view that the “args” should be external. At the time Steve Knight release pushed hard that all value could be defined in the environment. I see some value in this, however I was like you in which I wanted to have something separate to define arguments to setup a tool. I also wanted to have a standard set of values. For example we should use the same values to talk about archecture, version, etc.. when we can as this makes it easier to use and develop new tools. Given what I have seen in practice, I was moving towards the IATAP idea from Greg. This is what I call Settings in Parts. While I have not finished this object yet I think it goes toward this idea better. Here we have an object that given values creates an environment to be used to build stuff. I think at a high level we want to move this way.
For a design of the tools I think these are items we want to understand and define some way.
Platform- Parts define a system platform object, Scons defines a string, for the OS, it has nothing standard for the architecture. I think we should agree we need to define this better in Scons. I would like to see something in Scons defining this. I suggest what I have in Parts as a way to start. I define in a given environment a HOST_PLATFORM object and a TARGET_PLATFORM Object. This is to extend the PLATFORM string we have now. The way I did in in Parts allows these items to be synced and to extended with new os or archecture value at run time. I think having something like this in SCons will allow for a better more reusable tool design.
Finders- In Parts I have Finders and Scanners. These are different in what they return, but in effect are the same thing. I believe in the case of what Gary is talking about in Scons they serve the same purpose. The goal is it to return if we found something at a given resource location, given that being a path, registry entries, or environment variable. I should point out in someway this is like a autoconf test to find a compiler, etc. I would like to think this way is better however. I Parts I have finders and scanners. The difference in Parts is that a scanner returns back a set of value with version information, where a finder is much more simple “We have a match as resource X”. The reason for the difference is that some tools are easier to find as a set vs defining every known case. For example when I worked at Intel, the Intel compiler would make lots of internal drops as one would expect, these drops would have different versions. People would want to test the new drops out. Instead of me making a new entry for each new drop, I scanned the disk for example for what was installed. And returned all versions installed on disk. This scaled better for this tool set for users. Another example ( which I should improve a bit more) is the GNU tool change. People can install different version of the compiler on a Linux system and they often install them in different ways, such as under /opt or in there home directory in a standard way. I tried to have the Gnu logic in Parts look for these common places and use these version of requested. The use of a scanner logic was a lot easier here than making lots of custom finders. The main value of a finder vs scanner for me however is that a finder can be more declarative. This makes it easier to read and understand. For a system like SCons this is important for people to add new tools in a common way to be reused.
Tool args- I don’t define this at the moment as an object. I define some common values, but not an object. Originally I was going to define an object. However Steve knight convinced me otherwise. There is some value in that choice. Making it an object has some different value. I not against or for making a tools arg like object. What I am against if a notions of a **kw object which was something Steve Knight basically had and what seems to be defined here in the new tool form Gary. The reason I have issue with this is that this allows the utter chaos we have in the current tools.. ie there is no standard way to set a given tool up. I agree a tool should be able to define unique items if it makes sense, but we should strive for a common set of basic values that we reuse in all tools. Some value like a version may be ignored as it does not make sense for it. We still should define a common set as most tool have a common ground and people need a common way to set them up. Sure advance cases will exists, we have a way with **kw to add advance stuff as it happens. For me value that are useful here are an idea of ( This is just a base set I found useful…not the end all end all)
1) Root path – this is the base path we use to find the tool and extend with extra info
2) Version – this is the version of the tool we want, or selector of the best match of the tool we want. ( ie 3.2.2 is exact where a 3 finds the highest version of 3 as 3.4 on the system. Yes many time we want the latest, or the version does not matter for the tool. This should exist, but does not have to be defined to get a tool to work
3) Use_script – for me this tells us who to setup the environment. I use it as a way to point a custom script to use to setup the environment, vs a set of known defaults or to point to a known script ( based on the root path) to setup the environment
4) Target platform – this is the target platform we want to build on. Not always needed, as we default to the host. Having a common want to talk about the target is powerful and very useful. I should not that many tools have different tools based on what host-target combo exists. Many cases we have x86 –> x64 and x64 -> x64 like tools on a platform. I never have seen a case in which not preferring a native client is the not the correct answer. This can be an important details as some may have a native x64 and other only have the x86->x64 case even though they have a x64 OS.
Tool environment instance- I Parts I called this ToolInfo. This is not really different from what Gary talks about concreate instance of the tool. This is the object that is the information for a given instance of a tool we may want to use. Has information about what to add to the Environment. For me at least these object are define in a file and given finders and other information to expand to the environment when it is asked to fill in the environment with needed information to all a given tool to run/work. The main interface for this object at run time is an exist call and a get environment kv to add to the environment ( or a function to set the environment directly). Both of these functions would take some sort of tool_args. This object would use the finders and other information given to it to find if that instance exists. So for example it would have finders for getting information on the path, or shell variable to find a root directory to use to create the real path to the exe, which it can use to test that it really exists. A common case would be that you might have:
Standard look up
1) Finder to path finder finds pat
2) This object uses that path and test that when we subst the full path value that the provided tool name exists
More custom
1) Finder to path finders fails to find anything
2) Next finder in list is an environment finder, it find a the variable defined to it exists.
3) We use the value of this shell variable to subst the full path, and finds it the tools
Either way this allow a way to find and setup what we need. In Parts I did the setup to allow for setting the environment via define defaults, or by calling a predefined script or a custom script. Scripts are slower to call, and add risk of differences (as they could be modified), but they add a lot of flexibility for custom setup cases while reusing the main tools logic.
Tool info container- The other object needed is something to hold all these different tool cases. I called this ToolSetting in Parts. I did not make it as one global object, but many different obejcts, such as MSVC, GCC, etc. There might be some value to make it that way given what Gary proposes that I think would be an improvement on my design. The main point of this object for me is that this is the guy we register a set of tool information with, ie the exact platform requirements ( ie host and target it is for), and the tool info itself. This interface is a simple register at load time. If done right/well I should allow for an easy declarative definition, that should make it easy for other to read, copy and paste and tweak if needed. The other function is a query function to get a tool instance, based on some tool_args and to test for exist of some tool based on tool args. The only major function I would add in Parts that I don’t have in this case is a function to return all know information. This can be great to allow people to easily test via a simple command what the build system knows about. In Parts have this working via a hack of defining something we know does not exist to get an error message about what it could not find, and what are valid option the systems knows about.. just to note what I would change here ( minus the internally ugly but working impl I have at the moment). I also think this is a place I would add logic to cache known information about known tools to avoid re-looking it up for startup time improvements as a polish step. Would be useful for the generic check a lot of tools toolchain setups as these could be defined.
Toolchain – This is a key feature that we need to have added. I think core ideas of easily defining a toolchain in some way is critical. Some quick thoughts. In Parts I never really got this done. What I do have is a way to define something in Files under a toolchain directory. What is great about this is that it easy for users to define there own changes, local to there own build or systems. That is a plus. What is not good. I did not define a good interface for this ( lack of time and I never got back to it correctly…) . This mean I cannot easily define a toolchain in a Sconstruct. I hope to address this with the Setting (IAPAT) object going forward. It is needed that user can define a toolchain in a the Sconstruct, in independent files and possibly have some control on the command line. On the command line I think user really need and want a way to influence a toolchain setup, such as use version X of a tool or use this path instead to find the tool, vs what is defined by default. In Parts I allow this to a degree with –toolchain (--tc for short). The argument is in a tool[_version],[…] format where tool can be a tool or toolchain. I would redo this ideally to have more of a --tc=toolchain[@key:value[@key:value],[...]] like format to pass in a kv value dict to help influence a toolchain setup. Beyond this we need a good way to define a toolchain. There are a lot of ideas out there for this. Common on include some sort of anyof(), oneof() combination to define more complex toolchains with fallbacks. Or having an idea of a common tool type such as CC and all tool define themselves as a subset of a “master” tool concept. I really open to ideas here myself.
Ideas we can layer on top of this
Once we do a better tools setup. We can layer an idea on top of it like configuration to allow applying common flags to the environment after the tools are configured. This is great as it easily allows common values to be used as a whole and easily changed to try a new set, without having to modify the build scripts. In the case of Parts this has been great for complex build and cross build cases as uses can easily define common flags, etc for a given tool/version/etc to be used based on the tool chain a environment is using. This should be separate at this point.. how I just point it out as it a layer to think about as it would reuse tool information in a toolchain to provide more value. Something to think about in the design of building up the tool correctly.
I would hope that the concepts I talked about here are something we can find common ground on and talk about defining some common name for. I know this is a large e-mail, However I like very much some thoughts about what I suggest at high level to form a base of communication for use on the tool design. I am not saying my impl in Parts is what I would like it to be, but I think it does so common design points we want to discuss and the value of a reasonably easy to use and reusable system can have.
Jason
Sent from Mail for Windows 10
--
Gary
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist2.pair.net/pipermail/scons-dev/attachments/20151021/0f3bd88c/attachment-0001.html>
More information about the Scons-dev
mailing list