[Scons-dev] Mini announcement: v2.4 is near...
William Blevins
wblevins001 at gmail.com
Thu Aug 6 18:55:57 EDT 2015
On Thu, Aug 6, 2015 at 6:01 PM, Dirk Bächle <tshortik at gmx.de> wrote:
> William,
>
> On 06.08.2015 20:59, William Blevins wrote:
>
>> Dirk,
>> I don't see information on the hardware and/or threads used for your
>> profile attachment.
>> Also, it is interesting that the update time slightly increased. I
>> assume this is a side-effect of the lazy loading overhead. I am
>> not worried about it though.
>> Good work :)
>>
>>
> thanks for your feedback. I could list things like OS/machine/RAM...but
> this is all about the relative comparison of one commit to the other. In
> both runs the same machine and settings were used...does that help, or do
> you need more information?
>
I was curious to the thread count due to the slight reduction in full build
time. I imagine that savings grows with the number of threads since the
bottleneck here was the taskmaster forking new threads (and the memcopy
happening per fork). I was just curious because I noticed a small savings
in my builds at work. We already have posix spawn wrapper changes on the
list of upcoming work, but it might be a point of data for optimizing SCons
further by examining work that happens in a single threaded context that
*could* possibly be pushed elsewhere?
I know that the java emitter code falls into this category, but that's a
separate issue.
> @all: I just pushed a new commit for fixing a few tests/tools that were
> still accessing the Node attributes directly, mainly ".abspath" and
> ".path". Doesn't show up usually, because the "__getattr__" is in place in
> "Node/FS.py"...but I like to have the core sources clean.
> During testing I found a problem with "test/packaging/rpm/cleanup.py". It
> will usually PASS, but every now and then when repeatedly calling
>
> python runtest.py test/packaging/rpm/cleanup.py
>
> it fails. I couldn't find an underlying scheme yet, will try further, but
> is anyone else seeing this behaviour on his machine...or has an idea what
> could go wrong?
>
I will try to give it a skim when I get home.
>
> Regards,
>
> Dirk
>
> _______________________________________________
> Scons-dev mailing list
> Scons-dev at scons.org
> https://pairlist2.pair.net/mailman/listinfo/scons-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist2.pair.net/pipermail/scons-dev/attachments/20150806/e19930c0/attachment.html>
More information about the Scons-dev
mailing list