|
|
Charles C wrote:
> I did put together something of a "minimal" scene which
> demonstrates the behavior, & it looks even more like it's to do with
> token_count. I'll head over to general & post that.
As I said, it is a known 3.6 problem that is really easy to fix. The token
counting variable is (unfortunately) misused for different purposes and it
does not actually count tokens as main purpose (and as its name would
suggest). What it does instead, it flag special uses with the magic value
1000. In order to reduce progress output for parsing on newer, faster
systems, in tokenize.cpp this variable range was changed (iirc to 2500 or
5000 but I don't have an unfixed 3.6 code copy at hand right now). Just
changing the respective single line in tokenize.cpp back to the "proper"
magic value will fix this problem. No need for you to continue searching. As
I said, we know very well what exactly is wrong, the issue is just finding
the time to do an official 3.6 release on all platforms.
Thorsten
Post a reply to this message
|
|