kenobi
1/11/2015 6:52:00 PM
W dniu sobota, 10 stycznia 2015 23:02:51 UTC+1 uzytkownik Bart napisal:
> On 10/01/2015 19:19, Kaz Kylheku wrote:
> > On 2015-01-09, BartC <bc@freeuk.com> wrote:
>
> >> a=b+c
> >>
> >> In a scripting language with dynamic types, what would you compile it
> >> to? You don't know what the types are! They could be integers, floats,
> >> strings, lists, tuples, ranges, functions, classes or any combination.
> >> You have to sort this all out at runtime, which is not going to be
> >> helped by compiling.
> >
> > Yes, it is still helped by compiling.
> >
> > Firstly, the compiler knows certain facts, such as that a, b, and c are (let us
> > say) lexical variables and can generate fast references to them.
> >
> > A pure interpreter, by contrast, has to look these things up: it looks at a
> > and wonders, is that a local? A global? It has to do some environmental lookup.
>
> If we're talking about languages such as Python, then there will already
> be a compilation stage, usually transparent, that converts source code
> to bytecode. I assumed the OP was talking about compilation to native code.
>
> With bytecode, it will already know much of that information.
>
> > basically, you're neglecting to look at this from the point of view that
> > a, b and c can be regarded as variables having the generic type "value",
> > and that the expression is basically just like C:
> >
> > a = builtin_plus(b, c);
> >
> > you wouldn't say that compiling a function call and assignment isn't
> > beneficial, right? Even if a, b and c are pointers, and the function
> > actually has to be called.
>
> It's not necessarily beneficial. Most of the work is going to be done
> inside that builtin_plus() function (say a C function that is a
> permanent part of the implementation, while the a=builtin_plus(b,c) part
> is the result of a 'compiling' a bit of script code into C).
>
> The only advantage might be in not having a dispatch loop that finds out
> what the next operation might be.
>
> However, imagine you had 100 lines of a=b+c, then you will have 100
> successive calls to a=builtin_plus(b,c), these are actual calls with
> parameters needing pushing and so on, which are not going to be
> practical to inline.
>
> Now compare with a pure bytecode interpreter implemented with a switch:
>
> while (1) {
> switch (*pcptr) {
> ....
> case kadd: do_add(); break;
> ....
> }
> }
>
> This is an actual example, where do_add() corresponds to
> a=builtin_plus(b,c) (this uses a stack model so there will be extra
> bytecode instructions to push the operands and pop the result).
>
> The compiler will likely inline those handler functions like do_add()
> since there is only one call to each. It will spend those 100 lines in
> this loop, with no function calls and no parameter passing.
>
> > On top of that, advanced dynamic languages support optimization. If you
> > promise to the compiler, via declarations, that a, b, and c have certain
> > types, then it can generate better code.
>
> That wouldn't count as pure Python then (but as a different version,
> perhaps CPython). When I tried adding type info to my own language
> efforts, the modest gains weren't worth the extra complexity or losing
> the elegance of the language's variant types.
>
> --
> Bartc
so you think that efficiency loss (and python is considered slow) is taken not from lack of compilation but form python shaped structure
of executable content? 9that i dont know, is doing operations through runtime layers?)