[LispM-Hackers] Funcall notes

James A. Crippen james@unlambda.com
19 Oct 2001 15:33:01 -0800


Paul Fuqua <pf@ti.com> writes:

>     Date: 18 Oct 2001 13:34:26 -0800
>     From: james@unlambda.com (James A. Crippen)
>     
>     What is it that pushes its own address?  And why do things return
>     here?  Where's 'here'?
> 
> "Here" means to this step in the process, ie, about to calculate the LC
> offset.

Okay, that's what I figured but I wasn't sure if 'here' was an
explicit reference to a location in microcode or macrocode space, or
just a spot in the algorithm.

The SSDN isn't exactly the most lucid of texts...  Seems like it was
written in a hurry.

> "Pushes its own address" is a reference to the type of dispatch
> microinstruction, basically meaning it's (mostly) a call instead of
> a branch.

What exactly is the address being pushed, though?  Is it a microcode
address or pc?  Or is it just the address of the current
macroinstruction?  And is it getting pushed on the PDL?

> Lots of things were functional objects on the Explorer.
> DTP-FUNCTION, obviously.  DTP-SYMBOL would look fetch the function
> cell of the symbol and come back with a DTP-FUNCTION.  DTP-INSTANCE
> would do a method-table lookup and come back.  DTP-ARRAY would turn
> into an AREF and *not* come back (allowed for old historical
> reasons).  DTP-LIST would fake a call to SYS:APPLY-LAMBDA to jump
> into the interpreter, and come back with that DTP-FUNCTION (I
> think).  DTP-LEXICAL-CLOSURE and DTP-CLOSURE would pull the function
> out of the closure and come back.  And so on; see "Calling Anything
> Else."

Okay, I grok that.  Function calling either falls straight through the
step of retrieving the function if the call was on a DTP-FUNCTION,
otherwise the object is searched in a DTP-dependent way for the
appropriate DTP-FUNCTION object to call, and other cruft that must be
set up before the actual call is initiated.  And the one exception is
for DTP-ARRAY, which is just translated into an AREF of the
appropriate form.

> In other words, the simple common case, DTP-FUNCTION, falls through to
> the next step, while most other things jump out of the common path to go
> find a DTP-FUNCTION.

These instructions all seem to be interlaced with each other,
algorithm-wise.  The microcode must have been implemented as one large
routine with lots of jumps and conditionals, that all together
implemented the whole set (or few sets) of CALL instructions.

I'm wondering whether we should worry about code duplication in this
sense.  If a lot of code for each CALL macroinstruction is going to be
the same, should there be a convenient way to avoid duplicating so
much of it?  Or should we worry about that later after they all work?
Debugging one chunk of code is often easier than fixing the same bug
multiple times in different but similar algorithms...

>     Another question is where the location-counter offset goes when it's
>     calculated from the previous function.
> 
> Into the call frame, one of the five state words, where it can be used
> to restart the LC in the caller when returning.

The LC offset is always calculated for the *previous* function and put
in the *previous* function's state on the stack, right?  So the LC
offset is never for the current function being called, but the
function being called from?  Just want to clarify that.

> It's just the cache.  I think overflowing in that sense just meant that
> some had to be written out to make space for more.

Good, we can completely ignore this step then.

> If there was an actual overflow of the PDL array itself, it would
> cause a trap, where the debugger would offer an option to extend the
> array.  Either way, it's essentially transparent to the CALL
> instruction (aside from the check) as either the situation gets
> cleaned up or the call is aborted.

I get the impression that at the macroinstruction level most problems
are traps, which are restartable and/or continuable.  Essentially we
won't have to worry about errors in macroinstructions unless they're
explicitly mentioned, at least until we implement the trap handling.

> (And while I forget the exact mechanism, trapping involves switching to
> the debugger stack group, saving the micro-pc so we can look it up in a
> list that gives us an atom that indicates the type of condition to
> signal.)

Hmm.  We won't have a micro pc, AFAICT.  How would we simulate this?

'james

-- 
James A. Crippen <james@unlambda.com> ,-./-.  Anchorage, Alaska,
Lambda Unlimited: Recursion 'R' Us   |  |/  | USA, 61.2069 N, 149.766 W,
Y = \f.(\x.f(xx)) (\x.f(xx))         |  |\  | Earth, Sol System,
Y(F) = F(Y(F))                        \_,-_/  Milky Way.