GC refactor ahead

Patrick R. Michaud pmichaud at pobox.com
Fri Dec 4 19:52:09 UTC 2009


On Fri, Dec 04, 2009 at 11:26:12AM -0500, Andrew Whitworth wrote:
> 
> Let me try and explain better what I am thinking:
> [...]
> So if we call Parrot_new_immortal_pmc() to create the parent, and we
> call it again to create the child, we've manually shown the
> relationship that the two are in the same generation and we don't need
> integenerational pointers. However, if the parent is immortal and the
> child is normal, the write barrier will detect that and add the
> intergenerational pointer record.

Speaking from the perspective of an HLL developer and user, almost
nothing in the above paragraph makes any sense to me.  What would I do 
in a language like Perl 5, Python, Perl 6, or even PIR to say "hey, 
this object I'm creating is going to be part of an immortal data 
structure, use the 'immortal PMC' creator instead of the 'mortal' 
one"?  Or, if my compiler is supposed to be smart enough to figure it
out, then how will it know?  (This sounds like a variation of the 
halting problem to me.)

I also think the term "immortal PMC" is highly misleading and
open for misinterpretation -- long-lived (but still potentially 
mortal) PMCs appear to be the real issue here.  I'd prefer we stick
with the "long-lived" phrase instead of "immortal".

And to err on the side of being extra explicit, I should be
clear that *all* of the long-lived PMCs that cause the slowdown
in the fib.nqp benchmark are created from PIR, at :load :init time.
There's no special-purpose PMCs involved, nor any calls to NCI
subroutines, nor anything that would yet be able to somehow
make use of a "Parrot_new_immortal_pmc()" interface.

Pm


More information about the parrot-dev mailing list