I'm working on research that builds on Cranelift's scoped elaboration algorithm. Is there a preferred way to cite this work? (@Chris Fallin )
Great question! I should really publish an actual paper :sweat_smile: for now, the EGRAPHS 2023 keynote is probably the most academic citation -- something like (in bibtex)
@misc{aegraphs-egraphs2023,
author = {C Fallin},
title = {aegraphs: Acyclic E-graphs for Efficient Optimization in a Production Compiler},
howpublished = {{EGRAPHS} 2023 keynote},
}
if that works?
and you could also cite the RFC (https://github.com/bytecodealliance/rfcs/pull/27)
That works definitely, thanks! If you don't prefer the RFC to be cited, I'd go with the keynote.
One more small question if you don't mind, if I understand correctly, currently e-graph extraction happens greedy? That is, there is no consideration of node reuse during extraction, only during elaboration, is that right?
Sure, yep, keynote is great.
Re: extraction -- that's correct, the costs don't update as we do selection. Doing so is an NP-hard problem I think (weighted set cover or something like that?). If you have ideas here on better heuristics though I'd love to hear them!
Chris Fallin said:
Re: extraction -- that's correct, the costs don't update as we do selection. Doing so is an NP-hard problem I think (weighted set cover or something like that?). If you have ideas here on better heuristics though I'd love to hear them!
Thanks for your answers, they're extremely helpful!
Yes NP-hard (https://effect.systems/blog/egraph-extraction.html). In my work, I formulated the problem as an ILP problem and for the cases I'm looking at it works well but this can't be the way to go for a production compiler, especially in JIT context :sweat_smile:
(I mean, all you have to do is show that P=NP with a practical reduction...)
I guess it's an interesting question whether we could have some heuristic that does better -- divide costs by number of users, or propagate (decayed) cost upward or downward, or something like that
Last updated: Nov 22 2024 at 16:03 UTC