Problem solvers based on graphs are hard to get your head around at first, but then you get extremely elegant and powerful solutions to seemingly unsolvable problems.
The only gotchas are: 1) time to get your head around 2) algorithmic complexity of the resulting solution.
Graph theory is probably the most fulfilling math application in the computer science. In a way, graph-based algorithms do magic similar to AI but in a fully determined manner. If you think about it more broadly, a graph resembles a subset of a neural network but only with {0, 1} weights.
Maybe some day neural networks will so obvious and well-known to the general public that this is how we'd explain graphs to kids: imagine a NN where weights are always 0/1...
Neural networks is not so complicated. They are much much simpler than it seems when you think about something as complex as intelligence. It even makes me sad that such simple things as neural networks perform such complex intellectual things...
This seems, at least upon first read, analogous to global value numbering (GVN). Or, depending on how you look at it, common subexpression elimination (CSE). I am mostly wondering why they are not mentioned in the article.
GVN and CSE only identify duplicate/common subexpressions. They do not tell you where to place the computation of the common subexpression.
The canonical algorithm to do that is to compute the dominance relation. A node X dominates Y if every path to Y must go through X. Once you have computed the dominance relation, if a common subexpression is located at nodes N1, N2, N3, you can place the computation at some shared dominator of N1, N2, and N3. Because dominance is a statement about /all/ paths, there is a unique lowest dominator [1]. This is exactly the "lowest single common ancestor."
Note that dominance is also defined for cyclic graphs. There may be faster algorithms to compute dominance for acyclic graphs. Expressions in non-lazy programming languages are almost always acyclic (e.g. in Haskell, you can write cyclic expressions).
[1] Claim. Let A, B, and C be reachable nodes. Suppose A and B both dominate C. Then either A dominates B or B dominates A.
Proof. We prove the contrapositive. If neither A dominates B nor B dominates A, then there exist paths a, b from the root such that path a passes through A but not B and path b passes through B but not A. If there is no path from A to C, then A cannot dominate C as C is reachable. Similarly, if there is no path from B to C, then B cannot dominate C. So assume there are paths a' from A to C and b' B to C. Then the path b.b' witnesses that A does not dominate C, and the path a.a' witnesses that B does not dominate C.
(There might be a bug in the proof; I think I proved something too strong, but I'm going to bed.)
I’m also a bit weirded out by the lack of mentions of dominators (in TFA or in the article[1] it links to). Once you have the dominator tree, though, you’re still going to need to answer LCA queries on it (in this problem and in an SSA-based compiler both), so there seems to be no way around this part.
The algorithm in the article does O(1) queries with O(V+E) preprocessing (assuming linear-preprocessing LCA, which, yeah). What’s the best algorithm for dominator trees? People usually talk about Lengauer–Tarjan[2], which is linear in practice (linear except for UNION-FIND), and not the linear one by Georgiadis[3,4]. Unfortunately, I’m not a compiler person.
You don't need to compute dominators. A topological sort of the graph gives you a computation order where your definitions are computed before their use.
I mean, it does, but that’s not what TFA’s problem statement is, and I can imagine why one may not want to immediately take apart the expression into assembly-like SSA
let $0 = f a b in
let $1 = g $0 c in
...
and instead leave some original structure in place and some tree-level simplifications available.
Shit. I had the semi hard problem at my last job and I just realized that the most efficient way I could solve it would be to write a blog post on how proud I was of my (shitty) solution and wait for the snarky commenters to tell me the proper way to do it..
I love this fact about the internet! Thanks guys! Keep it up! Including the snarkyness. It’s part of what makes it great!
(I am aware this is not a novel idea. Posting the wrong solution is better than asking for help.. It is just fun to see it in action)
Wondered about the same thing. Perhaps the author deals with graphs with no side effects or branches? It would then trivially become CSE on a single basic block.
SSA transformations are essentially equivalent to what the author appears to be doing in terms of let-bindings [0].
I came here to mention this as well. If this problem was so critical to the company the author was working at, it seems negligent to spend a _year_ reinventing a solved problem from scratch, especially given the author's apparent history of compiler experience.
Equally important: have professionals of all kinds (like, engineers) be aware of state-of-the-art in fields of science that touch their profession. And how to find that (if necessary through 3rd parties - like author did).
Knowledge is much less useful if it doesn't get applied.
Problem solvers based on graphs are hard to get your head around at first, but then you get extremely elegant and powerful solutions to seemingly unsolvable problems.
The only gotchas are: 1) time to get your head around 2) algorithmic complexity of the resulting solution.
Graph theory is probably the most fulfilling math application in the computer science. In a way, graph-based algorithms do magic similar to AI but in a fully determined manner. If you think about it more broadly, a graph resembles a subset of a neural network but only with {0, 1} weights.
Maybe some day neural networks will so obvious and well-known to the general public that this is how we'd explain graphs to kids: imagine a NN where weights are always 0/1...
Neural networks is not so complicated. They are much much simpler than it seems when you think about something as complex as intelligence. It even makes me sad that such simple things as neural networks perform such complex intellectual things...
Building blocks (real ones, not metaphorical ones) are also simple.
Your brain is made of relatively simple cells. Even earthworms have neurons.
But emergent complexity of systems made of simple neurons is staggering! That's the point, I guess. Simple bricks made complex systems.
The general public believes 1/3 is smaller than 1/4.
Only 1/4 [0] of the general public believes that, but marketers get hurt at any loss of customers ...
[0] https://mises.org/mises-wire/87-statistics-are-made
You know what did happen millions of years ago when the monkeys started to use tools.
Time of AI to know how to use tools, like a mathematical formal solver. Well, it is already done, but it is not LLM... soooo.. academics only?
If you like reasoning about a program in terms of expression trees/graphs, I recently discovered that Wolfram Language has built-ins for this:
https://reference.wolfram.com/language/ref/ExpressionTree.ht...
This seems, at least upon first read, analogous to global value numbering (GVN). Or, depending on how you look at it, common subexpression elimination (CSE). I am mostly wondering why they are not mentioned in the article.
GVN and CSE only identify duplicate/common subexpressions. They do not tell you where to place the computation of the common subexpression.
The canonical algorithm to do that is to compute the dominance relation. A node X dominates Y if every path to Y must go through X. Once you have computed the dominance relation, if a common subexpression is located at nodes N1, N2, N3, you can place the computation at some shared dominator of N1, N2, and N3. Because dominance is a statement about /all/ paths, there is a unique lowest dominator [1]. This is exactly the "lowest single common ancestor."
Note that dominance is also defined for cyclic graphs. There may be faster algorithms to compute dominance for acyclic graphs. Expressions in non-lazy programming languages are almost always acyclic (e.g. in Haskell, you can write cyclic expressions).
[1] Claim. Let A, B, and C be reachable nodes. Suppose A and B both dominate C. Then either A dominates B or B dominates A.
Proof. We prove the contrapositive. If neither A dominates B nor B dominates A, then there exist paths a, b from the root such that path a passes through A but not B and path b passes through B but not A. If there is no path from A to C, then A cannot dominate C as C is reachable. Similarly, if there is no path from B to C, then B cannot dominate C. So assume there are paths a' from A to C and b' B to C. Then the path b.b' witnesses that A does not dominate C, and the path a.a' witnesses that B does not dominate C.
(There might be a bug in the proof; I think I proved something too strong, but I'm going to bed.)
I’m also a bit weirded out by the lack of mentions of dominators (in TFA or in the article[1] it links to). Once you have the dominator tree, though, you’re still going to need to answer LCA queries on it (in this problem and in an SSA-based compiler both), so there seems to be no way around this part.
The algorithm in the article does O(1) queries with O(V+E) preprocessing (assuming linear-preprocessing LCA, which, yeah). What’s the best algorithm for dominator trees? People usually talk about Lengauer–Tarjan[2], which is linear in practice (linear except for UNION-FIND), and not the linear one by Georgiadis[3,4]. Unfortunately, I’m not a compiler person.
[1] https://doi.org/10.1016/j.ipl.2010.02.014
[2] https://maskray.me/blog/2020-12-11-dominator-tree
[3] https://www.cs.princeton.edu/research/techreps/43
[4] https://dl.acm.org/doi/abs/10.1137/070693217
You don't need to compute dominators. A topological sort of the graph gives you a computation order where your definitions are computed before their use.
I mean, it does, but that’s not what TFA’s problem statement is, and I can imagine why one may not want to immediately take apart the expression into assembly-like SSA
and instead leave some original structure in place and some tree-level simplifications available.Shit. I had the semi hard problem at my last job and I just realized that the most efficient way I could solve it would be to write a blog post on how proud I was of my (shitty) solution and wait for the snarky commenters to tell me the proper way to do it..
I love this fact about the internet! Thanks guys! Keep it up! Including the snarkyness. It’s part of what makes it great!
(I am aware this is not a novel idea. Posting the wrong solution is better than asking for help.. It is just fun to see it in action)
This is not snark! This is me genuinely wondering.
Wondered about the same thing. Perhaps the author deals with graphs with no side effects or branches? It would then trivially become CSE on a single basic block.
SSA transformations are essentially equivalent to what the author appears to be doing in terms of let-bindings [0].
[0] https://dl.acm.org/doi/10.1145/278283.278285
I came here to mention this as well. If this problem was so critical to the company the author was working at, it seems negligent to spend a _year_ reinventing a solved problem from scratch, especially given the author's apparent history of compiler experience.
A great example of the value of academia and scientific research.
Equally important: have professionals of all kinds (like, engineers) be aware of state-of-the-art in fields of science that touch their profession. And how to find that (if necessary through 3rd parties - like author did).
Knowledge is much less useful if it doesn't get applied.
Yivycy