It's been a high-input, low-output sort of week. In the interest of accountability and perhaps higher retention, I'll try to summarize some of the things I've learned. Themes include monads, parallelism, and geometry. :)

Umut Acar on Dynamic Parallel Programming

Umut describes two ideas from CMU theses with which I'm familiar, parallel cost semantics using computation graphs (Daniel Spoonhower) and self-adjusting computation (Umut himself, though I'm familiar with it through Ruy Ley-Wild). In the former, one can analyze the parallelism of an algorithm as a function of the dependencies between computations; the algorithm is characterized in terms of work, the total cost, and depth, the length of the longest critical path through the dependency graph. Bob Harper wrote a nice post summarizing these ideas. With self-adjusting (or dynamic) computation, the idea is that you can re-run a program on slightly-changed inputs and, if the problem is stable (small input changes …

Umut Acar on Dynamic Parallel Programming

Umut describes two ideas from CMU theses with which I'm familiar, parallel cost semantics using computation graphs (Daniel Spoonhower) and self-adjusting computation (Umut himself, though I'm familiar with it through Ruy Ley-Wild). In the former, one can analyze the parallelism of an algorithm as a function of the dependencies between computations; the algorithm is characterized in terms of work, the total cost, and depth, the length of the longest critical path through the dependency graph. Bob Harper wrote a nice post summarizing these ideas. With self-adjusting (or dynamic) computation, the idea is that you can re-run a program on slightly-changed inputs and, if the problem is stable (small input changes …