We have also applied self-adjusting computation in a number of domains including algorithms, machine learning, and software systems. In these applications, we have solved several important open problems (e.g., geometric problems involving changing and moving data sets, inference algorithms for learning from changing models).
Perhaps the most important idea in self-adjusting computation is the treatment of computation as a "first-class" mathematical and computational object that can be remembered, re-used, and adapted to changes in its environment. Another important idea is the use of type systems that can separate "changeable" computations from "stable" or "invariable" computations that do not and cannot change. Another crucial idea is the realization of the first-class nature of computation in efficient algorithms that represent a computation in the form of a dynamic dependency graph. Such a graph can be constructed and updated efficiently, via a change-propagation algorithm, in response to changes to the computation data. The research stands on advances made in what in computer science is known an Theory A and Theory B, building on techniques from both approaches to theoretical computer science. A key technique that permeates much of this work is the use of language-based and algorithmic cost models for predictable, analyzable efficiency. We have demonstrated that carefully designed languages and compilers can realize in practice the efficiency predicted by such theoretical models.