This sounds impressive and very cool:
Alexander G. Gray, Bernd Fischer, Johann Schumann, Wray Buntine - Automatic Derivation of Statistical Algorithms: The EM Family and Beyond
sadly, "We made a push of progress in 2001 but currently things are stalled." (here)
Alexander G. Gray, Bernd Fischer, Johann Schumann, Wray Buntine - Automatic Derivation of Statistical Algorithms: The EM Family and Beyond
Abstract
Machine learning has reached a point where most probabilistic methods
can be understood as variations, extensions and combinations of a
much smaller set of abstract themes, e.g., as different instances of the
EM algorithm. This enables the systematic derivation of algorithms customized
for different models. Here, we demonstrate the AUTOBAYES
system which takes a high-level statistical model specification, uses powerful
symbolic techniques based on schema-based program synthesis and
computer algebra to derive an efficient specialized algorithm for learning
that model, and generates executable code implementing that algorithm.
This capability is far beyond that of code collections such as Matlab toolboxes
or even tools for model-independent optimization such as BUGS
for Gibbs sampling: complex new algorithms can be generated without
new programming, algorithms can be highly specialized and tightly
crafted for the exact structure of the model and data, and efficient and
commented code can be generated for different languages or systems.
We present automatically-derived algorithms ranging from closed-form
solutions of Bayesian textbook problems to recently-proposed EM algorithms
for clustering, regression, and a multinomial form of PCA.
sadly, "We made a push of progress in 2001 but currently things are stalled." (here)