my talk

Feb. 11th, 2008 11:59 pm
gusl: (Default)
[personal profile] gusl
I am giving the Machine Learning Lunch talk next Monday at noon, at NSH 1507. If you're on time, you can get free sandwiches from Quiznos.

I'm thinking my slides can be pretty much be just "snapshots" of the paper, presented slowly (since IMHO, the major fault of 90% of speakers is going too fast). Maybe I should add extra pictures, to support the audience's intuition.

abstract:

This talk will start by presenting Shimizu et al's (2006) ICA-based
approach (LiNGAM) for discovering acyclic (DAG) linear Structural
Equation Models (SEMs) from causally sufficient, continuous-valued
observational data. This is remarkable because it determines the
direction of every causal arrow when no experimental data is
available.

Our work generalizes the above. By relaxing the acyclicity constraint,
our approach, LiNG-DG, enables the discovery of arbitrary directed
graph (DG) linear SEMs. We present various algorithm sketches for
causal discovery with LiNG-DG, and show results of simulation for one
such algorithm.

When the error terms are non-Gaussian, LiNG-DG discovery algorithms
output a smaller set of candidate SEMs than Richardson's Cyclic Causal
Discovery (CCD) algorithm. We prove that all the models output by
LiNG-DG entail the same observational distribution and are equally
simple (i.e. same number of edges). This implies that without further
assumptions, no algorithm can reliably narrow the set of candidate
SEMs output by LiNG-DG using just observational data.

However, we show that under the additional assumption of "stability",
the set of candidate models output by LiNG-DG can be further narrowed
down (under some conditions, to a single model).


As someone not-yet-enrolled in a PhD program, I might well be the speaker with the fewest formal qualifications in the history of the seminar.

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags