Let me know if you come up with something that will automate memory optimization...even if it decreases speed efficiency by a little bit. Analysts are willing to wait for big answers to big problems, but they don't like it whe the program errs, stalls or crashes because there isn't enough memory!
Oh, I see. Memory optimization is what we do when we streamline a program that keeps redundant data around... maybe it uses some non-optimal representation, like using a matrix for representing a graph with sparse connectivity: a list of edges is much better.
I can imagine lots of useful applications of algorithm discovery / data-structure discovery to indexing problems. When should you use an array vs. (multiply) linked list vs. hash table, etc?
Of course, we would need to give the program more info: * our estimate of typical input parameters (e.g. size of matrix, connectivity, etc) * our preferences in the space vs. time trade-off, etc. (+ vs. bandwidth in networked apps)
Can you give me concrete examples of this memory optimization work?
I think it's feasible for us to start a very specific automatic optimizer by looking at narrow cases: what is the programmer doing, etc.
(no subject)
Date: 2006-04-21 04:02 pm (UTC)(no subject)
Date: 2006-04-25 07:01 pm (UTC)I can imagine lots of useful applications of algorithm discovery / data-structure discovery to indexing problems. When should you use an array vs. (multiply) linked list vs. hash table, etc?
Of course, we would need to give the program more info:
* our estimate of typical input parameters (e.g. size of matrix, connectivity, etc)
* our preferences in the space vs. time trade-off, etc. (+ vs. bandwidth in networked apps)
Can you give me concrete examples of this memory optimization work?
I think it's feasible for us to start a very specific automatic optimizer by looking at narrow cases: what is the programmer doing, etc.
(no subject)
Date: 2006-04-25 07:03 pm (UTC)