“It’s like the discrete version of convex functions” optimization is sick. our favorite kind of optimization is convex optimization because if you have a bowl, you just roll a ball to the bottom of the bowl. ezpz. sometimes, the bowl is very canoe-shaped (like the long axis of the canoe is 10^9 longer than the short axis), and you can do all sorts of wacky math to roll/teleport the ball to the bottom faster the key to this is the rolling part. we rely on differentiability / the ability to take a gradient. sometimes we have a discrete optimization problem (for example, what set of 6 pokemon do i bring to the elite four). either you bring charizard or you don’t. there’s no derivative with respect to charizard because there’s no continuous variable here. you have no gradients, and so there is no rolling. the optimization is called “combinatorially hard” and usually you just use some dogshit heuristic to solve the problem and there’s no math and no guarantees and life is bad. an approximately submodular function is a function mapping from sets (a discrete thing) to the real number line. using something called a lovasz extension, we can turn an approximately submodular function into a continuous but not-differentiable function (it’s kind of a stupid construction, you just line up all of your discrete things in an arbitrary order on the real number line and assign the cumulative sum as the function output). the problem is that the lovasz extension is still not differentiable, we still can’t roll our ball. so we use something called a subgradient descent method which basically makes an underwhelming guess every time you hit a nondifferentiable point. so now given a combinatorial set we wanna minimize, if it has a certain kind of structure, we can pervert our love of ball-rolling into rolling a ball down a bowl that has sharp corners and that still sort of works, just not as fast as usual does this work? is this interesting? can i weaponize this into something useful? fuck if i know, ask me in a month