Submitted by TobusFire t3_11fil25 in MachineLearning
rm-rf_ t1_jambi6b wrote
Reply to comment by sugar_scoot in [D] Are Genetic Algorithms Dead? by TobusFire
Don't Bayesian approaches generally work better in gradient-free optimization?
scawsome t1_janbtva wrote
Not necessarily. Bayesian methods work great when you have expensive objective function evaluations that can only be evaluated in serial (or limited parallel evaluations). Bayesian methods aren't ideal in massively parallelizable evaluations (evaluating >100 points at a time) or when evaluations are relatively cheap. It depends on the cost of optimizing the acquisition function. I've actually played around with combining BO with evolutionary algorithms to extend BO towards massively parallelizable evaluations and have seen some promising results.
_TheHalfTruth_ t1_jbdaf27 wrote
Metaheuristic algorithms like GA and simulated annealing are almost identical to Bayesian methods/MCMC. Metaheuristic algorithms are Bayesian methods if you can pretend that your objective function is proportional to a probability distribution that you want to maximize. They just take unique approaches to exploring the posterior distribution. But conceptually they’re identical
Viewing a single comment thread. View all comments