Traditional methods like Markov Chain Monte Carlo (MCMC) struggle with complex distributions4. These methods often lead to mode collapse, where the sampling process becomes confined to a single mode, reducing the diversity of generated samples and limiting the model's effectiveness. Additionally, MCMC techniques for discrete objects with combinatorial constraints are less developed than those for continuous counterparts.
Generative Flow Networks (GFlowNets) are a machine learning framework for probabilistic and generative modeling6. They address the challenge of sampling from unnormalized probability distributions by learning a policy on a constructed graph, enabling efficient sampling and approximating the target distribution. GFlowNets can be regarded as a combination of reinforcement learning, deep generative models, and energy-based models, offering a robust alternative to traditional sampling methods like MCMC.
GFlowNets address sampling from unnormalized distributions by learning a policy on a constructed graph, enabling sampling from an approximation of the target probability distribution through successive steps23. This innovative approach allows GFlowNets to generate diverse samples and avoid mode collapse, outperforming traditional methods like MCMC.