🧀 BigCheese.ai

Social

Fun times with energy-based models

🧀

This post covers Energy-Based Models (EBMs), a class of generative models that learn underlying data distributions by assigning energy values to data points. It explains generative models, introduces EBMs and their unique approach using an energy function, discusses sampling from EBMs with methods like Langevin dynamics, and reviews multiple training approaches including contrastive divergence, score matching, and noise contrastive estimation. The post also offers practical tips for dealing with common EBM issues and concludes with potential hardware implementation of EBMs for better sampling in probabilistic AI.

  • EBMs frame generative modeling via energy functions.
  • Langevin dynamics aids in sampling from EBMs.
  • CD provides a practical way to train EBMs.
  • Score matching avoids explicit model sampling.
  • NCE distinguishes data from noise distributions.