- Deep Learning Quick Reference
- Mike Bernico
- 79字
- 2025-04-04 17:20:42
The Adam optimizer
Adam is one of the best performing known optimizer and it's my first choice. It works well across a wide variety of problems. It combines the best parts of both momentum and RMSProp into a single update rule:




Where is some very small number to prevent division by 0.
Adam is often a great choice, and it's a great place to start when you're prototyping, so save yourself some time by starting with Adam.