Skip to main content
arXiv logo
Cornell University Logo

Computer Science > Machine Learning

arXiv:2405.12807 (cs)
[Submitted on 21 May 2024 (v1), last revised 3 Sep 2024 (this version, v11)]

Title:FAdam: Adam is a natural gradient optimizer using diagonal empirical Fisher information

Authors:Dongseong Hwang
View a PDF of the paper titled FAdam: Adam is a natural gradient optimizer using diagonal empirical Fisher information, by Dongseong Hwang
View PDF HTML (experimental)
Abstract:This paper establishes a mathematical foundation for the Adam optimizer, elucidating its connection to natural gradient descent through Riemannian and information geometry. We provide an accessible and detailed analysis of the diagonal empirical Fisher information matrix (FIM) in Adam, clarifying all detailed approximations and advocating for the use of log probability functions as loss, which should be based on discrete distributions, due to the limitations of empirical FIM. Our analysis uncovers flaws in the original Adam algorithm, leading to proposed corrections such as enhanced momentum calculations, adjusted bias corrections, adaptive epsilon, and gradient clipping. We refine the weight decay term based on our theoretical framework. Our modified algorithm, Fisher Adam (FAdam), demonstrates superior performance across diverse domains including LLM, ASR, and VQ-VAE, achieving state-of-the-art results in ASR.
Comments: 21 pages, 4 figures, 6 tables
Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI); Information Theory (cs.IT)
Cite as: arXiv:2405.12807 [cs.LG]
  (or arXiv:2405.12807v11 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2405.12807
arXiv-issued DOI via DataCite

Submission history

From: Dongseong Hwang [view email]
[v1] Tue, 21 May 2024 13:58:17 UTC (166 KB)
[v2] Thu, 23 May 2024 14:46:39 UTC (167 KB)
[v3] Sun, 26 May 2024 10:59:04 UTC (166 KB)
[v4] Tue, 28 May 2024 15:07:28 UTC (167 KB)
[v5] Mon, 3 Jun 2024 11:55:11 UTC (167 KB)
[v6] Fri, 7 Jun 2024 12:11:11 UTC (168 KB)
[v7] Fri, 28 Jun 2024 03:55:48 UTC (170 KB)
[v8] Tue, 9 Jul 2024 05:15:47 UTC (170 KB)
[v9] Sun, 4 Aug 2024 03:55:24 UTC (171 KB)
[v10] Thu, 22 Aug 2024 03:20:11 UTC (172 KB)
[v11] Tue, 3 Sep 2024 21:00:39 UTC (173 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled FAdam: Adam is a natural gradient optimizer using diagonal empirical Fisher information, by Dongseong Hwang
  • View PDF
  • HTML (experimental)
  • TeX Source
  • Other Formats
license icon view license
Current browse context:
cs.LG
< prev   |   next >
new | recent | 2024-05
Change to browse by:
cs
cs.AI
cs.IT
math
math.IT

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack