torchopt: Advanced Optimizers for Torch

Optimizers for 'torch' deep learning library. These functions include recent results published in the literature and are not part of the optimizers offered in 'torch'. Prospective users should test these optimizers with their data, since performance depends on the specific problem being solved. The packages includes the following optimizers: (a) 'adabelief' by Zhuang et al (2020), <arXiv:2010.07468>; (b) 'adabound' by Luo et al.(2019), <arXiv:1902.09843>; (c) 'adahessian' by Yao et al.(2021) <arXiv:2006.00719>; (d) 'adamw' by Loshchilov & Hutter (2019), <arXiv:1711.05101>; (e) 'madgrad' by Defazio and Jelassi (2021), <arXiv:2101.11075>; (f) 'nadam' by Dozat (2019), <https://openreview.net/pdf/OM0jvwB8jIp57ZJjtNEZ.pdf>; (g) 'qhadam' by Ma and Yarats(2019), <arXiv:1810.06801>; (h) 'radam' by Liu et al. (2019), <arXiv:1908.03265>; (i) 'swats' by Shekar and Sochee (2018), <arXiv:1712.07628>; (j) 'yogi' by Zaheer et al.(2019), <https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization>.

Version: 0.1.4
Depends: R (≥ 4.0.0)
Imports: graphics, grDevices, stats, torch
Suggests: testthat
Published: 2023-06-06
Author: Gilberto Camara [aut, cre], Rolf Simoes [aut], Daniel Falbel [aut], Felipe Souza [aut]
Maintainer: Gilberto Camara <gilberto.camara.inpe at gmail.com>
License: Apache License (≥ 2)
URL: https://github.com/e-sensing/torchopt/
NeedsCompilation: no
Language: en-US
Materials: NEWS
CRAN checks: torchopt results

Documentation:

Reference manual: torchopt.pdf

Downloads:

Package source: torchopt_0.1.4.tar.gz
Windows binaries: r-devel: torchopt_0.1.4.zip, r-release: torchopt_0.1.4.zip, r-oldrel: torchopt_0.1.4.zip
macOS binaries: r-release (arm64): torchopt_0.1.4.tgz, r-oldrel (arm64): torchopt_0.1.4.tgz, r-release (x86_64): torchopt_0.1.4.tgz
Old sources: torchopt archive

Reverse dependencies:

Reverse suggests: sits

Linking:

Please use the canonical form https://CRAN.R-project.org/package=torchopt to link to this page.