期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent
1
作者 Wes Whiting Bao Wang Jack Xin 《Communications on Applied Mathematics and Computation》 EI 2024年第2期1175-1188,共14页
We prove,under mild conditions,the convergence of a Riemannian gradient descent method for a hyperbolic neural network regression model,both in batch gradient descent and stochastic gradient descent.We also discuss a ... We prove,under mild conditions,the convergence of a Riemannian gradient descent method for a hyperbolic neural network regression model,both in batch gradient descent and stochastic gradient descent.We also discuss a Riemannian version of the Adam algorithm.We show numerical simulations of these algorithms on various benchmarks. 展开更多
关键词 Hyperbolic neural network Riemannian gradient descent Riemannian Adam(RAdam) training convergence
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部