Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stocha...Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.展开更多
An iterative learning control(ILC) algorithm using quantized error information is given in this paper for both linear and nonlinear discrete-time systems with stochastic noises. A logarithmic quantizer is used to guar...An iterative learning control(ILC) algorithm using quantized error information is given in this paper for both linear and nonlinear discrete-time systems with stochastic noises. A logarithmic quantizer is used to guarantee an adaptive improvement in tracking performance. A decreasing learning gain is introduced into the algorithm to suppress the effects of stochastic noises and quantization errors. The input sequence is proved to converge strictly to the optimal input under the given index. Illustrative simulations are given to verify the theoretical analysis.展开更多
Let(Z_n) be a supercritical branching process with immigration in a random environment. Firstly, we prove that under a simple log moment condition on the offspring and immigration distributions, the naturally normaliz...Let(Z_n) be a supercritical branching process with immigration in a random environment. Firstly, we prove that under a simple log moment condition on the offspring and immigration distributions, the naturally normalized population size W_n converges almost surely to a finite random variable W. Secondly, we show criterions for the non-degeneracy and for the existence of moments of the limit random variable W. Finally, we establish a central limit theorem, a large deviation principle and a moderate deviation principle about log Z_n.展开更多
文摘Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.
基金supported by National Natural Science Foundation of China(61304085)Beijing Natural Science Foundation(4152040)
文摘An iterative learning control(ILC) algorithm using quantized error information is given in this paper for both linear and nonlinear discrete-time systems with stochastic noises. A logarithmic quantizer is used to guarantee an adaptive improvement in tracking performance. A decreasing learning gain is introduced into the algorithm to suppress the effects of stochastic noises and quantization errors. The input sequence is proved to converge strictly to the optimal input under the given index. Illustrative simulations are given to verify the theoretical analysis.
基金supported by National Natural Science Foundation of China (Grants Nos. 11401590 and 11571052)
文摘Let(Z_n) be a supercritical branching process with immigration in a random environment. Firstly, we prove that under a simple log moment condition on the offspring and immigration distributions, the naturally normalized population size W_n converges almost surely to a finite random variable W. Secondly, we show criterions for the non-degeneracy and for the existence of moments of the limit random variable W. Finally, we establish a central limit theorem, a large deviation principle and a moderate deviation principle about log Z_n.