期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Performance Enhancement of Adaptive Neural Networks Based on Learning Rate
1
作者 swaleha zubair Anjani Kumar Singha +3 位作者 Nitish Pathak Neelam Sharma Shabana Urooj Samia Rabeh Larguech 《Computers, Materials & Continua》 SCIE EI 2023年第1期2005-2019,共15页
Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it... Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it initialize the parameters during the optimization process.There should be no variation in the cost function parameters at the global minimum.The momentum technique is a parameters optimization approach;however,it has difficulties stopping the parameter when the cost function value fulfills the global minimum(non-stop problem).Moreover,existing approaches use techniques;the learning rate is reduced during the iteration period.These techniques are monotonically reducing at a steady rate over time;our goal is to make the learning rate parameters.We present a method for determining the best parameters that adjust the learning rate in response to the cost function value.As a result,after the cost function has been optimized,the process of the rate Schedule is complete.This approach is shown to ensure convergence to the optimal parameters.This indicates that our strategy minimizes the cost function(or effective learning).The momentum approach is used in the proposed method.To solve the Momentum approach non-stop problem,we use the cost function of the parameter in our proposed method.As a result,this learning technique reduces the quantity of the parameter due to the impact of the cost function parameter.To verify that the learning works to test the strategy,we employed proof of convergence and empirical tests using current methods and the results are obtained using Python. 展开更多
关键词 Deep learning OPTIMIZATION CONVERGENCE stochastic gradient methods
下载PDF
Design of ANN Based Non-Linear Network Using Interconnection of Parallel Processor
2
作者 Anjani Kumar Singha swaleha zubair +3 位作者 Areej Malibari Nitish Pathak Shabana Urooj Neelam Sharma 《Computer Systems Science & Engineering》 SCIE EI 2023年第9期3491-3508,共18页
Suspicious mass traffic constantly evolves,making network behaviour tracing and structure more complex.Neural networks yield promising results by considering a sufficient number of processing elements with strong inte... Suspicious mass traffic constantly evolves,making network behaviour tracing and structure more complex.Neural networks yield promising results by considering a sufficient number of processing elements with strong interconnections between them.They offer efficient computational Hopfield neural networks models and optimization constraints used by undergoing a good amount of parallelism to yield optimal results.Artificial neural network(ANN)offers optimal solutions in classifying and clustering the various reels of data,and the results obtained purely depend on identifying a problem.In this research work,the design of optimized applications is presented in an organized manner.In addition,this research work examines theoretical approaches to achieving optimized results using ANN.It mainly focuses on designing rules.The optimizing design approach of neural networks analyzes the internal process of the neural networks.Practices in developing the network are based on the interconnections among the hidden nodes and their learning parameters.The methodology is proven best for nonlinear resource allocation problems with a suitable design and complex issues.The ANN proposed here considers more or less 46k nodes hidden inside 49 million connections employed on full-fledged parallel processors.The proposed ANN offered optimal results in real-world application problems,and the results were obtained using MATLAB. 展开更多
关键词 Artificial neural network(ANN) MULTIPROCESSOR hidden node nonlinear optimization parallel processing
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部