intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Gradient descent algorithm

Xem 1-15 trên 15 kết quả Gradient descent algorithm
  • This article presents the results of improving an artificial neural network (ANN) to predict the tool wear in high-speed dry turning of SKD11 steel. The original ANN was a backpropagation (BPN) model with the Gradient Descent algorithm (GD). In the improved model, so-called ANN-CS, some parameters were optimized by the Cuckoo search algorithm (CS).

    pdf15p vibego 02-02-2024 2 1   Download

  • This study aims to propose a solution to handle the uncertainty and imprecise knowledge associated with the collected data using Interval type-2 fuzzy inference system, or IT2FIS. IT2FIS has been shown to be capable of generalizing functional relationship between input and output while reducing computational complexity.

    pdf6p viannee 02-08-2023 5 4   Download

  • In this paper, we consider the adaptive sliding mode control with radial basis function neural networks for the Omni-directional mobile robot. This is a holonomic robot that can operate easily in small and narrow spaces, due to the ability of flexible rotational and translational moving, simultaneously and independently.

    pdf8p vidoctorstrange 06-05-2023 12 6   Download

  • In this paper, two new approaches based on neural networks are proposed in order to ¯nd the estimated solutions of the fully fuzzy nonlinear system (FFNS). For obtaining the estimated solutions, a gradient descent algorithm is proposed in order to train the proposed networks. An example is proposed in order to show the e±ciency of the considered approaches.

    pdf15p redemption 20-12-2021 4 0   Download

  • A proper design of the architecture of Artificial Neural Network (ANN) models can provide a robust tool in water resources modelling and forecasting. The performance of different neural networks in groundwater level forecasting was examined in order to identify an optimal ANN model for groundwater level forecast. The Devasugur nala watershed was selected for the study, located at northern part of Raichur district Karnataka and comes under middle Krishna river basin.

    pdf10p nguaconbaynhay6 23-06-2020 38 1   Download

  • The new parallel multiclass stochastic gradient descent algorithms aim at classifying million images with very-high-dimensional signatures into thousands of classes. We extend the stochastic gradient descent (SGD) for support vector machines (SVM-SGD) in several ways to develop the new multiclass SVM-SGD for efficiently classifying large image datasets into many classes.

    pdf9p vititan2711 13-08-2019 32 1   Download

  • This paper will discuss the determination and the effective usage of the guided information in MOEAs. The paper also gives a survey on the usage of directional information for best guiding the search to improve the balance between convergence and diversity for multi-objective evolutionary algorithms.

    pdf19p viengland2711 23-07-2019 20 2   Download

  • The aim of this paper is to implement various adaptive noise cancellers (ANC) for speech enhancement based on gradient descent approach, namely the least-mean square (LMS) algorithm and then enhanced to variable step size strategy. In practical application of the LMS algorithm, a key parameter is the step size. As is well known, if the step size is large, the convergence rate of the LMS algorithm will be rapid, but the steady-state mean square error (MSE) will increase. On the other hand, if the step size is small, the steady state MSE will be small, but the convergence rate will be slow.

    pdf5p girlsseek 27-02-2019 25 0   Download

  • Many machine learning problems can be cast as optimization problems. This lecture introduces optimization. The objective is for you to learn: The definitions of gradient and Hessian; the gradient descent algorithm; Newton’s algorithm; stochastic gradient descent (SGD) for online learning; popular variants, such as AdaGrad and Asynchronous SGD;...

    pdf26p allbymyself_08 22-02-2016 45 4   Download

  • Lecture Algorithm design - Chapter 12: Local search include all of the content: Gradient descent, metropolis algorithm, hopfield neural networks, maximum cut, nash equilibria. Inviting you refer.

    pdf37p youcanletgo_03 14-01-2016 48 3   Download

  • We present a novel semi-supervised training algorithm for learning dependency parsers. By combining a supervised large margin loss with an unsupervised least squares loss, a discriminative, convex, semi-supervised learning algorithm can be obtained that is applicable to large-scale problems. To demonstrate the benefits of this approach, we apply the technique to learning dependency parsers from combined labeled and unlabeled corpora.

    pdf9p hongphan_1 15-04-2013 38 2   Download

  • Stochastic gradient descent (SGD) uses approximate gradients estimated from subsets of the training data and updates the parameters in an online fashion. This learning framework is attractive because it often requires much less training time in practice than batch training algorithms. However, L1-regularization, which is becoming popular in natural language processing because of its ability to produce compact models, cannot be efficiently applied in SGD training, due to the large dimensions of feature vectors and the fluctuations of approximate gradients. ...

    pdf9p hongphan_1 14-04-2013 72 3   Download

  • Data-Reusing Adaptive Learning Algorithms In this chapter, a class of data-reusing learning algorithms for recurrent neural networks is analysed. This is achieved starting from a case of feedforward neurons, through to the case of networks with feedback, trained with gradient descent learning algorithms. It is shown that the class of data-reusing algorithms outperforms the standard (a priori ) algorithms for nonlinear adaptive filtering in terms of the instantaneous prediction error.

    pdf14p doroxon 12-08-2010 95 9   Download

  • The System Identification Framework for Adaptive IIR Filtering • Algorithms and Performance Issues • Some Preliminaries 23.2 The Equation Error Approach The LMS and LS Equation Error Algorithms • Instrumental Variable Algorithms • Equation Error Algorithms with Unit Norm Constraints Gradient-Descent Algorithms Based on Stability Theory • 23.3 The Output Error Approach Output Error Algorithms 23.4 Equation-Error/Output-Error Hybrids The Steiglitz-McBride Family of Algorithms Geoffrey A. Williamson Illinois Institute of Technology 23.5 Alternate Parametrizations 23.

    pdf21p longmontran 18-01-2010 299 6   Download

  • Introduction to Adaptive Filters 18.1 18.2 18.3 18.4 18.5 What is an Adaptive Filter? The Adaptive Filtering Problem Filter Structures The Task of an Adaptive Filter Applications of Adaptive Filters System Identification • Inverse Modeling • Linear Prediction • Feedforward Control General Form of Adaptive FIR Algorithms • The MeanSquared Error Cost Function • The Wiener Solution • The Method of Steepest Descent • The LMS Algorithm • Other Stochastic Gradient Algorithms • Finite-Precision Effects and Other Implementation Issues • System Identification Example 18.

    pdf20p longmontran 18-01-2010 69 8   Download

CHỦ ĐỀ BẠN MUỐN TÌM

ADSENSE

nocache searchPhinxDoc

 

Đồng bộ tài khoản
2=>2